Microsoft said in 2015 that Windows 10 would be the last version of Windows, why did they change their minds / strategy ?

From my perspective, Microsoft's development process for earlier versions of Windows, such as Windows 7 and Windows 8, followed a traditional methodology. This approach, known in software engineering as the Waterfall model, was a top-down process. It began with the marketing team forecasting future market trends, which, combined with high-level corporate strategy, would form a detailed blueprint for the new operating system. This blueprint was then handed off to engineering teams for a lengthy development cycle, often lasting 4-6 years, culminating in a single, monolithic product. The primary advantage of this model was clarity; the engineering team had a stable, long-term vision to guide the precise and careful design of the software.

However, around 2014-2015, Microsoft recognized that the Waterfall model was no longer viable. It was too slow to keep pace with the rapidly changing demands of the internet era. Furthermore, it created a critical problem where large segments of users would remain on older, popular operating systems, like Windows XP and Windows 7. This user fragmentation was "fatal" to Microsoft's ecosystem. It not only hindered new hardware sales but also dramatically increased the maintenance costs and security risks associated with supporting multiple aging platforms, ultimately weakening Microsoft's control over its own ecosystem.

To address this, Microsoft fundamentally shifted its development philosophy. It abandoned the Waterfall model in favor of an Agile development approach. The public-facing strategy for this new methodology was officially called "Windows as a Service" (WaaS).

Under the WaaS model, Windows would be delivered via a rolling release cycle. In other words, the development teams no longer worked from a long-term blueprint. Instead, they focused on short-term objectives (like OKRs, that's how I was working in Microsoft), delivering smaller, incremental updates that fixed bugs and added a few new features. These updates were then pushed out frequently to all users.

This methodology is much closer to that of an internet service than a piece of critical, underlying infrastructure like an operating system. The trade-off, as many observed, was a potential decline in software quality. Constantly shifting requirements can lead to engineering confusion, with features being cancelled mid-development or released in a half-finished state for public testing and future iteration. Despite these drawbacks, the WaaS model solved Microsoft's core problem of platform control.

This fundamental shift is precisely why Microsoft famously stated in 2015 that Windows 10 would be the "last version of Windows." That statement was effectively a marketing declaration for the new WaaS strategy. It signaled to the world that the era of monolithic releases was over. In theory, there would be no "Windows 11" or "Windows 12," only a single, constantly evolving platform named Windows.

This brings us to Windows 11, which many users criticize for its shortcomings. But the reality is that Windows has been a rolling update service for years. Windows 11 is best understood not as a truly new operating system, but as another major update in the continuous rolling cycle that was given a new user interface (a "theme pack," so to speak) and a major commercial rebrand.

Windows 11 continues to use the 10.0 kernel (with build numbers like 10.0.22621.xxxx), as it is built upon the same technical foundation as Windows 10 and is a direct evolution of its codebase. Unlike the major leap from Windows XP to Windows Vista, this was not a ground-up rewrite of the core architecture. So we says that everything you do on Windows 11, you can also do it on Windows 10.

This "rebrand" approach is obvious when you look at the feature churn. Instead of adding lasting new capabilities, Windows has become a place for experiments. Take the Windows Subsystem for Android (WSA)—it was a flagship feature for Windows 11, but the company killed it back in March of this year. The Timeline from Windows 10 is gone, and Cortana was pushed aside for Copilot. This is the WaaS model in action: features are just bets, and Microsoft will fold a hand quickly if it doesn't align with its new strategy.

For the user, this constant change creates a new kind of "bug." The system itself isn't crashing, but your trust in it does. You become hesitant to learn a new feature or build a workflow around it because you can't be sure it will exist next year. Your muscle memory is constantly broken, and the OS feels less like a stable tool and more like it's in a perpetual beta test.

And this inconsistency isn't random; it’s a direct reflection of the market. The sudden, all-in pivot to AI is the perfect example. To win the AI race with Copilot, Microsoft had to shift massive resources, and projects that were no longer the top priority paid the price. The OS is no longer a product shielded from these pressures; it’s a live canvas for the company's latest strategic response. Users get a front-row seat to these changes, for better or worse.

Is it theoretically possible to run Windows 11 on a PC which has no TPM V2?

Of course. It's theoretically possible and practically quite easy, though it is not officially supported by Microsoft.

Based on our discussion, this makes perfect sense. Since Windows 11 shares the same core as Windows 10, the OS doesn't fundamentally need a TPM 2.0 chip to run. This is why the mandatory requirement feels so unreasonable to many; if Windows 11 is essentially Windows 10 with a "theme pack," why enforce such a strict hardware cutoff? Technically, the limitation is artificial, and many people have bypassed it using simple registry edits or tools like Rufus. They've encountered no stability issues, aside from being unable to properly use features that directly depend on the TPM, like Windows Hello and the most secure implementation of BitLocker.

So why the artificial restriction? It’s a strategic decision that cleverly bundles commercial, security, and ecosystem goals into a single move. Primarily, it serves as a powerful catalyst for new PC sales, giving consumers a clear reason to upgrade. At the same time, this hardware cutoff allows Microsoft to enforce a much higher security baseline with TPM 2.0 as the foundation, addressing long-standing criticisms and making the platform more robust for enterprise clients. Finally, by setting this standard, Microsoft gains more control over its hardware ecosystem, reducing fragmentation and moving a step closer to Apple's curated model.

The official restriction, therefore, isn't a technical necessity but a strategic decision serving those purposes.

When did Microsoft change its strategy (from getting W10 updated indefinitely to forcing ppl to switch to W11) ?

From my perspective, Microsoft never actually changed its core strategy. The fundamental shift occurred around 2015 when the company moved to the "Windows as a Service" (WaaS) model: Windows is a rolling release, and what we perceive as different versions are simply feature updates on a continuous development track. Both Windows 10 and Windows 11 have had their own series of rolling versions, and this process is designed to be continuous, and actually the same code base.

The WaaS model has always operated on a consistent lifecycle policy where each feature update has a fixed support window. The upcoming end-of-life for Windows 10 on October 14, 2025, is not a sudden strategic change; it's business as usual. We've seen this before. Early versions of Windows 10, like 1809, reached their end-of-support back in 2020. This process is inherent to the rolling release model, which focuses resources on the latest codebase. The only difference this time is that the update came with a major commercial rebrand, so the public perception was much stronger.

So why did the announcement of Windows 11 on June 24, 2021, feel like such a dramatic reversal? The answer lies in the distinction between a consistent engineering strategy and a pivotal product and marketing strategy. The core development process didn't change, but Microsoft decided to package a routine version roll-up as a major new product. This rebranding broke the public's perception of a single, evergreen Windows 10, even though the underlying code is a direct continuation.

What made this pivot feel so abrupt was that it introduced a "hard fork" in the user experience. Previously, moving between Windows 10 versions was a seamless and inclusive update for almost everyone. The jump to Windows 11, however, came with strict hardware requirements like TPM 2.0, intentionally leaving millions of otherwise capable PCs behind. This was a radical departure from the inclusive nature of past WaaS updates and the single biggest reason users felt the strategy had changed.

Ultimately, this was a commercial decision layered on top of the existing engineering model. A simple "Windows 10 21H2 Update" would not have driven a massive hardware refresh cycle. But a rebranded "Windows 11," complete with a new UI and a hardware paywall, provided the perfect catalyst to boost new PC sales for the entire industry.

In conclusion, if you see Windows 11 not as a new OS but as a rebranded version branch from the same continuous codebase, the timeline becomes clear. There was no single date where the development strategy changed.

Is there a major technical difference between W10 and W11 that explains why it has to be a brand new OS, and not an updated version of W10 ?

Is there a major technical difference between W10 and W11? Initially no. Later Microsoft is refactoring a lot of code like graphics stack, window manager, and so on with rust. But these changes are incremental and evolutionary, not revolutionary. The core architecture remains largely the same, which is why both operating systems share the same kernel version (10.0).

Why a brand new OS? Answered before. Actually no. It's the same OS, with a theme pack and a commercial rebrand. The decision to market it as a new OS was driven by strategic business considerations rather than technical necessity. See detailed answers above. (Is it theoretically possible to run Windows 11 on a PC which has no TPM V2?)

You say it’s to reduce the costs of maintaining old systems (or to increase profits) and to gain more control over devices and standard : can you elaborate ?

First, it's about reducing long-term costs and pushing new, profitable products. Supporting hardware from the last decade is a huge engineering burden. By cutting off CPUs older than Intel's 8th Generation or AMD's Zen 2, for example, Microsoft no longer has to waste resources testing patches on countless old and unstable hardware combinations. This move also forces users who want the latest features onto new machines, directly boosting PC sales. More importantly, this new OS is built to deeply integrate and promote future paid services, most notably the heavy push for Copilot and other AI subscriptions.

Second, it's about setting a new, non-negotiable security standard for the entire PC industry. For years, Windows has had a reputation for being less secure than Apple's ecosystem. By making a security chip like the TPM 2.0 mandatory, Microsoft establishes a hardware-level foundation of trust. This makes the whole platform far more resilient to advanced attacks, a critical requirement for winning and retaining high-value enterprise and government contracts, which are a cornerstone of Microsoft's business.

Finally, this strategy gives Microsoft significantly more control over PC manufacturers like Dell, HP, and Lenovo. In the past, Microsoft had to ensure Windows could run on a chaotic variety of hardware designs. Now, they dictate the minimum requirements to even be considered a "Windows 11 PC." This flips the power dynamic, forcing hardware partners to build to a consistent standard. It’s a clear step toward a more curated, Apple-like model where tight integration between hardware and software leads to a more predictable and reliable user experience.