When Microsoft rolled out Windows 11 Copilot as the crown jewel of its AI-powered desktop vision, it promised seamless, intelligent assistance—right at your fingertips. But by late August 2025, users weren’t praising its smarts. They were rage-quitting it. Across Microsoft’s own Q&A forums, a flood of reports revealed Copilot crashing after just five interactions, losing memory, and consuming absurd amounts of RAM—all while promising to make work easier. The twist? It’s not just glitchy. It’s making users nostalgic for Notepad.
From Assistant to Annoyance: The Crash Pattern
It started quietly. Then exploded. On August 21, 2025, user Tyler Hagl posted a detailed breakdown: Copilot took three full seconds to load before crashing when he tried using "Smart GPT-5" for startup research. That wasn’t an outlier. Within 24 hours, dozens more confirmed the same: after 5-7 queries, Copilot slowed to a crawl. By 15-20 interactions, it would vanish entirely—erasing the entire conversation history. Error code -536870904 popped up constantly, a telltale sign of memory overload. And here’s the kicker: upgrading to Copilot Pro made it worse. One user wrote, "I paid for the premium version hoping for stability. Instead, I got more hangs and lockups."
When AI Can’t Find the Text Size Setting
Microsoft’s marketing machine was running full throttle—until it wasn’t. On November 19, 2025, WindowsLatest.com broke the story: Microsoft had pulled a major ad campaign after the AI failed a basic task. The ad showed Copilot guiding a user to adjust text size. Instead of pointing them to Settings > Accessibility > Text size, it directed them to Display settings. A grandmother could’ve spotted the error. The backlash was immediate. Comments flooded in: "This is embarrassing," "I’d trust my toaster more," and "I’m uninstalling this." Within hours, Microsoft locked replies on the post by its Windows Chief, who’d boldly claimed Windows was evolving into an "agentic OS." The silence that followed spoke louder than any press release.
Notepad’s Death by AI: A Case Study in Overreach
Charly Leetham, tech commentator and host of the popular YouTube channel with Episode 637 on November 14, 2025, didn’t just document the problem—he mourned it. "Notepad used to be the fastest, cleanest, most reliable text editor on Earth," he said. "Now? It’s cluttered with AI buttons, loading spinners, and a cloud dependency you didn’t ask for." Leetham highlighted a deeper issue: privacy. Every time you type in Notepad with Copilot enabled, your text gets sent to Microsoft’s servers. For professionals handling sensitive documents, that’s a non-starter. "It’s not just slow," he added. "It’s invasive. And it’s replacing something perfect with something broken."
Why It’s Failing: The Stateless AI Trap
Technical analysts point to a fundamental flaw: Copilot is stateless. Every time you ask a question, the entire conversation history gets shipped back and forth between your PC and Microsoft’s servers. The longer the chat, the heavier the load. It’s like carrying a 50-pound backpack on every walk around the block. Other AI platforms suffer from this too—but users say Copilot feels worse. Why? Maybe because Microsoft’s optimizations lag behind. Maybe because it’s forced into too many apps too fast. Either way, the result is the same: sluggishness, crashes, and frustration.
Microsoft’s Fix? Upgrade, Disable, and Hope
Instead of fixing the core issue, Microsoft’s official troubleshooting guide reads like a checklist for tech-savvy power users: upgrade to Windows 11 Pro, enable High Performance Power Plan, disable animations, tweak WSL memory, update drivers, disable antivirus, and restart conversations every 10-15 queries. For Copilot in Outlook and other Microsoft 365 apps, users are told to monitor Task Manager like a hawk. There’s no mention of server-side optimization. No promise of a fix. Just workarounds that require more effort than the AI is supposed to save.
The Bigger Picture: AI Integration at Any Cost?
This isn’t just about a buggy feature. It’s about Microsoft betting its future on AI—fast, loud, and everywhere. From Notepad to Excel, from Edge to Teams, Copilot is being stitched into every corner of Windows and Microsoft 365. But if users can’t trust it to change text size, or if it crashes mid-sentence, how can they trust it to draft contracts, summarize emails, or analyze data? The backlash isn’t just about performance. It’s about trust. And trust, once broken, is harder to rebuild than code.
What’s Next? The Feedback Loop That Might Not Be Enough
Microsoft says it’s listening—through the Microsoft 365 Copilot Community forum, where the "product development team monitors this site around the clock." But users are skeptical. After all, they’ve been reporting crashes since August. The same issues persist. The same ads get pulled. The same executives lock comments. The real question isn’t whether Microsoft will fix Copilot. It’s whether they’ll fix it before users abandon Windows 11 for alternatives—like Chromebooks, Macs, or even Linux.
Frequently Asked Questions
Why does Copilot crash after only a few questions?
Copilot suffers from a stateless design flaw: every query requires sending the full conversation history to Microsoft’s servers. As the chat grows longer, the data load increases, triggering memory errors like -536870904. This isn’t unique to Copilot, but Microsoft’s implementation lacks optimizations seen in competitors like ChatGPT, making crashes more frequent and severe.
Is Copilot Pro better than the free version?
No—users report the opposite. Multiple Q&A forum posts from August 2025 confirm that upgrading to Copilot Pro led to more hangs, lockups, and crashes. The premium version adds no stability improvements and may even increase resource usage due to additional AI features that aren’t optimized for consumer hardware.
What should I do if Copilot is slowing down my PC?
Disable Copilot in apps like Notepad and Outlook until Microsoft releases a patch. Use Task Manager to monitor memory usage—Copilot often spikes to 1.5GB+ of RAM. Switch to High Performance Power Plan, disable visual effects, and update drivers. But the most effective fix? Turn it off entirely. Many users report faster performance and less stress after uninstalling it.
Why did Microsoft pull its Copilot ad?
The ad showed Copilot guiding users to change text size—but it incorrectly directed them to Display Settings instead of Settings > Accessibility > Text size. The error was so obvious that even non-tech users spotted it. Microsoft deleted the ad within 24 hours and locked comments on its X post after a flood of ridicule. It was a public embarrassment that exposed the gap between marketing and reality.
Is my data safe when I use Copilot in Notepad?
No. When Copilot is active in Notepad, any text you type is sent to Microsoft’s cloud servers for processing. There’s no local AI option. For users handling confidential documents, legal text, or personal notes, this poses a serious privacy risk. Microsoft doesn’t clearly warn users about this—making it a silent data leak.
Will Microsoft fix Copilot soon?
Microsoft says it monitors feedback, but no major update has addressed the core memory and stateless issues since August 2025. With no public roadmap for optimization, users are left guessing. Historically, Microsoft patches critical AI flaws in major Windows updates—possibly in 2026’s 24H2 release. Until then, expect more crashes, more complaints, and more users switching to alternatives.
Comments
Post Comment