Is installing Windows 11 with a local account or on unsupported hardware harmful or dangerous? YouTube’s AI moderation system seems to think so, as it has started pulling videos that show users how to sidestep Microsoft’s setup restrictions.
Tech YouTuber Rich White, aka CyberCPU Tech, was the first to go public about the issue on October 26, when he posted a video reporting the removal of a how-to he published on installing Windows 11 25H2 with a local account instead of a Microsoft account. In the video, White expressed concern that YouTube’s automated flagging process may be the root of the problem, as he found it hard to believe that “creating a local account in Windows 11 could lead to serious harm or even death,” as YouTube reportedly alleged when it removed the video.
When he appealed, White said that YouTube denied the request within 10 to 20 minutes, early on a Sunday morning, which led him to speculate that there wasn’t a human in the loop when the request was shut down. That wasn’t his only video removed, either.
The next day, White uploaded his video for this week on installing Windows 11 25H2 on unsupported hardware, which was removed hours after being posted. YouTube justified the removal on similar grounds. As was the case with the first removal, White appealed.
“The appeal was denied at 11:55, a full one minute after submitting it,” White explained in a video following up on both removals. “If this was reviewed by a real human like YouTube claims they are, they watched a 17-minute video and denied the appeal in one minute.”
At least two other YouTubers – Britec09 and Hrutkay Mods – have released videos alleging much of the same: They published content about Windows workarounds, YouTube removed them, and they had an impossible time getting to a human for any help. YouTube’s AI, all three contend, blocked their videos, despite the fact that bypassing Windows 11 locks is neither illegal nor dangerous for people who follow the instructions.
Sure, you might muck up your Windows machine, but no one is going to lose a finger.
“I don’t believe I’ve spoken with a single human being from Google or YouTube yet since this started,” White told us in an email. “It’s been all automated.”
Is Microsoft tipping the scales?
White speculated in one of his videos about the takedowns that Microsoft may have been trying to exert pressure on Google to have the videos taken down. Neither Microsoft nor Google has responded to our questions on the matter, so while that prospect remains speculative, there is a timing element that’s quite coincidental. Microsoft just closed the local account loophole for Windows 11 setup in its most recent insider build.
That’s not the only loophole that Microsoft has been trying to quiet down, either. In February, Redmond took down its own advice on how to install Windows 11 on unsupported hardware without Trusted Platform Module 2.0 hardware in a bid to get people to stop doing that and just buy new hardware. Coincidentally or not, that’s the topic of White’s and others’ removed videos.
Odd timing aside, White doesn’t believe Microsoft’s malfeasance was part of the takedowns despite his publicly speculating on that in one of his videos.
“I mentioned in one of the videos I made regarding this issue the possibility of Microsoft’s involvement,” White told us. “That was spoken more out of frustration and confusion over the whole issue and I just don’t think Microsoft had anything to do with it.”
This problem, he explained, is more about AI inappropriately flagging content and YouTube not having the manpower to deal with appeals.
It could go further than just a few removed videos, though. All three channels we identified expressed far more concern about the chilling effect AI moderation without a human in the loop can have on free expression on YouTube.
“The number of creators currently being affected reaches far further than the ones that have had videos taken down like I have because creators have voiced fear in what we are allowed to publish,” White explained.
The CyberCPU vlogger explained that YouTube didn’t provide any actual explanation about what his videos did to violate the site’s content policy, leaving him and other tech creators in a position where anything they publish could result in a strike against their channel and eventual removal.
“My fear is this could lead to many creators fearing covering more moderate to advanced tutorials,” White told us, adding that such self-censorship would inevitably lead to less engagement. “Another creator shared that sentiment with me because he had been posting more ‘safe’ videos since this ordeal started and his views have suffered from it.”
Ultimately, the tech YouTubers waging this campaign seem to simply want an explanation and clarity.
“We would just like YouTube to tell us what the issue is,” White said. “If it’s just a mistake then fine, restore our videos and we’ll move on. If it’s a new policy on YouTube, then tell us where the line is and hopefully we can move forward.
“Operating blind isn’t going to work,” White concluded in his message to us.
Welcome to the age of AI moderation – here’s hoping you don’t trip the system with no way to appeal. ®
Source link