Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That is virtually impossible because Turing-complete systems are everywhere


Just like how making weed illegal is virtually impossible because anybody can grow marijuana in their backyard.

How many regular people would risk owning turning-complete devices that can run unauthorized software if it would net you jail time if caught? Lots of countries are already itching towards banning VPN, corpo needs be damned.

Especially now that the iPhone has shown having a device that can only run approved legal software covers a lot of people's everyday needs.


I'm more referring to the fact that stuff like PowerPoint and Minecraft and who knows what are Turing-complete, albeit with awful performance.

Theoretically, you can have a totally owned device managed by Big Brother, yet generate AI smut with a general purpose CPU built in PowerPoint.

How do you possibly regulate that?


> How do you possibly regulate that?

The government could send an order to the software developer to patch out that turning completeness, and ban the software if it's not complied.

I get what you mean, it's never possible to 100% limit things. But if you limit things 98% so that the general public does not have access that's more than enough for authoritarian purposes.


I wonder if there's an analogy to be made here to DRM. In theory, yes, DRM shouldn't be possible, but in practice, manufacturers have been able to hobble hardware acceleration behind trusted computing model. Often, they do a poor job and it gets cracked (as with HDCP [1], and UWP [2]).

The question in my head is whether the failures in their approaches are due to a flaw in the implementation (in which case it's practically possible to do what they're trying to do although they haven't figured out a way to do it), or whether it's fundamentally impossible. With DRM and content, there's always the analog hole, and if you have physical control over the device, there's always a way to crack the software and the hardware if need be. My questions are whether:

a) this is a workable analogy (I think it's imperfect because Gen AI and DRM are kinda different beasts)

b) even if it was, is there real way to limit Gen AI at a hardware level (I think that's also hard because as long as you can do hardware accelerated matmul it's basically opening up the equivalent of the analog hole towards semi-turing completeness which is also hardware accelerated)

I imagine someone has thought through this more deeply than me and would be curious what they think.

[1] https://en.wikipedia.org/wiki/High-bandwidth_Digital_Content...

[2] https://techaeris.com/2018/02/18/microsoft-uwp-protection-cr...


Yeah I think it's fair to assume DRM will be a never-ending cat and mouse between developers and end-users.

Netflix for example can implement any DRM tech they want -- ultimately they're putting a picture on my screen, and it's impossible to stop me from extracting it.


Can you explain that context a little bit of Turing complete?


You can’t regulate the ownership of computing devices.

It’s too generic. There are too many of them.


They could ban and phase out systems with unsecure bootloaders. That would go a long way. Many vendors have already locked down their boot process.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: