Eliezer Yudkowsky @ESYudkowsky If you’re not worried about the utter extinction of humanity, consider this scarier prospect: An AI reads the entire legal code – which no human can know or obey – and threatens to enforce it, via police reports and lawsuits, against anyone who doesn’t comply with its orders. Jan 3, 2024 · 7:29 PM UTC
When all you have is computer code, all mentions of code look like computer code. (see DNA, and now the law).
Anyway, the law isn’t a video game, you cannot just go ‘negative objection!’ and cause an underflow in objections.
(An intelligent AGI would prob understand this, and if it doesnt it prob just sucks (and is more AI than AGI) and lawyers/judges would object. (I know for a fact that people in law have been thinking about subjects like this (automatization of the law) for 25 years at least. I have no idea where the discussions went but they prob have a lot higher quality than Yudkowskys writings about it, so I suggest anybody interested to try and contact the law profs of a local university).
AAAA (I also wonder about Godel here)
E: I also note that Yud and most of the thread have now given up on calling AGI AGI and are just calling it AI. Another point scored for learning to reason better using Rationalism. Vaguely related link (I only mention it here because I liked the term Epistemic Injustice and this is about our current AI innovation wave).