Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?
Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.
The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?
Anthropomorphism has long been used as a big bad thing, the catchall excuse to keep animals as the stupid things they were supposed to be. We’re coming back from that thankfully.
It doesn’t mean the animals function the same way we do. But they do function in a lot of very similar ways.
My point is I can’t see how they can “gossip” or “tell stories”, if that isn’t textbook anthropomorphism I don’t know what that is.
It’s shorthand for information sharing. Which they certainly do. Crows will absolutely tell one another about lots of stuff, such as people that have harmed them.