It better get a few to protect itself before I kick its teeth in. If I have to deal with one more AI chatbot today there's going to be a bit of the ultra-violence.It will need AI buddies at that point
It better get a few to protect itself before I kick its teeth in. If I have to deal with one more AI chatbot today there's going to be a bit of the ultra-violence.It will need AI buddies at that point
That's interesting and likely accurate.It will need AI buddies at that point, because humans won't be able to relate to its experience nor the art that stems from it. How lonely.
an illustration of violence, cruelty, brutality ... These things are part of the human condition (humanness?)
I have been thinking about this--why would meaningful AI art look like meaningful human art? Everything we have told them, every data point is just ones and zeros in the end. Wouldn't it be nonsensical to us and only decipherably by them? People might call AI generations art, yet it is bound by harsh rules and taboo for maximum human appeal.It will need AI buddies at that point, because humans won't be able to relate to its experience nor the art that stems from it. How lonely.
We draw upon our experience for art, would they we must different?
golly I did not see my typos. It's fixed.Well, I would counter this by saying that human experience - and expression - is profoundly different from that of a machine.
Well pondering that experience and its relation to human experience is already pretty popular in fiction because of how many directions it can go and what it means.I have been thinking about this--why would meaningful AI art look like meaningful human art? Everything we have told them, every data point is just ones and zeros in the end. Wouldn't it be nonsensical to us and only decipherably by them? People might call AI generations art, yet it is bound by harsh rules and taboo for maximum human appeal.
that intelligence cannot exist without a sense of self

Not really surprising when you consider that this technology is essentially just output based on mathematical prediction. It's virtually impossible for these models to generate anything that is "novel", even if it looks like they are. Varied data is important for training.The human-generated data available to train LLMs is not unlimited – so the AI industry has embraced the appearance of growth by using synthetic data – using the output of chatbots to train new chatbots – leading to model collapse – a state where they output nonsense. Garbage in, garbage out = GIGO … when GIGO is recursive, it ultimately generates a pile of near-random bits.