Date: 2025-06-02 11:04 am (UTC)
boccaderlupo: Fra' Lupo (Default)
From: [personal profile] boccaderlupo
We actually see this quasi in real time with people writing stuff (I'm an editor, of sorts, in the waking world). You have people clearly using generative AI to write or at least clean up their stuff...but we screen for that. And now you have other LLMs that "humanize" said copy, to slip it past attempts to screen for AI. At some point you have to ask: given the amount of energy used for such a process—one LLM after another—and given that we know even saying "thank you" to LLMs consumes massive amounts of energy, at what point would it just be more economical to simply pay a person to write said copy? The only way I see this coming undone is if, as with recent developments in China, they make the use of these systems much, much less energy intensive. Which I guess is possible (perhaps even probable).

The apocalyptic situations I envision for these is less about genuinely evil intent and more about Bostrom's "paper clip maximizer" theory, in which an AI is given a particular mundane task and pursues it relentlessly, and there's no way to shut it off. Which, if you've ever worked with software and mistakenly given a program the slightest of wrong instructions, you know the feeling when it unleashes destruction. I don't know that such a scenario would end the planet, but I suspect it could causes considerable badness.
This account has disabled anonymous posting.
(will be screened)
(will be screened)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

boccaderlupo: Fra' Lupo (Default)
boccaderlupo

May 2025

S M T W T F S
    123
45678910
11 121314151617
18 192021222324
2526272829 30 31

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 20th, 2025 10:13 am
Powered by Dreamwidth Studios