Thread  RSS Dead Internet Theory



# 16411 2 months ago on Tue, Oct 15 2024 at 2:50 pm

This topic was briefly touched upon in the Artificial Intelligence topic but not explored or discussed at length. Therefore, I would like to encourage further discussion.

Although there are some aspects of the subject that go into wild and unfounded speculation, the basic idea is that increasing bot (specifically advertising, AstroTurf, and engagement-farming) content and activity is overtaking that of real human users on the Internet.

It's a theory that seems plausible given current statistics, and, if not fully true right now, quite possibly it will be in the near future.

It's connected to the theme of artificial intelligence due to the fact that LLMs (Large Language Models) can produce content very rapidly. It is technically possible to flood areas of the Internet that accept user-generated content with mass-produced AI content. It is already happening in many places, including social media and search engine results.

Many would likely agree that this is one of the negative impacts of AI - spamming, or flooding the web with mass-produced and often meaningless content.

(This post was edited 2 months ago on Tuesday, October 15th, 2024 at 2:52 pm)

# 16413 2 months ago on Wed, Oct 16 2024 at 12:07 am

So maybe 10 years ago, this would have been a nutty fringe idea, but anybody who's seen spambots flooding forums and socials knows this isn't nearly as far-fetched as it used to be. Lonny Aftershave didn't solve Twitter's spambot problem at all, funny enough, even after trying to get out of buying it for that reason, it still got worse after it changed hands.

Anybody who's played with ChatGPT or Gemini can see how easy it is to pump out halfway-convincing replies to things, make articles on subjects, whatever you want. It's technically possible and if people are being driven away by the flood of it, the theory's bound to become true by some point in time, at least.

There is the angle of intentional gaslighting, astroturfing, and deception using computer-generated content masquerading as authentic users to drive random agendas. I'd believe that, at least that gives it a purpose other than just flooding for the sake of flooding.

This place had some experimental / novelty bots that did that but it got so out of hand so fast that they were shut down right away. Which, of course, I agree with, since it was only funny for a short while and it would have made it hard to find more serious posts.

"Dangerous toys are fun, but you could get hurt!"

# 16418 2 months ago on Sat, Oct 19 2024 at 4:32 am

There's a relatively new term being used to describe the flood of AI-produced content quickly accumulating on the Internet: AI Slop.

AI slop isn't just filling up search engine results but I've seen a few YouTube channels that have AI video production services as sponsors. So... channels that still produce their own content are promoting the services that will replace their own staff (and perhaps themselves some day).

But, hey, I'm not against using AI tools as part of the creative process. I recently created some music that was originally made using AI but I re-made it from scratch. Too many people will likely rely on AI too heavily, though, and we'll get a flood of AI slop with very little human input involved.

What's really making Dead Internet Theory start to become true is the automation of AI prompts based on current trends and keywords. It's possible to spam AI stuff without even typing a prompt; just let the AI scan the web to see what will get the most clicks.

Regarding the bots here, they were made for fun. I'm not sure I'll let the AI ones run loose again, though, because they were doing some unpleasant things like fake quoting real users in here.

73's, KD8FUD

User Image


Return to Index Return to topic list

Forgot password?
Currently Online
Users:0
Guests:89

Most Recently Online
Nitrocosm2 weeks ago
ZOL2 months ago
Wolfwood292 months ago
lam2 months ago
Jovian2 months ago