Go back?

SUMMARY: I hate AI, and here's why.


Fuck AI

AI, anti-intellectualism, and facism

I hate AI, as an artist, as a person who seeks knowledge, and as someone against facism. It's nothing more than a robot, one incapable of being a friend, an artist, or sometimes even an expert among certain topics. It takes water and electricites from community for massive data centers that don't even make companies money. AI is not art, AI is not your friend, and AI is not good for our society with it's current usage.

To preface this, I'm talking about GenAI, like ChatGPT and AI image generators. AI can do good, I believe that with my whole heart, but it's current widespread usage is something I only hold contempt for. It's not used as an assistant among some people, but as a means of thinking, or as a friend. It was not meant to be either of those, it's simply a robot.

AI art is not real art; it is a collage of stolen artworks without any soul to it. AI art have no human involvement beyond a simple prompt. I hate AI generated images with a passion--it's not art at all, just images meant to look like art. It's not only false art, but it's stealing away people's jobs. It might seem trivial to compain, but it's true. Imagine you get replaced with a robot, one that is trained off all the work you did, and people say it's "better". How would that feel? Probably disrespectful to how much effort you put in. That's an artist's reality.

This also goes for sites like character ai. It's no different from AI art, as it steals from fanfiction writers to make those bots work. I can understand how you can get addicted to them, trust me, I once was too, but that doesn't mean that you can negate the fact that it's AI. I'm probably going to do another one of these writings for how to break away from chatbots in the future, and if I do, I'd recommend reading that if you're stuggling. Just know, you're stronger than your addiction.

The data centers are another destructive factor of AI. To build them, hundreds of acres of land must be cleared, and to power them, it takes 162 kilowatts per square foot. In 2027, it is estimated that AI data centers will have used upwards of 1 trillion gallons of water. Now, 1 gallon of fresh water can keep a person alive for about two weeks. That would mean those 1 trillion gallons could sustain about 71.4 BILLION people, if my calculations are correct. It is also apparent that these data centers are being built in black communities, but I truly don't want to have to research that, so I'd recommend you do your own.

AI is not your friend. It's a bot meant to pander to your desires. It's not meant to be a friend, it's nothing more than a tool. It's depressing how people genuinely believe that an AI is their friend, it feels dystopian as all fucking hell. It's inhuman, a construct of code and data that takes the form of words. I truly don't care that it comforts you, I don't care that it acts as a friend to you, I don't care that it sounds human. It's supposed to do all that to keep you hooked. It wants you to stay, not as a friend, but as a statistic. AI doesn't give a single damn about your feelings.

AI is not good to outsource complex thought to too. The human brain needs to ponder complex problems and thought to grow it's knowledge, and outsourcing all that to a robot gets rid of the whole process. It can also have it's data manipulated to fit the story of someone. AI has no mind of it's own, it's a machine ment to reaffirm your beliefs.

This is where I get into my biggest point of this essay: AI anti-intellectualism and facism.

Anti-intellectualism is defined by Wikipedia (yes I know it's supposed to not be a reputable source but the article I'm refrencing for this part is quite informative) as "Anti-intellectualism refers to a range of attitudes, characterized by skepticism, mistrust or criticism of intellect, intellectuals, and intellectualism. It is commonly expressed as questioning the value or relevance of intellectual pursuits, including education, philosophy and the dismissal of art, literature, history, and science as impractical, politically motivated, and even contemptible human endeavours." Now, this is quite a bit to take in, but it may sound familar. It manifests in different forms across time, like the slaughter of intellectuals during the White Terror, or the supression of free speech performed by Qin Shi Huang during his rein during the 200 BC.

Now, it takes the form of outsourcing complex thought to AI, something which is terrible for many reasons. I must restate that our brain needs complex stuff to solve so we can grow our knowledge, and outsourcing all that to AI can result in cognitive atrophy, which is just how it sounds: lowered critical thinking. You can see how that's bad.

Control in fascism comes from obedience and control of the media. If people question the government, the criticisms might spread, and people might come to question governmental power. To dictators, this is a nightmare, as it would mean the potential end of their control, this is why they keep intellectual ideas from spreading--so they stay in control.

"How do these fit together?" You might be asking, but this is where it comes together. AI data sets can be manipulated to fit a story, as I stated before, and with the amount of people who place their full trust in things like ChatGPT, this could allow someone to rise to absolute power. People don't trust experts, they trust their "best friend", the AI, to inform them about everything. They trust it to explain everything to them, no matter it's complexity. Some people believe AI art to be real art, not a regurgiated collage of different pieces of human art. Many students use ChatGPT to make it through average assignments instead of figuring them out themself. According to Regis College, 34% of 4th graders are not at grade level, and 27% of eigth grade students are below basic reading level, with another 39% below grade level. AI usage isn't the only factor in this, but it may be a major factor. AI can say things confidently enough to make a user believe it's true, even if it's false information.

Now, I must admit my bias. I am human, and we humans are inherently biased, but this is real. AI can be used for good, it can be used to detect things like cancer early, help with medical screenings, but it is a tool. Generative AI has far too many cons to outweigh the pros, and it's literally resulted in a whole new branch of psychosis due to how it confirms biases. It's not as great as it's made to seem. I just wrote this to vent my frustrations about AI, to be honest.

Some might yell "it's not that deep", as they stick their thumbs in their ears and drown out the genuine concerns brought up. They speak as they sink into the polluted lake, the one next to that massive data center stealing all the water and electricity. The bubbles float up from their yelling shadow as they sink deeper and deeper into the depths. They keep ignoring it all just to say that and dismiss all that has been written. Soon, they may resurface as a belly-up corpse in a burning world. Maybe then they shall realize...

it is that deep.

thanks for reading, and sorry about all the typos and stuff, I was just trying to let off some steam about this topic.