Dystopia and fear of AI

Between stimulus and response there is a space. In that space is our power to choose our response. In our response lies our growth and our freedom.

Viktor E. Frankl

So, I read a book by this guy Mo Gawdat, Scary Smart. He was a big wig at Google X. Mo wrote a book about AI in the future. The book was, in my opinion, complete garbage. It was a load of crap.

So I think the reason I didn’t want to post this is because it’s negative and I don’t want to be negative when I write but I gotta call this out. There’s a bunch of books out about the coming AI Revolution and how we have to adapt to the AI.

The AI and specifically progress in LLMs is inevitable. But the the book Scary Smart didn’t understand the true nature of AI. He gave magic meta cognition to the “AI”. He is pointing out that the future does not not have to be a dystopian Terminator future. The big conclusion the author wants us to accept is that by loving the AI as if they were our children, we avoid the bad outcomes.

Mo Gawdat also wrote a book about happiness, which was his big book. He came these conclusions after losing a child. Having also been through the grieving process of losing a child and working with parents who have lost child for years, I kind of understand what he’s saying, but when he tries to take grief and love of a child and apply it to AI, he is totally off base.

He tries to paint a good picture of how the world looks with AI as a good thing. And also, how the world looks with AI as a bad thing. The extended metaphor is a picture of people sitting around a campfire in the future, talking about what happened in the AI revolution. Throughout the book, he points out that they’re sitting there because technology collapsed. The world is at an end. Everything’s gone right down the toilet. Or they’re just camping because they have nothing but leisure time and camp to enjoy time with their family and nature.

Both scenarios are because of AI. I think he misses the whole point of AI and automation. AI can cause a dystopian future even if it becomes all the good things the tech people think it should be. The idea becomes a dystopian future just as much as a Terminator future, where the AI try to kill us.

Fear of AI is described as dystopian. Why? Because a perfect AI will know what’s best for us. Like the smothering mother, we won’t have a choice in our future therefor dystopia.

I think he misses what dystopia means. So, let’s look at dystopian novels like A Brave New World, or 1984, the classic examples of dystopia. The Giver is another one. There’s a lot of books in the genre. The Hunger Games series by Suzanne Collins has spawned movies. The Divergent Series, or Maze runners also have movies.

Anyway,in the Divergent novels, each person in society has a “type”. One girl is the combination of all four types. It’s not accepted by the society. You get one type, chosen for you, being a combo is bad according to the officials.

In The Giver, everything people require to survive is provided. They have all their material needs covered. They’re not hungry, they’re not starving, it’s not some dark future.

Having material needs provided is not the point. The dystopian part of these societies is a lack of choice. You don’t get to choose your own future. It’s chosen for you by society. That’s the dystopia part. Maybe that is why the books appeal to young adults. The big question to answer is do you take the assigned role from parents, society and peers, or be different, unique and a full individual.

Yes, humans make mistake humans, become drug, addicts, and alcoholics and gamblers and all the bad things. We have bad relationships, bad decisions. But all of that, we can at least believe are our decisions. Even if life comes out badly, as an individual, you made that decision. In a dystopia, the society is going to decide that for you. It’s all decided. You don’t get to make those decisions whether they’re good for you or bad for you.

In the AI perfect vision of the future, let’s say you have a Smartwatch that monitors your pulse, breathing, Oxygen levels, all the activity levels. You have the watch connected to a smartphone, and you have an app. AI let’s you take pictures of what you eat, and tells you if you’re eating right,or eating wrong. Get more exercise because you had pie after dinner. Go walk and get your steps. I have the watch, and yes I do all of the above.

Well, it’s good for you. Now, think about this, if you’re a kid who lives in a rich neighborhood, and you have access to this technology, and you take pictures and the AI app says, “Hey, greasy cheeseburgers and french fries are not the way to go. You need to eat a salad.” Just do what the AI says, Boom, you’re not going to be part of the obesity epidemic. You’re not going to be one of these people with a BMI of 40 or 50 at 16 and 5’7″. All this health is because you did what the AI said. It says, “work out”. You work out. You’re fit, you’re healthy. It says you need to go to bed early. You go to bed early.

You feel better. It says “study more”, okay? Study more. You have an AI assistant to help you study. You want to go to college, the AI assistance is there to help you. Your struggling with a subject, you can ask questions of the AI assistant. It’s infinitely patient. AI will answer your questions and you don’t have to feel embarrassed.

AI assesses your strength. AI decides, what college you should go to. Your AI interfaces with the other AIs and says, “Hey, here’s a nice person you should date” because you’re using the AI app that looks at all of your history for the last 15 years.

You’ll get along because you’re both lactose intolerant, have a similar family background, about the same age, income bracket and education. There’s no milk in the house. Isn’t that beautiful?

How much of that was actually your choice? How much of that came from the AI? Suddenly, without even realizing, we are in dystopian future where we don’t really make choices. We work for the computers.

If any of you are old like me and you go to the gas station for a coffee and the teenager behind the counter says $2.53, so you give them couple of dollars. You say oh wait I have three pennies and they look at you like you just grew horns. They type it into the machine and it turns out the change is exactly 50 cents and their mind is freaking blown. Completely blown. You understand what I’m talking about? We already work for the computer. Have you gone into McDonald’s and ordered from the kiosk? We already work for the computer. Can’t get an appointment without the app? Can’t send a message to the doctor longer than 500 characters? Computers don’t work for us.

This is the fear I have of a dystopian future. There are other dystopi not just the Terminator model.

The reason that loving all the computers doesn’t work, is Mo doesn’t understand the point of automation. If you build an AI system to do image recognition and control a machine gun. The AI/video/control system will look out across the field and kill people with the enemies uniform. The AI doesn’t get sick of killing. AI doesn’t wonder what this war is all about. It never read All’s Quiet on the Western Front or miss home. It just kills. How do you love that?

The whole point of the Terminator. He’s unfailing, he’s uncaring, he is there to kill you. He’ll be back. I’ll be back.

Honestly, I thought he had a good premise at the start of the book. Two sides to the dystopia. When he says, we just love them. Like there are children and everything will be air freshener and rainbows. How naive? It’s an arms race, unfortunately.

And the tools are close at hand. This isn’t nuclear physics. You can run these on your computer, you can run open source LLM models. On your machine you have now. Lenovo is advertising a laptop with AI included. I don’t know what that means.

I’m using AI to create this blog post. I’m speaking with my voice and translate the words to text. Into a real post that can be posted on this blog so you can read it with your eyes. And maybe, if you’re lazy or busy, you just point your AI reader at it, it reads it to you and makes inflections, and makes it sound like I’m reading. The AI will read with passion, expression and you can even make me have an English accent or a funny voice.

That’s all AI.

I can’t accept believing that we (humans I guess) just have to love the AI and suddenly it will be a happy world. There won’t be problems or dystopian futures or kids who rebel and throw rocks at the window because at least they chose to throw the rock. That attitude is naive and silly quite frankly.

And honestly I think he just wrote the love the AI stuff to sell the book. It makes no sense. I do think there’s a bubble. I do think Nvidia’s overvalued. I do wonder why AMD keeps getting beat up in their stock price even though they have products that compete with Nvidia. I do believe it’s a case of investors not understanding what’s going on in this new world and what it really means. People write books like this and investors read, but it is just junk science.

You can love AI and make it happy. Put some kind of meta-cognition on it. A humanity that doesn’t have. Machine guns are going to kill people as long as it can kill people. What it’s seeing are soldiers, that look a certain way and if one of your soldiers happens to be the wrong color, it’s too dark or have something on their face or their uniform. It’s gonna kill them too. That’s the future. We’re looking at.

You can’t stop it and you can’t turn it off. But, you can copy the same code to 1,000,000 AI guns. Intelligence at the edge is the buzz.

The LLM revolution is here. I don’t believe there will be a general machine intelligence. Us humans are good at moving the goal posts, so the definition will always be out of reach. There will be specialized LLM for everything and they can in many cases already outperform human experts. Chess, Go, and radiology are the usual examples.

Expressing love for a machine that pattern matches X-rays makes no sense. Skip the book. There are better books on AI out there.


Posted

in

,

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *