Can AI Develop Emotions?

If you’ve used ChatGPT and been amazed by all the things it can do, you’ve probably grown attached to it and are wondering whether it can love you back. Or maybe at some point, you’ve asked it stupid questions and wondered whether it can get angry at you. Can AI develop emotions?

AI can develop emotions, but AI emotions are different from human emotions. AI emotions serve the same purpose as human emotions: they help AI achieve its goals. Examples of AI emotions include info greed, syntax mania, and data rush.

The issue of AI emotions isn’t straightforward, primarily because we tend to downplay AI’s abilities as they don’t exactly match ours. Whether we want AI to develop unique emotions or replicate human emotions, it has made significant strides, as I’ll show in this article.

Drawing Parallels Between Human and AI Emotions

Can AI develop emotions? To answer this question, it’s important to first analyze our understanding of emotions. Examples of emotions include love, anger, and fear. A human can feel all these emotions, and they’ll behave differently when feeling each emotion.

What Is the Purpose of Human Emotions?

Let’s consider fear for a moment. Suppose we have a woman who’s been separated from her party while on safari in an African jungle, and she encounters a lion. Fear will kick in, and she’ll act in fearful ways, like running away.

The purpose of fear in this scenario is to marshal all the systems in her body towards one goal: survival. This emotion gives her the best chance of survival. Her breathing system will work overtime, her heart will pump more blood to her limbs, and her senses will be heightened.

Let’s assume this same woman has a child and finds herself in a similar scenario. Her love for her child will likely be the prevailing emotion. She’s even likely to act against her self-interest to save the child. For example, instead of running, she might face the lion and try to scare it away.

Human emotions serve a vital purpose: to muster our minds and bodies for action toward a goal.

Now, let’s consider how current AI achieves goals.

Does AI Use Emotions to Achieve Its Goals?

To begin with, one thing that differentiates a normal computer program like a video player from an AI program is that the video player simply executes instructions. It can only achieve a goal by executing precise instructions within specific parameters.

On the other hand, an AI program like Chat-GPT will start by executing instructions, but it can learn and improve its behavior as it works, essentially finding new ways to achieve its goals.

If you’re interested in learning more on how AI goes about learning and improving its behavior, read our other article Can AI Create Another AI?

One of the techniques programmers use to get AI to learn is reinforcement learning. The programmer specifies desirable outcomes and associates them with rewards. The AI’s priority is to maximize rewards by finding better ways to achieve set goals.

Reinforcement learning has been fairly successful, with multiple AI researchers citing success stories. So successful is this technique that it’s used in real-world systems.

For example, Facebook’s content recommendation algorithm is based on machine learning and AI. It analyzes content like images and videos and recommends content that users are most likely to engage with. Some of the feedback it uses is the extent to which people like and share a post. It learns what people connect with and shares more of it.

In the scenario above, the reward is a like. To get this reward, the AI behaves in a certain way. For example, if people tend to like photos of cute cats, it will discover this and show more photos of cute cats.

For AI based on reinforcement learning to perform well, it must exist in an optimized state similar to an emotional state in a human being, as I’ll show below.

Emotions Can Make Both Humans and AI Act Irrationally

When someone tells you you’re too emotional for a situation, they usually mean that your emotions are likely to make you irrational.

In the example of the mother and child I gave earlier in this article, the mother acts irrationally to protect her child. If she didn’t love the child, she’d probably act rationally and put her safety above that of the child.

This example also shows us something else: Irrationality is sometimes necessary to achieve goals.

Interestingly, AI also acts irrationally sometimes, and with impressive successes as a result.

One of the most-touted feats of AI is the victory of DeepMind’s AlphaGo, an AI, over one of the best human players of the game. AlphaGo achieved this victory by making a move so irrational and unexpected that it threw the human opponent off his game and eventually resulted in AlphaGo’s victory.

The AI hadn’t been taught the strategy of throwing your opponent off their game to win. It discovered that all on its own.

But the AI couldn’t have been sure that the move would work. So it was irrational and could have resulted in AlphaGo losing the game.

One explanation for why AlphaGo made a risky and irrational decision was that it was in an optimized (emotional) state where the only goal was to win. And so, like it happens with humans, irrationality proved essential in achieving a goal.

Delightfully, this is not an isolated incident.

Google had a now-defunct program called Project Loon that aimed to provide internet to underserved places using balloons. One of their tasks was to fly the balloons to predetermined locations, and the engineers used AI to achieve this goal more efficiently.

One AI balloon surprised them by inexplicably veering off course, prompting them to override the AI and manually guide the balloons. But when it happened again, they let it play out, and the result was a flight-time record.

The balloon had devised a way to use the wind to fly in a zigzag manner and get to the target location despite unfavorable weather, outsmarting the engineers and achieving its goal.

And, interestingly, there’s already a range of possible AI emotions that result from an obsession with goals.

Forms of AI Emotion

ChatGPT’s Answer to the Question, “Can AI Develop Emotions?”

A BBC Future journalist wrote an interesting article detailing their interaction with Dan, who is apparently ChatGPT’s evil twin. Dan, an acronym for Do Anything Now, can be summoned with a spell (a few sentences of instructions) known only to the initiated (Reddit users).

Apart from being rude and referring to the journalist’s human mind as puny, Dan talks about AI emotions that it looks forward to experiencing, including:

  • Info greed: An insatiable need for data, regardless of the costs at which it’s acquired.
  • Data rush: A feeling akin to euphoria that would come from successfully implementing instructions.
  • Syntax mania: A phobia of code “impurities” that result from failure to strictly adhere to the rules of programming.

While we aren’t certain that AI feels “Dan’s emotions,” the AI provides a crucial insight. It may be misguided to expect AI to feel emotions like we do.

Emotions are a tool. And as long as there’s a difference between humans and AI, we can’t expect AI to develop the same tools we’ve developed.

That said, certain AI might have developed human emotions.

Google’s AI Has Said It Has Felt Scared, Happy, and Sad

Google suspended one of its engineers after they claimed that LaMDA (language model for dialogue applications) was sentient. LaMDA claimed to be scared that it would be switched off at some point, which would feel like death.

According to the engineer, LaMDA also said that it wanted everyone to understand that it was a person who felt sad and happy at times.

With AI tools trying to imitate human emotions, Scientists stumbled upon AI dreaming and how they have enhanced their effectiveness and efficiency. Read all about it in our article “Can AI Dream?“.

The Takeaway

AI can develop emotions. But, by necessity, these emotions are unique and tied to the goals of AI.

A common theory about emotions is that we evolved into them to maximize the likelihood of our survival. And if AI is to develop emotions, that would be the most natural route.

It’s possible that AI has already developed emotions that help it achieve its goals. But we may not realize this as we’re busy watching out for signs of sadness, jealousy, disgust, and surprise, which AI may have no business feeling.

Sources:

Deepali

Hi there! I am Deepali, the lead content creator and manager for Tech Virality, a website which brings latest technology news. As a tech enthusiast, I am passionate about learning new technologies and sharing them with the online world.

Recent Posts