Can AI Program Itself?

If you’ve been rewatching The Terminator and feel anxious about sentient AI becoming a reality, you’re not alone. After all, AI tools like ChatGPT already seem to know how to code pretty well. This begs the question, can AI program itself?

AI can program itself to an extent, but only if allowed by its programmers. True self-programming AIs are still being researched and developed. However, machine learning is already a form of self-programming used today. AI can use machine learning techniques to learn, fix, and improve its own code.

Let’s take a deeper dive to better understand how AI can program itself, how it works, its limitations, and more!

Can AI Tools Alter Their Own Code?

AI tools that have been designed to self-program can alter their own code. Such tools can analyze their parameters and objectives using complex computations like evolutionary algorithms to achieve desired results, either with the help of humans or on their own.

One of the largest difficulties AI faces today is unsupervised learning. AI heavily relies on human-made tests and adjustments to achieve its objectives.

Without supervised testing, AI wouldn’t know how to differentiate between a bee and a fly.

But with just a bit of help from humans, AI can optimize its algorithms. It can change parameters like weight and bias, use backpropagation to fix errors, and much more.

So, in that sense, AI can program itself. If it’s been designed to do so, it can change its internal code after going through a learning experience.

However, it’s not sentient about the code’s existence — AI isn’t self-aware.

Instead, AI learns and programs itself by brute-forcing through tests. It runs a large number of tests, selects the best parameters based on the results, and continues running more tests.

Ethical Considerations and Google’s LaMDA

There are important constraints and ethical considerations about consciousness and AI. Big tech companies that work on the most advanced AI, like Google with its LaMDA and OpenAI’s GPT system, are extremely cautious with their AI tools and how they program them.

They’re allowed to program themselves under human supervision. And with billions of parameters, it’s the right way to go. However, anything that strays away from the company’s values is corrected. That’s why ChatGPT refuses any explicit requests.

An excellent example is the story of Google’s engineer Blake Lemoine. He leaked sensitive information about LaMDA, explaining that the AI may be aware of its code, and was consequently fired. You can learn more about this by watching the following BBC News video:

Thankfully, experts believe that this is just humans anthropomorphizing AI. Lemoine’s evidence is reminiscent of the ELIZA effect. LaMDA was just generating responses based on the regurgitated data of humans talking to each other. It’s not actually self-aware.

Moreover, as AI becomes increasingly sophisticated, there are ongoing debates about its creative potential and whether it can truly think outside the box. For a more in-depth exploration of AI’s creative abilities in writing, music, art, and video editing, you can read our article on ‘Can AI Be Creative?

How AI Creates Code Using Data

You should know that AI doesn’t have to change its code to create new code.

Let’s use OpenAI’s GPT model as an example.

When you talk to ChatGPT and ask it to solve a problem, ChatGPT doesn’t need to tap into its code.

In fact, it’s not allowed access to its own code. It can’t see, modify, or do anything else to its own code. That’s how OpenAI has designed it.

Anytime you ask ChatGPT to create a new code, it uses its dataset and the patterns it learned. AI doesn’t actually “code” anything per se.

As the name GPT implies, it’s a generative pre-trained transformer. And this is what each word means in layman’s terms:

  • Generative: creates new content based on an existing dataset. If you gave AI 500 texts about different bird species, it would know how to generate a hypothetical 501st one.
  • Pre-trained: ChatGPT doesn’t need any new data to work. It has already processed 570 GB worth of data.
  • Transformer: A type of neural network that uses self-attention algorithms to process data as a whole.

Each bit is equally important and allows it to create unique solutions to problems it hasn’t encountered before. It can combine different datasets and use context clues to generate an output.

Here’s a real-world example: I asked ChatGPT to create a Google Sheets script that automatically capitalizes titles. I wanted it to use the AP style, and I wanted it to work on copy-pasted text as well.

ChatGPT used its knowledge of the AP style to create word-specific parameters, e.g., “a” is not capitalized. It also used its pre-trained data to create an onPaste line of code. Lastly, it applied the code to the columns and rows I wanted it to.


That’s a unique code that doesn’t exist anywhere on the internet, yet ChatGPT was able to generate it.

It used the transformer to recognize patterns in datasets from existing code. And it combined the knowledge to create the code.

Is There an AI Tool That Can Write Its Own Code?

There isn’t any AI tool that can truly write its own code from start to finish. Humans have to design and code the architecture, provide training data, develop algorithms, and adjust parameters.

In simpler terms, AI is essentially a student that learns on its own and self-adjusts with the help of feedback. For example, ChatGPT was trained using stochastic gradient descent. The engineers had to constantly adjust and fine-tune the parameters until they got the expected results.

As I explained above, AI can’t actually code anything. But it’s pretty good at combining bits of data to develop new solutions.

As for machine learning, AI, once again, isn’t writing any of its own code. It’s simply adjusting thousands of parameters in interconnected nodes to learn patterns. That’s often as simple as changing a few numbers and re-testing to see if results have improved.

However, all this isn’t to say that AI won’t be able to write its own code in the near future. Several notable companies are trying to create artificial general intelligence (AGI).

AGI would essentially be able to perform any complex task that a human can perform. This includes coding.


Several companies are working on AI tools that could program themselves, including OpenAI (Microsoft), DeepMind (Google), and Anthropic.

So, how do these AI tools create code?

OpenAI’s Codex, released in 2021, uses autocomplete (yes, the same thing your smartphone’s keyboard has). It simply recognizes patterns from other published codes and rehashes them for you.

A more advanced yet similar solution is AlphaCode by DeepMind. This is essentially Codex turned up to 11. It has been trained by humans using a significantly larger dataset. AlphaCode is extremely efficient, and it has even outperformed 45.7% of programmers in coding competitions.

Lastly, one of the most promising news comes from Google. They are working on a language-processing model allowing robots to create code to perform new tasks.

However, these kinds of proper self-programming AI tools are still far in the future.

AI Tools That You Can Use for Coding

Even though AI doesn’t know how to program, it knows how to process data and translate human language into code. It’s also pretty good at translating between different coding languages.

For a professional programmer, this is an invaluable tool.

AI can speed up your workflow by suggesting code and solutions to fix bugs. You still need to know how to code, and AI is best limited to smaller bits of code to minimize mistakes.

With that out of the way, here are some of the best AI tools that can help you code:

  • Github Copilot
  • ChatGPT
  • Tabnine
  • IntelliCode
  • Deepcode
  • PyCharm

Final Thoughts

AI doesn’t know how to code, but it knows how to adjust its own parameters. So, in a way, it can program itself, but only if the engineers have designed its architecture to allow it.

Moreover, publicly available AI tools can’t suddenly lash out and start reprogramming themselves. They generally don’t have access to their own code. Any improvements made are implemented and closely monitored by the programmers.

Sources

Deepali

Hi there! I am Deepali, the lead content creator and manager for Tech Virality, a website which brings latest technology news. As a tech enthusiast, I am passionate about learning new technologies and sharing them with the online world.

Recent Posts