Why is AI important? By Vijay Krishna Palepu

 had a very interesting conversation with Vijay (my son) on AI. The outcome is this article written by him…The original article is @

I was about to put on “Coded Bias” on Netflix and Dad asked me a curious question: Why is AI important?

I could have answered by talking about the impact of AI/ML on our everyday lives — i.e., how it impacts everyday people … the users of technology and software

But I opted to approach this question as a software engineer instead. Specifically, I approached this question as someone who takes deep interest in how software came to be, historically speaking.

Through a short-yet-rambling answer, which turned into an hours long discussion between us, I told him that the whole point of programming was to implement an artificially intelligent machine; and so, AI is important because it is the whole reason why computer programming exists in the first place.

It was a long discussion.

And I accept that most people — most software engineers or computer scientists — would not take that viewpoint. But that is where I am. I think AI is important because it was the whole point of modern computer programming. As such, the motivations behind making AI work, likely shaped computer programming in its early stages. AI/ML is/are certainly applying ‘evolutionary pressures’ (if you will) on how programming languages are designed, and how programming is thought of.

And given the importance of programming in and of itself, it is important to view it through AI’s lens. If you humor me for a second …

I think of Turing as a pivotal figure and consider him to be the father of modern-day computers and programming. But he was really going after artificial intelligence. In 19050, Turing authored this paper titled: “Computing Machinery and Intelligence” [].

It is right there in the title: he is not saying, “Computing Machinery and Finances” or “Computing Machinery and Art”. He is being pretty on the nose about it: he is specifically calling out “Intelligence” and how it may be relevant to computing itself.

He opens the paper with a simple question: “Can machines think?”

He reframes that question four sections later to: “Are there digital computers which would do well in the imitation game?”

The “imitation game” [] that Turing refers to here, is popularly known today as the Turing Test []. Turing’s original definition of the imitation game is simply this: Can there be an interrogator “C” who can tell “A” (a human) from “B” (a machine), by engaging with “A” and “B” with a series of questions and answers? Or rather, can we create a “B” (a machine) that can come across as “A” (a human)?

He actually takes great pains to define a “digital computer”, but also does it in plain terms. He expects a digital computer to have a store (or memory), an executive unit, and a control unit (which would house a table of instructions, executed by the executive unit).

Sound familiar? It is 1950 and Turing just defined the modern-day computer (mobile and desktops). But he is not done yet. He actually posits what it means to program such a machine (note the spelling difference — he is/was a Brit).

He considers a program to be a table of instructions fed into the machine, i.e., the digital computer, which this computer will execute. And the act of programming is to put those instructions in the computer’s Control unit.

That is a very accurate, and down-to-earth definition of programming in 2022 — as defined in a paper on intelligent machines, back in 1950, by a mathematician who was thinking about sentient machines that can think.

Back to the future in 2022, we still do not have general-purpose AI that can really think, and reason and learn concepts and exhibit creativity. So far, to Turing’s original question the answer is (a qualified) “No”.

But in his attempt to conjure a thinking machine, he basically defined the modern-day computer and computer/software programming. Any one avoiding those roots of programming and software engineering is not being honest about where programming comes from.

The idea of programming existed before Turing — and he points to the Analytical engine. But his definitions of computers and programming have endured the last 72 years and they show no signs of waning.

That vision of Computers and Programming may not have yet given rise to thinking machines, in the sense of being sentient entities, but they have set humanity on the course of immense digitization and connectivity. And so, it becomes important to understand appreciate that the way we write code (for programs such as a word processor or an online store or a search engine) is primitively rooted in humanity’s desire to make machines think.

I dare to make this claim: every program we typically write is a deliberately failed attempt at creating a Turing Machine. (I am lucky that no one will read this, otherwise I would be taken apart by all corners of the internet 😅)

Even as I think about routine concepts in programming such as immutability, streams, and lambdas … it reminds me how such concepts have helped software engineers reason about large-scale/big-data/distributed systems … which have in-turn been essential to creating AI/ML systems (albeit not general purpose AI, yet).

So in closing, I contend that even before impacting our lives in the form of driverless cars or machine-generating entire movie scripts, AI gave us the concepts of programming and programable digital computers, which have been immensely consequential.

And even though we may not have achieved general-purpose AI just yet, humanity’s mere attempts at achieving that seemingly unattainable prize has restructured the way we live.

In how many other ways will we end up changing our world, in the pursuit of AI?


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: