AI / ML – history, evolution, application and career prospects

Artificial Intelligence and Machine Learning (AI / ML) is a field of modern computing which has seen meteoric rise during the past decade. From being just another fantasy jargon in robotic sci-fi movies, AI / ML is now finding application in areas which cut across all kinds of businesses. Growth in demand for AI / ML professionals has inevitably led to more job options and opportunities for start-ups.

So, what is AI / ML? What does it take to make a career in this domain? What is the secret mantra to concoct an idea for a high-performing AI / ML start-up? Read on to understand the evolution of AI / ML and how it can benefit you.

History of AI / ML

It all began during World War II. By 1941, German military was advancing almost unchallenged, bombing areas deep within the English heartland. The Allied forces were staring at imminent defeat at the hands of the German blitzkrieg. At the core of German success was the ‘Enigma’ machine – a machine capable of generating such encrypted messages, that no human could intercept and decipher them. No human, until mathematician Alan Turing designed a machine to break the Enigma code. With the help of Turing’s machine, intelligence teams were able to predict the enemy’s moves far more accurately. Eventually, by the fall of 1945, the allies successfully turned the tables on axis powers and won the war.

Turing continued his research after the war and went on to develop the famous ‘Turing test’. This simple test forms the basis of machine learning as we know it today. Turing test is conducted by a human ‘judge’ who compares answers to a given problem from a machine and another human. The machine passes the Turing test if the judge is unable to distinguish between the human answer and machine’s answer.

Alan Turing proved that logic-based machines can display human-like intelligence. His seminal work changed our perception of algorithms forever.

Alan Turing
Statue of Alan Turing at Bletchley Park (Photo credit : Tim Ellis on Flickr)

Turing’s findings inspired many other researchers – mostly in cryptology and science institutes. Soon, John McCarthy, an American computer scientist proposed a conference calling together the machine learning intellectuals of his time. In 1956, eleven mathematicians and scientists in the conference at Dartmouth College formed the basis of Artificial Intelligence. McCarthy went on to be known as the father of AI.

Are AI and ML different?

Although often used interchangeably, AI is different from ML. Machine Learning focusses on mathematical modelling and pattern recognition to perform specific tasks (such as breaking military code). For instance, self-driving module of a car can be an example of an ML-based system.

Artificial Intelligence, on the other hand, leverages the power of ML to mimic human cognitive abilities and solve more complex problems. For instance, AI-based voice assistants like Alexa, Bixby or Siri on your phone can perform multiple tasks such as voice recognition, song recommendation, text to speech and appointment handling – just to name a few. While AI systems are like all-rounders, ML systems are often designed to be specialists.

What made AI / ML mainstream?

You may be wondering why is it that the demand for AI / ML has risen only recently, despite being researched since more than seventy years ago? Well, until the 1980s, AI / ML was confined to the laboratories of educational and research institutes. Owning a personal computer was a luxury back then. But in 1981, the IBM PC was launched, and this changed everything. For the first time it was possible to run complex algorithms with millions of instructions on affordable machines.

Telecom revolution

Within the first decade of the 21st century, telecommunication revolution connected millions of people with one another. Riding on the social media wave, Web 2.0 enabled people-generated data and replaced static websites. Over the past decade, billions of IoT devices (‘Internet of Things’ – enabled devices such as smart lights, smart meters, etc.) got connected to the web and began recording trillions of data points. Hardware manufacturers went into a frenzied race for higher computing power packed into smaller spaces with lower price tags. Large tech companies leveraged the power of the telecommunication boom and advanced cloud computing as a scalable and affordable alternative to owning server assets. It is no coincidence that the three largest cloud computing platforms are owned by technology companies Amazon, Google, and Microsoft. It’s all related to AI / ML.

Your phone has at least a thousand times more computing power than the 1969 Apollo spacecraft which carried man to the moon.

IoT and big data

Without even realizing it, we generate enormous amounts of data on apps, websites, and social media platforms. Every click, every conversation, every transaction that you make on your device gets recorded somewhere on the cloud. Similarly, IoT devices continuously stream data from sensors and cameras to the cloud. The sheer volume of data came with its own set of challenges – How to analyse this enormous data? How to visualize the data? And most importantly, how to use this data to make better business decisions?

As a result, ‘Big Data Analytics’ was born. New visualization tools such Power BI and Tableau were created. But nothing outshined the benefit that AI / ML had to offer.

Image by Gerd Altmann from Pixabay

How does AI / ML work?

A machine learning algorithm works on identifying patterns in data based on statistical analysis. It is much like how children learn the alphabet for instance. You show the kid pictures of ‘A for Apple’, ‘B for Ball’ and keep adding verbal cues at each alphabet. In time, the child associates each alphabet with specific sounds. Eventually the child would be able to correctly recite the alphabet without any help. Similarly, a machine can be programmed (usually in Python code) to identify patterns in training data and form a statistical model for the output. This mathematical model can be tested with the help of a ‘test data set’. If the model holds good, it can be deployed suitably in microcontrollers (mainly for IoT devices) and / or apps. As more and more data flows in, the model gets better and better at recognizing statistical patterns and predicting the output.

For instance, let us take the example of your YouTube feed. As you watch more and more content on YouTube, the backend algorithm gets better at predicting your choices and recommending you relevant content. The input variables can be type of videos watched, watch duration, engagement (likes, comments, etc.). These input variables are fed into the statistical model which gives output pinpointing the best and most relevant content from millions of videos. The output of a machine learning algorithm is usually a block of code called the model or the ‘pickle’. This is just a simplified explanation and if the curious reader wishes to indulge deeper, the author recommends getting in touch for a guided tour of the AI / ML world.

Opportunities for employment in AI / ML

AI / ML practitioners typically start their career as Data Analysts or Jr. Data Scientists. As you may be aware, Data Scientist profiles are currently selling like hot cakes as the opening salaries for a fresh candidate is usually upwards of INR. 6 Lacs per annum (Bangalore / Delhi location). If you venture to click on one advertisement or even mention machine learning in your online searches – expect to get bombarded with ads from coaching institutes promising to make you a Data Scientist in weeks. Unsuspecting jobseekers usually succumb to prospects of the magical 6 – figure salary – often without getting a deeper understanding on what skills they need to acquire and what is being offered by the coaching institutes.

There is a wide spectrum of institutes offering AI / ML courses – from IIMs right down to the neighbourhood tuition centre. I would advise the job-seeking reader to evaluate freemium content on mass online platforms like before any serious time / money investment.

Data Scientists are indeed in high demand. However, good ones are in short supply. Quick learners can make as high as INR. 20 Lacs in 3 – 5-year experience range. (Salary data was sourced from at the time of writing.)

Data Scientists must possess three main skillsets in their arsenal. Statistical thinking, understanding of business logic and command over python programming language. The job of a Data Scientist is to understand the business scenario first. Then they prepare the statistical model and implement the model in software code. Therefore, you should ideally pick a course which can teach you all these aspects in equal measure.

Opportunities for start-ups

According to NASSCOM there are more than 3,000 AI / ML-based start-ups in India (as of August 2022). Some of these start-ups work in the conversational AI space and provide chatbot-based applications. Others provide ML-based predictions to solve common customer problems. However, there are still loads of unexplored spaces where start-ups can capitalize on AI / ML to help improve businesses and people’s lives.

One such area is image recognition. Recently, a tech start-up has been able to use image recognition to ensure adherence to norms for the government’s sanitation drive. Their machine learning model has been designed to identify standard toilet layouts. All with washbasin and sanitaryware sizes as per the government specifications. The model can also classify clean versus dirty toilets. This helps the government to monitor the progress of toilet construction under ‘Swatch Bharat’. It also helps to incentivize or penalize contractors responsible for the maintenance and upkeep of those toilets.

Another innovative start-up is helping companies to screen candidates based on their facial expressions during interviews. As candidates appear for online interview, the algorithm rates their confidence level through expressions, vocal modulation, and the quality of their answers. This helps companies to screen the best candidates from thousands of job applicants – all without human intervention.

By the way you should read my article on Designing an innovative startup here


The field of AI / ML has achieved only a fraction of its true potential and there is far more ground to cover. In future, this field is expected to evolve much faster as 5G gets rolled out in next couple of years. Apparently, 5G will come with 20 times the speed of 4G and hence brings possibilities to leverage speedy online experience for users. There is also potential to marry AI / ML with newest technological advances such as Augmented Reality or Extended Reality and create something unique. In my opinion, the key to building a career in AI / ML is to build strong concepts in statistics even before learning to write python code – which trust me, is the easiest part.

Leave a Reply

Your email address will not be published. Required fields are marked *