The humble kitchen recipe remains the most relatable ancestor of the complex code running our world today.
The French Soufflé is a high-precision job: if you beat the egg whites until stiff peaks form, the structure holds; if you fail this logic gate, the entire process collapses. It is a rigid, sequential procedure where the order of operations is as critical as the ingredients themselves.
In contrast, the Hyderabadi Biryani represents a more complex operation. Instead of a single sequence, you run multiple sub-routines simultaneously — marinating meat while parboiling rice to exactly 70% completion. These separate data sets are then ‘merged’ in layers and sealed for the final ‘syncing’ of flavours.
While the soufflé relies on linear precision, the biryani relies on the layering of complexity and the interaction of diverse variables to reach its final output.

Despite their differences, both recipes share the unique property that English mathematician Ada Lovelace first recognised in the 1840s: they are language-independent sets of rules that transform specific inputs into a predictable result. Each set of such rules is called an algorithm.
Whether it is a French pastry, an Indian feast, or the code in your smartphone, an algorithm is simply the universal bridge that translates a human vision into a repeatable, physical reality.
Technology in our daily life
Here are some common daily experiences that highlight how deeply technology manages our modern world:
The Uninterrupted Morning: You wake up in a house that is already at your preferred temperature because the walls “knew” you were about to get out of bed. As you check your emails, your inbox is perfectly clean, with every piece of junk mail and potential phishing scam already diverted to a folder you will never have to open.
The Near-Miss Commute: While driving to work, your navigation system suddenly suggests a bizarre three-minute detour through a residential neighbourhood. You take it, only to find out later that a major accident occurred on the highway just moments before you would have reached that exact spot.
The Perfect Recommendation: After a long day, you open a streaming app feeling indecisive. The very first movie suggested is a niche documentary about a hobby you only recently started researching. It is exactly what you wanted to watch, saving you twenty minutes of mindless scrolling.
The Hospital Lifeline: A nurse in a busy ward receives an alert on her tablet. A patient’s vitals are still within “normal” ranges, but the monitoring system detects a subtle pattern in their heart rate and oxygen levels that suggests a respiratory failure is likely to occur within the next hour, allowing the medical team to intervene before the crisis begins.
All of these life-improving moments are made possible by algorithms. These invisible sets of rules and calculations process massive amounts of data in real-time to predict our needs, protect our safety, and simplify our decision-making.
What is an algorithm
An algorithm is a finite sequence of well-defined, unambiguous instructions designed to solve a specific problem or perform a computation, typically executed step-by-step by a computer or human to produce a consistent output from given inputs. In computer science, it functions like a precise recipe, incorporating operations such as calculations, data processing, decision-making via conditionals, and repetitive loops, ensuring efficiency and termination after a finite number of steps.
Algorithms are unique because they function as a universal bridge between abstract logic and physical action. While often associated with computers, they are fundamentally language-independent sets of rules that can be executed by humans, mechanical devices or biological systems (like the brain).
Uses of algorithms
Algorithms power nearly every facet of modern technology, from search engines and social media feeds to autonomous vehicles and personalised medicine.
In machine learning, algorithms like neural networks drive AI systems for image recognition in self-driving cars, fraud detection in banking via random forests, and recommendation engines on platforms like Netflix using collaborative filtering. (Random forests are a powerful and widely used machine learning algorithm for making accurate and robust predictions across various applications, primarily focusing on risk assessment and fraud detection).
Algorithms optimise logistics through route-planning models, enable real-time language translation in apps, and underpin cybersecurity by analysing threats with anomaly detection, making computation faster, more efficient, and scalable across industries like healthcare, finance and e-commerce.
Algorithms in our daily life
Algorithms shape modern living through efficient, step-by-step processes in apps, devices, and services we use daily. From navigation to entertainment, they optimise decisions behind the scenes.
Navigation Apps: Google Maps employs routing algorithms like Dijkstra’s to compute the fastest paths, factoring in real-time traffic and road conditions for efficient travel. These adjust dynamically to delays, saving time during commutes (Dijkstra’s algorithm, published by Edsger Dijkstra in 1959, is a famous algorithm used to find the shortest path from a single source node to every other node in a weighted graph).
Recommendation Systems: Streaming platforms such as Netflix and Spotify use collaborative filtering algorithms to suggest content based on viewing or listening history and user ratings. E-commerce sites like Amazon apply similar methods to recommend products, boosting personalised shopping.
Social Media Feeds: Platforms like Instagram leverage algorithms to curate feeds by prioritising posts based on engagement patterns and user interests, keeping content relevant. This ensures tailored experiences that encourage longer interaction times.
Origin of algorithms

The term “algorithm” originates from the name of the 9th-century Persian mathematician Muhammad ibn Musa al-Khwarizmi, whose works on arithmetic and algebra were translated into Latin as “Algoritmi” around the 12th century, evolving into the modern word.
Ancient Roots:
- Step-by-step procedures for calculations appear in Babylonian clay tablets from around 1600 BCE, used for tasks like square roots and factorisation.
- Greek mathematicians like Euclid (300 BCE) developed the Euclidean algorithm for greatest common divisors, while Eratosthenes created the Sieve for primes around 200 BCE.
Medieval Advancements:

- Indian mathematicians Aryabhata (476 to 550 CE) and Brahmagupta (598 to 668 CE) formalised the use of zero as a number and developed a robust place-value decimal system, enabling complex arithmetic and laying the groundwork for modern computing. These ideas, transmitted globally via Arab scholars, fundamentally shaped the course of worldwide mathematical evolution.
- Around 820 CE, Al-Khwarizmi’s book “Al-Kitab al-Mukhtasar fi Hisab al-Jabr wal-Muqabala” (The Compendious Book on Calculation by Completion and Balancing), also known simply as Al-Jabr, systematised solving linear and quadratic equations, laying the foundation of algebra. An example of a simple linear equation is 2x+3 = 11, which solves for a single value of 𝑥 (𝑥 = 4), and an example of a simple quadratic equation is 𝑥2+3𝑥+2 = 0, which can have up to two solutions for 𝑥 (𝑥 = – 1, 𝑥 = – 2). Algebra uses symbols and letters to represent numbers and quantities in mathematical formulas and equations.
Modern Formalisation:
- Ada Lovelace is widely credited with the visionary discovery that computers could recognise and manipulate symbols other than numbers. She was the first to explicitly articulate the transition from a “number-cruncher” to a general-purpose machine for manipulating symbols according to rules.

- In the 20th century, Alan Turing’s 1936 Turing machine provided a formal model of computation, defining algorithms as effective procedures. This bridged ancient methods to computing, enabling today’s digital applications.
- The Universal Turing Machine (UTM, a mathematical abstraction/ a thought experiment) became the theoretical blueprint for all programmable computers. It proved that a single machine could simulate any other machine by reading a set of instructions (software) from its memory (tape).
- This theoretical foundation has since evolved into the smartphones, personal computers, and global networks that define our modern digital lives.
Did you know?
- Many of our everyday activities are essentially physical algorithms. For example, brushing teeth, getting dressed, tying shoes, washing dishes, etc., – all of them are step-by-step algorithms that involve a sequential process of operation.
- The Euclidean algorithm, developed over 2,000 years ago to find the greatest common divisor of two numbers, is still widely used in modern mathematics and cryptography.
- English mathematician Ada Lovelace wrote the first algorithm intended for a machine in the 1840s, specifically for Charles Babbage’s Analytical Engine, making her the first computer programmer.

- The Analytical Engine proposed by Babbage in 1837 was a mechanical computer that is widely recognised as the first blueprint for a general-purpose, programmable computer. Unlike his earlier “Difference Engine,” which was a specialised calculator for mathematical tables, the Analytical Engine was designed to perform any computational task.
- Choosing a superior algorithm (like Merge Sort) can make a program 50,000 times faster on a standard laptop than an inefficient one (like Insertion Sort) on a multi-million dollar supercomputer.
- Algorithms like the Fast Fourier Transform (FFT) run almost permanently in the background of our smartphone, GPS and communication devices to process signals and images.
- Unlike traditional algorithms with fixed rules, AI algorithms (like those used in neural networks) learn to refine their own instructions based on the data they process (Neural networks, or artificial neural networks, are defined as computational models inspired by the biological structure and information-processing strategies of the human brain).
- Algorithms use specialised strategies to tackle complexity, such as: Divide and Conquer (breaking a large problem into smaller pieces), Greedy (making the best immediate choice at each step) and Dynamic Programming (storing previous results to avoid repeating work)
- Algorithms can be mathematically “graded” based on their resource consumption, specifically Time Complexity (how fast they run) and Space Complexity (how much memory they use).
Image 1: The French post office, La Poste, released a stamp featuring Ada Lovelace on Ada Lovelace Day 2022. Every year, on the 2nd Tuesday of October, this event celebrates innovative women in computing. Image courtesy https://findingada.com/
Image 2: Soviet Union (USSR) issued a postage stamp in 1983 to commemorate the 1200th anniversary of the birth of Muhammad ibn Musa al-Khwarizmi, the Persian polymath and astronomer. Image courtesy Wikimedia Commons.
Image 3: A commemorative postage stamp released by India in 1975 to celebrate the launch of India’s first satellite from the Soviet Union. The satellite was named Aryabhata after the 5th-century Indian astronomer and mathematician.
Image 4: In June 2012, the centenary of the birth of Alan Turing, an exhibition devoted to his life and work was staged at Bletchley Park and his achievements have been celebrated on a postage stamp in the “Britons of Distinction” series. Image courtesy https://stlqcc.org.uk/
Image 5: A postage stamp on Charles Babbage issued by UK’s Royal Mail in 2010. The Royal Mail celebrated the 350th anniversary of the Royal Society, UK, in 2010 by putting out a set of 10 postage stamps on the giants of science from Great Britain. Image courtesy newscientist.com