Kinnu

Why Computers Matter

What is a computer?

This is a very special pathway.

This pathway was produced by the Kinnu community! The awesome volunteers who contributed to producing this pathway are:

@petrichori
@samurai
@leparda35
@Demiguise
@Fish
@finn.heimberg
@Mick
@oblongur
@EliTheNerd
@purejoymind
@Voegl
@roccomarco
@ZakZak
@BlackADragon
@salamigod
@DiavalDiablo
@puritii_light
@FumingRed
@purplecat
@thecat
@SmashySetsuna
@Louisg
@cboomerang
@BackByUnpopularDemand
@EugeneD
@Cato
@Pialgdio
@-ilo-
@Linda
@dgsoa
@faintzzz
@SirZed

We are so grateful to all of these contributors. You've helped us create something amazing.

If you'd like to get involved in producing a future pathway, sign up to our Discord!

What is a computer? We all use them, and we all know what they do. But could you actually explain what a computer is, and how it works?

It’s a strange fact of modern life that most of us spend hours and hours each day – both at work and in our free time – interacting with computers, yet we never really think about what’s going on inside them. We take for granted that these machines can perform incredible feats of thinking and deliver seemingly limitless data to us, as if by magic.

An Apple II, one of the earliest personal computers. FozzTexx, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

But there’s no magic involved in how computers work. These miraculous man-made machines are the result of ingenious math, engineering and design – conceptualised, built and refined by some of the world’s most extraordinary minds over the past century or so.

In this pathway you’ll learn about these concepts, so that you too can understand the technology that is the cornerstone of the world we live in today.

At its most basic level, a computer is an electronic device that operates by taking an input – meaning some data, which could be numbers, text or something else – processing it according to a set of instructions, called a program, and then producing an output.

A good way of thinking about that is to consider the simplest possible computer. This would look something like the machine in the picture below.

A Turing Machine, Rocky Acosta, CC BY 3.0 <https://creativecommons.org/licenses/by/3.0>, via Wikimedia Commons

This is an example of a Turing machine. It was conceptualised in 1936 by Alan Turing (a name you may have heard before – we’ll be talking more about him later!). Turing came up with the idea for this machine as a thought experiment, but there have been many different versions built by different people.

In the one in this picture, there are two spools of tape, with a device in the middle, which the tape is being passed through. The tape is split into squares. Each square can contain either a 1 or a 0.

The device in the middle is for a human to examine the tape one square at a time. Using the device, they can view the current state of one square of tape and change it to either a 1 or a  0. The human has a rulebook which tells them exactly what to do every time they inspect a square of the tape.

So, the human in this Turing machine examines a square of tape, and follows a set of rules which tells them what to do. The rules might look something like this: “Go to the third cell. If it is a 1, turn it into a 0. If it is a 0, leave it as a 0. Now, go to the fourth cell, and follow the instructions on the next page of this book.”

The human will keep following instructions until the rulebook is finished. And just like that, they will have performed a computation!

An example of Turing machine instructions. Image: Public Domain

They have taken an input (the state of the tape at the start of the operation), processed it according to a program (the rulebook they are following), and produced an output (the state of the tape at the end). And they've done this without having to make any complex calculations themselves.

Now, you might be thinking: “what’s the use of changing a load of 1s and 0s around?”. Well, these become very useful when we start to use binary notation. This is a system of representing numbers and other data through combinations of 1s and 0s. We can represent any number we like in this way – meaning using just the Turing machine, and the right rulebook, we can perform any calculation we like.

Computers as we know them are effectively doing what Turing machines do, at a massively higher volume and using much smaller components. Instead of 1s and 0s on tape, there are tiny transistors that either send an electric signal (1), or no signal (0). Instead of humans executing the program, there are processing units.

But remember: the human running the program was just following very simple rules. By breaking data down into binary code, computers can process very complex information without doing any ‘thinking’ in the traditional sense.

Of course, there are lots of further levels of complexity to all of this. We’ve shown how Turing machines can act as calculators, but what about all the other stuff computers do, like running operating systems, or displaying JPEGs, or playing Flappy Bird?

We’ll cover all of these and more as the pathway continues. But it’s important to remember that everything your computer is doing comes down to this simple process of breaking input information down into 1s and 0s, performing a program of instructions on those bits of binary, and producing an output.

The vast world of computers

Getting some work done on a computer. Image: Computer Using Cat by EvanLovely, Chi King (CC BY 2.0) <https://creativecommons.org/licenses/by/2.0>, via Wikimedia Commons

You’re probably aware that you’re looking at a computer right now – most likely your smartphone, running the Kinnu app. But did you know that there are probably dozens of tiny computers all around you?

If you’re reading this at home, your microwave, refrigerator, TV, toaster, oven, washing machine, vacuum cleaner, and many of your children’s toys will have tiny computers inside them. If you’re walking down the street, every street lamp, traffic light, and parking meter will contain at least one computer.

If you're in the car (hopefully not driving if you’re reading this!), there are probably dozens of computers controlling everything from the engine and brakes to the radio and climate control. In modern offices, factories, and stores, computers are ubiquitous, managing inventory, processing transactions, and controlling equipment.

There are several types of computers, each designed for specific tasks and user needs.

The type we are most familiar with is the personal computer (PC). PCs are what most people are referring to when they use the term ‘computer’ in conversation – a desktop or laptop, likely running either Windows or macOS, controlled by the user using a keyboard and mouse or trackpad.

A PC. Image: Laptop der Marke exone go 20240203 HOF06886 RAW-Export 000276 by PantheraLeo1359531 (CC BY 4.0) <https://creativecommons.org/licenses/by/4.0>, via Wikimedia Commons

Depending on who you ask, smartphones and tablets could also be considered PCs. Others would argue they form their own category. Like PCs, they are designed for a non-technical individual to use for a wide variety of tasks. However, they are not typically used for work in the same way that traditional PCs are.

Outside of PCs, smartphones and tablets, there are many more types of computers that ordinary people might be less familiar with.

Servers are computers that exist only to perform functions that are requested of them by other computers. A more simple way to think about this is as ‘computers without screens, which are used by other computers’. Usually this is done by sending a message over a network.

A room full of servers. Image: Wikimedia Servers-0051 13 by Helpameout (CC BY-SA 3.0) <https://creativecommons.org/licenses/by-sa/3.0>, via Wikimedia Commons

For example, when you open Google and enter a search, all the data that appears wasn’t stored on your device before. Your device has sent a request to a server (in fact, several servers), and is displaying the results of that request. Servers enable us to perform a much broader spectrum of activities on our devices.

Mainframe computers are large, powerful machines used by organizations for critical applications, such as bulk data processing and large-scale transaction processing. In the early days of computing, all computers were mainframes. Nowadays, they are rarer, but still vital for many organisations who need to handle massive amounts of data.

Supercomputers are the fastest and most powerful type of computer, capable of performing trillions of calculations per second. They are used for complex simulations and computations in fields like climate research, molecular modelling, and astrophysics. There are only a handful of supercomputers in the world, and they are tucked away in research labs. Unless you’re a top scientist, the chances are you’ll never use one.

A supercomputer. Image: IBM Blue Gene P supercomputer by Argonne National Laboratory's Flickr page (CC BY-SA 2.0) <https://creativecommons.org/licenses/by-sa/2.0>, via Wikimedia Commons

Embedded systems are specialized computers designed to perform specific tasks within larger systems. Unlike general-purpose systems, embedded systems generally perform one single task. As technology has developed, and it has become possible to build tinier and tinier computers, we have started to embed mini computers all over the place – even within other computers!

For example, in your smartphone there is a sensor hub which processes data from various sensors in your device. This is technically a separate computer to the main processor in the phone. So in your hand is not just one computer, but several!

As embedded systems become cheaper and more efficient, they are starting to appear in more and more objects. For example, thermostats used to be analogue devices, but now the majority of them contain embedded systems. Gradually these tiny computers are working their way into every device we use – from showers to lightbulbs.

One last type of computer to consider is wearable tech. This has overlaps with several of the other computers we’ve mentioned – they are arguably PCs, and sometimes function as embedded systems. The computers that fall into this category are anything you can wear – from Apple watches to smart hearing aids!

A smart watch. Image: Public domain

History and evolution of computers

What was the first computer? That all depends on your definition of the word. It could be argued that computers existed thousands of years ago. The abacus, which dates back to at least 2400 BCE, is considered one of the earliest computing tools, because it breaks down large calculations, allowing people to compute numbers just by following simple rules.

An ancient abacus. Image: Kulram by Tor Svensson (CC BY-SA 3.0) <http://creativecommons.org/licenses/by-sa/3.0/>, via Wikimedia Commons

The modern story of computing begins much more recently, in the early 19th century. The 1805 “Jacquard loom” was built by Joseph-Marie Jacquard, a French weaver. This loom used punch cards to control the pattern of the weave. A weaver could input a card with holes punched into it, which would indicate a certain pattern that the machine should follow in its weaving.

The Jacquard Loom would go on to inspire Charles Babbage to design the Difference Engine and the Analytical Engine in the early decades of the 19th century.

Babbage Difference Engine (the power-supply end) by Jitze Couperus (CC BY 2.0) <https://creativecommons.org/licenses/by/2.0>, via Wikimedia Commons

Although Babbage would never complete either engine, Ada Lovelace, the daughter of the poet Lord Byron, would write a series of notes detailing how to program the theoretical Analytical Engine so that it could produce the Bernoulli sequence of numbers. In her notes, Lovelace also realised that the machine didn’t have to be restricted solely to numbers – it could be used to run any logical equation.

As a result, Babbage is often celebrated as being the designer of the first computer, and Lovelace as the first computer programmer.

Fast forwarding to the early 20th century, Alan Turing, a mathematician at the University of Cambridge, made several theoretical innovations that would lay the groundwork for modern computing.

Alan Turing (1912-1954) in 1936 at Princeton University (Public domain), via Wikimedia Commons

Turing was a math prodigy whose area of interest was computable numbers. In a 1936 paper titled On Computable Numbers, With An Application To The Entscheidungsproblem, he dreamed up a machine that could perform any computation by only following extremely simple rules. He called this an “a-machine”, but it has since become known as a Turing machine, which we have already discussed.

The part of Turing’s paper where he sets out his design for the Turing machine. Image: Public domain

The Turing machine, though theoretical, was a foundational concept for modern computing. A few years after first proposing the machine, Turing put some of these principles into practice while helping to build a series of code-breaking machines, which were instrumental in the Allied war effort during World War II.

After World War II, there were rapid advancements in computer technology with the development of the first electronic digital computers during the 1940s and 1950s. One example, the ENIAC (Electronic Numerical Integrator and Computer), was completed in 1945 and was the earliest example of what we could now call a digital computer.

The ENIAC was a massive machine, weighing 30 tons and occupying 167.23 square meters, allowing it to perform up to 5,000 additions per second.

Glen Beck and Betty Snyder program the ENIAC in building 328 at the Ballistic Research Laboratory (Public domain), via Wikimedia Commons

This period also saw the introduction of the stored-program concept, where instructions for computations were stored in the computer's memory. This innovation was later implemented in the Manchester Baby, the world's first stored-program computer, in 1948.

The 1960s and 1970s brought about the era of mainframe computers, which were large, powerful machines that were put to use by businesses. IBM's System/360 series would rise to prominence and become the standard-setter in mainframe computing. These machines were capable of handling vast amounts of data and performing complex calculations, making them indispensable tools for large organizations to manage their data and perform complex operations.

IBM system 360 by waelder (CC BY-SA 2.5) <https://creativecommons.org/licenses/by-sa/2.5>, via Wikimedia Commons

The advent of the microprocessor in the 1970s was another huge development. In 1971, the Intel 4004 was released. It was the first microprocessor to become commercially available, and was able to perform the functions of a computer's central processing unit (CPU), making it possible to create smaller, more affordable computers.

Intel 4004 by LucaDetomi, it.wikipedia (CC BY-SA 3.0) <http://creativecommons.org/licenses/by-sa/3.0/>, via Wikimedia Commons

This innovation paved the way for the personal computer (PC) revolution of the 1980s. Companies like Apple and IBM introduced PCs that brought computing power to homes and small businesses, transforming the way people worked and lived.

Later the 1990s and 2000s saw the rise of the internet and the proliferation of networked computers. The World Wide Web, invented by Tim Berners-Lee in 1989, became a global platform, and allowed for easy access to communication and information.

This period also witnessed the development of operating systems such as Windows, Macintosh, and Linux as well as many powerful software applications, allowing computers to be more user-friendly and versatile.

Apple iMac in 'Bondi Blue'. Image: HereToHelp, CC BY-SA 2.0 https://creativecommons.org/licenses/by-sa/2.0/deed.en, via Wikimedia Commons

Today, computers are an integral part of everyday life, and are embedded in everything from smartphones to household appliances. Computers are continually evolving with advancements in artificial intelligence, machine learning, and quantum computing pushing the boundaries of what is possible.

The journey from Babbage’s Analytical Engine to the computers of today is a story of teamwork. Many small innovations made by countless individuals have shaped and honed the technology we live with today.

As we look to the future, the potential for further advancements in computing technology seems endless, continuing to shape and redefine our world.