Kinnu

Robotics

What is a robot?

It probably feels like a while ago now. But right at the start of this entire pathway, we talked about Talos – that automated giant, made entirely of bronze, which appeared in Greek mythology.

An AI generated image.

There's something quite distinct about Talos. Something that really sets him apart from most of the real-life AI models we've discussed in this pathway so far. Models like AlphaGo, and ChatGPT, and Midjourney, and Logic Theorist, and so on.

Talos has a physical body. He can pick up stones, he can walk about, he can tangibly interact with the world. In other words, Talos is a robot.

At least, he would be, if he'd actually existed.

A modern robot might be defined as follows: a physical machine, performing physical tasks, which impact the physical world. Another popular term is embodied AI – that's an AI model with some kind of physical body.

As we said, a lot of modern models don't fit this definition. ChatGPT lacks a physical body. There's no way for it to physically type, or to use a pen and paper.

But that doesn't mean that robots don't exist.

Just think of NASA’s Perseverance rover. It's currently using an AI system to physically explore and physically interact with the surface of the planet Mars. Many factories are now using robots too, to assemble products and transport items around.

Perseverance rover. Image: (Public domain), via Wikimedia Commons

Factory robots, and other similar machines, are a relatively modern invention. And if you talk to people, these robots actually make a lot of people nervous.

According to Isaac Asimov – a celebrated science fiction writer, who wrote extensively about robots during the twentieth century – humans are intrinsically frightened of intelligent machines. We fear that robots will rise up and attack us, something which a disembodied AI (like ChatGPT) wouldn't possibly be able to do.

Asimov called this the Frankenstein Complex: the fear that robots will turn against us, just like the monster in Frankenstein turned against his creator.

Of course, the idea of a robot turning against us isn't really realistic. At least, not with the robots which are currently being used.

Embodied or otherwise, these are still Artificial Narrow Intelligence models. All they can do is perform the tasks they were originally designed for, like fitting two pieces of metal together, or moving a box from point A to point B. They can't make the leap to 'vengeance' or 'violence' – that's not what they were built to do.

In other words, it's best to view robotics as an exciting technology, not a worrying one. As things stand, the idea of a robot uprising isn't a valid concern.

Robot non-vengefully sawing. By Luka Peternel (CC BY-SA 4.0) <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

Robotic bodies

As we talked about last time, a robot is defined by its physical body. That's what sets it apart from disembodied AIs like ChatGPT or AlphaGo. But we haven't talked about how these bodies actually work.

An effector is the part of a robot's body that actually performs an action. For example, it might have a gripper (for picking up objects), a drill (for cutting through metal), or a wheel (for moving around).

An actuator, on the other hand, provides the physical force that allows an effector to move. For example, it might use an electric motor to turn its wheels, or a pneumatic (compressed air) system that forces its gripper to close.

So, a robotic body is just a set of effectors, powered by a set of actuators. But depending on the nature of the robot, these parts can be assembled in totally different ways.

On the one hand, we have humanoid robots. These are mechanical bodies which are put together to look like real humans. One famous example is NASA's Valkyrie, which basically looks like an actual person wrapped up in a high-tech suit.

Valkyrie. Image: (Public domain), via Wikimedia Commons

On the other hand, we have non-humanoid robots. These are mechanical bodies which don't look anything like us. Think of the Perseverance rover, or a self-flying drone, or a floor-mounted arm like the ones you might find in a factory.

Floor-mounted arms. Image: (Public domain), via Wikimedia Commons

It has to be said: humanoid robots don't currently have many useful, real-world uses. For the time being, you won't see one working in a factory, or delivering your morning mail.

They're generally built for research purposes. One well-known example is the iCub robot, which was built in the style of a human child, then encouraged to learn some child-like behaviors, like crawling, playing, and talking.

iCub robot. Image: Xavier Caré / Wikimedia Commons / CC-BY-SA, via Wikimedia Commons

Humanoid robots are also built for entertainment. In 2016, Hanson Robotics launched Sophia – a humanoid robot which caught the attention of the world. Sophia is essentially just a chatbot with effectors for facial expressions, but she's given a number of televised interviews, and even spoken at important events.

Sophia robot. Image: (cropped) by ITU Pictures (CC BY 2.0) <https://creativecommons.org/licenses/by/2.0>, via Wikimedia Commons

While humanoid robots are mainly just used for research and entertainment, non-humanoid robots have lots of real-world uses.

An industrial robot is a class of robot which is generally used in factories. These often take the form of robotic arms, which use Computer Vision to look at objects, before using their effectors to pick them up and manipulate them.

In general, these robots are used for long, repetitive tasks which would be difficult for a human to perform. For example, if a factory needed to hammer a million bolts into place, it might be better to give that work to a robot, rather than a human who might tire out.

Another example of a non-humanoid robot is an automated guided vehicle (AGV). It uses vision or LiDAR to navigate a space, like a factory building, where it carefully transports goods from place to place.

AGV. By Raimond Spekking (CC BY-SA 4.0) <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

Robotic minds

It's time to come clean about something. We know this is a pathway about Artificial Intelligence... but a lot of robots don't actually use AI.

Consider a simple, mechanical arm. It's been programmed to saw a piece of wood. When you press a button, that's exactly what it does.

This arm is a robot: a physical machine, performing a physical task, which impacts the physical world. But does this arm use AI? Not at all. It doesn't need to 'think' or 'make decisions'. It just follows its programming, and moves that saw back and forth, whenever you press the button.

Think of it like this. A robot is a mechanical body. But that body doesn't necessarily need a 'mind'.

So, not all robots use AI. But a lot of the best ones do. Let's consider, for example, an AGV which is designed to pick up trash from the factory floor, then deliver that trash to a bin.

First of all, that robot will use Computer Vision to spot any objects on the floor. It will also need an image classification system, which helps it to identify whether those objects actually need to be thrown away.

This image classification system is probably some kind of neural network, which has been trained on a labeled dataset of trash (waste paper, burned wood, sawdust) and not trash (hammers, buckets, screws).

Trash or not trash?

Once the AI system has identified a piece of trash on the floor, it will send signals to the robot's actuators. These actuators, in turn, power the robot's effectors. It wheels its way over to the piece of trash, and picks it up with a grabber.

Now, the robot needs to make its way to the bin. But the factory is crowded with human workers walking back and forth.

Factory floor.

Luckily, this robot has a LiDAR system. It starts sending out lasers, which bounce off the humans, and give the robot a dynamic, 3D map of all these moving workers. It uses this map to navigate the space, still sending signals to actuators which cause its wheels to steer left and right as required.

Eventually, the robot reaches the bin. The AI sends a signal to another actuator, which opens the grabber, and drops the trash into the bin.

The robot is about to turn around, and search the factory for another piece of trash, when its microphones pick up the sound of a voice: "Hey, robot, that's enough for today. Can you switch into sleep mode?"

The AI takes these words, and uses some Natural Language Processing to translate them into numbers. It recognizes these numbers as a command, and responds by sending out a set of signals which shuts its system down.

That AGV is just one example of a robot using AI.

Another robot, which was built to weave colorful rugs, might use Generative AI to come up with new designs. Another might use Computer Vision to perform quality assurance on products. Another might use NLP to give reports and updates to workers. There are honestly countless examples. The list goes on and on.

And it makes you wonder... if we wanted to build a robot like Talos, would that actually now be possible?

A humanoid robot which walks back and forth, using Computer Vision to watch the ocean. When it sees an object which resembles a boat, it picks up a rock using actuators and effectors, and throws that rock at the object. When the object sinks, the robot continues its patrol.