Page 140 - Grasp English B1+ (Student Book)
P. 140
11 Robotics
In his 1942 science fiction short story, Runaround,
Isaac Asimov introduced the Three Laws of
Robotics. These three laws have stood the test
of time and still have an impact today on our
thinking about the ethics of artificial intelligence.
Although we may think that artificial intelligence,
or AI, is a recent idea, even the myths of Ancient
Greece include gadgets to help humans. One
of them is Bubo, the mechanical owl which helps
Perseus on his adventure to save Andromeda.
Hundreds of years later, in early 19 -century
th
France, we see one of the first examples of
machines being programmed to carry out
tasks. The Jacquard loom was a textile-making
device which used cards with holes punched
into them. Joseph Marie Jacquard’s loom was
based on earlier prototypes invented by other
Frenchmen. It was invented during the Industrial
Revolution – a period of history when machines
became heavily involved in manufacturing.
Robotics is not only used in the manufacturing
process. With our increased use of the internet,
many companies now have chatbots on
Laws of Robotics their websites which are able to interact with
customers at any hour of the day. The chatbots
First Law have a limited number of responses, but they
A robot may not injure a human being or, through inaction, are quickly being developed to become more
allow a human being to come to harm. efficient and effective.
Second Law However, these are still machines which have
A robot must obey the orders given by human beings been programmed by humans to carry out tasks.
except where such orders would conflict with the First Law. Have we seen any truly autonomous robots yet?
Third Law At the moment, input comes from the human
A robot must protect its own existence as long as such programmers, and we have yet to see a
protection does not conflict with the First or Second Laws. self-aware device. This brings us back to the
question of the ethics of artificial intelligence
Handbook of Robotics, 56 Edition, 2058 AD.
th
and Asimov’s Laws.
Imagine a self-driving car is about to crash into a
mother with her young child in a pram. The car
can turn quickly to avoid hitting the baby, but
would instead hit two pedestrians walking along
the street. How can we explain the ethics of this
situation to a machine? And what would you do
if you were driving the car?
138