Even robots are made for a purpose
Author: Tony Watkins Keywords:
Conflict, ethics, life, purpose, technology, future
Film title: I, Robot
Director: Alex Proyas
Screenplay: Jeff Vintar based on the book by Isaac Asimov
Starring: Will Smith, Bridget Moynahan, Alan Tudyk, James Cromwell
Distributor: 20th Century Fox
Cinema Release Date: 2004
Certificate: 12A (UK); PG-13 (USA)
In the film I, Robot, Detective Spooner is investigating the apparent suicide of a top robotics scientist. The year is 2035 and robots are a part of everyone’s life – doing menial tasks including shopping, cleaning and collecting rubbish. Spooner, however, is strongly anti-robot and suspects an android has murdered the dead man.
This idea is ridiculed – robots are programmed to follow the three laws of robotics: ‘a robot may not harm a human or, by inaction, allow a human being to come to harm; a robot must obey orders given it by human beings except where such orders would conflict with the first law; a robot must protect its own existence as long as such protection does not conflict with the first or second law’.
These three laws were formulated by Isaac Asimov in his 1950 short story collection which inspired the current film. The laws are an established part not just of science fiction but of real world robotics. Asimov was weary of stories about out-of-control science and saw robots as having wonderful potential for good – the laws were intended to ensure that robots would always be on our side.
This film, however, sticks with the more conventional line of robots taking over the world. What makes it interesting is its questioning of whether the three laws are adequate.
A familiar theme in science fiction films is the possibility of artificial intelligence developing its own consciousness, becoming able to think freely for itself and to have emotions. If that happened, would the machine still be bound by the three laws, or could it choose to think differently?
And what if the machines had a very different perspective on the three laws? Given the human tendency towards conflict, wouldn’t robots have a duty to step in to prevent harm? And how would a robot balance the various conflicting interests?
I, Robot’s concern is perhaps not so much on robot ethics but on human ethics: how should we live in a world of conflicting self-interest? From a Christian perspective the answer is found in embryonic form in I, Robot when the first truly free robot says, ‘My father made me for a purpose. We all have a purpose, don’t you think?’
Complete autonomy, whether of humans or machines, leads to trouble. Instead we need to recognise the responsibility to discover the creator’s purposes for our lives. Only then can we be fully human.
Related articles/study guides:
Author: Tony Watkins
© Copyright: Tony Watkins 2004, first published in Christian Herald in August 2004
Unless stated otherwise, Bible quotations are from the New Living Translation (NLT) copyright © 1996, 2004 by Tyndale Charitable Trust. Used by permission of Tyndale House Publishers.