The council has launched its “5 Laws of Robotics” which state:
- Robots should not be designed as weapons
- Robots should comply with existing law, including privacy.
- Robots are products; they should be safe, reliable and not misrepresent their capabilities.
- Robots are manufactured artefacts; the illusion of emotions and agency should not be used to exploit vulnerable users.
- It should be possible to find out who is responsible for any robot.
There have been similar recent tempts to codify robotic development and ethics, such as EPSRC’s “Principles of Robotics” in 2010, and earlier this saw the British Standards Index release the document “BS8611 Robotics and robot devices”, presented at the Social Robotics and AI Conference in Oxford, examining ethical risk assessment and how to embed it in robots.
The Good Robot Design Council’s rules have been drawn up by a group of scientists, academics, and users, aiming at an audience of robotics developers to assist them in identifying and assessing areas of potential ethical harm. Such areas include the concerns of humans forming emotional connections to robots, especially to robots used in care, and robots being used to enact physical violence.
The advice that robots should not be designed as weapons, that is, solely designed to cause harm or kill humans, is highlighted as paramount, echoing the sentiments made by Asimov that a robot is not permitted to cause harm to a human. Responsibility, and tracking responsibility for a robot’s behaviour is also noted as being deeply significant, especially as robots increasingly integrate into human society and can inform in areas of the law, such as criminal liability, or insurance.