14 Aug 2019

USA: A Crucible for Regulation?

Is it significant that America is the first country to have the first human killed by a robot? That incident, fear of job losses, the proliferation of robotic applications is raising a raft of questions on law and policy in the United States.

By David Cowan

shutterstock
BravissimoS / Shutterstock.com

 

 Even Elon Musk has said, “I’m increasingly inclined to think that there should be some regulatory oversight.” According to a survey by Pew Research Center “Americans express broad concerns over the fairness and effectiveness of computer programs making important decisions in people’s lives.”

If we go back to the 1980s, who would have predicted the sheer volume of new legislation drafted regulating Internet privacy, social media and mobile phones? Legal questions relating to robots, autonomous cars, and drones are an indication that the new industrial age is raising ever-increasing policy concerns about the impact of automation in the lives of everyday citizens. In this country focus we look at some of the non-military areas of robot regulation where America is grappling with the issues.

Social themes

The WeRobot conference, held in annually in Spring, is a gathering of American and international academics, practitioners and others which explores the latest learning on legal and policy issues relating to robotics. “We Robot fosters conversations between the people designing, building, and deploying robots and the people who design or influence the legal and social structures in which robots will operate,” says A. Michael Froomkin, founder of WeRobot. The conference encourages “reflection and interdisciplinary collaborations between developers of robotics, AI, and related technology and experts in the humanities, social science, and law and policy.” Previously, the conference has been held at the University of Miami, University of Washington, Stanford, and Yale, while next year the community heads to the University of Ottawa (April 2-4, 2020).

Among the topics under consideration by scholars are: Whether tricking a robot amounts to illegal "hacking" under federal law, how robot referees might change sports, a feminist view of drone regulation, long-run implications for doctors and patients when machine learning systems start beating doctors at diagnosis, the private sector's role in governing autonomous weapons systems, and, whether real-life RoboCops would achieve – or set back – the goals of Black Lives Matter. Others are asking questions such as whether a robot can be programmed to be racist. Can a robot sexually harass a co-worker, human or non-human? These topics and themes illustrate how closely intertwined are technology developments and policy issues in the area of robotics law.

Torts

In an XPONENTIAL 2018 keynote speech, Professor Zeynep Tufekci, of the University of North Carolina, observed: “In the future, we will no longer need two pilots, planes will have just one captain and a dog. The dog will be there to bite the human in case he touches anything.” However, we may also need the dog to guard us against mistakes or malintent by the robots. The Occupational Safety and Health Administration (OSHA) reports that in the past 30 years there have been 30 fatalities caused by robots , though this does favouably compare against the 5,000 workplace deaths annually. The first known robo-killing was in 1979 when a five-story, 1-ton industrial robotic arm fatally struck Robert Williams in the head. Mr William’s family was awarded $10 million in a jury verdict against the manufacturer. Afterwards, the plaintiff’s attorney declared, “The question, I guess, is, ‘Who serves who?’”

The US Chamber Institute for Legal Reform has issued its second edition on “Torts of the Future II Addressing the Liability and Regulatory Implications of Emerging Technologies.” The new report covers a wide range of legal issues related to artificial intelligence and unmanned systems, and the authors argue: “As robots and other products become more capable of making decisions on their own, courts may look to alternative models of liability.” According to the study, robots may already be covered under “agency law,” whereby employers would be responsible for any injury resulting from their machines similar to employees. Alternatively, courts could view robots like pets in terms of liability: “In each of these areas, the person sued does not fully control the actions of the third party or animal that led to an injury, but, in some circumstances, is liable for the consequences.”

Drones

Federal Aviation Administration (FAA) regulators are accused in some quarters of holding back progress in commercial drones. At present, the FAA restricts drones to human-operated flights away from population centers. However, the Unmanned Aerial Vehicle (UAV) market opportunity is largely predicated on autonomous “Beyond Line Of Sight” (BLOS) missions. The US Chamber Institute explains, “There is no one-size-fits-all approach to addressing liability and regulatory issues associated with emerging technology. The key is to strike the right balance between promoting innovation and entrepreneurship and addressing legitimate safety and privacy concerns.”

At WeRobot, the award for best senior scholars paper was awarded to: “Through the Handoff Lens: Are Autonomous Vehicles No-Win for Driver-Passengers,” authored by Jake Goldenfein (Cornell Tech); Deirdre Mulligan (UC Berkeley School of Information) and Helen Nissenbaum (Cornell Tech). In response to how the transport models described by technology companies, car manufacturers, and researchers each generate different political and ethical consequences for users, the paper introduced the analytical lens of “handoff” for understanding the ramifications of different configurations of actors and components associated with three archetypes of autonomous vehicles – fully driverless cars, advanced driver assist systems, and connected cars. “Handoff” is an approach to tracking societal values in socio-technical systems. It exposes what is at stake in transitions of control between different components and actors in a system, i.e. human, regulatory, mechanical or computational.

Sexbots

One thing common with technology is that just as military contracts finance a great deal of pioneering work, one of the earliest and most commercially successful adopters is the sex industry. Robotics is no different. A robot brothel has been set in Houston by Toronto-based company KinkySdollS. Featuring on-site, short-term private rentals, it was set up by a coalition of four investors, including an attorney, who said there are no regulations to stop  such service. Experts and activists are saying regulations are needed for the sake of consumer safety and public health. 

John Banzhaf, a law professor at George Washington University, told a local newspaper, “I can buy two or three or four of these (sexbots) on the Internet and in Washington, D.C., or New York, or anywhere I want, I can set them up and charge people $100 an hour to use them.” The sexbots range from a heated slice of silicon with a face, for the approximate price of a laptop, to more than $10,000 for lifelike models. Professor Banzhaf said it’s time to hold legislative hearings so reasonable rules can be written before sales explode and the services too entrenched. He suggests there should be an effort to review whether abuse-simulating dolls increase or reduce human victimization, an unsettled debate.

One area that has seen a push for regulation is to ban sex robots that resemble children. House of Representatives lawmakers quickly and unanimously passed the CREEPER Act, tabled by New York House Representative Dan Donovan. The Curbing Realistic Exploitative Electronic Pedophilic Robots (CREEPER) Act would amend 18 U.S.C. § 1462 (which currently restricts certain obscene materials) to prohibit the importation or transportation of “anatomically-correct doll[s], mannequin[s], or robot[s], with the features of, or with features that resemble those of, a minor, intended for use in sexual acts.” (Sec. 3). The legislation aims to ban importation and interstate commerce involving “any child sex doll," but the Senate has not acted. Donovan said in a statement that his bill would “help better protect innocent children from predators” and urged the Senate “to follow the House’s lead and swiftly pass this legislation that would benefit our communities.” However, the bill died in a previous Congress as it was never passed by the Senate.

The Consumer Product Safety Commission enforces standards for some parts that may go into robots, but has no specific rules for sexbots. Spokeswoman Patty Davis said, “Certainly there are component parts like batteries, and the [artificial intelligence] system itself that would have standards associated with them, but there are none for sex robots, as a product, that we are aware of.” The Food and Drug Administration could be able to regulate sexbots as medical devices if sellers claimed the bots “treat, prevent, cure, mitigate or diagnose a disease or condition,” such as sexual dysfunction. This is easily bypassed by canny drafting. 

If the status of human brothels is anything to go by, robot brothels are legal in 49 states, though Alabama has a sex toy ban that may apply. The Supreme Court in 2005 declined to hear a challenge to the Alabama ban, while the U.S. Court of Appeals for the 11th Circuit upheld the ban, warning that overturning it could lead to prostitution. Fifth Circuit judges overturned a similar ban in Texas’ sex toy ban, which was not appealed, giving uneven federal circuit rulings.

Health and safety

Another area of examination is encroachment by robots into formally safe public spaces such as sidewalks, Froomkin said. This was examined in the paper, “Robots in Space: Sharing Our World with Autonomous Delivery Vehicles (ADVs),” by Mason Marks of Yale Law School & NYU Law School. Sidewalk delivery robots are the newest and fastest growing segment of the industry, and will therefore face the fewest legal and regulatory hurdles. The paper focused on the differences between the laws that regulate sidewalk delivery robots and the laws that govern other types of autonomous vehicles, and proposes legislation to regulate sidewalk delivery robots that will increase their safety and utility, while limiting the privatization of public spaces.

With the growth of robots on the shop floor, working side by side with human workers, there is natural concern about conflicts between them. OSHA does not have rules specifically to protect workers from robots, and there does not appears to be any plans to start. OSHA believes that if the agency did try to create such rules, a time-consuming process measured in years rather than months, technological change would outpace them. Employees are currently covered  by a range of regulations protecting workers from being struck or entrapped by machinery (29 C.F.R 1910.212) or by unexpectedly activated machines (29 C.F.R. 1910.147). Robots would be covered under these rules.

There are also industry-created consensus standards for robot safety, which includes the International Organization for Standardization’s (ISO) standards ISO 10218, Robots and Robotic Devices – Safety Requirements for Industrial Robots, and ISO 15066, Safety Requirements for Industrial Robots – Collaborative Operation. The primary U.S. standard, modeled on the ISO standard and issued by the American National Standards Institute (ANSI) and the Robotic Industries Association, is ANSI/RIA R15.06, Industrial Robots and Robot Systems-Safety Requirements. Automated ground vehicles, such as self-driving forklifts, are for the most part governed by the Industrial Truck Standards Development Foundation’s ANSI/ITSDF B56.5.

Small claims

Turning to the use of bot apps, the DoNotPay app has recently expanded to add a feature that will allow you sue a person or company with just a click of a button. It is completely free and it can win in court. The app was created by Joshua Browder, a 21-year-old Stanford University student who is currently studying computer science. Browder created the app to help him and others fight expensive traffic tickets, but he eventually expanded it to help people reverse bank fees and get refunds on airline tickets whose price dropped after they were purchased. He has even helped victims of the Equifax data breach successfully sue and recoup damages after their personal information was leaked.

Now the app claims you can sue anyone in America at the touch of a button. The app will ask a few questions about the legal dispute. A bot then uses the answers to classify the case into one of 15 different legal areas. The app then instantly generates and fills out all of the documents needed to take the person or company to court. The users prints this out, mails it to the courthouse, and even send a script to read in court. Mr Browder says he is “trying to remove the word lawyer from the dictionary. So everything an average person would ever want, beating bureaucracy or fighting companies, you can get done hopefully in the app.” The app currently deals with small claims up to $25,000, but the ambition is to expand the app to other areas of the law.

Robots and taxes

Perhaps the issues will all come down to taxes. Two years ago, the McKinsey Global Institute found that the job functions that are “most susceptible to automation” in the United States account for 51 percent of the activities in the economy and $2.7 trillion worth of wages. The institute estimates “half of today’s work activities could be automated by 2055.” Such an outcome could result in hundreds of billions of tax dollars lost every year. There may well be a need to tax the robots. This again takes us to considering how robots are treated, this time under the tax code, and  there is a wide gap between the traditional way the tax code treats human labour and the way it treats automation. The robot may be seen as neither one thing nor the other, and this will doubtless create a lot of litigation and regulatory innovation in the future.

Peter W. Signer, a political scientist and the author of “Wired for War”, has said, “The rationale for robot ‘rights’ is not a question for 2076, it’s already a question for now.” There are a number of areas where this work is becoming pressing, raising the question: What are you asking?

 


related topics