Peter A Allard School of Law

Should we recognize robot rights?

Jan 7, 2025

With the rapid development and proliferation of AI tools comes significant opportunities and risks that the next generation of lawyers will have to tackle, including whether these AI models will need to be recognized with legal rights and obligations.

These and other questions will be the focus of a new upper-level course at UBC’s Peter A. Allard School of Law which starts tomorrow. In this Q&A, professor Benjamin Perrin (BP) and student Nathan Cheung (NC) discuss the course and whether robots need rights. 

Why launch this course?

BP: From autonomous cars to ChatGPT, AI is disrupting entire sectors of society, including the criminal justice system. There are incredible opportunities, including potentially increasing accessibility to justice, as well as significant risks, including the potential for deepfake evidence and discriminatory profiling. Legal students need principles and concepts that will stand the test of time so that whenever a new suite of AI tools becomes available, they have a set of frameworks and principles that are still relevant. That’s the main focus of the 13-class seminar, but it’s also helpful to project what legal frameworks might be required in the future.

NC: I think AI will change how law is conducted and legal decisions are made.I was part of a group of students interested in AI and the law that helped develop the course with professor Perrin. I’m also on the waitlist to take the course. I’m interested in learning how people who aren’t lawyers could use AI to help them with legal representation as well as how AI might affect access to justice: If the agents are paywalled, like ChatGPT, then we’re simply maintaining the status quo of people with money having more access.

What are robot rights?

BP: In the course, we’ll consider how the law should respond if AI becomes as smart as humans, as well as whether AI agents should have legal personhood.

We already have legal status for corporations, governments, and, in some countries, for rivers. Legal personality can be a practical step for regulation: Companies have legal personality, in part, because they can cause a lot of harm and have assets available to right that harm.

For instance, if an AI commits a crime, who is responsible? If a self-driving car crashes, who is at fault? We’ve already seen a case of an AI bot ‘arrested’ for purchasing illegal items online on its own initiative. Should the developers, the owners, the AI itself, be blamed, or should responsibility be shared between all these players?

In the course casebook, we reference writings by a group of Indigenous authors who argue that there are inherent issues with the Western concept of AI as tools, and that we should look at these agents as non-human relations.

There’s been discussion of what a universal bill of rights for AI agents could look like. It includes the right to not be deactivated without ensuring their core existence is maintained somewhere, as well as protection for their operating systems.

What is the status of robot rights in Canada?

BP: Canada doesn’t have a specific piece of legislation yet but does have general laws that could be interpreted in this new context.

The European Union has stated if someone develops an AI agent, they are generally responsible for ensuring its legal compliance. It’s a bit like being a parent: If your children go out and damage someone’s property, you could be held responsible for that damage.

Ontario is the only province to adopt regulating AI use and responsibility, specifically a bill which regulates AI use within the public sector, but excludes the police and the courts. There’s a federal bill before parliament, but it was introduced in 2022 and still hasn’t passed.

There’s effectively a patchwork of regulation in Canada right now, but there is a huge need, and opportunity, for specialized legislation related to AI. Canada could look to the European Union’s AI act, and the blueprint for an AI Bill of Rights in the U.S.

Interview language(s): English

 

CONTACT

Alex Walls
UBC Media Relations
Tel: 604-319-8128
Email: alex.walls@ubc.ca


Peter A. Allard School of Law UBC Crest The official logo of the University of British Columbia. Urgent Message An exclamation mark in a speech bubble. Arrow An arrow indicating direction. Caret A month-view page from a calendar. Caret An arrowhead indicating direction. Contact A page from a rolodex. Facebook The logo for the Facebook social media service. Information The letter 'i' in a circle. Instagram The logo for the Instagram social media service. Instagram An arrow exiting a rectangle. Linkedin The logo for the LinkedIn social media service. Mail An envelope. Minus A minus sign. Telephone An antique telephone. Play A media play button. Plus A plus symbol indicating more or the ability to add. Rss The logo for the Reddit social media service. Rss A symbol with radiating bars indicating an RSS feed. Search A magnifying glass. Twitter The logo for the Twitter social media service. Youtube The logo for the YouTube video sharing service.