Close

May 11, 2021

Law firms are building A.I. expertise as regulation looms

[Fortune, Getty]

There’s a wide gulf between A.I. system builders and legal experts. One boutique law firm is hoping to bridge it.

What makes BNH.ai unique is that it combines deep expertise in data science and machine learning with deep expertise in law and regulation, especially around data privacy. Burt and Hall founded the firm in Washington, D.C., because it is one of the few places in the U.S. where non-lawyers are allowed to be equal equity partners in a law firm. Although only founded just before the coronavirus pandemic struck, the firm already has a growing roster of clients that includes some of largest U.S. technology companies, as well as companies in financial services, insurance, and healthcare.

“We are able to get extremely hands on and, in certain cases, we are even writing code to correct models or make new models and that sets us apart from most firms out there,” Hall tells me. Hall says, going forward, there’s likely to be more law firms and legal departments that will combine machine learning and legal expertise, given the complexity of both fields. The problem, he says, is that today there is little overlap between people in the two spheres, and they speak completely different languages.

Hall was previously a senior data scientist at a number of enterprise software companies. He says the very last question many teams would ask when building an A.I. application is, “Is it legal?” It should be among the first questions asked, he says, and it needs to be answered by someone with legal expertise as well as an understanding of how the particular model or algorithm works.

Just because A.I. is an emerging area of law doesn’t mean there aren’t plenty of ways companies can land in legal hot water today using the technology. He says this is particularly true if an algorithm winds up discriminating against people based on race, sex, religion, age, or ability. “It’s astounding to me the extent to which A.I. is already regulated and people are operating in gleeful bliss and ignorance,” he says.

Most companies have been lucky so far—enforcement agencies have generally had too many other priorities to take too hard a look at more subtle cases of algorithmic discrimination, such as a chat bot that might steer certain white customers and Black customers to different car insurance deals, Hall says. But he thinks that is about to change—and that many businesses are in for a rude awakening.

Working with Georgetown University’s Centre for Security and Emerging Technology and Partnership on A.I., Hall was among the researchers who have helped document 1,200 publicly reported cases of A.I. “system failures” in just the past three years. The consequences have ranged from people being killed (in the infamous case of Uber’s self-driving car striking a pedestrian in Arizona) to false arrests based on facial recognition systems misidentifying people to individuals being excluded from job interviews.   

He thinks that data scientists and machine learning engineers need to adopt a mindset more similar to that of civil or aerospace engineers or cybersecurity experts, or, for that matter, lawyers: the “adversarial” point of view that assumes all systems are fallible and that knowing those exact points of failure and the consequences of that failure are vital. People building A.I. should assume other people will try to game the system, abuse it, or fool it in bizarre ways. Too often today, he says, machine learning teams are rewarded for building systems that perform well on average, or beat a benchmark tests, even if those systems are vulnerable to catastrophic failures when presented with unusual “corner” cases.

Hall says he welcomes the FTC’s recent signals that it plans to crack down on companies if their A.I. systems discriminate or if a company uses deceptive or misleading practices to either gather data to train an A.I., or in their marketing of their A.I. software. “I think that will shake up the future of machine learning in the U.S. and I think that is a good thing,” he says. “There’s a lot of snake oil and sloppiness that hurts people today.”

Many businesspeople don’t love lawyers. But this may be a case where we should all be grateful to see the suits and briefcases knocking on the door.

BY JEREMY KAHN

Leave a Reply