Before machines can be autonomous, humans must work to ensure their safety


SOURCE: NEWS.VIRGINIA.EDU
NOV 03, 2021

As the self-driving car travels down a dark, rural road, a deer lingering among the trees up ahead looks poised to dart into the car’s path. Will the vehicle know exactly what to do to keep everyone safe?

Some computer scientists and engineers aren’t so sure. But researchers at the University of Virginia School of Engineering and Applied Science are hard at work developing methods they hope will bring greater confidence to the machine-learning world – not only for self-driving cars, but for planes that can land on their own and drones that can deliver your groceries.

The heart of the problem stems from the fact that key functions of the software that guides self-driving cars and other machines through their autonomous motions are not written by humans – instead, those functions are the product of machine learning. Machine-learned functions are expressed in a form that makes it essentially impossible for humans to understand the rules and logic they encode, thereby making it very difficult to evaluate whether the software is safe and in the best interest of humanity.

Researchers in UVA’s Leading Engineering for Safe Software Lab – the LESS Lab, as it’s commonly known – are working to develop the methods necessary to provide society with the confidence to trust emerging autonomous systems.

The team’s researchers are Matthew B. Dwyer, Robert Thomson Distinguished Professor; Sebastian Elbaum, Anita Jones Faculty Fellow and Professor; Lu Feng, assistant professor; Yonghwi Kwon, John Knight Career Enhancement Assistant Professor; Mary Lou Soffa, Owens R. Cheatham Professor of Sciences; and Kevin Sullivan, associate professor. All hold appointments in the UVA Engineering Department of Computer Science. Feng holds a joint appointment in the Department of Engineering Systems and Environment.

Since its creation in 2018, the LESS Lab has rapidly grown to support more than 20 graduate students, publish more than 50 publications and obtain competitive external awards totaling more than $10 million in research funding. The awards have come from such agencies as the National Science Foundation, the Defense Advanced Research Projects Agency, the Air Force Office of Scientific Research and the Army Research Office.

The lab’s growth trajectory matches the scope and urgency of the problem these researchers are trying to solve.

The Machine-Learning Explosion

An inflection point in the rapid rise of machine learning happened just a decade ago when computer vision researchers won the ImageNet Large Scale Visual Recognition Challenge – to identify objects in photos – using a machine-learning solution. Google took notice and quickly moved to capitalize on the use of data-driven algorithms.

Other tech companies followed suit, and public demand for machine-learning applications snowballed. Last year, Forbes estimated that the global machine-learning market grew at a compound annual growth rate of 44% and is on track to become a $21 billion market by 2024.

But as the technology ramped up, computer scientists started sounding the alarm that mathematical methods to validate and verify the software were lagging.

Similar articles you can read