by Chris Jackson (email@example.com)
It became apparent in 2016 to several key staff at UCLA that something was missing in the field of autonomous systems. Autonomous vehicles are poised to revolutionize transport, and the apparent safety and reliability benefits are considerable. But how do we ensure that an autonomous system is safe and reliable before we use it? How do we test or demonstrate autonomous system safety and reliability? How do we measure this? What design frameworks need to be implemented to realize robust systems? Is there some sort of guidebook for the regulators and regulated to meet these challenges? We want to be the ones to provide these frameworks and guidelines.
Everything about autonomous systems is safe and reliable. Sensors are designed to detect people and obstacles to make sure the vehicle can avoid it. An often quoted statistic is that around 94% of vehicle accidents are caused by human error, a mechanism that is by definition removed from something that it autonomous. But if presented with an autonomous system, how does a regulatory body make an assessment that it is ‘safe and reliable’ enough? How does the manufacturer drive its design team to make sure they do what is necessary to ensure that the horror scenarios associated with autonomous systems do not materialize? This was the catalyst for us forming the Center for the Safety And Reliability of Autonomous System (SARAS).
We realized that there was something we could do. We have access to some of the leading experts and organizations regarding autonomous system technology. And we hope in the near future we will start contributing to the area in very real and tangible ways. But we first had to establish a presence and communicate what we are about. Which brings us to the ‘guiding hands’ logo.
When we tried to illustrate a representative image, we understood that there were some key things we need to communicate. The right hand of the logo is robotic, and represents the ‘machine’ that will become our autonomous system. The right hand was chosen to be robotic as it is through the right hand that we typically interact with the world around us, and importantly control the things we want to. It is the right hand that guides the system and this will not be ‘human’ in an autonomous system.
The left hand is human. There will always be a human element in every autonomous system – and this should never be forgotten. Autonomous systems are used to achieve human goals. Without human guidance, systems can never truly be autonomous. Their pseudo-decision making ability is learned from us for as long as they are used.
The last element of our logo is the networked globe. Autonomous systems may be somewhat of a misnomer. They will generally need to be connected to many systems around them. Systems that we think are autonomous may actually gain their ‘autonomy’ from other systems that transmit this capability on an ongoing basis. Further, autonomous systems can actually provide substantial benefit to us all by being networked. Knowing where other systems are, where accidents have occurred, what local weather conditions are (and so on) allows autonomous systems to adapt in ways that we humans cannot. And they can learn how to do this.
The creation of SARAS is based on a holistic approach to autonomous system reliability and safety. We need to understand that the ‘system’ is actually ‘everything’ - people, public infrastructure, the environment and so on. This is where we start, and hopefully where we end is safe and reliable.