“Short-Circuit the output of the module under various input and output states, the module should be able to achieve protection or retraction, and short-circuit repeatedly for many times. After the fault is eliminated, the module should automatically resume normal operation. “ 1 Repeated short circuit test◆ Test description Short-circ...
“This is like the moon landing plan of our time. From sensors to artificial intelligence (AI), the classic Electronic supply chain has formed a collaboration matrix dedicated to achieving the safety of autonomous vehicles. To this end, a lot of hardware and software development work is needed to ensure that drivers, passengers, and pedestrians are protected.
Just like the initial moon landing, there are still many obstacles on the road to safe self-driving vehicles. Recent accidents involving self-driving vehicles have fueled the opposition of opponents who believe that the vehicle and its driving environment are too complex, too many variables, and algorithms and software are still too buggy. For anyone involved in ISO26262 functional safety compliance verification, it is understandable that they are skeptical. This skepticism is supported by relevant data. The following figure compares the actual mileage and the number of times that the five self-driving vehicle companies tested in Silicon Valley in 2017 have left the self-driving mode (Figure 1). The data for 2019 has not yet been aggregated, but individual company reports are available online.
Figure 1. The test data of the five major autopilot manufacturers in California: the average number of miles driven by the autopilot system after each departure from human takeover (December 2017 to October 2018). During this time, a total of 28 companies took the initiative to test vehicles in public places in California. During the period, a total of 2,036,296 miles were driven in automatic driving mode, and 143,720 man-made takeovers occurred.
But the goal is already clear, and the top priority is to ensure vital safety when autonomous driving is about to arrive. According to unofficial data from the California Department of Vehicles (DMV) in 2018, the number of man-made takeovers of the autonomous driving mode is decreasing under the same mileage, which also indicates that the autonomous driving system is becoming more and more powerful. And this trend needs to be further accelerated.
By putting collaboration and new thinking first, automakers will negotiate directly with chip suppliers; sensor manufacturers will discuss sensor fusion with AI algorithm developers; and software developers will establish contacts with hardware providers to give full play to The advantages of both. Old relationships are changing, and new relationships are being dynamically formed to optimize the performance, functionality, reliability, cost, and safety of the final design.
The ecosystem is looking for suitable models to build and test fully autonomous vehicles on this basis for rapidly emerging new applications, such as robo-taxi and long-distance trucks. In this process, the sensors used in advanced driver assistance systems (ADAS) have been continuously improved, resulting in a rapid increase in automation.
Figure 2. Various sensing technologies used for ADAS perception and vehicle navigation often work independently,
And give a warning to the driver in order to react.
These sensor technologies include cameras, laser detection and ranging (LiDAR), radio detection and ranging (radar), microelectromechanical sensors (MEMS), inertial measurement units (IMU), ultrasound and GPS, all of which provide artificial intelligence systems Critical data input to drive a truly self-driving vehicle.
The cognitive ability of the vehicle is the cornerstone of predictive safety
The degree of intelligence of a vehicle is usually expressed by the level of automatic driving. L1 and L2 are mainly early warning systems, while L3 or higher vehicles are authorized to control to avoid accidents. As the vehicle develops to L5, the steering wheel will be eliminated and the vehicle will drive fully automatically.
In the first few generations of systems, as vehicles began to have L2 functions, each sensor system worked independently. These early warning systems have a high false alarm rate and have caused a lot of trouble, so they are often shut down.
In order to realize fully autonomous vehicles with cognitive capabilities, the number of sensors will increase significantly. In addition, performance and response speed must also be greatly improved (Figure 3, Figure 4).
Figure 3. In order to ensure the safety of self-driving vehicles, it is necessary to fully detect current and historical status,
Environmental characteristics and the state of the vehicle itself (position, speed, trajectory and mechanical conditions).
Figure 4. Autonomous driving level and sensor requirements.
After more sensors are installed on the vehicle, it is also possible to better monitor and analyze current mechanical conditions, such as tire pressure, weight changes (for example, load and no load, one passenger or five passengers), and possible effects on braking And other wear factors of manipulation. With more external sensing methods, the vehicle can more fully perceive its driving conditions and surrounding environment.
Improvements in sensing methods enable the car to recognize the current state of the environment and understand the historical state. This comes from the principles developed by Joseph Motola, CTO of ENSCO Aerospace Science and Engineering. This sensing capability can perform simple tasks, such as detecting road conditions and identifying the location of potholes, as well as performing detailed analysis, such as the types of accidents and the causes of accidents that occur in specific areas over a period of time.
When generating these cognitive concepts, they seem out of reach due to the limitations of sensing, processing, memory capacity, and network connections. But now the situation has changed drastically. Now, the system can access this historical data and combine it with real-time data provided by vehicle sensors to provide increasingly accurate preventive measures to avoid accidents.
For example, the IMU can detect sudden jumps or deviations caused by potholes or obstacles. In the past, this information was nowhere to be transmitted, but now through a real-time connection, the data can be sent to a central database and used to warn other vehicles about potholes or obstacles. The same goes for camera, radar, lidar, and other sensor data.
These data are compiled, analyzed and fused, so that the vehicle can use the data to predict its driving environment. This enables the vehicle to become a learning machine that is expected to make better and safer decisions than humans.
Multi-faceted decision-making and analysis
Great progress has been made in improving vehicle perception. The focus is on collecting data from various sensors and applying sensor fusion strategies to maximize complementary advantages and make up for the weaknesses of different sensors under various conditions (Figure 5).
Figure 5. Each sensing technology has its own advantages and disadvantages,
But as long as there is an appropriate sensor fusion strategy, they can complement each other’s strengths and make up for their weaknesses.
However, there is still a lot of work to be done in order to truly and effectively solve the problems faced by the industry. For example, to improve the camera’s ability to calculate the lateral speed (that is, the speed at which an object moves on a path perpendicular to the vehicle’s direction of travel). However, to achieve a sufficiently low false alarm rate, even the best machine learning algorithms still need about 300 milliseconds to detect lateral movement.For a vehicle traveling at a speed of 60 miles per hour and a pedestrian walking in front of the vehicle, the difference in milliseconds is related to the severity of the injury, so the response time is very important
The 300 millisecond delay is caused by the time required for the system to perform incremental vector calculations from consecutive video frames. To perform reliable detection, ten or more consecutive frames are required, but we must reduce it to one or two consecutive frames in order to give the vehicle enough response time. Radar can do this.
Similarly, radar has many advantages in terms of speed and object detection, such as high resolution in bearing and pitch angles, and the ability to “see” surrounding objects, but it also needs to provide more time for vehicles to respond. With the goal of speed measurement of 400 km/h or higher, some development work from 77 GHz to 79 GHz has made new progress. This horizontal speed measurement may seem extreme, but it is necessary to support complex two-way lane driving, in which the relative speed of vehicles driving in opposite directions exceeds 200 km/h.
Lidar can make up for the shortcomings of cameras and general radars, and is an indispensable component of fully autonomous vehicles with cognitive capabilities (Figure 6). But it also faces challenges.
Figure 6. Fully automatic driving vehicles mainly rely on 360˚ detection, which requires the use of advanced radar,
Lidar, camera, inertial measurement unit and ultrasonic sensor
Lidar is being developed into a cost-effective compact solid-state design that can be placed in multiple locations around the vehicle to support a complete 360˚ coverage. It complements the general radar and camera system to improve the angular resolution and depth perception to provide more accurate three-dimensional environmental images.
However, the near-infrared band (IR) (850nm to 940nm) is harmful to the retina, so its energy output is strictly adjusted to 200nJ/pulse at 905nm. By migrating to short-wave infrared with a wavelength of more than 1500nm, this light is absorbed by the entire surface of the eye. In this way, some restrictions can be relaxed and adjusted to 8 mJ per pulse. The energy level of the 1500nm pulsed lidar system is 40,000 times that of the 905nm lidar, and the detection range is 4 times that of the latter. In addition, the 1500nm system can better resist certain environmental conditions, such as haze, dust and fine aerosols.
The challenge for 1500nm lidar is system cost, which is largely driven by photovoltaic detector technology (this technology is now based on InGaAs technology). Obtaining high-quality solutions, that is, with high sensitivity, low dark current and low capacitance, will be the key technology for the progress of 1500nm lidar. In addition, as lidar systems enter the second and third generations, application-optimized Circuit integration is required to reduce size, power, and overall system cost.
In addition to ultrasound, cameras, radar, and lidar, other sensing technologies are also playing a key role in achieving fully automated driving. GPS allows the vehicle to always know where it is. Nevertheless, there are still some places where GPS signals cannot be obtained, such as in tunnels and high-rise buildings. And this is where the inertial measurement unit plays an important role.
Although often overlooked, the IMU is very stable and reliable because it relies on gravity, which is almost unaffected by environmental conditions. It is very useful for dead reckoning. In the absence of GPS signals for the time being, dead reckoning can use data from sources such as speedometers and IMUs to detect the distance and direction of travel, and superimpose these data on a high-definition map. This enables the self-driving vehicle to stay on the correct track until the GPS signal is restored.
High-quality data saves time and saves lives
Just as important as these sensing technologies is their reliability. If the sensor itself is not reliable and the output signal is not accurately captured to provide high-precision data to the upstream, then these key sensors will become meaningless and positive. The sentence is fulfilled, “If the input is garbage, then the output must also be garbage.”
In order to ensure the reliability of the sensor, even the most advanced analog signal chain must be continuously improved to detect, acquire and digitally convert the sensor signal so that its accuracy and precision will not deviate with time and temperature changes. Using appropriate devices and design methods can greatly alleviate some well-known problems (such as bias temperature drift, phase noise, interference and other unstable phenomena). High-precision/high-quality data is the basis for machine learning and artificial intelligence processors to be properly trained and make correct decisions. Generally, there will be no second chance for you to start all over again.
Once the data quality is guaranteed, various sensor fusion methods and artificial intelligence algorithms can make the best response. In fact, no matter how well the artificial intelligence algorithms are trained, once the models are compiled and deployed to devices at the edge of the network, their effectiveness depends entirely on high-precision reliable sensor data.
This interaction between sensor modes, sensor fusion, signal processing, and artificial intelligence has a profound impact on the development of autonomous vehicles with intelligence and cognitive capabilities, as well as the safety of drivers, passengers, and pedestrians. However, if there is no highly reliable, accurate, and high-precision sensor information (this information is the basis of safe autonomous vehicles), everything is meaningless.
As with any advanced technology, the more work we do in this area, the more complex use cases that need to be solved. This complexity will continue to pose problems for existing technologies, so we look forward to the next generation of sensors and sensor fusion algorithms that can solve these problems.
Just like the initial moon landing, we also have great expectations for the entire self-driving vehicle implementation plan, hoping that this will bring profound changes and lasting impact to society. The development from assisted driving to autonomous driving will not only greatly improve traffic safety, but also significantly increase productivity. And such a future depends entirely on sensors, and everything else will be built on the basis of sensors.
For the past 25 years, ADI has been committed to automotive safety and ADAS development. Now, ADI is laying the foundation for the future of autonomous driving. ADI provides high-performance sensors and signal/power chain solutions around the outstanding accumulation of inertial navigation, high-performance radar and lidar. These solutions will not only greatly improve the performance of these systems, but also reduce the implementation cost of the entire platform, thereby accelerating our pace towards autonomous driving.