On April 22, NIO explains the advantages of lidar system with an article titled “What does the world look like in the eyes of lidar?”
Can you drive against the light during the day?
Or when traveling in the dark of night
Are you always alert to the dangers that lurk in the shadows?

And all of this can be seen by ultra-long-range lidar

LiDAR has the characteristics of ultra-long-distance penetration through the atmosphere
In this way, it is no longer limited by ambient light, light and shade, and object size
Can see far, see clearly, see steadily
Is the current autonomous driving industry
The preferred solution for ultra-long-range perception

—
In 2021, ET7 officially debuted
It is the world’s first mass-produced car equipped with lidar
Leading the industry so far

Lidar-based ultra-long-range detection capabilities and atmospheric penetration capabilities can assist vehicles to achieve high-precision, efficient, and safe driving journeys
At the moment, together with ET7
Discover the dangers lurking in the dark
01 |
Watchtower layout reduces blind spots
Knows the ultra-distance road condition information in advance
The farthest detection distance can reach 500 meters


02 |
ROI dynamically adjustable
Enables smaller, more accurate generic obstacle detection


03 |
Penetrating atmospheric barrier
Can cope with various scenes such as strong light at night
See and regulate earlier than people

04 |
No fear of harsh environments such as backlighting and low visibility
More stable detection of target objects


— |
NIO Intelligent Driving Hardware System
Aquila NIO Super Sensing System
As the “eyes” of the whole car
Co-integration of lidar and multi-sensor perception
Can provide a full range of mass-level environmental information output for vehicles
33 piece High Performance Aware Hardware
Defining a new standard for mass-produced vehicles’ autonomous driving perception systems

Combined with the computing power of Adam supercomputing platform 1016TOPS
Bringing faster path planning and control decisions to the car

NIO models currently equipped with the “Banyan·Rong” intelligent system
In addition to NOP+ enhanced navigation assistance function
Fusion perception also performs real-time calculation and decision-making for all assisted driving functions such as lane centering, intelligent adaptive cruise control, instrument environment perception and simulation display.
Efficiently ensuring driving safety
— |
At NIO Center | Shanghai Auto Show
NIO Innovations Exhibition Area
For the first time, we bring the Aquila hypersensory system
Real-time Perception Demonstration
Lead everyone into the world of Aquila

–Full-stack self-developed visualization image: In the form of “occupancy grid”, it indicates which areas of the vehicle driving area are occupied by obstacles, Each grid represents a small 3D space of the real world. The combination of multiple grids can finely characterize obstacles of various shapes. The expression form occupying the grid can also output rich geometric and semantic information, such as category, dynamic and static, speed, 3D height, 3D position, etc.
–Multi-sensor association: The things seen by “multiple eyes” are integrated to form a unified and stable picture, providing stable detection results of 360-degree surround view.
–Multi-view and timing input: It can simultaneously realize dynamic target detection, tracking, speed measurement, prediction, and static environment segmentation, clustering and occupancy grid prediction, and realizes the integration and deployment of multi-sensing functions on mass-produced vehicles on a large scale landing.
As a result, the vehicle can cope with various complex challenges in the process of intelligent driving, bringing a safe and worry-free journey experience.