Lidar imaging can be used in a wide variety of sensing applications involving situational awareness in automated platforms or vehicles. The system is fully tolerant to solar background thanks to its patented solid-state scanning system. This involves applications suitable for multiple sectors including:

Vehicle navigation / Collision avoidance / Object detection and tracking / Advanced surveillance /
Vision aids / Multisensory approaches


Automotive autonomous cars are a well-known application case, including consumer, robotaxis, off-road and heavy transport vehicles. These entire transport vehicles share progressing in the current trend towards increased automation and safety led by improved solid-state and high resolution lidar sensors. Lidar technology is in the roadmap of all the manufacturers of large vehicle equipment supporting the development of better and safer vehicles with improved navigation capabilities. With L3CAM it moves one step further.

The solid robustness and high density point clouds of L3CAM may be combined seamlessly with any passive imager, including RGB, NIR, SWIR or polarimetric imagers. Autonomous vehicles need advanced sensor suites with several complementary sensors onboard, and the L3CAM architecture enables 3D lidar point cloud and 2D imaging sensor fusion by hardware, speeding the data fusion and machine learning procedures in a process free of parallax error and with a very small computational cost compared to software-based approaches.


Solutions for the railway industry involving automotive lidar go beyond autonomous trains. For instance, applications related to prevention or mitigation of collisions in areas with railway or TRAM services mixed with automobiles in urban environments . However, imaging lidar sensors can also be used for continuous infrastructure monitoring, automated inspection of installations, and preventive inspection of trenches or presence of obstacles, including moving ones.


Novel automated cargos are starting to sail, for which the detection of smaller boats or obstacles around them is critical, and can be performed with a sensor fusion suite like the L3CAM. Imaging lidar sensors have a special capability to detect small floating objects (like buoys or surface debris) beyond the capabilities of radar sensors. Similarly, but for different reasons, small recreational or fishery vessels also benefit from detection of small floating obstacles and other vessels as navigation support or for safe autonomous navigation.

Data fusion enables enhanced detection of objects. In the maritime environment the fusion of polarimetric and LIDAR 3D data provides in special reliable surface detection of small cross section objects around the vessel, especially relevant in large cargo ships, where fast and accurate detection of persons or small fishing or leisure vessels are critical for safe path planning.


Robotic and autonomous robotic spacecrafts and satellites are being prepared for the next steps of space exploration and services in a variety of uses, where high accuracy multi-sensor units including high density 3D sensors are expected to be critical. Applications include orbital robotics, close proximity navigation, space debris detection and tracking, rover navigation and terrain mapping for landing.


Self-guided mining vehicles are increasingly being used to reduce labour costs and avoid endangering human lives in hard-to-access mining exploitations. Both the sensing of the mining vehicle to optimize the use of cutting and drilling tools, and the automated vehicles which may transport the materials to the surface, gain performance using a lidar imaging sensor.

Underwater vehicles

Lidar imaging is specially indicated in underwater environments, since it perfectly complements radar-based techniques, is tolerant to particles in suspension, and allows a degree of resolution hitherto nonexistent using radar. The technology may be customized for each specific solution with concrete needs, including supervision of infrastructures, maintenance of facilities and installations (petrol platforms, oil rigs, gas or fiber pipelines..) enabling large spatial resolution. The inspection and maintenance of submerged areas of recreational and industrial vessels for integrity checks without leaving the sea is also a very relevant application, easily implementable onto small robotic underwater vehicles.


The proposed technology presents a great potential in the framework of any type of auto guided vehicle, allowing to add a sensorial capacity in real time, that also can be adjustable according to the necessary benefits. This includes sensors for robotic vehicles in outdoor stores, applications with high-resolution imaging involved or needs for depth sensing, and even interfacing with robots in outdoor environments, or at long distances.

Security & Defense

Real-time image acquisition in three dimensions adds a lot of value to many of the applications in this areas. In particular, different applications have been detected in robotic mapping of risk environments (contaminated disaster scenarios, remote control of critical installations, preventive maintenance of extensive installations,
etc.) and in border control applications, especially in detection and monitoring of airplanes and other unmanned vehicles. The technology developed even allows imaging through obscurants, such as under foliage detection, or aerial detection of objects behind obscurants such as sandstorms.


Likewise to the rest of transport markets and applications already commented, specific applications have been detected in the aeronautics industry, where the technology is relevant specially as a complementary sensor for navigation in adverse atmospheric conditions, or in the critical landing maneouver in special in irregular or weak grounds. Landing operations in the middle of mist or rain, or the landing of helicopters under rising clouds of snow or dust have been identified by several armies as severe problems which compromise the visibility of the pilot and the integrity of the aircraft, and which are tackled easily by our patented lidar technology. Low weight versions for low-flying drone applications are equally feasible.