Sentry - HLU

In a "multi-layer" hierarchical architecture where low-level functionality is needed to control real-time devices to the application layer where complex algorithms  are performed, the High Level Unit (HLU) is the processing unit which is delegated to perform the functionality considered application.

For example, the HLU can be dedicated to detecting and tracking different types of "objects" in the images (interest classes in video frames), including people and dogs.

In a security application, the detected objects are treated differently: the first neglected, the seconds reported as intruders.

Typically, the HLU communicates with a Ground Control Station (GCS), a software application designed to monitor and supervise system activities.

Algorithms are based on two state-of-the-art approaches, namely convolutional neural networks and tracking with  "particle filters".
The convolutional neural networks (CNN) are machine learning algorithms that have been particularly interesting in the scientific world and in the industrial world over the last few years. Their main application uses them as "Object Detection" and "Classification" on images. Tracking based on particle filter, on the other hand, is an algorithm for tracking objects of interest in images based on a probabilistic approach that provides good performance in scalability and tracking performance. 

The  mentioned algorithms require considerable computational capacity to be executed. Usually, you can not run them "online" because processing a single image takes time in the order of seconds. However, the embedded platform used to accelerate the computation of machine learning algorithms to enable it to be "online".

The platform is made up of highly performing HW chipsets, including 256 CUDA cores, Quad ARM® V8A architecture,
Encode (HEVC) and 
Decode (12-Bit), 8 GB LPDDR4 memory, 32 GB eMMC, Camera Serial Interface up to 6 Rooms, CAN, USB, Ethernet, WiFi, Bluetooth.

The GPU consists of 256 CUDA cores, allowing parallel execution of a large number of processes, ensuring a considerable speedup while processing the images. Along with the computational capacity of the quad core ARM processor, it is possible to split the computational burden on the different computing units, taking into account the intrinsic characteristics of each processor.

The presence of a HW decoder on the platform guarantees the real-time decoding of the h264 stream produced by the rooms without any aggravation of the ARM CPU.

 

 LLU-PILOT

THE AUTOPILOT FOR LAND VEHICLES

It can be installed on every land vehicle, it has a small footprint, a low energy absorption and is expandable according to requirements. 

LLU – PILOT is an autonomous driving system, that can be installed on every land vehicle, for indoor/outdoor use, providing autonomous movement capability and responsiveness to avoid obstacles coming by:
It computes the 3D scene detected around the vehicle by sensors  maps obstacles, designs trajectories, and act one of them, – in real time – with a total latency of less than 1.1 ms (professional driver time of response = 150-200 ms).

Operations:

- Autonomous Guidance
- Obstacles detection
- Collision avoidance

Boxed and OEM versions available.

TECHICAL FEATURES:

CONTROL

Output control signals on steering and speed for differential or Ackermann kinematic models.
Separate control for brake, engine brake or regenerative brake.
Real-Time processing with total latency <1,1 ms to perform the full process: 3D analysis, obstacles detection and exploration of space trajectories consisting of 1024 possibilities
Geometric analysis of high-resolutions 3D surveys (limited by the resolution of the sensor).
Real-time diagnostics on the ranging sensor, with speed limitation down to the full stop of the vehicle for safety reasons in case of unreliable operation of the sensor.
Vehicle rollover function.

 COMPATIBILITY

It’s compatible with vehicles with different kinematic models.
It has interfaces on different protocols and physical connections: double Can Bus, PWM and RS232 signals compatible with Roboteq controller.
Broad compatibility with scanning periods of sensors (30, 50 e 100ms).
Compatibility with sensors protocol LIDAR Velodyne VLP16, Velodyne HDL32e and Sick s300.
Communication API over TCP-IP protocol, providing configuration functions, alarms, mission management.

FLEXIBILITY OF USE

On Road           Off Road 

Suitable for each scenario, “off/on road” and “out/indoor”, thanks to an extreme configurability and to a wide ability to model the space around the vehicle.
Discrimination of the kind of obstacle, just with 3D sensors (LIDAR): ramp, pit, step, wire mesh, arch.
Filtering for every single stem of grass or insect, specific automatic mode for dense grass and narrow passages.
Possibility to create more interdicted areas (besides those detected by provided sensors).

 Outdoor            Indoor 

Total configurability of the dynamic behavior of the autopilot,  to optimize performances in each scenario.

 

INTERFACE

Connector 1:

ETH 10/100  - RS422 : ranging sensors
2 contacts for interlock

Connector 2:

2 RS232 PORT: engine controllers
2 CAN BUS PORT: vehicle control
RS232 PORT: navigation sensors

Connector 3:

ETH PORT: vehicle/mission configuration

Connector 4:

Power supply

 

DA-RT - INFO SOLUTION S.P.A. Line Of Business - © 2017 Headquarter - Via Cadorna, 67- 20090 Vimodrone MI - Italy - Tel +39 0227409353 - Infosolution(at)legalmail.it

C.F. 12419470153 - P.I. 02996000960 - Business Register n. 12419470153 - REA MI 1559826 - Share Capital Euro 963.180,00 i.v.