Simultaneous Localisation and Mapping, commonly known by its acronym SLAM, is an increasingly important topic in the field of autonomous robotics and driverless vehicles. The gain in interest has been matched by high profile, large investments into autonomy by the automotive industry, with the promise of fully autonomous transportation. Large corporations such as Google, UBER and Tesla are using SLAM to help develop their driverless car platforms.
SLAM is a synthesis of technologies that enable an autonomous agent to navigate an unknown or dynamically changing environment without reliance on GNSS. It is a complex mathematical technique for fusion of data from a variety of sensors. The process is used to build a rich map of the local environment. Probabilistic modelling techniques are applied in parallel to accurately identify a robot’s position and pose in real-time.
The SLAM problem is one of the most challenging and exciting areas in modern robotics. It is traditionally considered to be within the domain of computer vision, and was initially performed using camera sensors. At Dynium we are utilising more recent innovations in laser measurement technologies and other complimentary sensors to make highly accurate location estimates and create detailed maps of the robot’s environment.
SLAM is more of a concept than one algorithm. There are several steps involved, each of which can be implemented in many ways. These steps generally include landmark extraction, data association, transformation, registration, state estimation, state update and landmark update. We are optimising these algorithmic steps specifically to perform well in the off-road and agricultural environment, and it is this specialisation which differentiates us from our peers in Silicon Valley.
At Dynium Robot we have expert engineers including PhDs in SLAM, Robotics and Computer Vision. Across the team we have many years’ experience in researching and applying SLAM techniques to mobile robotics. This puts us in a valued position as a company, possessing the rare combination of expertise, experience and practical know-how in autonomous vehicles. This also provides the ability to develop and build robust robotic brains that are in high demand, with the growing desire for autonomy particularly in farming.
We are taking advantage of the latest advances in LiDAR sensing technology to create the safest and most reliable off-road navigation system available. We use a combination of high resolution 3D sensors and smaller, low cost solid state lidars to produce detailed models of the dynamic environment with much higher precision than was possible in the past.
Computer vision is core to our work at Dynium. Our engineers have a background in applying computer vision techniques to driverless vehicles, industrial robotics and people tracking applications. This exposure gives us the capabilities to solve a variety of problems in autonomous navigation with a unique and creative approach using advanced camera technology. We are developing and testing algorithms to assist in navigation through complex dynamic environments such as densely populated farmland and orchards. With the combination of our expertise, experience, and being a small nimble start-up, we can rapidly adapt to different challenges.
We have computer vision research agreements with Warwick University, enabling us to implement the latest academic discoveries into our vision systems. One such collaboration is the development of a robust obstacle detection and collision avoidance system that is widely applicable to off-road pilotless vehicles. Utilising recent developments in stereo camera vision systems help us to advance these systems and create completely robust obstacle avoidance and safety modules so our robots can operate around people without concern. We expect such technologies to provide defensibility to the company, especially as autonomous farming ISO safety standards become stricter.
Ultrasonic Sensing and IMU
Further vehicle sensors are added to our robots to complement the SLAM algorithms, including developing our own agricultural ultrasonic sensing systems. Ultrasonic sensors allow us to detect objects within a close range of the vehicle. We are also developing sensor fusion algorithms to utilise MEMS IMU sensors to assist in the localisation and mapping accuracy of our systems. The IMU can be used to compliment the other sensors in the SLAM system to help improve robustness.
Distributed Systems and Embedded Technologies
Efficiency is important to our company and prepares our vehicle for future electrification. The use of low powered microcontrollers and distributed computing architectures brings energy savings and improvements in reliability.
The philosophy of our systems is modularity and flexibility at the design stage. We can then provide interfaces to third party robotics companies and reconfigure our systems to suit different applications. We also adhere to automotive standards where appropriate to gain the highest reliability and robustness in our products.
Data Collection and Archiving
In the modern era, data has become one of the most sought-after commodities. Dynium robot are building a database to store all our robotic data with intelligent meta-data labelling to form one of the world’s most detailed agricultural sensor databases.
Whilst our robots are employed in trials and deployment, they are continuously collecting sensor data, transmitting this to the cloud for storage. This will include vision-system data, LiDAR, IMU, and soil moisture content, all precisely localised and matched to weather and other pertinent growing conditions. This data will be valuable for in-house product development, AI, modelling, and simulation. We also anticipate that this data will have currently unknown future applications and be of value to the research community and global industry.