Autonomous Mobile Robots Sensors

Sensor usage has exploded in all sorts of devices, from smartphones and embedded devices to moving objects such as robots. Sensors form the core of IoT.

A typical smartphone contains 20-plus sensors. Rapid smartphone growth over the years has also helped the overall sensor industry in terms of advancement of sensor technology, miniaturization and dramatic reduction of their cost. This in turn has helped the robotics industry too.

Sensors come in all different sizes and shapes, measuring various quantifiable parameters of the robot or the external environment the robot is in. The types of sensors used in robotics are large and vary across different applications of robots and types of robots.

Sensors for an AMR are like its eyes. When combined with sensor software algorithms, sensors allow an AMR to understand and navigate the environment, detect and avoid collision with objects, and provide location information about the robot.

Types of sensors

Exteroceptive sensors discern the external world; these include camera, laser and lidar, radar, sonar, infrared, touch sensors such as whiskers or bump sensors, GPS and proximity sensors.

Proprioceptive sensors deal with robot itself, such as accelerometers, gyroscope, magnetometer and compass, wheel encoders and temperature sensors.There are many other categories sensors can be clubbed into, such as active or passive sensors. It is also important to note that sometimes the boundaries between exteroceptive and proprioceptive overlap.

With the use of voice assistant interfaces becoming popular to interact with robot, sound sensors such as microphones are also becoming more prevalent.

Likewise, as robots must often connect to the internet, communications with Wi-Fi and LTE are critical. While not strictly sensors nor actuators, they allow robot to interact with external world.

 

Sensors used for mobility

Typical sensors used in ground mobile robots and drones include:

Inertial measurement units (IMUs) typically combine multiple accelerometers and gyroscopes. They can also include magnetometers and barometers. Instantaneous pose (position and orientation) of the robot, velocity (linear, angular), acceleration (linear, angular) and other parameters are obtained through the IMU in 3D space. MEMS sensor technology advances have benefitted IMUs significant. IMUs suffer from drifts, biases and other errors.

GPS provides latitude, longitude and altitude information. Over the years, GPS accuracy has increased significantly and highly accurate modes, such as RTK, also exist. GPS-denied areas, such as indoor areas, tunnels and so forth, and slow update rates remain a GPS’s top limitations. But they are important sensors for outdoor mobile robots and provide an accurate periodic reference.

Depending on whether they are indoor or outdoor robots and the speed at which the robot moves, laser sensors can vary significantly in price, performance, robustness, range and weight. Most are based on time of flight principles. Signal processing is performed to output points with range and angle increments. Both 2D and 3D lasers are useful. Laser sensors send a lot of data about each individual laser point of the range data. To take full advantage of lasers, a lot of compute power is needed. Lidars are also very popular in mapping.

Encoders count the precise number of rotations of the robot wheels, thereby estimating how far the robot has travelled. The terms odometry or dead-reckoning are used for distance calculation with wheel encoders. They suffer from long-term drifts and hence need to be combined with other sensors.

Vision sensors, such as cameras, both 2D and 3D, as well as depth cameras, play a very critical role in AMRs. Computer vision and deep learning on the sensor data can aid object detection and avoidance, obstacle recognition and obstacle tracking. Visual odometry and visual-SLAM (simultaneous localization and mapping) are becoming more relevant for autonomous robots operating in both indoor and outdoor environments where lighting conditions are reasonable and can be maintained. 3D cameras, depth and stereo vision cameras provide pose, i.e., position and orientation, of an object in 3D space. In industrial environments, well-established machine vision techniques combined with pose can help solve a number of problems from grasping to placement to visual servoing. Thermal and infrared cameras are used when working in difficult lighting conditions, such as the dark or fog.

If there is an object in the range of an ultrasonic sensor pulse, part or all of the pulse will be reflected back to the transmitter as an echo and can be detected through the receiver. By measuring the difference in time between the transmitted pulse and the received echo, it is possible to determine object’s range. Sonars are impacted by multipath reflections.

Pulsed and millimeter wave radars detect objects at long range and provide velocity, angle and bearing parameters typically measured to the centroid of the object. They work in all weather conditions while most other sensors fail in complex environments, such as rain, fog and lighting variations. But their resolution is limited as compared to lidar or laser.

Robust and accurate localization schemes combine data received from IMUs, wheel encoders, GPS, laser, radar, ultrasonic and vision software algorithms to implement SLAM techniques. Depending on the application and specification of navigation and object avoidance, the fusion can be limited to few sensors or all sensors.

Selection and placement of sensors

With so many sensors at disposal, it is a complex exercise to select the right ones. Factors that dictate the choices are type of application, specification of navigation and localization features, environment in which the AMR is going to operate, available compute power to run sensor algorithms, choice of software algorithms, such as sensor fusion, power consumption and costs. Invariably, there is a tradeoff taken when balancing across all selection parameters.

Dynamic range, accuracy, resolution, linearity, field of vision and many other parameters determine the quality of sensors. Sensors used in defense applications are very expensive and can meet superior specs. But in the majority of non-defense applications, where the specification, errors and biases of sensors are suboptimal, the choice of algorithms running the software is a key decision point.

The placement of sensors inside the robot also requires a very careful design exercise. Sensors in IMUs are extremely sensitive to external forces, such as vibrations, stray magnetic fields, lighting conditions and more. Accurate static translation and rotation offsets need to be calculated using physical methods and calibration techniques. Some of the sensors require antennas and, once again, selection of the right antenna and its placement are critical.

Parameters to consider during sensor integration

Sensor calibration; time synchronization, in sensor fusion especially; different rates and frequencies at which sensor data arrives from different sensors; various types of errors in measured values and biases, which are both intrinsic and external environment-driven; and environmental conditions impacting sensor measurements remain key challenges in sensor integration. The majority of research and development happens in these areas.

Various sensors are available in the market today with a wide range of performance parameters. Choosing the right sensors is a complex, technical and product management task. It’s a balance of sensor spec, application requirements, cost, power, form factor, environment conditions, software algorithm sophistication, time to market, longevity and more.

Through sensor fusion algorithms, different sensors complement each other well to achieve tough goals. Sensors and sensor software algorithms are critical to the success of the AMR industry to a large extent.

Sensor manufacturers are also putting significant effort into improving sensor performance, accuracy and range. Some are even doing sensor fusion inside their units. This helps accurate time synchronization. When selecting sensors for a specific AMR application, teams need to keep a continuous watch on various developments happening in the sensor world.