Sensors and Signals at Edge AI

Sensors are electronic components that give the edge devices the power to interact with the environment. Sensors give the devices a stream of data. The stream of data can be in a few formats like

  • Time Series—When the environment is pinged for values at a regular frequency, the changes in the value can be represented over a period of time. When the values are collected over a fixed interval, they are called periodic. When not collected at a constant rate, they are aperiodic. They could contain summary information, such as the number of times something has happened during the interval since the last value. This is the most common type of streaming data for AI devices
  • Audio – Special case of time series data where signals represent the oscillation of sound waves as they travel through air. The frequency of audio signals is very high to the tune or thousands of times per second. The audio can be beyond the comprehension limits of humans. Ultrasound (higher than the human limit and infrasound (lower than the human limit). Speech detection and classification is an example of edge AI audio detection.
  • Image – is another common data type for edge devices. Some devices like cameras use an array of tiny elements to capture data of an entire scene in one go. Sensors like LIDAR (Light detection and ranging) build an image by sweeping a single sensor across the scene over a period of time. The image has a grid of pixels. The size of the grid is called the resolution of the image. One pixel might have multiple values. If it is a grayscale image then the value is the lightness or darkness of the pixel. If it is a color image then each pixel has RGB values. Images do not have to represent visible light (like the ones in normal pictures taken on a camera), they can represent infrared light showing the temperature, and time of flight (in the case of LIDAR how much time does it take for the light to bounce back) or radio waves as collected by radio telescope.
  • Video – another special case of time series data which are sequences of images each representing a snapshot of the scene at a point in time. Each snapshot image is called a frame. Video is a very rich format and it means that it needs a lot of space and computing power.
Types of Sensors

Sensors can be grouped by their modality. Modality is the way in which something happens or is experienced. In humans, all our sensations of touch, sight, sound, smell, and taste are different modalities. The sensor modality can be broadly classified as

  • Acoustic & Vibration – Ability to hear vibrations through air (microphones), water (hydrophones), or ground (geophones and seismometers). This sensor provides time series data that shows the variation in pressure of the medium. Depending on the use case the relevant sensor would be used
  • Visual and scene – Image sensors range from tiny, low-powered cameras to high-quality multi-megapixel cameras. Image sensors capture light using a grid of sensor elements. The area that can be captured is called the field of view. There are other considerations for image sensors like Color channels (grayscale or RGB), Spectral response (wavelength of light that the image sensor is sensitive to), Pixel size (larger pixel size can capture more light), sensor resolution, frame rate, etc. Depending on the use case the right image sensor needs to be used. For good quality comprehension, high resolution camera is needed but that results in high costs and expensive transmission. Similarly for self-driving cars, the time of flight is needed and hence LIDAR sensor is needed.
  • Motion and Position – These sensors allow the device to understand its position and motion in the external environment. Example,
    • Tilt sensor – on or off depending on the orientation
    • Accelerometer – measures change in velocity over time. Used for motion sensing and devices like smartwatches to ascertain run versus walk versus no activity
    • Gyroscope – rate of rotation of an object
    • Time of flight – LIDAR
    • Inertial Measurement Unit (IMU) – to approximate the current position of the device based on its motion and internal frame of reference, not external
    • Global positioning system – understand the current position true to a few meters by an external frame of reference, using satellite radio signals. It would need a line of sight from the device to the satellite.
  • Force and Tactile – They measure the mechanical strain on an object. These include buttons and switches, capacitive touch sensors (the amount of the surface touched by the conductive object like a human finger on a screen), Strain Gauges (how much the object is deformed due to force), Load cells (amount of physical load applied – like self-checkout in the grocery store), flow sensor (for the amount of flow of a liquid), pressure sensor (pressure of gas or liquid like in a car tire or instant pot)
  • Optical, Electromagnetic, and Radiation – they measure electromagnetic radiation, magnetic fields, high energy particles, voltage, current, etc. Photosensor, color, spectroscopy, magnetometer, electromagnetic field meter, current sensor, voltage sensor. These sensors provide time series measurements. They can be combined with cause and effect scenarios like when the photosensor detects movement due to a change in light wavelength then it can switch on the light.
  • Environmental, Biological, and Chemical – Temperature, Gas, Humidity, Carbon monoxide, Particulate matter, biosignals, and other chemical sensors
Types of Signals

Edge devices have access to a lot of other data as well which can be split into 2 categories.

  • Introspective Data – About the state of the device itself. This includes device logs (configurations, alerts, errors), internal resource utilization, and internal sensor data to ascertain the correct functioning of the device for example the temperature sensor on the CPUs
  • Extrospective Data – this data comes from outside the device like connected systems, remote commands, API calls, network data, etc

Most use cases would use a combination of introspective and extrospective data. For example, a self-driving car would use motion and position sensors along with weather data and traffic data to make decisions on the driving speed and the route to take.

Summary

In this post, we took a quick look at the different types of sensors and the signals that can originate from within the device and the ones outside the device.

Leave a Comment

Your email address will not be published. Required fields are marked *