解决多传感器网络化的难题 – 第二部分 | LMI Technologies

YouKu

WeChat QR

 
X
Search
解决多传感器网络化的难题 - 第二部分

In Part 1 we looked at two of the four main challenges in multi-sensor networking: (1) sensor wiring and (2) sensor discovery, mapping, and assignment. Now let’s take a look at the two remaining challenges: (3) sensor alignment to a common coordinate system, and (4) sparse or dense network processing.

Sensor Alignment

All sensors in a network have to be aligned in order to relate measurements from a sensor to an absolute position on the object. To do this, alignment transformations are required to convert from sensor coordinates to a common coordinate system (i.e., world coordinates).

解决多传感器网络化的难题 - 第二部分

This alignment process can be achieved in many ways. One common approach uses a known artifact with precise dimensions of a particular shape that all sensors can scan. These scans need to contain one or more unique artifact features that can be used to determine the position of the sensor in world coordinates.

Another method is to use a laser tracker and attach retroreflectors to the sensor housing. This allows the tracker to pick out the orientation of each sensor and position the sensor in world coordinates.

Sparse and Dense Network Processing

In a dense network, data from multiple sensors is stitched into a single 3D point cloud. For example, a wood optimizer is a dense network where data from overlapping top and bottom sensors are stitched to produce a single 3D board model with two surfaces (top and bottom). These surfaces are then analyzed to produce cutting solutions that maximize volume recovery. A second example is in protein portioning, where data is stitched together into a 3D model representing the volume of an object that is optimized for portioning into smaller weights while minimizing waste.

In contrast, a sparse sensor network is one where sensors do not overlap and there is no combined 3D point cloud generated. Instead, sensors measure and report key features independently. An example of a sparse network can be found in automotive body-in-white inspection, where there are fixed sensors strategically placed around the metal body to measure key features. In this application, the sensors do not cover every millimeter of the body surface, only those regions of interest that require feature verification.

In both dense and sparse networks, the sensor produces data in world coordinates so it can be related back to the position of the object.

The Simplest Network: Dual Sensor Networking

The simplest form of network is made up of 2 sensors that LMI calls a main and a buddy. These networks are used to calculate thickness or to find two edges on a wide web of material, such as strips on a tire.

In a 3D smart sensor, the main/buddy configuration is a built-in feature. In this setup, the first sensor (main) is paired with a second sensor (buddy), and the main sensor is able to automatically “recognize” the buddy sensor when each is connected to the same network.

After pairing is complete, the buddy sensor sends its data to the main sensor. Both datasets are then merged into a common coordinate system and used to take measurements, execute control decisions and display results.

For more on Dual Sensor Networking, read this blog post.

Larger Network Support

In many cases, more than one buddy is needed. For example, imagine a case where 20 sensors (10 top, 10 bottom) are required to scan a very long object, from which a large 3D point cloud is generated in order to calculate the object’s volume.

Today’s 3D smart sensors support the discovery, assignment, mapping, alignment, and stitching of large multi-sensor networks to solve these types of applications using an “opposite” layout scheme. For ease-of-use, multi-sensor systems in a 3D smart sensor use a single GUI to configure, measure, execute decisions and display results.

解决多传感器网络化的难题 - 第二部分

Sensor Acceleration

Because there is often a lot of data in multi-sensor networks, sensor acceleration is used to redirect all sensor data streams to a PC where processing and memory are sufficient to handle such an application. You can read more about Gocator sensor acceleration here.


Interested in learning about 3D smart sensors and their advanced multi-sensor networking capabilities? Download the Gocator Multi-Sensor Networking Datasheet.