Smart Navigation

How is it possible to upgrade a simple robotic arm so that it becomes an autonomous detective?

The answer to this question is provided by our Smart Navigation Showcase!

Aim of the Showcase

Our Smart Navigation showcase demonstrates how it is possible to realize even complex use cases, such as the recognition and tracking of any object on a small, simple robot arm – even if you initially have nothing more than a serial interface available!

It also shows how the resulting live machine data from the robot can be used intelligently and how it can be enriched and statistically analyzed.

 

YouTube

Mit dem Laden des Videos akzeptieren Sie die Datenschutzerklärung von YouTube.
Mehr erfahren

Video laden

The Setup

Smart Navigation

Simple robotic arm from the company UFACTORY (uarm). Upgraded with an INTEL Realsense stereo depth camera.

An NVIDIA Jetson TX-2 – the heart of the setup.
This is where the neural network is executed and the robot control is coordinated.

Jetson-TX2
IOT-Node

An IOT sensor node expands the robot’s range of sensor values. It is now also able to measure, statistically evaluate and react to other physical variables such as temperature, air pressure and vibrations.

Technology & Software

The INTEL Realsense stereo camera delivers two different types of video.

On the one hand, an RGB-based video with a resolution of 640×480.

The second video stream is based on an infrared depth image. Here, each pixel has a defined distance value which is represented by a color code. A red pixel stands for a small distance to the camera, blue for a greater distance to the camera.

The basis of the Smart Navigation showcase is object recognition with tinyYOLO and the programmed real-time coordination of the robot arm. To show that there are (almost) no limits to object recognition, we decided to train with our dmc-group logo.

After almost 300 images in the training data set and several hours of computing time on our high-performance computers in Paderborn, the neural network was sufficiently trained to reliably recognize our logo. Although the training itself required a lot of resources, the neural network can even be run on smaller embedded devices – so no high-end server is required.

The operation of the robot arm itself is based on frame-by-frame recognition of the object (camera delivers stable 30 fps). By calculating the difference between the positions of the trained object in the video stream, a movement vector (red arrow) is calculated and translated for the machine controller. Now it is possible for the robot arm to follow the trained object in real time (within its degrees of freedom).

The dashboard based on Node-RED displays the live data of the robot arm. Real-time recording of the machine values, such as the number of objects detected, the azimuth angle of the arm or the distance to the object, can be viewed here.

As the dashboard is freely configurable in terms of display, you can hide values that are not required or display additional values (e.g. input from the measured values of the IOT sensor node).

The robot arm itself does not require an active cloud connection to carry out its work.

If required, the robot’s information can be made accessible via the web by exporting the dashboard data to a cloud. This can be used for remote monitoring or to display the machine values on mobile devices.

Analysis & Data

This collected data can now be displayed, statistically processed and evaluated.

As a first example, the position of the detected object can be displayed graphically over time, as shown here. This makes it possible to display the trajectory of the object in 3-dimensional space.

As soon as sufficient data has been collected, it can be displayed and evaluated over time.
This is useful for uncovering correlations between different data streams during fault analysis.

This information can be used to predict error states (keyword: predictive maintenance). Here, the machine data would be monitored by a trained neural network. In this way, specific error patterns of a machine can be recognized and early warnings of an imminent impairment of operation can be sent.

Further application areas

Here on the right you will find some examples of the recognition of people and other objects, such as chairs or vehicles.

To recognize an unknown, new object, the neural network must be trained again – as in our case to recognize our logo. This first requires a sufficiently large and high-quality test data set of image material.

As soon as the training has been successfully completed, however, almost any object can be reliably recognized – the range extends from facial recognition to the identification of (partial) products in a factory during the production process for quality assurance.

Objekterkennung
Defekterkennung Zeichnung

Smart navigation can also provide support for applications aimed at inspecting hard-to-reach areas. One example would be the application in the field of functional drones for the inspection of wind turbines in fault condition.

Another conceivable scenario would be clarifying a confusing situation from the air with the help of thermal images.
For example, if people are missing in a forest fire and rescue teams have to assess whether or not they should enter an area despite the development of smoke.

Das professionelle Zeitstudienerfassungsgerät

Angewandte Ergonomie

Wir haben das ORTIM a5 speziell für den Einsatz als Zeitstudiengerät und zum Erfassen von Multimomentaufnahmen entwickelt – vereinfacht handelt es sich hierbei um eine große Stoppuhr.

  • intuitiv bedienbar
  • Touchscreen mit integrierter Handschrifterkennung
  • Formularblock mit zugeordneten Messtasten für Mitschrift auf Papier
Frei konfigurierbar
  • Anpassbar an jede Art von Zeitstudie und an jede Auswertungsregel; erlaubt alle Arten von Zeitaufnahmen bei Mehrstellenarbeit (zyklische und nichtzyklische Abläufe gleichzeitig aufnehmbar)
  • Planzeitcodes (PCOs) generieren aus Zeitaufnahmen Planzeitformeln
  • steckbare Tastatur mit integriertem Formularträger für umfangreiche Texteingaben

Anschließend können Sie unsere Software ORTIMzeit nutzen, um Ihre Zeitaufnahmen auszuwerten.

Fragen zum Produkt?

Hinweiß gemäß Batterieverordnung

Hinweis zur Rücknahme von Altgeräten