Forthcoming IoT devices – bio-metric haptic devices

By NeuroDigital Technologies S.L.

INTRODUCTION

The bio-metric and haptic devices have been present in our lives from long ago, applied to field such as the healthcare, the research or the entertainment.
One of the main uses of the bio-metric information is the analysis of the body state, wearable or compact devices have been designed to motorize data such as the heart rate, body temperature, the sugar level, etc, so that these variables can be tracked in real time for a fast detection of any kind of problem related to them.
In most of these cases, the final objective is to obtain the data related to the body activity.
In the case of the haptic devices, it’s importance to transmit feedback is well known, and it has been mostly exploited by the entertainment field in the shape of haptic gamepads and other kinds of controllers, that use it as a way to cause a deeper player immersion in the experience.
Bio-metric is also a wide term that sometimes we relate only to the monitoring of the body organs, blood characteristics, brain signals, etc, but the own body position and the way each of their bones is arranged at each time is a feature to monitor that helps us to contextualize the available information and also becomes very valuable for several kind of activities.

XR EXPERIENCES

With the arrival of virtual and augmented reality experiences, where the user can not only visualize a digital world with 360º freedom, but also use his hands to interact with it, and even move around physically, the next logical step was to provide a feedback system that could match this kind of visual immersion.
The headsets manufacturers focused on the development of multipurpose controllers that could be easy to use, and the feedback was mainly included in them, just the same way the video games industry implemented haptic feedback in their gamepads. The reason is simple, the gamepads are required and including any other kind of device, such as a wearable one would increase the overall cost of the product.

This overprice wouldn’t be a problem if these products would have as main target the industrial field, however, the most advanced headsets have been created as part of a mass adoption middle-long term campaign, so this has become an important handicap, even if the manufacturers understand the natural manipulation and the realistic feedback is important and a more immersive solution must be provided in the middle-long term too.

SENSORIAL XR

Sensorial XR was created as an answer to the industrial market needs in order to improve the immersion of VR/AR experiences, but can be used with any kind of application, as we will see later.

Sensorial XR is a haptic glove with two main functionalities:

  1. Virtualize the user’s hand, so that the palm and fingers movements can be integrated in a virtual experience with high precissión.
  2. Provide a haptic feedback with enough richness to be able to simulate all kinds of sensations.

These two functionalities are required to obtain a fully immersive experience, as the most natural controllers are the user’s own hands, and whenever they perform any kind of action in a digitally designed environment a feedback is expected to take place.

HAND VIRTUALIZATION – MOTION CAPTURE

In order to virtualize the hand, it is necessary to estimate the palm and fingers relative positions in the space. This is possible using different kinds of technologies, being the most common solutions based in computer vision or the use of sensors.
The main difference between these two approaches is the use of computer vision requires that a camera has been placed in such a way that it can see clearly the user’s hands, while the sensors require a correct initial calibration to work correctly.
In the current state of the art the headsets integrate cameras for computer vision in their own surface, and therefore real hand tracking can be achieved using them. However, the main problem for this solution is, on the one hand, the camera capabilities to capture images, as the ones used in these devices can’t track fast hand movements as they become blurry. On the other hand, a computer vision based system can’t determine the hand position when hidden by overlapping objects, or the point of view is not good enough.
The sensor based approach is a more robust approach, as good quality sensors will work correctly independently of where the hands are placed or how fast they move. However, they are not a software solution, so they require their own manufacturing process and some additional hardware maintenance.
In the case of Sensorial XR, the motion capture is performed using a sensor approach, which makes it more suitable for tasks such as AR maintenance in industrial environments. To perform the motion capture of the palm and each finger rotation Sensorial XR includes 7 IMUs (Inertial Measurement Unit). Each of them is a 9 axis one working over 225 Hz per second with a sample rate over 200 Hz.
The next figure shows the position of each IMU. Among the 7 IMUs, there is one placed in the palm center, one placed at the base of each finger. In the case of the thumb, that has a more complex behavior, an additional IMU is placed in the second articulation to capture its movement correctly.

THE NEED TO INCLUDE SMART CONTROLLER FUNCTIONALITIES

In order to reach the pinnacle of immersion, it’s necessary that actions take place in a similar way they are performed in reality, and this requires using our own body movements in detail to provide a realistic interaction.
However, let’s think about the body, and how difficult it may be sometimes to repeat the same gesture or put our body in a certain position. Would it be a good way to trigger actions?. In these cases fuzzy logic is applied to guarantee these actions must be similar, but not necessarily exactly the same. Now let’s think about different persons, whose bodies will also be different. Well, it is possible to search for a pattern that will be recognized the same way in all the users, so this shouldn’t be a problem either.
Finally, let’s think about the kind of actions these body poses or actions can trigger. If we are thinking in a game, maybe it will just confirm the current selection, and will be a way to navigate the user interface. There is no risk in this kind of approach.
Now let’s think in a teleoperated robot arm that may be performing a surgery. What would happen if the movements of the teleoperator body matched by error any of the predefined action poses? It could trigger an event that could make the teleoperated robot perform an undesired action with fatal consequences.
Due to this reason, even if the body tracking control is a great asset for the remote control, it is necessary to include a way to trigger certain tasks only when the user desires to do it, with a 100% accuracy.
In Sensorial XR, this requirement resulted in the creation of the Smart Controller system.

The haptic gloves include conductive fabric surfaces in the palm, and the top of thumb and the index and middle fingers. When these surfaces contact each other, a 100% accurate signal will be triggered. This way, even if the user is not holding any kind of controller, certain gestures where these conductive surfaces contact each other can be used as smart controllers/buttons to trigger a certain action.
The main difference with performing gestures with motion tracking is the smart control is not based on any kind of fuzzy logic. The signal will only be triggered when there is a physical contact between the fabric surfaces, so the system can’t misinterpret the operator’s intention.

HAPTIC FEEDBACK

As commented previously, the haptic signals, that uses vibrations to simulate the user and make him feel sensations, is the solution most of the manufactures bet on as the way to address the feedback requirements of immersive technologies.
However the current proposals from leading enterprises have been too related to the kind of hardware solutions they can provide in a mass consumption product.
Sensorial XR was designed specifically for this purpose and aimed at the industrial field, so this allowed this glove to implement a more complex haptic feedback system that includes a total of 10 haptic points. Five of them are located at each finger’s top, and the other five are positioned among the palm.

Each haptic actuator is a customized low-latency linear resonant actuator (LRA), that allows to define one among 1024 vibrations profiles. Their frequency is 205 Hz with a latency under 30 ms, and they are used to simulate different kinds of stimulus.

COMMUNICATIONS

A traditional problem with motion tracking and haptic systems is the difficulty to connect them to the hardware system executing the application the operator is connected to. There are lots of systems that require special communication dongles to use their technologies, because they don’t support wireless standards protocols, and while in some cases this could not be a problem, more specialized hardware can’t deal with this kind of limitations easily.

Ideally, a good motion tracking and haptic device should:

  1. Be based on a standardized communication protocol, such as Bluetooth or WiFi, and guarantee that any system supporting them will be able to connect without the need of any additional hw element.
  2. Guarantee the performance of the information exchange between the device and the system, so that for example, in the case of the gloves, a pair of them can be operated simultaneously without causing a communication bottleneck.
  3. To guarantee an acceptable connection from at least a conservative distance from the receptor.
  4. Ideally implement low energy consumption communication profiles, or any other techniques to minimize the impact of the communication in the battery energy drain.

As part of the iNGENIOUS activities, Sensorial XR has been improved to implement a wireless connection based on BLE (Bluetooth Low Energy) 5.0 protocol for an easy connection with any smart device. This way additional BT dongles or similar solutions are unnecessary.

SOFTWARE INTEGRATION

In order to make it easier for all the iNGENIOUS project partners to use the Sensorial XR gloves with full extended capabilities, an improved API and SDK has been improved
In most cases, Sensorial XR is used together with any of the leading 3rd party video game engines in the market, Unity and Unreal, and our API and SDK were designed to match this situation. However, in some cases, like in the teleoperation, more traditional solutions are used, and the gloves are used only as a mean to provide feedback, or the experience only requires the recognition of smart controllers trigger, and maybe the measure of a couple of IMUs,

In this situation, the original C# API limited substantially the systems that could use it to communicate with the glove, so as part of this project the C# API was replaced by a new C++ API, that can be easily integrated in a wider range of systems.

Furthermore, when operating in virtual environments, it used to be necessary to provide certain behavior in the interaction of the body (hands in this case) with the surface to manipulate virtually, for example a cockpit interior, in order to provide a good interaction immersion.

This process is complex and in most cases the technical personnel that can obtain a good result in these tasks is more related to the game development business than to the industrial field. To fulfill this gap, the original Sensorial SDK API for Unity and Unreal were updated to include a physical interaction toolkit, configured to start using the Sensorial XR gloves with functionalities such as physical grabs, finger slide on surfaces, obstacle blocking, ghost hands that will show the difference between the real hand position and the virtual hand position, among many others.

INGENIOUS

Sensorial XR has been used in the iNGENIOUS to improve the overall interaction with an AGV (Autonomous Guided Vehicle) under different circumstances and requirements.
In the first case, the glove haptic capabilities have been used as a way to improve the feedback provided to AGV teleoperator, whose task is to drive it in an industrial environment (the port of Valencia in this case) from a remote virtual cockpit. This task requires a 100% accuracy, and due to the nature of the action, the teleoperator must be fully aware of the AGV surroundings. In order to drive the AGV a real time video feedback is shown in the virtual cockpit, but to provide additional real-time information without the need to focus the attention far from the action, haptic stimuli in the gloves are triggered to indicate to the driver different issues that may be happening in the AGV surroundings.
In the second case, the hand motion capture and the smart controllers are used to support the navigation of an AGV that receives instructions from an virtual teleoperation environment. In this case, the AGV will move autonomously to the position defined in the software, and the gloves are used to point to the desired position with a natural input by means of the fingers motion capture, and to trigger the actions using the smart controllers.