New season, New skills! All Courses at $9.99

Nov 6, 2020

Gesture Recognition In The New World Of Human–Device Interaction Patterns

Gesture Recognition In The New World Of Human–Device Interaction Patterns

For the longest time and since the advent of technology, humans have been trying to evolve human-device interaction to the point of optimal natural interaction. There was a time when computers did not have a screen. Back then, all they could do was type documents with the help of an old-style typewriter-like keyboard.

Even today, many TVs come with a remote — a much smarter remote but still, a handheld device to transmit human commands as instructions for the TV. Then, the GUI innovation came along and made the interaction between a person and a computer a lot easier. We now have dialogue boxes, and windows, drag and drop elements, and icons for everything.

Even today, we still need a mouse to communicate our thoughts for the computer to process accurately. But with the advent of AI, a new wave of human-computer interaction patterns have started to find their way into mainstream use.

Interaction patterns that do not need intermediary translators like a mouse or a keyboard, or even visual elements on a screen. An example is speech recognition artificial intelligence-enabled interaction. This interaction pattern is why voice-activated assistant devices like Alexa exist. With nothing but your voice, you can instruct Alexa to make an order at McDonald's or turn up writing review service options like Pick The Writer or Writing Judge.

However, our focus for this article is gesture recognition and control. Gesture recognition as a new interaction pattern aims to teach our devices to master human movement so that they can immediately interpret what it means, much like the way humans interact with each other. Gesture recognition falls under the new era of human-device interaction known as the Natural User Interface (NUI).

Human gestures include facial expressions, hand movements, sign language, gait, posture, and more. Gesture recognition tech utilizes depth-sensing 3D cameras, infrared (IR) cameras, and complicated mathematical algorithms to record human movement and process it into understandable instructions for our devices.

Gesture-enabled devices and tech already exist all around us

We've already begun to see applications of this technology across many devices, especially in the automotive, healthcare, software design, and manufacturing industries. Gesture and hand recognition features are present in smartphones, video game consoles, vehicles, medical equipment, and much more.

An excellent example of this would be the Samsung Galaxy Note 10. This high-end smartphone comes with to-be-expected facial recognition and navigation gesture features.

These features have already become a staple of many smartphone brands. Swiping, pinching, and double-tapping are considered traditional gesture movements in smartphones. This reality plants today's smartphones firmly in the category of touch gesture input devices.

Back in 2010, Microsoft was the first to market to come up with a gesture recognition technology known as the Microsoft Kinect for their gaming console Xbox 360. This motion-sensing accessory allowed gamers to use hand and body motion for a more immersive and intuitive real-time gaming experience, freeing them from the restriction of joysticks and keypads. Kinect's application has since expanded, and it is now more a part of Azure, Microsoft's cloud-based service.

Tobii Rex, a technology company, based in Sweden, designed a device of the same name that takes eye-device coordination to another level. This tiny rectangular device uses gesture control to track the eye's movement via infrared light and interprets these movements into specific actions on many devices. The Tobii Rex was created mainly for people with a physical disability to maintain complete autonomy while working on their computers with nothing but eye movement.

Digital Habit's OSOUND is a circular audio system that is wall-mounted and can connect to any Bluetooth-enable device to play music or listen to the radio and more. OSOUND's functionality comes with a powerful gestural interface that enables the user to have complete control of the system via recognized gestures. A user can pause/play music, move to the next or previous song, handle volume control, and more.

And then there's the BMW 7 Series. Since 2016, the German automaker introduced gesture recognition as a feature of the 7 Series of cars. The gesture controls allow you to take or reject calls, raise or lower the stereo volume, adjust the vehicle's multi camera angles, and much more.

And just for the heck of it, there's also an unassigned two-finger gesture movement that can be customized by the driver to respond to anything from 'plot fastest route' to more bizarre instructions like 'order a pizza.'

There are still a plethora of companies doing great things and solving critical problems in society. Companies like Leap Motion seek to incorporate gesture recognition in the field of virtual reality. Gestigon and uSens create SDKs (software development kits) that help software developers to include gesture recognition in their clients' apps. Limix builds technology to record and translate sign language into audible words on a phone, speaker, or voice synthesizer.

The glaring benefits of gesture recognition technology

One of the most apparent benefits of gesture recognition is that it significantly mitigates device wearing. With interaction peripherals like keyboards, volume buttons, or even a touchscreen, the device will undoubtedly lead to the wearing of the equipment.

This is because of the constant and regular contact a user has to make with the device to be able to use it. Gesture recognition technology eliminates this. There is no direct contact with the device while operating it, being mostly a touchless form of user interaction.

Gesture recognitions make a much richer, intuitive, flexible, and experimental approach. Not only can users enjoy the traditional forms of gesture inputs, but with some gesture input devices, they can create custom and unique gestures and assign them actions.

This NUI technology is beginning to pop up in more devices, and it can't come fast enough. So far, it has made its way into our gaming experience, smartphone use, apps, medical and surgical initiatives, infotainment devices and venues, vehicles, and more.

According to a report on Grand View Research, the global gesture recognition and control market is burgeoning rapidly. It is expected to hit a total value of $30 billion comes 2025, as more markets and industries start to deploy it as a significant feature in their product/service lineup.

Final Words

Remember Tom Cruise and Steven Spielberg's sci-fi classic, Minority Report, back in 2002? That movie was a stark reminder that gesture control has been a long time coming. Cruise's sensor-laden gloves seemed futuristic at the time to most viewers, the world we live in today proves that the filmmakers were right all along.

Much like speech recognition, gesture control is still in its infancy. The ways that this technology can bolster human-device interaction are still being learned and implemented. Also, there are still challenges on the way to mainstream acceptance and deployment for gesture recognition.

Factors like perfecting the technology's accuracy, logistics of peripherals setup and space management; hardware immobility and commercial-scale limitations; adequate safety and operational regulations; and public perception, among other things.

But these challenges and barriers to entry are not insurmountable. It is only a matter of time before the world begins to take collective notice of the immense promise of gesture recognition and other NUI technologies for the global economy. Looking for ways to position your brand, product/service offering for this trend now can only be a good thing for the coming world of effective touchless interaction.

Explore now our collections!

We’ll leave you with a selection of the most creative interactions out there in our Collection Drag, Gestures & Other Interactions. To find the best examples of the most recent web projects. Do not miss to check more examples at Collections.


△ △ △ △

About the Author

Ana Medina likes writing from her university years. When she graduated from the Interpreters Department, she realized that translation was not so interesting, as writing was. She trains her skills now working as a freelance writer on different topics. Always she does her best in the posts and articles.