No futuristic movie or TV show is complete without a scene at mission control where the hero, or a geeky assistant, will wave an arm through the air to cause a holographic screen to appear, enabling plot-changing information to be revealed by pointing and swiping fingers.
Such cool representations are well ahead of existing products, but gestural recognition – where human movements are registered by sensors to trigger and direct devices – is an active research area and an evolving innovation that has captured popular imagination.
It's a fascination that predates the digital era. Audiences in the 1920s suspected sorcery when Russian physicist Leon Theremin waved his hands to produce melodic cadences from his new invention, the world's first electronic musical instrument.
Known as a theremin, it has two metal antennas that create an electromagnetic field. Very specific hand and finger gestures interrupt the field to prompt and control radio-frequency oscillators which generate an eerie, other-worldly tone that can replicate a musical scale, and even imitate a violin.
But theremins are treacherously difficult to master. Players must find the exact gestures the instrument demands out of thin air – and perform them with scrupulous precision if they don't want a squawking cacophony.
Modern computer-based devices are more forgiving. They can learn, or be programmed, to recognise and accept some deviation within defined gestures.
“Because a gesture recognition system usually tries to identify the unique pattern of a gesture, we have to keep the same pattern so that the system is able to detect it correctly,” says Dong Ma, Assistant Professor of Computer Science in the School of Computing and Information Systems at Singapore Management University (SMU).
“However, as humans are not perfect, it is impossible to perform a gesture exactly the same twice. [Modern] recognition systems usually have a certain degree of freedom to bear the slight variations of the gestures performed by different users, or over time.”
Powered by light
Professor Ma is a research partner in a recent project that demonstrates new possibilities in how solar panels can be utilised to recognise gestural commands.
It is based on the observation that hand gestures near a solar-powered device interfere with the incident light rays in a unique way, leaving a distinguishable signature in the harvested photocurrent.
The researchers built a laboratory prototype called SolarGest, which achieved 96 percent accuracy in detecting gestures while consuming 44 percent less power than light-sensor based systems. The actual sensing part of SolarGest uses no power because it is getting a free ride on the energy harvesting function of a solar panel.
The experiment considered a wide variety of settings: three solar cells with different energy harvesting efficiencies and transparencies; five light intensity levels for indoor and outdoor combined; six hand gestures; three subjects to perform the gestures; and scenarios with and without human interference, such as someone walking past.
In each of the sessions, subjects were asked to perform six gestures 40 times. The data was collected across five days and a dataset of 6960 gestures was collected and then classified using machine learning.
“The reason why we can differentiate gestures is that each gesture has its unique pattern. We utilise machine learning to automatically learn these patterns and find a way to separate them in the feature space,” says Professor Ma.
Towards a smarter watch
The researchers minimised the traditional issue of human inaccuracy by having such a large data set, with subjects asked to intentionally perform gestures slightly differently each time, which allowed the machine learning-based classifier to see more variations of the same gesture.
Another strategy was to pre-process the signal before it was analysed for classification. The researchers also devised an improved version of a segmentation algorithm to better detect start and stop points.
Any device fitted with solar panels for energy harvesting can also recognise gestures. Since solar energy harvesting responds to any form of light, SolarGest could find applications both indoors and outdoors. For example, users could purchase from solar-powered vending machines, configure solar-powered garden lights, or operate solar-powered calculators by simply using gestures.
But a disincentive to widespread use is that “conventional solar panels are opaque, bulky and difficult to integrate into electronic devices”, Professor Ma says.
More appealing is the possibility of designing applications using the emerging technology of transparent solar panels. Imagine a smart watch where the screen is an energy harvesting solar panel, which would both power the device and recognise gesture commands, and still allow the information on the watch face to be clearly visible.
“When the project was initiated, there were no transparent solar cells available in the market,” says Professor Ma. “So we collaborated with researchers from the School of Photovoltaic and Renewable Energy Engineering at UNSW Sydney. They designed and fabricated a transparent solar cell made from organic materials.”
The SolarGest system can work equally well with transparent cells, but there is a practical limitation in that “the energy harvesting efficiency is still quite low compared to traditional silicon-based solar cells”, says Professor Ma.
Simulation model
Specialist manufacturers across the globe are working to improve the performance of transparent solar cells, notably for use as windows, but it may be some time before the technology is affordably available – complete with development kits – for potential Internet of Things (IoT) devices, such as a solar smart watch. Analysing the impact of backlight on gesture recognition will be a future task for the SolarGest team.
The researchers have created a theoretical model for simulating hand gestures under different conditions that can allow would-be developers to estimate gesture performance of their devices without the time-consuming and physical costs of testing on real hardware.
“In addition, the model helps us to have a better understanding why and how solar cells can be used for gesture recognition,” Professor Ma says.
As affordable transparent cells become commercially available, the SolarGest team intends to extend the simulator with capabilities to analyse the impact of incident lights from both sides of a transparent solar cell.
“And we are continuing to explore other applications for solar-cell based sensing. One of the projects is to use smart home IoT devices that are equipped with a solar cell to do indoor localisation,” Professor Ma says.
Back to Research@SMU Feb 2022 Issue