New app to let devices watch your surroundings 24x7
The concept of the RedEye app is to allow our computers to assist us by showing them what people see throughout the day.
Washington: Scientists have developed an app that allows devices to have continuous vision and remember only specific things, an advance that may turn computers and smartphones into personal assistants to help people in their daily lives.
"The concept is to allow our computers to assist us by showing them what we see throughout the day," said Lin Zhong, professor at Rice University in the US.
"It would be like having a personal assistant who can remember someone you met, where you met them, what they told you and other specific information like prices, dates and times," Zhong said.
The app RedEye is an example of the kind of technology the computing industry is developing for use with wearable, hands-free, always-on devices that are designed to support people in their daily lives, researchers said.
The trend, which is sometimes referred to as "pervasive computing" or "ambient intelligence," centres on technology that can recognise and even anticipate what someone needs and provide it right away.
"The pervasive-computing movement foresees devices that are personal assistants, which help us in big and small ways at almost every moment of our lives," Zhong said.
The bottleneck for continuous vision is energy consumption as the best smartphone cameras are battery killers, especially when they are processing real-time video, he added.
Researchers measured the energy profiles of off-the-shelf image sensors and determined that existing technology would need to be about 100 times more energy-efficient for continuous vision to become commercially viable.
They improved the power consumption of off-the-shelf image sensors tenfold simply through software optimisation. The energy bottleneck was the conversion of images from analogue to digital format, researchers said.
"Real-world signals are analogue, and converting them to digital signals is expensive in terms of energy," said Robert LiKamWa, graduate student at Rice University.
"We decided a better option might be to analyse the signals while they were still analogue," LiKamWa said. The main drawback of processing analogue signals – and the reason digital conversion is the standard first step for most image-processing systems today - is that analogue signals are inherently noisy, LiKamWa said.
To make RedEye attractive to device makers, the team showed that it could reliably interpret analogue signals. Researchers used a combination of the latest techniques from machine learning, system architecture and circuit design.
"The upshot is that we can recognise objects - like cats, dogs, keys, phones, computers, faces, etc - without actually looking at the image itself," LiKamWa said.
"We're just looking at the analogue output from the vision sensor. We have an understanding of what's there without having an actual image," he said.
"This increases energy efficiency because we can choose to digitise only the images that are worth expending energy to create," he added.