Sep 5 2017

Smartphones Will Know Where They Sit

Dan

Blog post image

A new technology created by scientists at the University of St. Andrews in Scotland will eventually allow smartphones to act as if they were tactile. In other words, they will be able to carry out a number of tasks just by recognizing the surface it sits on. The given name is SpeCam, a program that seeks to enable phones to use their built-in camera to recognize what materials it is exposed to. A database of surfaces and their descriptive parameters would help the phone to “see” what type of surface it is left on.

The researchers say that this would offer a wide range of applications, starting with what is perhaps the most urgent yet most trivial of uses, that is, to be able to tell you where it has been left when you can’t find it, but also to perform more sophisticated tasks depending on the surface it is left on. One simple example would be, if the phone rings and you place it face down on your desk (or your laptop), it could send a message back to warn that you are busy at a meeting, or if put in your pocket, it could be programmed to suggest the caller to try calling someone else in your group. The program would also allow for different tasks to be triggered depending on the way it is being held, such as to change the volume on a TV set it is pointed at, or playing different kinds of music depending where the phone is sitting on. Since the basic hardware requirements are already present in most (if not all) smartphones, the SpeCam system could be easily enabled through a software update or by installing the application itself.

This new development will be presented at the prestigious 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, on September 6th in Vienna, Austria. The paper entitled “SpeCam: Sensing Surface Color and Material with the Front-Facing Camera of a Mobile Device” was given an honourable mention award at MobileHCI, and it was one of only 5 such papers in the entire program to receive such distinction.

Lead researcher Professor Aaron Quigley, Chair of Human Computer Interaction at the University, said: “This is an example of what we call Discreet Computing or discreet interaction, where subtle and inconspicuous user actions can result in entirely new forms of interaction. By training and hence recognizing materials all the surfaces around us become a canvas for our imagination.”

This clever idea uses the front camera of the smartphone to capture the reflection on the screen, which is used as a multi-spectral light source. Then the program scans the incidence of different colors on the captured image in order to detect and identify the material the phone is sitting on and then infer the location of the device.

Hui Yeo, whose PhD research led to this research, added: “The goal is to explore novel forms of interaction for one-handed use which can leverage what people already do with devices, without adding new hardware to the device.”

Speaking of exploring novel forms of interaction, we at Tenex Developers place research in high priority. Give us a call, or send us an email and we will be glad to demonstrate to you how we can enable your ideas with the latest technology paradigms that will certainly please your customers.