London, Sep 3 (ANI): Ever thought that your home would tell if you have left a stove burner on after making your breakfast? Well, it is now possible, thanks to the new sensor-stuffed apartment created by researchers at Washington State University in Pullman.
The smart home, known as Casas, developed by Diane Cook and colleagues, can learn the ways of its inhabitants by observing their daily habits and how they use different appliances everyday.
The technology could be used in houses to support people with cognitive difficulties or dementia with their daily living needs, or to make things easier for healthy people.
For example, the apartment can recognise when a person is performing actions associated with making breakfast and can prompt them with audio and video signals to warm them of any anomaly like a stove left burning.
While Casas was developed to analyse the sensors' output, Graduate student Parisa Rashidi has improved the system, so that it can learn a person's habits without prior assumptions about what events or patterns to expect.
While previous smart homes used movie cameras to pre-define key activities before recognising them, the new system was successfully tested in a specially outfitted apartment with a single resident on campus.
It required around a month of training to accurately tease out the resident's habits from the sea of sensor data, said Rashidi.
Once trained, Casas can identify patterns as complex as "at 6 am the kitchen light comes on, the coffee maker turns on, and the toaster turns on" without any prior knowledge of what to expect.
To maintain a resident's sense of privacy Casas works without cameras, RFID chips or microphones.
Instead less "invasive" sensors that detect motion, temperature, light, humidity, water, door contact and the use of key items, such as opening a bottle of medication or switching on the toaster.
"We don't want to give residents the feeling that Big Brother is watching them," New Scientist quoted Rashidi as saying.
The researchers developed a number of data-mining algorithms to help make sense of the sensor output.
One algorithm uses a grid of motion sensors to map out how a person walks around the home, looking for daily "trajectories", or routes through the house.
A second algorithm finds patterns in a sequence of events, such as learning to expect the resident to turn on a tap after turning on the oven.
And a third algorithm looks to correlate events it detects with the time of day to identify the pattern, for example, of when the person eats dinner.
Now the researchers are working on upgrades that allow the apartment to decipher the actions of multiple inhabitants and recognise subtle variations in commonly repeated tasks.
The study has been published in the journal IEEE Transactions on Systems Man and Cybernetics. (ANI)