Unimore logo AImageLab

Pervasive Self-Learning with multi-modal distributed sensors

Abstract: Truly ubiquitous computing poses new and significantchallenges. One of the key aspects that will condition theimpact of these new tecnologies is how to obtain a manageablerepresentation of the surrounding environment startingfrom simple sensing capabilities. This will make devicesable to adapt their computing activities on an everchangingenvironment. This paper presents a frameworkto promote unsupervised training processes among differentsensors. This framework allows different sensors to exchangethe needed knowledge to create a model to classifyevents. In particular we developed, as a case study,a multi-modal multi-sensor classification system combiningdata from a camera and a body-worn accelerometer to identifythe user motion state. The body-worn accelerometerlearns a model of the user behavior exploiting the informationcoming from the camera and uses it later on to classifythe user motion in an autonomous way. Experimentsdemonstrate the accuracy of the proposed approach in differentsituations.


Citation:

Bicocchi, Nicola; Mamei, Marco; Prati, Andrea; Cucchiara, Rita; Zambonelli, Franco "Pervasive Self-Learning with multi-modal distributed sensors" IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops, Venice, Italy, pp. 61 -66 , October 20-October 24 2008, 2008 DOI: 10.1109/SASOW.2008.51

 not available