Context-aware gestural interaction in the smart environments of the ubiquitous computing era

5.00
Hdl Handle:
http://hdl.handle.net/10547/344619
Title:
Context-aware gestural interaction in the smart environments of the ubiquitous computing era
Authors:
Caon, Maurizio
Abstract:
Technology is becoming pervasive and the current interfaces are not adequate for the interaction with the smart environments of the ubiquitous computing era. Recently, researchers have started to address this issue introducing the concept of natural user interface, which is mainly based on gestural interactions. Many issues are still open in this emerging domain and, in particular, there is a lack of common guidelines for coherent implementation of gestural interfaces. This research investigates gestural interactions between humans and smart environments. It proposes a novel framework for the high-level organization of the context information. The framework is conceived to provide the support for a novel approach using functional gestures to reduce the gesture ambiguity and the number of gestures in taxonomies and improve the usability. In order to validate this framework, a proof-of-concept has been developed. A prototype has been developed by implementing a novel method for the view-invariant recognition of deictic and dynamic gestures. Tests have been conducted to assess the gesture recognition accuracy and the usability of the interfaces developed following the proposed framework. The results show that the method provides optimal gesture recognition from very different view-points whilst the usability tests have yielded high scores. Further investigation on the context information has been performed tackling the problem of user status. It is intended as human activity and a technique based on an innovative application of electromyography is proposed. The tests show that the proposed technique has achieved good activity recognition accuracy. The context is treated also as system status. In ubiquitous computing, the system can adopt different paradigms: wearable, environmental and pervasive. A novel paradigm, called synergistic paradigm, is presented combining the advantages of the wearable and environmental paradigms. Moreover, it augments the interaction possibilities of the user and ensures better gesture recognition accuracy than with the other paradigms.
Citation:
Caon, M. (2014) 'Context-aware gestural interaction in the smart environments of the ubiquitous computing era'. PhD thesis. University of Bedfordshire.
Publisher:
University of Bedfordshire
Issue Date:
Jun-2014
URI:
http://hdl.handle.net/10547/344619
Type:
Thesis or dissertation
Language:
en
Description:
A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of Philosophy
Appears in Collections:
PhD e-theses

Full metadata record

DC FieldValue Language
dc.contributor.authorCaon, Maurizioen
dc.date.accessioned2015-02-20T13:38:34Zen
dc.date.available2015-02-20T13:38:34Zen
dc.date.issued2014-06en
dc.identifier.citationCaon, M. (2014) 'Context-aware gestural interaction in the smart environments of the ubiquitous computing era'. PhD thesis. University of Bedfordshire.en
dc.identifier.urihttp://hdl.handle.net/10547/344619en
dc.descriptionA thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of Philosophyen
dc.description.abstractTechnology is becoming pervasive and the current interfaces are not adequate for the interaction with the smart environments of the ubiquitous computing era. Recently, researchers have started to address this issue introducing the concept of natural user interface, which is mainly based on gestural interactions. Many issues are still open in this emerging domain and, in particular, there is a lack of common guidelines for coherent implementation of gestural interfaces. This research investigates gestural interactions between humans and smart environments. It proposes a novel framework for the high-level organization of the context information. The framework is conceived to provide the support for a novel approach using functional gestures to reduce the gesture ambiguity and the number of gestures in taxonomies and improve the usability. In order to validate this framework, a proof-of-concept has been developed. A prototype has been developed by implementing a novel method for the view-invariant recognition of deictic and dynamic gestures. Tests have been conducted to assess the gesture recognition accuracy and the usability of the interfaces developed following the proposed framework. The results show that the method provides optimal gesture recognition from very different view-points whilst the usability tests have yielded high scores. Further investigation on the context information has been performed tackling the problem of user status. It is intended as human activity and a technique based on an innovative application of electromyography is proposed. The tests show that the proposed technique has achieved good activity recognition accuracy. The context is treated also as system status. In ubiquitous computing, the system can adopt different paradigms: wearable, environmental and pervasive. A novel paradigm, called synergistic paradigm, is presented combining the advantages of the wearable and environmental paradigms. Moreover, it augments the interaction possibilities of the user and ensures better gesture recognition accuracy than with the other paradigms.en
dc.language.isoenen
dc.publisherUniversity of Bedfordshireen
dc.subjectG490 Computing Science not elsewhere classifieden
dc.subjectubiquitous computingen
dc.subjectgestural interactionen
dc.subjectsmart environmentsen
dc.subjecthuman computer interactionen
dc.subjectnatural user interfaceen
dc.subjectgesturesen
dc.titleContext-aware gestural interaction in the smart environments of the ubiquitous computing eraen
dc.typeThesis or dissertationen
dc.type.qualificationnamePhDen_GB
dc.type.qualificationlevelPhDen
dc.publisher.institutionUniversity of Bedfordshireen
This item is licensed under a Creative Commons License
Creative Commons
All Items in UOBREP are protected by copyright, with all rights reserved, unless otherwise indicated.