2.50
Hdl Handle:
http://hdl.handle.net/10547/340317
Title:
Framework of active robot learning
Authors:
Liu, Beisheng
Abstract:
In recent years, cognitive robots have become an attractive research area of Artificial Intelligent (AI). High-order beliefs for cognitive robots regard the robots' thought about their users' intention and preference. The existing approaches to the development of such beliefs through machine learning rely on particular social cues or specifically defined award functions . Therefore, their applications can be limited. This study carried out primary research on active robot learning (ARL) which facilitates a robot to develop high-order beliefs by actively collecting/discovering evidence it needs. The emphasis is on active learning, but not teaching. Hence, social cues and award functions are not necessary. In this study, the framework of ARL was developed. Fuzzy logic was employed in the framework for controlling robot and for identifying high-order beliefs. A simulation environment was set up where a human and a cognitive robot were modelled using MATLAB, and ARL was implemented through simulation. Simulations were also performed in this study where the human and the robot tried to jointly lift a stick and keep the stick level. The simulation results show that under the framework a robot is able to discover the evidence it needs to confirm its user's intention.
Citation:
Liu, B. (2008) 'Framework of active robot learning'. MSc by research thesis. University of Bedfordshire.
Publisher:
University of Bedfordshire
Issue Date:
Oct-2008
URI:
http://hdl.handle.net/10547/340317
Type:
Thesis or dissertation
Language:
en
Description:
A thesis submitted to the University of Bedfordshire, in fulfilment of the requirements for the degree of Master of Science by research
Appears in Collections:
Masters e-theses

Full metadata record

DC FieldValue Language
dc.contributor.authorLiu, Beishengen
dc.date.accessioned2015-01-30T10:53:59Z-
dc.date.available2015-01-30T10:53:59Z-
dc.date.issued2008-10-
dc.identifier.citationLiu, B. (2008) 'Framework of active robot learning'. MSc by research thesis. University of Bedfordshire.en
dc.identifier.urihttp://hdl.handle.net/10547/340317-
dc.descriptionA thesis submitted to the University of Bedfordshire, in fulfilment of the requirements for the degree of Master of Science by researchen
dc.description.abstractIn recent years, cognitive robots have become an attractive research area of Artificial Intelligent (AI). High-order beliefs for cognitive robots regard the robots' thought about their users' intention and preference. The existing approaches to the development of such beliefs through machine learning rely on particular social cues or specifically defined award functions . Therefore, their applications can be limited. This study carried out primary research on active robot learning (ARL) which facilitates a robot to develop high-order beliefs by actively collecting/discovering evidence it needs. The emphasis is on active learning, but not teaching. Hence, social cues and award functions are not necessary. In this study, the framework of ARL was developed. Fuzzy logic was employed in the framework for controlling robot and for identifying high-order beliefs. A simulation environment was set up where a human and a cognitive robot were modelled using MATLAB, and ARL was implemented through simulation. Simulations were also performed in this study where the human and the robot tried to jointly lift a stick and keep the stick level. The simulation results show that under the framework a robot is able to discover the evidence it needs to confirm its user's intention.en
dc.language.isoenen
dc.publisherUniversity of Bedfordshireen
dc.subjectH671 Roboticsen
dc.subjectcognitive roboticsen
dc.subjecthigh-order beliefsen
dc.subjectrobot active learningen
dc.subjectfuzzy logicen
dc.subjectMATLABen
dc.subjectroboticsen
dc.titleFramework of active robot learningen
dc.typeThesis or dissertationen
This item is licensed under a Creative Commons License
Creative Commons
All Items in UOBREP are protected by copyright, with all rights reserved, unless otherwise indicated.