Rome, April 14th, 2007 Human-robot interaction in autism

Preparing to load PDF file. please wait...

0 of 0
100%
Rome, April 14th, 2007 Human-robot interaction in autism

Transcript Of Rome, April 14th, 2007 Human-robot interaction in autism

ICRA’07 2007 IEEE International Conference on Robotics and Automation
Full Day Workshop on Roboethics
Rome, April 14th, 2007
Human-robot interaction in autism

S. Casalini, G. Dalle Mura, M. L. Sica, A. Fornai, M. Ferro, G. Pioggia, R. Igliozzi, A. Ahluwalia, F. Muratori, D. De Rossi

Marcello Ferro, Ph.D.

Andrea Fornai, Ph. D.

Interdepartmental Research Center “E. Piaggio”

Department of Cognitive Science

Faculty of Engineering

Faculty of Literature and Philosophy

University of Pisa, Italy

University of Siena, Italy

Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

Facial Automaton for Conveying Emotions (FACE)

Human-machine interface for non verbal communication within an umwelt

Believability: the embodied mind

Materials
Artificial vision and earing system
Proprioception system Artificial muscles Motor control

Control
Sensory and actuating data fusion Imitation strategy Neurocontroller

Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

Facial Automaton for Conveying Emotions (FACE)
The learning process in FACE will be based on imitating predefined stereotypical behaviours which can be represented in terms of FAPs (Fixed Action Patterns) followed by a continuous interaction with its umwelt, the epigenetic evolution of the machine
FAPs

action schemes, partly fixed on the basis of physical constraints and sensory-motor reflexes, partly
subjected to a specialization on the basis of the experience

FACE
will continually learn, adapt and evolve within a simplified behavioural space in function of the umwelt and it will maintain spontaneous activity open to any innovative and intelligible behaviours
arising which may then be interpreted

Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

FACE Architecture
Man-machine interface for non-verbal communication

• Paradigms to develop a
believable emotional display • anthropomorphic
mechanics • science of materials

Neuroscientific overview on neural message transmission
Correlation between neural activity and emotional behaviours

Emerging emotional behaviours
By behaviour we mean an emerging form of interaction with the environment FACE is engaged with. The problem we are currently setting ourselves is that of realizing a neural structure capable of creating its own representation of the surrounding environment in order to make it possible for innovative behaviours to emerge. Emerging behaviours could derive from an associative memory through which it may be possible to navigate within a behavioural space. These characteristics are typical of some areas of the central nervous system like the hippocampus, upon which the architecture for the neurocontroller of FACE will be based.
• Pioggia et al., “FACE: Facial Automaton for Conveying Emotions”, Applied Bionics and Biomechanics, 1(2), 2004 • Casalini et al., “FACE e la sua mente”, in "La Bioingegneria del Sistema Cervello-mente", cap. 5., Biondi Ed., Collana di Bioingegneria, Patron, 2006
Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

Framework Architecture
Framework’s base architecture is available to the researcher as a structured programming environment
Filters can be redefined according to the particular device’s technology
The efficiency of filtering and buffering processes over the data coming from sensors and over the data directed to actuating devices is delegated to appropriate interfaces (drivers)
The framework is responsible for dispatching transducer data to the control system through an indexing operation during the initialization step
Specific processes are defined inside the control system, and their execution and synchronization are managed by the framework
Ferro et al., “An Architecture for High Efficiency Real-time Sensor and Actuator Data Processing”, EUROSENSORS XIX, Barcelona, Spain, September 11th-14th, 2005
Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

Acquisition of physiological and behavioural information

an unobtrusive sensorized wearable
interface from the interlocutor.

to analyse the emotional reactions of individuals through
optical analyses of facial expressions and tracking

Shoulder articulation ECG - RA
Elbow articulation

a
LA Thoracic respiration
Precordial leads
Abdominal respiration LF

b

c

d

e

Future development
Gaze monitoring

Facial expression recognition

Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

Facial Expression Recognition

Hierarchical Neural Network (HNN): 4 KSOMs + 1 MLP

Off-line training and test:

• Facial Expression Databases JAFFE: 5 female subjects, 7 facial expressions, 154 total images)
Center “E. Piaggio”: 2 male subjects, 7 facial expressions, 308 total images)

Facial expressions of: a) neutrality; b) happiness; c) surprise; d) anger; e) disgust; f) sadness; g) fear

• Splitter tool

• DataEngine ANN: 18 HNN configurations

• Panellist tool: 12 subjects at Center “E. Piaggio”

Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

Facial Expression Recognition (2)
• Off-line training: MLP and KSOM learning • Real-time test: CameraSensorDriver, Face Tracking, Facial Zone Detection, MLP and KSOM running processes
Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

Facial Proprioception System

• Sensors

Directly printed on the fabric by

using carbon filled silicone

• Connecting wires

rubber.

This mixture shows piezoresistive properties and can be used as a

sensor. Moreover, no external wiring is necessary to interconnect

the sensors on the fabric. Elastosil LR 3162 A/B is produced by

WACKER Ltd which guarantees the non toxicity of the material

A module for the facial profile reconstruction is currently under development as an extension of the framework architecture

• Mazzoldi et al., “EAP activity in Italy”, World Wide ElectroActive Polymers EAP (Artificial Muscles) Newsletter, Yoseph Bar-Cohen Editor, Vol. 6, No. 2, 2004. • Pioggia et al., “A biomimetic sensing skin: characterization of piezoresistive fabric-based elastomeric sensors”, in Sensors and Microsystems, 10th Italian Conference, World Scientific Publishing Co., Singapore, 2006.
Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]

FACE interacts with kinesics, a non verbal communication conveyed by body part movements and facial expressions
Full Day Workshop on Roboethics – April, 2007 – Rome, Italy – Marcello Ferro [email protected] – Andrea Fornai [email protected]
BehavioursExpressionsAutomatonSensorsTraining