New fully funded PhD positions in Cognitive Robotics & Human-Robot Interaction at the Italian Institute of Technology
Photo by Laura Taverna (IIT)

New fully funded PhD positions in Cognitive Robotics & Human-Robot Interaction at the Italian Institute of Technology

In the spirit of the doctoral School on Bioengineering and Robotics the PhD Program for the curriculum “Cognitive Robotics, Interaction and Rehabilitation Technologies” provides interdisciplinary training at the interface between technology and life-sciences. The general objective of the program is to form scientists and research technologists capable of working in multidisciplinary teams on projects where human factors play a crucial role in technological development and design.

The 5  fellowships offered this year by the Istituto Italiano di Tecnologia as part of this curriculum will be assigned to the best applicants to one of the six themes offered within rhe curriculum, that can be found at the following address: https://www.iit.it/phd-school/phd-school-genoa under the "Cognitive Robotics Interaction Rehabilitation technologies" Research themes.

Here you can find the direct link to the list of themes in: List of Themes (PDF)

The deadline for the application is at 12:00 (Italian time) of the 15th June 2020

Here interested applicants can find more details on the application: unofficial tip & tricks page to the application process

All IIT units involved in this curriculum are located in the “Erzelli” site of the Italian Institute of Technology and share research facilities including, besides two iCub humanoid Robots, a fully equipped motion capture room with simultaneous electromyography recording and force-platforms, a Transcranial Magnetic Stimulation Lab, an Electrophysiology Lab for EEG recording and meeting rooms.

The ideal candidates are students with a higher level university degree willing to invest extra time and effort in blending into a multidisciplinary team composed of neuroscientists, engineers, psychologists, physicists working together to investigate brain functions and realize intelligent machines, rehabilitation protocols and advanced prosthesis.

Interested applicants are encouraged to contact the perspective tutors for clarifications before submitting their application. International applications are encouraged and will receive logistic support with visa issues and relocation.

Within this curriculum I am supervising three themes:

  1. Autonomous learning in human-robot interaction: towards transparent robot symbiosis in skill acquisition
  2. The role of social signals in human-robot interaction in group
  3. Cortical networks for affective communication in human robot interaction

Below some more details on each. Drop me a line at alessandra.sciutti@iit.it if you need more information!

Autonomous learning in human-robot interaction: towards transparent robot symbiosis in skill acquisition

Tutors: Dr.Alessandra Sciutti, Dr.Francesco Rea, Prof. Giulio Sandini ; Research Units: Robotics, Brain and Cognitive Sciences and CONTACT

Description: A central element of effective human-robot collaboration is transparency, i.e., the communication of the internal processes guiding robot’s behaviors and decisions. In particular, when an artificial agent has to learn a dexterous activity from a human partner, the cooperation can be facilitated if the robot transparently gives insights on its own learning process to the human supervisor. To investigate transparency during skill acquisition, the research will first focus on the implementation of a supervised learning task for the humanoid robot iCub and then will address how to enable it to express transparently details about the learning. Last, the research will quantify the mutual influence of the transparent communication on the two agents. The first phase will be based on studies of autonomous learning (e.g.: reinforcement learning, learning by demonstration) leveraging on social signals. The second part will be based on studies on dyadic interaction [1-2] and will explore how the humanoid robot can transparently communicate the properties of its learning process to facilitate human supervision. In this context, the research will leverage on the existing cognitive framework of the iCub robot and enrich it to promote human robot interaction. Particular focus will be given to the improvement of human sensing finalized to the extraction of engagement measures (attentional level, cognitive load and fatigue) during learning.

Requirements: a master degree in Bioengineering, Computer Science or equivalent, with experience in the analysis and modeling of human movements and in robot programming. Attitude for experimental work, problem solving and computational modeling will constitute factors of preference.

References: 1. Rea, F., Vignolo, A., Sciutti, A., & Noceti, N. (2019). Human motion understanding for selecting action timing in collaborative human-robot interaction. Front. Robot. AI, 6, 58; 2. Rea, F., Sandini, G., & Metta, G. (2014). Motor biases in visual attention for a humanoid robot. In 2014 IEEE-RAS International Conference on Humanoid Robots (pp. 779-786). IEEE. 3. Sandini G., Sciutti A., Rea F. (2017) Movement-Based Communication for Humanoid-Human Interaction. In: Goswami A., Vadakkepat P. (eds) Humanoid Robotics: A Reference. Springer, Dordrecht.

The role of social signals in human-robot interaction in group

Tutors: Dr. Francesco Rea, Dr. Alessandra Sciutti, Prof. Giulio Sandini ; Research Units: Robotics, Brain and Cognitive Sciences and CONTACT

Description: Social interaction between humans and robots can happen in group, either when numerous individuals interact with one robot or when more than one robot interact with humans. This research intends to investigate whether social signals proven to be effective in dyadic interaction can be exploited in group settings. Moreover, it will assess whether the sense of social participation can be enhanced by the social dynamics of the group (being it homogeneous or heterogeneous).

The PhD candidate will focus on:

•           the development of methods and relative software to generate robot behaviors supporting natural mutual understanding in the context of group of multiple humans and multiple robots.

•           the design and implementation of novel models of robot perception, tailored for group setting. In particular, the dimension of social touch will be investigated.

•           the design of experiments to investigate the dynamics of group HRI and to validate the developed methods.

The goal of the research is modelling how the humanoid robot iCub could adapt its perception and social behaviour to the individual needs of the partner, when the interaction occurs in group and involves another iCub or a different robot. In particular, the investigation will address how these models can be transferred to novel robotic designs with different level of anthropomorphism.

Requirements: degree in robotics, bioengineering, computer science, computer engineering, or related disciplines, attitude for problem solving, C++ programming. A background on machine learning is an asset.

 References: 1. Tanevska A., Rea F., Sandini G., Canamero L., & Sciutti A. 2019, ‘A Cognitive Architecture for Socially Adaptable Robots.’, Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics, Oslo, Norway, August 19-22, 2019; 2.  Yoshikawa, Y., Iio, T., Arimoto, T., Sugiyama, H., & Ishiguro, H. (2017). Proactive Conversation between Multiple Robots to Improve the Sense of Human-Robot Conversation. AAAI Fall Symposia. ; 3.  Barros, P., Sciutti, A., Hootsmans, I. M., Opheij, L. M., Toebosch, R. H., & Barakova, E. (2020). It's Food Fight! Introducing the Chef's Hat Card Game for Affective-Aware HRI. arXiv preprint arXiv:2002.11458

Cortical networks for affective communication in human robot interaction

Tutors: Dr. Giuseppe Di Cesare, Dr. Alessandra Sciutti; Research Unit: CONTACT

Description: Social interactions require the ability to evaluate the attitudes of others observing their actions. Humans perform actions with different forms/styles expressing their positive or negative internal state. For example, observing a person that greets us, we may understand if that person is happy or not, or if he/she feels good or not. In future, new generations of robots could be endowed with the capacity to generate these forms of communication, in order to become more comfortable to interact with. To investigate further this form of robotic communication, the project will generate a series of actions of the iCub humanoid robot conveying different attitudes and evaluate their effect on human brain and on behavior during interaction. In particular, using the functional magnetic resonance imaging technique (fMRI), several robotic actions will be presented to healthy participants in order to assess the neural activity involved the observation of these actions. In addition, the adoption of other techniques will be considered, such as tractography, in order to generate virtual connectivity maps of brain structures. The research project will be carried out in collaboration with the University of Parma that is equipped with a 3 Tesla MR scanner. The work will take advantage of an existing software module available on the iCub robot supporting the generation of actions with different properties and will potentially improve it.

The successful candidate will: 1) participate in the generation of iCub robot’s actions characterized by different kinematic features and styles; 2) develop and test cognitive paradigms coupled with cortical and subcortical fMRI recordings; 3) compute brain activity maps from fMRI data.

Requirements: Degree in Bioengineering, Computer Science, Computer Engineering, Robotics, or related disciplines, attitude for problem solving, C++ programming. We expect the candidate to develop skills in signal processing, and computational modelling. Excellent analytical skills (MATLAB) will also be required.

 References: 1.  Di Cesare G., Gerbella M., Rizzolatti G., (2020). The neural bases of vitality forms.National Science Review; 7, (1) 202–213.  2. Di Cesare, G., Di Dio, C., Marchi, M., & Rizzolatti, G. (2015). Expressing and understanding our internal states and those of others. Proceedings of the National Academy of Sciences of the U S A, 112(33):10331-5. 3.     Vannucci F., Di Cesare G., Rea F., Sandini G., Sciutti A. (2019). A Robot with Style: Can Robotic Attitudes Influence Human Actions? IEEE-RAS International Conference on Humanoid Robots, doi: 10.1109/HUMANOIDS.2018.8625004.

 


To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics