BL2 Hosts an Exciting Research Meeting!

October 22, 2015

https://www.youtube.com/watch?v=Bu3wtlxJg68

(This video has no audio to be captioned or interpreted)

Members of Dr. Laura-Ann Petitto’s Brain and Language Lab for Neuroimaging, BL2, with scientists and students from three other universities (see below for a full list of names), participated in two thrilling days of science on October 21 and 22, 2015, at Gallaudet University.

During the Keck FoundationNational Science Foundation INSPIRE meeting, the second of three planned meetings, participants discussed and worked together on the technical and research aspects of integration among a human avatar, Maki humanoid robot, and thermal imaging for a prototype intelligent Robot-Virtual Human (Avatar) learning tool.


The team conducted pilot studies with two infants to identify optimal Robot, Avatar, and Thermal Camera placement and configuration. During the first day of the meeting, the Keck-NSF INSPIRE Petitto team completed a pilot study with an infant, testing hypothesized resolutions to questions such as optimal distance between the infant, the Robot, and the Virtual Human screen, as well as size, angle, and other factors. The team also investigated whether the infant’s eye gaze followed the direction in which the Robot’s head turned and whether the infant engaged in shared visual regard and joint attention with the Robot. On the second day, the team made adjustments to the configuration and conducted a second pilot session with a different infant.


The Petitto (PI) Keck and NSF-INSPIRE team is conducting fNIRS neuroimaging research in the BL2 laboratory at Gallaudet, which will be utilized to build the Robot Avatar thermal-Enhanced learning tool that will, for the first time, provide socially contingent and interactive language to young babies even before they are able to produce language. This project especially targets babies of ages appropriate for the critical period of nascent phonological development.


The fNIRS neuroimaging research will permit the identification of babies’ peaked sensitivity to specific rhythmic temporal patterns understood to be at the heart of phonological segmentation – whether it be in speech or sign. The Petitto team is testing hypotheses about the rhythmic nuclei that make possible infants’ capacity to discover (find salient, attend to) visual SIGN PHONOLOGICAL units in the input if exposed to sign, or auditory SPEECH PHONOLOGICAL units if exposed to speed.


Avatar Production and Robot Perception: The identified rhythmic temporal algorithms will be used with Virtual Human creation (Dr. David Traum/USC; one Avatar creation of a native signer, Melissa Malzkuhn – Motion Light Lab Director, is shown below). Thermal Infrared Imaging (Dr. Arcangelo Merla, University of Chieti, Italy) will be used to tell us when the baby is in a peaked state of attentional-emotional arousal during fNIRS phonological processing experiments, and, thus, provide a new tool for harnessing when pre-verbal babies are “ready to learn.” This, in turn, will trigger a Robot (Dr. Brian Scassellati/Yale University) to “perceive” a baby’s engagement as looking + social engagement (as opposed to looking + crying) and to start or stop a Virtual Human who will produce interactive language and nursery rhymes to young deaf and hearing babies in sign language, with a speech option.


Impact of science for society: One intended use of the RAVE learning tool will be to provide interactive language samples to babies who may have minimal language input in early life, which has been shown to have devastating life-long deleterious impacts on language, cognition, and, crucially, reading success.


Full list of names of Keck & NSF INSPIRE science participants:

The Brain and Language Laboratory for Neuroimaging, BL2 at Gallaudet University

Prof. Laura-Ann Petitto (PI, Keck & NSF INSPIRE)

Melissa Malzkuhn (Director: Motion Light Laboratory, ML2/Keck Co-PI)

Dr. Barbara Manini (Petitto Keck-Thermal IR Imaging Post-Doc)

Dr. Clifton Langdon (PEN faculty)

Adam Stone (PEN graduate student)

Geo Kartheiser (Pen graduate student)

TraciAnn Hoglind (NSF SLC-VL2 Undergraduate Scholar)

Infrared Imaging lab of ITAB, Institute of Advanced Biomedical Technologies of “G.D’Annunzio” Italy (via Fuze)

Prof. Arcangelo Merla (Thermal Project Head)

Dr. Daniela Cardone (Thermal Project Research Assistant)

Paola Pinti (Thermal Project Graduate Student)

USC Institute for Creative Technologies

Prof. David Traum (Virtual Human Project Head/NSF INSPIRE CO-PI)

Ari Shapiro (VH Project Research Scientist)

Yale University, Social Robotics Lab

Prof. Brian Scassellati (Robotics Project Head)

Dr. Katherine Tsui (Scassellati Keck-Robotics Post-Doc)


https://www.youtube.com/watch?v=gy2dyw8ifQQ

(This video has no audio to be captioned or interpreted)

Melissa Malzkuhn (ML2) goes to Drs. David Traum and Ari Shapiro’s lab at USC. Melissa was scanned into a virtual human for her avatar creation to be used in the ASL Nursery Rhyme language samples for RAVE.

https://www.youtube.com/watch?v=SV1e0GPQTn4

(This video has no audio to be captioned or interpreted)



0 views

© petitto.net