IMFAR acceptances

February 1st, 2009  |  Published in News

imfar_2008_logo_575x205

Congratulations to Gabriela Marcu, David Nguyen, and Gillian Hayes for the acceptance of “Use of a Wearable Recording Device in Therapeutic Interventions for Children with Autism” and congratulations to Michael Yeganyan, Sen Hirano, David Nguyen, and Gillian Hayes for the acceptance of “Interactive and Collaborative Classroom Visual Schedules”

Interactive and Collaborative Classroom Visual Schedules

Michael Yeganyan, Sen Hirano, David H. Nguyen and Gillian R. Hayes
Department of Informatics, University of California, Irvine, Irvine, CA 92697

Background
Visual schedules have been in use as a best practice in schools for many years. These artifacts use words and images to represent activities that will take place (or have taken place) arranged in temporal order to augment understanding of time, events, and places. Research has shown that the use of visual schedules can improve communication with children with autism spectrum disorder (ASD). Often, however, these visual schedules are not interactive and interesting enough to draw and maintain attention; they can be out of date or inaccurate; and they do not support communication and collaboration surrounding activities. Handling these challenges necessitates an immense amount of manual effort by teachers and aides. To offset this burden, we have created innovative computing technologies that we are using to simplify schedule creation and generate useful data that teachers and other caregivers can analyze.

Objectives
Design and implement a visual scheduling system for classrooms to support children with autism and expand or refine current teaching techniques to make use of this new system. Demonstrate these schedules in use sessions by teachers.

Methods
We used paper and simple digital prototypes during sessions in autism classrooms and interviews with teachers and autism specialists. During the sessions, we iteratively developed a system that not only mimics the analog visual schedules currently in use but also provides new features. For example, the system provides students with personal devices that interact collectively with a large shared screen at the front of the room. We are collecting information about system usage and practices of teachers and students during demonstration visits and initial use in classrooms. In particular, we are interested in understanding the potential for social learning as students, teachers, and aides are able to see responses from students who may be struggling on particular activities as well as those who have mastered them.

Results
By spending time in the field with teachers, specialists, and children with autism, we have been able to understand and compile information about the design of interactive visual schedules. These electronic visual schedules can assist teachers in managing their classrooms, in not only setting up exercises but also running them and keeping records that would otherwise be unfeasible. Finally, using both shared large displays for the whole class and smaller networked systems for individual children, we enable new interactions in classrooms, including social and peer learning as well as more efficient and rapid feedback for students and staff about individual progress and abilities.

Conclusions
We have designed and developed an interactive visual scheduling system based on extensive in-school interviews and observations. This system replaces and enhances the features of analog visual schedules in digital form. Through this new system, we also enable new ways of keeping records by automatically logging all interactions with the system and new forms of teaching and learning by dynamically sending and receiving visual information to the students’ networked personal devices.

Use of a Wearable Recording Device in Therapeutic Interventions for Children with Autism

Gabriela Marcu, David H. Nguyen, Gillian R. Hayes
Department of Informatics
University of California, Irvine

Background
A common problem for caregivers of children with autism is facilitating communication, especially when the children are non-verbal. Caregivers use social stories and references to a child’s activities to encourage discussion, but these efforts are challenging and not always effective. Additionally, caregivers are challenged with trying to understand a child’s behavior and reactions without the aid of verbal expression from the child. This project applies the use of SenseCam to therapeutic interventions for children with autism.

SenseCam is a small wearable camera developed at Microsoft Research. It takes photographs automatically at a frequency determined either by a preset interval or in response to sensory input. Photographs can be transferred to a computer and watched at different speeds. SenseCam was designed to augment human memory by providing a first-person account of the wearer’s activities.

Objectives
In this work, we hypothesized that the use of SenseCam by children with autism will enable greater awareness of the child’s perspective during the day and potentially be used in new therapeutic interventions to support communication and understanding. The photographs are captured from the child’s perspective, allowing caregivers to review captured events with a child and encourage them to discuss activities and people appearing in these photographs. The primary objective of this work is to develop processes for inclusion of SenseCam technology into everyday use by children with autism and to modify interfaces to SenseCam media for use in these interventions.

Methods
Children with autism, their daily activities, and their social interactions were observed in the classroom. Interviews with teachers and parents of children with autism provided additional data about communication challenges. Interviews with autism specialists informed the design of use of SenseCam by children with autism.

Results
We have developed a therapeutic intervention using a novel technology–SenseCam from Microsoft Research. In this intervention, children with autism can wear SenseCam during all or part of a typical day. Parents and other caregivers can review photographs taken during school hours while teachers and other school staff review photographs taken during after-school hours. These review sessions can facilitate mutual understanding of activities in these environments and better communication between school and home. Parents, teachers, and other caregivers can also review photographs together with the children to facilitate communication. The photographs can be used to encourage the children to discuss activities, people, social situations, and their feelings. This intervention can improve the children’s expression and augment their understanding of social interactions through open discussion about the photographs.

Conclusions
Working in collaboration with schools and autism specialists, we have developed a technology based therapeutic intervention which can enhance caregiver awareness and improve children’s communication and social understanding. Future work will involve the deployment and evaluation of SenseCam in the classroom and the home among children with autism, their parents, teachers, and other caregivers.

Comments are closed.