Telematic Embodied Learning
Synthesis Center leverages decades of research and techniques that allow people to freely move about, manipulate physical objects, and use their senses of touch and proprioception. As pandemic-induced shutdowns accelerate the transition to blended and distance learning, we ask: Can we design learning experiences that allow students and teachers in different locations to use the affordances of their respective physical spaces and bodily abilities (Rajko 2017, Sheets-Johnstone 2011, 2013, 2016)?
In collaboration with experts in higher education and community stakeholders, the Synthesis Center is developing a suite of tools for collaborative, portable, mixed reality environments that free students and teachers from uninterrupted screen use and reintroduce spatial awareness and interaction wherever hybrid teaching methods are needed. Validation of this novel learning experience will require iterative, community-driven design, custom development of low-cost software and hardware, and a pedagogical analysis that leverages but also moves beyond traditional methods of assessment and engagement
This is a multidisciplinary effort that prioritizes:
- Shared experience in live settings, paying attention to spatial, corporeal, and social affordances to enable ensemble activities.
- Tools that are gesture-based, minimizing fatigue due to extensive use of screen-based technology.
- Accommodating unanticipated pedagogical practices invented by teachers (or students) for specialized needs, which may have distinct gestural idioms and techniques.
- Minimizing content development costs and user cognitive load without compromising the social aspects of learning: body language, a sense of physical presence, tangible affordances, and synchronous interaction between learners and teachers.
- Imbuing media objects (image, video, sound, text, freehand squiggle) with a tangibility that is analogous to that of physical experience.
- Easy incorporation of ad hoc bodily action and movement, and the affordances of physical surroundings and objects into pedagogical activity without requiring additional engineering or apps.
The TEL research group is developing relationships with community stakeholders so as to place teachers and students front and center of R&D. We will lead iterative design sessions and gain knowledge of the pain points that teachers and students face on a daily basis, particularly as a consequence of the pandemic. We plan to develop partnerships over the course of 2020-2021 and lead workshops that follow the principles of human-centered design.
A gestural interface for students to organize, create and trace conceptual connections across multimedia learning materials, and for educators to design collaborative exercises.
Sutured spaces for virtual classrooms
A network solution for synchronous classroom activity; this setup will enable any scenario that requires individual workspaces, which instructors can observe in real-time and either intervene in privately or share with the entire group of participants.
Media Choreography and Playful Environments
In this responsive media studio course, students with no programming background will make persuasive performances and installations using pre-built media processing software instruments and accessories. The emphasis is on experiential design, ensemble activity, and where possible augmenting the physical environment beyond screens and keyboards.
Complex Systems (Weather, Geography, Heatscapes)
Tabletop environment in which participants can jointly dip their hands and physical props in order to interact with and study a complex system, such as weather physics, heatscape, or urban infrastructure.
- Gesture tracking
- Systems chart tracing
- Augmented Reality
- Streaming multi-channel video and audio
- Custom hardware for physical interaction
- Real-time complex systems simulation (e.g. weather patterns)
- Video portals and mated tabletops
For more on technical resources, see Synthesis Center techniques.
Garrett L Johnson (AME MAS PhD): Research lead, experience design
Gabriele Carotti-Sha (San Francisco): External projects coordinator, experience design
Muindi F Muindi (U Washington, Seattle): Performative experience design
Tianhang Liu (AME Digital Culture): Augmented reality, research assistant
Andrew Robinson (Synthesis, Weightless, AME Digital Culture): Realtime media, user interfaces
Ivan Mendoza (AME Digital Culture): Gesture tracking, 3D graphics
Connor Rawls (Synthesis): Media choreography systems, network media, responsive environments
Pete Weisman (AME Technical Director): Audio-visual systems, responsive environments
Sha Xin Wei PhD (Synthesis, AME): Director, experience design, external projects