ATLAS researchers converge at TEI’26 to showcase their work on tangible, embedded, and embodied interaction
Sound, vision, movement and touch—ATLAS researchers explore many different ways humans can interact with computers, collect and analyze data, and empower creative exploration.
Nearly a dozen current and former ATLAS lab members will participate in in Chicago (March 8-11, 2026), the 20th annual conference presenting the latest results in tangible, embedded, and embodied interaction.
This year’s conference theme is “Tide + Tied”. Organizers note, “By becoming a venue to bring multi-folded 'Tides' across diverse, interdisciplinary fields, the conference aims to bring researchers, designers, and artists with different backgrounds and interests together to be 'Tied,' weaving the future of the TEI community together.”
ATLAS has been involved with the TEI conference since its early years, with Professor and ACME Lab director, Ellen Do, and ATLAS director Mark Gross both actively involved behind the scenes.
Do, who is a co-author on three papers and three works-in-progress accepted at TEI ‘26, explains, “Each one of the projects is a documentation of how researchers think about ideas and how to implement them and get them to fruition.”
She elaborates, “The conference is called Tangible, Embedded and Embodied Interaction, so a lot of work we're doing is beyond the screen. Things that we touch and put together.”
Papers
Kosei Ueda,Ellen Yi-Luen Do, Hironori Yoshida
Abstract: Traditional carpentry faces a critical shortage of skilled workers due to limited opportunities for potential apprentices to access onsite woodworking experience. Through expert interviews, we learned the importance of hammering sound to judge the precision in Kigumi assembly, as master carpenters rely on differences between “soft sound" and “sharp sound" without relying on visuals. This paper presents Sound of Kigumi (SoK), a playful VR system for inexperienced users to casually experience sound sensory skills through the loop of hammering and chiseling. In SoK, users listen to hammering sound in relation to tightness, assess the precision of their work, and return to chiseling for further adjustments. Furthermore, SoK implements pseudo-haptic feedback by visually modifying hammering resistance based on chiseling progress. Expert evaluation indicated SoK replicates the hammering process and serves as an effective introductory tool, and user feedback confirmed SoK provides an immersive woodworking experience and effective Kigumi learning.

The first prototype: The user observes and hammers two types of Kigumi - one correctly processed without visible gaps when hammered and one incorrectly processed that reveals gaps upon hammering.
Krithik Ranjan, S. Sandra Bae, Peter Gyory,Ellen Yi-Luen Do, Clement Zheng, Rong-Hao Liang
Abstract: Outdated Computer Vision (CV) toolkits for Tangible User Interfaces (TUI) have led to fragmented practices, diminished reproducibility, and reduced community support. This paper examines the past, present, and future trajectory of CV-TUI toolkits. First, our scoping review of ACM literature reveals a divergence between applications using the limited interactions of established toolkits like ReacTIVision and the fragmented, bespoke systems built for complex interactions, highlighting the need for advanced toolkits that enable accessible making. Second, we present proof-of-concept applications using the contemporary ArUco fiducial marker library. We demonstrate how accessible hardware, like a top-down camera and a flat-panel display, can support a comprehensive design space of tangible interactions beyond 2D manipulation, including 3D spatial interaction, multi-device interaction, and actuated tangibles within canonical applications. Finally, reflecting on our findings, we offer six suggestions for building next-generation CV-TUI toolkits. This study provides the TUI community with an updated perspective to inform future research.

Sensing touch input on tokens with a capacitive touchscreen: a) Each ArUco-marked knob is augmented with a vinyl-cut copper sheet pattern. b-c) The knob transfers finger-touch inputs to the touchscreen when users interact with the token.
Thiago Rossi Roque, Ruojia Sun,Ellen Yi-Luen Do,Grace Leslie
Abstract: Building on the growing interest in technology-supported dance practice, neural imaging offers novel opportunities to reveal dancers’ internal states and expand the possibilities for augmented, embodied interaction. Despite advances in social neuroscience, the exploration of dance through brain imaging remains limited by technical challenges. To overcome these barriers, we developed and validated a real-time vibrotactile biofeedback system based on inter-brain coupling (IBC) measures from tango dancers using a mobile, synchronous multi-brain EEG system. We first conducted an empirical study recording synchronized EEG and motion data to test whether behavioral synchronization enhances inter-brain coupling. Insights from this study informed the design of our tangible neurofeedback system, which experienced dancers evaluated. Our findings support the Synchronicity Hypothesis of Dance and demonstrate how embodied technologies can enhance collective dance practice. This work introduces a novel methodological and interaction paradigm, bridging neural measurement with wearable feedback for socially situated embodied experiences.

HyperDance enables real-time measurement and tactile feedback of inter-brain coupling during natural partner dance practice.
Art and Performance
Eldy S. Lazaro Vasquez;Viola Arduini;Etta W Sandry;Katerina Houser;Srujana Golla;Mirela Alistar
Abstract: Bioactuated Tapestry is an installation that explores how biomaterials and textile craft unfold multiple temporalities of interaction. Structured in three zones, the installation moves from milk-based bioplastic samples that change shape quickly when misted, to a Sample Book that documents iterations of bioplastic integration into weaving, to a woven tapestry that changes shape slowly in response to humidity in the surrounding space. Together, these zones demonstrate how interaction can emerge from material behavior shaped through biomaterial formulation and, when woven, through structure. The work foregrounds biomaterial agency, weaving, and situated sustainability grounded in sourcing, fabrication, and practices of care. Through this convergence of biodesign and textile craft, Bioactuated Tapestry aligns with the TEI theme of Resurgence and Convergence, highlighting how material-led practices reconnect material experimentation, environmental attunement, and embodied ways of knowing.

Detail of Bioactuated Tapestry, showing colored casein-based bioplastic strips woven through black cotton yarns. Moisture causes the bioplastic to change shape, and the weave directs that change into curling.
Pictorials
Viola Arduini;Eldy S. Lazaro Vasquez;Srujana Golla;Mirela Alistar
Abstract: Leaking bodies are often concealed or disregarded in both society and design. Likewise, bodily fluids are rarely leveraged as triggers for material interaction in HCI. In this pictorial, we investigate how fluid-responsive biomaterials can enable porous, expressive, and cyclical interactions co-shaped by the body. We focus on a milk-derived bioplastic with reversible shape-changing properties, examining fluid absorption as a meaningful design affordance. Our material-led approach contributes both formulation and fabrication methods of casein bioplastic; while autoethnographic inquiry with a lactating body informed the development of Leaky Body Maps and speculative garments that position leakage as a generative site of body-material interaction. This work contributes to the discourse of feminist and posthuman HCI by centering bodily permeability, material responsiveness, and the potential of designing with – rather than concealing – leaky bodies.

Garment prototype showing where casein-based bioplastic was placed based on a body leak map of possible milk leakage.
Works In Progress
Rong-Hao Liang, Steven Houben,Krithik Ranjan, S. Sandra Bae, Peter Gyory,Ellen Yi-Luen Do, Clement Zheng
Abstract: Tangible User Interfaces (TUIs) that integrate digital information with physical interaction require specialized hardware and complex calibration, limiting their adoption in portable or mobile display systems. This paper introduces ArUcoTUI, a computer vision (CV) toolkit for prototyping tangible interactions on portable screens, leveraging standard cameras and the OpenCV library. ArUcoTUI uses ArUco fiducial markers to detect physical inputs. The software toolkit offers streamlined calibration, a signal processing pipeline, and a client application that translates tangible input into structured events for use in HCI applications. Using a conventional camera in a top-down setting with a flat-panel display, we demonstrate how this toolkit supports the development of interactive surface TUIs with advanced features, including 3D spatial interaction, multi-device interaction, and actuated tangibles within applications. We describe the software implementation, which utilizes accessible hardware to support the development of these tangible interactions. We provide the results of a preliminary evaluation with users, including design implications and suggestions for future research and development.

ArUcoTUI is a software toolkit for rapid TUI prototyping on portable screens. It uses standard cameras, OpenCV, and ArUco markers for real-time object tracking. We demonstrate the applicability using an overhead camera for a) multi-token music control, b) above-screen gesture detection, c) multi-display board games, and d) actuated data visualization using robots.
Krithik Ranjan, Khushbu Kshirsagar, Harrison Jesse Smith,Ellen Yi-Luen Do
Abstract: Character animation remains challenging for novices and children despite advances in digital tools. While recent tangible interfaces have lowered barriers by enabling creators to animate their drawings on paper, they are limited to preset animation sequences and support for only human-like characters. We present Rig-a-Doodle, a tangible kit and web application for fully open-ended character rigging animation, where creators can draw any character and construct a custom physical rig using everyday materials to animate it. This work-in-progress contributes a system of tangible interaction to animate hand-drawn characters by direct physical manipulation of custom rigs in real-time. We share findings from a preliminary workshop with adults to explore the kinds of expressive animation the kit enables, discover issues with interaction, and source ideas for future directions.

(Top-left) Rig-a-Doodle character template to draw the character and cut out CV markers for the rig. (Top-right, bottom-left, bottom-right) The three steps of Capture, Assign, and Play illustrated with screenshots from the Rig-a-Doodle application.
Ruhan Yang,Yuchen Zhang,Ellen Yi-Luen Do
Abstract: We present Lighting the Reef, an interactive installation that uses modular 3D paper circuits to explore ecological fragility. Participants build coral structures from foldable paper blocks with copper tape and low-voltage components. When connections align, coral modules glow, metaphorically expressing the energy exchange between coral and zooxanthellae, the symbiotic algae crucial to coral metabolism. Pollution modules add resistance that dims light or interrupts the current entirely, mirroring environmental disruption. We position Lighting the Reef as a Research through Design case that articulates fragility as an interaction aesthetic and ecological metaphor. We reflect on how modular circuitry, material constraints, and embodied play make precarity tangible. We also report workshops with 15 participants that discussed themes of care, collapse, and interdependence. We contribute insights into designing for fragility with modular circuits, ecological storytelling through tangible interaction, and accessible and reproducible designs for participatory sustainability education.

Lighting the Reef is a tangible installation built from 3D paper circuit modules, whose illumination depends on alignment and balance. As participants assemble and adjust the blocks, the lights brighten, dim, or turn off, reflecting the changing conditions of the coral system.