Human-Centered Physical AI
The world around us is inherently physical, yet adaptive interfaces focus mostly on adapting digital contents. Users, on the other hand, have many physical tasks and interact with physical objects all the time. To support users with physical tasks, AI needs an embodiment.
Augmenting humans > automating tasks
We research adaptive systems for physical interaction that are ubiquitous and unobtrusive. Intelligence should be integrated into our familiar objects and environments, therefore blending into the background while dynamically adapting to our needs.
We investigate interactive structures and physical AI, designing computational methods and innovative materials that seamlessly blend computation, fabrication, and robotics. Our research advances the creation of responsive materials and intelligent structures that sense, react, adapt, and communicate with users and the environment. By embedding intelligence directly within physical forms, from flexible robotic actuators to wearable technologies and metamaterials, we create intuitive interfaces, enhance human interaction, and foster novel, unobtrusive integrations of technology into everyday life.
At the Interactive Structures Lab, we bridge the gap between understanding user needs and adapting to them by developing full-stack research prototypes, incl. software systems and physical interfaces.
The Interactive Structures lab is led by Prof. Alexandra Ion at CMU HCII.
Reach out to join the team. CMU students, see current research opportunities.