The Seminar series on Acessibility and AI

Find more about how to attend the biweekly seminar series here. This is an overview of the previous speakers and talks.

Thijs Roumen: “why'd you say that? Human-AI relationships for AAC users” February 5th 2026

Abstract: Accessibility is to date the best use case of AI, but (how) should we go about using AI to advance accessibility? Thijs explores this question through a lens of his own work in speech accessibility. The core questions is what role AI should/could play in supporting augmentative and alternative communication (AAC). Together with his student Tobias Weinberg, he developed interfaces to support the creation of humorous interjections through autocomplete. This AI-humor turned out to be a powerful form of backchanneling for participants in the study--it communicates presence rather than elaborate complex humor. In follow-up work, they ran workshops to dive deeper into the micro-culture of backchanneling as developed in AAC communities. Finally, to really find out what role AI should play, they conducted a longitudinal experiment where they trained a personalized AI on 7 months of Tobi's speech. In this talk Thijs discusses the advantages of accurate auto-complete and the costs in terms of privacy, agency, and identity of such ultra-personal AI.

Bio: Thijs Roumen is assistant professor in Information Science at Cornell Tech (NYC) and one of the Executive board members of the CATAI initiative. He is recipient of a Google Research award and several best paper and honorable mention awards at top-tier conferences. In 2023 he finished his PhD in HCI and IT Systems Engineering with Prof. Patrick Baudisch at the Hasso Plattner Institute. Before that he was research assistant at the National University of Singapore with Prof Shengdong Zhao. Thijs completed a MSc in IT Product Design at the University of Southern Denmark in Sonderborg and a BSc in Industrial Design from Eindhoven University of Technology in the Netherlands.

Tapomayukh Bhattacharjee: “Physical Intelligence for Physical Care: Towards Stakeholder-Informed Caregiving Robots in the Real World” February 19th 2026

Abstract: How can we build robots that meaningfully assist people with mobility limitations in their daily lives? To support complex caregiving tasks such as robot-assisted feeding, transferring, bathing, and meal preparation, robots must physically interact with people and objects in dynamic, unstructured environments. In this talk, I will present an overview of various projects from my lab that showcase fundamental advances in the field of physical robotic caregiving. I will highlight how we design stakeholder-informed systems—from simulation platforms to real-world deployments—by integrating multimodal perception, user feedback, and adaptive algorithms. Together, these efforts move us closer to creating caregiving robots that are not only technically capable, but are also responsive to the real needs of people in care settings.

Bio: Tapomayukh "Tapo" Bhattacharjee is an Assistant Professor in the Department of Computer Science at Cornell University where he directs the EmPRISE Lab (https://emprise.cs.cornell.edu/). He completed his Ph.D. in Robotics from Georgia Institute of Technology and was an NIH Ruth L. Kirschstein NRSA postdoctoral research associate in Computer Science & Engineering at the University of Washington. His primary research interests are in the area of physical robot caregiving and physical human-robot interaction. He is the recipient of TRI Young Faculty Researcher Award'24, NSF CAREER Award'23, AFCEA 40 under 40 Award'22, and his work has won Best Paper Award at RSS’25, Best Paper and Student Paper Award Finalist and Best HRI Paper Award Finalist at ICRA’25, Best Systems Paper Award Finalist at HRI'24, Best Demo Award at HRI'24, Best RoboCup Paper Award at IROS’22, Best Paper Award Finalist and ABB Best Student Paper Award Finalist at IROS’22, Best Technical Advances Paper Award at HRI'19, and Best Demonstration Award at NeurIPS’18. His work has also been featured in many media outlets including the BBC, Reuters, New York Times, IEEE Spectrum, and GeekWire and his robot-assisted feeding work was selected to be one of the best interactive designs of 2019 by Fast Company.