The speed at which new technologies collide with the market is nothing compared to the speed at which talented researchers find creative ways to use them, train them, and turn them into things that we cannot live in. One such researcher is MIT MAD Fellow Alexander Htet Kyaw, a graduate student with a double master’s degree in architectural studies in computational and electrical engineering and computer science.
Kyaw employs technologies such as artificial intelligence, augmented reality, and robotics, and combines them with gestures, voice, and object recognition to interact with the built environment, modify shopping, design complex structures, and create human workflows that can create physical things.
One of his latest innovations is Curator AI, and he and his MIT graduate student partners won $26,000 in Openai products and cash on AI builds at the MIT AI Conference. In partnership with Kyaw, Richa Gupta (Architecture) and Bradley Bunch, Nidhish Sagar and Michael won.
Curator AI is designed to streamline online furniture shopping by using AI and AR to provide context-conscious product recommendations. The platform uses AR to measure the room where windows, doors and existing furniture locations are located. The user can then talk to the software and explain the new furniture they need. The system uses a vision language AI model to search and display a variety of options that suit both the user’s prompts and the visual characteristics of the room.
“Shoppers can choose from suggested options, visualize AR products, use natural language to request search modifications, and make the furniture selection process more intuitive, efficient and personalized,” says Kyaw. “The problem we’re trying to solve is that most people don’t know where to start when providing a room, so we developed Curator AI to provide smart context recommendations based on the appearance of the room.” Curator AI was developed for furniture shopping, but can be expanded for use in other markets.
Another example of Kyaw’s work is a product created by him and three other graduate students at the March 2024 MIT Sloan Product Tech Conference hackathon. The focus of that competition was to support small businesses. Kyaw and the team decided to work on the basis of a Cambridge painting company that employs 10 people. Estimates use AR and object recognition AI technology to obtain accurate measurements of the room and generate detailed cost estimates for renovation and/or painting jobs. It also utilizes generation AI to display images of rooms and rooms that may appear to have been painted or refurbished, and generates an invoice once the project is complete.
The team won that hackathon and $5,000 in cash. Kyaw’s teammates are Guillaume Allegre, May Khine and Anna Mathy, all graduated from MIT in 2024 with a Masters in Business Analytics.
In April, Kyaw will give a TEDX lecture at Cornell University’s alma mater. CornellUniversity discusses AI, AR, and other projects that use Robotics to design and build things.
One of these projects is UNLOG, where Kyaw connects AR with gesture recognition to build software input from fingertip touches to the surface of the material to map the dimensions of building components. That’s how Unlog, a towering art sculpture made from ash logs standing on the Cornell campus, was born.

Play the video
Feedback-based mixed reality and robotic robot manufacturing gesture recognition UNLOG tower
Video: Alexander Htet Kyaw
UNLOG represents the possibility that you can move your logs to a timber mill and convert them into boards or 2 x 4, then build a structure directly from the entire log, rather than being shipped to a wholesaler or retailer. This expresses Kyaw’s desire to use building materials in a more sustainable way. The paper on this work, “The Case Study of the UNLOG Tower of Feedback-Based Mixed Reality Manufacturing Gesture Recognition UNLOG Tower,” was published in January 2024, in the minutes of the 5th International Conference on Computing Design and Robot Manufacturing, by Kyaw, Leslie Lok, Lawson Spencer, and Sasa Zivkovic.
Another system developed by Kyaw integrates physics simulation, gesture recognition and AR to design active bending structures built with bamboo poles. Gesture recognition allows users to operate digital bamboo modules in AR. It also integrates physics simulations to visualize how bamboo bends and how bamboo poles can be visualized in a way that creates a stable structure. This work was featured in the Proceedings of the 41st Education and Research on European Computer-Aided Architectural Design in August 2023 as “Active Bending in Physics-Based Mixed Reality: Design and Manufacturing of Reconfigurable Modular Bamboo Systems.”
Kyaw used the bamboo module to pitch similar ideas and created a deployable structure in Mitdesignx last year. Mitdesignx is a MIT MAD program that provides coaching and funding to select promising startups and launch them. Kyaw later established Bendshelters to build prefabricated modular bamboo shelters and community spaces for refugees and displaced people in his home country, Myanmar.
“Where I grew up, in Myanmar, there are a lot of daily impacts of climate change and extreme poverty,” says Kyaw. “There’s a major refugee crisis in this country. I want to think about how I can contribute to my community.”
His work with Bendshelters has been recognized by the MIT Sandbox, the PKG Social Innovation Challenge and the Social Good Awards of Amazon Robotics.
At MIT, Kyaw is working with Professor Neil Gershenfeld, director of the BITS and ATOMS Centre, and doctoral student Miana Smith to create workflows that can build objects in an accessible, on-demand, and sustainable way using Speech Specification recognition, 3D generation AI, and robotic arms. Kyaw holds a Bachelor’s degree in Architecture and Computer Science from Cornell. Last year he was awarded an SJA Fellowship by The Steve Jobs Archive. It funds the project at the intersection of technology and arts.
“I enjoy exploring different types of technology, designing and creating things,” says Kyaw. “Being part of MAD helped me think about how all work would connect and clarify my intentions. My research vision is to design and develop systems and products that allow for natural interactions between humans, machines and the world around us.”