About me

I am a Ph.D. Candidate in Mechanical engineering at the Mechatronics, Controls and Robotics Laboratory (MCRL) at NYU Tandon School of Engineering. My work focuses on extended-reality systems (virtual, augmented, and mixed reality) for rehabilitation and human-robot interaction. Specifically, I develop head-mounted and handheld AR interfaces, combined with perception algorithms and supplementary sensors, to build intuitive solutions that anyone—regardless of technical background—can use.

Background & Motivation

I began my engineering journey at Tecnológico de Monterrey, where I earned a scholarship to study Mechatronics Engineering and designed a camera-guided robotic arm for my senior capstone project. After graduation, I worked as an automation engineer in the oil and gas sector, honing my skills in PLC programming, HMI and SCADA design, sensor integration, and real-time control systems. These experiences solidified my passion for seamless human–machine collaboration and laid the groundwork for my transition to advanced robotics research.

Path to Research

In 2018, I enrolled in NYU’s M.S. in Mechatronics & Robotics program, where I explored machine-vision and learning-algorithm projects. After one semester, I began volunteering in the Artificial Intelligence for Civil Engineering Laboratory (AI4CE Lab) led by Professor Chen Feng, where their research on construction robotics problems inspired me to deepen my research engagement. Building on that foundation, I progressed into NYU’s PhD program advised by Professor Vikram Kapila, where I developed a tele-dialysis interface during the COVID-19 response.

Today, I specialize in extended‐reality interfaces and perception‐driven systems—designing head‐mounted and handheld AR devices, evaluating them through structured user studies, and refining workflows to streamline human–robot interaction. My goal is to drive practical, technically robust solutions that advance the next generation of robotic applications.