Projects
Duke Robotics Club
As Electrical Lead on the Duke Robotics Club, I led the migration of various sensor communication systems to ROS 2, designing a modular, robot-agnostic architecture that supported development across multiple robotic platforms. Alongside a small team of engineers, I co-led the electrical design and assembly of Duke Robotics’ new robot, Crush. I also developed computer vision algorithms to analyze sonar data for real-time detection of underwater objects. Additionally, I managed project timelines, delegated tasks across the electrical subteam, and onboarded new members to our team.
Python
ROS
Arduino
Soldering
C++
Brain Tool Lab - Compact Tension Sensing Unit
I designed and built a compact force sensor system for closed-loop continuum robots, capable of accurately measuring tendon tension for real-time closed-loop control. Inspired by other researchers, I reduced the system cost from over $5,000 to just $150 and cut the lead time by approximately 90%. This work was presented at the 2025 International Symposium on Medical Robotics (ISMR), where I shared our design and results with a global community of robotics researchers.
Fusion 360
3D Printing
Arduino
Brain Tool Lab - TACTER
Working with other researchers, I co-designed the actuation systems for a multi-material, tendon-actuated concentric tube robot developed for minimally invasive endoscopic endonasal surgery. The project was led by my mentor, Kent Yamamoto, and the robot was later validated through both phantom and cadaver experiments, successfully navigating from the nostril to the sphenoid sinus. Our work was presented by Kent at the 2025 International Symposium on Medical Robotics (ISMR) as part of a broader paper on the system’s design and experimental performance.
Fusion 360
3D Printing
Matlab
Square Game
I designed and implemented a 5-stage pipelined CPU operating at 50 MHz using Verilog, featuring a custom ALU, support for multiplication and division, and both standard and custom instruction sets. In collaboration with a friend, we used the processor to drive a motion-controlled game on an FPGA platform, incorporating tilt-based input via an onboard accelerometer, real-time VGA video output, and battery-powered operation. I developed core game functionality, including accelerometer integration, scoring through 7-segment displays, and custom background music.
Verilog
MIPS
Vivado
MIT Beaver Works Summer Institute
I programmed a DJI Tello drone to autonomously track AR tags and visually identify known objects using computer vision. As part of a team project, I also developed an interactive control system that allowed the drone to follow user-selected hands or custom-colored targets during gameplay in Just Dance, navigating strictly through visual cues. PID tuning was also added to ensure smooth and stable flight performance during dynamic tracking tasks.
Python
ROS
OpenCV
First Tech Challenge
I designed and built multi-system robots capable of handling game-specific objects, with control interfaces mapped to PlayStation controllers. To support autonomous behavior, I integrated odometry for precise state estimation and implemented AR tag-based object detection to identify key field elements. I also incorporated motion planning libraries and simulation tools, which improved path efficiency and significantly accelerated autonomous development and testing.
Java
TensorFlow
OpenCV