A project to provide a shared calendar for cleaners and maintenance people. The project takes in a set of iCal links from Airbnb or VRBO and generates one unified calendar at a public link.
Read MoreI prototyped a mobile-friendly line-calling system for pickleball using computer vision to reduce disputes over close calls. Leveraging YOLO models, RoboFlow, and Python, the app detects ball trajectories and court lines, offering instant replays to help players make accurate judgments.
Read MoreI developed a low-latency sensor to digitize respiration for virtual reality, aiming to create seamless, immersive relaxation experiences. After sharing it on the Oculus Reddit page, it quickly became the top post of the day, attracting over 1,000 views and interest from researchers and developers. This led to collaborations with institutions like Cornell and the University of Florida, and we filed for a provisional patent to protect the technology.
Read MoreWe collaborated with Stanford CHARIOT to develop a VR experience that helps reduce pre-procedural anxiety and pain for children in the operating room. This innovative tool increases cognitive load before IV placement, enhancing immersion and minimizing discomfort, and is now in clinical trials at Stanford Children's Hospital.
Read MoreWhile working on JunoVR, a company I co-founded with my friend Eric Levin to create therapeutic VR experiences, we explored a range of innovative ideas aimed at helping people relax and meditate. Drawing inspiration from research on heart rate variability, respiration, Buddhism, states of awe, and binaural audio, we developed immersive experiences like breath-controlled flying meditations, out-of-body breathing visuals, and environments that responded empathetically to users. Our goal was to craft experiences that induced calm with minimal cognitive effort. Through VR meditation events and onsite user testing in San Francisco, we observed significant anxiety reductions—up to 36% in a 15-minute guided session—validating the potential of VR as a powerful tool for mental wellness. In this project, I took on roles as Developer, Sensor Engineer, CEO, and Artist.
Read MoreI developed an application to detect a user's exhales using an off-the-shelf microphone while filtering out environmental noise and voices. By analyzing frequency spectrum peaks specific to breath, the app successfully detected breathing without false positives during speech.
Read MoreAt the 2016 San Francisco Virtual Reality Hackathon, our team built a VR sports training simulator that let users enter a virtual locker room, watch motivational videos, and engage in training exercises led by top athletes. Using Leap Motion and Oculus Rift, we created an immersive demo that won first place in the Health/Medical category.
Read MoreI developed a Leap Motion-based application to track the progression of Parkinson's and Essential Tremor by precisely measuring hand tremor frequency and movement volatility during a simple line tracing task. Unlike traditional methods that rely on subjective evaluations, this software provided 1.2 mm accuracy in 3D space, capturing data hundreds of times per second for a low-cost, high-resolution alternative. The project gained attention from Leap Motion, who featured it on their blog, and neurologists worldwide reached out for collaboration, using the shared code in their research labs.
Read MoreAt the SF Neurogaming Hackathon 2014, I built an application that captured screenshots when users smiled, overlaying their emotional data alongside the content they were viewing. This project explored how emotion detection could enhance our digital experiences, though the Emotiv EEG's emotion detection accuracy at the time was limited.
Read MoreInspired by Spritz's speed-reading technology, I developed Super Reader, a proof of concept that integrates touchless controls into the reading experience. By using blink detection and subtle hand movements, I explored new ways to control the reading pace without traditional inputs. While blink tracking showed promise as a low-effort control method, hand tracking was less practical compared to simpler options like keyboard controls.
Read MoreBotanicus Perverticus is a houseplant with a personality, designed to crave human touch and respond with a satisfied sigh when its needs are met. Using 60 Hz power grid noise detected by an open-ended analog-to-digital converter, I developed a proximity sensing mechanism in LabVIEW that triggered audio responses via a networked sound server. Inspired by Disney Research’s Botanicus Interacticus, this playful project aimed to make plants more relatable, sparking empathy and responsibility for living things. We showcased it at local MakerSpace meetups, where it became a crowd favorite.
Read More