SYSTEM SITE
Hi, I'm Rishabh.
Builder across STEM + Business + Impact
Timelines across engineering, companies, and nonprofit work - every project opens into a drawer with real context, proof, and next steps.
WORK
Timelines by category
Definitely not every build, but these are some of my favorites.
LANES
Choose a timeline
Each lane is its own narrative. Hover rows to preview the poster that follows your cursor.
Click again to stay in the lane and refine.
TIMELINE
Showing 7 of 7 builds in STEM.
Built a robot that draws before middle school. I scratch-built a 2D pen plotter in 3rd–4th grade-Arduino + CNC shield, salvaged stepper motors, wood rails, limit switches, and a lot of hot glue forming a custom motion platform that I wired, coded, and calibrated without any kit or ChatGPT/AI help. Every line the plotter draws comes from my pipeline: Inkscape to G-code, then Arduino to translate that code into smooth X/Y motion. It’s proof I was already running full-stack hardware projects-electronics, firmware, mechanics, and debugging-years before most. Want the deeper dive into the skills, schematics, and what it taught me? Read more about the Pen Plotter build →
Built my own 3D printer before I ever had one. In 4th grade I took the CNC ideas from my pen plotter and scratch-built a DIY 3D printer using a RAMPS 1.4 board running Marlin, stepper motors, limit switches, a hacked-together wood/metal frame, and a bolted-on extruder-no kit, no 3D-printed parts, no ChatGPT/AI help. I tuned firmware settings, fought wobble on the Z axis, and iterated until the machine could actually move in X/Y/Z. Want the deeper dive into what worked, what broke, and what it taught me about real-world precision? Read more about the DIY 3D Printer build →
Turned my DIY 3D printer into a CNC mill. After getting my scratch-built printer moving, I swapped the extruder for a drill and pushed the same RAMPS 1.4 + Marlin setup, stepper motors, and hacked-together wood/metal frame into acting as a basic CNC mill-no kit, no off-the-shelf plans, no ChatGPT/AI help. I rewrote motion settings, experimented with feeds and depths, and learned the hard way how flex, vibration, and cutting forces expose every weakness in your mechanics. Want the deeper dive into what this taught me about machine limits, tool loads, and designing past the edge of your hardware? Read more about the CNC Mill conversion →
Turned a mirror into a smart display in 5th grade. I built a Raspberry Pi–based computer, flashed the OS, so the glass shows time, weather, and widgets floating in what looks like a normal mirror. Want the deeper dive into the wiring, software stack, and what it taught me about UX and polish? Read more about the Magic Mirror build →
Moved a prosthetic hand I built with my own thoughts in middle school. In 7–8th grade I designed and 3D-printed the prosthetic arm itself (multi-finger hand, linkages, servos) and built the full EEG control stack around it: an 8-channel headset over the motor cortex feeding into an OpenBCI Cyton, ESP8266, Arduino, Raspberry Pi-no kit, no lab, no ChatGPT/AI help. I wrote the entire signal pipeline (filters, CSP, feature extraction, logistic regression) to decode left/right motor imagery, then added haptic feedback by measuring grip force on the prosthetic fingers and recreating that pressure on other nerves so the brain “feels” the hand closing. Want the deeper dive into the EEG pipeline, prosthetic mechanics, and what it taught me about building real neuroprosthetics? Read more about the Mind-Controlled Prosthetic with Haptic Feedback →
Designed a research OS for LLMs in high school. In 9–10th grade I wrote a 50-page architecture called Eureka Engine that turns a base LLM into a careful problem-solver: it proposes ideas, designs experiments, talks to tools/oracles, scores what worked, logs everything in a versioned memory graph, and only returns a single, tested “eureka” per cycle instead of a brainstorm dump. It’s a full system spec with modules for hypothesis generation, active experiment design, MDL/utility scoring, safety gates, and provenance tracking that I now use as the mental “OS” behind my own research. Want the deeper dive into the diagrams, operators, and how it links to my BCI/silent speech work? Read more about the Eureka Engine →
Silent speech by warping an electric field in your ear. In 11th grade I built a prototype where a pair of earbuds generate a capacitive electric field in the ear canal and I read the tiny distortions that tongue and jaw motion cause in that field, turning a single noisy channel into a 1D control slider. From there I map that slider into tokens for real-time, hands-free, eyes-free, voice-free conversation with an LLM-designing the whole stack myself, from raw field distortion → robust normalization → adaptive multi-band slider → low-latency AI loop. As far as I could find, using ear-canal electric field distortion to track tongue movement has almost prior published work. Want the deeper dive into the sensing physics, signal pipeline, and what it taught me about turning electric fields into language? Read more about the Novel Earbud Silent Speech Interface →
- Developed Power BI dashboards to surface KPIs for exec reviews.
- Optimized refresh schedules so reporting stayed live for stakeholders.
- Wrote and tuned SQL queries that extracted insights for ticket prioritization.
- Documented query logic so engineers could self-serve new reports.
- Managed incoming tickets and service requests end-to-end.
- Created triage notes so issues were forwarded to the right owners without dead time.
- Led intern onboarding, training, and daily stand-ups.
- Mapped deliverables and checkpoints so remote work stayed aligned.