I started my career because of that one famous TED talk (the one linked above) on multi-touch user interfaces by Jeff Han. The way he presented his experiments and solutions, the sheer awesomeness of simultaneous multiple inputs in user interfaces, it had such a profound effect on my young brain that it kickstarted a latent need to build things from scratch. I went deeper and found a group called NUIGroup, a forum with people from all over the world experimenting and finding ways to build multitouch hardware on their own. Learning from that group, I built a multitouch screen using a projector, camera, and infrared lasers. Simultaneously, I joined a group of people developing SDKs for building apps with multitouch UX. I was one of the core contributors to the PyMT project, which later evolved into Kivy.org. I learned so much being part of that group, thanks to some amazing people and mentors.
The video of my final year project. I built both the software and the hardware. Wish I had captured a better video of the hardware :D
All these memories came flooding back when I saw someone saying that Unity has magical support for multitouch input from the MacBook trackpad. Immediately I was like, “hey, I’ve seen this one” — 15+ years ago in the PyMT IRC channel. We added support for it back then in PyMT. Essentially, MacBook’s trackpad raw input can be accessed using some undocumented C APIs. We used to struggle finding hardware with multitouch support, and people were using clever ways to identify and hack multitouch devices to allow multiple inputs. So the Mac trackpad was just plain awesome, we could use something that was right there to test our software. After the Unity guy, I saw someone else mentioning this again and gatekeeping how they did it. O_o
Remembering all this motivated me to build something so that I can open source it. I knew I wanted to use three.js for visualization, but browsers don’t allow you to access undocumented C APIs. So the next obvious plan was to somehow stream the touch input to the browser. Websockets! I wrote a quick backend in Go to access the C APIs using cgo, then I fired up Claude Code to handle the rest—the websocket streaming boilerplate and setting up a frontend to connect and visualize the touch points. Claude Sonnet has a surprisingly good grasp of three.js, so I could create this demo app within a couple of hours. The post on X ended up blowing up a bit.
As an experiment I built a Mac trackpad multi-touch input visualizer. Saw someone mentioning that Unity has support for this. TBH this has been known for 16+ years now. It's basically an undocumented C API, which can be easily accessed via CGO. These API are not accessible via… pic.twitter.com/YmfU93DvV9
— Sharath Patali (@sharathpatali) October 5, 2025
Since I made it open source, it already helped someone create another variant of the app within a couple of hours. That made me super happy. Reminded me of all the good butterfly effects of open sourcing software and ideas.
inspired by him, I added a garden where you can plant flowers & trees, kind of a stress relief tool
— mirkenan (@mirkenan_) October 6, 2025
I'll try to make this even better https://t.co/yvIkhozQDN pic.twitter.com/joSnjmPUyo
Check out the project: mtwebviz on GitHub