What stuff do I do?

Currently I’m a part of Google’s Lit Team. I also run the Twitter account! 🐦

Material Web Components

I helped start and for a while managed the Material Design Web Components library which is the canonical implementation of Material Design on the web that uses Lit Web Components.


For a long time I helped convert YouTube from their legacy framework to Web Components and have worked extensively in helping them migrate to newer and greater things. Also I fix various component-level bugs across their frontend properties.

(Hello dark mode 🌚)

Component 64



For Google I/O ‘17 I worked with others making everyone’s favorite physical web creature-battling game Polymon! It was a social game where we hid physical web beacons around the I/O grounds that each contained a Polymon. Once you collected several polymon, you could fight each other in a turn-based MTG-like / Pokemon-like battle.

The game was a PWA, used QR codes to communicate between players, natively supported deep-linking and sharing via android-beam. repo

Microsoft - Bing

I did an internship at Microsoft. I worked on Cortana and Bing integration designing and implementing some Cortana features.

Talks and Videos

I have given talks at various Web Component events, summits, and hackathons. Also have done

World Wide Web Consortium - Crosscloud

As an undergraduate I did some research at the W3C. While there I worked on crosscloud which at the time was an experimental foray at decentralized social profiles.

There I built many types of web applications such as a decentralized photo gallery and a decentralized twitter (very similar to mastodon today).

MIT Media Lab

At the MIT Media lab, I worked in the Responsive Environments Group A group dedicated to creating context-aware environments that serve humans.

While there, I worked under a PhD student whose research was on experimental, perception-based, lighting controls.

There I created a Unity simulation of the lighting controls, helped present it to investors at an investor symposium, helped acquire millions of dollars in funding and equipment from Philips, created an iPhone interface connected to a server that computed the correct lighting configuration from the student’s research and displayed it in real life. I then created robust data collection program that could play and replay the touches of test subjects from HCI trials.