I am currently a software engineer at Bloomberg working on data visualization. Previously I worked at Adobe on Lightroom Web, did mobile app development at Zynga, and VR research at the University of Pennsylvania.

In my own time I like to work on visual and audio software projects using libraries like ThreeJS and ProcessingJS.

QUICK LINKS because I need to clean up:


Story Maker

Interactive Plot-Based Narrative Generation

Fall 2018

For my Senior Capstone Project, I created a hierarchical task network (HTN) to programmatically generate unique stories based on a few user-inputted parameters and a world base (characters, settings, events). I used the SHOP2 algorithm as a guide for implementing my HTN. My implementation incorporates and unites a large breadth of fields: linguistics, interactive storytelling, literature, automated planning, and world building.

My experience with writers' block inspired me to want to build a tool that could allow authors to play around with different narrative possibilities for the characters in their world. While the field of Interactive Storytelling seemed dense and full of projects and innovations, I saw comparetively little research on Assisted Authoring--which focuses more on creating interesting stories instead of user interaction.

I wrote the app using ReactJS so that it could be web-based and easily accessible and shareable. For my demo I created my own world base and illustated corresponding art for the different characters and events!

github repo →


Audio-processing video art web app

go here for a friend and a song →

JavaScript, Webpack, ThreeJS

I wrote a function that maps an audio source's realtime fundamental frequency (pitch) onto an RGB color spectrum, and played around with creating video art programmatically.

(github repo)

Donation Tracking and Recruitment Management Web App

Technical Lead, Full Stack Engineer for Hack4Impact

Spring 2018

As the technical lead of a team of four, I oversaw the technical completion of this project for Bread & Roses Community Fund, a local Philadelphia nonprofit that supports community-based movements for racial equity and economic opportunity through fundraising and grantmaking. My responsibilities included leading technical infrastructure decisions, mentoring other members, assigning weekly tasks, and communicating the project scope and details to the client.

The Problem
We built a Flask web application specifically for Bread & Roses' Giving Project, to help out with their overloaded recruitment process and longer-term donation tracking. The Giving Project seeks to bring together a group of diverse, passionate individuals, so cohort balance is a huge consideration in recruitment.

The Product
We built a recruitment management system that allows admins to track applicants, see overall cohort statistics, and search for and manage previous applicants. With this system, no applicant gets lost through the cracks, and balancing a new cohort becomes less of a headache.

We also built a donor tracking system that allows participants in the Giving Project to track their donors from first communication up until the donation actually makes it into the bank. We were influenced by Asana for the design, and added special mechanisms between columns to allow the user to specify new information (such as the date, and amount pledged/donated) before moving cards to a new column.

This application was built off of flask-base, a Hack4Impact open source Flask boilerplate app. It was written in Python and JavaScript, using Jinja templating and CSS.

See It Live
We deployed the product in July 2018 at! The application is only accessible to admins and participants, but feel free to check out the landing page and interest form.

Conditional Application Processing Software

Technical Lead, Full Stack Engineer for Hack4Impact

Fall 2017

As the technical lead of a team of five, I oversaw the technical aspects of this project for Habitat for Humanity Philadelphia, the Philly chapter of Habitat for Humanity, a global nonprofit housing organization with the vision of a world where everyone has a decent place to live.

The Problem
The application process for Habitat's homeownership and home repair programs is complex and calculation intensive because of the various financial history, income, and background information needed to check for an applicant's eligibility for Habitat's program. Habitat relies on its volunteers to complete these background checks and calculations - an inefficient and error-prone process.

The Product
We sought to solve Habitat's application process bottleneck by building a completely customizable, conditional application processing software.

With this system, an admin can specify conditional formulae (like in Excel) that process each round of inputs in an application, automatically doing the calculations that would have been done by hand for each level of background checks. Admins can also customize forms and their order to represent each phase of checks.

This application was built off of flask-base, a Hack4Impact open source Flask boilerplate app. It was written in Python and JavaScript, using Jinja templating and CSS. //- We deployed this product in February 2018

Procedural City

3D city generated using shape grammars

Spring 2017

For this project, I designed a shape grammar that creates urban buildings, and designed a city layout to place the buildings generated by my shape grammar. The overall result is a procedurally generated city, which I named Suzville!

The shape grammar parser that I wrote in order to be able to create the city is a modified version of the L-System parser I created in a previous L-System project.

The specific grammar rules I used, and more about my implementation and design choices can be found in the github repo :)

This project was written primarily using ThreeJS, a 3d library for JavaScript. I created this project in the context of the experimental (CIS 700) Procedural Graphics class taught in the Spring of 2017.

Music xx

Interactive Spotify 3D music experience

Spring 2016

here! →

JavaScript, ThreeJS, Spotify API

I love listening to music while on a train, walking, or running and so I built this app to try to simulate that experience of immersive, multi-sensory flow. This personalized app allows you to experience your Spotify playlists visually, providing several 3D scenes to choose from and calculating the pace of the visual effects based on the average tempo of the selected playlist.

(github repo)