The site you are currently on!
Made from scratch in HTML / CSS / Javascript. Used JSDoc in some places for static typing and documentation. All cards and certain UI elements were hand-drawn and scanned in.
This is the first real graphics project I made. I got inspired by Code Parade's video on generating fractials via Ray Marching. This lead me to various resources such as Inigo Quilez's amazing website that contains lots of info on SDF's and Ray Marching. I also experimented with ShaderToy before switching to my own website which used WebGL.
Same as most ray marchers, it works by rendering a quad to the screen (the vertex shader simply passes the 6 verts of the quad to the fragment shader.) Then all the raymarching is done per pixel over there. Camera information is passed to the fragment shader via a uniform so that you can move around the Mandelbulb.
Made as a solo project for KnightHacks 2020, I wanted to create an virtual 3D environment where multiple people could join and project their camera feeds on to in-world textures. See more info on the Project Page
After randomly pairing up with some programmers and artists, we immediately started working on a puzzle game in Unity. I made all the sprites, the intro 3D animation, aswell as the music, the level design and the codebase. See more info on the Project Page
Made as the final project for my computer graphics class, along with Nick Stuhldreher.
We implemented a GPU accelerated noise function, the marching cubes algorithm, and a simple renderer. We also added simple FPS controls and a few different parameters to play with. Everything is real time meaning you can completely change the volume (density) function and see the changes in real time.
WebGPU was still a relatively new technology, with few tutorials, and incomplete documentation. We ended up having to work through alot of the issues ourselves through trial and error. In the end it was very rewarding to be able to demo our project by simply going to the site, instead of cloning a python script off github.
In developing the "pt-gpt" project, I integrated the GPT-4 API to create a personal trainer application. The frontend, built with JavaScript, React, and Expo, interacts with Firebase services. I set up Firebase functions to handle user authentication, profile setup, and real-time chat interactions. The chat function leverages the OpenAI API, processing user messages and generating AI responses based on user profiles and conversation history. My work focused on ensuring seamless integration between the frontend, Firebase backend, and the OpenAI API, aiming to provide a personalized and interactive training experience.
The Triangular Array Elementary Cellular Automaton. It works the same way as a regular ECA. Where the state of a cell is determined by the state of some neighboring previous cells. Except now each row is offset by half a cell, meaning that there is always an even amount of "previous" cells. The amount of possible previous state combinations is k^(2r) where k is the number of states and r is the range. The amount of possible rulesets is therefore k^(k^(2r)), as each of these combinations can be mapped to a state.
Wanted to get more experience creating a datapath / workflow in WebGPU. Created some reusable resource templates, bind groups, and pipelines; allowing for easy coupling of different parts of the program.