Projects


This page exists to showcase the types of work I enjoy doing: successes I have had in personal and professional projects, lessons learned through hands-on exploration, and current work that has at least one iteration out in the world. Though I’m trained as a software engineer, not all of the projects listed here are software-related – I thoroughly enjoy developing ideas of all types, and appreciate the challenges and education involved in exploring new media and building new skills.

Projects are listed in chronological order, with the most recent projects detailed first.

PreForm: PreForm 3.0 UI Overhaul

When I joined Formlabs, my first project was to totally overhaul the user interface of our primary desktop application, PreForm, to coincide with the launch of the Form 3 printer. I had 4 months to learn the Qt development framework (which I had never used before), learn the current code structure, create all new components and styling, manage feedback of internal stakeholders, and get the new changes shipped with the new 3.0 release. Through this project, I formed connections across many teams to learn about the product/company, resolve technical and design conflicts, and solicit feedback. The project shipped along with the Form 3 launch, and was considered to be the smoothest UI project in the company to date at that time.

PrecisionHawk: PrecisionFlight

26454835265_a35e94e249_o copy
A DJI Phantom flies above using DataMapper InFlight.

My main project at PrecisionHawk is the development of the PrecisionFlight application. This application controls commercially available DJI quadcopters to autonomously collect images over a target area, or around a target of interest. After these images are collected, they can be easily uploaded to our processing platform, PrecisionMapper, and stitched together into a high-resolution map or 3D model. These outputs can be analyzed to unlock massive business potential, and we work with companies to build solutions on top of these capabilities that fit their needs. More information is available on the PrecisionFlight product page, and the app can be downloaded from Google Play or from the App Store.

I serve as the technical lead of this application, which involves setting technical direction, contributing code, managing stakeholder requests, providing creative direction to designers, and owning the product vision for our team. Through the release process, I have worked with a team of great mobile developers, designers, and product managers that have enabled the release of an iOS version of the application and continued to add new features to both versions of the app.

For more about the design of PrecisionFlight, check out this Medium post that details a moment of inspiration for the experience created by the application.

Project Aperture [User Study]

Occasionally, I study the impact of new technologies by developing prototype applications for them. In Project Aperture, I developed a system that allowed users to “right click” on reality using Google Glass hardware. This served as both a user study for smart glasses hardware, and an experiment around potential use cases for augmented reality.

projectaperture

The Aperture application allowed people to look directly at any building, and tap the glasses for more information about that place. The implementation I developed sourced location information from Google Maps to let people know what they were looking at, and used a custom algorithm based on user location and compass data to properly identify the structure. The system would also tell the user if any of their friends were at a particular location, and give aggregate reviews of restaurants and other types of locations.* I then ran a small user study of the application in downtown Raleigh to collect opinions (via survey) and reactions (via observation) to the technology.

The data I collected from this experiment heavily informed the way I think about wearable tech and augmented reality, and I plan on exploring the potential of these concepts more in the near future.

* All contextual data presented (except the identity of the building itself) was randomly generated by the prototype for demonstration purposes.

DXLab / PrecisionHawk: LATAS

PrecisionHawk drones we used at a testing event for LATAS.
PrecisionHawk drones we used to test LATAS against the sunrise in California.

My final project at DXLab was the Low Altitude Tracking and Avoidance System, LATAS, for PrecisionHawk. This is an air traffic control system for airborne drones that provides collision avoidance, flight planning, and tracking/replay abilities to pilots of UAVs. A detailed portfolio piece can be found on the DXLab website, and the system’s capabilities are detailed at the LATAS website.

My involvement in this project was to develop a complete software prototype of the system, including a backend API and a web client for pilots to track/manage flight operations. The backend API interfaced directly with our hardware prototype, which involved working closely with our embedded developer/engineer both technically and from a product management perspective. I helped coordinate our efforts on the systems and contributed to embedded development and testing when needed, as well as establishing low-level timelines to make sure we met our high-level goals/deadlines.

I also represented our client at multiple investor/partner meetings to demonstrate the capabilities of the system, as well as traveling to NASA to test our integration into their efforts. Our team also demonstrated the system to media outlets as well, getting featured in Bloomberg Business and attracting other media attention throughout the development process.

IMG_20150317_135034010_HDR (1)
Photo of the system taken during the prototyping stage.

DXLab / DXVentures: Nicotrax

One of my first projects while working at DXLab was to aid Nicotrax, Inc. in their Android and web development. This involved being an architect and project manager for the Android application (managing a team of 4 grad students at NC State University and one employee of Nicotrax), and being the lead developer of the web client. I also provided input to the design process, with the rest of the team taking point on that part of development.

This project was done as part of the DXVentures design incubator, a program designed to give startups the resources they need to design great products that solve genuine user problems. Companies participate in a 3-month curriculum where they revisit their target market/problem space, go through the ideation process to generate effective solutions, and build out functional prototypes to test hypotheses about how people use their products.

Within these three months, the DXLab team had helped Nicotrax develop a complete hardware and software prototype, and get this solution into the pockets of users to test. All of us benefitted greatly from working together, with Nicotrax gaining much-needed manpower on their idea, DXLab getting the exposure of an incubator success, and myself gaining experience in product management/working with embedded systems.

A more detailed portfolio piece on the process can be found on the DXLab website.

IMG_3557
The Nicotrax mobile app and hardware prototype.

Epic Games: Oculus Rift VR Project [Research]

Through my User Experience class at NC State University, I got the opportunity to do research on which factors provide the best experience in virtual reality environments for Epic Games. Our team studied the effect of user-controlled motion on the user’s virtual reality experience. To do this, we set up an experiment in which users had either active control (they control their own motion) or passive control (the experimenter controls the user’s motion) within a virtual environment.

Our hypothesis was that active control would make for a better, more engaging experience in all cases. We tested this across different types of control, and tested users’ engagement through a standardized Game Engagement Questionnaire. This survey separated engagement into four distinct measures: absorption, flow, immersion, and presence. We found that while users reported higher scores for absorption and immersion in the active control scenario, they reported lower scores for flow and presence in the same trials. Comparison against other research in the field led us to conclude that a hybrid form of control – a pilot/co-pilot relationship between the user and their environment – is actually best for introducing people to virtual reality experiences.

DXLab: AutoUber [Concept]

Concept picture developed for the AutoUber concept piece.
Artwork developed for the AutoUber concept piece.

One of the earliest pieces of writing I contributed to at DXLab was our autonomous vehicle concept, AutoUber. In this piece, we explored what self-driving cars could mean for transportation, and how the way we get from point A to point B could fundamentally change when the car becomes a blank canvas for a transportation experience. The piece is hosted on the DXLab website.

Glass CC [Research]

During my Human-Computer Interaction course in undergrad, I took on the challenge of developing an accessibility application for Google Glass with two graduate students. The purpose of this project was to closed-caption live conversations as they happened, with the future goal of offering real-time translation for people acclimating to foreign countries. The application could also help those who are hard of hearing as an alternative to hearing aids.

On top of gaining experience with Google Glass, this project was informative in other ways as well. For example, as my first foray into speech processing, the project disproved some core assumptions I had about natural language understanding (NLU). I was able to take the results of the project and dig deeper to figure out why certain parts of the system did not function as well as I had thought they would when designing our real-time recognition algorithm.

Below is a video demonstration taken near the end of the project. While the system is demonstrated as working in some capacity, the response time is unsatisfactory and the output is not accurate. With additional time on this project, I would have revised the algorithm to allow for the speech recognition component to establish the domain in which the user is speaking, and have access to other linguistic hints that would improve accuracy. I would also rearchitect the data flow to allow for faster response.

(Note: this video was recorded before discovering the nuances of how speech APIs work. Though I trash the Google Web Speech API a little bit for its slow/inaccurate performance, further optimizations did improve performance, and I gained a more full understanding of how speech systems worked in the following months while trying to figure out what went wrong.)

[youtube https://www.youtube.com/watch?v=uXsSWksUsJg&w=420&h=315]

Custom Guitar Build

As a musician, it can often be difficult to find an instrument that lets you express yourself to the fullest extent. When I was looking for a guitar to be my primary player, I couldn’t find anything in stores that was exciting – so I decided to design and build one instead.

1935432_10150559870134155_1188907723_n
The design for the guitar body.

I took on this project with only cursory knowledge of woodworking, and moderate familiarity with circuit design for the internal components. Each stage of the project was thoroughly researched online – I took the popular “measure twice, cut once” woodworking mentality to heart. The neck was spec’d out and purchased from Warmoth, and the body was a custom design I conceived and built from a large block of padauk, an African swamp ash with tonal qualities similar to maple. By the end of the project, I was a much more proficient woodworker (I have advised/supervised others in the shop), and still play the instrument I created as my only current guitar. (The wood color shown in the picture is 100% natural, with only an oil finish applied at the end of the project.)

The electronics of this project are also slightly different from most guitars. My guitar includes a true bypass toggle for the bridge pickup. This switch, when engaged, bypasses the potentiometers that control volume, tonal shape, and pickup selection, and instead pass the raw signal from the bridge pickup directly to output. This results in a more aggressive tone, and is useful particularly for hard-hitting melody lines and solos.

475780_10151008055484155_441878389_o
The finished product.

Sound examples from this guitar can be found on my SoundCloud page, including the songs Flying Free and The Itch.

Broken [iOS]

I took on my first major software project in late high school, which was the creation and release of a game for the iPhone and iPod Touch. At the time, these were both innovations that offered cutting-edge features like touch screens, accelerometers, and a full app development ecosystem.

The premise of the game was simple: use the accelerometer to rotate the paddle around the mass of blocks, and break your way to the core. Powerup blocks allowed you to grow and shrink your paddle, as well as adjust the speed of the ball.

I developed this game using the Torque game development framework for iOS, which involved learning a proprietary language built on top of the iOS development SDK. It was a valuable learning experience at that stage in my career, and taught me about everything from code structure, to product testing/release processes, to marketing and branding. Most importantly, it taught me just how much effort goes into making a single application, and how to adapt and learn in different phases of product development.

A screenshot from the game on the original iPhone.
A screenshot from the game on the original iPhone.