(My resume can be found here .)

VirtuePlay (1999-2002, and 2006-2009)

Lunar Explorer, circa 2002. My primary contribution to Lunar Explorer was the planetary-scale, real-time terrain rendering system based on ROAM (real-time optimally adapting mesh) as described in ‘Real-Time Optimal Adaptation for Planetary Geometry and Texture: 4-8 Tile Hierarchies’ (Hwa et al., 2005).

Lunar Explorer, circa 2002. My primary contribution to Lunar Explorer was the planetary-scale, real-time terrain rendering system based on ROAM (real-time optimally adapting mesh) as described in ‘Real-Time Optimal Adaptation for Planetary Geometry and Texture: 4-8 Tile Hierarchies’ (Hwa et al., 2005).

Prior to starting VirtuePlay, my soon-to-be-co-founder-colleagues and I made numerous prototypes of game engines and tools based on the supposition that concepts like WYSIWYG and rapid-application development could also make game development better. We came up with new methods and strategies to solve our shared sense of frustration with the mediocrity of the state of game development tools, and the marring of poor working conditions at game companies of that era. Thus, in early 2000, the decision was made to bring our side projects and ideas to the forefront and start a company!

Our little group was able to secure capital, and spent the first couple years honing our core tech and writing demos. A few of our early investors had connections to organizations that allowed special access to new hardware and unique datasets. With that, and the need to demonstrate the cool features of our tech, we built a fully-immersive, interactive virtual reality simulation of the entire Moon, where a user could traverse and explore any part of the Lunar surface as well as intercept objects of interest in orbit. We modeled accurate terrain and imagery sourced from NASA’s Clementine and Lunar Prospector missions, and created highly detailed artifacts for all US and Russian landing sites. As a cherry-topper, we leveraged a Rockwell-built, military-grade stereoscopic head-mounted display (HMD), combined with 9-DOF sensors to track head and body movement. (Keep in mind, this was over a decade before Oculus!) The user was able to fly to a site and walk around artifacts in virtual reality. It was pretty cool for it’s time and was featured at numerous events and trade shows, including events sponsored by NASA. Even the great “Buzz” Aldrin came to visit us to try it out!

Over the years we created numerous cool demos and licensed our tech to a few game companies, but yet to build a full-scale game of our own. So, we hunkered down to create Lunar Racing Championship:

Click here to watch the LRC trailer video.

Click here to watch the LRC trailer video.

Click here to watch the wheel physics used in LRC (circa 2004).

Click here to watch the wheel physics used in LRC (circa 2004).

SpaceShipOne (2002-2005)

Click to watch a brief (5 min.) documentary about SpaceShipOne.

Click to watch a brief (5 min.) documentary about SpaceShipOne.

I was given the massive responsibility of designing and developing spacecraft avionics – what would become the System Navigation Unit (SNU), and Flight Director Display (FDD). My primary objective was to work closely with the test pilots and aerospace engineers to create an optimal user experience and interfaces that satisfied all modes of flight. My background as a pilot, and prior exposure in aerospace engineering, helped produce a successful outcome. We went through numerous iterations, and deployed many versions to both simulated and real in-flight testing. The avionics was implemented in C/C++ and assembly, including a large amount of SIMD code to squeeze out maximum performance for both display graphics and flight director logic. I also contributed heavily to the SpaceShipOne flight simulator, data acquisition and ground telemetry systems.

SpaceShipOne and WhiteKnightTwo captive carry flight.

SpaceShipOne and WhiteKnightTwo captive carry flight.

SpaceShipOne cockpit with the Flight Director Display (FDD) at center.

SpaceShipOne cockpit with the Flight Director Display (FDD) at center.

SpaceShipOne FDD graphics. From left to right, modes of flight: drop, boost, coast/reentry and glide.

SpaceShipOne FDD graphics. From left to right, modes of flight: drop, boost, coast/reentry and glide.

TierOne program elements. From upper-left to right: mobile nitrous oxide delivery system (MONODS), rocket motor test stand trailer (TST), WhiteKnightOne, SpaceShipOne, and the SCUM (Scaled Composites unit mobile).

TierOne program elements. From upper-left to right: mobile nitrous oxide delivery system (MONODS), rocket motor test stand trailer (TST), WhiteKnightOne, SpaceShipOne, and the SCUM (Scaled Composites unit mobile).

Me with SSO at the Scaled Composites main hanger.

Me with SSO at the Scaled Composites main hanger.

PRIME (2009)

Click to watch the PRIME product review video.

Click to watch the PRIME product review video.

Predator/Reaper Integrated Mission Environment, or “PRIME,” was a program originally sponsored by the Air Education and Training Command (AETC) out of Randolph Air Force Base, Joint Base San Antonio, Texas to solve the problem of limited access to “big metal” Reaper and Predator simulators – especially for fundamental RPA training. It aspired to emulate the functions of an MQ-9 Reaper Ground Control Station (GCS) for both Pilot and Sensor Operator (SO) in a cost-effective, networked PC hardware configuration, in both a classroom or laboratory setting. PRIME’s scalable software architecture allows deployments that range from a single laptop for solo, off-line task-training, all the way up to the “full-sized” multi-computer, near-realistic simulation environment for combined pilot, SO and instructor training. All displays and menus are accurately modeled to the MQ-9 Reaper systems with the proper form-factor and behavior, including weapons, EO/IR sensor, satellite delay, and so on.

I created a unique architecture that could handle a vast number of interacting systems, while also ensuring the software development time-frame remained tractable. The solution was a visual programming environment that included a flow-parallel runtime, an entity-component-system (ECS)-based compositional architecture, and Harel state-charts for conditional logic. The runtime core used series-parallel, partial ordering to transform discrete system data flow graphs – created by either programmers, or designers – into a single DAG definition with near-optimal data parallelism for real-time simulation execution. The entire MQ-9 system, from the ground control system (GCS) to flight model, physics engine, and GUI was fully implemented in this architecture.

The PRIME simulator has helped train hundreds of pilots and sensor operators, and is still in operation today.

DARPA PCAS (2011)

Watch the sample video.

Watch the sample video.

Persistent Close Air Support (PCAS) was a DARPA program with the goal of demonstrating dramatic improvements in close air support (CAS) capabilities by developing a system to allow continuous CAS availability and lethality to Joint Terminal Attack Controllers (JTACs). We proposed an agile-based, end-to-end adaptive simulation environment based on PRIME that allowed iterative design and development of software and hardware components while developing new tactics, techniques and procedures – all of this guided by feedback directly from the JTACs using our system. Our proposal was a bit of a long-shot since we had never won a TTO program and were going up against “the bigs.” But, it worked. The DARPA Program Manager (PM) agreed with our supposition that sprinting ahead using software simulation with hardware in-the-loop to develop an MVP before spending big bucks on bending real metal was the appropriate course of action. We were selected as a Task “B” performer to optionally support the two competing Task “A” performers. Throughout our period of performance our team worked closely with the two “bigs”, and continued to work with the down-selected Task “A” in subsequent program phases.

Me and a soon-to-be autonomous A-10C.

Me and a soon-to-be autonomous A-10C.

AFRL ATAK for PCAS

Prototype PCAS-specific additions to ATAK

Prototype PCAS-specific additions to ATAK

The Tactical Assault Kit for Android (ATAK) is an Android smartphone geospatial infrastructure and military situational awareness app used by special forces for precision targeting, surrounding land formation intelligence, situational awareness, navigation, and data sharing. I leveraged ATAK as our primary tablet UI framework to create prototypes of PCAS-specific workflows and UI features. I was the primary researcher, designer and developer for much of the tablet UI work.

DARPA Plan X (2012)

Plan X functional prototype featured at DARPA Demo Day 2016

Plan X functional prototype featured at DARPA Demo Day 2016

Plan X was a Defense Advanced Research Projects Agency (DARPA) program, which aimed to develop a defensive platform for the Department of Defense to plan for, conduct, and assess cyber-warfare in a manner similar to kinetic warfare.

In late 2011, we kicked-off the Plan X proposal development process with the primary objective of capturing Task Area (TA) 5: “Intuitive Interfaces,” and a secondary objective of supporting TA1: “Infrastructure.” We won both bids. I was chosen to lead the Intific effort as the Principal Investigator (PI) responsible to meet the desires of the DARPA PM (Program Manager), interface directly with the SETAs (Systems Engineering and Technical Assistance), gather requirements, and guide the design and development of core program features. Plan X was a very complex program with many task areas and performers.

Some articles about Plan X:

Next-Gen Air Combat Maneuvering Instrumentation System (2016)

Click here to watch a video about Cubic’s ACMI system.

Click here to watch a video about Cubic’s ACMI system.

In 2014, Intific was acquired by the Cubic Corporation and subsequently put under the wing of Cubic Global Defense Systems. Among the many projects within Cubic, the ACMI (Air Combat Maneuvering Instrumentation) system is their bread and butter. ACMI is used by the USAF and our allies to track the state of all aircraft participating in air combat training exercises (Red Flag, for example). It’s like laser tag, but on a much, much greater scale. The movie “Top Gun” features Cubic’s first ACMI system in a post-ACM (air combat maneuvers) debrief scene. Cheesy, indeed.

‘If you think, your dead.’ – Maverick

‘If you think, your dead.’ – Maverick

My objective was to modernize an aging, second-generation AMCI Individual Combat Aircrew Display Systems (ICADS). It was an exciting opportunity with many difficult challenges to contend with.

I decided to build the next-gen ICADS system as a modern web application using the latest technologies and experience gained developing Plan X’s complex UI and visualizations. React and Redux was leveraged to implement the core UI and application logic. WebGL and WebAssembly was used heavily for 3D rendering and performance-critical, “main-loop” code. This recipe proved to be surprisingly successful.

The next-gen ICADS web interface is capable of rendering multiple complex visualizations at 60 FPS.

The next-gen ICADS web interface is capable of rendering multiple complex visualizations at 60 FPS.

The next-gen ICADS system is highly modular to meet special use cases. Here’s a screen shot of the Range Training Officer (RTO) display.

The next-gen ICADS system is highly modular to meet special use cases. Here’s a screen shot of the Range Training Officer (RTO) display.

DARPA PROTEUS (2017)

PROTEUS concept art.

PROTEUS concept art.

The goal of the Prototype Resilient Operations Testbed for Expeditionary Urban Operations (PROTEUS) program is to create and demonstrate tools to develop and test agile expeditionary urban operations concepts based on “dynamically-composable” force packages. If successful, the software tools and concepts developed in the PROTEUS program will enable assessment and exploration of new approaches to combined arms operations involving coordination of effects in multiple domains.

Some articles about PROTEUS:

Project IKE (2018)

As the Cyber National Mission Force (CNMF) evolves to a unified command structure, it needs tools to track the readiness, status, and activities of thousands of cyber operators. Additionally, CNMF leaders need a consolidated situational awareness picture of cyber threat indicators and known compromises, and associated aids in course of action development. The OSD Strategic Capabilities Office identified the potential to achieve these goals with the Defense Advanced Research Projects Agency’s Plan X, and initiated a prototype called Project IKE. I made significant contributions to the effort of securing Plan X as the de-facto C2 (command and control) solution for OSD SCO, USAF, and Space Force.

Some articles about Project IKE:

Optios (Present)

Click to watch a promotional video about one of our latest projects.

Click to watch a promotional video about one of our latest projects.

Optios is a leader in the rapidly emerging neuroperformance industry. Based on more than a decade of work at DARPA, hundreds-of-millions of dollars of proprietary research, and close partnerships with the world’s most elite organizations, Optios' guiding mission is to build an intellectual framework and platform that supports the next phase in human development.