The following are highlights of stuff I worked on in the past.

VirtuePlay (1999-2002, and 2006-2009)

Like most programmers, I spend a good chunk of my free time programming – twiddling away on personal projects, learning new skills, and so on. Occasionally, these side projects are the result of some frustrating aspect of the day-to-day job, often spurred by the thought that “I can code it better.” Prior to starting VirtuePlay, my colleagues and I made numerous prototypes of game engines and tools based on the supposition that concepts like WYSIWYG and rapid-application development could also make game development better. We came up with new methods and strategies driven by of our shared sense of frustration over the mediocrity of the state of the art, and the marring by poor working conditions at game companies of that era. So, in early 2000, the decision was made to take our side projects and ideas to the forefront and start a company…

Our little group was able to secure capital, so we spent the first couple years honing our core tech and writing demos.

A few of our early investors had connections to organizations that gave us special access to new hardware and unique datasets. With that, and the need to demonstrate our tech, we built a fully-immersive, interactive virtual reality simulation of the entire Moon, where a user could traverse and explore any part of the Lunar surface as well as intercept objects of interest in orbit. We modeled accurate terrain and imagery sourced from NASA’s Clementine and Lunar Prospector missions, and created highly detailed artifacts for all US and Russian landing sites. For the display output, we used a Rockwell-built, military-grade stereoscopic head-mounted display (HMD) combined with 9-DOF sensors to track head and body movement. (Keep in mind, this was over a decade before Oculus!) The user was able to fly to a site and walk around in virtual reality. It was pretty cool for it’s time and was featured at numerous events, including events sponsored by NASA. We even had “Buzz” Aldrin try it out!

One of my primary contributions to Lunar Explorer was the planetary-scale, real-time terrain rendering system. The implementation was based on ROAM (real-time optimally adapting mesh), which ended up being similar to what is described in ‘Real-Time Optimal Adaptation for Planetary Geometry and Texture: 4-8 Tile Hierarchies’ (Hwa et al., 2005).

One of my primary contributions to Lunar Explorer was the planetary-scale, real-time terrain rendering system. The implementation was based on ROAM (real-time optimally adapting mesh), which ended up being similar to what is described in ‘Real-Time Optimal Adaptation for Planetary Geometry and Texture: 4-8 Tile Hierarchies’ (Hwa et al., 2005).

We created numerous cool demos, and licensed our tech to game companies, but had not yet built a full-scale game of our own. So, we hunkered down to create Lunar Racing Championship:

The following is a video capture of wheel physics development used in Lunar Explorer (circa 2001):

Over the years at VirtuePlay, I contributed heavily to the core engine, acted as lead programmer on numerous projects, was responsible for the development of our rapid-application development tools, and helped create a forward-leaning micro-kernel operating system leveraging on our core game tech. I had a hand in every aspect of architecture and coding.

SpaceShipOne (2002-2005)

I joined Scaled Composites in 2002, and was given the massive responsibility of designing and developing the spacecraft avionics – what would become the System Navigation Unit (SNU), and Flight Director Display (FDD). My primary objective was to work closely with the test pilots to create an optimal user experience and interface that satisfied all modes of flight. The flight director logic also required significant collaboration with the aeronautical engineers and spacecraft designers. My background as a pilot, and prior tinkering in aerospace engineering, helped tremendously in producing a successful outcome. We went through hundreds of iterations, and deployed many versions to simulated and in-flight testing, which I always participated. As for the hardware, we came to the conclusion that we didn’t have time to develop our own solution from scratch, so we derived our SNU and FDD hardware units from a previously proven architecture developed by a third-part. Unfortunately, that architecture was very limited graphically, having only very basic graphics hardware acceleration. This required extra vigilance in how we implemented the display graphics. The avionics was implemented in C++, C and assembly, and included large amount of SIMD code to squeeze out maximum performance for both display graphics and flight director logic.

I also made many contributions to the SpaceShipOne flight simulator and data acquisition and telemetry systems.

I promise I’ll write more about my experience at Scaled Composites soon!

SpaceShipOne and WhiteKnightTwo captive carry flight.

SpaceShipOne and WhiteKnightTwo captive carry flight.

SpaceShipOne cockpit with the Flight Director Display (FDD) at center.

SpaceShipOne cockpit with the Flight Director Display (FDD) at center.

SpaceShipOne FDD graphics. From left to right, modes of flight: drop, boost, coast/reentry and glide.

SpaceShipOne FDD graphics. From left to right, modes of flight: drop, boost, coast/reentry and glide.

TierOne program elements. From upper-left to right: mobile nitrous oxide delivery system (MONODS), rocket motor test stand trailer (TST), WhiteKnightOne, SpaceShipOne, and the SCUM (Scaled Composites unit mobile).

TierOne program elements. From upper-left to right: mobile nitrous oxide delivery system (MONODS), rocket motor test stand trailer (TST), WhiteKnightOne, SpaceShipOne, and the SCUM (Scaled Composites unit mobile).

Me with SSO at the Scaled Composites main hanger.

Me with SSO at the Scaled Composites main hanger.

PRIME (2009)

In 2009, I joined Intific (formerly, Total Immersion) to work on a “fundamentals training program” for the USAF. Predator/Reaper Integrated Mission Environment, or “PRIME,” was a program originally sponsored by the Air Education and Training Command (AETC) out of Randolph Air Force Base, Joint Base San Antonio, Texas to solve the problem of limited access to “big metal” Reaper and Predator simulators – especially for fundamental RPA training. It aspired to emulate the functions of an MQ-9 Reaper Ground Control Station (GCS) for both Pilot and Sensor Operator (SO) in a cost-effective, networked PC hardware configuration, in both a classroom or laboratory setting. By leveraging commercial off-the-shelf (COTS) hardware the students had the ability to run PRIME from a laptop, as an off-line “task trainer,” all the way up to a networked, multi-computer, near-realistic simulation environment for combined pilot, SO and instructor training. It models the Holloman-White Sands Test Range terrain area with numerous population centers and infrastructure, and can be populated with dynamic entities to include people and vehicles. All displays and menu functionality are accurately modeled to the MQ-9 Reaper system with the proper form factor and behavior, including weapons, EO/IR sensor, satellite delay, and so on.

Early in the development, it became abundantly clear that a full MQ-9 systems simulation would require a unique architecture to handle the many interacting systems while ensuring the systems development remained tractable. The solution was a visual programming environment (language and tools) with a flow-parallel runtime based on entity-component-system (ECS) architecture, and Harel statecharts for conditional logic. The runtime core leveraged an innovative series-parallel partial ordering algorithm to transform raw data flow graphs, created by both programmers and designers, into DAG definitions with optimal data parallelism that resulted in high simulation throughput. The nickname for this environment became “DFL.” Meh.

The flight model and physics engine for the MQ-9 was implemented with DFL initially based on sparse publicly-available specifications. Detailed specifications for the MQ-9 was the closely-guarded intellectual property of GA. Unfortunately, GA was not officially part of the development program, so we leveraged senior MQ-9 pilots and sensor operators to provide guidance on the accuracy of our simulation throughout its development.

Another major hurdle we faced was how to accurately render a complex 3D environment given the unique capabilities of the MQ-9 sensor systems. The Reaper includes a high-resolution, multi-spectral targeting sensor with multiple high-power zoom levels and a laser rangefinder/designator, as well as a synthetic aperture radar system. Since no game graphics engine at the time supported large-world scenes, planetary-scale terrain, and multi-spectral rendering capabilities, we were forced to create our own. And like the flight models and physics engine, the rendering system was also integrated with DFL.

Over the years, the PRIME simulator has helped train hundreds of pilots and sensor operators, and is still in operation today.

DARPA PCAS (2011)

Persistent Close Air Support (PCAS) was a DARPA program with the goal of demonstrating dramatic improvements in close air support (CAS) capabilities by developing a system to allow continuous CAS availability and lethality to Joint Terminal Attack Controllers (JTACs). We proposed an agile-based, end-to-end adaptive simulation environment (based on DFL) that allowed iterative design and development of software and hardware components, while developing new tactics, techniques and procedures – all of this guided by feedback from the JTACs. Our proposal was a bit of a long-shot since we had never won a TTO program and were going up against “the bigs.” But, it worked. The DARPA Program Manager (PM) agreed with our supposition that sprinting ahead using software simulation with hardware in-the-loop to develop an MVP before spending big bucks on bending real metal was the appropriate course of action. We were selected as a Task “B” performer to optionally support the two competing Task “A” performers. Throughout our period of performance our team worked closely with the two “bigs”, and continued to work with the down-selected Task “A” in subsequent program phases.

AFRL ATAK for PCAS

The Tactical Assault Kit for Android (ATAK) is an Android smartphone geo-spatial infrastructure and military situational awareness app for precision targeting, surrounding land formation intelligence, situational awareness, navigation, and data sharing. During our PCAS period of performance, I leveraged ATAK as our primary tablet UI framework to create prototypes of PCAS-specific JTAC UI features. I was the primary researcher, designer and developer for much of the tablet UI work.

Prototype PCAS-specific additions to ATAK

Prototype PCAS-specific additions to ATAK

DARPA Plan X (2012)

Background: Plan X is a Defense Advanced Research Projects Agency (DARPA) program, which aimed to develop a defensive platform for the Department of Defense to plan for, conduct, and assess cyberwarfare in a manner similar to kinetic warfare.

Plan X functional concept featured at DARPA Demo Day 2016

Plan X functional concept featured at DARPA Demo Day 2016

Over a period of a few months in late 2011, I participated in the Plan X proposal development process with the primary objective of capturing Task Area (TA) 5: “Intuitive Interfaces,” and a secondary objective as a supporting component of TA1: “Infrastructure.” We won both bids. I was chosen to lead the Intific effort as the Principal Investigator (PI). The position required frequent interactions with the DARPA PM (Program Manager) and SETAs (Systems Engineering and Technical Assistance) to gather requirements, and guide the design and development of features. Additionally, Plan X had multiple task areas, some having more than one performer – and each having a PI that required persistent collaboration (the N was large).

Next-Gen Combat Aircrew Display System (2016)

In 2014, Intific was acquired by the Cubic Corporation and subsequently put under the wing of Cubic Defense Systems. This is not widely known, but Cubic happens to be the primary developer of the ACMI (Air Combat Maneuvering Instrumentation) system used by the USAF (and many of our allies) to record an aircraft’s in-flight state during air combat training. I assume you’ve seen the movie Top Gun, right? No? Well, go watch it. There’s a post-BFM debriefing scene with Kelly and Tom that features the first-generation Cubic ACMI system.

‘If you think, your dead.’ – Maverick

‘If you think, your dead.’ – Maverick

I was brought in to lead the modernization effort of the aging second-generation AMCI Individual Combat Aircrew Display Systems (ICADS). This was an exciting opportunity, but there were some difficult challenges to contend with.

I decided to build the next-gen ICADS system on modern web application technologies after an exhaustive trade analysis, combined with my experience developing Plan X’s complex UI and visualizations with web technologies. React and Redux was used to implement the UI and application logic. WebGL and WebAssembly was used heavily for 3D rendering and performance-critical, “main-loop” code. This recipe proved to be surprisingly successful.

This Next-gen ICADS functional prototype was capable of rendering multiple complex visualizations at or near 60 FPS using web application technologies.

This Next-gen ICADS functional prototype was capable of rendering multiple complex visualizations at or near 60 FPS using web application technologies.

Next-gen ICADS Range Training Officer (RTO) display.

Next-gen ICADS Range Training Officer (RTO) display.

DARPA PROTEUS (2017)

The goal of the Prototype Resilient Operations Testbed for Expeditionary Urban Operations (PROTEUS) program is to create and demonstrate tools to develop and test agile expeditionary urban operations concepts based on dynamically composable force packages. If successful, the software tools and concepts developed in the PROTEUS program will enable assessment and exploration of new approaches to combined arms operations involving coordination of effects in multiple domains.

PROTEUS concept art.

PROTEUS concept art.

Project IKE (2018)

As the Cyber National Mission Force (CNMF) evolves to a unified command structure, it needs tools to track the readiness, status, and activities of cyber operators. Additionally, CNMF leaders need a consolidated situational awareness picture of cyber threat indicators and known compromises, and associated aids in course of action development. The OSD Strategic Capabilities Office identified the potential to achieve these goals with the Defense Advanced Research Projects Agency’s Plan X, and initiated a prototype called Project IKE. I made significant contributions to the effort of securing Plan X as the Project IKE solution for OSD SCO, and the USAF.

Optios (Present)

Optios is a leader in the rapidly emerging NEUROPERFORMANCE industry. Based on more than a decade of work at DARPA, hundreds-of-millions of dollars of proprietary research, and close partnerships with the world’s most elite organizations, Optios’ guiding mission is to build an intellectual framework and platform that supports the next phase in human development.