BRAIN HQ
Architecting a C++ iOS/Android sprite engine in OpenGL.

PROGRAMMING

THE CHALLENGE

We needed to implement a core training regiment that span almost 40 hours of unique content and consisted of visual and auditory presentations that needed to be accurate to a single frame. The training needed to work on multiple platforms and the neurologists who were going to be programing the training had no exposure with mobile programing. To reduce QA burden across platforms I decided to implement the core training in OpenGL ES using vanilla C++ but the challenge was creating a simple stable environment that people new to C++ could be productive in.

The openGL layer manages the visual and auditory presentation of all the training and holds all date models, manages provisioning, maintains and resolves online/offline catch states, and manages social elements of the program.  This allows us to use the exact same code on iOS and Android.  The only difference in the code bases is around the JNI/Object-c bindings.

MAKING C++ PALATABLE

The training exercises only need to remain in memory for 2-5 minutes. Once they were completed they’d be pulled from memory and another task would be instantiated. To make things safer I implanted an extensive publisher subscriber patter that reduced the need to hold dependancies on various instances the program. The developer would simply subscribe to hear button presses, timeout windows, progressions, animation states, audio states, particle systems, etc without the need to reface the instance of the class and risk access faults during cleanup. This made it possible to remove all the dependancies before memory was deallocated.

MEMORY

Memory management is one of the great powers and curses of C++. To make things easier I implemented a system of smart points that used ref counting to purge dead memory. I also officiated access to larger memory partitions like visual and auditory assets. All textures where managed in texture atlases that used a BSP algorithm to optimize texture memory in real time. The uv, color, vertex and face indexes where also managed outside the users control making it possible to purge a majority of memory without the developer even realizing the memory was allocated.

INTERFACE ELEMENTS

Implementing our own graphics layer also made it easier to maintain an extremely consistent user experience across all platforms.  The engine contains a catalog of event-driven user interface elements that were easy to create and customize. The elements included toggles, buttons, timers and various types of sprites. In addition to this all visual elements were easily animatable and events were issued when any animation event completed.

CRASH DETECTION

Centralizing the logs was an important step in maintaining stability. The app ran in Java, C++, Objective-C and Swift so centralizing all the logs into a single source was essential to taking down and fixing bugs. After testing a number of solutions Crashalityics provided the best recovery system for crashes across all languages and quickly became the primary source for determine the apps stability. Our crash free users usually hover around 99.8% which isn’t great but it could be worse given the situation.