Ramping up and engineering department in a fast growing startup.


In 2008 I took on a role as lead engineering for a small company of six. At the end of 18 months we released our flagship product and grew to 160+ people. This is a recount of some the things that worked and some of the things that didn’t work as we grew.


Organizationally speaking more than half the staff concentrated in product development and the rest of the employees in marketing, customer service, and HR. In addition to this there was a sizable science department that brought along with it HIPPA requirement and the potential for FDA QMS process.


So many aspects of a good experience rest in an intangible area that can’t be specified in documents. To help create a great user experience we felt it was important to bring the programmers closer to the customer. We created in house user testing lab where our customers could come in and use our product for free. We encouraged the programers to visit the lab and provided the programmers and designers ample opportunity to sit in on user testing sessions. We also worked hard to integrated customer service into the department by allowing the reps to sit down with the programers and explain the issues first hand and provided room for the engineers to come up with solutions to address the problems.

COMMUNICATION Early on I decided communication was a key to growing fast. One of our first hires was a writer to help the engineers document their choices and the API’s they created. This made it easier to move people on and off projects and brining new staff up to speed. It also gave us the opportunity to codify coding standards, development setup and the production process.


As much as I wanted to implement code reviews and peer programming it never really gelled with the team. I introduced code review initially but one unfortunate aspect of code reviews is they tend to be more critical than collaborative. We tried some experiments with peer programing but in the end a combination of the teams culture and looming deadlines made it hard to do.


A lack of though vetting of 3rd party technologies also contributed to the downfall of code reviews. We spent a lot of time tracking down bugs and rewriting core functionality that weren’t related to our codebase. In the end our technology choice forced us to investing time in automated testing over collaborative coding.


The depth of the product combined with our technical choices made testing difficult. To respond to this we invested a fair bit of time creating an automated testing platform that greatly reduced acceptance testing time, allowed us to pinpoint difficult issues rapidly and greatly improved the stability of our product. The testing harness consisted of a Tcl runtime environment that ran the application in a separate thread. This allowed us to write scripts that execute code in the application and simulate user interactions and verify the applications response.