Details: We have been developing software since that was still cool. In fact, we've been developing software since before that was cool. We wrote our first serious programs in the 1970s, starting with punch cards. These were mostly to solve technical problems. Not much has changed on that front, except that we've also solved a lot of technical problems like vehicle routing and resource management that arise in everyday business.
In the 1980s we began writing software professionally. We've worked in most of the major development paradigms from machine language to procedural Fortran and then through structured programming using Pascal, C, and Modula-2, then into object-oriented programming mostly using C++ and Java, and more recently we have been developing software in the functional programming paradigm using Clojure and Scala (an object-functional language).
We've also spent a great deal of time in those parts of software construction that don't involve the direct writing of executable code (requirements capture, analysis, design, testing and validation, etc.). Usually this requires configuration management systems like revision control (e.g. git), build management systems (e.g. Ant, SBT, Leiningen or Maven), testing infrastructures (e.g. JUnit, Leiningen, or ScalaTest), etc. Most modern programming systems also provide documentation generation tools. We use those and respect them. To be sure, just because a language platform like Java or Scala provides tools for generating documents, that doesn't mean that magic happens without you doing anything to produce really useful programmer's guides or other documentation. These tools basically follow a formula called "Literate Programming" introduced by Stanford Professor Donald Knuth back in the 1980s. The idea is that you put the code and the documentation in the same file so that they are less likely to drift apart from one another. But you still have to put the documentation in that file. In fact you have to write the documentation very carefully so that the tools pick up the right cues and produce the most useful documents. These tools generally require a certain style of writing. For instance, Javadoc and Scaladoc use the first sentence of your descriptions differently than they use the rest of your descriptions. They also require special annotations of all of the important details like input parameters and error conditions.
Suffice it to say that most of our work in data science, modeling, and simulation has resulted the development of computer programs using systems programming languages and tools. Most of the work showcased on this web site is a result of that software construction. Here, for instance, is a video tutorial of a system called GRANITE we developed several years ago for the National Institute of Allergy and Infectious Disease (one of the National Institutes of Health):
Here's a screen dump from one of our first pieces of professional software, back in the mid-1980s. It is a program called STAC for processing telephone call record data and developing the most cost efficient telephone networks. This program was developed for Cable & Wireless. It was developed before Microsoft Windows was stable enough for office use. Remarkably, STAC employed the same data science roles (data fusion, analytics, and control) that we now recognize as the hallmarks of data science.
Here is a screen dump from a system we developed in the early 1990s for doing a different kind of network optimization. The Technical Specification Assistant (TSA) used an expert system to automatically design digital communications networks. This program was developed for GTE.
Here is a screen dump of a program developed for DARPA for use in planning Special Forces operations in the late 1990s. Special Operations are often called in with very little advance notice. At that time, Special Forces personnel carried beepers, and they could be called at any time of the day or night and were required to be in the air, en route to the fight 60 minutes from when the beeper went off. This program allowed users to plan their complex operations on whatever maps they had available: digital files, paper maps, tourist maps, even hand drawn maps if necessary. Whatever they could put on a scanner. We made range and bearing measurements that were as precise as the source maps (in this case a paper UTM map).
Here is a cockpit dump from a program we developed in the mid-1990s for delivering and displaying updated strike plan information to F-16 pilots whose missions changed mid-flight due to new information about threats (like previously unknown surface-to-air missiles) or emerging, higher priority targets. This system, called Xploit, sent the mission updates digitally, over the pilot's voice grade channel, and displayed the new plan information on the pilot's tactical display. These particular images show a 3-D view of the pilot's new target (an airfield) with a bubble showing the range of it's anti-aircraft protections, and an overhead view of the same target where the anti-aircraft bubble appears as a circle.
|