To recap a few things:
I attended Rensselear Polytechnic Institute from 1968 to 1974, graduating with a Master's degree in Engineering Science. Engineering Science was at that time and remains to this day a funny program, a "roll your own" sort of thing. Engineering Science degrees at RPI require that you convince the curriculum chairman that the course of study that you had designed was an appropriate course of study for an engineer. Then, of course, you actually have to complete the program, which is not as easy as you might think (and certainly not as easy as you thought when you first thought up the idea).
One buddy of mine studied urban planning, transportation, and architecture, so now he designs airports around the country and the world. Another took courses in electrical and biomedical engineering, and he's now a hospital management consultant. All in all, it worked out well, I think, at least sufficiently well that the Engineering Science program at RPI continues to this day.
As for me, my course of study centered on the simulation modeling of urban and environmental systems. These days, it's hard for me to even write that sentence without marveling at youthful hubris. What were we thinking? Well, grand notions were in the air. The Cybernetics movement of the 40s and 50s has flowed into what was called General Systems Theory: the idea that science and engineering has developed a set of analytical tools that were so powerful that they might even be able to handle the social and biological sciences.
I put together a course of study that included linear systems and control theory, voice and image processing, urban analysis, with a solid chunk of statistics and operations research for trying to get the data that most people agreed would be necessary to validate and calibrate these huge models that we were going to build. I'm not sure how coherent the course of study, but I will say that for a while it seemed that every course eventually would up with us trying to invert some damn matrix or another.
Finally, I did my graduate work on rewriting a simple simulation model to compare to the very large model that the Lake George Ecosystem project at RPI had prepared. Then I graduated, moved to California, and began to look for a job. Eventually a dream job fell into my lap (literally, as one of the guys I was living with at the time tossed the phone number into my lap, saying, "We decided I wasn't the right guy for this, but it sounds right up your alley), and I became a smog scientist. Why this was a perfect next step will now require some simple math.
Conceptually, the most general model of a dynamic (changes with time) system is the state variable formulation. It pretty much goes, “Here is a series of variables that describe some phenomenon. Each variable is linked to the other variables such that the state of that variable at an instant in time is a function of the other variables at some previous time.” Often, the “previous time” is the instant immediately preceding (for a differential equation), or a discrete time step back (for a difference equation). Sometimes, however, the functional relationship looks back some period of time, although this can be turned into an instant/single time step formulation just by creating more state variables which then contain a time lag link to previous variables, i. e. “memory variables.”
Having said all that, Keep It Simple, Stupid is a good rule to live by, and it’s a pretty good rule in science and engineering. So let me write a simple equation:
dC/dt = kAB
In case anyone here has math nausea, let me emphasize how simple this is. It just says that the rate that C is changing with time depends on the product of A and B with k as a rate parameter (just multiply A and B and k). A, B, and C represent state variables, while k is a parameter, and t is time.
C can be anything, but it is most interesting when C has an effect on A and/or B. For example, suppose
C = -A-B+D+E or
This is like a chemical reaction, where A and B react to form D and E. Another way of writing it is
A + B => C + D
That’s your basic chemical shorthand.
Or suppose you are dealing with a predator/prey relationship:
Wolf + Deer => (1+∆)Wolf (i. e., a well fed wolf)
Or an aquatic ecosystem:
Phytoplankton + Zooplankton => (1+∆)Zooplankton
Phytoplankton + light +phosphate => (1+∆)Phytoplankton –phosphate
Now why is this so interesting?
The most interesting thing about such equations is how generally applicable they are. As I’ve just shown, you can use them for chemical compounds or ecosystems with equal abandon. Why? Because the setup is similar; the rate of change of the state variables depends upon how often the individuals (molecules, plankton, wolves) come in contact with each other. That is a very common situation.
The equations are non-linear but they can come close to being linear when either A or B is much larger than the other. Sometimes, this is called "well-behaved" which means "I think I sometimes understand how it works." Nevertheless, large, "well-behaved" systems often are "counter-intuitive," which means, "Okay, so I was wrong at first, but this time I'm sure I'm right, maybe."
This is just the chemistry part, of course, and the rest of photochemical modeling also has a lot of physics and mechanics in it. Whitten and I used to joke about how academic lectures of smog modeling would usually begin with somebody writing the diffusion equation on the board, a really daunting looking three-dimensional partial differential equation describing fluid flow, followed by a couple of single letters that represented “emissions” and “chemistry.” The joke was that there were well-established ways of solving the diffusion equation numerically, and the process itself (fluid mechanics) has been pretty well understood for generations. In other words, all the really hard work, preparing the emissions inventory and developing the chemistry, were compressed into two humble little letters.
A while back, in “The Right Formula” (May 7), I wrote a bit on “the stiffness problem” that often occurs when you’re trying to solve dynamic state equations that have widely varying time scales. In that essay, I briefly noted the “Gear-Hindemarsh” routines that we used in simulating smog chamber chemistry, before we hard coded the chemical kinetic mechanisms into urban smog models. When doing the latter, we used various tricks that can only be used if you already know the chemistry you’re dealing with. Obviously this isn’t very good when you’re developing the chemistry, but that’s okay, because we had the Gear routines.
The Gear-Hindemarsh codes were developed at Lawrence Livermore Labs, in order to deal with equations like this:
Li6 + n => He4 + H3
H3 + H2 => He4 + n + 17.2Mev
H2 + H2 => He3 + n
H2 + H2 => H3 + H1
In case you didn't notice, the 17.2 Mev means that if you do this to a substantial amount of Li6 and H2 (deuterium), you get a lot of energy. If the pressure and density of the material is right, you get a very large bang, i.e. a thermonuclear detonation. Hence, Livermore's interest.
Gear-Hindemarsh and similar schemes solve stiffness problem but at a cost: every time the systems see an input with discontinuities (step discontinuities even at the 4th or 5th derivative), the solver drops to a lower order predictor corrector and takes very small steps. This is fine for smog chamber experiments, not so good for urban simulations, where inputs and boundary conditions keep changing by the hour.
As the final bit of something a little like irony, I’ll note that I once revisited my Master’s Degree work and used a Gear-Hindemarsh solver on it instead of DYNAMO, which was a simulation language developed at MIT and used by Jay Forrester for his Urban Dynamics and World Dynamics models. The Gear-Hindemarsh results were substantially different from the DYNAMO results, indicating that the (pretty crude) numerical solver in DYNAMO was still sensitive to step size in my simulations. Oops. I have no idea if Forrester’s results suffered from the same problem, though I don’t actually think it matters that much. Forrester’s work had substantially worse problems than a bad number cruncher.