Math and Computing Are Different...and You Need Both

One of the hardest things to get used to is that, in spite of the close kinship between math and computing, as soon as you start computing with real hardware, differences emerge, and in order to get the most out of your computational tools, you need to be good at both math and computing. 

"Stiffness" may be the best illustration. Lots of systems involve behavior on multiple timescales. And even if the fast moving behavior is small, it can wreak havoc with your computational tools. This is because you usually don't try to solve the problems analytically. Instead you solve them "numerically", meaning that you let the computer make a really good approximation of the solution, relying on the computer's repeatability and its ability to do fast arithmetic. But in stiff systems, even small errors in the approximation can cause things to go haywire.

Here's an example using a simple, classically stiff ordinary differential equation:

y' = y2 - y3   and we will let  y(0) = 0.01  for our initial condition

It's hard to get any simpler. But when we try to solve the equation by computer using a numerical method that isn't sensitive to stiffness, what we get makes almost no sense at all. 


        

The orange curve represents the true solution and the blue curve represents the solution the computer came up with. All of the oscillations and lag are artifacts. They aren't part of the answer to the problem, they are part of the method we used to solve it. There's no real math error; we haven't even tried to find an analytic solution, so beyond stating the problem we haven't done any math. There is no computer error; the computer has performed all of its calculations as instructed. And there isn't even a roundoff problem. If we had ten times the number of digits of precision, the error would be almost exactly the same. The error is in the computational method, because we used a computational method that--while perfectly good for some problems--is insensitive to stiffness. It's useful to emphasize that computation and computers are different things too. Computers are machines. Computation describes the methods programmed into the machines. 

    
       


To get all of this right, we need to know enough math to recognize that this is a stiff problem, and enough computation to know how to solve a stiff problem on a computer.

If stiffness never comes up in real problems, then there are no worries, right? But stiffness is real. Wherever you have behaviors working on different timescales, stiffness arises. Traffic analysis is a good example.


        

If you are trying to model complex traffic flow in busy cities that experience rush hours, or in telecommunications networks that have busy hours, you have to cope with stiffness.

Stiffness is a good example, but far from the only place where the differences between math and computing are evident. We briefly mentioned roundoff. Computers are really bad at things like adding small numbers to big ones. And this also comes up often in what are called "ill-conditioned systems", like when you are trying to watch a target but all of your sensors see it from a similar angle (sometimes called "needle space"). 

In the real world we frequently have little choice in where we can observe a behavior, like in the figure below where we have two fixed sensors observing a vehicle, but the sensors are looking in almost the same direction to see the vehicle.


        


That invariably leads to ill-conditioning. The same problem can come up in social network analytics, for example, when trying to characterize someone by their comments but you only have their comments on a narrow range of subjects. Like a person you only know from their Netflix reviews, and they only review films they like.

There are ways to adjust to roundoff problems just as there are ways to adjust to stiffness. But once again, it requires enough math to see the problem coming, and enough computation to know how to fix it.


S3 Data Science, copyright 2015.