S3 Home‎ > ‎Our Work‎ > ‎Areas of Expertise‎ > ‎

### Control Theory

#### Minimalist Description

Control is about achieving desirable outcomes. This could be strategic or it could be tactical. For instance, a football team wants to control the outcome of the game. Going into the game, they have a strategy: cover the outside receivers even if it means taking a bit of pressure off the quarterback; run to the inside and use the fake inside run to free up the receivers at the corners. But after the game starts, they emphasize tactics: the middle linebacker isn't falling for the fake inside run, and is instead peeling off to help the corners, so fake the inside run and then throw over the middle to the running back after the middle linebacker vacates to assist the cornerback.

Much of control theory deals with optimization, and in fact it is often called Optimal Control Theory. Optimization can take many forms. The two most basic are:

• Continuous Optimization: These are problems described by smooth (or at least continuous) functions, such as the fuel burn schedule for a rocket engine or the watering schedule for a farm.
• Combinatorial Optimization: These are problems for which there is no smooth function, and the answers will be a finite set, such as which objects to take if the museum catches on fire (an unordered set), or what machines to process a job on (an ordered set).

Both classes of problem come in constrained and unconstrained flavors. The rocket nozzle may restrict the fuel burn schedule to 50 pounds per minute even though if we could burn 75 pounds per minute at the very end, we could save on total fuel spent. There are also stochastic versions of both problems, where there is uncertainty in the schedule. We may not know, for instance, exactly how much time will be required to process a job on a particular machine.