On a winter day 50 years ago, Edward Lorenz, SM '43, ScD '48, a mild-mannered meteorology professor at MIT, entered some numbers into a computer program simulating weather patterns and then left his office to get a cup of coffee while the machine ran. When he returned, he noticed a result that would change the course of science. The computer model was based on 12 variables, representing things like temperature and wind speed, whose values could be depicted on graphs as lines rising and falling over time. On this day, Lorenz was repeating a simulation he'd run earlier—but he had rounded off one variable from .506127 to .506. To his surprise, that tiny alteration drastically transformed the whole pattern his program produced, over two months of simulated weather. The unexpected result led Lorenz to a powerful insight about the way nature works: small changes can have large consequences. The idea came to be known as the " butterfly effect " after Lorenz suggested that the flap of a butterfly's wings might ultimately cause a tornado. And the butterfly effect, also known as " sensitive dependence on initial conditions, " has a profound corollary: forecasting the future can be nearly impossible.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below