- Benoit Mandelbrot & Richard HudsonWho
- July 13, 2021WhenRead, recorded or researched

Mandelbrot sketches the origins of "modern finance", disproves their assumptions, and explains how fractal geometry is a better way of modeling how financial markets actually work. The book culminates in ten insights, which Mandelbrot's mathematical approach has proven to him about the nature of markets. I enjoyed the book and subscribe to a lot of what it reveals.

Financial economics, as a discipline, is where chemistry was in the sixteenth century: a messy compendium of proven know-how, misty folk wisdom, unexamined assumptions and grandiose speculation.

Most of it focuses on practical aims, such as making money or avoiding loss for whoever is paying for the research, whether a bank or a government. While there is nothing wrong with that, it does mean-especially in the field of financial market analysis that the flow of scientific information is sharply curtailed by self-interest.

A bank in which the research department thinks it has discovered something new and useful will not share it with anyone else. Being focused on profit, not knowledge, it is unlikely to fund fundamental research.

This is where the public sector needs to step in. Fundamental research in economics needs expansion, in an environment that encourages information-sharing. The National Science Foundation in Washington, the European Research Council in Brussels, the ECB and the Fed, need to fund more basic research.

What is needed is a Project Apollo for economics a sizeable, coordinated effort to advance human knowledge. We need to understand, in much closer fidelity to reality, how different kinds of prices move, how risk is measured and how money is made and lost. Without that knowledge, we are doomed to crashes, again and again.

"Modern" finance emerged from the mathematics of chance and statistics. The fundamental concept: Prices are not predictable, but their fluctuations can be described by the mathematical laws of chance. Therefore, their risk is measurable, and manageable. This is now orthodoxy to which I subscribe - up to a point.

Work in this field began in 1900, when a youngish French mathematician, Louis Bachelier, had the temerity to study financial markets at a time "real" mathematicians did not touch money.

In the very different world of the seventeenth century, Pascal and Fermat (he of the famous "last theorem" that took 350 years to be proved) invented probability theory to assist some gambling aristocrats.

In 1900, Bachelier passed over fundamental analysis and charting. Instead, he set in motion the next big wave in the field of probability theory, by expanding it to cover French government bonds.

His key model, often called the "random walk," sticks very closely indeed to Pascal and Fermat. It postulates prices will go up or down with equal probability, as a fair coin will turn heads or tails. If the coin tosses follow each other very quickly, all the hue and cry on a stock or commodity exchange is literally static - white noise of the sort you hear on a radio when tuned between stations. And how much the prices vary is measurable. Most changes, 68 percent, are small moves up or down, within one "standard deviation" - a simple mathematical yardstick for measuring the scatter of data - of the mean; 95 percent should be within two standard deviations; 98 per cent should be within three.

Finally - this will shortly prove to be very important extremely few of the changes are very large. If you line all these price movements up on graph paper, the histograms form a bell shape: The numerous small changes cluster in the center of the bell, the rare big changes at the edges.

The bell shape is, for mathematicians, terra cognita, so much so that it came to be called "normal"-implying that other shapes are "anomalous."

It is the well-trodden field of probability distributions that came to be named after the great German mathematician Carl Friedrich Gauss. An analogy: The average height of the U.S. adult male population is about 70 inches, with a standard deviation around two inches. That means 68 percent of all American men are between 68 and 72 inches tall; 95 percent between 66 and 74 inches 98 percent between 64 and 76 inches.

The mathematics of the bell curve do not entirely exclude the possibility of a 12-foot giant or even someone of negative height, if you can imagine such monsters. But the probability of either is so minute that you would never expect to see one in real life.

The bell curve is the pattern ascribed to such seemingly disparate variables as the height of Army cadets, IQ test scores, or - to return to Bachelier's simplest model - the returns from betting on a series of coin tosses. To be sure, at any particular time or place extraordinary patterns can result: One can have long streaks of tossing only "heads," or meet a squad of exceptionally tall or dim soldiers.

But averaging over the long run, one expects to find the mean: average height, moderate intelligence, neither profit nor loss. This is not to say fundamentals are unimportant; bad nutrition can skew Army cadets towards shortness, and inflation can bond prices down. But as we cannot predict such external influences very well, the only reliable crystal ball is a probabilistic one.

Bachelier's work was largely ignored by his contemporaries, but it was translated into English and republished in 1964, and thence was developed into a great edifice of modern economics and finance (and five Nobel Memorial Medals in economic science).

A broader variant of Bachelier's thinking often goes by the title one of my doctoral students, Eugene F. Fama of the University of Chicago, gave it: the Efficient Market Hypothesis. The hypothesis holds that in an ideal market, all relevant information is already priced into a security today. One illustrative possibility is that yesterday's change does not influence today's, nor today's, tomorrow's; each price change is "independent" from the last.

With such theories, economists developed a very elaborate toolkit for analyzing markets, measuring the "variance" and "betas" of different securities and classifying investment portfolios by their probability of risk.

According to the theory, a fund manager can build an "efficient" portfolio to target a specific return, with a desired level of risk. It is the financial equivalent of alchemy.

Want to earn more without risking too much more? Use the modern finance toolkit to alter the mix of volatile and stable stocks, or to change the ratio of stocks, bonds, and cash.

Want to reward employees more without paying more? Use the toolkit to devise an employee stock-option program, with a tunable probability that the option grants will be "in the money."

Alas. The theory behind "Modern Finance" is elegant but flawed, as anyone who lived through the booms and busts of the 1990s can now see.

The old financial orthodoxy was founded on two critical assumptions in Bachelier's key model: **Price changes are statistically independent, and they are normally distributed.** The facts, as I vehemently argued in the 1960s and many economists now acknowledge, show otherwise.

The multifractal model I subscribe to is still in development but it does already offer some immediate insights into how markets work.

Wind is a classic example of a form of fluid flow called turbulence. Though studied for more than a century, turbulence remains only partly understood by either theoreticians or aircraft designers.

Wire a wind tunnel at Boeing or Airbus with appropriate instruments; and you can detect the complex motion of the water vapor, dust, or luminescent markers blowing inside it. When the rotor at the tunnel's head spins slowly, the wind inside blows nice and smoothly. Its currents glide in unison in long, steady lines, planes and curves like parallel sheets of supple, laminated plywood. This kind of flow is called laminar.

Then, as the rotor accelerates, the wind inside the tunnel picks up speed and energy. Here and there, it suddenly breaks into gusts - sharp, intermittent. This is the onset of turbulence. The wind inside the tunnel dissipates the rotor's energy. Eddies form; and on those eddies yet more, smaller eddies form. A cascade of whirlpools, scaled from great to small, spontaneously appears.

Then, just as suddenly, a surprise. Here and there, smooth flow returns momentarily. And then more gusts and turbulence. Smooth again. Rough again.

That experience underlies all my thinking about financial markets. The tell-tale traces of turbulence are plainly there, in the price charts. It has the turbulent parts that scale up to echo the whole. It has a set of numbers - a multifractal spectrum - that characterizes the scaling. It has a long-term dependence so that an event here and now affects every other event elsewhere and in the distant future.

It shows turbulence in a wild kind of variation far outside the normal expectations of the bell curve; in a concentration of changes here and there; in a discontinuity in the system jumping from one value to another; and in one set of mathematical rules that can, in large measure, describe it all.

Turbulence is dangerous. Its output can swing wildly, suddenly. It is hard to predict, harder to protect against, hardest of all to engineer and profit from. Conventional finance ignores this, of course. It assumes the financial system is a linear, continuous, rational machine. That kind of thinking ties conventional economists into logical knots.

Consider the so-called Equity Premium Puzzle, a chestnut of the scholarly literature since its discovery two decades ago by two young economists, Rajnish Mehra and Edward C. Prescott.

Why is it that stocks, according to the averages, generally reward investors so richly? The data say that, over the long stretch of the twentieth century, stocks provided a massive "premium" return over that of supposedly safer investments, such as U.S. Treasury Bills. Inflation-adjusted estimates of that premium vary, depending on the dates you examine, between 4.1 percent and 8.4 percent.

Conventional theory calls this impossible. Only two things, the theory says, could so inflate stock prices: Either the market is so risky that people will not invest otherwise, or people merely fear it is too risky and so will not invest otherwise.

Now, when studying this, economists typically measure the real market risk by its volatility quantified by their old friend, the bell-curve standard deviation. They measure people's perception of risk from opinion surveys. Then they do the math, and come up short: The conventional formulae say the risk premium should not exceed 1 percent or so. Surely some mistake in the data?

But these papers miss the point. They assume that the "average" stock-market profit means some thing to a real person; in fact, it is the extremes of profit or loss that matter most. Just one out-of-the-average year of losing more than a third of capital - as happened with many stocks in 2002 - would justifiably scare even the boldest investors away for a long while.

The problem also assumes wrongly that the bell curve is a realistic yardstick for measuring the risk. As I have said often, real prices gyrate much more wildly than the Gaussian standards assume. In this light, there is no puzzle to the equity premium. Real investors know better than the economists. They instinctively realize that the market is very, very risky, riskier than the standard models say. So, to compensate them for taking that risk, they naturally demand and often get a higher return.

The same reasoning-that people instinctively understand the market is very risky-helps explain why so much of the world's wealth remains in safe cash, rather than in anything riskier.

Concentration is common. Look at a map of gold deposits around the world: You see clusters of gold veins - in South Africa and Zimbabwe, in the far reaches of Siberia and elsewhere. This is not total chance; millennia of real tectonic forces gradually worked it that way.

Understanding concentration is crucial to many businesses, especially insurance. A recent study of tornado damage in Texas, Louisiana, and Mississippi found 90 percent of the claims came from just 5 percent of the insured land area.

In a financial market, volatility is concentrated, too; and it is no mystery why. News events - corporate earnings releases, inflation reports, central bank pronouncements - help drive prices.

Orthodox economists often model them as a long series of random events spread out over time. While they can be of varying importance and size, their assumed distribution follows the bell curve so that no single one is preeminent.

What sense is this? The terrorist attack on the World Trade Center was, by anyone's reckoning, far and away the most important event in years for world stability and, consequently, for financial markets. It forced the closure of the New York Stock Exchange for an unprecedented five days, and when trading reopened caused a 7.5 percent fall. It was one titanic event, not the sum of many small ones. Big news causes big market action. And that action concentrates in small slices of time.

What is an investor to do? Brokers often advise their clients to buy and hold. Focus on the average annual increases in stock prices, they say. Do not try to "time the market," seeking the golden moment to buy or sell. But this is wishful thinking. What matters is the particular, not the average.

Some of the most successful investors are those who did, in fact, get the timing right. In the space of just two turbulent weeks in 1992, George Soros famously profited about $2 billion by betting against the British pound. Now, very few of us are in that league, but we can in our modest way take cognizance of concentration.

Continuity is a common human assumption. If we see a man running at one moment here and a half-hour later there, we assume he has run a line covering all the ground in between. It does not occur to us that the man stopped to rest and then hitched a ride.

Economists often do the same. Continuity is a fundamental assumption of conventional finance. The mathematics of Bachelier, Markowitz, Sharpe, and Black-Scholes all assume continuous change from one price to the next. Without that, their formulae simply do not work.

Alas, the assumption is false and so the math is wrong. Financial prices certainly jump, skip, and leap-up and down. In fact, I contend the capacity for jumps, or discontinuity, is the principal conceptual difference between economics and classical physics.

In a perfect gas, as molecules collide and exchange heat, their billions of individually infinitesimal transactions collectively produce a genuine "average" temperature, around which smooth gradients lead up or down the scale.

But in a financial market, the news that impels an investor can be minor or major. His buying power can be insignificant or market-moving. His decision can be based on an instantaneous change of heart, from bull to bear and back again.

The result is a far-wilder distribution of price changes: not just price movements, but price dislocations. These are especially noticeable in our Information Age, with its instantaneous broadcasting by television, Internet, and trading-room screen. News of a terrorist attack in Indonesia flashes across the globe in seconds to millions of investors.

They can act on it, not bit by bit in a progressive wave, as conventional theorists assume, but all at once, now and instantaneously. The effect can be exhilarating or heart-stopping, depending on whether you gain or lose.

Conventional financial analysis is a welter of conflicting views of time. One, implicit in conventional finance theory: Time is measured by the clocks and is the same for all investors. When calculating risk under the Capital Asset Pricing Model, the formulae assume all investors think and breathe very much alike, holding the securities in question for exactly the same length of time. The con tradictory view, popular among market pundits: Time is different for every investor. Each time-scale you consider, each holding period for a stock or bond, has its own kind of risks.

In fractal analysis, time is flexible. The multi fractal model describes markets as deforming time - expanding it here, contracting it there. The more dramatic the price changes, the more the trading time-scale expands. The duller the price chart, the slower runs the market clock.

Some researchers have tried linking this concept to trading volume: High volume equals fast trading time. That is a connection not yet established, and it need not be. Time deformation is a mathematical convenience, handy for analyzing the market; and it also happens to fit our subjective experience.

Time does not run in a straight line, like the markings on a wooden ruler. It stretches and shrinks, as if the ruler were made of balloon rubber. This is true in daily life: We perk up during high drama, nod off when bored. Markets do the same.

One of the surprising conclusions of fractal market analysis is the similarity of certain variables from one type of market to another.

Prices are not driven solely by real-world events, news, and people. When investors, speculators, industrialists, and bankers come together in a real marketplace, a special, new kind of dynamic emerges-greater than, and different from, the sum of the parts.

To use the economists' terms: In substantial part, prices are determined by endogenous effects peculiar to the inner workings of the markets themselves, rather than solely by the exogenous action of outside events.

Moreover, this internal market mechanism is remarkably **durable**. Wars start, peace returns, economies expand, firms fail all these come and go, affecting prices. But the fundamental process by which prices react to news does not change.

A mathematician would say market processes are "stationary." This contradicts some would-be reformers of the random-walk model who explain the way volatility clusters by asserting that the market is in some way changing, that volatility varies because the pricing mechanism varies. Wrong.

A striking example: My analysis of cotton prices over the past century shows the same broad pattern of price variability at the turn of the last century when prices were unregulated, as there was in the 1930s when prices were regulated as part of the New Deal.

We mathematicians and physicists love what we call an invariance. That is a property that remains unchanged, no matter how you transform the data, shape, or object under study. Fractal geometry is the mathematics of one such invariance in the physical world-the study of patterns, in space or time, that remain the same even as the scale of observation changes. Statisticians have a kindred concept, called stationarity: A stationary time series has the same basic statistical properties throughout.

Economists argue their field may be different. Economist Jacob Marshak once meeting I attended that the only economic invariance he could imagine concerned the equality between the number of left and right shoes and not even that could be trusted. Following that thinking, many recent models of price variation try to explain the obviously shifting pattern of volatility by inserting parameters that change by the day, hour, and second; such are in the GARCH family mentioned earlier. I am an optimist. I would rather not dismiss the existence of invariances but continually look for them hiding in non-obvious places. Invariances make life easier. If you can find some market properties that remain constant over time or place, you can build better, more useful models and make sounder financial decisions. My multifractal model works with just such a set of consistent parameters.

As I have stated often, the distribution of price changes in a financial market scales. Like the proportion of billionaires to millionaires in Pareto's income formula, so the proportion of big changes to small changes in a financial price series follows a consistent pattern - and it results in wilder price swings than you might otherwise expect.

Rephrase this in the language of conditional probability: Given that event X has happened, what are the odds that Y will happen next? In Pareto's case, the scaling formula means that the odds of making more than ten billion once you make more than one billion are the same as those of making more than ten million once you make more than one million.

With financial prices, scaling means that the odds of a massive price movement given a large one are akin to those of a large movement given a merely sizeable one. In both cases, the proportions are controlled by a scaling exponent, alpha.

A mind-bending paradox, to be sure. For example; have you alighted upon a stock the price of which will run and run until your profits are so vast you cannot count them? Or have you found a loser that, just as the price seems to take off, unexpectedly falls short? Are you living through a price bubble that will burst at any moment, so you should stay away? Or have the fundamental economic rules of the game changed, so that only a timid fool would not invest?

Such is the confusion of scaling. It makes decisions difficult, prediction perilous, and bubbles a certainty.

People want to see patterns in the world. It is how we evolved. We descended from those primates who were best at spotting the telltale pattern of a predator in the forest, or of food in the savannah. So important is this skill that we apply it everywhere, warranted or not. We see patterns where there are none.

Between the wars, Evgeny Slutzky, a Soviet statistician, showed how even the record of a Brownian motion - accumulation of a coin-toss game - can appear deliberate and ordered. The eye spontaneously decomposes it into up and down cycles, and then into smaller cycles that ride on the bigger cycles, and so on. Add more data, and more cycles appear. These are not real, of course. They are the mere juxtaposition of random changes.

How much more prone to spurious patterns, then, is an economic or financial price series? As described earlier, the long-range dependence in prices creates a kind of tendency in the data-not towards any particular price level, but towards price changes of a particular size or direction.

The changes can be persistent, meaning that they reinforce each other; a trend once started tends to keep going. Or they can be anti-persistent, meaning they contradict each other; a trend once begun is likely to reverse itself. The persistent variety, especially those with an H exponent near 0.75, are especially curious, and these are the type common to many financial and economic data series.

It takes no great leap of the imagination to see how such spurious patterns could also appear in otherwise random financial data. This is not to say that price charts are meaningless, or that prices all vary by the whim of luck. But it does say that, when examining price. charts, we should guard against jumping to conclusions that the invisible hand of Adam Smith is somehow guiding them. It is a bold investor who would try to forecast a specific price level based solely on a pattern in the charts.

All is not hopeless. Markets are turbulent, deceptive, prone to bubbles, infested by false trends. It may well be that you cannot forecast prices. But evaluating risk is another matter entirely.

Step back a moment. The classic Random Walk model makes three essential claims. First is the so-called martingale condition: that your best guess of tomorrow's price is today's price. Second is a declaration of independence: that tomorrow's price is independent of past prices. Third is a statement of normality: that all the price changes taken together, from small to large, vary in accordance with the mild, bell-curve distribution.

In my view, that is two claims too many. The first, though not proven by the data, is at least not (much) contradicted by it; and it certainly helps, in an intuitive way, to explain why we so often guess the market wrong. But the others are simply false.

The data overwhelmingly show that the magnitude of price changes depends on those of the past, and that the bell curve is a nonsense. Speaking mathematically, markets can exhibit dependence without correlation.

The key to this paradox lies in the distinction between the size and the direction of price changes. Suppose that the direction is uncorrelated with the past: The fact that prices fell yesterday does not make them more today. It remains possible for the absolute changes to be dependent: A 10 percent fall yesterday may well increase the odds of another 10 percent move today - but provide no advance way of telling whether it will be up or down. If so, the correlation vanishes, in spite of the strong dependence. Large price changes tend to be followed by more large changes, positive or negative. Small changes tend to be followed by more small changes. Volatility clusters.

What use is that? Plenty, if you are in the business of managing, avoiding, or profiting from risk.

Of course, you cannot predict anything with precision. Forecasting volatility is like forecasting the weather. You can measure the intensity and path of a hurricane, and you can calculate the odds of its landing; but, as anyone who lives on the U.S. Eastern Seaboard knows, you cannot predict with confidence exactly where it will land and how much damage it will do. Nevertheless, work on such meteorological ideas has begun in finance. A first step is agreeing on a way to measure the intensity and path of a market crisis.

The famous Richter Scale is the analogy most drawn upon. It measures the energy released by an earthquake on a logarithmic scale; for instance, a catastrophic quake of magnitude 7 packs ten times as much energy as a merely devastating quake of magnitude 6.

What is a financial market's analog to energy? Volatility, some have surmised. Thus, two University of Paris researchers recently devised an Index of Market Shocks according to which there have been ten financial "quakes" since 1995. The Russian market crash of 1998 was a major tremor of 8.89 on the IMS scale. The biggest: the Twin Towers attack of September 2001, registering 13.42.

The next step is forecasting - but here, work is just beginning. You cannot beat the market, says the standard market doctrine. Granted. But you can sidestep its worst punches.

**10. In Financial Markets, the Idea of "Value" has Limited Value.**

I do not argue there is no such thing as intrinsic value. It remains a popular notion, and one that I myself have used in some of my economic models. But the turbulent markets of the few decades should have taught us, at the least, that value is a slippery concept, and one whose usefulness is vastly over-rated.

So how, you ask, does one survive in such an existentialist world, a world without absolutes? People do it rather well all the time. The prime mover in a financial market is not value or price, but price differences; not averaging, but arbitraging. People arbitrage between places or times.

These arbitrage tactics assume no "intrinsic" value in the item being sold; they simply observe and forecast a difference in price, and try to profit from it. Of course, I am by no means the first to suggest the importance of arbitrage in financial theory; one of the latter-day "fixes" of orthodox finance, called Arbitrage Pricing Theory, tries to make the most of this. But a full understanding of multifractal markets begins with the realization that the mean is not golden.

It is an extraordinary feature of science that the most diverse, seemingly unrelated, phenomena can be described with the same mathematical tools.

The same quadratic equation with which the ancients drew right angles to build their temples can be used today by a banker to calculate the yield to maturity of a new, two year bond. The same techniques of calculus developed by Newton and Leibniz two centuries ago to study the orbits of Mars and Mercury can be used today by a civil engineer to calculate the maximum stress on a new bridge, or the volume of water to pass beneath it.

Now, none of this means that the bridge, river, and planets work the same way; or that an archaeologist at the Acropolis should help price an Accenture bond. Likewise, the wind and the markets are quite distinct; one is a phenomenon of nature, the other a creature of man.

But the variety of natural phenomena is boundless while, despite all appearances to the contrary, the number of really distinct mathematical concepts and tools at our disposal is surprisingly small.

When a man goes to clear a jungle he has relatively few types of tools: To cut, perhaps a machete; to knock down, a bulldozer; to burn, fire. Science is like that. When we explore the vast realm of natural and human behavior, we find our most useful tools of measurement and calculation are based on surprisingly few basic ideas.

When a man has a hammer, all he sees around him are nails to hit. So it should be no great surprise that, with our small number of effective mathematical tools, we can find analogies between a wind tunnel and a Reuters screen.

If you study the bell curve, some surprising facts arise. First, assume several games are going at once. While Harry and Tom play with a coin, their cousins are throwing dice and their friends are dealing cards. The players in each game expect a different average outcome; but for each, the graph of how their winnings per set differ from that average has the same general bell shape.

Some bells may be squatter, and some narrower. But each has the same mathematical formula to describe it, and requires just two numbers to differentiate it from any other: the mean, or average, error, and the variance or standard deviation, an arbitrary yardstick that expresses how widely the bell spreads.

Now, this is all very convenient, in fact, simpler than most situations that occur in physics. One formula that includes two numbers as parameters can describe a vast range of human experience.

The normal curve is indestructible. It is mathematical alchemy. It is what you inevitably get if you combine lots of little variations, each one independent from the last, and each one negligible when compared to the total.

No one individual matters much to the total IQ curve; no one coin toss matters much to Harry and Tom's game. But cumulatively, over time or across a population, the way the results vary forms a regular and predictable pattern. The data points are grains of sand on a shoreline, blades of grass in a lawn, electrons moving along a copper wire.

Now, this is a convenient way to look at the world, but is it the only way? Not at all.

Think of an archer standing before a target painted on an infinitely long wall. He is blind folded and consequently shoots at random, in any direction.

Most of the time, of course, he misses. In fact, half of the time he shoots away from the wall, but let us not even record those cases. Now, had his recorded misses followed the mild pattern of a bell curve, most would be fairly close to the mark, and very few would be very wide of it.

Suppose he shot arrows long enough, in successive "sets." For each set, he could calculate an average error and standard deviation - even give himself a score for blindfolded archery. But our archer is not in the land of the bell curve; his misses are not mild. All too often, his aim is so bad that the arrow flies almost parallel to the wall and strikes hundreds of yards from the target, or even a mile, if his arm is strong enough.

Now, after each shot, let him try to work out his average target score. In the Gaussian 'normal distribution' environment, even the wildest shots have a negligible contribution to the average. After a certain number of strikes, the running average score will have settled down to one stable value, and there is practically no chance the next shot will change that average perceptibly.

But the archer's case is completely different. The largest shot will be nearly as large as the sum of all the others. One miss by a mile completely swamps 100 shots within a few yards of the target. His scores for blindfolded archery never settle down to a nice, predictable average and a consistent variation around that average. In the language of probability, his errors do not converge to a mean. They have infinite expectation, hence also infinite variance.

This is a totally different way of thinking of the world than the bell curve. The errors are not distributed as near-uniform grains of sand; they are a composite of grains, pebbles, boulders, and mountains.

Such has been the argument of most mathematicians and scientists ever since: Gaussian math is easy and fits most forms of reality, or so it seems. But with the sharp hindsight provided by fractal geometry, the Gaussian case begins to look not so "normal," after all.

It was so-called only because science tackled it first; as ever in science, there is a healthy opportunism to begin with the problems easiest to handle. But the difference between the extremes of Gauss and of some forms of reality could not be greater. They amount to two different ways of seeing the world: one in which big changes are the result of many small ones, or another in which major events loom disproportionately large.

Long-term dependence is a pillar of fractal geometry. It has parallels in nature but here are some examples of how it reverberates through finance.

In 1982 IBM, then the world's biggest computer company, decided some upstarts at Apple were threatening its future with a new product called the personal computer. Uncharacteristically, IBM acted quickly. It bypassed its own big chip factories and software departments. It picked a struggling semiconductor company named Intel to make its microprocessors and a bright but insignificant kid named Bill Gates to provide its software.

The rest is well-known: Intel and Microsoft grew wildly, beyond any imaginable bounds. IBM stumbled, and shrank. But the fates of these three companies are still intertwined. Their stock prices affect one another, as profits or troubles at one redounds on the business or market-ratings of the others.

That event of three decades ago, IBM's midwifery to two new industry giants, continues to reverberate today in IBM's stock price. The dependence there is about thirty years long. One can easily imagine even longer dependence: The court-ordered breakup of John D. Rockefeller's Standard Oil Trust in 1911 continues to affect its surviving children today, ExxonMobil, ConocoPhillips, Chevron Texaco, and BP Amoco.

No one is alone in this world. No act is without consequences for others. It is a tenet of chaos theory that, in dynamical systems, the outcome of any process is sensitive to its starting point-or, in the famous cliché, the flap of a butterfly's wings in the Amazon can cause a tornado in Texas.

I do not assert markets are chaotic, though my fractal geometry is one of the primary mathematical tools of "chaology." But clearly, the global economy is an unfathomably complicated machine. To all the complexity of the physical world of weather, crops, ores, and factories, you add the psychological complexity of men acting on their fleeting expectations of what may or may not happen-sheer phantasms.

Companies and stock prices, trade flows and currency rates, crop yields and commodity futures - all are inter-related to one degree or another, in ways we have barely begun to understand. In such a world, it is common sense that events in the distant past continue to echo in the present.

In the 1960s, some old-timers on Wall Street - the men who remembered the trauma of the 1929 Crash and the Great Depression - gave me a warning: "When we fade from this business, something will be lost. That is the memory of 1929." Because of that personal recollection, they said, they acted with more caution than they otherwise might. Collectively, their generation provided an in-built brake on the wildest forms of speculation, an insurance policy against financial excess and consequent catastrophe. Their memories provided a practical form of long-term dependence in the financial markets.

Is it any wonder that in 1987, when most of those men were gone and their wisdom forgotten, the market encountered its first crash in nearly sixty years? Or that, two decades later, we would see the biggest bull market, and the worst bear market, in generations? Yet standard financial theory holds that, in modeling markets, all that matters is today's news and the expectation of tomorrow's news.