Posts Tagged ‘models’

It may be that the presumed dichotomy between determinism and randomness is superficial and illusory. Determinism is the world view that events result from an unalterable causal chain. It models the world as a clock whose behavior can be inferred by scientific investigation. Stocasticity or randomness is the world view that uncertainty pervades experience. It models the world as a dice game with unpredictable behavior.

Many thinkers including Einstein, Buckminster Fuller, and D’Arcy Wentworth Thompson have argued in support of the traditional deterministic world view[1]. However, Quantum mechanics, machine learning, and behavioral economics are three prominent areas which have helped realign modern thinking to apprehend that randomness and uncertainty may be fundamental and pervasive. Leonard Mlodinow in a 2008 book goes further and argues that randomness rules our lives.

In preparing for and discussing randomness at a recent meetup of the Ben Franklin Thinking Society, I started to gravitate to the hypothesis that uncertainty and determinism may be like inside and outside or concave and convex. They may be both real, both partially right and partially wrong, both revelatory and misleading. It may be that each perspective is a “tuning in” to only part of a reality that is both-neither[2].

The principle of functions states that a function can always and only coexist with another function as demonstrated experimentally in all systems as the outside-inside, convex-concave, clockwise-counterclockwise, tension-compression couples.

— R. Buckminster Fuller, Synergetics 226.01

Here are several ways to see the dual and co-occurant qualities of the stochastic and deterministic models or world views.

In a deterministic model of the world, the fixed set of laws that govern everything apply to every quanta of energy or their constituents. So computing the state of the world requires applying these fixed laws to each such quanta from some initial state and iterating through each picosecond of time. Clearly, this is computationally infeasible except for the computer known as Universe itself. So any effective simulation or calculation will entail estimates and approximations, that is, randomness. Unwittingly, randomness imposes itself into the system!

Conversely, in a stochastic model the relationships between data are given by frequencies with respect to their sample space, the set of possible outcomes. What could be more deterministic than the elementary counting of frequencies? Indeed probability is basically a form of advanced counting in ratios. Deterministic indeed!

Now consider measurement. The basis of a scientific model involves measurable parameters. Data are measurements. Science has determined that all measurements involve uncertainty. MIT physicist Walter Lewin puts it emphatically: “any measurement that you make without any knowledge of the uncertainty is meaningless!” Measurement theory is built upon the law of error which is a principle of the science of randomness. Hard data acquires its validity and persuasiveness from the science of chance!

The key to understanding measurement is understanding the nature of the variation in data caused by random error.
Leonard Mlodinow

On the other hand, the law of error is a central principle in statistics, the science of inferring probabilities from observed data. Such inference is the gold standard of scientific truth. The techniques of scientific inference are based on the mathematics of randomness. Like all mathematics, the theory is definite, rigorous, and repeatably verified by logic, proof and experiment. The sciences of probability and statistics are rigorous and deterministic like all mathematics!

Even in a fundamentally deterministic world, our understanding, decision-making, strategies, predictions, measurements, and designs are predicated upon uncertainty and randomness. To be effective we must be cognizant of these lingering unavoidable uncertainties.

Conversely, even in a fundamentally uncertain world ruled by randomness, pattern and order emerge and can be identified. To be effective we can and should seek the design and structure permeating through the apparent randomness.

From these considerations, I conclude that randomness and determinism always and only coexist. They are inseparable. Each provides a spectacular, incisive perspective on reality. The careful thinker or practitioner should be facile in using both types of models to get a more wholistic, more complete picture of the world in which we find ourselves. This is evidence that both-neither should be our guiding principle in seeking truth!

Do you find the argument compelling? Is it sound? Can you help me improve it? Do you see other ways in which these two models interpenetrate and interaccommodate? How do you see the interrelationship between determinism and randomness?

To better develop my understanding of a more complete set of models (beyond superficial determinism vs. stochasticity), I am excited about Scott E. Page‘s new and just started on-line video course on Model Thinking. I think we need many diverse models to sharpen our thinking and uncover subtleties in the complex systems and theories upon which our civilization is built. I am looking forward to wrapping my head around the 21 or so models in this course. You can register for the Model Thinking course by filling out the form at http://www.modelthinker-class.org/.

So if you want to be out there helping to change the world in useful ways, it’s really really helpful to have some understanding of models.
— Scott E. Page

Finally, here are three good audio-visual resources that explore issues of randomness further:


[1] Click here to read my previous essay on randomness where arguments for determinism are discussed.
[2] Credit to Tom Miller for the wonderful expression both-neither.

Share |