Skip to main content
  1. Reviews/

Cybernetics: Or Control and Communication in the Animal and the Machine: Preface by Doug Hill

·5 mins

Cybernetics #

Let’s imagine that you stumbled upon a grand unified theory that explains control mechanisms in all things - living or automatons. The promise is that you can use this theory to understand all control-and-coordination machinations’ underpinnings - micro and macro. Not only do you know them, but you can also use the know-how to optimize and improve on them for, you know, fun and profit! Like all grand schemes, this was a pipe-dream; like all pipe-dreams, the research’s salvage value opened flood-gates of inventions, discoveries, and alternate viewpoints.

I am talking about Cybernetics.

Cybernetics was the forerunner of advances in such fields as Systems Thinking, Operational Research, Control Systems, Governance and many more. It is one of the rare areas that had both the politicians and scientists excited about its potential. Cybernetics is a general field that abstracts the concept of control. The abstract idea of control, especially extending power to Governance, spoke with many runners of the State machinery. From Soviet Union to Chile, countries experimented with Cybernetics working in tandem with the field’s pioneers.

Is there is a single piece of literature that can be called the source of Cybernetics? In that case, it is the book by Norbert Wiener - Cybernetics or Control and Communication in the Animal and the Machine.

Doug Hill, author of the book “Not So Fast: Thinking Twice About Technology”, writes a foreword for one of the editions of the book. The preface is more of cautionary advice than an introduction. In the foreword, he echoes Norbert Wiener’s skeptical stance on the hazards of technological development. Doug writes about Norbert’s troubled and precocious childhood. Doug also laments the misuse of Social Media by the Democratic Institutions and the Media against Norbert’s skepticism about new technologies.

Childhood #

Norbert Wiener was an academically precocious child. By the time he was ten, he studied chemistry, geometry, physics, botany, and zoology; by the age of eleven, he’d entered college. He earned his undergraduate degree in mathematics from Tufts at fourteen, spent a year studying philosophy at Cornell, and earned his doctorate with a thesis on mathematical logic from Harvard at seventeen. Leo Wiener, his father taught him at home, an education that, according to Norbert, entailed a “systematic belittling” regimen. He self-identified himself as an outsider. He said, “if I was not to be welcomed, well then, let me be too dangerous to be ignored.”. Claude Shannon, one of Norbert’s contemporaries, is well-known in communications because of his pioneering work. Norbert work, along similar lines but more generic, is receiving a belated-recognition among AI scientists. Doug describes Norbert as being a bit neurotic and insecure about his own ideas being misattributed.

Foreword #

Doug, in his foreword to the book, warns us about the perils of technological progress. He argues that technological progress dings employment because of automation. He also argues that the control aspect of Cybernetics is ruinous to the Democratic State Machinery. In the hands of evil statesmen, the Cybernetic theory becomes another cog in the State’s control apparatus. The speed and the literal-minded application of technology only add fuel to the fire. He excerpts mishaps using driver-less cars and the 2010 crash of the stock markets as examples of how technology and its application have hampered the human condition. Doug bases his apprehensions on Norbert’s skepticism about human nature.

Whenever I come across a discourse about the dangers to technologies, I am reminded of the prescient parable - the cautionary tale of Frankenstein. Like Victor Frankenstein, many inventors and scientists are caught unaware of the consequences of their own creations. At its best, technological advancement brings about a paradigm shift in the welfare of the human condition; at its worst, it opens a Pandora’s box that threatens humanity’s existence. Galvanism inspired Frankenstein; Galvanism was perhaps a harbinger of the Cybernetic movement. Cory Doctorow, writing about Frankenstein in his essay, “I have created a Monster, and So Can You)”, opines a theory about technological progress. A theory which he calls the “Adjacent Possible”. What is possible “adjacent” to what we already have in place?

“But technology doesn’t control people: people wield technology to control other people.”

So what is the advice for the young scientist or inventor? The young inventor is caught between two groups - Muses and Oracles. While Muses promise adventures and fruits of the research, the Oracles caution them about the moral implications. If the muses are subdued, then we cannot have any progress. If, on the other hand, the oracles are ignored, then the young inventor is weighed down by the guilt of a preventable mishap. The cost-benefit analysis is necessary but tedious.

My opinion aligns more with Cory than Doug. Doug’s argument undermines the agency of people using technology. Consider nuclear energy. On the one hand, scientists have made bombs that decimated cities. In a chilling video, J. Robert Oppenheimer, a scientist who had worked on the Atomic Bomb, remembered how he felt about his creation by taking a leaf out of Bhagvad Gita

Now I am become Death, the destroyer of worlds.

On the other hand, nuclear energy’s controlled use generates Electricity several orders of magnitude greater than what is possible with the traditional means. The quote from the Spider man movie puts it best:

With Great Power comes Great Responsibility.