16 January 2019 – The poster child for rampant nationalism is Hitler’s National Socialist German Workers’ Party, commonly called the Nazi Party. I say “is” rather than “was” because, while resoundingly defeated by Allies of WW2 in 1945, the Nazi Party still has widespread appeal in Germany, and throughout the world.
These folks give nationalism a bad name, leading to the Oxford Living Dictionary, giving primacy to the following definition of nationalism: “Identification with one’s own nation and support for its interests, especially to the exclusion or detriment of the interests of other nations.” [Emphasis added.]
The Oxford Dictionary also offers a second definition of nationalism: “Advocacy of or support for the political independence of a particular nation or people.”
This second definition is a lot more benign, and one that I wish were more often used. I certainly prefer it!
Nationalism under the first definition has been used since time immemorial as an excuse to create closed, homogeneous societies. That was probably the biggest flaw of the Nazi state(s). Death camps, ethnic cleansing, slave labor, and most of the other evils of those regimes flowed directly from their attempts to build closed, homogeneous societies.
Under the second definition, however, nationalism can, and should, be used to create a more diverse society.
That’s a good thing, as the example of United States history clearly demonstrates. Most of U.S. success can be traced directly to the country’s ethnic, cultural and racial diversity. The fact that the U.S., with a paltry 5% of the world’s population, now has by far the largest economy; that it dominates the fields of science, technology and the humanities; that its common language (American English) is fast becoming the “lingua franca” of the entire world; and that it effectively leads the world by so many measures is directly attributed to the continual renewal of its population diversity by immigration. In any of these areas, it’s easy to point out major contributions from recent immigrants or other minorities.
This harkens back to a theory of cultural development I worked out in the 1970s. It starts with the observation that all human populations – no matter how large or how small – consist of individuals whose characteristics vary somewhat. When visualized on a multidimensional scatter plot, populations generally consist of a cluster with a dense center and fewer individuals farther out.
This pattern is similar to the image of a typical globular star cluster in the photo at right. Globular star clusters exhibit this pattern in three dimensions, while human populations exist and can be mapped on a great many dimensions representing different characteristics. Everything from physical characteristics like height, weight and skin color, to non-physical characteristics like ethnicity and political ideology – essentially anything that can be measured – can be plotted as a separate dimension.
The dense center of the pattern consists of individuals whose characteristics don’t stray too far from the norm. Everyone, of course, is a little off average. For example, the average white American female is five-feet, four-inches tall. Nearly everyone in that population, however, is a little taller or shorter than exactly average. Very few are considerably taller or shorter, with more individuals closer to the average than farther out.
The population’s diversity shows up as a widening of the pattern. That is, diversity is a measure of how often individuals appear farther out from the center.
Darwin’s theory of natural selection posits that where the population center is depends on where is most appropriate for it to be depending on conditions. What is average height, for example, depends on a complex interplay of conditions, including nutrition, attractiveness to the opposite sex, and so forth.
Observing that conditions change with time, one expects the ideal center of the population should move about in the multidimensional characteristics space. Better childhood nutrition, for example, should push the population toward increased tallness. And, it does!
One hopes that these changes happen slowly with time, giving the population a chance to follow in response. If the changes happen too fast, however, the population is unable to respond fast enough and it goes extinct. So, wooly mammoths were unable to respond fast enough to a combination of environmental changes and increased predation by humans emigrating into North America after the last Ice Age, so they died out. No more wooly mammoths!
Assuming whatever changes occur happen slowly enough, those individuals in the part of the distribution better adapted to the new conditions do better than those on the opposite side. So, the whole population shifts with time toward characteristics that are better adapted.
Where diversity comes into this dynamic is by providing more individuals in the better-adapted part of the distribution. The faster conditions change, the more individuals you need at the edges of the population to help with the response. For example, if the climate gets warmer, it’s folks who like to wear skimpy outfits who thrive. Folks who insist on covering themselves up in heavy clothing, don’t do so well. That was amply demonstrated when Englishmen tried to wear their heavy Elizabethan outfits in the warmer North American weather conditions. Styles changed practically overnight!
Closed, homogeneous societies of the type the Nazis tried to create have low diversity. They try to suppress folks who differ from the norm. When conditions change, such societies have less of the diversity needed to respond, so they wither and die.
That’s why cultures need diversity, and the more diversity, the better.
We live in a chaotic universe. The most salient characteristic of chaotic systems is constant change. Without diversity, we can’t respond to that change.
That’s why when technological change sped up in the early Twentieth Century, it was the bohemians of the twenties developing into the beatniks of the fifties and the hippies of the sixties that defined the cultures of the seventies and beyond.
2 January 2019 – Now that the year-end holidays are over, it’s time to get back on my little electronic soapbox to talk about an issue that scientists have had to fight with authorities over for centuries. It’s an issue that has been around for millennia, but before a few centuries ago there weren’t scientists around to fight over it. The issue rears its ugly head under many guises. Most commonly today it’s discussed as academic freedom, or freedom of expression. You might think it was definitively won for all Americans in 1791 with the ratification of the first ten amendments to the U.S. Constitution and for folks in other democracies soon after, but you’d be wrong.
The issue is wrapped up in one single word: dogma.
“A principle or set of principles laid down by an authority as incontrovertibly true.”
In 1600 CE, Giordano Bruno was burned at the stake for insisting that the stars were distant suns surrounded by their own planets, raising the possibility that these planets might foster life of their own, and that the universe is infinite and could have no “center.” These ideas directly controverted the dogma laid down as incontrovertibly true by both the Roman Catholic and Protestant Christian churches of the time.
Galileo Galilei, typically thought as the poster child for resistance to dogma, was only placed under house arrest (for the rest of his life) for advocating the less radical Copernican vision of the solar system.
Nicholas Copernicus, himself, managed to fly under the Catholic Church’s radar for nearly a century and a quarter by the simple tactic of not publishing his heliocentric model. Starting in 1510, he privately communicated it to his friends, who then passed it to some of their friends, etc. His signature work, Dē revolutionibus orbium coelestium (On the Revolutions of the Celestial Spheres), in which he laid it out for all to see, wasn’t published until his death in 1643, when he’d already escaped beyond the reach of earthly authorities.
If this makes it seem that astrophysicists have been on the front lines of the war against dogma since there was dogma to fight against, that’s almost certainly true. Astrophysicists study stuff relating to things beyond the Earth, and that traditionally has been a realm claimed by religious authorities.
That claim largely started with Christianity, specifically the Roman Catholic Church. Ancient religions, which didn’t have delusions that they could dominate all of human thought, didn’t much care what cockamamie ideas astrophysicists (then just called “philosophers”) came up with. Thus, Aristarchus of Samos suffered no ill consequences (well, maybe a little, but nothing life – or even career – threatening) from proposing the same ideas that Galileo was arrested for championing some eighteen centuries later.
Fast forward to today and we have a dogma espoused by political progressives called “climate change.” It used to be called “global warming,” but that term was laughed down decades ago, though the dogma’s still the same.
The United-Nations-funded Intergovernmental Panel on Climate Change (IPCC) has become “the Authority” laying down the principles that Earth’s climate is changing and that change constitutes a rapid warming caused by human activity. The dogma also posits that this change will continue uninterrupted unless national governments promulgate drastic laws to curtail human activity.
Sure sounds like dogma to me!
Once again, astrophysicists are on the front lines of the fight against dogma. The problem is that the IPCC dogma treats the Sun (which is what powers Earth’s climate in the first place) as, to all intents and purposes, a fixed star. That is, it assumes climate change arises solely from changes in Earthly conditions, then assumes we control those conditions.
Astrophysicists know that just ain’t so.
First, stars generally aren’t fixed. Most stars are variable stars. In fact, all stars are variable on some time scale. They all evolve over time scales of millions or billions of years, but that’s not the kind of variability we’re talking about here.
The Sun is in the evolutionary phase called “main sequence,” where stars evolve relatively slowly. That’s the source of much “invariability” confusion. Main sequence stars, however, go through periods where they vary in brightness more or less violently on much shorter time scales. In fact, most main sequence stars exhibit this kind of behavior to a greater or lesser extent at any given time – like now.
So, a modern (as in post-nineteenth-century) astrophysicist would never make the bald assumption that the Sun’s output was constant. Statistically, the odds are against it. Most stars are variables; the Sun is like most stars; so the Sun is probably a variable. In fact, it’s well known to vary with a fairly stable period of roughly 22 years (the 11-year “sunspot cycle” is actually only a half cycle).
A couple of centuries ago, astronomers assumed (with no evidence) that the Sun’s output was constant, so they started trying to measure this assumed “solar constant.” Charles Greeley Abbot, who served as the Secretary of the Smithsonian Institute from 1928 to 1944, oversaw the first long-term study of solar output.
His observations were necessarily ground based and the variations observed (amounting to 3-5 percent) have been dismissed as “due to changing weather conditions and incomplete analysis of his data.” That despite the monumental efforts he went through to control such effects.
On the 1970s I did an independent analysis of his data and realized that part of the problem he had stemmed from a misunderstanding of the relationship between sunspots and solar irradiance. At the time, it was assumed that sunspots were akin to atmospheric clouds. That is, scientists assumed they affected overall solar output by blocking light, thus reducing the total power reaching Earth.
Thus, when Abbott’s observations showed the opposite correlation, they were assumed to be erroneous. His purported correlations with terrestrial weather observations were similarly confused, and thus dismissed.
Since then, astrophysicists have realized that sunspots are more like a symptom of increased internal solar activity. That is, increases in sunspot activity positively correlate with increases in the internal dynamism that generates the Sun’s power output. Seen in this light, Abbott’s observations and analysis make a whole lot more sense.
We have ample evidence, from historical observations of climate changes correlating with observed variations in sunspot activity, that there is a strong connection between climate and solar variability. Most notably the fact that the Sporer and Maunder anomalies (which were times when sunspot activity all but disappeared for extended periods) in sunspot records correlated with historically cold periods in Earth’s history. There was a similar period from about 1790 to 1830 of low solar activity (as measured by sunspot numbers) called the “Dalton Minimum” that similarly depressed global temperatures and gave an anomalously low baseline for the run up to the Modern Maximum.
For astrophysicists, the phenomenon of solar variability is not in doubt. The questions that remain involve by how much, how closely they correlate with climate change, and are they predictable?
Studies of solar variability, however, run afoul of the IPCC dogma. For example, in May of 2017 an international team of solar dynamicists led by Valentina V. Zharkova at Northumbria University in the U.K. published a paper entitled “On a role of quadruple component of magnetic field in defining solar activity in grand cycles” in the Journal of Atmospheric and Solar-Terrestrial Physics. Their research indicates that the Sun, while it’s activity has been on the upswing for an extended period, should be heading into a quiescent period starting with the next maximum of the 11-year sunspot cycle in around five years.
That would indicate that the IPCC prediction of exponentially increasing global temperatures due to human-caused increasing carbon-dioxide levels may be dead wrong. I say “may be dead wrong” because this is science, not dogma. In science, nothing is incontrovertible.
I was clued in to this research by my friend Dan Romanchik, who writes a blog for amateur radio enthusiasts. Amateur radio enthusiasts care about solar activity because sunspots are, in fact, caused by magnetic fields at the Sun’s surface. Those magnetic fields affect Earth by deflecting cosmic rays away from the inner solar system, which is where we live. Those cosmic rays are responsible for the Kennelly–Heaviside layer of ionized gas in Earth’s upper atmosphere (roughly 90–150 km, or 56–93 mi, above the ground).
Radio amateurs bounce signals off this layer to reach distant stations beyond line of sight. When solar activity is weak this layer drops to lower altitudes, reducing the effectiveness of this technique (often called “DXing”).
In his post of 16 December 2018, Dan complained: “If you operate HF [the high-frequency radio band], it’s no secret that band conditions have not been great. The reason, of course, is that we’re at the bottom of the sunspot cycle. If we’re at the bottom of the sunspot cycle, then there’s no way to go but up, right? Maybe not.
After discussing the NOAA prediction, he went on to further complain: “And, if that wasn’t depressing enough, I recently came across an article reporting on the research of Prof. Valentina Zharkova, who is predicting a grand minimum of 30 years!”
He included a link to a presentation Dr. Zharkova made at the Global Warming Policy Foundation last October in which she outlined her research and pointedly warned that the IPCC dogma was totally wrong.
I followed the link, viewed her presentation, and concluded two things:
The research methods she used are some that I’m quite familiar with, having used them on numerous occasions; and
She used those techniques correctly, reaching convincing conclusions.
Her results seems well aligned with meta-analysis published by the Cato Institute in 2015, which I mentioned in my posting of 10 October 2018 to this blog. The Cato meta-analysis of observational data indicated a much reduced rate of global warming compared to that predicted by IPCC models.
The Zharkova-model data covers a much wider period (millennia-long time scale rather than decades-long time scale) than the Cato data. It’s long enough to show the Medieval Warm Period as well as the Little Ice Age (Maunder minimum) and the recent warming trend that so fascinates climate-change activists. Instead of a continuation of the modern warm period, however, Zharkova’s model shows an abrupt end starting in about five years with the next maximum of the 11-year sunspot cycle.
Don’t expect a stampede of media coverage disputing the IPCC dogma, however. A host of politicians (especially among those in the U.S. Democratic Party) have hung their hats on that dogma as well as an array of governments who’ve sold policy decisions based on it. The political left has made an industry of vilifying anyone who doesn’t toe the “climate change” line, calling them “climate deniers” with suspect intellectual capabilities and moral characters.
Again, this sounds a lot like dogma. It’s the same tactic that the Inquisition used against Bruno and Galileo before escalating to more brutal methods.
Supporters of Zharkova’s research labor under a number of disadvantages. Of course, there’s the obvious disadvantage that Zharkova’s thick Ukrainian accent limits her ability to explain her work to those who don’t want to listen. She would not come off well on the evening news.
A more important disadvantage is the abstruse nature of the applied mathematics techniques used in the research. How many political reporters and, especially, commentators are familiar enough with the mathematical technique of principal component analysis to understand what Zharkova’s talking about? This stuff makes macroeconomics modeling look like kiddie play!
But, the situation’s even worse because to really understand the research, you also need an appreciation of stellar dynamics, which is based on magnetohydrodynamics. How many CNN commentators even know how to spell that?
Of course, these are all tools of the trade for astrophysicists. They’re as familiar to them as a hammer or a saw is to a carpenter.
For those in the media, on the other hand, it’s a lot easier to take the “most scientists agree” mantra at face value than to embark on the nearly hopeless task of re-educating themselves to understand Zharkova’s research. That goes double for politicians.
It’s entirely possible that “most” scientists might agree with the IPCC dogma, but those in a position to understand what’s driving Earth’s climate do not agree.
28 November 2018 – There’s a reason all modern civilized countries, at least all democracies, institutionalize separation of church and state. It’s the most critical part of the “separation of powers” mantra that the U.S. Founding Fathers repeated ad nauseam. It’s also a rant I’ve repeated time and again for at least a decade.
In my 2011 novel Vengeance is Mine! I wrote the following dialog between two people discussing what to do about a Middle-Eastern dictator from whom they’d just rescued a kidnapped woman:
“Even in Medieval Europe,” Doc grew professorial, “you had military dictatorships with secular power competing with the Catholic Church, which had enormous sectarian power.
“Modern regimes all have similar checks and balances – with separation of church and state the most important one. It’s why I get antsy when I see scientific organizations getting too cozy with governments, and why everyone gets nervous about weakness in religious organizations.
“No matter what your creed, we have to have organized religion of some kind to balance the secular power of governments.
“Islam was founded as a theocracy – both sectarian and secular power concentrated together in one or a few individuals. At the time, nobody understood the need to separate them. Most thinkers have since grown up to embrace the separation concept, realizing that the dynamic tension is needed to keep the whole culture centered, and able to respond to changing conditions.
“Fundamentalist Islam, however, has steadfastly refused to modernize. That’s why psychopaths like your Emir are able to achieve high office, with its accompanying state protection, in some Islamic countries. The only way to touch him is to topple his government, and the Manchek family isn’t going to do that.
“Unfortunately, radical Islam now seems to be gaining adherents, like Communism a hundred years ago. Eventually, Communist governments became so radicalized that they became inefficient, and collapsed under their own weight.”
“You’re comparing Islam to Communism?” Red questioned.
“Well,” Doc replied, “they may be at opposite ends of the spectrum doctrinaire-wise, but they share the same flaw.
“Communism was (and still is) an atheistic doctrine. Its answer to the question of religion is to deny the validity of religion. That kicks the pins out from under the competition.
“Since people need some sort of ethical, moral guide, they appealed to the Communist dogma. That blows the separation of church and state, again.
“There’s nobody to say, ‘naughty, naughty.’ Abuses go unchecked. Psychopaths find happy homes, and so forth. Witness Stalin.
“The problem isn’t what philosophy you have, it’s the inability to correct abuses because there aren’t separate, competing authorities.
“The strength of the American system is that there’s no absolute authority. The checks and balances are built in. Abuses happen, and can persist for a while, but eventually they get slapped down because there’s somebody around to slap them down.
“The weakness is that it’s difficult to get anything done.
“The strength is that it’s difficult to get anything done.”
In the novel, their final solution was to publicly humiliate the “Emir” in front of the “Saudi Sheik,” who then approved the Emir’s assassination.
Does that sound familiar?
The final edit of that novel was completed in 2011. Fast forward seven years and we’re now watching the aftermath of similar behavior by the Saudi Crown Prince Mohammed Bin Salman ordering the murder of dissident journalist Jamal Khashoggi. It’s interesting that authoritarian behavior is so predictable that real events so closely mimic the fiction of years before.
In a parallel development, the Republican Party today is suffering a moral implosion. Over the past two years, long-time Republicans, from senior Senators to loyal voters, have been jumping the Republican ship in droves on moral grounds.
I submit that this decline can be traced, at least in part, to the early 1980s when conservative elements of the Party forgot the meaning of “political conservatism,” and started courting the support of certain elements among Evangelical Christians. That led to adding religiously based planks (such as anti-abortion) to the Republican platform.
The elements among Evangelical Christians who responded were, of course, those who had no truck with the secular/sectarian-separation ideal. Unable to convince any but their most subservient followers of their moral rectitude (frankly because they didn’t have any, but that’s a rant for another day), those elements jumped at the chance to have the Federal Government codify their religious dogma into law.
By the way, it was an identical dynamic that led a delegation of Rabbinical Jews to talk Pontius Pilate into ordering the crucifixion of Jesus. In the end, Pilate was so disgusted by the whole proceeding that he suffered a bout of manic hand washing.
That points out the relative sophistication of the Roman culture of 2,000 years ago. Yes, the Roman emperors insisted that every Roman citizen acknowledge them to be a “god.” Unlike the Hebrew god, however, the Roman emperor was not a “jealous god.” He was perfectly willing to let his subjects worship any other god or gods they wanted to. All he required was lip-service fealty to him. And taxes. We can’t forget the taxes!
By the First Century CE, Greco-Roman civilization had been playing around with democratically based government off and on for five hundred years. They’d come to embrace religious tolerance as a good working principle that they honored in action, if not in word.
Pilate went slightly nuts over breaking the taboo against government-enforced religion because he knew it would not play well at home (in Rome). He was right. Lucius Vitellius Veteris, then Governor of Syria, deposed Pilate soon afterward, and sent him home in disgrace.
Pilate was not specifically disgraced over his handling of Jesus’ crucifixion, but more generally over his handling of the internecine conflicts between competing Jewish sects of the time. One surmises that he meddled too much, taking sides when he should have remained neutral in squabbles between two-bit religious sects in a far off desert outpost.
The take-home lesson of this blog posting is that it makes no difference what religious creed you espouse, what’s important from a governance point of view is that every citizen have some moral guide separate from secular law by which to judge the actions of their political leaders.
There are, of course, some elements required of that moral guide. For example, society cannot put up with a religion that condones murder. The Thugee Cult of British-Colonial India is such an example. Nor can society allow cults that encourage behaviors that threaten general order or rule of law, such as organized crime or corruption.
Especially helpful to governments are religions whose teachings promote obedience to rule of law, such as Catholicism. Democracies especially like various Protestant sects that promote individual responsibility.
Zen Buddhism, which combines Buddhist introspection with the Taoist inclusive world view, is another good foil for a democratic government. Its fundamental goal of minimizing suffering plays well with democratic ideals as well.
There are plenty of organized (as well as disorganized) religious guides out there. It’s important to keep in mind that the Founding Fathers were not trying to create an atheistic state. Separation of church and state implies the existence of both church and state, not one without the other.
7 November 2018 – During the week of 22 October 2018 two events dominated the news: Cesar Sayoc mailed fourteen pipe bombs to prominent individuals critical of Donald Trump, and Robert Bowers shot up a synagogue because he didn’t like Jews. Both of these individuals identified themselves with far-right ideology, so the media has been full of rhetoric condemning far-right activists.
To be legally correct, I have to note that, while I’ve written the above paragraph as if those individuals’ culpability for those crimes is established fact, they (as of this writing) haven’t been convicted. It’s entirely possible that some deus ex machina will appear out of the blue and exonerate one or both of them.
Clearly, things have gotten out of hand with Red Team activists when they start “throwing” pipe bombs and bullets. But, I’m here to say “naughty, naughty” to both sides.
Both sides are culpable.
I don’t want you to interpret that last sentence as agreement with Donald Trump’s idiotic statement after last year’s Charlottesville incident that there were “very fine people on both sides.”
There aren’t “very fine people” on both sides. Extremists are “bad” people no matter what side they’re on.
For example, not long ago social media sites (specifically Linkedin and, especially, Facebook) were lit up with vitriol about the Justice Kavanaugh hearings by pundits from both the Red Team and the Blue Team. It got so hot that I was embarrassed!
Some have pointed out that, statistically, most of the actual violence has been perpetrated by the Red Team.
Does that mean the Red Team is more culpable than the Blue Team?
No. It means they’re using different weapons.
The Blue Team, which I believe consists mainly of extremists from the liberal/progressive wing of the Democratic Party, has traditionally chosen written and spoken words as their main weapon. Recall some of the political correctness verbiage used to attack free expression in the late 20th Century, and demonstrations against conservative speakers on college campuses in our own.
The Red Team, which today consists of the Trumpian remnants of the Republican Party, has traditionally chosen to throw hard things, like rocks, bullets and pipe bombs.
Both sides also attempt to disarm the other side. The Blue Team wisely attempts to disarm the Red Team by taking away their guns. The Red Team, which eschews anything that smacks of wisdom, tries to disarm the Blue Team by (figuratively, so far) burning their books.
Recognize that calling the Free Press “the enemy of the people” is morally equivalent to throwing books on a bonfire. They’re both attempts to promote ignorance.
What’s actually happening is that the fringes of society are making all of the noise, and the mass of moderate-thinking citizens can’t get a word in edgewise.
George Schultz pointed out: “He who walks in the middle of the roads gets hit from both sides.”
I think it was Douglas Adams who pointed out that fanatics get to run things because they care enough to put in the effort. Moderates don’t because they don’t.
Both of these pundits point out the sad fact that Nature favors extremes. The most successful companies are those with the highest growth rates. Most drivers exceed the speed limit. The squeaky wheel gets the most grease. And, those who express the most extreme views get the most media attention.
Our Constitution specifies in no uncertain terms that the nation is founded on (small “d”) democratic principles. Democratic principles insist that policy matters be debated and resolved by consensus of the voting population. That can only be done when people meet together in the middle.
Extremists on both the Red Team and Blue Team don’t want that. They treat politics as a sporting event.
In a baseball game, for example, nobody roots for a tie. They root for a win by one team or the other.
Government is not a sporting event.
When one team or the other wins, all Americans lose.
The enemy we are facing now, which is the same enemy democracies face around the world, is not the right or left. It is extremism in general. Always has been. Always will be.
Authoritarians always go for one extreme or the other. Hitler went for the right. Stalin went for the left.
The reason authoritarians pick an extreme is that’s where there are people who are passionate enough about their ideas to shoot anyone who doesn’t agree with them. That, authoritarians realize, is the only way they can become “Dictator for Life.” Since that is their goal, they have to pick an extreme.
We love democracy because it’s the best way for “We the People” to ensure nobody gets to be “Dictator for Life.” When everyone meets in the middle (which is the only place everyone can meet), authoritarians get nowhere.
Ergo, authoritarians love extremes and everyone else needs the middle.
Vilifying “nationalism” as a Red Team vice misses the point. In the U.S. (or any similar democracy), nationalism requires more-or-less moderate political views. There’s lots of room in the middle for healthy (and ultimately entertaining) debate, but very little room at the extremes.
Try going for the middle.
To quote Victor “Animal” Palotti in Roland Emmerich’s 1998 film Godzilla:“C’mon. It’ll be fun! It’ll be fun! It’ll be fun!”
14 September 2018 – This is an extra edition of my usual weekly post on this blog. I’m writing it to tell you about an online event called “Open Future” put on by The Economist weekly newsmagazine and to encourage you to participate by visiting the URL www.economist.com/openfuture. The event is scheduled for tomorrow, 15 September 2018, but the website is already up, and some parts of the event are already live.
The newsmagazine’s Editor-in-Chief, Zanny Minton Beddoes, describes the event as “an initiative to remake the case for liberal values and policies in the 21st century.”
Now, don’t get put off by the use of the word “liberal.” These folks are Brits and, as I’ve often quipped: “The British invented the language, but they still can’t spell it or pronounce it.” They also sometimes use words to mean different things.
What The Economist calls “liberal” is not what we in the U.S. usually think of as liberal. You can get a clear idea of what The Economist refers to as “liberal” by perusing the list of seminal works in their article “The literature of liberalism.”
We in the U.S. are confused by typically hearing the word “liberal” used to describe leftist policies classed as Liberal with a capital L. Big-L Liberals have co-opted the word to refer to the agenda of the Democratic Party, which, as I’ll explain below, isn’t quite what The Economist refers to as small-L liberal.
The Economist‘s idea of liberal is more like what we usually call “libertarian.” Libertarians tend to take some ideas usually reserved for the left, and some from the right. Their main tenet, however, which is best expressed as “think for yourself,” is anathema to both ends of the political spectrum.
But, those of us in the habit of thinking for ourselves like it.
Unfortunately (or maybe not) small-L libertarianism is in danger of being similarly co-opted in the U.S. by the current big-L Libertarian Party. But, that’s a rant for another day!
What’s more important today is understanding a different way of dividing up political ideologies.
Left vs. Right
Two-hundred twenty-nine years ago, political discourse invented the terms “The Left” and “The Right” as a means of classifying political parties along ideological lines. The terms arose at the start of the French Revolution when delegates to the National Constituent Assembly still included foes of the revolution as well as its supporters.
As the ancient Greek proverb says, “birds of a feather flock together,” so supporters of revolution tended to pick seats near each other, and those against it sat together as well. Those supporting the revolution happened to sit on the left side of the hall, so those of more conservative bent gathered on the right. The terminology became institutionalized, so we now divide the political spectrum between a liberal/progressive Left and a conservative Right.
While the Left/Right-dichotomy works for describing what happened during the first meeting of the French National Constituent Assembly, it poorly reflects the concepts humans actually use to manage governments. In the real world, there is an equally simple, but far more relevant way of dividing up political views: authoritarianism versus democracy.
Authoritarians are all those people (and there’s a whole bunch of them) who want to tell everybody else what to do. It includes most religious leaders, most alpha males (and females), and, in fact, just about everyone who wants to lead anything from teenage gangs to the U.N. General Assembly. Patriarchal and matriarchal families are run on authoritarian principles.
Experience, by the way, shows that authoritarianism is a lousy way to run a railroad, despite the fact that virtually every business on the Planet is organized that way. Managment consultants and organizational-behavior researchers pretty much universally agree that spreading decision making throughout the organization, even down to the lowest levels, makes for the most robust, healthiest companies.
If you want your factory’s floors to be clean, make sure the janitors have a say in what mops and buckets to use!
The opposite of authoritarianism is democracy. Little-D democracy is the antithesis of authoritarianism. Small-D democrats don’t tell people what to do, they ask them what they (the people) want to do, and try to make it possible for them to do it. It takes a lot more savvy to balance all the conflicting desires of all those people than to petulently insist on things being done your way, but, if you can make it work, you get better results.
Now, political discourse based on the Left/Right dichotomy is simple and easy for political parties to espouse. Big-D Democrats have a laundry list of causes they champion. Similarly, Republicans have a laundry list of what they want to promote.
Those lists, however, absolutely do not fit the democracy/authoratarianism picture. And, there’s no reason to expect them to.
Politicians, generally, want to tell other people what to do. If they didn’t, they’d go do something else. That’s the very nature of politics. Thus, by and large, politicians are authoritarians.
They dress their plans up in terms that sound like democracy because most people don’t like being told what to do. In America, we’ve institutionalized the notion that people don’t like being told what to do, so bald-faced authoritarianism is a non-starter.
It started in England with the Magna Carta, in which the English nobles told King John “enough is enough.”
Yeah, King John is the same guy as the “Prince John” who was cast as the arch-enemy of fictional hero Robin Hood. See, we don’t like authoritarians, and generally cast them as the villains in our favorite stories.
Not wanting to be told what to do was imported to North America by the English colonists, who extended the concept (eventually) to everyone regardless of socio-economic status. From there, it was picked up by the French revolutionaries, then spread throughout Europe and parts East.
So, generally, nobody wants authoritarians telling them what to do, which is why they have to point guns at us to get us to do it.
The fact that most people would simultaneously like to be the authoritarian pointing the gun and doing the telling, and a fair fraction (probably about 25%) aren’t smart enough to see the incongruity involved, gives fascist populists a ready supply of people willing to hold the guns. Nazi Germany worked (for a while) because of this phenomenon. With a population North of 60 million, those statistics gave Hitler some 15 million gun holders to work with.
In the modern U.S.A., with a population over 300 million, the same statistical analysis gives modern fascists 75 million potential recruits. And, they’re walking around with more than their fair share of the guns!
Luckily, the rest of us have guns, too.
More importantly, we all have votes.
So, what’s an American who really doesn’t want any authoritarian telling them what to do … to do?
The first thing to do is open your eyes to the flim-flim represented by the Left/Right dichotomy. As long as you buy that drivel, you’ll never see what’s really going on. It’s set up as a sporting event where you’re required to back one of two teams: the Reds or the Blues.
Either one you pick, you’ll end up being told what to do by either the Red-team authoritarians or the Blue-team authoritarians. Because it’s treated as a sporting event, the object is to win, and there’s nothing at stake beyond winning. There isn’t even a trophy!
The next thing to do is look for people who would like to help, but don’t actually want to tell anyone what to do. When you find them, talk them into running for office.
Since you’ve picked on people who don’t really want to tell other people what to do, you’ll have to promise you won’t make them do it forever. After a while, you promise, you’ll let them off the hook so they can go do something else. That means putting term limits on elected officials.
The authoritarians, who get their jollies by telling other people what to do, won’t like that. The ones who just want to help out will be happy they can do their part for a while, then go home.