Luddites RULE!

LindaBucklin-Shutterstock
Momma said there’d be days like this! (Apologies to songwriters Luther Dixon and Willie Denson, and, of course, the Geico Caveman.) Linda Bucklin/Shutterstock

7 February 2019 – This is not the essay I’d planned to write for this week’s blog. I’d planned a long-winded, abstruse dissertation on the use of principal component analysis to glean information from historical data in chaotic systems. I actually got most of that one drafted on Monday, and planned to finish it up Tuesday.

Then, bright and early on Tuesday morning, before I got anywhere near the incomplete manuscript, I ran headlong into an email issue.

Generally, I start my morning by scanning email to winnow out the few valuable bits buried in the steaming pile of worthless refuse that has accumulated in my Inbox since the last time I visited it. Then, I visit a couple of social media sites in an effort to keep my name if front of the Internet-entertained public. After a couple of hours of this colossal waste of time, I settle in to work on whatever actual work I have to do for the day.

So, finding that my email client software refused to communicate with me threatened to derail my whole day. The fact that I use email for all my business communications, made it especially urgent that I determine what was wrong, and then fix it.

It took the entire morning and on into the early afternoon to realize that there was no way I was going to get to that email account on my computer, find out that nobody in the outside world (not my ISP, not the cable company that went that extra mile to bring Internet signals from that telephone pole out there to the router at the center of my local area network, or anyone else available with more technosavvy than I have) was going to be able to help. I was finally forced to invent a work around involving a legacy computer that I’d neglected to throw in the trash just to get on with my technology-bound life.

At that point the Law of Deadlines forced me to abandon all hope of getting this week’s blog posting out on time, and move on to completing final edits and distribution of that press release for the local art gallery.

That wasn’t the last time modern technology let me down. In discussing a recent Physics Lab SNAFU, Danielle, the laboratory coordinator I work with at the University said: “It’s wonderful when it works, but horrible when it doesn’t.”

Where have I heard that before?

The SNAFU Danielle was lamenting happened last week.

I teach two sections of General Physics Laboratory at Florida Gulf Coast University, one on Wednesdays and one on Fridays. The lab for last week had students dropping a ball, then measuring its acceleration using a computer-controlled ultrasonic detection system as it (the ball, not the computer) bounces on the table.

For the Wednesday class everything worked perfectly. Half a dozen teams each had their own setups, and all got good data, beautiful-looking plots, and automated measurements of position and velocity. The computers then automatically derived accelerations from the velocity data. Only one team had trouble with their computer, but they got good data by switching to an unused setup nearby.

That was Wednesday.

Come Friday the situation was totally different. Out of four teams, only two managed to get data that looked even remotely like it should. Then, one team couldn’t get their computer to spit out accelerations that made any sense at all. Eventually, after class time ran out, the one group who managed to get good results agreed to share their information with the rest of the class.

The high point of the day was managing to distribute that data to everyone via the school’s cloud-based messaging service.

Concerned about another fiasco, after this week’s lab Danielle asked me how it worked out. I replied that, since the equipment we use for this week’s lab is all manually operated, there were no problems whatsoever. “Humans are much more capable than computers,” I said. “They’re able to cope with disruptions that computers have no hope of dealing with.”

The latest example of technology Hell appeared in a story in this morning’s (2/7/2019) Wall Street Journal. Some $136 million of customers’ cryptocurrency holdings became stuck in an electronic vault when the founder (and sole employee) of cryptocurrency exchange QuadrigaCX, Gerald Cotten, died of complications related to Crohn’s disease while building an orphanage in India. The problem is that Cotten was so secretive about passwords and security that nobody, even his wife, Jennifer Robertson, can get into the reserve account maintained on his laptop.

“Quadriga,” according to the WSJ account, “would need control of that account to send those funds to customers.”

No lie! The WSJ attests this bizarre tale is the God’s own truth!

Now, I’ve no sympathy for cryptocurrency mavens, which I consider to be, at best, technoweenies gleefully leading a parade down the primrose path to technology Hell, but this story illustrates what that Hell looks like!

It’s exactly what the Luddites of the early 19th Century warned us about. It’s a place of nameless frustration and unaccountable loss that we’ve brought on ourselves.

Farsighted Decisions

"Farsighted" book cover
Farsighted: How We Make the Decisions That Matter the Most by Steven Johnson

30 January 2019 – This is not a textbook on decision making.

Farsighted: How We Make the Decisions That Matter the Most does cover most of the elements of state-of-the-art decision making, but it’s not a true textbook. If he’d really wanted to write a textbook, its author, Steven Johnson, would have structured it differently, and would have included exercises for the student. Perhaps he would also have done other things differently that I’m not going to enumerate because I don’t want to write a textbook on state-of-the-art decision making, either.

What Johnson apparently wanted to do, and did do successfully, was lay down a set of principles today’s decision makers would do well to follow.

Something he would have left out, if he were writing a textbook, was the impassioned plea for educators to incorporate mandatory decision making courses into secondary-school curricula. I can’t disagree with this sentiment!

A little bit about my background with regard to decision-theory education: ‘Way back in the early 2010s, I taught a course at a technical college entitled “Problem Solving Theory.” Johnson’s book did not exist then, and I wish that it had. The educational materials available at the time fell woefully short. They were, at best, pedantic.

I spent a lot of class time waving my hands and telling stories from my days as a project manager. Unfortunately, the decision-making techniques I learned about in MBA school weren’t of any help at all. Some of the research Johnson weaves into his narrative hadn’t even been done back then!

So, when I heard about Johnson’s new book, I rushed out to snag a copy and devoured it.

As Johnson points out, everybody is a decision maker every day. These decisions run the gamut from snap decisions that people have to make almost instantly, to long-term deliberate choices that reverberate through the rest of their lives. Many, if not most, people face making decisions affecting others, from children to spouses, siblings and friends. Some of us participate in group decision making that can have truly global ramifications.

In John McTiernan’s 1990 film The Hunt for Red October, Admiral Josh Painter points out to CIA analyst Jack Ryan: “Russians don’t take a dump, son, without a plan. Senior captains don’t start something this dangerous without having thought the matter through.”

It’s not just Russians, however, who plan out even minor actions. And, senior captains aren’t the only ones who don’t start things without having thought the matter through. We all do it.

As Johnson points out, it may be the defining characteristic of the human species, which he likes to call Homo prospectus for their ability to apply foresight to advance planning.

The problem, of course, is the alarming rate at which we screw it up. As John F. Kennedy’s failure in the Bay of Pigs invasion shows, even highly intelligent, highly educated and experienced leaders can get it disastrously wrong. Johnson devotes considerable space to enumerating the taxonomy of “things that can go wrong.”

So, decision making isn’t just for leaders, and it’s easier to get it wrong than to do it right.

Enumerating the ways it can all go disastrously wrong, and setting out principles that will help us get it right are the basic objectives Johnson set out for himself when he first decided to write this book. To wit, three goals:

  • Convince readers that it’s important;

  • Warn folks of how easily it can be done wrong; and

  • Give folks a prescription for doing it right.

Pursuant to the third goal, Johnson breaks decision making down into a process involving three steps:

Mapping consists of gathering preliminary information about the state of the Universe before any action has been taken. What do we have to work with? What options do we have to select from? What do we want to accomplish and when?

Predicting consists of prognisticating, for each of the various options available, how the Universe will evolve from now into the foreseeable (and possibly unforeseeable) future. This is probably the most fraught stage of the process. Do we need a Plan B in case of surprises? As Sean Connery’s “Mac” character intones in Jon Amiel’s 1999 crime drama, Entrapment: “Trust me, there always are surprises.”

Deciding is the ultimate finish of the process. It consists of finally choosing between the previously identified alternatives based on the predicted results. What alternative is most likely to give us a result we want to have?

An important technique Johnson recommends basing your decision-making strategy on is narrative. That explicitly means storytelling. Johnson supplies numerous examples from both fiction and non-fiction that help us understand the decision-making process and help us apply it to the problems we face.

He points out that double-blind clinical trials were the single most important technique that advanced medicine from quackery and the witch-doctor’s art to reliable medical science. It allowed trying out various versions of medical interventions in a systematic way and comparing the results. In the same way, he says, fictional storytelling, allows us to mentally “try out” multiple alternative versions of future history.

Through storytelling, we explore various possibilities and imagine how they might turn out, including the vicissitudes of Shakespeare’s “slings and arrows of outrageous fortune,” without putting in the time and effort to try them out in reality, and thereby likely suffering “the fuss of mass destruction and death.”

Johnson suggests that’s why humans evolved the desire and capacity to create such fictional narratives in the first place. “When we read these novels,” he says, “ … we are not just entertaining ourselves; we are also rehearsing for our own real-world experiences.”

Of course, while “deciding” is the ultimate act of Johnson’s process, it’s never the end of the story in real life. What to do when it all goes disastrously wrong is always an important consideration. Johnson actually covers that as an important part of the “predicting” step. That’s when you should develop Mac’s “Plan B pack” and figure out when to trigger it if necessary.

Another important consideration, which I covered extensively in my problem solving course and Johnson starts looking at ‘way back in “mapping” is how to live with the aftermath of your decision, whether it’s a resounding success or a disastrous failure. Either way, the Universe is changed forever by your decision, and you and everyone else will have to live in it.

So, your ultimate goal should be deciding how to make the Universe a better place in which to live!

Robots Revisited

Engineer with SCARA robots
Engineer using monitoring system software to check and control SCARA welding robots in a digital manufacturing operation. PopTika/Shutterstock

12 December 2018 – I was wondering what to talk about in this week’s blog posting, when an article bearing an interesting-sounding headline crossed my desk. The article, written by Simone Stolzoff of Quartz Media was published last Monday (12/3/2018) by the World Economic Forum (WEF) under the title “Here are the countries most likely to replace you with a robot.”

I generally look askance at organizations with grandiose names that include the word “World,” figuring that they likely are long on megalomania and short on substance. Further, this one lists the inimitable (thank God there’s only one!) Al Gore on its Board of Trustees.

On the other hand, David Rubenstein is also on the WEF board. Rubenstein usually seems to have his head screwed on straight, so that’s a positive sign for the organization. Therefore, I figured the article might be worth reading and should be judged on its own merits.

The main content is summarized in two bar graphs. The first lists the ratio of robots to thousands of manufacturing workers in various countries. The highest scores go to South Korea and Singapore. In fact, three of the top four are Far Eastern countries. The United States comes in around number seven.Figure 1

The second applies a correction to the graphed data to reorder the list by taking into account the countries’ relative wealth. There, the United States comes in dead last among the sixteen countries listed. East Asian countries account for all of the top five.

Figure 2The take-home-lesson from the article is conveniently stated in its final paragraph:

The upshot of all of this is relatively straightforward. When taking wages into account, Asian countries far outpace their western counterparts. If robots are the future of manufacturing, American and European countries have some catching up to do to stay competitive.

This article, of course, got me started thinking about automation and how manufacturers choose to adopt it. It’s a subject that was a major theme throughout my tenure as Chief Editor of Test & Measurement World and constituted the bulk of my work at Control Engineering.

The graphs certainly support the conclusions expressed in the cited paragraph’s first two sentences. The third sentence, however, is problematical.

That ultimate conclusion is based on accepting that “robots are the future of manufacturing.” Absolute assertions like that are always dangerous. Seldom is anything so all-or-nothing.

Predicting the future is epistemological suicide. Whenever I hear such bald-faced statements I recall Jim Morrison’s prescient statement: “The future’s uncertain and the end is always near.”

The line was prescient because a little over a year after the song’s release, Morrison was dead at age twenty seven, thereby fulfilling the slogan expressed by John Derek’s “Nick Romano” character in Nicholas Ray’s 1949 film Knock on Any Door: “Live fast, die young, and leave a good-looking corpse.”

Anyway, predictions like “robots are the future of manufacturing” are generally suspect because, in the chaotic Universe in which we live, the future is inherently unpredictable.

If you want to say something practically guaranteed to be wrong, predict the future!

I’d like to offer an alternate explanation for the data presented in the WEF graphs. It’s based on my belief that American Culture usually gets things right in the long run.

Yes, that’s the long run in which economist John Maynard Keynes pointed out that we’re all dead.

My belief in the ultimate vindication of American trends is based, not on national pride or jingoism, but on historical precedents. Countries that have bucked American trends often start out strong, but ultimately fade.

An obvious example is trendy Japanese management techniques based on Druckerian principles that were so much in vogue during the last half of the twentieth century. Folks imagined such techniques were going to drive the Japanese economy to pre-eminence in the world. Management consultants touted such principles as the future for corporate governance without noticing that while they were great for middle management, they were useless for strategic planning.

Japanese manufacturers beat the crap out of U.S. industry for a while, but eventually their economy fell into a prolonged recession characterized by economic stagnation and disinflation so severe that even negative interest rates couldn’t restart it.

Similar examples abound, which is why our little country with its relatively minuscule population (4.3% of the world’s) has by far the biggest GDP in the world. China, with more than four times the population, grosses less than a third of what we do.

So, if robotic adoption is the future of manufacturing, why are we so far behind? Assuming we actually do know what we’re doing, as past performance would suggest, the answer must be that the others are getting it wrong. Their faith in robotics as a driver of manufacturing productivity may be misplaced.

How could that be? What could be wrong with relying on technological advancement as the driver of productivity?

Manufacturing productivity is calculated on the basis of stuff produced (as measured by its total value in dollars) divided by the number of worker-hours needed to produce it. That should tell you something about what it takes to produce stuff. It’s all about human worker involvement.

Folks who think robots automatically increase productivity are fixating on the denominator in the productivity calculation. Making even the same amount of stuff while reducing the worker-hours needed to produce it should drive productivity up fast. That’s basic number theory. Yet, while manufacturing has been rapidly introducing all kinds of automation over the last few decades, productivity has stagnated.

We need to look for a different explanation.

It just might be that robotic adoption is another example of too much of a good thing. It might be that reliance on technology could prove to be less effective than something about the people making up the work force.

I’m suggesting that because I’ve been led to believe that work forces in the Far Eastern developing economies are less skillful, may have lower expectations, and are more tolerant of authoritarian governments.

Why would those traits make a difference? I’ll take them one at a time to suggest how they might.

The impression that Far Eastern populations are less skillful is not easy to demonstrate. Nobody who’s dealt with people of Asian extraction in either an educational or work-force setting would ever imagine they are at all deficient in either intelligence or motivation. On the other hand, as emerging or developing economies those countries are likely more dependent on workers newly recruited from rural, agrarian settings, who are likely less acclimated to manufacturing and industrial environments. On this basis, one may posit that the available workers may prove less skillful in a manufacturing setting.

It’s a weak argument, but it exists.

The idea that people making up Far-Eastern work forces have lower expectations than those in more developed economies is on firmer footing. Workers in Canada, the U.S. and Europe have very high expectations for how they should be treated. Wages are higher. Benefits are more generous. Upward mobility perceptions are ingrained in the cultures.

For developing economies, not so much.

Then, we come to tolerance of authoritarian regimes. Tolerance of authoritarianism goes hand-in-hand with tolerance for the usual authoritarian vices of graft, lack of personal freedom and social immobility. Only those believing populist political propaganda think differently (which is the danger of populism).

What’s all this got to do with manufacturing productivity?

Lack of skill, low expectations and patience under authority are not conducive to high productivity. People are productive when they work hard. People work hard when they are incentivized. They are incentivized to work when they believe that working harder will make their lives better. It’s not hard to grasp!

Installing robots in a plant won’t by itself lead human workers to believe that working harder will make their lives better. If anything, it’ll do the opposite. They’ll start worrying that their lives are about to take a turn for the worse.

Maybe that has something to do with why increased automation has failed to increase productivity.

Teaching News Consumption and Critical Thinking

Teaching media literacy
Teaching global media literacy to children should be started when they’re young. David Pereiras/Shutterstock

21 November 2018 – Regular readers of this blog know one of my favorite themes is critical thinking about news. Another of my favorite subjects is education. So, they won’t be surprised when I go on a rant about promoting teaching of critical news consumption habits to youngsters.

Apropos of this subject, last week the BBC launched a project entitled “Beyond Fake News,” which aims to “fight back” against fake news with a season of documentaries, special reports and features on the BBC’s international TV, radio and online networks.

In an article by Lucy Mapstone, Press Association Deputy Entertainment Editor for the Independent.ie digital network, entitled “BBC to ‘fight back’ against disinformation with Beyond Fake News project,” Jamie Angus, director of the BBC World Service Group, is quoted as saying: “Poor standards of global media literacy, and the ease with which malicious content can spread unchecked on digital platforms mean there’s never been a greater need for trustworthy news providers to take proactive steps.”

Angus’ quote opens up a Pandora’s box of issues. Among them is the basic question of what constitutes “trustworthy news providers” in the first place. Of course, this is an issue I’ve tackled in previous columns.

Another issue is what would be appropriate “proactive steps.” The BBC’s “Beyond Fake News” project is one example that seems pretty sound. (Sorry if this language seems a little stilted, but I’ve just finished watching a mid-twentieth-century British film, and those folks tended to talk that way. It’ll take me a little while to get over it.)

Another sort of “proactive step” is what I’ve been trying to do in this blog: provide advice about what steps to take to ensure that the news you consume is reliable.

A third is providing rebuttal of specific fake-news stories, which is what pundits on networks like CNN and MSNBC try (with limited success, I might say) to do every day.

The issue I hope to attack in this blog posting is the overarching concern in the first phrase of the Angus quote: “Poor standards of global media literacy, … .”

Global media literacy can only be improved the same way any lack of literacy can be improved, and that is through education.

Improving global media literacy begins with ensuring a high standard of media literacy among teachers. Teachers can only teach what they already know. Thus, a high standard of media literacy must start in college and university academic-education programs.

While I’ve spent decades teaching at the college level, so I have plenty of experience, I’m not actually qualified to teach other teachers how to teach. I’ve only taught technical subjects, and the education required to teach technical subjects centers on the technical subjects themselves. The art of teaching is (or at least was when I was at university) left to the student’s ability to mimic what their teachers did, informal mentoring by fellow teachers, and good-ol’ experience in the classroom. We were basically dumped into the classroom and left to sink or swim. Some swam, while others sank.

That said, I’m not going to try to lay out a program for teaching teachers how to teach media literacy. I’ll confine my remarks to making the case that it needs to be done.

Teaching media literacy to schoolchildren is especially urgent because the media-literacy projects I keep hearing about are aimed at adults “in the wild,” so to speak. That is, they’re aimed at adult citizens who have already completed their educations and are out earning livings, bringing up families, and participating in the political life of society (or ignoring it, as the case may be).

I submit that’s exactly the wrong audience to aim at.

Yes, it’s the audience that is most involved in media consumption. It’s the group of people who most need to be media literate. It is not, however, the group that we need to aim media-literacy education at.

We gotta get ‘em when they’re young!

Like any other academic subject, the best time to teach people good media-consumption habits is before they need to have them, not afterwards. There are multiple reasons for this.

First, children need to develop good habits before they’ve developed bad habits. It saves the dicey stage of having to unlearn old habits before you can learn new ones. Media literacy is no different. Neither is critical thinking.

Most of the so-called “fake news” appeals to folks who’ve never learned to think critically in the first place. They certainly try to think critically, but they’ve never been taught the skills. Of course, those critical-thinking skills are a prerequisite to building good media-consumption habits.

How can you get in the habit of thinking critically about news stories you consume unless you’ve been taught to think critically in the first place? I submit that the two skills are so intertwined that the best strategy is to teach them simultaneously.

And, it is most definitely a habit, like smoking, drinking alcohol, and being polite to pretty girls (or boys). It’s not something you can just tell somebody to do, then expect they’ll do it. They have to do it over and over again until it becomes habitual.

‘Nuff said.

Another reason to promote media literacy among the young is that’s when people are most amenable to instruction. Human children are pre-programmed to try to learn things. That’s what “play” is all about. Acquiring knowledge is not an unpleasant chore for children (unless misguided adults make it so). It’s their job! To ensure that children learn what they need to know to function as adults, Mommy Nature went out of her way to make learning fun, just as she did with everything else humans need to do to survive as a species.

Learning, having sex, taking care of babies are all things humans have to do to survive, so Mommy Nature puts systems in place to make them fun, and so drive humans to do them.

A third reason we need to teach media literacy to the young is that, like everything else, you’re better off learning it before you need to practice it. Nobody in their right mind teaches a novice how to drive a car by running them out in city traffic. High schools all have big, torturously laid out parking lots to give novice drivers a safe, challenging place to practice the basic skills of starting, stopping and turning before they have to perform those functions while dealing with fast-moving Chevys coming out of nowhere.

Similarly, you want students to practice deciphering written and verbal communications before asking them to parse a Donald-Trump speech!

The “Call to Action” for this editorial piece is thus, “Agitate for developing good media-consumption habits among schoolchildren along with the traditional Three Rs.” It starts with making the teaching of media literacy part of K-12 teacher education. It also includes teaching critical thinking skills and habits at the same time. Finally, it includes holding K-12 teachers responsible for inculcating good media-consumption habits in their students.

Yes, it’s important to try to bring the current crop of media-illiterate adults up to speed, but it’s more important to promote global media literacy among the young.

The Case for Free College

College vs. Income
While the need for skilled workers to maintain our technology edge has grown, the cost of training those workers has grown astronomically.

6 June 2018 – We, as a nation, need to extend the present system that provides free, universal education up through high school to cover college to the baccalaureate level.

DISCLOSURE: Teaching is my family business. My father was a teacher. My mother was a teacher. My sister’s first career was as a teacher. My brother in law was a teacher. My wife is a teacher. My son is a teacher. My daughter in law is a teacher. Most of my aunts and uncles and cousins are or were teachers. I’ve spent a lot of years teaching at the college level, myself. Some would say that I have a conflict of interest when covering developments in the education field. Others might argue that I know whereof I speak.

Since WW II, there has been a growing realization that the best careers go to those with at least a bachelor’s degree in whatever field they choose. Yet, at the same time, society has (perhaps inadvertently, although I’m not naive enough to eschew thinking there’s a lot of blame to go around) erected a monumental barrier to anyone wanting to get an education. Since the mid-1970s, the cost of higher education has vastly outstripped the ability of most people to pay for it.

In 1975, the price of attendance in college was about one fifth of the median family income (see graph above). In 2016, it was over a third. That makes sending kids to college a whole lot harder than it used to be. If your family happens to have less than median household income, that barrier looks even higher, and is getting steeper.

MORE DISCLOSURE: The reason I don’t have a Ph.D. today is that two years into my Aerospace Engineering Ph.D. program, Arizona State University jacked up the tuition beyond my (not incosiderable at the time) ability to pay.

I’d like everyone in America to consider the following propositions:

  1. A bachelor’s degree is the new high-school diploma;
  2. Having an educated population is a requirement for our technology-based society;
  3. Without education, upward mobility is nearly impossible;
  4. Ergo, it is a requirement for our society to ensure that every citizen capable of getting a college degree gets one.

EVEN MORE DISCLOSURE: Horace Mann, often credited as the Father of Public Education, was born in the same town (Franklin, MA) that I was, and our family charity is a scholarship fund dedicated to his memory.

About Mann’s intellectual progressivism, the historian Ellwood P. Cubberley said: “No one did more than he to establish in the minds of the American people the conception that education should be universal, non-sectarian, free, and that its aims should be social efficiency, civic virtue, and character, rather than mere learning or the advancement of education ends.” (source: Wikipedia)

The Wikipedia article goes on to say: “Arguing that universal public education was the best way to turn unruly American children into disciplined, judicious republican citizens, Mann won widespread approval from modernizers, especially in the Whig Party, for building public schools. Most states adopted a version of the system Mann established in Massachusetts, especially the program for normal schools to train professional teachers.”

That was back in the mid-nineteenth century. At that time, the United States was in the midst of a shift from an agrarian to an industrial economy. We’ve since completed that transition and are now shifting to an information-based economy. In future, full participation in the workforce will require everyone to have at least a bachelor’s degree.

So, when progressive politicians, like Bernie Sanders, make noises about free universal college education, YOU should listen!

It’s about time we, as a society, owned up to the fact that times have changed a lot since the mid-nineteenth century. At that time, universal free education to about junior high school level was considered enough. Since then, it was extended to high school. It’s time to extend it further to the bachelor’s-degree level.

That doesn’t mean shutting down Ivy League colleges. For those who can afford them, private and for-profit colleges can provide superior educational experiences. But, publicly funded four-year colleges offering tuition-free education to everyone has become a strategic imperative.