Quantcast
Channel: Peter Berger Archives - VTDigger
Viewing all articles
Browse latest Browse all 125

Peter Berger: You may hang up now

$
0
0

Editor’s note: This commentary is by Peter Berger, who has taught English and history for 30 years and writes “Poor Elijah’s Almanack.” The column appears in several publications including the Times Argus, the Rutland Herald and the Stowe Reporter.

Once upon a time, Poor Elijah used to talk to human beings when he called his pharmacy. It’s tough to pinpoint exactly what year that all changed, but suddenly instead of talking to a human, he found himself dealing with a disembodied Voice that instructed him to enter his birthdate, his prescription numbers, and Mickey Mantle’s lifetime batting average on his telephone’s keypad.

The Voice didn’t cope well if Poor Elijah pressed the wrong buttons. When it got confused, its default response was to ask irrelevant questions. Poor Elijah’s default response was to repeatedly scream “human being” until the Voice hung up.

A few months ago the Voice got an upgrade. It’s more polite, it speaks in longer sentences, and it tries to have a conversation with him. Instead of instructing him to press the right numbered buttons, he’s supposed to say the numbers, and when their business is concluded, it tells him, “You may hang up now.” Unfortunately, Poor Elijah’s enunciation isn’t always up to the Voice’s exacting auditory standards. It gets confused, and he returns to screaming “human being.”

At no point has he ever imagined he was dealing with an actual warm-blooded person.

That’s not to say that all warm-blooded pharmacists are unfailingly competent or congenial. Neither, for that matter, are all teachers, which is why education tech enthusiasts have declared that “the robots are coming.” Naturally, we’re talking about artificially intelligent robots, which presumably means they can do things in classrooms that Roombas can’t.

Most teachers, not surprisingly, aren’t thrilled at the prospect of sharing their classrooms with – or being replaced by – a silicon-powered, humanoid version of Dumbledore or Miss Crabtree. A 2019 Education Week survey found that 84% disputed tech advocates’ claims that “AI-powered” robot “teaching assistants” would improve student learning. A comparable 90% likewise disagreed with assertions that entirely replacing human teachers with robots would boost achievement, even where those teachers were “chronically low-performing.”

Some teaching robots are simply artificially intelligent computers with an email account. Students enter data, and the computer at the other end of the line replies with data – hopefully correct, pertinent data. Like Poor Elijah I’ve suffered through enough conversations with computers to be skeptical.

Not everyone shares my Luddite skepticism. One pro-AI college professor cites his informal experiment with human and robot online teaching assistants. He found that his students couldn’t tell the difference. Of course, that might just reflect the inherent shortcomings of online instruction itself, even when two humans are involved.

Researchers have observed that students’ “social connection” is “much stronger” with actual “physical” robots than with “virtual” robot images on computer screens. At the kindergarten level Chinese schools have introduced a “small robot” named Keeko that “tells stories, poses logic problems, and reacts” to students’ responses with “facial expressions.” Here in the United States Boston schools are piloting a teddy-bear robot named Tega that specializes in early elementary “language and literacy skills.” Researchers are also experimenting with “young children” and “fully autonomous,” “peer-like” “social robots.”

“Experimenting” is the right word.

Some teachers can see a use for robots when it comes to chores like taking attendance and grading. Others worry, however, about data insecurity, invalid grades, and the implicit gender biases of artificial intelligence programmers, most of whom are male.

Of these concerns, I’m particularly uneasy about placing any classroom grading judgments in the hands of remote “experts,” male or female, especially given the decades-long meaningless hash they’ve already made of standardized testing. The human programmers and education authorities operating behind the robots are simply the latest incarnation of the misguided, pipedream expert corps whose bright ideas have plagued public schools for two generations.

I see the direst peril, though, in advocates’ upbeat assessment that children find physical robots “more believable” than virtual robots and therefore have “more positive interactions” with them.

Tech enthusiasts present this as a good thing.

It isn’t.

We’re so impressed as the Voice gets more realistic that we don’t consider the likelihood that we may just be getting worse at telling the difference between machines and real people.

I don’t want children to have “positive interactions” with instructional humanoids. I don’t want them to form a “social connection” with a “peer-like” robot, any more than I want them to relate to their classroom’s light switch.

If you’re someone who keeps up with the endless procession of school reforms, you probably know that “social-emotional learning” is one of public education’s latest bandwagon “initiatives.” Not coincidentally, tech-inclined experts currently tout classroom robots’ “social-emotional benefit.”

Someday soon, though, experts will be writing cautionary articles about the harmful effects of classroom robots on students’ social-emotional development, the same way they’re now writing cautionary articles about the harmful effects of classroom computers on everything from students’ sleep cycles to their social skills. These are the same computers, by the way, that the same experts hyped schools into buying by the truckload, the same blue-light screens that schools park students in front of more and more each day.

Voice-activated companionship is counterfeit and unhealthy. When there’s nobody else in the room and I’m talking out loud, I’d rather it be thought that I’m talking to myself than to Siri or Alexa, especially if I’ve begun to believe they’re really talking back. It’s a sign of better mental health.

So is talking to my dog. She may not understand much of what I’m saying, but at least when she cocks her head, she’s sincere.

The next time you’re out at a restaurant, look around the room. Look around your table. Count the people who are staring at screens. Consider how Facebook has corrupted the meaning of “friend.” Now imagine a world where naive children and awkward adolescents spend their days relating to robots.

If we unleash robot Voices in our classrooms, we can expect only more of the same.

You may hang up now.

Read the story on VTDigger here: Peter Berger: You may hang up now.


Viewing all articles
Browse latest Browse all 125

Trending Articles