Does Reality Matter (for Higher Ed)?

Yesterdays’ annual Pullias Lecture on higher education reminded me of Woody Allen’s The Purple Rose of Cairo.

In that movie, Mia Farrow’s character falls in love with a handsome man who’s walked down off a movie screen. Farrow’s Cecilia is only a little put-off by her loved one not being entirely real. She tells her sister: “I’ve met the most wonderful man. He’s fictional, but you can’t have everything.”

At USC’s Pullias Lecture and discussion yesterday, James Gee and Henry Jenkins discussed gaming platforms and fan fiction as platforms for learning math and science (Gee) and political activism (Jenkins).

  • Jim Gee explained how some players in the game World of Warcraft set up complex experiments to analyze flaws in the statistical models behind the game.
  • Similarly, players of the game Portal master complex variations in the laws of physics to allow their fictional avatars to jump from one place to another through the game’s eponymous portals.
  • In his response, Henry Jenkins recounted the way some young people not only write their own Harry Potter novels, they actually become non-fictional political organizers around Potter-inspired values.

Let’s skip the question of “some” vs. “many.” These were inspiring anecdotal talks about the power of platforms quite unlike formal education to inspire passionate devotion to tasks requiring hard work and the acquisition and integration of powerful skills. I suspect the rate at which WOW players and Potter readers become stats experts or novelists is far lower than that of college graduates--but leave that be for now.

The question that nagged at me when I woke up the next day was: What do we think about becoming an “expert” on the laws of a physical universe or set of historical institutions that, strictly speaking, don’t exist? We still apparently believe that a kind of synthetic non-disciplinary expertise matters. But what if its basis--like the almost-perfect boyfriend in The Purpose Rose of Cairo--is fictional?

“I have the most wonderful theory: it’s fictional, but you can’t have everything.”

A formalist defense of gaming and fiction as platforms for learning is not hard to imagine. ‘Well, physics and English and the tax code are all merely rules, and so any ability to master rules, respond and respond with strategies is educational, broadly speaking--and that at least some of these skills are transferable.’

Which is fine. But do we believe it?

The history of Western music offers an interesting analogy.

Since around 1600 composers and pedagogues centered in Europe believed that good music should follow rules drawn from in part from acoustics: tones have what are called overtones, and the overtones determine what was “consonant” and what was “dissonant.” Dissonance was permitted, but only under rigid conditions--basically to drive towards a tonal home and center. The ‘home-and-center’ part wasn’t anchored firmly in acoustics, but it had some taproots.

During the 19th century Romantic composers like Wagner drifted further and further from this logic of a voyage that started at home, went away and then returned. Then near the beginning of the 19th century something happened. Composers began to ignore acoustics and the old rules. Schoenberg, Berg and Webern ceased favoring some notes of the 12-tone scale over others: they set the 12 tones ‘free.’ Musical materials became a neutral substance on which to impose compositional patterns.

And this lead the way to composing music using helicopters and such. After all, “music” was no longer the substance: it was the organization, the pattern. So consonance ceased to matter--if you followed that particular theory, that is.

We all know what happened: classical music became wildly unpopular, the purview of the few, while popular music retained its ties with acoustics and traditional harmony. Aside from the addition of some extra notes (like the 7th note of the scale), many pop songs of today are perfectly comprehensible within the outlines of the common practice period. The music that paid attention to acoustics is still popular. And the music that forgot about acoustics is still revered by “experts”--but only by them.

But is that important? Should popularity and common wisdom trump specialized expert knowledge? If everyone who loves WOW or Harry Potter thinks you’re terrific, is that enough?

The example is on-point, because a part of Gee’s talk was dedicated towards hammering on the idea of disciplinary “experts”--Alan Greenspan was Gee’s example--while celebrating the virtue of “amateurs”--embodied by Darwin. (It could almost make you feel sorry for Yoda, I mean Greenspan.) “Experts” killed classical music, so why not abandon that kind of expertise?

The answer is probably somewhere in the middle. Economics may involve playing with an arbitrary set of man-made rules called laws. But the minimum number of calories a human being needs to survive is well known. And when your income doesn’t provide for those calories--or better, the required nutrients, since calories are now cheap--there’s a real danger of starvation or pellagra or the like.

Likewise, there’s nothing arbitrary about genetics: those who care about science tend to agree that Darwin discovered something about nature, not about arbitrary rules. (I’m leaving out those who think Darwin gave us ‘just a theory.’)

Reality is not just a game to be played and mastered. We underestimate reality when we treat it as such. Both Gee and Jenkins underline the collaborative and cooperative aspects of games and fan activities, so I can’t claim they are solely focused on strategic behavior: they’re not.

But it does seem cavalier to marginalize reality by treating science and politics as a man-made structures house-of-cards to be built or re-built willy-nilly. Were higher education to embrace arbitrariness, it’s hard not to foresee the old debates about relativism resurfacing all over again.

And that is not a movie I’d care to re-watch.

--Edward R. O'Neill


Popular Posts