It’s worth recalling that before the Internet ate the world, it was a project built in America’s universities. Yes, it was funded with plenty of Department of Defense money, but many of the earliest ’net programmers saw it as a way to bring together far-flung information in ways that could create new knowledge. And much of that idealism clung to the evolving network as it later became, in the popular imagination, the “Information Superhighway.” In its clumsily hyperbolic way, the phrase captured both the novelty and the promise of the Internet: something new that couldn’t quite be described with old terms—and it was going to change everything about how we lived and learned.
Much of that hype coagulated around Silicon Valley in the 1990s and was transmuted into venture capital and stock options. The commercial Web arrived, and quite often it seemed as though the great educational potential of the Internet took a backseat to its money-making potential. There were still flashes of that old ideal, though, taking new forms. As Audrey Watters points out, the revolutionary zeal to transform education returned in 2012, in what the the New York Times dubbed “The Year of the MOOC.” Another ungainly acronym, it stood for “massive open online course,” and its proponents promised to finally “democratize” education, placing it within reach of anyone with a computer terminal. That revolution, much like the Internet itself, instead became something much tamer, and just a few years later, the MOOC-ineers were no longer promising knowledge for all, anytime, anywhere. Instead, they settled for “Uber for education,” and Watters details why that’s both a disappointment and, more subtly, a threat.
The Internet was something new that couldn’t quite be described with old terms—and it was going to change everything about how we lived and learned.
Jonathan Rees has a similar worry about the faddish notion of the “flipped classroom.” It seems an innocuous concept: Rather than have students spend their time outside of class reading or doing homework, send them to their computers to watch Internet-delivered lectures about the class material. Then in-class time can be spent interacting with their teachers. This noble-sounding setup, Rees suggests, actually turns professors into mere content providers, whose work can be freely distributed via the Internet—maybe they’ll be compensated for it, but maybe not. More worryingly, it suggests that learning is simply a matter of students consuming the right content, like a cruder version of Neo from The Matrix, who could simply download knowledge into his brain. Professors, Rees says, should be wary of rendering themselves into slices of content; he makes a comparison to self-slaughtering pigs that I’ll let you savor for yourself.
But if the Internet has wrought strange changes in the landscape of higher education, what about the realm of autodidactism? Surely it’s easier than ever to learn almost anything: After all, the Information Superhighway runs right to your door (or your phone, or your watch) and is ready to deliver the sum of the world’s knowledge. Nithin Coca looks at how language learning has (and hasn’t) evolved with technology. From apps like Duolingo to the venerable Rosetta Stone, there are more options than ever for learning a language. Are they improvements over what came before? Coca finds the answer is not what you might expect.
Enjoy the issue.
Photo via naosuke ii/Flickr (CC BY 2.0) | Remix by Jason Reed