I don’t want to begin my GEDI blogging journey on a negative note, but I couldn’t help but be critical of this week’s readings. In general, they smacked of techno-utopianism — hype that disguises as much truth about the networked world as it reveals. Here I have to disclose some bias — my PhD research is on the philosophical implications of technology. In particular, I spend a lot of time thinking about why end-users have so uncritically embraced digital networks and social norms that multiply and are amplified in a networked context. As an instructor, my research interests are echoed in the way I treat the use of tech in my classroom. As a student of pedagogy, it’s only honest to connect my views on these topics to my (admittedly very-much-in-development) teaching practice.
Despite my overarching critique, however, there’s interesting stuff to be explored here. All of this week’s thinkers have a sincere interest in leveraging innovation toward the best end possible. That’s great. Unfortunately, these pieces all demonstrated a very specific ideology: the ethos of Web 2.0. They would have been more interesting to me if they revealed their bias. Or at least the social trajectory by which this bias has come to support an ever-more popular perspective on networked learning. Here are a few examples:
In Gardner Campbell’s article, he emphasizes the need for students to understand the Internet from a more practical perspective. I completely agree with this: at this point, we’re all digital beings, and it behooves us to know a thing or two about how the web actually works. But “how the web actually works” includes more than just technical expertise on URLs, packet routing, HTTP and data infrastructures. There is an economic ideology at the core of Web 2.0. The constant clicking and “sharing” that facilitates the business model of the Internet, which makes a profit from all of this online activity in ways that are generally unknown to users. The networked self, including the networked student, generates a profit for the platforms they use. Although Seth Godin states that “blogging is free,” it is only free in the sense that we do not have to pay directly for certain blogging services. In these cases, user data is extracted that is highly valuable to all sorts of entities, include surveillants and social media sites fine-tuning algorithms to serve more profitable ads to their users. The networked student becomes a source of income. No explanation of the Internet should fail to include observations like these, which are not political interpretations — they’re just facts.
But my problems with this ideology aren’t entirely related to economic exploitation. They also have to do with the spirit of what it means to learn and produce intellectual and creative work.
Campbell tells us:
By forcing students to write ‘publicly’, their writing rapidly improves.
This can be true in many cases. But it also reinforces the idea that the only meaningful behavior is that which we can display / promote. In research she did with teenage students, social critic Sarah Leonard indicated that a staggering amount of millennial writers aspire to become marketers. Marketing is a safe job option for budding writers, at least in comparison to the potential of making it as a journalist, editor or (God help us) fiction writer. Public writing can be extraordinarily helpful insofar as it forces writers to emphasize clarity and conceptual legibility. But we also need to affirm the value of creative work for its own sake, not for its profit margin or the number of hits it garners (two metrics now deeply intertwined).
This issue is separate from the related concern addressed by Tim Hitchcock. In his piece, Hitchcock says “a lot of early career scholars, in particular, worry that exposing their research too early, in too public a manner, will either open them to ridicule, or allow someone else to ‘steal’ their ideas.” Those are fair concerns and should be left to the consideration of each individual scholar, particularly based on their field (although Hitchcock perfunctorily dismisses them with a vague reference to his own life). He does not consider the deeper process by which scholars come to feel ready and comfortable to share their work. Constant self-publication and self-publicization may lead to more Twitter followers and a “constant conversation,” but often one needs to work in isolation to hit on a truly singular finding. The short-term reward of “growing your network” and being marginally recognized with Internet fame is a foil to what might be achieved when time spent on social media is rerouted solely to one’s own paper.
I mention this, of course, in the context of graduate work. But I think we also need to share this perspective with undergraduates, because it’s becoming increasingly viewed as outmoded. To demonstrate that it’s possible to be both tech-savvy and critical of the Internet is sometimes interesting to students, I guess because they don’t get that perspective very often. In the “Digital Culture” unit of the course I teach, I try to offer this (without hitting them over the head with it).
Meanwhile, the “Working Openly On The Web Manifesto” has its roots in free and open software culture, which (again) is not inherently bad, but may have problematic implications for scholarship. The notion that work is primarily meaningful for its sharing value seems like a quick route to short-circuiting educational processes that take gestation, painstaking attention to detail, and long stretches of time to come to fruition. Academic work is not the same as crowdsourced coding projects. That is, until starry-eyedness about the Internet forces us to lose sight of the reasons why scholarly practice is distinguished from other types of work. (Also known as: why, for the most part, grad students don’t publish drafts of their doctoral dissertation until it’s been through extensive review).
Okay, I think that’s all I have. Despite my polemic, my mind is still open, and I wouldn’t mind being challenged on this. (Hey, it’s good for my research).
The post GEDI Post I appeared first on .