Weekly Pessimism: Carr’s Nightmare & Google’s Perfection

Of all the readings Carr’s piece really hit home. This specific problem has plagued me for years. I need my information delivered in bursts. I am totally addicted to the instant gratification that a smartphone and Google provides, and I have no more use for specifics. I skim almost everything I read. I’ll Google answers to questions I have while trying to fall asleep, even if that wakes me up again. I lack the patience for detail, in most cases I can’t even tolerate it.

It is an incredible disadvantage in graduate school. After all we’re supposed to read a few papers per day, right? I don’t think I’ve thoroughly read one in years, I usually skim my own proof-reading. So how am I supposed to finish a PhD, when the entire point is becoming an expert in the mundane details of one highly specific area. You can’t be a jack-of-all trades PhD who specializes in nothing. Honestly, I’ve been worried this lack of mental discipline would tank my career for years, yet I can’t fight it. In the middle of reading Carr’s piece I was truly hoping he’d offer some brilliant solution I hadn’t thought of, while simultaneously hoping he wrap it up soon, because come on dude it’s been like 15 paragraphs, cut to the chase…

Frighteningly, this piece was written in 2008, just a year after the first modern smart phone was released, and years before anything resembling modern social media. I would suspect the effect is even more profound today when Facebook is considered too detail heavy, and most of us have moved on to image sharing services and single sentence tweets. I suspect that no matter how pessimistic Carr was feeling as he typed out that article, he probably didn’t really comprehend how bad it would get for some.

I must admit that I am an addict myself, and the 2008 version of me would never have predicted how bad it would get. According to an app called Quality Time, which measures phone use, I unlock my phone about 300 times a day, and spend between 4-5 hours on it, with roughly 2/3rds of that on social media. Every day, I spend hours training myself to consume nothing but tidbits of novel information, without any substance at all. How could I expect to suddenly be able to read 15 pages about obscure spatial statistics.

What’s worse, I had the good fortune of getting most of my education prior to the rise of Google and social media. What concerns me most is that the students we’ll be teaching in another 5-10 years will have grown up on it. While the rest of us may be able to return to the high-detail “scuba-diving” version thinking (hopefully), perhaps that skill is entirely foreign to the younger generation. What will the kids who got their first iPhone in fourth grade be like when they get to college? Taking away their electronics won’t help them, it’ll just disengage them further. I keep thinking, perhaps we should embrace this and give them bursts of low-detail novel information, but at some point, they’ll need detail if they are to be professionals in their field.

As usual, I honestly have no idea how to solve this problem, and expect it to be exacerbated in the following years. Salzberg’s advice seems like a Band-Aid on a broken bone; medication not meditation is what I need, and I didn’t grow up reading Instagram on the school-bus. Perhaps we older folk can retrain ourselves by abstaining from instant gratification sources of knowledge, and forcing ourselves to read more. I have heard some folks have luck with completely disengaging from social media and going back to flip phones, but I have yet to try it. Maybe I’ll start with a few good books of the paper variety. As for the younger folks, they may be out of luck. Perhaps the world will have to adapt to them instead.
__________________

On a side-note, the comments on Carr’s article are completely missing the danger of perfect silicon memory. To greatly paraphrase the work of Dr. Nicholas Christakis (one of the fathers of my field), there is incredible utility in imperfection. To steal an example he uses frequently, consider designing an algorithm for a small robot to find the top of a hill. You program it to always go in the direction of the highest slope, and set it loose. Eventually it finds the top of the highest hill, right? Not quite, it finds the local maxima, but there could be a much taller hill next door. If the robot isn’t allowed to make mistakes and go downhill sometimes, it’ll get stuck on the local maxima and never leave. The same goes for Googling answers. How many times have you gone looking for something and stumbled upon a better solution? If you cannot ever make that mistake, if you always find exactly what you need, you’re greatly limiting innovation.

People that are willing to give this up for perfect recall are giving up far more than they know.

The curse of exceptional peers and other weekly pessimism

I want to call attention to a paper that is quite related to this week’s readings:

It is actually discussed in more layman’s terms here (good for me as I am far from a psychologist).

We saw in most of the readings, and the TED talk, that the traditional carrot and stick method harms creativity and performance in anything other than mechanistic tasks. Also, grades force students to narrow their focus onto achieving high marks rather than learning, and can even average grades can discourage students expecting (especially if the peers do better). The Rogers et al. (2016) paper goes beyond that, suggesting that even without grades, seeing their peers do better on a graded assignment harms performance.

The authors actually ran two experiments for this paper. In the first, they collected grade data from a MOOC they offered. The course assignments included writing essays, as well as reading and grading their peers’ essays. This allowed the authors to track the “quality” of each individual as well as the quality of each peer essay read by each individual. They found that students exposed to excellent mid-term essays generally turned in less impressive final essays. These same students were even less likely to finish the course than those who graded poor peer essays. Their second experiment involved writing and grading SAT essay questions, and again those exposed to high quality writing reevaluated their own skill and were less confident in their abilities. Moreover, they were more likely to feel that essays were an unfair measure of skill, but still felt ashamed by their lack of success. The conclusion they drew was that even encountering superior work can have an incredibly demotivating effect.

To be honest, I have felt this a lot. Obviously imposter syndrome is a big part of graduate school, and we are bombarded with exceptional work all the time (“read this Nature paper while you’re trying to publish your masters work”). But I would suggest it is even worse in an interdisciplinary PhD program. Most of my colleagues have different backgrounds, and have many skills far beyond my own; I’ll never compete with the computer scientists in writing code or the statisticians in model design. Being forced to face this routinely, especially in our department seminars, is often crushing. I spend half of those seminars barely able to follow along, even when listening to junior colleagues – it took a year before I realized some of the other students were just as lost when I gave my talks.

Accordingly, even if we eliminate grades in favor of more detailed evaluations, the competitiveness and the feeling of self-doubt will still haunt many students. It makes no difference if it is a graduate seminar, or a grade-school arithmetic problem on the chalk board. I don’t know how we can get away with this.

Incidentally, about 2% of the 150,000 registered students completed the MOOC, which lends support to the opposite argument: a lack of incentive to finish (and penalty for failure to do so) totally guts motivation.

Second point, Alfie Kohn’s essay was fantastic. I want to agree with virtually every point, but as one might have guessed, I feel stirrings of pessimism. One must never forget that a student’s entire career depends on these grades. You can eliminate letter grades, and replace them with the analytical grid Dr. Elbow described, or you can use detailed assessments. All good in theory, but if an employer, college, grad school is going to base hiring or admissions decisions on those assessments, you can be assured that the student will focus on optimizing them, rather than learning.

In centuries past, it may have been the case that most students attended college pursuing a classical education simply for their own benefit. But today’s middle-class is generally built upon these degrees. The quality of the school and the “rank” of the student are literally of life-affecting importance. Until this changes, students will always prioritize ranking over learning. I don’t agree with Elbow that they want to be ranked, it is simply inevitable.

Weekly pessimism – Anti-Teaching

I always seem like a depressed pessimist when writing these blogs… 🙁 At any rate, as usual, these readings and videos seem wonderful in theory, but they just gloss over the problems. Perhaps the issue is that those who go into pedagogy enjoy their work too much… To be fair, most people who stick through a PhD and build a career, enjoy their work. But for those in pedagogy, their work is guided learning, teaching itself, and they can’t seem to grasp the idea that some student’s don’t want to be a part of it.

This does not apply at the graduate level, because presumably all of us actually enjoy what we do and wouldn’t be here otherwise. But most of our readings seem to cover all schooling at all ages. And I am certain I have encountered more than a few undergrads who simply didn’t want to be there. And of all the students, these are the ones most resistant to “fun” and active learning. They are also the ones most in need of being reached.

Thomas and Brown suggest that learning is inexorably tied with play and fun, and when combined properly, people are happy to learn. Which I would agree with, though their example of middle-school kids learning about Harry Potter lore is borderline absurd. First and foremost, the Harry Potter books are fun, for most students, spatial statistics and antebellum US history are not. You can certainly improve upon dry boring lectures, but you’ll never make it exciting unless you have the talent of someone like VT’s John Boyer or Youtube’s Dan Carlin.

Second of all, aren’t we ignoring the fact that high school age and younger kids have a psychological need to rebel against authority? (Again, this is not applicable to graduate students.) I’d wager that if you took a class of 8th graders, and required them to read Harry Potter, and assigned homework on the subject, and tested them, and then said that their futures depend on their ability to recall mundane details, they would hate it as much as they hate earth science or algebra. To those students, the teacher is the authority they are supposed to rebel against.

I hate to sound so negative, but the fact is that there isn’t enough flexibility for undergrads to exclusively take interesting courses. Sometimes they have to take required courses they despise. I certainly don’t blame them for that, but it doesn’t change the fact that some of them really don’t want to be sitting in your class.

The single most difficult moment in my short teaching experience was for a summer program introducing STEM fields to 8th graders. All of them were bright, and some of them wanted to be there, but a few very clearly didn’t. Who knows if their parents forced them to be there, or if they misjudged how interested they were in the program… Either way it seemed like torture for them. We tried to make our session as active as possible, running a real-time epidemic simulation and infect an imaginary population with some horrific pandemic influenza virus to see how many survived. Most of the students were happier playing the game than listening to our PowerPoint intro, but the more we tried to engage the disinterested ones, the less interested they were. I honestly think they’d have preferred to sleep through a lecture than be bothered by us. As soon as they structure of the rigid lecture (we teach, you listen) disappeared, they were free to completely disconnect. They certainly didn’t like the lecture either, but the game certainly didn’t help. Honestly it completely prevented me from reaching the other students. I felt like I was hurting these poor kids (because I remember being in their shoes in 8th grade myself). In fact, the only time they paid any attention was at the end when they got to control the learning entirely (asking questions about famous epidemics).

TL;DR: All of the methods we’ve read about so far are wonderful for interested students, but they make the fatal assumption that all the students want to be a part of this experience. If I get one thing out of this class, I hope I learn how to deal with these ones who don’t.

________________________

Unrelated side-note: One of the readings talks about the transition from black and white to color TV. Ever wonder what it looked like?

Dubious on Digital Learners

I must admit that to this outsider, the field of education seems very odd! It is surprising how enthusiastically the field embraces new ideas, and their eagerness to denigrate the old ways is even more unusual. I certainly can’t imagine engineers or biologists so willing to say that the last 200 years were done all wrong, and insist we need to completely reinvent the field – yet if this week’s readings are to be believed, pedagogy since the 1500s has been a disaster. Even Robert Talbert claims that organized lectures are useless for “information transfer”, and he was its only proponent.

As an outsider, maybe I have no idea what I’m talking about, but while the intersection of digital gamification and active learning seems to be innovative and useful, it is also hardly a panacea. As a product, or perhaps victim, of the old educational system (nearly was one of ‘ third of college students), I have a few layman’s criticisms:

1. Gamified learning sounds great, but is it truly accessible for all students? I like Dr. Carnes’s ancient Athens game, but that requires a lot of social interaction. There must be introverted students who can’t stand these games and intentionally withdraw as much as possible. Moreover, one must remember that the teenage years are marked by a need to rebel. A game that is embraced by juniors in private college is surely not going to be embraced by 7th graders, especially if it is mandatory, and especially the teacher is playing along. To be honest, when my lab forces us to play team-building “games” I feel heavily patronized – I’d rather sit and listen to a boring lecture. Some students just want to get the material and go home, and they will feel just as off-put by forced social games as any extrovert does in a lecture.

2. This seems like a brilliant addition to the standard curriculum, but it surely can’t replace the old-fashioned lectures and labs. It just doesn’t seem very applicable to most subjects – there are only so many ways you can gamify statistical analysis. Moreover, while creating video game levels for Aesop’s Fables seems like an amazing way to absorb that material – what does that kid do when they get to college and need to learn some mundane but necessary on their own? There are times in your academic career where you need to simply sit down and read a boring book, no fun, no play, no games, just the discipline to be bored. If we indulge the need to play from K-12, nobody will ever have the discipline learn something horrible like SQL.

3. For all this talk of active learning, does it work? Are we sure that it works? Has anyone proved that it works? Has anyone quantified how much more effective it is than traditional learning? My wife is a physician and often talks about her old medical school’s two curriculum system. Students had the option of the vanilla lecture / lab curriculum, or what they called “problem-based learning” (PBL) which sounds like it was inspired by an episode of House MD. The PBL students met every day to take on a case, typically from a catalog of real-life historic cases. They each had to do research, come up with a diagnosis, and a treatment plan, getting updates as they go. It actually sounds awesome. A heck of a lot more fun that sitting in a lecture hall. It must be more fun for the professor too, as they must decide realistic consequences for mistakes. But for all this fun, at the end of their schooling, the PBL folks in her class did far worse on the medical licensing exam than the traditional folks. They had a higher failure rate too, and a far lower placement rate in residencies. To be fair, this could just be a symptom of teaching for the test, and perhaps that test isn’t a great way to gauge ability. But even in residency, my wife claimed that the PBL graduates were well behind. This story is anecdotal, but the NIH studied this significantly.

Here is a metastudy, looking at 15 earlier studies that compared PBL to conventional med-school curricula. The study concludes “Twenty-two years of research shows that PBL does not impact knowledge acquisition; evidence for other outcomes does not provide unequivocal support for enhanced learning.” (Harting, et al. 2010). So, at the end of the day, are we certain that these changes are as beneficial as these authors claim? Or is active learning just another subtle improvement on a centuries old formula?

Side note: The “New Learners of the 21st Century” documentary makes an egregious false equivalence when it compares social-media and game addicts, who are condemned, to studious kids who are praised. The addiction to video games can be as pathological and compulsive as the addiction to hard drugs. It is a legitimate psychological disorder which literally kills a few young people per year. Though I suppose it is possible to do the same studying, it is certainly not something to be praised. On the contrary, an obsession with studying could also be extremely dangerous, while gaming, social-media, or studying in moderation is fine.

TL;DR: Sounds good, but: Introverts must hate active learning. Is it really applicable to all fields? Does it really work as well as these authors claim (I doubt it)? Addiction is bad.

  • Hartling, Lisa, Carol Spooner, Lisa Tjosvold, and Anna Oswald. “Problem-based learning in pre-clinical medical education: 22 years of outcome research.” Medical teacher 32, no. 1 (2010): 28-35.

A sadly pragmatic take on Dr. Wesch’s TED talk

The TEDxKC video “What Baby George Taught Me About Learning” by Dr. Michael Wesch, was inspiring but seemed a bit too idealistic. At one point he lamented the fact that despite all of his efforts, his students were still most concerned about their grades, rather than learning the material. But how could that ever not be the case?

If we are totally honest, the majority of students at any college are attending primarily for the job opportunities their degree affords. Certainly “expanding our horizons” and improving our understanding of the world is a great benefit – and I recognize that this was the original aim of tertiary education – but few could afford this experience if it didn’t also provide significant employment benefits. This has never been more true than it is today, when student loan debt is crippling, tuition costs have skyrocketed, and most white-collar jobs absolutely require the once-optional BA.

Doing some back of the envelope calculations, just 30 years ago, a year of tuition at VT cost the equivalent of about 500 hours of minimum wage work. One could pay for the entire year’s tuition with a summer job. Today that figure is closer to 1900, almost a full year of full time work. Couple that with the fact that 30 years ago a BA was mostly optional, while today it is required to manage a Starbucks. Add to this the fact that a degree from a good school like VT can be worth over $500,000 over a 20-year period. Can you really blame students for obsessing over grades?

By the time students reach Dr. Wesch’s class, they must have invested tens of thousands of dollars, likely put themselves deep into debt, and know their grades will literally dictate the rest of their lives. A few bad grades could make the difference between getting into a good grad school with funding, or paying their own way at some R3. It could be the difference between even getting into med school at all, or in getting an internship with their dream employer instead of ending up in a cubicle farm in a job they hate.

Until this changes, students will always prioritize grades above actual learning, especially in an elective subject.

I admire Dr. Wesch’s idealism, and I hope to encourage students to love both the material and the act of learning itself, but we cannot allow ourselves to forget how important grades are to these students. If they are truly concerned about their futures, learning will be the last thing on their minds.