Power to the students

What caught my attention most in the work of Paulo Freire is “respect for what students know”—that is, taking advantage of students’ prior knowledge to learn more than is possible when a supposedly all-knowing professor dictates. I really love this perspective, given my disillusionment with academia and the ivory tower and my pro-blue collar/trade school/indigenous knowledge mindset (Did I mention I want to teach community college? Or just be one of those food critics that gets paid to eat a bunch of food? That’s a thing, right?). Although I have not yet been responsible for teaching a semester-long course, I have led several lectures and labs. I always make an effort to access this existing student knowledge by asking questions in a conversational manner during the lecture (e.g. “Have you ever noticed that…”). Similarly, I like to know where the students are from to tie in examples of natural features near their corresponding homes (e.g. “Who here is from the Piedmont? You’ve probably seen how…”). Note: following the microaggression theme from my last blog post, I ask this of pretty much everyone because I find it interesting; most of the time I actually hope you will be from somewhere different, because the hydrology might be distinct there. I copied this technique from some of my favorite professors because it made the subject matter more approachable and familiar to the students. I always felt a sense of ownership or authority on a given topic that related to me in some way, as if I already knew more than I thought I did. This tactic can be successful in all fields, but I find it especially easy to incorporate in hydrology. Water is all around us, unless maybe you are from a desert (which I would find out by asking where you are from), so we can tap into those subconscious observations to discover that most of us probably know a good deal about hydrology.

Respect for students’ prior knowledge is also critical from a multi/interdisciplinary standpoint. For example, a hydrology or geomorphology course would be essential for a wildlife biologist studying salamanders, but I would also be curious about the hydrological processes these students observe in their line of work (perhaps salamanders congregating near zones of cooler water upwelling in the summer, and where those areas might be?). Or, I would be interested to learn more about water rights from a political science or pre-law student. However, in order to capitalize on what students know, we must first know something about the students. As I mentioned, asking where students are from is one good question, but inquiring about fields of study and extracurricular interests also provides opportunities to connect with the course content and make the material relevant to each individual.

Maybe this is understood or assumed in the work of Freire, but I would add the modification to his model of informed problem-posing rather than simply problem-posing. I am still scarred from a few discussion-based graduate seminars that I guess attempted to get at this problem-posing format. For these seminars, we would read a few peer-reviewed articles, which were always really complicated and archaic and often written by renowned researchers. The professors wanted the students to entirely take charge of the discussion and talk about what was wrong with the paper, what we would do differently, etc. These are great questions and, theoretically, a fine set-up for a graduate-level class. Small problem: despite careful reading, we often did not understand the papers well enough to have this sort of discussion (like, “I think they do something with a sediment sample at some point”). I should clarify, I do believe that being able to work through complex articles that may not be in our area of expertise is an essential skill for graduate students to develop. However, the end result that I witnessed in these purely student-led classes was random babbling and tangents, and I did not feel like I came away with any more knowledge. Incidentally, I have taken really great graduate seminars that also involved reading and discussing articles. The professors in these classes still encouraged student-based discussion but created some structure by providing necessary background on the subject or interjecting with their own questions. At least in my experience, this model was more successful. While I like the problem-posing technique that draws on pre-existing student knowledge, professors should not completely step back, but rather teach concepts and suggest tools that can help solve these problems. I think that students do not normally use their prior knowledge in the classroom because they develop tunnel vision (“I always have to use this equation to get this answer”) and do not necessarily know they are allowed to do anything else. I feel that small prompts and reminders that students should use all of their intellectual resources to tackle a problem, as opposed to just the ones presented in class, can go a long way.

And in honor of my last blog post…

No trespassing

 The readings this week made me think of a class discussion last semester about microaggressions. Microaggression was a new term to me, but it refers to “everyday verbal, nonverbal, and environmental slights, snubs, or insults, whether intentional or unintentional, that communicate hostile, derogatory, or negative messages to target persons based solely upon their marginalized group membership.” We had a list of microaggression examples to go along with the discussion. Some of the examples are clearly offensive and should be avoided in all cases. However, I was not sure if I liked the idea of a “no-no” list and found some of the microaggressions to be context-dependent or just a stretch in general. Anything can be discriminatory if said in a certain tone or context. To be fair, there is a disclaimer at the top of the document saying that you must consider the individual situation with all of the examples. But I wondered if erecting so-called “out of bounds” signs in this manner actually opposes, rather than promotes, the goals of diversity and inclusion?

For example, the first section, “Alien in One’s Own Land,” discourages asking people who appear different than the dominant group where they are from or about their ethnicity/background. It is easy to imagine how this practice can be discriminatory; for example, the teacher asks the one student of color in a large class where they are from but no one else. While we should steer clear of scenarios like this one, I feel that asking someone where they are from is an exceptionally normal activity and should not generally be considered a microaggression. Upon meeting someone, this question is usually the second one that I ask after finding out his/her name. People tie up substantial information with the surroundings they call home, and I think it is natural to want to know how others fit in with that conceptualization. As a result, people identify characteristics that differ from their own way beyond skin color, appearance, or language. I am from Georgia, and anytime I leave the South, I encounter the statement (not even a question) “you’re not from around here” based on my accent. Incidentally, I do not have a strong accent because my parents were born in the Midwest, so I actually am also told I am not from my hometown (when I was a server at a restaurant, sometimes multiple times in the same day). I do my own set of classifications: I come from a rural area, and I can classify out-of-towners by their behavior, such as people from Atlanta believing that, just because they have left the perimeter, trekking poles are needed every time you go outside.

The “where are you from?” microaggression primarily caught my attention, but there were others from the discussion that I thought were fairly context-dependent. Some people mentioned compliments that imply “it is impressive you can do this given that you are a woman/person of color/non-English speaker, etc.” Again, not hard to think of a situation where such statements are absolutely offensive. But the discussion turned to people sharing random stories along the lines of “this one time, I told someone I was doing my Ph.D., and they were impressed” (in my experience, people are usually impressed when you tell them you are doing a Ph.D.). Likewise, I assume that my advisor is not snubbing my gender when he tells me that I did a good job on my grant proposal.

I came across a few articles that also caution against too many restrictions on our interactions. One controversy last fall involved professors at Washington State University banning the use of discriminatory language in class (Washington Post article and Inside Higher Ed article). Cool, let’s not hurl racial slurs and homophobic discourse. But the syllabus prohibited terminology such as “the white man” and referring to men and women as males and females (although using “white men” and/or “white males” is okay, which I admit I don’t quite follow). I support trying to move away from this sort of language, but many people are unaware of these conventions and do not necessarily mean to cause harm when they say “white males.” Educating students on more inclusive terminology over the course of the semester seems appropriate, but the syllabus warned that such missteps could result in removal from class and a failing grade, in extreme cases. A New York Times article, “The Sheltering Campus: Why College is Not Home,” also calls for retaining “a certain degree of freedom” on college campuses. According to the authors:

While we should provide “safe spaces” within colleges for marginalized groups, we must also make it safe for all community members to express opinions and challenge majority views. Intellectual growth and flexibility are fostered by rigorous debate and questioning.

In order to embrace diversity and move towards inclusion, I think that the first step is to acknowledge that we are, indeed, different from each other! But in order to realize that diversity is not threatening or scary but actually okay (even great, essential, etc.), I feel that we have to allow an exploration of this difference. And that might include asking people where they come from. Or inquiring about their native language. I left that particular class last semester terrified to talk to anyone different from me: “well I can’t ask them about their hometown, apparently that’s bad, I also can’t compliment them on anything…I wonder if talking about the weather is a microaggression?”. Is this feeling not counterproductive to creating a more inclusive environment?

I like that the authors of the readings this week advocate an acknowledgement of diversity and courage rather than just safety. According to Shankar Vedantam in “How the Hidden Brain Does the Thinking For Us”:

The far better approach is to put race on the table, to ask [children] to unpack the associations that they are learning, to help us shape those associations in more effective ways.

Similarly, Claude Steele talks about a teacher using diversity “as a classroom resource rather than following a strict strategy of colorblindness.” While civil, considerate language is always a must (and, yes, there is some terminology and actions that we can definitively put on the “no-no” list), I fear that establishing too many “no trespassing” signs only perpetuates this fear of difference. Does this practice not create more distance and discomfort between people? How can we strike a balance between making sure everyone feels safe from harm but also safe to discuss (which might involve, like it or not, making the occasional mistake)? I guess I tend to believe that a more natural exploration of “the other” is the only way to actually understand that there is no “other.”

Be yourself

The piece by Sarah Deel, “Finding My Teaching Voice,” and the idea of our authentic teaching selves resonated with a blog post I wrote a couple of weeks ago for my co-teaching independent study (I am co-teaching a sophomore-level hydrology course with my advisor this semester). I also reference a humorous article from The Chronicle of Higher Education called, “Desperate to be Liked,” which I think is relevant for this subject as well. The syllabus we devised for the independent study includes teaching a portion of the classes and, among other components, faculty observation. I shadow my advisor on the days I am not leading the class, but we also thought it would be a good idea to observe other professors as well, particularly those I would not necessarily have as an instructor otherwise. At first, I wondered if I would get anything out of this practice: is there really any use sitting in on more classes at this point in my program? So far, I have shadowed two professors in addition to my advisor and highly recommend this exercise to others interested in teaching. The readings this week made me think of the first professor I observed, who teaches an upper-level course in forestry. When my advisor mentioned him, he said something along the lines of “I don’t really get it, but students absolutely love him.” The professor definitely deviates from the master of pedagogy most of us might imagine. He comes across as fairly “old school” when it comes to teaching, and his examples and problems are real-world scenarios students might encounter in a future job. His PowerPoints are not full of animations, pictures, or YouTube videos. He speaks slowly. I doubt that he listens to Morning Edition on NPR. I also doubt that he blogs about it. Do the bored youth of our university cradling their smart phones actually like this guy? Yes, they certainly do, and within minutes of being in his class I understood why. He is extremely friendly, which makes him approachable. Being near retirement, he is exceptionally knowledgeable about the subject matter. His real-world examples strike a chord with students that realize they might need to use a concept from his class one day. I wrote in my other blog about how his voice caught my attention. I always assumed that in order to show enthusiasm and passion for a class, I must speak excitedly, waving my arms about, and shouting like an aerobics instructor. This professor speaks slowly and even quietly (although audibly), which actually has an extremely calming effect and also makes it easier for the students to follow when he works through problems or equations on the board. And he makes eye contact with everyone in the room, searching for confused looks or questions.

After observing his class, I began to think more about what makes a great teacher that inspires students. I reflected on my favorite teachers over the years, and while there are common threads (they were accessible, cared about the students, set clear expectations, etc.), for the most part, they are totally different from one another. In some cases, the teacher of the best class you will ever take may refuse to give tests and encourage everyone to play video games. Another teacher of an equally life-changing class may do the exact opposite. I realized that there is no one way to be a good teacher. We should take advantage of the latest pedagogical research to improve student learning by trying out non-traditional techniques and branching out from what has always been done. However, phenomenal teachers do not, by definition, need to follow each and every accepted convention in either direction (old school versus contemporary) but rather figure out what methods make them (and not necessarily someone else) the best and most effective instructor possible. I like the idea of finding our authentic teaching selves, which I would just call being genuine. We must continually work to modify courses and step up the quality of our teaching, but we should always start with being ourselves. And, in honor of being yourself….

 

 

 

 

To answer the blog prompt, what is my authentic teaching self? To follow the logic of “being yourself,” I guess I will start by describing myself in general. I am Type A, extremely organized, and detail-oriented. I can also be impatient. I have a dry sense of humor but enjoy humor and laughing in general; as a result, I often make fun of and laugh at myself. I enjoy spending time outside and am terrified at the thought of people being disconnected from the natural world (the most common question I was asked as a raft guide by rafting guests, mostly adults, is why the river does not go in a circle: “We can’t get out here, we’re not to where we put in yet!”), which I think is a veritable problem (it’s like, yeah, of course these people don’t believe in climate change). Despite coming across as an insane tree-hugger, I tend to promote moderation in most things. So, you know, lectures are cool, but not every day, all the time, and they should be high-quality (so, not reading off of slides). Class discussions and hands-on activities are also fine, but, depending on the course, maybe not every day, all the time, and some structure or prompts to the activity can help guide students in that regard. Tests should not ruin people’s lives or stand as a metric of their overall intelligence (other personal life story and aside, I was a lift operator at a ski resort, and–stemming from suggestions in class last week–to anyone that scoffs at mechanical/blue collar knowledge, running a chair lift, in terms of the machine itself, requires more intelligence, hands down, than a Bachelor’s degree, not even a point of discussion. Want to see people with problem-solving skills? Watch your car mechanic after providing the helpful diagnosis: “Yeah, it makes a noise sometimes.”). However, we should not totally overlook tests and other assessments as a learning tool and, additionally, as a source of feedback on our teaching. Technology is the key to solving many of the current problems we face, but let’s not stare at screens all day–it’s really bad for you. Go play outside, so you don’t ever get confused about rivers going in circles (or mountains and hills just being “really tall trees,” another common question from adult rafting guests that do not understand the concept of topography). Let’s allow our students to enjoy being nineteen or twenty years old and learning for the pure joy of learning, but we should also realize that jobs can bring people fulfillment (e.g. my friend wanted to be a doctor so that she could help people; Monica on Friends becomes a chef because she loves cooking, etc.), so we can permit them to think about future career paths without lamenting the death grip of capitalism. My authentic teaching self is some reflection of all of the above, for better or worse.

“I have a dream…wait…what was it about?”

I think we have all suffered through a bad lecture. The unlucky among us may have endured countless of them. Let’s face it: there are many ineffective lecturers and even more horrible lectures. The bad rap of lectures is no surprise and, furthermore, well-deserved. But does that mean that the lecture format itself is to blame? Robert Talbert provides a slightly more balanced view of lectures in “Four Things Lecture is Good For.” Talbert asserts that lectures are appropriate when providing context, telling stories, or demonstrating how to solve a problem. Otherwise, he insists active forms of learning are the way to go. As a student in the environmental sciences that thinks we should all be outside playing in streams as a primary learning mechanism, I am all in favor of efforts to increase hands-on experience beyond the traditional classroom confines. I feel that well-designed courses already take advantage of alternative formats to some extent (although, of course, with much room for improvement); labs are integral in the sciences, and discussion is common in the humanities. However, I think that lectures also have a function in education and can be interwoven into these more active components for maximum effectiveness (and more than the four scenarios that Talbert mentions). Lectures may not always be the best way to teach every concept in every course all the time, especially in a dry, unappealing delivery, but to claim they hold little value may be a tad hasty.

Lectures have stuck around as a primary method of teaching for hundreds of years (see reference). Yes, this is contemporary pedagogy, and we are interested in how we can improve teaching for the future rather than remaining in the past. But I would argue than an integral part of looking forward is also reflecting on the past. What went well and what did not? The status quo is clearly not cutting it, but can we attribute that to the actual lecture format? Or have we all just been afflicted by one too many bad lectures? Molly Worthen defends the lecture in her op-ed in The New York Times, “Lecture Me. Really.” Unsurprisingly, disgruntled academics trolling their Twitter accounts responded with a barrage of angry opposition. Worthen writes that students benefit from lectures by developing critical listening skills:

Absorbing a long, complex argument is hard work, requiring students to synthesize, organize and react as they listen. In our time, when any reading assignment longer than a Facebook post seems ponderous, students have little experience doing this […] But if we abandon the lecture format because students may find it difficult, we do them a disservice. Moreover, we capitulate to the worst features of the customer-service mentality that has seeped into the university from the business world. The solution, instead, is to teach those students how to gain all a great lecture course has to give them.

Although Talbert admits that lectures do serve a purpose in some isolated cases, he decries, among other issues, how the length of lectures is ill-suited to the average human attention span. I whole-heartedly agree—we evolved to run around on the prairie hunting buffalo with spears, not listening to a long-winded lecture. However, we also did not evolve to do a whole host of things that current society expects of us, such as sitting in front of an LCD screen writing a blog, yet here we are. Just because we are not biologically programmed to do something naturally does not necessarily mean this skill is not worth practicing. Worthen continues:

Listening continuously and taking notes for an hour is an unusual cognitive experience for most young people. Professors should embrace—and even advertise—lecture courses as an exercise in mindfulness and attention building, a mental workout that counteracts the junk food of nonstop social media.

Because I often think in terms of examples outside of education, the whole passive versus active learning argument reminded me of stretching (maybe it’s a stretch, get it?). In yoga or other stretching exercises, active stretches are all the rage. For example, a crescent lunge or lizard pose to stretch, among other muscles, the hip flexor. Active poses are great, but muscles actually like to be stretched both actively and passively. A passive stretch for the hip flexor would be lying on the floor with a low block below your tailbone, allowing the leg to extend straight to the floor in front of you. Chances are you will not feel intense sensations if you try this, but passive stretches are extremely beneficial in addition to more active ones.

I am also a little confused by the distinction Talbert drew between inspiration and learning, as in you can be inspired by a TED talk or church sermon but do not learn from them. Maybe I am singular in this regard, but I usually learn a great deal from TED talks? Given the widespread appeal of TED talks and other activities that involve, when it comes down to it, someone talking at you–stand-up comedy, television, podcasts, speeches, plays–I think it stands to reason that good lectures can also captivate audiences and promote thought. Talbert’s anecdote of listening to a sermon and enjoying it but not being able to describe what it was about, if anything, more firmly underscores the need for the development of an “analytical ear.” Imagine listening to the state-of-the-union address or a political debate and then discussing it with a friend. “What did Donald Trump have to say? No idea, sure was inspiring though” (kidding, totally kidding). Or worse, what if we are lucky enough to have another orator like Martin Luther King Jr., and a similar discussion ensues after his/her version of the “I Have a Dream” speech. “Yeah he had a dream about something, but there wasn’t an interactive component, no Twitter prompts during the whole thing, so I got distracted. Seemed cool though.”

Are tests and rubrics the enemy?

One of the challenges we face when trying to improve education is that opinions often greatly diverge as to the best course of action. This disagreement is evident in both informal discussions among colleagues as well as conflicting scientific studies on the topic. Alfie Kohn decries the culture of testing in schools in “The Case Against Grades.” According to Kohn, “frequent temperature-taking” in the form of tests is unnecessary and, furthermore, inadequate to evaluate student learning and progress. Kohn goes on to argue that grades produce anxiety among students that detract from learning and decrease creativity. I can identify with the feeling that tests sometimes do a poor job of asking students to show what they know. I have led a few lectures for my advisor in his undergraduate hydrology class, and he asked me afterwards to write a few exam questions on the material I covered. His tests are a combination of multiple choice, short answer, discussion, and calculation problems. I always found the short answer, discussion, and calculation problems fairly easy to write, and I think they can be crafted in a way that tests the knowledge of the student pretty well. However, I had a lot more trouble with the multiple choice questions. Maybe creating multiple choice problems gets easier with practice, or it might be somewhat of an art, but I remember thinking that no matter how I phrase the question or what answer options I provide, the questions just seem inadequate and either really easy or sneakily obscure. Kohn insists that tests should be a rarity, and Marilyn Lombardi talks about other options for demonstrating learning, such as portfolios.

To complicate matters, other pedagogical studies talk about how tests are one of the most effective learning tools and that we should test more, not less, often. Preposterous, you say? Perhaps. What I am referring to is called “the testing effect” and is discussed in Make it Stick: The Science of Successful Learning. Apparently copious research shows that, if you want your students to remember something, you should test them on it. A test does not necessarily have to take the form of a high-stakes, anxiety producing, multiple choice final exam. The authors include any form of information recall that students do without looking at their notes, such as using flash cards or quizzing each other. Any time that you have to work to remember something, your brain makes a stronger connection to find that information, so it is easier to do so the next time around. The authors also warn readers up front, “your students won’t like this.” However, they also give advice on how to incorporate the testing effect without terrorizing your students. Namely, giving frequent, low-stakes quizzes that do not really impact the grade that much, which also helps to decrease the negative connotation of tests. I was a big convert to the testing effect after reading this book, but I do have reservations about the frequent quizzing, which would become a form of taking attendance. I think Kohn is pretty extreme in his arguments, but I do not think that traditional tests are the best method of student evaluation in many circumstances. Portfolios, papers, and projects are often far superior options, but I think that tests do also have their place. For example, I tagged along during a dendrology field lab last week to observe the professor, and dendrology is definitely a class that requires substantial memorization. The professor did a great job of interweaving stories and context to the different trees and also gave students tips about how to organize their tree descriptions to see connections among species. He also quizzed the students four or five times during the class on trees they learned the previous weeks. I think this sort of class (anatomy would be another one) is a good candidate for frequent testing, which the dendrology professor is already doing. I guess I would caution that tests do serve a purpose in some cases, so do not completely overlook their potential

Similarly to tests, scholars disagree on the value of rubrics. Kohn thinks that rubrics discourage creativity by telling students what to expect and delimiting boundaries on the project. On the other hand, Lombardi promotes rubrics. The rubrics I have seen as a student are usually pretty general and do not seem to greatly constrain the project, especially if the professor includes something along the lines of “other project formats are acceptable but must be cleared by the professor to make sure it is appropriate.” I honestly think rubrics are kind of annoying, but I also believe they can be good to guide the assignment with a general set of expectations. In another book I read, How Learning Works: Seven Research-based Principles for Smart Teaching, the authors describe and then problem-shoot a common complaint of professors that students come into a class unable to carry over previous knowledge from former classes. The authors attribute this inability to a lack of “deep learning,” which may be the issue more often than not, but I also feel that sometimes students simply suffer from tunnel vision and do not think to apply knowledge they already possess in a new environment. Small prompts on assignment instructions or rubrics might go a long way in helping students tap into these other resources they possess. Thus, though counterintuitive, maybe such guidance can actually increase creativity? Rubrics are also good for transparency in grading to decrease resentment among students and help them to understand what they did and did not do well. I had a T.A. last semester who deducted points for nit picky and really just random and unfair reasons that made no sense or were flat-out wrong: we could do no right on our assignments, according to him. In the words of my friend in the class with me, “I have never felt personally attacked by a graded assignment in my entire life until now.” We never debated the grades with him to avoid being “those people” that quibble over points, but he would have avoided considerable resentment if there was a rubric at least suggesting some of the logic behind the strange deductions. It’s like, “if you wanted it that way, why didn’t you just say so?”.

Don’t bash the basics

So, apparently I enjoy playing the devil’s advocate when it comes to the weekly readings. I agree with some of what both Langer and Wesch write but—in a nice, exciting middle ground position—some of their views on anti-teaching and how to learn most effectively also differ from mine. Langer speaks of the dangers of overlearning, excessive practice, and drilling “the basics” to student creativity and even mastery. One of the hazards of overlearning is the inability to react to new situations, although the examples Langer provides do not exactly lend a sense of urgency to incorporating mindfulness into education (turning on a car blinker on an abandoned road? walking on the left as opposed to the right side of the sidewalk?). However, I absolutely agree that practicing can be done to a fault. My first thought when I read this article was of a book I recently finished (and which I seem to reference in most of my blog posts and comments), Make it Stick: The Science of Successful Learning by Brown et al. I highly recommend this book to anyone interested in teaching…very interesting and informative but also enjoyable to read. The authors discuss some of the educational myths that Langer outlines as well as strategies supported by pedagogical research to help students learn. One technique is interleaving (see also Scientific American article) as opposed to the traditional method of practicing a single skill repetitively before moving on to another. Interleaving is mixing up types of problems or drills, so, for example, instead of grouping math problems in a homework assignment by whether they require addition and subtraction or multiplication and division, the problems are jumbled. An example from the book outside of education is with batting practice in baseball. In one study, the pitches were in random order to one group of players and blocked by the type of pitch (twenty curveballs followed by twenty fastballs, etc.) to another. The players in the random, interleaved practice struggled more not knowing what pitch they were going to get but ended up performing better than the other group in future practice and games because they had learned to discriminate different pitches. The topic of overlearning, especially in reference to “the basics,” also reminded me of the “Everything is a Remix” YouTube video series and TED talk by Kirby Ferguson that we watched in the Preparing the Future Professoriate class last semester. Langer warns against mindlessly going through the motions of learning basic skills, whether in tennis or math, without considering individual needs and abilities. Langer also questions the notion that a standard set of basics should exist, because these guidelines may hamper modifications that permit creativity and lead to new insights. Clearly, everyone is different, and what works for one person will not necessarily work for someone else. However, according to Ferguson in Part 3 of his series, “copying is how we learn.” In contrast to Langer, Ferguson seems to identify more with the school of thought that we need to learn some set of established fundamentals before we can go on to achieve greatness. He provides examples, mostly of famous artists, who start out by copying the work of others before they then become creative geniuses themselves. Bob Dylan’s first album consisted mostly of cover songs, and Hunter S. Thompson retyped The Great Gatsby, word for word, to know what it was like to write a novel. There are other examples, but the point is that practicing a skill in a prescriptive manner or according to what someone else did does not necessarily prevent or stifle creativity. Langer is not calling for a complete overhaul of basic skills acquisition, but the goal of individualizing “the basics” for every person is somewhat unrealistic. That being said, small changes can go a long way: for example, offering a few different ways one might hold a tennis racket is easy to do and avoids the mindset of “this is absolutely the only way this will ever work for you.” But I would argue that a general set of basics, fundamentals, or prerequisites are time-saving, but also useful and not in opposition to the goal of individual learning and mastery. So, to tie in with my post title, “yo, don’t bash the basics.” Following our discussion on connected learning, I think we all hope to share our excitement about a subject to students to ignite their curiosity. Better yet, the students can then discover how the topics have meaning in their own lives–maybe beyond tests. While I think most of us want our students to find a passion for learning, Wesch accurately describes how many of us come up short. Students struggle to connect their education to anything meaningful? Yep. Students are more concerned with tests than understanding? Also yes. I am eager to change the climate of higher education to re-awaken a love of learning in students, but I thought Wesch’s views were biased toward a decidedly academic mindset. I believe that most college students are rational human beings, and while many of them do possess the capacity to love learning, they would also very much like a job one day that provides them food, water, shelter, and the ability to pay off student loan debt and procreate in a financially-responsible manner. In order to get this sort of job fifty years ago, a Bachelor’s degree was more than sufficient. Now, a Bachelor’s might not be enough, and applicants must additionally have good grades, internship or research experience, community service, and other resume-building activities. While many college students today are a product of the culture of standardized testing in K-12 education, their preoccupation with grades and tests is also a bit of a survival tactic: the job market is competitive, and, like it or not, grade point averages help determine whether or not you come out on top. Despite the very real pressures students face, instructors can, and should, cultivate a desire to learn. All people are “cut out for learning,” to quote Wesch. However, I disagree with the notion that school, in the sense of colleges and universities, is for everyone. The education system obviously leaves much to be desired, and schools should better facilitate student success and encourage students to get excited about learning. That is to say, school is for many more people than current conditions would suggest, but still not for one hundred percent of individuals…and I think that is okay! I tend to be a big fan of trade and vocational schools. You want to be a raft guide for the rest of your life? Or a massage therapist? Or a welder? That’s awesome, you’ll probably be much happier than most academics that make six-figure salaries. Constant learning also takes place in these other professions. Or what about students that are driven by other, equally worthy passions besides strictly learning? For example, one of my friends studied to be a doctor (so, medical school and not vocational school, but you see where I am going with this), not because she is endlessly curious about disease mutations, but because she wants to help provide medical care to underprivileged people in the rural South. In order to be a good doctor, she will continually learn as well, but a burning desire to know is not what keeps her going; rather, I think her primary goal is to actively help people. The hunger for knowledge Wesch speaks of seems to apply more for Master’s or PhD-bound students interested in research. I feel that, as teachers, our focus should move towards not only fostering an atmosphere of learning but additionally helping students connect with their true interests and curiosities…and realizing that these will not always coincide with the passion for learning in a school setting that Wesch describes.

Who do blogs connect?

The last blog I wrote for the Preparing the Future Professoriate class last semester, “To blog or not to blog after this semester?”, actually comes full circle quite nicely to the readings for this week. The consensus seems to be, at least among the authors of the assigned readings and Godin and Peters, that blogging is awesome. I definitely raised my hand in class last week when Dr. Nelson asked, “who in here hates blogging?”. While I am not a huge fan of blogs, I do see the value in the activity. As I wrote in my previous blog entry on the topic and as the articles describe, a blog is a great place to practice writing, much in the same manner as the journal or diary of yesteryear. I certainly saw improvement in my writing over the course of mandatory blogging last semester. According to Hitchcock, blogs have great potential in academia. Many professors and scientists struggle with connecting their research to non-experts and “normal” people but often also have trouble communicating in general (I call it as I see it). Writing a blog forces authors to think through the information they want to convey in order to present a coherent argument. The benefits of this practice are two-fold: one advantage is the practice in communication, but organizing information into a digestible format also helps the authors better understand and form deeper connections with their own material. One big plus for blogs over old-school journals is the possibility for two-way dialogue with readers, which Rosenberg likens to the telephone, and what can be nearly immediate feedback. Another beauty of the digital blog is the ability to modify, update, and correct posts after publication—a “freedom to fail,” if you will. This freedom should be liberating to academics who normally must conform to rigid formatting guidelines of scholarly journals and get caught up in what reviewers might think.

One potential caveat to the blog hype is that, while blogs are ideally open forums accessible by anyone on the internet, most bloggers will not reach a broad audience but rather a small handful of followers. The readers one is able to attract are generally colleagues (if the blog is in the academic realm) and friends. That is to say, blogs do not necessarily initiate conversations with the uninformed masses and, instead, present an example of confirmation bias: the people that regularly read a particular blog largely do so because they know they will agree with the views presented by the blogger. Not that there is anything wrong with this arrangement. Opportunities for public discussion exist if readers do want to weigh in on a topic, but blogs largely serve the blogger through the action itself of synthesizing information to create a post. Blogging can still be worthwhile, even if no one besides the author ever visits the site. I just wanted to point out that the vision of blogs as an educational tool that invites discussion and collaboration with people around the world is a possibility, but also quite idealistic.

The only other hesitation I have regarding blogging is a fear of too much technology. Not to sound like grandpa or a conspiracy theorist. On the contrary, I am very much on the bandwagon that believes technology is the key to solving many problems in the world. However, I do shudder at this new expectation that we should spend an additional hour or two every week hunched over a keyboard in front of a bright screen working on our digital identity, especially when most of us in higher education already spend most of our days doing just that. I feel that there are other approaches to accomplish the blogging goals, such as writing in a journal or setting up regularly scheduled, informal meetings with peers and colleagues to discuss research. Blogs are definitely a streamlined, glitzy alternative to the traditional ways of doing business, but that does not necessarily mean that everyone should feel like they have to blog. If that sort of thing tickles you, then wonderful. But if not, I think that is also fine.