Mindfulness and Technology in the Classroom

Given the recent readings and discussions we’ve had about mindfulness and technology use in the classroom, I thought some people might be interested in an article that I came across discussing how integrating some of these really cool technologies in the classroom can certainly help students, but those students are often the ones who are doing average or better. Struggling students might actually do worse. Additionally, the technology use can make them look busy– as if they understood the material when they actually don’t. Granted, the article (and Edutopia in general) focuses on K-12 classrooms, I find a lot of their articles like this one could have implications in higher education too. The article link is below; I recommend following Edutopia on Facebook if you want to see more of their stuff!

https://www.edutopia.org/article/looking-edtech-through-equity-lens?utm_source=facebook&utm_medium=socialflow

Being a Mindful Fisherman Takes Creativity and Curiosity

In this week’s readings, Douglas Thomas and John Seely Brown in A New Culture of Learning (2011) mentioned that the old saying, “Give a man a fish and feed him for a day, teach a man to fish and feed him for a lifetime” doesn’t capture the rapidly changing dynamics in almost all aspects of today’s world. What if there’s something else that should be fished instead? Or, even worse, what if there aren’t any fish left? All the resources this week cited a need to teach, on some level, curiosity and a willingness to be creative as a solution to dynamic problems.

I agree with this, to an extent. Certainly, being a creative fisherman– one who thinks critically about how to make and cast fishing nets and how these skills might be used to get other types of food– is much better than being an uncreative one. After all, it is this creativity that leaves you open to change. And, of course, curiosity naturally leads to creativity, so that should be fostered as well.

But these readings seem to completely dismiss the usefulness of just learning how to fish. Ultimately, if you want to learn how to be creative with your fishing skills, don’t you have to learn how to be uncreative first? To just know how to do it outright? That is, there seems to be no credit attributed to just knowing something, even if the thing that is known is subject to change over time.

Taking this a step further, I would even push back a little on the memorization and types of reflexes that Ellen Langer writes against. When a student is panicking on an exam, having that reflex and knowing what they should be doing– never mind the why– might actually save them on a particular question, allowing them to answer it an move on. (Of course, Ellen Langer would probably then start arguing about right vs wrong answers and such, but let’s face it– that is the nature of exams and most forms of assignments in classes. That’s the foundation for how we know how well students are doing. If we throw out right vs wrong answers completely, how do we measure progress or give students valuable feedback?)

What I think I would propose instead is altering the original saying to something like, “Give a man a fish and feed him for a day, teach a man to fish and probably feed him for a long time, teach a man to think critically about his fishing skills and definitely feed him for a lifetime.” (But maybe that’s a mouthful.) Yes, things change, but I don’t think they’re changing so fast that old knowledge is becoming obsolete at a rate that would make skills– at least the basic ones– completely unusable to the point of not even being worth teaching people how to just do them (making them think critically about it after they have a foundation for which to base such thinking on, and encouraging critical thinking all the while). People have been fishing for thousands of years, after all. How much has fishing really changed in the last 5,000 years? 1,000? years? 100 years? 30 years? 5 years?

Engaging Computer Science Students: Technology Required

This week as part of my Contemporary Pedagogy class at Virginia Tech (GRAD 5114), we were asked to read several pieces of literature regarding using technology in the classroom to engage students who grew up in a digital age. However, in a computer science classroom, the use of technology is not an option; it is required. All projects are done on a computer and may very well involve other pieces of technology, such as a mobile device, depending on the class. Students also quickly learn that StackOverflow is the computer science oriented Google: you can search for various programing-specific problems and be almost guaranteed to find an answer. And if you can’t, you can easily post your own question to ask help from the community.

What this boils down to is that computer science classrooms are a special case where technology is concerned. Students are required to use it, so it must be in the classroom. They also are already engaged– or quickly learn to be– in online communities where they help each other with programming issues and concepts. The real challenge, then, is how to best integrate the technology in the classroom. Much of this challenge originates from the fact that the material taught in the class is highly technical, which I would argue needs to be taught based on a strong conceptual foundation. That is, if a student knows the concept and can explain it well, then they could probably figure out the code to accomplish it. If they just know the code but don’t understand the concept, then they likely can only apply the concept in a limited number of scenarios at best.

This is where Robert Talbert’s “Four Things Lecture is Good For“ comes in (who I happened to actually have in a math course at Grand Valley State University way back when– small world!). Modeling the thought process behind a new computer science concept I believe is undeniably crucial to students’ success. Sharing cognitive structures, giving context, or telling stories help solidify these ideas in their heads so that they can hopefully apply them independently in the course projects and exams.

However, this creates an issue in the classroom. Concepts can often be taught independently of actual programming examples, but at some point, these examples need to be introduced to contextualize the concept. This implies a balance that is necessary to strike in the computer science classroom between traditional lecture and programming examples that require computers. Such a balance is very difficult to strike, as I found out first hand this past summer when I taught my first class. Reflecting on my own experiences as an undergraduate student, I think other professors in computer science struggle with this too; many of the professors I had seemed almost resigned to the idea that computers were going to be in the classroom and that students would get distracted by them. All they asked was that students who were more prone to such distractions sit towards the back of the class in hopes that they would be less distracting to their peers. Perhaps, though, part of this resignation was that there is always a huge disparity in abilities in any such class: some students come in knowing nothing, and others have already learned everything. How do you accommodate such a wide disparity in a classroom where technology is a necessity?

Some of the readings discussed using games to help students get more excited about the material, thus helping them be more engaged as well. While this is a common tactic used in computer science classes, I don’t think that game-based learning can be used every day in such a context. There is simply too much technical material that, while some students may have experienced previously, many have not. The sheer amount of technical material that needs to be covered make it difficult to do so while giving students an opportunity to directly engage with it every class period. The solution in my class this past summer was to have several large projects throughout the semester that were built upon the lecture material. In combination with the labs that gave them some designated time to practice, the hope was that they would achieve a mastery of the material. Some did, of course, but I can’t help but wonder what is a better balance to strike between technology use and practice in the classroom and more lecture-based approaches while ensuring all necessary material is covered…