Being a Part of This Thing Called Education

Let’s start with a philosophical question.  What is the point?  I think that question can be asked of just about anything we are expected to do.  Go to school.  Get a job.  Get married.  Don’t get married.  Have kids.  Get a pet.  Learn an instrument.  Have a drink.  Start a blog!  Well, why?  Why do any of it?

All right, now that all the nihilists have sat back down let me just say that I am not a nihilist.  I believe the overwhelming majority of things in life matter to a certain degree, but some things have a more pertinent point than other things.  You would most likely drive yourself mad very quickly if you tried to treat every detail of your life with the same degree of extreme importance.  I certainly would.  Where I am beginning this blog post is with the question, “How important is formal education?”

I started thinking through this question this week after a few short readings on the subject (here, here, and an excerpt from here).  As a career K12 educator, I certainly had some thoughts on the subject, as well as some Pavlovian defensive reactions.  You see, it’s pretty fashionable, not to mention politically expedient, to blame education for most of the ills in our society.  If that ire were directed at those setting the “education agenda” (mostly politicians) then OK.  In fact, I agree that there is some change needed in top-down direction for public schools.  Unfortunately, the people who get most of the blame are the educators working with students and running schools.

Now let me pause here and say two things.  First, not all teachers are great at what they do.  Everyone has had a bad teacher and they leave a sour taste for a long time.  No one (especially other teachers) wants bad teachers in schools.  That being said, please don’t let one bad teacher spoil your experiences with the other great teachers who cared for you and helped you learn.  That’s like getting a bad waiter and swearing off going out to eat for the rest of your life.  It’s nonsense.  Second, I get why it happens.  It’s easier to pick on someone who either won’t or can’t fight back in the same way they are being attacked.  In schools, we call that being a bully.  It also gets you in trouble.  I suppose that is one of the disconnects schools have with “the real world.”

So let me take that opportunity to segway back into the idea of educational goals.  That’s where I was going originally.  If I were to condense those previously mentioned readings into one sentence, it would be this:  There is a significant gap between what schools are teaching and what students need to be successful adults in our society.  Right now, schools are focused on cramming information into students in an effort to get them to repeat that information back by correctly answering multiple choice questions (at least that’s the perception).  Instead, schools should be focusing on teaching students how to be good thinkers, to solve problems, and to guide them in finding their individual path to personal fulfillment.  I propose that we can think of these two constructs for educational goals as points on a spectrum.  We don’t have to choose one or the other, because there is a fairly massive area in between with plenty of available real estate.

So let’s say that right now our education system is indeed currently too close to the “information acquisition” side of the spectrum (the more skeptical might call this “memorize and monetize”Smilie: ;) and we need to shift more towards the “motivate to innovate” side (which a different set of skeptics might call “rainbows and unicorns”Smilie: ;).  How do we get from where we are to where we’d like to be?  Since this is a real problem there are many thoughts on how to do this.  Here are a couple ideas worth considering and my thoughts on each.

Let’s consider how a school environment would change if we moved from “information acquisition” to “motivate and innovate.”  The first change I think we’d all notice is significantly fewer lectures and PowerPoints.  Notice I didn’t say we’d eliminate lectures and PowerPoints.  There is absolutely still a need for experts to act as experts to the benefit of students who are working on learning.  Don’t agree with that?  Here’s what an expert has to say on the matter.  Teachers have value.  If we attempt to replace teachers with systems and AI we are really doing a disservice to students (sorry Sal, but sites like yours should support the work of teachers, including being used a curriculum, but should not attempt to displace teachers).

So we still need teachers, but we need teachers who are able to help students see the value in what they are learning.  Some see this as making learning fun.  I think sometimes we see stories or videos of kids making their own educational video games and think of that as the “gold standard” of learning.  It’s a valuable approach that can yield cool results.  For full disclosure, I taught and developed curriculum for two separate Video Game Programming courses.  One was more focused on using games as a tool to teach programming concepts while the other attempted to balance an understanding of gameplay concepts alongside basic programming concepts.  The classes were great.  Most students really enjoyed the classes.  There were some in every class that didn’t.

So what happened there?  Students had a fairly high degree of freedom to explore concepts in an elective class they chose to take.  There was always time built in to play with the game design tools.  That was intentional.  Why didn’t all students engage in the course?  Based on my personal interactions with each and every student in the courses I taught, the reason some students really tried and others didn’t is because some cared enough to try and others did not.  Not one single student who did poorly in my class did so because he or she couldn’t grasp the content.  If the student was motivated to get it then the student eventually got it.  Some progressed faster than others, which is what you’d expect, but the difference between learning and not learning was student motivation.

This realization is incredibly important in the discussion of changing educational outcomes for students.  Some (possibly even some authors and articles linked to in this post) see the key to improvements in education occurring as a result of increased use of situated learning and gamification practices.  Both those things are great!  Both can lead to improved motivation and better learning outcomes.  Neither is a “magic bullet” that impacts every student.  We’d like to find that, but we haven’t yet.  The most impactful teaching strategy I’ve ever seen is personal relationships with students, but even that falls short of inspiring some students.  There has to be some movement on the part of the student to want to learn, whether that happens through reading, listening to a lecture, participating in a simulation, or playing a video game.

Be bold with innovative instructional approaches that push students to learn in unique ways.  Online communities, gaming, and project-based learning are just a few of the multitude of instructional practices that move us along the spectrum towards “motivate and innovate.”  Just know that regardless of the fun and engagement you build into a course, the outcomes you want to achieve matter and there very well could be students who simply don’t buy in.  That is, by far and away, the most difficult thing about being a teacher, but you can’t let it define you as a teacher or cause you to lower your expectations to include unmotivated students.  Appreciate the fact the majority of your students bought into the process and continue to look for ways to help your students learn more deeply and demonstrate their learning more creatively.

Gaming Teaching vs Conventional Teaching

After watching the Vimeo video Digital Media: New Learners of the 21st century I thought I would share some of my concerns / comments with you folks.  The first thing I want to say is this is a really novel and amazing idea. It is really nice to see the educational system embrace technology and conforming to this new way of teaching. I think presenting subject matter through the use of games or other forms of media is very unique way of secretly teaching the children at this school.  I would think it surely beats lectures in some aspects. That being said I am worried that this style of teaching is kind of catered to kids that are already “gamers or computer junkies” which is fine, but I worry that this style of teaching is assuming all these kids want to go into a game development or computer science related job.  I worry that they are so young and this curriculum might be too focused on the gaming aspect.  What if gaming is just a hobby and not a passion they want to make a career out of ? Do you think this gaming teaching style is as good as a conventional education ? I say this because I feel like the way conventional schools are setup is all these different opportunities are given to you. This allows you to try them all. As it turns out you might be really great at something you never thought you would be. I am just worried that these kids are being Pidgeon held into a life that they did not choose but was forced on them. Let’s face it they are kids and at that age they really don’t know any better or what they really want.

As a final though, in the video the kids go on to say that they still have conventional classes, however that just have different titles. Also a teacher says that they cover all the material required by the state. My question to you is do you think what they cover is as comprehensive as other conventional schools?

Gaming the System

Laptops in the classroom are clearly a divisive issue as demonstrated by the NPR article on the different approaches to technology in the classroom. The conversation does seem to be very polarizing, but often times the different sides seem to be talking past each other. Of course laptops, much like other tools available in the classroom, have helpful applications in the classroom under the right circumstances, but I think when people discuss banning laptops they aren’t teaching a class that uses laptops as a tool or teaching aid. It seems strange then to take extremists positions when the use of technology in the classroom is self-evidently situational.

This gets at a larger issue that I’ve found intriguing during my professional development: Is it the job of the student to pay attention or the job of the professor to keep the attention of students? Should professors aim to reach only those that demonstrate interest and engage with the material, or is it necessary to make sure that every last student is engaged in the class?

I struggled academically during my undergraduate career, often times because I was on my laptop not paying attention. I eventually had to come to my own realization that doing such things wasn’t in my best interests and take responsibility for what was going on. Isn’t putting the onus on professors to be entertaining or to strive to keep the attention of all students merely absolving the responsibility of the student to take control of their own education? Students have always gazed off into the distance or daydreamed since time immemorial, why put the pressure or responsibility entirely on the professors?

Additionally, the exploration of the different methods of learning developed from video games or other types of problem-solving is an intriguing idea. As a “gamer” I recognize that there is an inherent drive to learn to master the mechanics of a game, to be able to adapt to different challenges to reach an ultimate goal. This can also be accomplished in board games or interactive/physical games, not just video games (if conferences in my particular academic concentration are anything to go by “simulations” and other game-like teaching methods are clearly becoming more popular and publishers/academic-focused companies are creating more and more products to meet that demand).

Ultimately, I do think that laptops could create better learning environments if used correctly, For instance, the introduction of games (both video and traditional) in the classrooms could create better outcomes than the traditional lectures. As a teenager, I developed an interest in history and politics through the video games I was exposed to and has led in many ways to a deeper understanding of certain historical events than I might otherwise have had if I had only learned about history through books or academic lectures. There would be some cool ways to incorporate games into courses to promote that type of active learning, but there will still always be students who aren’t interested in whatever form of teaching is being offered.

In the final analysis, some students like lectures and others hate it, the same goes for video games or other methods of learning. It’s unclear how this tension could be resolved or how any clear cut distinction about what teaching method is “better” could be made given these issues.

Please let me know your thoughts on this issue, I’m legitimately curious to hear what other people think about the different approaches to learning and class management.

Am I in the Right Room?

I’ve spent the better part of two hours typing a blog that will sit in draft mode until further notice. The gist of the conversation I was hoping to stir concerns our responsibilities as teachers and parents in the avenues our children and students make. Not all engineering students will become engineers. Not all designers become designers. And we should all understand that that is okay.

I switched halfway through my undergraduate from mechanical engineering to industrial design. Don’t get me wrong, I could grasp the concepts, I just couldn’t produce the math. In my professional career as a designer, my area of expertise has always been more function than form, but I’m still not producing the numbers. That’s what engineers are for. And I don’t expect engineers to always get the form right. That’s what designers are for.

Does this mean my math professors didn’t teach me correctly, or did my engineering colleagues have terrible art teachers? No. Most students will be better at one thing over another; it’s how we become specialists in our respective disciplines. But let me pose this, if an art teacher had taught more like an engineer, or the math teacher had thought more like the artist, would things have turned out differently?

I had parents that backed me up when I decided to turn around. I had tears in my eyes when I told them I was quitting one thing to do another. But it was worth it. For all of us. I ended up in a career I didn’t hate with the best teachers I could have asked for. I grew stronger in my relationship with my parents through honesty and open conversations like that one. It changed my life for the better.

Now, I am fortunate to be in an immersive field of study. Industrial designers learn concepts of form and function, conceptualize through drawings, receive feedback from students and teachers both, and iterate through physical models until the final form is delivered, usually a full-scale working prototype. It is extremely satisfying. It requires the input of the student. It requires the student’s peers to make comments and provide feedback. It’s a shame other fields of study do not have this same opportunity.

Or do they? Does this translate to other disciplines? How does a history major build? Is it only though their writing, or does it require physical travel to places of historic significance? Or both? What if you can’t get there from here? What if it’s too cost-prohibitive? What if you’re too ill to go? What if you have obligations outside of academia that we all find ourselves in? What if’s can kill dreams.

I think this a world preparing itself for VR. Advances in the technology are going to shift what we can do, and make accessible the inaccessible. Adbhut wrote a good blog with plenty of questions we still need to examine and answer. Using VR?AR, can we experiment with how we teach engineers through the lens of the artist? Can we create a virtual art class that designs sculpture through the algorithms of human bone growth? What new information will be passed, exchanged, and shared by those students, as was pointed out in our latest GEDI readings?

I am hopeful. It is my opinion we are already providing more avenues for students to learn than ever before. We are beginning to provide the most current of tools and are creating more. The students are also already changing how they learn if given the space. Hopefully we are able to guide them toward meaningful lives, able to help them correct themselves when that path gets rough, and even have the strength to say it’s okay to turn around and find a different path. And hopefully, it is because we’ve already exhausted all other options, and not because we told them they weren’t good enough.

Week 3 — Laptops And Phones In The Classroom: Yea, Nay Or A Third Way?

I am a big advocate of a shared responsibility model in a class room setting where individual students and teachers share equal responsibility of the failure and success of the classroom (including individual success/failure of students). I personally think that utilization of all forms of personal technology should be banned from classrooms (my perception takes its genesis from old school of thinking) simply because surfing the web not only distracts individual students but also distracts their neighbors and this distraction propagates like a ripple effect. Following are more reasons why I think personal technology should be banned from classroom:

  1. According to a study conducted by microsoft, average attention span has reduced to eight seconds, beating goldfish’s attention span of 9 seconds. If unrestricted use of technology is allowed, then the matters would only become worse. Class room is one realm of the life where people do most of their learning, and learning to have better attention spans can be helpful to students in future.
  2. The class room setting is a time for face to face interaction amongst students, usage of technology diminishes this time to almost zero. Additionally, this diminishing social interaction is increasingly generating more anti-social human beings with a skewed perception of social normality.
  3. Excessive use of technology has increased mental health related problems. I think the class room can be the one place where students get a “break” from using technology and increase physical (not virtual) social interaction.
  4. Using phones/laptops, when a teacher is lecturing, for social media or for other entertainment purposes is extremely disrespectful. Condoning such behaviors on a regular basis creates a perception of normality — which is not in accordance with expectations in the industry. Let me share a personal incident: During my internship in summer 2018, some of my fellow interns were using phones during client meetings. After various failed insinuations, the meeting host, out of frustration, explicitly told the interns to put their phones away as this action was disrespectful. A fellow intern responded to this “order” by saying I can multi-task and continued to use their phone. Following this interns lead, some of the other interns continued using laptops/phones — I was appalled by this behavior and the utter disrespect for the client, the company, and the meeting host (higher up) without any signs of remorse. I think this behavior can be ascribed to normalization of technology use in the classroom setting.

Times are changing, so are teaching techniques and learning methodologies. Consequentially, an absolute ban of technology is not pragmatic in todays “modern” class room. Howbeit, I think that using tools to limit the sites a student can access is a reasonable solution to:

  1. The need for students to have access to technology to be successful in a class.
  2. Stop students from using social media, so they don’t distract themselves or their neighbors, for the period of class.
  3. Encourage student interaction

The article cites Jesse Stommels’ quote: “I don’t think the attention of students is actually something teachers can or should control,” and I couldn’t disagree more. Saying that teachers should not take action to make students better human beings is akin to saying that the doctor should not be controlling the patients prescription. The job of the teacher is to teach students class material, teach them something about life, and help them be a better version of themselves. If students are distracted then how is the teacher supposed to do their job? These children are still learning. This is the time when a person with much more life experience than students should navigate them through right and wrong.

 

 

What happened to the overhead projector?

800px-OHP-sch

I am old.  I am a dinosaur.   I learned and started teaching with an overhead projector and mimeograph sheets.   As I write this blog post, I am trying to guess when they disappear from the land of higher learning?  Do I need to include a wikepedia link for to help my classmates that read this post that has no idea what an overhead projector is?   Am I that old that I miss the days that the TV rolled in on a high cart which means we had a substitute teacher?

I read ‘s article about laptops and phones in the classroom and I thought about the overhead projector.   When I was taking classes while watching the pterodactyl flying out the window, this was all the technology in the room.  My goal through high school and four years of engineering school was to get as much information from that pre-made transparency into a notebook.   I got really good and trying to synthesis the notes and taking clues from the teacher/professor for a particular portion of the notes that will be on that mid-term or final test.   I would then take my notes back to my little dorm room and recopy them with particular definitions or equations from the textbook to supplement the notes.  My fellow class mates would come and buy portions or all of a set of notes for a class.  That money went along way in buying cheap beer and a dinner somewhere.  Somewhere in the process, that grainy 20 years old transparency sunk into my head that I could put back onto that test.

Now, I sit twenty plus years later in the state of the art classroom with projectors and webcasting of someone who is attending the class from a bar in the airport in Dubai.  Everyone has a laptop or tablet sitting in front of them.   If the professor flashes something up on the fancy screen, a person can whip out their camera on their smart phone to make sure they didn’t miss it.  If I don’t understand a definition, I can google it and it is at my fingertips.  I don’t need the textbook or go the library because it is at my fingertips.

The etiquette is now different.   If you have a cell phone, you hide it under the table so the professor doesn’t see you texting your significant other or check that Facegram or Instabook account.   I always try to sneak a peek at the open computer next to me.  The vast majority of the people are surfing the net, social media, looking at job sites, or trying to do the homework for the next class.

I read Anya Kamenetz’s arguments for the pros and cons of this technology.  She missed the point.  It isn’t about trying to engage students better so they are too busy to be distracted or a teacher has to ban technology outright.  She missed the responsibility of the student.  It is up to the student to learn the material.  It was no different copying notes from the overhead projector or today off the fancy touch screen.  If the student wants to do something else, the student has missed out the opportunity to learn the material that someone thought he or she should know.   Leave it up to the student to sink or swim.  That is how it is in the real world.

My guess it was 2001 when the overhead projector went the way of the dodo.

 

 



 

Thoughts on “Four Things a lecture is Good For”

This article, Four Things a Lecture is Good For by Robert Talbert, makes some good, concise points on the topic of traditional lecturing. It should be noted that criticizing the traditional, stand-and-deliver lecture isn’t new. As the article outlines, for decades research repeatedly shows that lecturing does not lead to adequate levels of content retention. In fact, the numbers that have been reported to me over the years all describe that the levels of retention by traditional lecturing are embarrassing low. It would be easy (and possibly lazy) to say that these facts can be summed up and attributed to a few bored and boring lecturers. I used to believe this, but not as much now. I becoming more friendly to the argument that lecturing is overused and is an inferior way of teaching in many circumstances.

Now, with the emergence, or insurgence rather, of online educational sources, traditional, lecture-based education is forced to justify itself. On this matter Tablert says “Resorting to a lecture because I need to ‘cover material’ is just an admission that I didn’t design my course well. If that’s all the lecture is for, put it online so students can at least pause and rewind.”

In spite of the online takeover of education gaining momentum and territory there is still room for the traditional lecture. I agree with the author on this point. He outlines four occasions where lectures are still an optimal content delivery method. Of these the most persuasive to me is “modeling thought processes”. Providing an expert model, especially a role model, through software is very difficult, if not impossible (currently). The presence of an expert human being demonstrating thinking in reality is something that virtual reality cannot yet imitate.

In summary, I believe that education is due for a much needed update steering away from the overused, lecture-based, content delivery method. I also believe, like the author, that lecturing still has its very important place and should remain an important part of the pedagogical arsenal.

https://www.chronicle.com/blognetwork/castingoutnines/2012/02/13/four-things-lecture-is-good-for/

Technology in the Classroom?

I want to first start by saying “WOW”.  It never seizes to amaze me at how many opinions are polar, regardless of the topic.

As with many things in life there are very few absolutes, and I believe that technology in the classroom falls into that category.  I will attempt to illustrate the discretionary nature of this question with the inclusion and introduction of two considerations. With that said, there are numerous considerations that could be used and these are not intended to be absolute.

First, will be the level of cognitive learning associated with the classroom in question i.e. Memorization, Understanding, and Application defined in the link below.

http://www.indiana.edu/~idtheory/methods/m1d.html

Memorization. This is rote learning. It entails learners encoding facts or information in the form of an association between a stimulus and a response, such as a name, date, event, place or symbol. For example, these are facts: Columbus discovered America in 1492, Pi = 3.1417

Understanding. This is meaningful learning. It entails learners relating a new idea to relevant prior knowledge, such as understanding what a revolutionary war is. The behaviors that indicate that this kind of learning has occurred include comparing and contrasting, making analogies, making inferences, elaborating, and analyzing (as to parts and/or kinds), among others.

Application. This is learning to generalize to new situations, or transfer learning. It entails learners identifying critical commonalities across situations, such as predicting the effects of price increases. The behavior that indicates that this kind of learning has occurred is successfully applying a generality (the critical commonalities) to a diversity of previously unencountered situations.

 

Second, will relate to the type of education being instructed or pursued in the classroom. This will be generalized in three subtopic areas:

-Public K-12 Education

-Private K-12 Education

-College / Post-secondary education

 

It can be argued that during the phase of memorization technology is not required or necessary for a student to learn and absorb the information/knowledge that is being delivered. However, the inverse can be equally argued by stating that technology facilitates certain types of rote memorization i.e. digital flash cards. During this phase students are more reliant on an instructor for the delivery and/or explanation of the material.

As the phases progress the reliance on instructor should begin to decrease. This is made evident through the  following quotes from the above source: “It entails learners relating a new idea to relevant prior knowledge”, and “successfully applying a generality (the critical commonalities) to a diversity of previously unencountered situations.” Students in the application and understanding phases must at some point must “disconnect” from the instructor in order to “relate and apply” what is being taught.

Here, as with memorization the last two types of learning can have arguments for and against technology. Each of these arguments would have nuanced characteristics that specifically related to the type and way the course material is/was presented. Therefore, as stated initially, it depends and is a discretionary call. However, the level of authority an instructor is given for the discretionary call should not be blanketed.

Before I begin this section I want to first state, if a students technology is negatively impacting the ability of other students to learn, corrective action should be taken. e.g. three students in front of you are streaming a soccer game. (that has ever happened to me 😉

In public K-12 schools students education and in some cases transportation and food are funded by taxation. The use of technology should be more heavily debated within this context. In many circumstances these students do not have the choice or option for private education and are reliant on the established policies to have a beneficial impact on there education. If students fail to perform to a satisfactory level, a burden is potentially placed on taxpayers to fund another year of schooling, transportation and food. Additionally, as stated by Mr. Glupton during class, “teachers are measured with metrics and the burden is placed on them to get the student where they need to be.” Therefore, if a student fails to perform at least three negative outcomes will persist: 1. The student falls behind 2. The taxpayers must foot the bill, again 3. The teacher is found at fault.

What teachers are not in these scenarios are parents or guardians. Ms. Welzenbach states in, Laptops And Phones In The Classroom: Yea, Nay Or A Third Way? “These devices are worse than distracting, she says. They can connect teens to cyberbullying, hate speech, sexting and other “unhealthy” experiences.” Many of these topics fall into the parental/guardian arena and should not be used as ammunition for a teachers discretionary decision on this topic.

Private K-12 schools have a different dynamic. The students in these settings have parents/guardians or other individuals that are paying for the education of the students. In this setting the benefactor should have some input if not complete authority. In this scenario there is no financial burden to the tax paying population for an error in judgment by the benefactor. Additionally, the teacher should be able to document the recommendation and the associated concerns given to the benefactor in order to provide insulation to external decisions being made outside of the teachers control.

A similar argument can and should be made for institutions at the college or post secondary level.  The students are now placed in either a student/benefactor or a student/beneficiary position. In either case the student should have complete discretionary control over how they see fit to be educated. (Again, as long as it doesn’t interfere with others abilities to learn) If they decide to not pay attention, and shop online, text/email a friend it is their prerogative.

In most cases these students will have to compete in the open market to earn a living. Their traits, qualities and education are just some of the components that will feed into this equation. There many arguments or examples that can be made which entertain the outlying examples of students who are atypical. There are those who are brilliant and never needed formal education to be successful (Steve Jobs, J.K Rowling, Steven Speilberg) Then there are those who just needed a little push by an educator to ignite a spark or a passion within them. But a majority will fall somewhere in middle, and their cumulative decisions and experiences will impact where their life takes them.

At some point the “umbilical cord” needs to be cut and students need to take control and responsibility for themselves. In my opinion, day one at a your first job is too late to figure out how to effectively navigate technology and the distractions it can bring. Regardless, at this level it is not an educators responsibility to police their students. Instead, the focus should be on doing the most good possible and ensuring that the environment does handicap those who want to learn.

In summation, there are many nuanced scenarios, circumstances and factors that affect this discussion. I do not believe that there is a right or a wrong answer which can be applicable in all settings. The intent was to highlighted two possible factors that impact this decision, with the hope to illustrate the sheer dynamics.

 

VR learning for Generation Z

Let me start with a small story. Three years ago, I went to Walmart and accidentally stumbled upon a huge stack of these Virtual Reality (VR) headsets. I had heard about VR from a friend who recently bought a fancy VR headset for $1000 and here I was looking at a super cheap smartphone version, costing just $15. I bought it just out of curiosity, went home downloaded some VR apps on my phone and I was amazed at the extent of things I could do with it. I could play some super cool games, experience myself sitting in a roller coster and not only that, there was an app where I could watch the inside of a human body. Of course in this toned-down cheap version I could really not do as much as in the $1000 one, but still it was good enough to get a feel of it. What I actually learnt from this experience was that VR was no longer some distant future and it was no longer limited to games or entertainment, but can be used as a great learning tool in this digital age.

According to a a popular model developed by educational theorist Fleming, there are usually four types of learning styles- Visual , Auditory, Reading/Writing and Kinesthetic. But, is this model really applicable for the generation of digital learners? A recent Barnes and Noble College study, conducted on 1300 middle and high school students, shows that today’s students are not big fans of passive learning. They are not interested in showing up to lectures, make notes and then memorize for exams. Instead they want an educational experience that is immersive and engaging. For example, 51% of the surveyed students said that they learn best through active participation while only 12% said that they learn by listening. They also said that with technology and hands on experience, learning is much more fun for them. This survey definitely gives us a hint what future learners expect from their classroom experience and VR can play a big role in making this experience more immersive.

The biggest advantage of VR could be in visualizing and understanding difficult and abstract scientific concepts like magnetism, relativity, human anatomy etc. Students can easily perform the complex chemistry and biological experiments in virtual labs without having to worry about the dangers of using chemicals etc. This TED talk by Michael Bodekaer, the co-founder of Labster shows how virtual labs can revolutionize education. Another technology that is gaining popularity is Augmented Reality (AR). AR adds another dimension to the learning process and teachers can combine traditional approach with innovative practical illustrations of the complex concepts .

While the costs of these high-tech fancy VR sets are undeniably high for now, they will inevitable drop. But for the starters, why not experiment with the super cheap ones like the one I got from Walmart. There are numerous apps available which work with these low-cost ones, for almost every subject including chemistry, physics, zoology, history, grammar and list goes on and on. In the end, it’s upon the educators to understand the need of this Generation Z and make the learning experience for them not only more engaging, but also make them a part of this experience.

Two Cultures of Education

The traditional classroom teaching approach is something that has been there for a long time. Listening to someone delivering a lecture and taking notes is the common way of education we have received. But it is changing slowly and if it is for the good or bad is a subjective question. We are shifting to the new active learning approach or better known as a learning-centered approach where various digital learning techniques in a learning environment are used. I personally feel both the techniques have merits.

Traditional lecturing approach is good for hands-on learning from the expert on that topic or subject. But taking notes by hands during lectures is not an effective way of learning. One tends to focus more on writing rather than learning and understanding new information. But then the new classroom teaching through presentations and fill in the blanks in the notes is a solution for that. If the teacher is able to deliver an interesting lecture without boring you, it is a good method of learning. But the truth is there is not enough practical, experiential or active learning. Classroom lecturing is a must for learning a basic set of information and worldly knowledge but other skills necessary to survive in this world like communication, teamwork, problem-solving, learning through competing etc. are missing.

Digital learning is a new culture in which technology is used as a medium for learning. Human beings learn the most when they are faced with a problem they have never seen before and are eager to solve. Computer games are one of the ways for effective problem-solving, teamwork and learning. Quest to Learn School  is an initiative in this direction. The curriculum involves game-based learning along with the traditional lecture-based learning. Developing individual games allow students to succeed by failing and trying on their own. It also allows them to think creatively and use their imagination  which we do not generally learn in a lecture-based approach

Students need an environment in which they can learn and experiment. A mix of traditional lecture learning, game-based active learning, communication through dialogues and discussion is the best way forward in my way. The world is changing and we need to accommodate to changes as well. Which approach do you like more? Feel free to share any experiences or thoughts.

1 2 3