I was just watching the PBS FrontLine documentary on the Digital Nation and it completely irritated me. Here were professors from MIT, one of the best technical institutions in the US, complaining about the students’ use of technology. I can’t even say that this is ironic, as much as it is just plain stupid.

The crux of the argument is that students these days are completely distracted because of the internet. They’re in the classroom, but rather than listening to the lecture, they’re googling things, reading articles, or probably chatting with their significant other. The professor is frustrated because he gave a simple exam that just tested on whether the students were paying attention in class and the students score poorly. The immediate thought that came into my head was why was the professor blaming the students rather than his teaching methods.

I teach classes at the Tokyo Hackerspace. My first class was a basic electronics class that was done with a lecture format and I took three hours to deliver everything I thought participants needed to know about electronics. It was enough information to give them a firm foundation and understand the basics of voltage, current, resistance, capacitance, and simple design patterns. That was a mistake, and the problem wasn’t the participants, it was with my assumption that they wanted to know the basics of electronics.

I finally realized that the problem was that I was telling them the basics so they could build on top of it. But without experiencing it for themselves, they would never understand why you need to limit the current of an LED or why you need to have bulk capacitors on a power supply. One of the reasons why the new MAKE: Electronics book is so great is because they encourage readers to break things as well as make them.

Anyways, I finally grasped the concept that people want to discover things for themselves. They want to understand why certain things are the way they are, rather than just be told it. They want to build things and customize them to make it their own and put their stamp on it. This is the proper way to teach people and has been for centuries, where the old master/apprentice relationships still existed.

We got away from hands on work and master/apprentice type teaching way back when educational institutions came into existence. I’m not an expert at educational history, but my understanding was that they evolved from the Prussian system of standardized education which was created to control the populace. Whoah…I’m not going to get into any types of conspiracy theories here, but the main point is that standardized education and educational institutions is only a recent phenomenon. And I am completely surprised that professors don’t realize that lecture-based teaching formats are horrible ways to transmit information to their students.

When I’m teaching classes these days, I take a lot of time to prepare very detailed class notes. The notes only cover enough background information to get you through the accompanying lab. This process continues for all the labs. I exist only to introduce the labs and answer questions people will inevitably have. They come to me with all sorts of questions. “Why does the circuit behave this way”, “Why do you need to use a ‘for’ loop when you could just use a ‘while’ loop in all cases”, “Why do you need to limit the boundaries of the servo”. These are things that they probably wouldn’t have asked if I just lectured them on how to do things. On top of that, they don’t stop when they finish the lab. My last class on how to use Processing had participants hacking the very first lab. I wasn’t sure about the technological level of the participants so I had all the graphics they drew in grayscale. The first thing they did was implement RGB colors to all the shapes they were drawing. I'm not saying that its an ideal teaching format, but participants are usually so engaged that I have to step in and almost physically force everyone to move on to the next labs.

I think what a lot of people from my generation and beyond don’t realize is that the internet encourages participation. You participate when you leave a comment on a blog post, or reply on a discussion forum. You participate on conversations on Twitter, Facebook (ugh), Second Life (2x ugh), etc. Also, the internet allows you to choose and control the information you receive. You can do a quick skim and bookmark the page for later reference, or read it thoroughly and when you’re finished, add a quick comment that the author’s full of shit (believe me, it happens). Basically, I adjusted my class to give participants full control of the information (via lab notes) and they test themselves on their understanding by performing the lab (participation via hands on application). Any gaps are then filled by answering their questions.

You can actually see a lot of this at work in the explosive growth of hackerspaces, makers, and the DIY community. People want to make things and customize them. Not just electronics, but art, writing, bicycles, industrial design, etc. It's probably difficult to understand if you're not involved, but if you are, all of this just seems so obvious. Technology is not dumbing down the population, it's increasing their hunger for information. I can't think of talking to any engineers I used to work with and have them as excited about electronics as some of the people I talk to on the Adafruit chat or in the Tokyo Hackerspace. 

That’s why I get irritated when I see professors at a top US technical school complaining that the kids are using technology and its affecting their learning skills. Perhaps its not the learning skills of the students that has changed. I’m pretty sure that the teaching methods used to get through to students today needs to change to allow them to control their own information and encourage individual participation. Interestingly enough, if you think of it in those terms, it’s a lot like open source :)

Add comment

Security code