Sunday, May 18, 2008

"Lecturing" vs. Teaching

"The Lectures Are Recorded, So Why Go to Class? (http://chronicle.com/weekly/v54/i36/36a00103.htm) More colleges are taping lectures so students can watch online, but not all professors are sure the results are good for their classrooms." This is the head of an article in the Chronicle of Higher Education, online edition 16 May 08. The main text of the article is not online, but there's a stub that confirms the headline: many universities making lecture recordings available to students, and as a result students are not bothering to come to class.

The very fact that this problem can arise highlights the malaise we see in some universities and colleges today. The teachers don't teach, they lecture. If they TAUGHT their classes, that is, if they interacted with the students, if they discussed topics WITH students rather than lecture AT students, then this problem would never arise. Whenever a classroom becomes the equivalent of a book, with the students partaking of the author's expertise but never able to question or ask for clarification, why would students come to the classroom if they hae an alternative? Moreover, this leads to an environment of regurgitation of material, rather than of thinking, of education. Education is about understanding; understanding is enhanced when the teacher and the students work together to understand. Lectures are more appropriate for training, not education (see http://teachgamedesign.blogspot.com/2008/04/training-vs-education.html).

In the past people came to the classroom because the material wasn't otherwise available. (It has to be said, even then if the teacher had written the textbook, and then lectured the same material as the book, students wouldn't bother to come to class.) Now that it's available elsewhere, why would they inconveniently come to listen to a "lecture"?

If someone asks if I'm going to "lecture" about something, I say, "I don't lecture to students, I talk with them". I am a teacher, not a lecturer. Teaching is more akin to coaching a sports team than to writing a book; lecturing is much more like writing a book.

One objection that might be raised to my point of view is, "the classes are so large, no one could actually TEACH them." That's not true; I've read of "lecturers" who went out into the audience, who conveyed information via questions and answers. No, they can't learn all the students' names, they can't get to know the students so that they can coach them, but they can make the learning interactive.

But more important, there shouldn't BE huge classes, as huge classes are necessarily poor education. I had a class of "only" 48 recently, and I managed to make it interactive, but it was very difficult; when the students really got into a topic, discussions were going on all over the room and chaos reigned.

Apply this to a hands-on topic like game development, and the objections to this lecture-style "teaching" are even stronger. Most game development students are of the millennial generation, and millennials thrive in group settings, in sharing, and in interaction. Add to this that the students are game players, very much used to interaction, not passive absorption of material. Finally, millennials hope that their authority figures will be their friends, or at least friendly--like their parents--which the university "lecturer" cannot be because he or she cannot get to know the student.

Yes, I learned the name of every person in that class of 48. And in my smaller (24) classes, I spent at least 10 minutes individually with each person, at one point, to get to know more about them. I'd like to have spent more time but my "office" was a tiny cubicle amongst many more, not conducive to private discussions with students!

Go back to the medieval roots of the university system, and you won't find huge classes, you'll find teachers with small groups. Fundamentally, huge classes are a symptom of the ultimate disease in education, "money". If someone can cope with (I won't say teach) 48 people instead of 24, the school spends half as much on pay for faculty. If a "lecturer" can talk at 200 people, and let poorly-paid grad students take care of the labs, the school saves lots of money.

This is why a good community college--they aren't all good, and that includes the one where I was given 48 students--is likely to provide a much better education than a large university, for the first two years. (Perhaps this is confirmed by the research showing that students who transfer from community colleges in my state to colleges do better there than the students who start their education at those colleges.) Community colleges hire teachers, not lecturers, and classes are usually small, not large.

I have much more to say about this, but I'll stop for now.

Thursday, May 1, 2008

Skepticism: How do we know things?

This seems to be something many young students haven't quite "got" yet. It's a "firm grasp of reality", especially important in the hype of the game industry. And it's an ability to "think critically", to analyze what you hear and decide whether it is likely to be true or not.


I was educated as an historian. One of the things you learn is skepticism about information sources, though some historians seem to lose that skepticism at times.

Many of the stories "everyone knows" are in fact apocryphal, never happened. Something sounds so good it gets attributed to an historical figure who in fact had nothing to do with it. This is true even for living people. Half of what Yogi Berra is supposed to have said, he didn't say. (One of his "Yogiisms" is, "I never said half the things I really said!")

Furthermore, people write and say things that aren't true, sometimes by accident, sometimes deliberately. I am personally skeptical of memoirs that discuss in detail something that happened 20 or 30 years before, especially if it was during the writer's childhood. If the writer didn't keep a diary at that time, I think to myself, how can he or she remember all these details? *I* don't remember that kind of detail, though my memory generally is excellent. So how much do they get wrong, or are they making up?

Lawyers know and study how unreliable witnesses can be, and in what ways. [books about it]

The astronaut Frank Bormann tells a story about something that happened along the way on a trip to the moon. Someone listened to the actual recording of the incident, and found that it was drastically different from the story. He played it for Bormann--I heard this on NPR radio, by the way--and Bormann said, well, yeah, I guess so, but I'm still going to tell the story, it's so good. Imagine how many books about the moon flights, about Bormann, about space flight in general, will include this entirely wrong story as "truth".

The Fayetteville, NC newspaper had a nice report about a meeting at Methodist University (then college) where a well-known writer had given a talk. The problem is, it never happened. Weather was so bad that the meeting was called off. But the report had been "pre-written", and published as is. And a historian reading that paper 50 years from now probably will take it as fact, as truth.

Now I've said this, about the newspaper, but I'm repeating what my wife, who was then chief librarian at Methodist, told me. I wasn't actually at Methodist to see that there was no meeting, nor did I read the newspaper report as far as I can recall. So I could be wrong, eh?

Initial reports on September 11, 2001 (9/11) stated that the State Department had been bombed. Never happened. But this was in the heat of the event. The next day (IIRC), all the major broadcast TV networks reported that some people had been rescued from the rubble, found in an SUV. I checked every network, and all reported this as truth; yet the next day, all admitted that no such thing had happened.

I rarely listen to the news right after some shocking event has happened, because the report will likely have "substantial inaccuracies" in it.

But for months, even more than a year perhaps, after the destruction of the World Trade Center, the tally of dead was about 7,000. Then that was reduced to about 3,000, a number which has held up. Wikipedia now says 2,974 died as an immediate result of the attacks with another 24 missing and presumed dead. (Of course, not everything in Wikipedia is correct.) With all those resources, with the importance of who had died and who hadn't (insurance claims, government and charitable perks for the relatives of those who died), the number was drastically wrong.

Be skeptical. Try to find out where stories come from. Yes, the BBC may report that one Chinese killed another because the latter sold his "magic sword" from an MMORPG, and it MAY be true, but then again, how reliable is news from China as reported by the BBC? Yes, a South Korean may have died from failure to take care of bodily functions while playing online games--or may not. One student told me he knew someone who had to go to a hospital to be treated for malnourishment because he played video games day in an day out--and maybe that was true, or maybe not.

Just because you heard it, just because someone told you about it, just because it was in the news, doesn't mean it's true. "What everyone knows" isn't always true, though frequently it is.

If it sounds unbelievable, maybe you shouldn't believe it! "Take everything with a grain of salt". You can rely more on your personal experience than on anything else, but even THAT can be deceptive.

This is part of critical thinking critically, discussed in Wikipedia as:

"Critical thinking consists of mental processes of discernment, analysis and evaluation. It includes possible processes of reflecting upon a tangible or intangible item in order to form a solid judgment that reconciles scientific evidence with common sense. In contemporary usage "critical" has a certain negative connotation that does not apply in the present case. Though the term "analytical thinking" may seem to convey the idea more accurately, critical thinking clearly involves synthesis, evaluation, and reconstruction of thinking, in addition to analysis.

Critical thinkers gather information from all senses, verbal and/or written expressions, reflection, observation, experience and reasoning. Critical thinking has its basis in intellectual criteria that go beyond subject-matter divisions and which include: clarity, credibility, accuracy, precision, relevance, depth, breadth, logic, significance and fairness."

In some ways game design is an exercise in critical thinking! Especially as you decide how to modify a game based on playtesting input.
"Always do right--this will gratify some and astonish the rest."Mark Twain
"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." Antoine de Saint-Exup'ery

"Not everything that can be counted counts, and not everything that counts can be counted." Albert Einstein

"Make everything as simple as possible, but not simpler." Albert Einstein

"The worst form of inequality is to try to make unequal things equal." -- Aristotle