Friday, June 13, 2008

What is "game development"?

Recently I talked with some folks from a college involved with game development.

One of them says, "(person X) says he doesn't know anything about game development". Person X is a major official of a group that's all about game development! Then later "(person Y) doesn't know games" (or maybe he said, "game development"). Person Y is heavily involved in game development/creation education, and ought to know something about game development, surely. But person Y comes from the art side of things.

On thinking about it, I recognized that the speakers equated "games" and "game development", and further equated "game development" with computer programming.

Fairly obviously, you can know a lot about games, in a variety of ways, and not know much about game development. We get students all the time who think they'll be good at creating games simply because they like to play games a lot: NOT SO, bucko. Yet you can also be an important part of a team that creates electronic games, and know next to nothing about computer programming.

This long introduction leads to the key question: is "game development" now the equivalent of computer programming for games, or is it something much broader? When creation of an electronic game was a one-person endeavor, back in the 70s and 80s, every game developer had to be a programmer. But this "one hero per game" style practically ended around 1990--I've had students who were born later than that--as most games became too big to be done by one person.

Nowadays, many more artists than programmers work on electronic games. And there are teams of game designers, level designers, sound people, narrative writers, and so forth working on big games. Programming is the minority endeavor.

More important, in almost all cases programming nowadays can only screw up a game, not make it outstanding. What makes an electronic game outstanding is, first, the design, the gameplay; second, the look and feel of the game, which is a combination of design and art. Good programming can certainly contribute, but mostly, programming is there to implement the vision of the designers and artist, and is a fairly mechanical contribution to the game. But if it's poorly done, then it can ruin the game. Patches typically fix programming problems, they can rarely fix fundamental design problems.

Now mind you, unlike a great many of the people who teach programming (I do not), I actually worked full time as a programmer for a while before getting deep into networking and user support. Nonetheless, this is my take on programming, especially today:

"Programming is donkey work".

What I mean by "donkey work" is that programming is mechanical. We know today that many of the steps programmers used to have to do manually, are now done by software tools. Ideally, we'd like to be able to tell a computer-based tool what kind of game we want, provide it with art, and it would write the programming. Game engines go partway in this direction, simplifying programming by (in effect) doing some of it themselves.

Constantly, people are trying to write tools that will make programmers less and less necessary, less and less important.

Yes, yes, we know there is creativity in programming. But once we get past the highly entrepreneurial stage (which we have), too much creativity in programming causes problems. In games we want programming to be reliable, solid, fast--mechanical, not creative.


So what is "game development"? Not programming, folks, it's design and art, with programming coming in near the rear. Programming is a necessary evil, not the heart of an electronic game. (And if we stray into the world of non-electronic games, we have design and we have art, but we have no programming at all.)

Now perhaps we could agree that "game development" means programming, and we can change our "game development" curricula names to "game creation" (those that includes artists and designers, at any rate). Or we can recognize that "game development" means all aspects of game creation, not just programming.

Unfortunately, "game development" programs in colleges and universities are often started by programmers, who have no interest in art and little interest in design (and sometimes, little interest in games!). In many less-well-known schools "computer programming" is going away as a topic of interest for the millennial generation, or has already been dropped; "game development" is grabbed as a life-saver for those who want to teach programming but lack students. These "game development" curricula are about fifteen years out of date when they start. My own experience of this is that when programmers start "game development" programs, those programs are a disaster for artists and designers. "Game development" should be in the hands of gamers who are teachers, not of programmers.

If you're a student planning to pursue game creation as a career, find out whether the school you have in mind runs the programming version of game development, or the broader version that accommodates non-programmers.

Sunday, May 18, 2008

"Lecturing" vs. Teaching

"The Lectures Are Recorded, So Why Go to Class? (http://chronicle.com/weekly/v54/i36/36a00103.htm) More colleges are taping lectures so students can watch online, but not all professors are sure the results are good for their classrooms." This is the head of an article in the Chronicle of Higher Education, online edition 16 May 08. The main text of the article is not online, but there's a stub that confirms the headline: many universities making lecture recordings available to students, and as a result students are not bothering to come to class.

The very fact that this problem can arise highlights the malaise we see in some universities and colleges today. The teachers don't teach, they lecture. If they TAUGHT their classes, that is, if they interacted with the students, if they discussed topics WITH students rather than lecture AT students, then this problem would never arise. Whenever a classroom becomes the equivalent of a book, with the students partaking of the author's expertise but never able to question or ask for clarification, why would students come to the classroom if they hae an alternative? Moreover, this leads to an environment of regurgitation of material, rather than of thinking, of education. Education is about understanding; understanding is enhanced when the teacher and the students work together to understand. Lectures are more appropriate for training, not education (see http://teachgamedesign.blogspot.com/2008/04/training-vs-education.html).

In the past people came to the classroom because the material wasn't otherwise available. (It has to be said, even then if the teacher had written the textbook, and then lectured the same material as the book, students wouldn't bother to come to class.) Now that it's available elsewhere, why would they inconveniently come to listen to a "lecture"?

If someone asks if I'm going to "lecture" about something, I say, "I don't lecture to students, I talk with them". I am a teacher, not a lecturer. Teaching is more akin to coaching a sports team than to writing a book; lecturing is much more like writing a book.

One objection that might be raised to my point of view is, "the classes are so large, no one could actually TEACH them." That's not true; I've read of "lecturers" who went out into the audience, who conveyed information via questions and answers. No, they can't learn all the students' names, they can't get to know the students so that they can coach them, but they can make the learning interactive.

But more important, there shouldn't BE huge classes, as huge classes are necessarily poor education. I had a class of "only" 48 recently, and I managed to make it interactive, but it was very difficult; when the students really got into a topic, discussions were going on all over the room and chaos reigned.

Apply this to a hands-on topic like game development, and the objections to this lecture-style "teaching" are even stronger. Most game development students are of the millennial generation, and millennials thrive in group settings, in sharing, and in interaction. Add to this that the students are game players, very much used to interaction, not passive absorption of material. Finally, millennials hope that their authority figures will be their friends, or at least friendly--like their parents--which the university "lecturer" cannot be because he or she cannot get to know the student.

Yes, I learned the name of every person in that class of 48. And in my smaller (24) classes, I spent at least 10 minutes individually with each person, at one point, to get to know more about them. I'd like to have spent more time but my "office" was a tiny cubicle amongst many more, not conducive to private discussions with students!

Go back to the medieval roots of the university system, and you won't find huge classes, you'll find teachers with small groups. Fundamentally, huge classes are a symptom of the ultimate disease in education, "money". If someone can cope with (I won't say teach) 48 people instead of 24, the school spends half as much on pay for faculty. If a "lecturer" can talk at 200 people, and let poorly-paid grad students take care of the labs, the school saves lots of money.

This is why a good community college--they aren't all good, and that includes the one where I was given 48 students--is likely to provide a much better education than a large university, for the first two years. (Perhaps this is confirmed by the research showing that students who transfer from community colleges in my state to colleges do better there than the students who start their education at those colleges.) Community colleges hire teachers, not lecturers, and classes are usually small, not large.

I have much more to say about this, but I'll stop for now.

Thursday, May 1, 2008

Skepticism: How do we know things?

This seems to be something many young students haven't quite "got" yet. It's a "firm grasp of reality", especially important in the hype of the game industry. And it's an ability to "think critically", to analyze what you hear and decide whether it is likely to be true or not.


I was educated as an historian. One of the things you learn is skepticism about information sources, though some historians seem to lose that skepticism at times.

Many of the stories "everyone knows" are in fact apocryphal, never happened. Something sounds so good it gets attributed to an historical figure who in fact had nothing to do with it. This is true even for living people. Half of what Yogi Berra is supposed to have said, he didn't say. (One of his "Yogiisms" is, "I never said half the things I really said!")

Furthermore, people write and say things that aren't true, sometimes by accident, sometimes deliberately. I am personally skeptical of memoirs that discuss in detail something that happened 20 or 30 years before, especially if it was during the writer's childhood. If the writer didn't keep a diary at that time, I think to myself, how can he or she remember all these details? *I* don't remember that kind of detail, though my memory generally is excellent. So how much do they get wrong, or are they making up?

Lawyers know and study how unreliable witnesses can be, and in what ways. [books about it]

The astronaut Frank Bormann tells a story about something that happened along the way on a trip to the moon. Someone listened to the actual recording of the incident, and found that it was drastically different from the story. He played it for Bormann--I heard this on NPR radio, by the way--and Bormann said, well, yeah, I guess so, but I'm still going to tell the story, it's so good. Imagine how many books about the moon flights, about Bormann, about space flight in general, will include this entirely wrong story as "truth".

The Fayetteville, NC newspaper had a nice report about a meeting at Methodist University (then college) where a well-known writer had given a talk. The problem is, it never happened. Weather was so bad that the meeting was called off. But the report had been "pre-written", and published as is. And a historian reading that paper 50 years from now probably will take it as fact, as truth.

Now I've said this, about the newspaper, but I'm repeating what my wife, who was then chief librarian at Methodist, told me. I wasn't actually at Methodist to see that there was no meeting, nor did I read the newspaper report as far as I can recall. So I could be wrong, eh?

Initial reports on September 11, 2001 (9/11) stated that the State Department had been bombed. Never happened. But this was in the heat of the event. The next day (IIRC), all the major broadcast TV networks reported that some people had been rescued from the rubble, found in an SUV. I checked every network, and all reported this as truth; yet the next day, all admitted that no such thing had happened.

I rarely listen to the news right after some shocking event has happened, because the report will likely have "substantial inaccuracies" in it.

But for months, even more than a year perhaps, after the destruction of the World Trade Center, the tally of dead was about 7,000. Then that was reduced to about 3,000, a number which has held up. Wikipedia now says 2,974 died as an immediate result of the attacks with another 24 missing and presumed dead. (Of course, not everything in Wikipedia is correct.) With all those resources, with the importance of who had died and who hadn't (insurance claims, government and charitable perks for the relatives of those who died), the number was drastically wrong.

Be skeptical. Try to find out where stories come from. Yes, the BBC may report that one Chinese killed another because the latter sold his "magic sword" from an MMORPG, and it MAY be true, but then again, how reliable is news from China as reported by the BBC? Yes, a South Korean may have died from failure to take care of bodily functions while playing online games--or may not. One student told me he knew someone who had to go to a hospital to be treated for malnourishment because he played video games day in an day out--and maybe that was true, or maybe not.

Just because you heard it, just because someone told you about it, just because it was in the news, doesn't mean it's true. "What everyone knows" isn't always true, though frequently it is.

If it sounds unbelievable, maybe you shouldn't believe it! "Take everything with a grain of salt". You can rely more on your personal experience than on anything else, but even THAT can be deceptive.

This is part of critical thinking critically, discussed in Wikipedia as:

"Critical thinking consists of mental processes of discernment, analysis and evaluation. It includes possible processes of reflecting upon a tangible or intangible item in order to form a solid judgment that reconciles scientific evidence with common sense. In contemporary usage "critical" has a certain negative connotation that does not apply in the present case. Though the term "analytical thinking" may seem to convey the idea more accurately, critical thinking clearly involves synthesis, evaluation, and reconstruction of thinking, in addition to analysis.

Critical thinkers gather information from all senses, verbal and/or written expressions, reflection, observation, experience and reasoning. Critical thinking has its basis in intellectual criteria that go beyond subject-matter divisions and which include: clarity, credibility, accuracy, precision, relevance, depth, breadth, logic, significance and fairness."

In some ways game design is an exercise in critical thinking! Especially as you decide how to modify a game based on playtesting input.

Sunday, April 20, 2008

Grading student game projects

How do you evaluate and grade student-designed games? It is hard, no doubt, especially as there’s rarely enough time to play all the games enough to properly test them. Still, here are a few pointers. This is aimed primarily at non-electronic games, which are much better tools for teaching game design than electronic games, because a much greater percentage of a student's effort goes into design than prototype production, and they take a lot less time to produce a playable prototype.

Any “review”, whether of a book, a movie, a play, music, software, or a game, must answer three questions: what was the creator trying to do, how well did he do it, and was it worth doing? These questions can help guide a teacher grading a student game project, just as they help a reviewer evaluate a commercial game. However, I am not going to discuss these questions, except insofar as what follows indicates how well the student(s) did it.

If you don't play games, all you can do is try to judge effort and professionalism. If you do play games, but are not accustomed to evaluating games without playing them (perhaps talking with people who have), you’ll have a problem, because you often won’t have enough time to play the game enough to really find out what it’s about. (This is why game reviewers must spend a lot of time playing a game, to avoid being misled by first impressions.) Some games just don’t sound (or look) like much until you play them. Some sound or look really good, but fail in actual play.

Fortunately, if you’re very familiar with games and have some design experience, you can judge the important things fairly easily.

Just as the three most important factors in real estate are location, location, and location, the three most important factors in a game are gameplay, gameplay, and gameplay. There are several aspects which I’ll discuss in the next paragraphs.

The most important question about any game is, what does the player DO? What are the challenges the player(s) face? What actions can they take to overcome those challenges? This is the heart of most games. You can often see quickly, when evaluating a student game, that the player just doesn’t have much to do, or that what he is doing is quite repetitive without compensating factors.

Related to this is player interaction. What can players to do affect each other? There are certainly good games (often “race” games) where a player can do little to affect another player, but most good games have a significant-to-high level of player interaction. Or if it’s a solo game as are many electronic games, interaction between the player and the game becomes the target.

Another question related to what the player does is, is the game replayable many times without becoming "just the same" over and over? Perhaps this is less important with electronic games, as players EXPECT such games to become “the same”, and they also don’t mind sameness as much as the non-electronic gamers do. (So many AAA list electronic games are quite derivative of the gameplay of so many predecessors.) Nonetheless, the better a game is, the more replayable it is likely to be.

Some people would add one more thing, is the game fair? Do the rewards seem to match the effort?

So much for gameplay design. What about what we might call the professional aspects of the game? Care of construction (NOT looks--construction of the “rules” or electronic equivalent, game mechanics) is important. Sloppiness leads to imprecision which leads to confusion.

Even more important, how much was the game playtested? Are there records (not necessarily elaborate–I tend to list the date, who played, and anything unusual that occurred, along with changes I decided to make (or decided did not work, so I changed back). That can be just a couple lines of documentation per play.

With very few exceptions, a game with a good basic gameplay design won’t actually play well unless it has been thoroughly playtested to work out the little (or big) obstacles to good gameplay.

While the instructor may not be able to play the game enough to know it well, he can suppose that a poorly playtested game won’t be worth much, and that one that is playtested a lot is more likely to be a decent game. “Playtesting is sovereign”, whether the game is digital or not.

I usually ask students to give me the original playable prototype, and the “final” version. There should be very significant changes in the game, if it has been playtested much. You might even ask students to list what significant changes were made.

Here are some specific comments about what NOT to use as criteria:

"Fun" is not a criterion. We can't generally agree what fun is, and your idea of fun is different from mine. A chess master has a different idea of “fun” than your mother has! Some people like party games, some like silly games, some like perfect information, some like planning ahead (hence tend to like perfect information), some like much that is hidden, some like reaction to circumstances (hence tend to prefer hidden information), etc.

Likely hundreds of thousands of people have played my game Britannia, yet even *I* wouldn't call it "fun". It may be interesting, fascinating, and lots of other things, even educational, but many players would not call it fun.

Story is not a criterion. This is especially important because so many students don't understand this, and think story or something other than gameplay is what's important in a game. People play games, they listen to/watch stories. Yes, you can make the story interactive to an extent, but a great many game players really don't care about story. It is absolutely necessary to pound into students’ heads that story is not important in the design of most games.


What about the marketing palaver–the game concept and treatment and so on? These are not even written for non-digital games, though when the game is “finished” and the designer is trying to persuade a publisher to look at it, and then only sometimes, a description of the game will be written that is something like the game concept.

These documents in the electronic game world are marketing documents that have NOTHING to do with the quality of the game, NOTHING. They represent a simple plan for what the game will be.

Consequently, does it even make sense for a teacher to require students to produce these documents? I would say, it makes sense only insofar as the document might help the instructor evaluate the game, or help the students create the game. In that context, the two to three page game concept is useful, but something longer such as a game treatment may not be.

At the least you might ask the student(s) to briefly characterize the “essence” of their game–and let them decide what “essence” means.

Looks of the Game–not a criterion. This counts for virtually nothing; in fact, for non-electronic games I definitely do not want students to spend a lot of time on looks. As long as the physical components of the game are clear, not confusing, that's what counts. (There's a rule of thumb in the boardgame world, that the better a prototype looks, the less likely it is to be a good game, because novice designers spend far too much effort on the looks of the prototype.) I want students to understand that gameplay is what counts, not graphics, even though in the AAA list electronic world graphics become quite important simply because of the youthful audience.

I heard of a “game design teacher” who severely downgraded a student’s non-electronic game project because the box he supplied wasn’t large enough for the game. “The BOX?” That’s completely irrelevant to design; most experienced designers don’t make a box for their prototypes (I *never* have). This is something only novices who don’t understand what they’re doing think is important.

For electronic games, looks only matter near the end of development. And students will rarely have the time to do the playtesting and incremental, iterative modification necessary to “complete” a game and get to the true end of development.



So what else do you evaluate? Appropriateness for the audience. Is there an appropriate mixture for the audience and game type? If it’s a party game (whether “Apples to Apples” or a Wii party game, is it relatively easy and does it promote interaction amongst the players? Here we might actually ask if it’s “fun” in a party sense.


I am not a person who lists exact grade points and values, because I don’t think you can consistently judge this carefully, nor am I confident that my “rubric” would take everything into account. But I can judge an “A”, or a “B”, or a “C”, or something in between, based on these criteria, without assigning specific percentage numbers.

It's very difficult for anyone to "grade" these games without playing them several times, for which there is no time. My main criterion, aside from what I can see about the gameplay, is whether the students playtested the games and benefitted from that.

For groups, I also give them a peer evaluation sheet to fill out. The idea is that I may find out which people in the group actually contributed most, or least. It has its flaws, but is better than nothing.

Saturday, April 5, 2008

Training vs. Education

I've had some thoughts from recent experience about how important the distinction is between training and education in community colleges (and in any educational institution).

Everyone seems to have his or her own definition of "training" and "education". Here are mine.

In general, when you train someone, you tell them a specific way to do (or not do) something. In some cases it can be strictly rote learning, as in how to assemble or disassemble a weapon (there's only one way to do it, by the numbers). In any case, you're not trying to help people make judgments about uncertain situations, you're telling them, "if A, then B; if not A, then C." Many corporate training sessions are of this sort.

In education, you explain to people why something works the way it does or is the way it is, so that they can understand its nature, what's going on. If they lose their way, they can figure out what they need to do to get back to their objective. They can deal with uncertain situations, where there's no clear answer.

In general, the training recipient requires good memory and good organization more than good thinking processes; the education recipient requires application of intelligence, and sometimes critical thinking.

To me, this is the difference between having a set of written directions to get somewhere, and having a map. If you go wrong with the written directions, you may be able to backtrack, but if you get off the path you may be completely lost. If you have a map, you can figure out where you are and get back to where you need to be even if you've strayed far from the proper path. The first is analogous to training, the second to education.

In K12, we now have many schools that are training institutions. The teachers know what is required in the "end of class" test (EOC) , and they know that their job security depends largely on how their students do on those tests. So they drill into their students what they need to know to pass those (usually, multiple choice) tests, and that's all. The students, too, know this is the score; they know they can do next to nothing during the semester, as long as they pass the EOC. Even the smart ones tend to do little, then cram from the book (which, of course, is supposed to contain everything they need to know to pass the EOC), then forget it after the test. (Yes, there are many exceptions: this is the trend, not in every school or every class.)

I've seen the results of this time and again in college. Students expect to be told exactly what's on a test, and what they need to regurgitate, and are dismayed when I require them to actually think. They don't have any idea how to think, because they haven't been required to. In classes I try to impose the "educated" attitude from the beginning, but it's hard for kids to adjust.

Unfortunately in colleges of all types we have teachers who think their job is to "convey the material". I had one teacher say to me, when I was discussing a difficult class, "but you covered the material, didn't you"? That's a dumb question, I'm afraid. That's not what education is about, but it is what training is about. A corollary of the "training mentality" is that if you present the material and the students have the chance to absorb it, and some don't, then oh, well, that's the way it goes. In education, you're trying to find ways to convey what you mean to each class (and each class is different). You've not only conveying information, you're conveying an attitude, a way of doing things. If you can't cover "all the material" because a particular class is having problems, oh, well, your job is to choose (to judge) what's best for the class, and do it, not necessarily "cover all the material".

This is something like sports team coaching, but we don't have as many hours with the students, and we have a lot more students than, say, Coach K at Duke has basketball players (and he has three assistants). Many "trainer" types don't even know the names of their students, let alone understand them as individuals, and don't think that's a problem! How can you judge what the students need, when you know so little about them that you don't know their names?

The problem is, many "teachers" aren't interested in this more complex kind of thinking and understanding of what is needed. It requires more effort, more thought. And unfortunately, school accreditation people also aren't interested, because they can't measure it.

See my post about "educated" people (http://teachgamedesign.blogspot.com/2007/11/game-industry-wants-educated-people.html). I'm not talking about degrees here, I'm talking about a way of thinking and approaching life.

Unfortunately, the US "education" system is becoming more and more training oriented, and less education oriented, every year. Perhaps one indication of this is the very strong tendency of accreditation organizations to emphasize (in teacher qualification) degrees and classes taken rather than actual ability to do something. We are more and more finding "teachers" who have never done what they're teaching in the real world, and it shows. But if all you're doing is conveying material, telling students what to regurgitate on multiple choice tests, how much will the experience of a person who's actually done the work professionally matter? It's a system geared to turn out people who cannot do complex work in the real world--people who have been trained, not educated.

Thursday, March 20, 2008

Coping with your first programming class

I'm posting this here for the benefit of some former students:

If you'll bear with me and read all of the following, I think it may help you a lot in the programming class. The summary is: use pseudo-code (English) to write out what you want a program to do, then turn it into whatever programming language you're using, and you'll be much more successful.


Beginners in programming, especially those who don't find programming an interesting and attractive pastime, have many struggles.

There are always two problems in programming: first, what do you need to do to solve whatever problem you're working on (what is the logic of it), the second, how do you convert that solution into something the programming language can execute (what is the syntax of it)?

It is possible to have a computer program that compiles and runs perfectly (correct syntax), but fails to correctly solve your problem, because the logic is incorrect. That is, you can get the second part right, but if the first part isn't also right, the whole fails.

Hence, initial student programs should be ones where either you don't worry about coding--you just solve the problem--or ones where you don't worry about how to solve the problem--you just code it.

For example, the (logical) solution to the problem of saying "Hello World" on the screen is trivial. In English it's:

Blank the screen (clean off whatever was there)
Show "hello world" on the screen
Make sure the screen doesn't go away because the program ends (or we won't see the message)

You can take that English description and code the program in dozens of different languages. Each one will be more or less different, but all will do the same thing. That's why "hello world" is usually the first program you do in a language, it is entirely a problem of language syntax, not of logic.

Example from one of the simplest programming languages, Windows command files (batch files):

CLS
echo Hello World
pause

Now, comparing the pseudo-code with the actual code, you can figure out what "CLS" means even if you don't know that it stands for "Clear Screen". You can figure out that "echo" means "Show (by default, on the screen) whatever comes after echo". You can guess that "pause" means "wait for the user to hit a key".

But if your problem is, "take a group of numbers and decide which ones are prime numbers", then you have to figure out how you can possibly do this at all, before you try to code it. If at the same time that you're trying to solve the logic problem, you try to figure out the program syntax, it becomes MUCH more difficult.

Object-oriented programming can throw kinks in the process, but most of the code in most programs is procedural, telling the computer to do this, do that, do the other thing. Any set of instructions, whether game rules, or product manuals, or software how-to's, amounts to a set of sequences of instructions, loops (repetitions until something particular happens or doesn't happen), and branches (decision points where the instructions go one way or another, sometimes from a choice of more than two). Since you can write English to reflect these three structures, you can write English pseudo-code for most any procedural problem. The example above is strictly a sequence. We could add a couple lines where the program would ask the user if he wanted to see "hello world" again or not:

Clear off the screen
Show "hello world" on the screen
Ask user if he wants to see more "hello worlds" on the screen
If user says yes, loop back to "Show", or else continue on to end
Make sure the screen doesn't go away because the program ends (or we won't see the message)

Now we have a branch (the IF) and the potential for a loop (going back to the "show"). Note the loop doesn't go all the way back to the start, or else the screen would be cleared again.

In fact this is not possible to do in Batch files (which are very, very simple), because there's no way to get user input within the program! (unless you use other programs for that purpose). But any "real" programming language can do it.



Here's an analogy. If you "kinda" know a second language, it is much easier to listen to (or read) that language and figure out what it means, than it is to take a meaning and turn it into that spoken (or written) language. In computer programming terms, in the first case, you're only worrying about the syntax; in the second you're worrying about both the logic and the syntax.

So, if you know some Spanish and you're trying to talk with a Spaniard who knows some English, you should speak English, and he should speak Spanish. You'll find it much easier to understand the Spanish you hear than to make it up and speak it, because you're not worrying about the creation, about solving the problem of what to say; he'll do much better sorting out your English and not worrying about how he's saying what he's saying, because he'll be using his native language.

By using pseudo-code you separate your "big mess" into two problems that you solve one after the other. It's much easier.

So, you've all heard (I hope) that the longer you take to plan a program, the less time it takes to code it. Pseudo-code is a principle tool when you're writing simple, short programs. (Longer ones require much more elaborate planning, such as systems analysis.)


This is why I advocate not teaching any one language to "novices", instead concentrating on the logic needed to solve problems with computer languages.

Sunday, March 2, 2008

Why I'm not an electronic game designer

Why would I want to design electronic games? I'm better off as is.

  • The "AAA list" electronic games are really designed by committee. When I design a game, it is almost all MINE. (The rest is playtesters and publisher.)
  • For most of the age of video games, you had to work full time in the industry, yet the pay was and is poor. I'd rather help young people as a teacher, get paid at least as well, and have lots of time to design games.
  • The working hours are bad. "Crunch time" (unpaid overtime) is common, though designers are not involved in that quite as much as programmers and artists.
  • Fighting with the electronics obscures the purity of design. You worry about what the computer can do instead of what the players can do.
"Always do right--this will gratify some and astonish the rest."Mark Twain
"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." Antoine de Saint-Exup'ery

"Not everything that can be counted counts, and not everything that counts can be counted." Albert Einstein

"Make everything as simple as possible, but not simpler." Albert Einstein

"The worst form of inequality is to try to make unequal things equal." -- Aristotle