Thursday, December 1, 2011
Teaching game design
Game design is often taught by people who have no game design expertise, usually programmers who really want to be teaching programming but are using games to entice people into learning programming. This is a problem. I believe there’s also a basic contempt, amongst many non-gamers and especially many programmers, for the entire idea of game design, as in “oh, that’s kid’s stuff, anyone can do that” or “it’s just getting a few ideas” (which conveniently masks their lack of knowledge and experience, of course). Nothing can be done about the contemptuous people who are supposed to be teaching game design, but for those with a more open mind, I’ve tried to distill my experience as a game designer who is also a very experienced teacher (17,000 classroom hours).
First you have to decide, are you going to talk about game design and discuss it so that students have some idea what it really about but no clue how to actually do it, or are you going to have them begin to learn how to do it? You can’t do both in one semester; it’s very hard to do both in two semesters. Moreover, you can’t try to teach them game production in the same classes. If you do, they’ll struggle with the game production and learn very little about game design.
There’s a big problem in conveying the difficulty of game design to students unless they experience it first hand (“experientially”). I’ll try to use an analogy to explain.
Most college and high school age people know intellectually that they can die, but emotionally they feel that they’re immortal, and sometimes behave that way as they take foolish risks such as driving while intoxicated. Similarly, they can be told that the ideas in their heads will not translate directly and accurately to what happens in a game (and even if they do, they often won’t be enjoyable to play); they may acknowledge this intellectually, but emotionally they don’t believe it. Instead they think that the game is going to work just as they conceive it and will be great fun (and further, that there will be more detail in it than they actually have in mind).
There is no substitute for them making games and (inevitably) seeing that the game not only does not work as they anticipated, it usually does not work well at all. To do that with beginning students you must use non-video (tabletop) games. If they try to learn how to create video games at the same time as they learn design, the struggle to create will be so great that they learn virtually nothing about design. This will happen even with tools such as Gamemaker that avoid programming per se. And even if they one of the few who might succeed, it takes so much longer to create and repeatedly modify a video game (as opposed to a tabletop game) that they won’t have time even in two semesters to really understand how much happens after the playable prototype is created. Nor will they have the opportunity to understand that the most important part of game design happens after the initial prototype creation.
In other words, to understand game design students must “complete” games, not merely plan them or get to an (inevitably poor) working prototype.
Given that students today are impatient with theory and want to do "something practical", they’re likely to be more engaged if they start by designing games. As you then introduce the “theory” they’ll be able to associate some of it with their practice so far, which will help both understanding and retention.
Another analogy: just as it is usually necessary for someone learning to write novels to write and discard a million words as practice, it is usually necessary for someone learning to design games to design many games that are not publishable but are good practice.
You'll have plenty of chances to teach design skills from the practical experience. Don't end up like American basketball players: "It's like what ESPN analyst ... Jay Bilas says all the time: In Europe, they teach skills, in America they play games. Teenagers aren't coached here so much as they're babysat and herded from tournament to tournament." (Rick Bonnell) But "playing" without teaching at least produces good basketball players. Experience without insight is of limited use, insight without experience is almost entirely useless for actual game designers. You need to be the coach, but they need to make the games.
If the instructor has not actually designed games from start to finish, not just to the point of a playable prototype, then the instructor probably won’t understand what’s really happening either. Most people don't complete games. Instead they think that when the playable prototype stage has been reached, the job is about done, when in fact it's closer to the beginning than to the end.
I understand the notion, once posed to me by the head of the neurosurgery department at UCLA, that "I can teach anything I can understand". The question is, can you understand game design without actually having done it? Not well. Teachers don't need to be professional practitioners, but they must be practitioners.
Yes, someone who has not painted could teach a class about painting, but the result is not likely to be good. Someone who has never sculpted can teach a class about sculpture but the result is not likely to be good. Someone who has not composed music can teach a composition class, but the result is not likely to be good even if they’ve played lots of music. (That is, playing a lot of games doesn’t make one a game design teacher any more than it makes one a game designer.) Games are more complex and less personal than paintings or sculptures or musical compositions, games are interactive and involve people other than the designer, games are more cerebral than these other arts and typically take much longer to complete. So the teachers who are not professional practitioners but are teaching the classes are much less likely to have gone through the entire process. Someone who has not actually designed games from start to finish can teach a class about game design, but the result is quite unlikely to be good.
A poor teacher who is a practitioner may not get results as good as a good teacher who is not a practitioner; but a good teacher who is a practitioner will get much better results than a good teacher who is not.
Example: several years ago one of the local community colleges used their Game Design I class mostly to teach students how to use Gamemaker, that is, they were really teaching game production. Then they used Game Design II to divide students into randomly-determined groups and have them make five Gamemaker games in 16 weeks. (Once again, game production was the main objective.) So a game barely reached playable prototype stage, with barely any design involved, it was mostly a production struggle, with the Gamemaker "programmer" having by far the most influence. Students got absolutely the wrong ideas about game design, as well as great frustration with the concept of working in groups in such limited time-frames.
When I say design and complete games, I don’t mean write game design documents. There are too many curriculums that teach students how to write game design documents but don’t let them learn how to design games, so what goes into the GDD is likely to be garbage. A game design document is just a plan, and when people who have no experience of actually doing something try to plan to do it, it can be a train wreck.
I have just read yet another syllabus for an introductory game design course where the students are not actually learning how to design games. Instead they're learning how to write brief pitch documents describing a plan for a game. (The documents are much too short to actually be a plan; they're a description of the most notable parts of the game for marketing purposes.) This may sound like designing games to you, if you never designed a game, but to me the students are only being asked to express an extended idea. And if you have not yet learned how worthless ideas are in game design then there are many articles you can read, such as "The Idea is Not the Game", http://www.gamecareerguide.com/features/614/the_idea_is_not_the_.php and "Why Your Game Idea Sucks"
http://www.escapistmagazine.com/articles/view/issues/issue_221/6582-Why-Your-Game-Idea-Sucks .
Furthermore, when beginners write extended descriptions of their ideas for video games they tend to focus more on what they wish the game to be rather than on how the game is going to work. See "When You Start a Game Design Conceive a Game Not a Wish List" http://gamasutra.com/blogs/LewisPulsipher/20111014/8668/When_you_start_a_game_design_conceive_a_game_not_a_wish_list.php .
Writing these brief documents doesn't even teach students about game structure, let alone about the process of game design. It teaches them nothing about the iterative and incremental process of taking an inevitably poor prototype and turning it into a good game. The way to learn about the process of game design is to design and complete games.
Example of how this begins to work: the first day of an intro game design class–I don’t discuss the syllabus until the second day–I talk with students about Monopoly, try to get them to understand why it is a poor game even though they may have fond thoughts about it (because they were doing something enjoyable with their families), and (if there’s time) put them into groups to start devising ways to improve the game. As soon as we can, I have them play their version to see how their ideas turn out in practice. (Of course, usually the new versions are a mess.) This gets the attention of the serious people, at least. It may be disappointing for the students who think they're going to play video games all the time for class. . .
(And a related note:)
I was a peer reviewer once for a textbook for an introductory database class, with a strong leaning toward Oracle. The book began with a lengthy discussion of normalization of databases, which is a process of making sure that you don't have more fields in a database than are necessary aimed at avoiding duplication of data. The book then talked about entity relationships and other details of creating database applications. Finally the book got to actually using a database to do something. I told the authors that typical college students would quickly switch off at the beginning of the book because it would seem irrelevant to them. I suggested having the students make and use databases first so that when the students got to the discussion of normalization and entity relationships they would have some experience to apply it to and could understand why it was important.
This bears considerable resemblance to the situation in game design. Beginning students have a lot more experience of games than of databases. Nonetheless, if they have no experience of designing a game then the theory of game design will mean much less to them. They'll associate it with fantasies that it all will happen very easily that I discussed at the beginning of this article. They will still think that game design is all about getting good ideas, not about the execution of those ideas.
(Another related note:)
The intellectual/emotional split between the “theory” of game design and the practice is a little like the split that’s part of teaching beginning students to program: if you try to have them both solve the problems that are posed, and write programs to implement the solutions, they’ll fail at both. You need to either present them with a solution and have them program it, or have them figure out a solution to a problem and then show them how an efficient program to this end works, without the students trying to figure it out. When they’re more experienced then they can do both the problem-solving and the program creation.
Lew Pulsipher
Friday, August 26, 2011
"Objective" is not better than "Subjective"
(I'd swear I posted this years ago, but I cannot find it. Since the topic keeps coming up, I'll post it now.)
Many people misunderstand "objective" to be somehow more valid than "subjective". This is not true at all.
Objective merely means in accordance with an external standard. For example, a person can use the objective standard of a ruler or yardstick to measure the width of a room. However, if that objective standard is itself wrong--say the yardstick used is not 36" long--then the result is certainly invalid.
A subjective measurement is based on internal rather than external standards, but can be just as (or more) valid. If someone who is very good at judging distances tells you how wide the room is, he may be more accurate than someone who measures with a yardstick, especially if that yardstick is faulty.
Many matters cannot be measured objectively, because we have no way to do so. Can "customer satisfaction" be measured objectively? No, it is necessarily subjective. Yet it is nonetheless of vital importance to, say, computer support personnel. If you try to measure how "good" computer support is by counting things, you may come to faulty conclusions. Simple example: is it good that over time there are more and more calls to a computer Help Desk in a corporation? Well, you could conclude that more people realize that the Help Desk really does help (some don't, you know). Or you could conclude that the Help Desk isn't doing very well, so people have to call back. Or you could conclude that the training for workers, so that they can manage to do things themselves, is faulty, so they have to call the Help Desk. And so forth. An increase in number of calls can be argued both ways. The only way to be sure is to survey the people who use (or could use) the Help Desk, a subjective measure that is much more valid than the objective measure of number of phone calls.
Most sports, for example, rely on subjective evaluations (referees, judges in figure skating or boxing, etc.). Sometimes people are unhappy, but the prizes are still awarded. The judges/referees/umpires use "objective" standards, but they necessarily apply them in a subjective way--"judgment calls".
How about the famous dog shows, judged by a single referee? We could just have a time trial and measure the running speed of the dogs, and that would be an objective measurement, but it wouldn't tell us which was "best", only which was fastest in a time trial.
There appears to be no *meaningful* objective measure available to determine whether a game is "great" or "flawed" or "awful" or somewhere else in between. We can try to use a combination of "kind of meaningful" (do lots of people play it?) and "kind of measurable" (the effect of the game on people, which we have no means to measure objectively) and "kind of objective" (is there player elimination, etc.)
I see people applying the (wrong) objective standard to so many things, because they cannot easily measure what is really important. I'm a college teacher. In teaching, accreditation bodies measure whether people have a degree rather than whether they know the subject (that's too hard to determine). They don't even imagine they should try to measure whether the person is a good teacher--it's too hard, so they measure something that is truly unimportant. My neighbor tells me that a respected teacher at her high school, who has taught special education for 32 years, got a letter from the state telling him he was unqualified--because he didn't have the "proper degree". In assessment of students, we see "end of class" multiple choice tests as the only determiner of student/school quality--MC tests are a poor way to determine whether someone knows something--and as a result we have students coming into colleges who have been trained, not educated, and cannot think for themselves. Even the good students don't understand what they're doing.
At one high school system in a major city, they have a cooperative agreements with the colleges, and instead of running "AP" (Advance Placement) classes they run classes via the colleges--except for American history The entire school system's worth is measured by how students do on their American History End-of-class test because that's all there is to "objectively" measure. Balderdash and poppycock!
Examples can be multiplied in other fields, I'm just using ones (computer support, education) that I'm familiar with.
Many people misunderstand "objective" to be somehow more valid than "subjective". This is not true at all.
Objective merely means in accordance with an external standard. For example, a person can use the objective standard of a ruler or yardstick to measure the width of a room. However, if that objective standard is itself wrong--say the yardstick used is not 36" long--then the result is certainly invalid.
A subjective measurement is based on internal rather than external standards, but can be just as (or more) valid. If someone who is very good at judging distances tells you how wide the room is, he may be more accurate than someone who measures with a yardstick, especially if that yardstick is faulty.
Many matters cannot be measured objectively, because we have no way to do so. Can "customer satisfaction" be measured objectively? No, it is necessarily subjective. Yet it is nonetheless of vital importance to, say, computer support personnel. If you try to measure how "good" computer support is by counting things, you may come to faulty conclusions. Simple example: is it good that over time there are more and more calls to a computer Help Desk in a corporation? Well, you could conclude that more people realize that the Help Desk really does help (some don't, you know). Or you could conclude that the Help Desk isn't doing very well, so people have to call back. Or you could conclude that the training for workers, so that they can manage to do things themselves, is faulty, so they have to call the Help Desk. And so forth. An increase in number of calls can be argued both ways. The only way to be sure is to survey the people who use (or could use) the Help Desk, a subjective measure that is much more valid than the objective measure of number of phone calls.
Most sports, for example, rely on subjective evaluations (referees, judges in figure skating or boxing, etc.). Sometimes people are unhappy, but the prizes are still awarded. The judges/referees/umpires use "objective" standards, but they necessarily apply them in a subjective way--"judgment calls".
How about the famous dog shows, judged by a single referee? We could just have a time trial and measure the running speed of the dogs, and that would be an objective measurement, but it wouldn't tell us which was "best", only which was fastest in a time trial.
There appears to be no *meaningful* objective measure available to determine whether a game is "great" or "flawed" or "awful" or somewhere else in between. We can try to use a combination of "kind of meaningful" (do lots of people play it?) and "kind of measurable" (the effect of the game on people, which we have no means to measure objectively) and "kind of objective" (is there player elimination, etc.)
I see people applying the (wrong) objective standard to so many things, because they cannot easily measure what is really important. I'm a college teacher. In teaching, accreditation bodies measure whether people have a degree rather than whether they know the subject (that's too hard to determine). They don't even imagine they should try to measure whether the person is a good teacher--it's too hard, so they measure something that is truly unimportant. My neighbor tells me that a respected teacher at her high school, who has taught special education for 32 years, got a letter from the state telling him he was unqualified--because he didn't have the "proper degree". In assessment of students, we see "end of class" multiple choice tests as the only determiner of student/school quality--MC tests are a poor way to determine whether someone knows something--and as a result we have students coming into colleges who have been trained, not educated, and cannot think for themselves. Even the good students don't understand what they're doing.
At one high school system in a major city, they have a cooperative agreements with the colleges, and instead of running "AP" (Advance Placement) classes they run classes via the colleges--except for American history The entire school system's worth is measured by how students do on their American History End-of-class test because that's all there is to "objectively" measure. Balderdash and poppycock!
Examples can be multiplied in other fields, I'm just using ones (computer support, education) that I'm familiar with.
Friday, February 18, 2011
Game designing and writing as professions
I was at a local game shop the other day to try out 4th edition D&D seasonal adventures. One of the players had played Warhammer 40,000 but had never played D&D. I discovered on further acquaintance that she likes to write fiction. This seems to be the most common hobby cum professional objective of people in their late teens or early 20s, after wanting to make video games, though that observation comes from my own experience rather than surveys. (Before someone comments that surveys show that teens want to be doctors, lawyers, teachers, and sports people, I’m talking about what they really want to do, not what they think they ought to want to do, or think that others think they should do, or what they think they will have to do.)
Fortunately this 19-year-old recognizes that she isn’t likely to make a living from writing; unfortunately she doesn’t really have any idea of what else she might want to do.
I guess that the number of people who make a full living from fiction writing worldwide is in the hundreds rather than the thousands. I recently read an interview with Glen Cook, who is one of my favorite fantasy authors, who said:
"Even in my best years of the first thirty it was never more than hobby money. The last maybe five I've made enough to support myself in genteel poverty. Certainly not enough to support a family and put three sons through college."
This is a man who worked full-time and retired from General Motors, and wrote in his spare time, but had a lot of books published. Now that he's retired he does about two a year.
In contrast the number of people who make a full living from tabletop game design is very likely less than 100, total, no more than a quarter of those freelancers. The obvious freelancers are Reiner Knizia, Klaus Tauber (Catan), and Alan R. Moon (Ticket to Ride), and likely Richard Borg (Liar's Dice, Memoir '44 etc.), plus people who work at Hasbro and a few other companies.
Perhaps even more in fiction writing than in games, it's very rare for young person to become well-known. Despite the exception of the author of Eragon (who got a lot of help), how many successful fiction authors, people who make enough to make a living, can you name who are less than 30 years old? There's probably somebody in tabletop game design under 30, but the ones I've named above are much older than that. Part of this may simply be that you need to do quite a few things before you become well-known, but in fiction writing I also think it's a matter of personal experience. The authors of really affective [sic] fiction can draw upon a wealth of life experience: they've personally experienced love and death and disappointment and betrayal. (When I talked about experience to my 19-year-old acquaintance she pointed out all the things she had *done* (such as skydiving and horse riding) rather than all the emotional experiences she had had.)
In the age of instant gratification it's now even harder for young people to recognize that practice makes a difference, THE difference. This is true for fiction writing and it's also true for game design. This is what Cook had to say about fiction writing when asked "Do you have any advice for beginning writers?"
"This is the easiest answer of all. Write. Don't talk about writing. Don't tell me about your wonderful story ideas. Don't give me a bunch of 'somedays.' Plant your ass and scribble, type, keyboard. If you have any talent at all, it will leak out despite your failure to pay attention in English. And if you didn't pay attention, learn. A carpenter needs to know how to use a hammer, level, saw, and so forth. You need to know how to use the tools of writing. Because, no, the editor won't fix it up. S/he will just chunk your thing in the shit heap and go on to somebody who can put together an English sentence with an appropriate sprinkle of punctuation marks."
Jerry Pournelle used to say you too can be a novelist if you're willing to throw away your first million words. Brandon Sanderson, who is finishing the Wheel of Time series following the unfortunate death of the original author, wrote something approaching a dozen novels before he sold one. Glen Cook apparently wrote a great many novels before he sold one. And none of those old novels will ever be published.
Fortunately my 19-year-old is writing rather than just talking about writing. I know another 19-year-old who wants to be a novelist who can only make herself write as part of National Novel Writing Month every November. With the support provided by others then and the aspects of a contest she can do it; the rest of the time it doesn't seem to happen. That's not going to work in the long run, is it?
Perhaps several hundred people work as game and level designers in the video game industry and make a living. But very few of them came out of school to get a job as a designer. Just as it's necessary for an aspiring fiction writer to have a fallback career in mind that will enable them to actually make a living, it's necessary for an aspiring game designer to gain other skills that can make them a desirable employee in the game industry. This would usually be programming or art, of course, although many people in game design and even game writing started out doing something for game companies that was not directly involved with game creation, such as game testing, working in the mailroom, working in the IT department, working in marketing, and so forth
Just as Cook says that you have to write I tell students that if you want to be a game designers you've got to design games. And you've got to take them all the way through to completion, it doesn't help just to get ideas or to flesh out the ideas a bit and then stop. A playable prototype is only the beginning.
One of the problems with video games is that it takes a long time to produce a playable prototype. It's much more practical to begin by designing tabletop games, where you can make a playable prototype in a few hours or less.
Of course, to begin with it makes a lot of sense to modify existing games to improve them rather than to do games from scratch. When I was a teenager and early 20 something I designed Risk variants and Diplomacy variants. But I had also designed games to play by myself, once I'd been exposed to commercial wargames beginning with Conflict when I was very young, then American Heritage Broadsides, and then especially Stalingrad, Afrika Korps, and other Avalon Hill games. But I tended to design games that were not commercially viable: for example I designed a massive space wargame that I played solitaire with many many sides, far too many to be practical, and also it used fog of war but there was no mechanism for it, I just pretended as I played each Empire that I couldn't see where the opposition was and didn't know what they were doing.
So when I teach beginners game design, one of the first things I do is talk about what an inadequate game Monopoly is (especially for adults), and why, and then have them try to come up with ways to improve it. And I have them actually play their variant to see that it usually won't turn out the way they think it will.
Cook quotes from http://www.sfsite.com/10a/gc209.htm
I hope I've cleaned up all the oddities introduced by Dragon Naturally Speaking.
Fortunately this 19-year-old recognizes that she isn’t likely to make a living from writing; unfortunately she doesn’t really have any idea of what else she might want to do.
I guess that the number of people who make a full living from fiction writing worldwide is in the hundreds rather than the thousands. I recently read an interview with Glen Cook, who is one of my favorite fantasy authors, who said:
"Even in my best years of the first thirty it was never more than hobby money. The last maybe five I've made enough to support myself in genteel poverty. Certainly not enough to support a family and put three sons through college."
This is a man who worked full-time and retired from General Motors, and wrote in his spare time, but had a lot of books published. Now that he's retired he does about two a year.
In contrast the number of people who make a full living from tabletop game design is very likely less than 100, total, no more than a quarter of those freelancers. The obvious freelancers are Reiner Knizia, Klaus Tauber (Catan), and Alan R. Moon (Ticket to Ride), and likely Richard Borg (Liar's Dice, Memoir '44 etc.), plus people who work at Hasbro and a few other companies.
Perhaps even more in fiction writing than in games, it's very rare for young person to become well-known. Despite the exception of the author of Eragon (who got a lot of help), how many successful fiction authors, people who make enough to make a living, can you name who are less than 30 years old? There's probably somebody in tabletop game design under 30, but the ones I've named above are much older than that. Part of this may simply be that you need to do quite a few things before you become well-known, but in fiction writing I also think it's a matter of personal experience. The authors of really affective [sic] fiction can draw upon a wealth of life experience: they've personally experienced love and death and disappointment and betrayal. (When I talked about experience to my 19-year-old acquaintance she pointed out all the things she had *done* (such as skydiving and horse riding) rather than all the emotional experiences she had had.)
In the age of instant gratification it's now even harder for young people to recognize that practice makes a difference, THE difference. This is true for fiction writing and it's also true for game design. This is what Cook had to say about fiction writing when asked "Do you have any advice for beginning writers?"
"This is the easiest answer of all. Write. Don't talk about writing. Don't tell me about your wonderful story ideas. Don't give me a bunch of 'somedays.' Plant your ass and scribble, type, keyboard. If you have any talent at all, it will leak out despite your failure to pay attention in English. And if you didn't pay attention, learn. A carpenter needs to know how to use a hammer, level, saw, and so forth. You need to know how to use the tools of writing. Because, no, the editor won't fix it up. S/he will just chunk your thing in the shit heap and go on to somebody who can put together an English sentence with an appropriate sprinkle of punctuation marks."
Jerry Pournelle used to say you too can be a novelist if you're willing to throw away your first million words. Brandon Sanderson, who is finishing the Wheel of Time series following the unfortunate death of the original author, wrote something approaching a dozen novels before he sold one. Glen Cook apparently wrote a great many novels before he sold one. And none of those old novels will ever be published.
Fortunately my 19-year-old is writing rather than just talking about writing. I know another 19-year-old who wants to be a novelist who can only make herself write as part of National Novel Writing Month every November. With the support provided by others then and the aspects of a contest she can do it; the rest of the time it doesn't seem to happen. That's not going to work in the long run, is it?
Perhaps several hundred people work as game and level designers in the video game industry and make a living. But very few of them came out of school to get a job as a designer. Just as it's necessary for an aspiring fiction writer to have a fallback career in mind that will enable them to actually make a living, it's necessary for an aspiring game designer to gain other skills that can make them a desirable employee in the game industry. This would usually be programming or art, of course, although many people in game design and even game writing started out doing something for game companies that was not directly involved with game creation, such as game testing, working in the mailroom, working in the IT department, working in marketing, and so forth
Just as Cook says that you have to write I tell students that if you want to be a game designers you've got to design games. And you've got to take them all the way through to completion, it doesn't help just to get ideas or to flesh out the ideas a bit and then stop. A playable prototype is only the beginning.
One of the problems with video games is that it takes a long time to produce a playable prototype. It's much more practical to begin by designing tabletop games, where you can make a playable prototype in a few hours or less.
Of course, to begin with it makes a lot of sense to modify existing games to improve them rather than to do games from scratch. When I was a teenager and early 20 something I designed Risk variants and Diplomacy variants. But I had also designed games to play by myself, once I'd been exposed to commercial wargames beginning with Conflict when I was very young, then American Heritage Broadsides, and then especially Stalingrad, Afrika Korps, and other Avalon Hill games. But I tended to design games that were not commercially viable: for example I designed a massive space wargame that I played solitaire with many many sides, far too many to be practical, and also it used fog of war but there was no mechanism for it, I just pretended as I played each Empire that I couldn't see where the opposition was and didn't know what they were doing.
So when I teach beginners game design, one of the first things I do is talk about what an inadequate game Monopoly is (especially for adults), and why, and then have them try to come up with ways to improve it. And I have them actually play their variant to see that it usually won't turn out the way they think it will.
Cook quotes from http://www.sfsite.com/10a/gc209.htm
I hope I've cleaned up all the oddities introduced by Dragon Naturally Speaking.
Wednesday, January 12, 2011
What characterizes broad game markets?
I have been thinking about what characterizes the broader market in games, both tabletop and video. I haven't come to any generalized theory yet (if I ever will), but I have some observations.
"Twitch games" are games requiring a player to move and react very quickly. This is the most common form of hard core video game, as epitomized by shooters, but can also be seen in many casual games such as Tetris.
The 21st century is the world of Instant Gratification, of "oh shiny", of the "Easy Button", of myriad distractions and encouragements to "just do it" rather than think about it. It's the world of "listen to your feelings, Luke", where something other than logic is preferred (e.g. "The Force" is better than any computer). K-12 education in most places in the USA consists of memorization of material to pass multiple choice tests. Students aren't encouraged to think. ("Life is an essay test, not multiple choice", but that's not the trend in education.) Twitch games are far more popular than strategy games because so many people in the modern world are unwilling to shift their brain out of first gear. I am talking about general points of view, not necessarily what YOU are like, of course. We don't need to concern ourselves with whether this is good or bad, it is what it is.
In the past decades we've "dumbed down" the twitch games to reach a broader market, as typical games are easier than in the past, repleat with such features as auto-aim and auto-save. I'm *not* saying this is bad, in fact I think we should go further in story-driven games so that those who want to enjoy the story without the work of playing the game can do so, while those who enjoy challenge can do so. A game can be hard for those who want it to be hard, and can provide auto-pilot for those who just want to enjoy the story.
But the "dumbing down" also means there is even more of a market now for "twitch games" than for thinking games. Kids especially are far more willing to learn the highly repetitive hand movements and the eye coordination, than to apply a lot of brainpower to a game.
There may have been a time--or may not--when the population as a whole were more willing to think than to twitch, but if so those days are long gone.
Not surprisingly, many of the people who like thinking games play tabletop games more than video games. The proportion of "twitch" is much higher in video games (of course), the proportion of thinking games much higher in tabletop because there are few ways to make them games of reaction and movement, and because people are more formidable and resourceful opponents than the computer.
Social networking games on Facebook are an extreme, in a sense a reversion to the original video games that required very little brainpower. Most if not all social networking games are deliberately designed to present very simple puzzles each day (often repetitive puzzles) that any normal person can solve without frustration, if they choose to do so. Nor are they actually social, as almost all of them can be played solitaire; other people are not required.
As a lifelong "strategy gamer" and one who enjoys playing games with other people, I find all of this disappointing, but game designers must deal with it.
"Twitch games" are games requiring a player to move and react very quickly. This is the most common form of hard core video game, as epitomized by shooters, but can also be seen in many casual games such as Tetris.
The 21st century is the world of Instant Gratification, of "oh shiny", of the "Easy Button", of myriad distractions and encouragements to "just do it" rather than think about it. It's the world of "listen to your feelings, Luke", where something other than logic is preferred (e.g. "The Force" is better than any computer). K-12 education in most places in the USA consists of memorization of material to pass multiple choice tests. Students aren't encouraged to think. ("Life is an essay test, not multiple choice", but that's not the trend in education.) Twitch games are far more popular than strategy games because so many people in the modern world are unwilling to shift their brain out of first gear. I am talking about general points of view, not necessarily what YOU are like, of course. We don't need to concern ourselves with whether this is good or bad, it is what it is.
In the past decades we've "dumbed down" the twitch games to reach a broader market, as typical games are easier than in the past, repleat with such features as auto-aim and auto-save. I'm *not* saying this is bad, in fact I think we should go further in story-driven games so that those who want to enjoy the story without the work of playing the game can do so, while those who enjoy challenge can do so. A game can be hard for those who want it to be hard, and can provide auto-pilot for those who just want to enjoy the story.
But the "dumbing down" also means there is even more of a market now for "twitch games" than for thinking games. Kids especially are far more willing to learn the highly repetitive hand movements and the eye coordination, than to apply a lot of brainpower to a game.
There may have been a time--or may not--when the population as a whole were more willing to think than to twitch, but if so those days are long gone.
Not surprisingly, many of the people who like thinking games play tabletop games more than video games. The proportion of "twitch" is much higher in video games (of course), the proportion of thinking games much higher in tabletop because there are few ways to make them games of reaction and movement, and because people are more formidable and resourceful opponents than the computer.
Social networking games on Facebook are an extreme, in a sense a reversion to the original video games that required very little brainpower. Most if not all social networking games are deliberately designed to present very simple puzzles each day (often repetitive puzzles) that any normal person can solve without frustration, if they choose to do so. Nor are they actually social, as almost all of them can be played solitaire; other people are not required.
As a lifelong "strategy gamer" and one who enjoys playing games with other people, I find all of this disappointing, but game designers must deal with it.
Sunday, January 9, 2011
Beginners "design" what they want to play
If you ask beginners to "design" a video game by writing a formal description (a game treatment), what you get is a vague description of the "really cool" game they'd like to *play*, usually overflowing with superlatives like "great story" and "great graphics". There's no recognition of practical limitations. It is "pie in the sky". Nor is there any element of real game design, which is about setting constraints and resulting problems, solving those problems, then solving all the problems that arise from the inevitable weaknesses of those initial solutions, and so on. There are no details, only vague ideas, and I (and many others) have already described how little value there is in ideas.
Subscribe to:
Posts (Atom)
"Always do right--this will gratify some and astonish the rest."Mark Twain
"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." Antoine de Saint-Exup'ery
"Not everything that can be counted counts, and not everything that counts can be counted." Albert Einstein
"Make everything as simple as possible, but not simpler." Albert Einstein