Games and Realism: Part 3.1 - A Game Is A Game Is A Game
I think I'm probably one of the worst sleepers in the world. I crave sleep, yet I always go to bed way too late in the night. As a result, in the morning, when it is time to go to work, I always feel like I really need more sleep. When that alarm rings, it's like the call of the devil.
But you know what's worse? When you wake up before your alarm goes off. And you know it's almost time to wake up, but you don't know how much longer you have. You wanna know if you can try to fall completely asleep again or just forget it and wake up now, since the alarm will ring in less than 10 minutes. So you open your eyes to a squint just barely enough to look at your clock and find out how much time you have.
Do you know how it can get even worse? By not having a clock in plain view, because now you gotta climb out of bed to find the time. But it's so annoying and frustrating that you'd almost rather not know what time it is at all than to climb (fall) out of bed and find a clock.
Of course, most people don't have that last problem. Most of us are smart enough to always have a clock in plain view of our bed, so we know exactly what time it is when we wake up. If we don't have one, we'll go out and buy one right away, and put it somewhere conveniently in plain view of the bed.
This isn't science here, folks. You know exactly where you want your clock to be placed, though there are definitely many different ways to implement your clock's placement. You can use your cell phone as a clock so you can bring it to your face to check the time without even needing to turn your head. You can place it in plain view but across the room, forcing you to get out of bed to shut off the alarm when it goes off. You can have it on the table next to your bed in perfect arm's length, so that when you swing your arm over, it hits the "snooze" button right on target every time. Regardless of the solution, you can now discern the time from your bed. It's valuable information, information you want to obtain in the least amount of time and effort possible.
I believe that gaming is all about information. You gather all the information you know at the time and act accordingly, whether you know it or not. Be you one who is into metagaming (in the words of Omar Kendall from his comment in my last post, metagaming is "when players look past the setting and are primarily concerned with underlying systems and information") or casual gaming, this is how a video game is played. Pac-Man was a success to casual gamers because, at all times, you can see where the Ghosts are and where the dots are. And you plan your path according to where the Ghosts aren't, even playing out in your mind far in advance where the Ghosts will not be many seconds later. Tetris was a success to casual gamers because the playing field is there, along with a "Next" piece in a window on the side. All the information you need is there, and using that information allows you to react well to the randomly generated pieces. The Sims was also a huge hit because you are given all the information you need in obvious ways. Even though traditional meters are provided for most information,
they still take a very entertaining way to provide the information: your Sims will have symbolic thought bubbles or break the fourth wall and yell at you directly. In all of these games, even casual gamers are very reactive to the information presented to them on the screen. It's not as heavily analyzed as it would be by the metagamers, but it is still very much processed in a similar fashion.
So, just like a clock in the morning when you wake up, you tend to want your information as quickly and efficiently as possible to be able to plan your next move. Gaming is instinctual and requires much forward thinking. Removing information for the sake of immersion or trying to emulate something that isn't a game is just annoying, like Derek Daniels commented on the last post. But of course that doesn't necessarily mean keeping ourselves stuck in the past. It's all about evolution, and we have so much to learn from the past 30+ years of gaming. If something doesn't work anymore, why bother, as James M asks in the previous post, to continue honoring it? Remembering video gaming roots isn't about re-using the same systems over and over again.
The way to remember video gaming roots, as I woefully failed to elaborate in my last post, is to remember what games are. They are a mental challenge, though sometimes requiring finger dexterity. They are obstacles presented to us as something to pass. That's why games are so wonderful: you always know you can win, regardless of how hard it is. Games are never made so that they can't be beaten, like so many things in real life are. Game makers are challenging their player to beat the obstacles they have presented. There is a great joy when you manage to overcome the things thrown at you. Sometimes players even feel like they've outsmarted the game makers, even though game makers have every intention for their games to be beaten.
But if a game is annoying and unnecessarily extra effort needs to be used to gather the simplest or the most critical information, they lose a lot of the enjoyment factor. This is exactly what needs to be avoided. Most game designers know this, but I'm afraid that, in a quest to achieve too much immersion, some may overlook this very fundamental concept. If you are planning to attempt high levels of immersion, don't do it in spite of that enjoyment factor.
When I suggest that game designers should be trying to keep adding things that make a game a game, I'm speaking in very broad terms. Of course there is no one true answer or system that makes a game a game. Maybe David Jaffe is right: video games are like porn because I certainly can't describe what constitutes a "game thing," but I sure know it when I see it. A game is a game is a game. I used the removal of the health bar in Fight Night 3 and King Kong as negative examples of immersion, yes, but it's not that I think life bars should never be removed. It's just that for those games, knowing your health at all times is mission critical information. Feel free to take away life bars if you want, but you'd better have a quick and clear indication of your health elsewhere, such as the Silent Hill example I gave in the last post. James M brings up a great example in Resident Evil. Even though your energy meter is inconveniently placed in a subscreen, the slowed run and limp are immediately recognizable signs that can give you that information you need right away that will prompt you to observe your health meter in the subscreen.
(This is the part where I admit having not yet played Fight Night 3, since I still don't own a damned XBox 360. The information of how badly damaged your character is could possibly be conveyed very well, and I could possibly be using a poor game as an example. I think I pick on it mostly because it drives me mad to hear people claim that dropping life bars makes the game more immersive, just 'cause it looks more real with no health meters.)
I do want to expand more on the topic that Omar has brought up: the metagamer versus the casual gamer. But I think that can take up a whole blog post all on its own, so I'll save that for next week.
To close up on this immersion topic, though (unless new comments spur more reactionary commentary from me... do your worst!), I am mostly fearful that some game designers may be losing focus. Immersion has become such a buzz word and, for some reason, has gained the identity of being equitable with realism. So in their strides to become more immersive, I'm afraid games might start to lose focus of what they are and gaming, as a whole, will suffer for it. I guess I'm just trying to be a wake up alarm for those that may have accidentally slipped into that path. But some of you are checking that clock and noting that it's probably far too early to sound the alarm just yet. If so, feel free to ignore me, hit snooze, and go back to bed.
- James
But you know what's worse? When you wake up before your alarm goes off. And you know it's almost time to wake up, but you don't know how much longer you have. You wanna know if you can try to fall completely asleep again or just forget it and wake up now, since the alarm will ring in less than 10 minutes. So you open your eyes to a squint just barely enough to look at your clock and find out how much time you have.
Of course, most people don't have that last problem. Most of us are smart enough to always have a clock in plain view of our bed, so we know exactly what time it is when we wake up. If we don't have one, we'll go out and buy one right away, and put it somewhere conveniently in plain view of the bed.
This isn't science here, folks. You know exactly where you want your clock to be placed, though there are definitely many different ways to implement your clock's placement. You can use your cell phone as a clock so you can bring it to your face to check the time without even needing to turn your head. You can place it in plain view but across the room, forcing you to get out of bed to shut off the alarm when it goes off. You can have it on the table next to your bed in perfect arm's length, so that when you swing your arm over, it hits the "snooze" button right on target every time. Regardless of the solution, you can now discern the time from your bed. It's valuable information, information you want to obtain in the least amount of time and effort possible.
So, just like a clock in the morning when you wake up, you tend to want your information as quickly and efficiently as possible to be able to plan your next move. Gaming is instinctual and requires much forward thinking. Removing information for the sake of immersion or trying to emulate something that isn't a game is just annoying, like Derek Daniels commented on the last post. But of course that doesn't necessarily mean keeping ourselves stuck in the past. It's all about evolution, and we have so much to learn from the past 30+ years of gaming. If something doesn't work anymore, why bother, as James M asks in the previous post, to continue honoring it? Remembering video gaming roots isn't about re-using the same systems over and over again.
The way to remember video gaming roots, as I woefully failed to elaborate in my last post, is to remember what games are. They are a mental challenge, though sometimes requiring finger dexterity. They are obstacles presented to us as something to pass. That's why games are so wonderful: you always know you can win, regardless of how hard it is. Games are never made so that they can't be beaten, like so many things in real life are. Game makers are challenging their player to beat the obstacles they have presented. There is a great joy when you manage to overcome the things thrown at you. Sometimes players even feel like they've outsmarted the game makers, even though game makers have every intention for their games to be beaten.
But if a game is annoying and unnecessarily extra effort needs to be used to gather the simplest or the most critical information, they lose a lot of the enjoyment factor. This is exactly what needs to be avoided. Most game designers know this, but I'm afraid that, in a quest to achieve too much immersion, some may overlook this very fundamental concept. If you are planning to attempt high levels of immersion, don't do it in spite of that enjoyment factor.
When I suggest that game designers should be trying to keep adding things that make a game a game, I'm speaking in very broad terms. Of course there is no one true answer or system that makes a game a game. Maybe David Jaffe is right: video games are like porn because I certainly can't describe what constitutes a "game thing," but I sure know it when I see it. A game is a game is a game. I used the removal of the health bar in Fight Night 3 and King Kong as negative examples of immersion, yes, but it's not that I think life bars should never be removed. It's just that for those games, knowing your health at all times is mission critical information. Feel free to take away life bars if you want, but you'd better have a quick and clear indication of your health elsewhere, such as the Silent Hill example I gave in the last post. James M brings up a great example in Resident Evil. Even though your energy meter is inconveniently placed in a subscreen, the slowed run and limp are immediately recognizable signs that can give you that information you need right away that will prompt you to observe your health meter in the subscreen.
(This is the part where I admit having not yet played Fight Night 3, since I still don't own a damned XBox 360. The information of how badly damaged your character is could possibly be conveyed very well, and I could possibly be using a poor game as an example. I think I pick on it mostly because it drives me mad to hear people claim that dropping life bars makes the game more immersive, just 'cause it looks more real with no health meters.)
I do want to expand more on the topic that Omar has brought up: the metagamer versus the casual gamer. But I think that can take up a whole blog post all on its own, so I'll save that for next week.
To close up on this immersion topic, though (unless new comments spur more reactionary commentary from me... do your worst!), I am mostly fearful that some game designers may be losing focus. Immersion has become such a buzz word and, for some reason, has gained the identity of being equitable with realism. So in their strides to become more immersive, I'm afraid games might start to lose focus of what they are and gaming, as a whole, will suffer for it. I guess I'm just trying to be a wake up alarm for those that may have accidentally slipped into that path. But some of you are checking that clock and noting that it's probably far too early to sound the alarm just yet. If so, feel free to ignore me, hit snooze, and go back to bed.
- James
2 Comments:
I don't want any confusion as to the term metagaming that I previously introduced; it is not synonymous with "hardcore" or anything like that - the gaming market is comprised of much more complex buying groups than simply hardcore and casual. Metagaming is a type of awareness that all kinds of players can or can not manifest, and in fact it's not at all tied specifically to video games, but rather all gaming.
That many traditional video game players metagame is not somehow proof that metagaming is what videogames are all about. This is a hard topic to convey, and maybe it's because I'm speaking as a designer and not as a player that I have a different perspective, but we simply cannot assume that all players play games the same way - there is no one game or to rule them all. The assumption that you're making regarding the communication of information stems from the way things have been, which is wholly independent of any "right way," and you do seem to acknowledge at least that. You want your information as quickly and efficiently as possible. Your gaming is instinctual and requires much forward thinking.
I wonder though, if my mom has as much gaming instinct as you do. I wonder how much she would rely on the traditional methods of quick and efficient information dissemination, and how much forward thinking she could apply to games. Probably not nearly as much as you, and yet the value she places on information should mirror yours? I disagree. This is fun!
By
omar kendall, at 9:35 AM
I actually do agree with most of what you are saying, and that's what I wanted to devote a whole post to. I figure it's a good topic to have it's own article, rather than just being one slapped into the middle of some talk about immersion and gaming roots and what-not.
I think that non-metagamers (for the lack of a better term, since I do want to make a distinction that calling them casual gamers does not imply that metagamers are hardcore gamers... I just don't know what the opposite of a metagamer is called...) do not apply metagaming strategies at first. But I think they learn it as they continue at the game they are playing, whhether it is video games, basketball, chess, or whatever. People do generally learn and, if they continue to stay complacent at the level they are at, more than likely they will grow bored of the game they are playing and drop it all together. If they garner enough vested interest in the game, they will begin to pick things up given time. And not a lot of time is needed either.
So quick information dissemination is not something non-metagamers will need right away. But as they become more involved with the game, it is something they will learn to want and need. I would like to think that even a non-gaming mom playing Nintendogs, for example, will eventually learn to clean their dog before the fleas show up by checking the status menu.
Anyhow, I'll save more of this for next post. Keep up the discussion, though. I guess we're disagreeing to agree... this is fun.
By
jchensor, at 1:34 PM
Post a Comment
<< Home