The Sims 2 (Windows)/Story Recognition Mode
This is a sub-page of The Sims 2 (Windows).
To do: Put this into unused data and prerelease. |
This needs some investigation. Discuss ideas and findings on the talk page. Specifically:
|
Story Recognition Mode
According to Mike Sellers, Lead designer on The Sims 2 one of the design goals of the game was this:[2]
- Make the environment adaptive based on the player's decisions. This is probably my favorite thing that didn't make it in: we wanted to vary the lighting, camera angles used, and music based on the kinds of objects the players purchased -- so if you bought a bunch of creepy things vs. hearts and flowers, we'd use lighting/sound/view angles corresponding more to a horror movie vs. a romance. Among other things we were able to show that the same animation of two Sims kissing looked and felt very different when set to romantic violin music vs. smoky sexy jazz music (one exec who watched the two sequences said the jazz one had a lot better resolution -- but it was literally the same video of an animation in both cases).
The question "What is your best 'The Sims 2' life story?" was asked on Quora. Mike Sellers was asked to answer it. In his answer, he described how story recognition mode worked during the early stages of the game's development:[3]
I was A2A, which is cool but odd, since I never played the game much after it was released. All my stories come from inside, like:...
- the “story mode” we wanted to put in, where based on the kinds of objects you bought and put in the house, the lighting, sounds, and camera angles would change as the game deduced whether you were telling a Comedy, Romance, Horror, Action, or some combination of story types. The prototypes for this worked really well… but again, it never made into the game.
But the idea was first introduced by Will Wright at the Game Developer Conference on 2001 conference "Will Wright's Design Plunder" starting at minute 58. Here's how he described it:[4]
Now, the last thing I want to talk about here, real briefly, is interactive storytelling.
I've always been really happy in my little sandbox, you know, doing these open-ended games, these simulations. And I've always really avoided the idea of interactive story. But at this point, as you can see, our fans are going to basically drag me into it whether I like it or not. So I've been thinking about this quite a bit lately, how we might enable this.
What's happening right now is the simulation is going off and doing one thing, and the fans are saying, "Oh, to hell with that, I'm going to tell my story over this direction." And frequently, they're having to take these Sims that just want to, you know, pee and eat and sit in front of the TV, and they're grabbing them, "No, stand over here, I need to take this shot." And so they're like these little actors on strike, you know, that you're trying to manipulate to take the screenshots.
So, what we need to do is try to get these things more to converge, bring them back into line. We've got basically this open-ended space, you know, you go in a direction in, but it's very flat dramatically. Now, drama, you know, is linear but has this added dimension that we don't have. And so, how do we get this added dimension of drama in our open-ended space?
I think the path to this, I see, is not teaching the computer to tell a story, but rather teaching it to recognize a story. If, when you're playing these stories out in the Sims, the computer can actually get some sense of the story that's in your head, it could then start helping you a bit.
It might, at first, be just a very simple thematic recognition. It might look at the things you've bought in the Sims and the interactions you've chosen, "Oh, you tend to kiss a lot, and you buy this type of stuff, I think you're doing sort of a romantic story." Or, you buy, you know, heart heads and jars, or shackles in your dungeon, and it might say, "Oh, that looks kind of like horror." And then it looks at what you're doing, it might even drive events to resolve it later.
You know, if it's not quite sure if you're doing a horror or comedy, you might walk into a room, and then there's a chainsaw and a cream pie, and it's looking to see which one you pick up.
Now, there are a number of approaches we can take towards this. One might be language to parsing, in language parsing, this is the way computers try to understand natural language. They'll look at the string of words and from those try to build higher and higher levels of features and eventually parse it into a complete sentence. I suspect there might be something similar possible with story parsing, where you look at an event stream and look for higher and higher levels of structure coming above to build the computer's understanding.
Let's assume that, for instance, we could do this. If the computer could parse this story, then it could actually start changing the presentation of the story in something like The Sims. We could start having the camera angles change, we could have the lighting change, the music. So, if it thinks you're doing horror, the lights start to come down, the spooky music starts, you see lightning in the background, the camera gets really close in, ambient sound effects, and all this stuff.
And the next step would be probably for the computer to actually start driving the events, rather than the events being random as they are now, events that are there to support your story.
Another way to look at this is that we have users always in the game trying to confine these success landscapes. If the computer can detect what goal the user is trying to pursue in the game, then perhaps, so say that a million people are playing this game every day, and after they play it, there's some fitness test.
You can measure how long you've played, maybe you score at the end, you give it a 1 to 10, "Oh, I really liked that experience today." Then it goes back up to the server, and it compares all the results. So, at the top level, we can have the server discovering these rules of story. It's not like we're trying to engineer a story grammar, what we're trying to do is develop a system to where we have enough information for the servers at the top end to discover this. This is very much like SETI at home, where we're doing a vast parallel search, except in this case, we're really using parallel things like SETI at home.
So, I think long term this might really be practical. And of course, you can look at different players, and they're going to want to play the games in different ways. So, a mapping that might work for me won't necessarily work for you. So, at some point, you want to classify the players. It might say, "Oh, that's the type of player that really enjoys the comedy experience." So, they get this mapping, and they'll be evolving a certain mapping for that group of players. These other players, we know, elected to kill and torture their Sims, and so we're going to give them a completely different experience.
So, that's about it on this. One more thing: once you've done this, you know, assuming we could do this, which is a big "if" - I know it's a very difficult target - once you finish this, once you've done the parsing, presentation, the computer is helping you tell the story. In essence, you've made a movie at the end, what you can then share with your friends. Maybe you can even go back and edit the camera angles. Other people with the game, you should be able to send this movie, very much like Quake movies we're doing, which was a very cool idea. If you have the engine, it's a very small amount of data I have to send you to send the movie, or it could be still saved out as a JPEG or an MPEG and given around other people, posted on websites. So, really what I think this is going to end up being more like is The Truman Show, where you're Truman kind of living your life out. You know, there's boundaries, but the boundaries are as far away as we can make them, and the computer is back there, sitting there, looking at what you're doing, saying, "That might be dramatic. What if I had this happen?" And maybe trying things on you all the time. The computer might be sending NPCs into your life, you know, seeing, "Oh, I see this love story developing. I'd better make the jealous ex-girlfriend appear at the door right now," so always, you know, trying to keep the dramatic tension going.
So, that's about all I have to say on that.
Based on the descriptions provided by Mike Sellers and Will Wright, it can be inferred that they are discussing different aspects of the same mode. Mike Sellers focuses on creating atmosphere, camera angles, lighting, and music in the game, while Will Wright also described this and mentions the potential for the computer to control events in the game rather than the events being random. And gave an example: sending NPCs to maintain the drama of the story.
Therefore, it can be assumed that both of them are describing different aspects of the same mode. One function of this mode, as described by them, is determining the genre of the story the player wants to tell based on the objects the player acquires in the game.
This is one of the directions Will Wright wanted to go with The Sims in the long term. In the interview on GamesRadar "The Sims 2: Inside the mind of Will Wright"[5] which published on November 04 2004, Will Wright explains that he wants to take The Sims in the direction where the computer can have a better understanding of the story that the player is telling. He believes that the more the computer can recognize the emotional state of the player, the better it can supply appropriate results and repercussions to the story.
Wright gives an example of how the computer can understand the story by observing the interactions the player takes in the game, the clothes they dress their Sims in, and the characters they create. He says that he can get a rough idea of whether the player is doing comedy, horror, or romance.
Wright wants to take this concept further and make the game feel like an interactive movie. He wants the computer to act as the player's assistant, doing the lighting and sound, and everything is targeted. For example, if the player is playing a horror story, the mood can change, the music can get creepy, a thunderstorm appears, lightening strikes, and suddenly there's a chainsaw outside the front door.
In this interview, Will Wright corrects the interviewer's use of the term "storytelling engine" and clarifies that he is actually referring to a "storytelling parser."[6]
Overall, Wright wants to create a more immersive and personalized experience for players by giving the computer a better understanding of the story they are telling and providing appropriate responses to their actions in the game.
Alleged cases of observation from players on thesims.cc forum
Although Mike Sellers says that this mode did not appear in the release version of the game, in certain versions of the game it may turn on and start working according to several players on thesims.cc forum.[7] For example, if only the University and Bon Voyage expansion packs are installed. Maybe add Apartment Life expansion pack.
A few more observations about how the sims behave if the mode works from several players on thesims.cc forum:[8]
- Sims themselves can change their behavior depending on the actions of the player. And even their aspirations are not an obstacle.
- Sims can cancel their assigned actions themselves. In this case, its icon will be crossed out, as if the player himself canceled it.