Tag Archives: get a life

Let’s Clear Up Some Things about Escalators at Gyms

So.  The photo to the right shows people taking an escalator to a fitness center.  It crops up regularly when people want to talk about how Americans are fat and stupid, because only a stupid person would take an escalator when they were just going to work out anyway, and if people would just take the stairs then they would not be fat.

Can we stop this, please?  Aside from the fact that the photo is a good 5-10 years old and the line of reasoning presumably older (from the first days of escalators, I imagine), you don’t look clever and superior.  You just look judgmental.  Actually, check that–you just are judgmental: saying judgmental things makes you, ipso facto, a judgmental person.  And, given that you think that people who ride escalators are worthless societal leeches, you don’t get to complain about how I refer to you.

To head off the inevitable objection of “If someone broke his leg skiing or something, then it would be okay for him to take the escalator, but those are clearly able-bodied adults!”, let’s start with arthritis.  Swimming is good for arthritis.  Therefore, an arthritic might join a gym to swim, but have trouble getting up the front steps.  While advanced cases may require a walker or cane, the early stages of arthritis can cause significant pain while remaining externally invisible except upon close examination.  And some types of arthritis strike young people and even children.

I could name any number of other injuries, illnesses, and disabilities that might lead to the same situation; the point is that you shouldn’t judge because you don’t know.  People with non-visible disabilities face hatred and discrimination every day from people who assume that, if you don’t look disabled, you couldn’t possibly be disabled*.  Don’t be one of those people.  You can’t get much lower than picking on the disabled.

Here you may gird your loins and say that, sure, some people who take the escalator to the gym might have legitimate health problems, but some of them have got to just be fat lazy slobs, amirite?  So let’s talk about obesity.

First, being overweight does not mean that someone is lazy!  The idea that everyone could be thin if they just didn’t sit on the couch all day eating bonbons is insulting.  Diet and exercise are factors that contribute to weight, but so are metabolism, body shape, and other factors over which one has no control.  For instance, polycystic ovary syndrome affects about 5% of women, is often undiagnosed, and causes–you guessed it–obesity.

Second, the fat=unhealthy myth needs to die a painful death.  An obese person may be unhealthy.  Or he or she may be an Olympic weight lifter.  And it isn’t like thin people never suffer from lifestyle-related health problems, either.

Third, maybe an obese person is taking the escalator because he or she has joint problems and/or shortness of breath because he or she is overweight.  You’re shaming a fat person for going to the gym!  Gyms already have a problematic tendency to cater to people who are already fit, rather than those who are trying to get fit.  This may be an inherent problem to some degree, but I’m pretty sure the solution isn’t to mock people and blame America’s problems on them.

Finally, maybe the escalator-taker has absolutely no excuse.  He or she is able-bodied and plans to go straight to the stair climber.  It’s still none of your damn business.  That person’s silly exercise regimen affects you exactly not at all, and your snide superiority is far more unattractive than taking an escalator to a gym could ever be.

EDIT: Read the comments for more examples of judgmental people who are incensed at the idea that they shouldn’t consider others inferior to them based on their lifestyle choices.

—-

*Not to suggest that people with visible disabilities don’t face discrimination, because of course they do.

Advertisements

3 Comments

Filed under Uncategorized

A Week without Social Media

I’m behind on everything because I’ve been moving, but I can’t possibly let Harrisburg University’s week-long social media blackout pass without a comment.

I don’t know about you, but regardless of what justifications are put forth, this has “curmudgeon” written all over it.  Not necessarily old curmudgeon–the originator in his forties–but a curmudgeon nonetheless: “I made do without Facebook when I was your age and so can you!”

It’s an experiment, says Eric Darr, the professor enforcing it.  A learning experience.  At the end, students will write reflective essays about the experience (I’m sure they’re delighted about that).  But it’s hard to believe it’s a lesson rather than a punishment, given that it’s forced.  Students don’t have the choice to participate.  To make it a shared experience, Darr says, and to avoid the confusion of students not participating trying to get a hold of students who are.  If only there was a sort of shared personal website where someone could alert friends and acquaintances that one was going to be offline for a week.

Darr, along with a lot of over-35s, thinks that young people are becoming addicted and need to learn how to cope without social media.  Hogwash, I say.  There will always be new technology and it will always displace old ways of doing things and lifestyles will always change as a consequence.  It’s always possible to wean oneself off these advances and, yes, it requires adjustment, but who cares?

Because the same logic could be applied to cell phones or regular phones or email or all computers or all typing devices or written language as a whole or cars or wheeled transportation in general or just about everything else.  We could go for a week without any of these.  There are even arguments to be made for doing so: the ancient Celts swore that writing ruined your memory.  We’re now addicted to writing and can’t remember a thing we didn’t make a note of.

When I was growing up in the days before e-commerce, my parents booked all our airline tickets over the phone.  By the time I started booking my own flights, online was the norm.  I remember realizing that I didn’t know how to book plane tickets over the phone–not that I expect it to be difficult; I’ve just never done it.  This was quickly followed by the realization that it didn’t matter: I knew the correct current procedure, and the phone method was just one of a thousand slightly obsolete ways of doing things, not particularly better or worse than any other possible way of booking plane tickets, not even interesting enough to be a trivia fact, let alone worth actually doing.

And life without social media isn’t even that different!  I could see how it could be beneficial, if nerdy, to maybe live like pioneers for a week, washing your clothes in a tub and all the rest, because you’d be forced to actually make drastic lifestyle changes.  But having to text or email your friend instead of IMing him?  Not that different.

Fact is, Darr would probably never think of having a week-long fast from telephones or cars.  The very idea would strike him as silly.  He grew up with these conveniences and he knows that they didn’t make his generation a bunch of indolent imbeciles.  But like many middle-aged people, he understands the technologies of his time, knows how to navigate life with their aid, and doesn’t quite see the benefit of this new stuff–though its downsides are clear enough to him.  Quite possibly he holds a lingering resentment against these newer, more convenient ways of doing things, since they weren’t available to him at that age.  It’s a common sentiment, hence the eternal narrative that the next generation is growing up lazy because they’ve never learned how to work*.

Remember, I don’t even use social media.  My last Facebook update was changing my status from “single” to “married.”  But plenty of generations of new technologies and conveniences have gone by without the world imploding in an ADD-generated singularity.  Life changes.  It’s not a threat, it’s not the enemy, and you might as well get used to it.

—-

*There are reasons to think that this might actually be happening, but they are social, not technological.  Hiring a maid is a lot more likely to make you lazy than buying a washing machine.

Picture from Failbook (yes, I changed the picture; this one just seemed so appropriate).

60 Comments

Filed under Uncategorized

The Politics of STFU

It was a glorious day when the term STFU entered the public discourse (at the hands of Alan Grayson, naturally).  Sometimes people should really just shut up.

Predictably, most people don’t like being told to STFU.  Some of these people have a dim remembrance that there was some sort of rule that meant they could say whatever they want, and they leap on that.  Take this comment from over at STFU, Conservatives.  The blog is just another irritating Tumblr blog, but the comment is fun.  He thinks his First Amendment rights are being violated because a blogger told him to shut up.  The blog, of course, set him straight by telling him to shut up again.

But that’s just a random person.  There’s one in every crowd.  Surely prominent figures would never be so silly as to suggest that one private citizen telling another private citizen to be quiet equals censorship or a First Amendment violation.

I don’t know what the future of our country would be in terms of First Amendment rights and our ability to ask questions without fear of attacks by the mainstream media.  (Sarah Palin)

Palin doesn’t want the media to be able to attack her when she attacks other people.  She’s not alone.  Dr. Laura feels the same way:

I want my 1st Amendment rights back, which I can’t have on radio without the threat of attack on my advertisers and stations.  (Dr. Laura)

And Palin feels the same way about Dr. Laura:

Dr.Laura=even more powerful & effective w/out the shackles, so watch out Constitutional obstructionists. And b thankful 4 her voice,America!  (Sarah Palin)

Those tweets really do make you feel like a mom trying to figure out what her middle-school daughter is saying over IM, don’t they?  So Palin and Dr. Laura either don’t understand the First Amendment, or, more likely, are using it as a stick to beat their critics.  In real life, as you know, the First Amendment only states that government can’t make you be quiet; in fact, the person telling you to shut up is exercising his or her First Amendment rights as well.

Additionally, the First Amendment says nothing about what sort of platform you should be given to express your views.  Dr. Laura has a privileged position on her radio show, so she isn’t being silenced if people pressure her advertisers to sever ties: she’s being reduced to the same level of communication as the rest of us.  She’s free to use whatever racial slurs she desires on a random blog that nobody reads, just like everybody else.  A privilege is earned and can be rescinded.

But while removing someone’s advantaged position is fair game, I don’t believe anyone ought to actually force someone else to be quiet.  STFU is an exhortation, not a mandate.  I’m telling you that you should shut up–but the burden is on you to either accept my suggestion or not.  That’s why I’ve never been impressed with the Anti-Defamation League.  The ADL, which always struck me as an Israel-centric, litigious clone of the Southern Poverty Law Center, seems intent on actually forcing people to be quiet.  That is still not a violation of the First Amendment, of course, since the ADL is not a government agency, but it goes strongly against the spirit of STFUing and of discourse in general.

Finally, STFU is a tool to use sparingly.  The purpose of discourse is engagement, so regardless of how sure you are that you are right, it’s your job to attempt to substantively speak to your opponent.  However, if your opponent has demonstrated an inability to listen, an insistence on asserting his or her position by sheer volume without argument or evidence, and harmful beliefs that will have real negative consequences, then it’s STFU time.

Leave a comment

Filed under Uncategorized

Ebert Is Back!

4,000+ comments later, and with a score of approximately 4,000:1, Ebert issued an apology for his comments about video games.  Sort of.

I’ll forgive his patronizing tone (the post is saturated with references to things that he likes, better, as if whether he had read Anna Karenina were in any way pertinent, and he expresses surprise that video game fans could express themselves intelligently) in light of the preponderance of comments highlighting his age and dated, out-of-touch views, but it remains that he isn’t really apologizing at all.  He admits that he should not have brought up video games, a medium into which he still refuses to invest the time to gain even a cursory understanding of its nature, in the first place, but he still refuses to concede that video games can be artistic; his admission that sometime in the vast future of mankind it’s theoretically possible that someone might make an artistic game is wiped out by his smug confidence that he’s right about every video game that exists today; he just shouldn’t have said so.

Since Ebert did not apologize or retract his original post in any meaningful way, and indeed built upon it, I’ll seize the opportunity to continue to berate him.

A sign that Ebert indeed just doesn’t get it–where “it” is not only video games, but art as a whole–is his failure to include the possibility of bad art.  By his definition, or rather description, something is either (implicitly good) art or it isn’t art at all.  This is an even more finicky criterion than his puzzling idea that art is the work of one person, not even allowing for categories like painting to be considered art, but requiring individual evaluation of whether a particular painting works or not to determine whether that painting counts.  But art can be bad.  For example, I and many others believe that the world would have been a better place if Thomas Kinkade had been born with no arms, yet his paintings are indeed art.  Terrible art.  Ebert would have been on much more solid ground if he had suggested that video games are not good art, but he didn’t.

You would think that someone’s head would implode if he stated that art was an individual, subjective experience at one moment and swept a broad piece of culture into the non-art category the next, but apparently it doesn’t, and there’s little hope for that single-minded of an individual to ever be set straight on anything he was initially wrong about.  But remember: during every new medium’s germination, there are people who stand off to the side and insist, “That’s not art!”  And every single solitary time, they’ve been wrong.

—-

Image from Thomas Kinkade’s website.

Leave a comment

Filed under Uncategorized

Ebert Should Stick To Movies

Roger Ebert is an excellent movie reviewer. But when he ventures out of his specialty to talk about video games, he begins sounding like an old codger who just doesn’t go for all this newfangled stuff. He maintains that video games are not and cannot be an art form. I happen to think that his article on the subject is, in fact, compelling evidence that his opinion on this matter should be given no weight whatsoever. I’m not specifically arguing that video games are art or should be treated as art, but merely that his assertion that they are inherently non-artistic is groundless based on the arguments he makes.

I am not in principle opposed to opposing things in principle. I’ve done it myself. However, to make an argument in principle, you need to provide a priori reasons why the very things that define it prevent it from being good. For example, I submit an idea a friend of mine is still defending: ice cream of wheat. You don’t need to make it or optimize the freezing conditions or the ratio of milk to wheat to know that this will always be a terrible idea. Ebert fails to make anything like a strong enough argument to convince anyone that video games are to art as ice cream of wheat is to tastiness. Let’s have a look at his post.

His strong opening statement that ” I remain convinced that in principle, video games cannot be art” leads one to expect that he would immediately offer a reason why this cannot be true, perhaps by defining art and defining video games and demonstrating that the two are disjoint sets. Yet he immediately segues into a rebuttal of an argument by Kellee Santiago, a proponent of the gaming industry, who says that video games can be considered art. Santiago, unfortunately, takes a weak tack, and Ebert, who ought to know better, considers his counterargument as grounds to declare victory on the topic altogether.

Santiago talks about cave art and suggests that video games seem less artistic than other types of media because it’s a relatively young field and that, when it has had time to mature, we will see more artistic games being developed. This leads Ebert into a discussion of the origins of various types of art and whether the cave painters were in fact talented artists:

Any gifted artist will tell you how much he admires the “line” of those prehistoric drawers in the dark, and with what economy and wit they evoked the animals they lived among.

If that smacks of the True Scotsman, it should; he’s defining a “great artist” as “an artist who admires cave paintings,” and anyone who doesn’t is not a great artist. That particular fallacy doesn’t matter much, but make a note of it, because we’ll see it again in more relevant contexts. Anyway, while that undermines Santiago’s argument, it doesn’t have a thing to do with Ebert’s. Certainly, his being mostly right on this point doesn’t prove that it’s impossible for something non-artistic to evolve into something artistic.

Next, he briefly makes an actual a priori argument:

One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite a immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them.

A cave from a cooperative game of Dwarf Fortress I played

Ebert has listed his own counterexample: immersive games with no goals, points, or objectives.* He probably doesn’t realize it, but this is an entire game genre, called sandbox games. An example is Dwarf Fortress. You control a band of dwarves who begin with a cart full of supplies and mine mountains, grow crops, build a fortress, create goods, trade with other settlements, and fight invaders. Your settlement can die off if it’s mismanaged, but if you play well, it will grow indefinitely. There is no win scenario. I like to collect and tame as many animals as possible. Jordan likes to create elaborate defenses with flooding moats and lava traps. One couple I know named two dwarves after themselves and locked them in a room together until they got married.

I’m not arguing that Dwarf Fortress is a work of art per se, but it reveals Ebert’s profound ignorance of what video games are like. According to Ebert, Dwarf Fortress “ceases to be a game.” But you are manipulating characters–8-bit characters, no less-on a computer screen through use of keyboard controls. That’s a video game. If you’re keeping track, the True Scotsman is back: if it doesn’t have a win scenario, it’s not a “real” video game. Ebert seems on the brink of simply defining video games as “a form of media with no artistic merit,” along the lines of definitions of pornography.

Even worse, he suggests that sandbox games are “a representation of a story, a novel, a play, dance, a film.” Here his ignorance makes his argument incoherent. Sandbox games don’t have plots. How could they? There is nothing to make your character go a certain place or do a certain thing, so there is no way to make a story progress in any understandable, let alone well-told, fashion. The more prominent a game’s plot, the more linear the game is: the only thing you can do next is the next thing in the plot. And games with plots always have objectives and win scenarios. You win when you finish the story. Depending on the game, there may or may not be wrong choices you can make and lose by getting to a wrong ending.

But let’s pretend he was talking about plotted games rather than freeform games. Are they only representations of other types of storytelling? Of course not. They have nothing more in common than all having a plot. Some video games are based on movies and vice versa, but the poor reception both generally receive demonstrates the vast difference between the two media. For a proponent of the media form that, after video games, is most commonly derided as being non-artistic plebian fodder, Ebert is being incredibly myopic. There probably were and are people who would say that a good movie is just a representation of a book. Nevertheless, it is a movie, and it succeeds as a movie, not as an attempt to be like a book (even, perhaps especially, if it is an adaptation of a book). I would expect a movie critic to have more perspective.

Exquisite corpse by Yves Tanguy, Man Ray, Max Morise, and Joan Miro

You can win most games, but that’s hardly an exhaustive definition. A popular game when I was in college was paper telephone. The players sit in a circle and each writes a phrase on the top of a piece of paper. Then, everyone passes the papers to the left, reads the phrase they got, and draws a picture of it just below. They fold over the top of the paper so that only the picture shows and pass the papers again. Then each player writes a description of the picture underneath it. They continue passing the papers, alternating drawing a picture and writing a description, until the pages are full. Then everyone unfolds the papers and looks at how things progressed. Undoubtedly a game, but nobody wins. There are no points or objectives and scarcely any rules.** What’s more, the outcome is distinctly artistic. A similar game involves folding a paper into thirds and having one player draw a head, a second player a body, and a third player legs. It’s a game–but it was invented by the dada artists and surrealists. The product is a work of art. Slippery, isn’t it?

Art is difficult to define, and this becomes the crux of Ebert’s article. He never suggests a concise definition, which is good, given his strange criteria, such as:

I do not believe collaborative art cannot be art. I cite cathedrals and tribal dances as collaborative works of art. But they begin with an auteur with an original vision — whether that be a king, an architect, or a choreographer. The film director usually has the original vision.  (from the comments)

Remember, this guy is a movie critic.  Difficult to believe, isn’t it?  He’s ruled out any kind of cooperative work based on multiple contributions, from the exquisite corpse up there to most non-classical music, stripped the title of “artist” from the people who most deserve it (actors, painters, singers) and given it to the people who didn’t do anything except sit on the sidelines and say “I’d like something like this.”  Besides, it’s a pointless criterion, since it doesn’t rule out particularly more non-artistic things than artistic things.  It doesn’t even rule out games: Dwarf Fortress was completely created by one person.

After criticizing various definitions of art put forth by Santiago, he finally decides:

How do we tell the difference? We know. It is a matter, yes, of taste.

And with that, he loses his own argument. He has conceded that art is subjective and taste-based. Therefore, if he finds video games to be unartistic, it means nothing more than that he doesn’t like video games. If artistry is subjective, there can be no grounds for rejecting an entire media form a priori. He goes on to criticize particular video games put forth by Santiago as examples of artistry, but there’s no reason to give an ounce of credence to what he has to say. As C.S. Lewis observed, one should not criticize specific works within a whole genre that one dislikes:

Otherwise we shall find epics blamed for not being novels, farces for not being high comedies, novels by James for lacking the swift action of Smollett. Who wants to hear a particular Claret abused by a fanatical teetotaller, or a particular woman by a confirmed misogynist?***

The fact that he criticizes the games individually is itself a concession. If there is reason to point out the things that are wrong with a game, there is the possibility that those things might have been right; thus, if certain elements of a game are unartistic, that concedes that those same elements, done differently, might have been artistic.

The three games put forth by Santiago are Waco Resurrection, Braid, and Flower. Ebert’s individual criticism is nothing more than simplistic bashing exacerbated by the nagging problem that he hasn’t played any of the games in question and is basing a complete write-off on two minutes of non-interactive trailer. This is actually worse than evaluating a movie based on a trailer, because at least a movie trailer shows two minutes of what the viewer’s actual experience will be like, whereas passively watching something is never a good approximation of the actual experience of playing a game. This is why games release demos, but needless to say, Ebert couldn’t stoop to playing a free demo. It wouldn’t matter anyway. He’s resolved to dislike video games under all circumstances.

He belittles Waco as “one more brainless shooting-gallery” and criticizes it for not engaging the senses and emotions…based on a trailer. Additionally:

The graphics show the protagonist exchanging gunfire with agents according to the rules of the game.

There again with the rules. I’m sure he’s disappointed when a movie camera captures images according to the rules of the reflectance and transmission of light.

I’ve never played Waco, but Jordan does own Braid, and Ebert’s criticism of it is even more puzzling. The whole premise of Braid is that the player can control and reverse time. Ebert doesn’t like this:

In chess, this is known as taking back a move, and negates the whole discipline of the game.

He seems to be suggesting that reversing time would make the game too easy, but the dangers of reviewing a game you’ve never played are showing. Braid has an exceptionally steep learning curve. You don’t just reverse it, you slow it, manipulate it, and predict its consequences to achieve goals that would be impossible in linear time. I would have thought Ebert would be more receptive to use of non-linear time; he gave Memento three stars out of four. But then, he just doesn’t like video games. Ever.

Braid's title screen

But the real problem with his comment is that he’s criticizing the gameplay. Whether or not Braid requires discipline is utterly independent of its artistic value. Indeed, a major problem with Ebert’s review is his confusion on what he means by “video games.” Is he referring to the gameplay, or the act of playing the game, or the game’s sound and graphics? At times like this, he seems to be referring to the second, which I doubt anyone would consider to be art, except in the Latin-root sense of “skill,” as in “Zen and the Art of Motorcycle Maintenance.” The third ought to contain the possibility of artistry completely aside from the fact that it’s part of a game, just as Ebert doesn’t consider chess to be art, but as he as forced to concede in the comments, a beautifully made chess set could be. He fails to grasp the implications, though: a graphically beautiful game is both chess the game and a beautifully crafted chess set.

Ebert says that the between-level story in Braid “exhibits prose on the level of a wordy fortune cookie.” Ponder that for a while and try to guess what he’s trying to say. Is he criticizing the game for not using enough words? Is he suggesting that Braid is written in Engrish? Neither, of course; it’s meaningless bashing because he’s committed to not liking any part of any video game. Otherwise he might have mentioned that the story is told almost entirely through pictures.

Moving on to Flower, he begins by showing his age:

Nothing she shows from this game seemed of more than decorative interest on the level of a greeting card.

He can put the greeting card next to the fortune cookie. But I just realized where all this sounds familiar. Roger Ebert was on the AP committee that evaluated my AP Art portfolio in high school. You know, the one that only got a 3 (2 in the technical category) because they didn’t like digital art? I’d thought that the resistance against art made on the computer was a thing of the past, but I suppose it persists in the over-65 crowd. Anyway, he’s negating his own argument again. If Flower isn’t pretty enough, that necessitates that it’s possible for a game to be pretty enough. By the next bit, his ignorance is getting simply tiresome:

Is the game scored? She doesn’t say. Do you win if you’re the first to find the balance between the urban and the natural? Can you control the flower? Does the game know what the ideal balance is?

All together now: Just play the game. All your questions will be answered. And stop criticizing gameplay, as if whether or not you control the flower determined whether the game was artistic or not. The last point seems to be a jab at the game for, presumably, providing a single answer to a complex, subjective question, as though any movie about the relationship between man and nature wouldn’t have to do the same.****

Yes, this is a video game.

Even if Ebert’s criticism of these three games was warranted, they aren’t the whole of the gaming industry and even a casual gamer could easily throw out a few others worth mentioning. Okami was disappointing as a game, but just look at it. Then there’s the deliciously clever writing of Portal, the mysterious atmosphere of Myst, and my favorite, Beyond Good and Evil, notable, for its beautifully immersive graphics, solid characters, and Christophe Heral’s wonderful soundtrack.

Moving on, we reach the patronizing bit:

Why are gamers so intensely concerned, anyway, that games be defined as art?…Why aren’t gamers content to play their games and simply enjoy themselves? They have my blessing, not that they care.

Why can’t Roger Ebert admit that movies are meaningless opiates for the ADD-stunted, popcorn-stuffed masses?

His conclusion:

The three games she chooses as examples do not raise my hopes for a video game that will deserve my attention long enough to play it. They are, I regret to say, pathetic.

This is the sort of statement that casts, not only his opinion of video games, but all his opinions (which is to say, his entire career) into serious doubt. He evaluates things based on his prejudices without bothering to verify them. Why on earth would anyone listen to a word such a person has to say on any topic?

In conclusion, Roger Ebert is a peevish geezer who wants the video gaming industry to get off his lawn. He should stick to movies, but I wouldn’t read any of his reviews if I were you. None of his opinions deserve my attention long enough to read them. They are, I regret to say, pathetic.

—-

*Of course sandbox games still have rules, but that is hardly relevant. Since video games have to use controls and are not restricted by the laws of the natural universe, they have to set up rules by which the interactions take place; say, that spacebar means jump. Criticizing them for this is not much more meaningful than criticizing actors for obeying gravity. Dwarf fortress, in fact, follows the laws of the natural world rather closely–up to and including thermodynamics (warning: the video is a little gory).

**There are, however, sufficient rules that I once got in an argument over them. The point of contention was whether the game should stop when the paper is full, or when the paper has traversed the circle and returned to the person who started it. I still maintain that the former is the better strategy, leading to a more pleasing finished product and allowing the game to be played with as few as three people.

***C.S. Lewis, Of Other Worlds, p. 60

****Of course the movie doesn’t have to just say “This is the correct balance between man and nature,” but neither does the game. They do, however, both have to make statements about a subjective topic.

Exquisite Corpse found here. Braid screenshot found here. Okami screenshot found here.

2 Comments

Filed under Uncategorized