Sunday, January 30, 2011

Living (I Presume)

"Life.  Don't talk to me about life."  - Marvin, the "Paranoid Android" from Douglas Adams's Hitchhiker's Guide to the Galaxy

There's a place between our theoretical ideas about existence and the reality.  The amazing thing is, no matter how stone-cold practical we are, no matter how tough, no matter how wise, no matter how brilliant, there's still that gap, and we still inhabit it.  What do I mean, exactly?  Oh, but this is one of those posts, one of those times when What I Mean is too tied up in What Does It All Mean to be clear.  Nevertheless, we write on, we live on.

It strikes me that we're so obsessed with life (and the universe, and everything) that we forget living.  Or is it the other way around?  Are we so obsessed with living that we forget life?  The latter is closer to one of T.S. Eliot's more grim lines, anyway: "Where is the life we lost in living." I guess it depends upon which word we use to signal which place in our minds.  Is life the place we inhabit, or is living?  Or, worse, do we inhabit neither, operating as mere automatons in some great celestial or spiritual machinery?

Between theoretical ideas about existence and the reality...  What are our theoretical ideas, and what is the reality?  Well, that's easy enough to say.  Everything we believe is theoretical, and the reality something we can't really access, limited as we are by our senses and, above all, by the inexorable march of time.  Philosophers throughout history, it seems to me, have argued about exactly how impossible it is to get to reality given our limited capacities for thought and perception - granting, of course, that our capacities far exceed those of other animals - ranging in conclusion from thought is reality to perception is reality to there is no reality at all to it doesn't matter because God will save and/or destroy us anyway to let's play backgammon instead of thinking about this anymore.  Maybe the last of these is the most wise, after all, not because it is true, or right, but because to absorb oneself in a game might be the best way to inhabit our mysterious existence joyfully.

Then again, it might not be.  The irony of being counter-cultural is that you need a culture to counter.  Since all humans are a part of some culture or another, it stands to reason that we're never without opportunity to follow or break cultural mores.  It is still the case, however, that breaking the rules is irrelevant once you become invisible: consider the trouble-making student who is only relevant because he is so numerous; the individual who sneaks out of class doesn't really effect anything - including himself - all that much.

That is what it means to be obsessed with life.  We are so concerned with a troublemaker, with what happens to him, with whether he is caught, with how disruptive his actions are.  We take our cultural belief systems and apply them to the situations we see around us, heaping praise and scandal on the people who stand out (often the counter-cultural ones) because we think it matters so much.  But does it, really?  Why are we so concerned?  Why do we insist, with so much vehemence, that people do what we think is right?  The origin of imperialism is in this Christian sense of Life, and with it comes a much more profound death: a death of the spirit, of ingenuity, of love.  And, of course, of all of the non-Christians.

Of course, I'm essentially quoting Nietzsche here.  And don't get me wrong; just because we're the children of Puritan forefathers doesn't mean that we're alone.  The obsession with Life - the need to pry into other people's business, indeed, the need to see that others believe and act and do unto in the same way that you do unto - is hardly unique to Christianity, or indeed to the West.  Social creatures, it turns out, find it impossible not to meddle with other social creatures.

All of which is neither here nor there, as the saying goes.  The real point is that we don't inhabit either of those spaces: much as we are obsessed with life, we don't really understand - can't afford to understand, as David Hume might have it - that our beliefs are all predicated not on illusions, per se, but upon extremely limited information.  We are so obsessed and so certain precisely because we are so uncertain, because no amount of research or prayer can ever get us closer to really knowing what is right.  And, what's worse, we start to suspect that maybe there isn't some universal, timeless, "what's right" to begin with.  We start to think that, maybe, words are just that, words, and the whole problem with trying to understand life is that, as a human, we are forced into "trying to understand life," a series of sounds (or symbols) that only have meaning thanks to an accident of evolution (brains capable of rendering meaning out of sounds) and their cultural situation.

Which isn't to say that words aren't useful, or that there's some problem with being alive.  Far from it.  Living is wonderful, replete with joy, friendship and love, eating, thinking, writing, playing games, and countless other good things.  On the other hand, it's also full of pain, disappointment, unemployment, and frustration.  Of course, even those are a part of the joy of living, because they not only can be set against the good things, but indeed can mingle with the good things to produce a more interesting tapestry.  How many people love the feeling of completing a marathon, for example, not despite the pain, but because the pain makes it more sweet?  Isn't it the case that the difference between a romance novel and a pornographic story is mostly a matter of how disappointed the protagonist is for the balance of the narrative?

Sometimes, however, we get so caught up in life that we forget living all together.  Not that, again, we should ignore philosophies and theories and ideals.  Far from it, those things can be a part of the pleasure and pain of living.  But to inhabit them, to live only for life or - as Nietzsche might point out, for death - strikes me as the heart of silliness.

Ah, the irony here is too ripe.  "Not that we should..." is exactly what I'm writing against, no?  That's not a rhetorical question; it's a real one.  You see, I'm just as trapped as everyone else, inhabiting some place between life and living, between human and animal, between good and evil.  That I recognize it gives me no special benefit.  And that, ladies and gentlemen, boys and girls, is the real lesson.

Wednesday, January 26, 2011

Winning and Losing with Rafael Nadal

Now that I've returned from Portland,* I've also returned to watching the Australian open.  That means that, last night, I watched Rafael Nadal do something he almost never does.  I watched Rafa lose.

* Portland is an unusual place.  Upon returning to Honolulu, I feel very much like the guy in this TV show trailer.

Rafa's opponent was David Ferrer, a fellow Spaniard ranked number seven in the world.  Ferrer plays - or at least played last night - with ferocity and physicality despite his advancing age.  At 28, he is reaching the end of his productive tennis-playing years, and indeed much as broadcasters and writers lament the possibility that a Roger Federer or Andy Roddick might not have many more chances to win a Major, it's players like Ferrer - who has never won a Major or Grand Slam event, but finds himself in the top ten late in his career - who remind me of the cruelty of a sport so devastating to the body that players only rarely remain in form past 30.

Last night Ferrer was on top of his game, however, obviously far from imagining the inevitable end of his career within the next few seasons.  He was every bit Nadal's match, taking the game to him in a marathon, 20+ minute second game in which he eventually broke Rafa's serve.  More importantly, however, Nadal injured his hamstring, likely due to the virus he had been fighting throughout the tournament, leaving him more vulnerable to other injuries.

I'm not writing to extol the virtues of tennis as a one-on-one sport,* nor to recount the stories of Ferrer or Nadal (the latter of which, if you watch any tennis at all, you probably know).  Rather, I want to talk briefly about what happened after Nadal hurt himself, about how he lost to Ferrer, but simultaneously demonstrated why he wins so much.

* It does seem to me that only the pitcher-batter matchup in baseball comes close to the one-on-one intensity of tennis.  Most of our other popular sports are built around working together as a team, about exploiting weak players by matching them against strong ones.  In tennis, there is only you and the other guy, and in the Majors especially, where matches are best 3 sets out of 5, the competition can go on for four or five hours of running, pivoting, and swinging with everything you have.  It's grueling physically and psychologically.

With his hamstring bothering him, Nadal took a lengthy injury timeout, and was consistently seeing a trainer between service games for the rest of the three-set sweep by Ferrer.  The announcers speculated that Nadal would forfeit, as did the tournament organizers, who contacted the mixed doubles players who were next to go on the court and told them to "be read" as early as the first set.  Nadal's body language was bad, to say the least.  He was glancing at his box of coaches and supporters almost every point, grimacing and shaking his head.  Ferrer, meanwhile, was showing no mercy, attacking hard and swinging with abandon, trying to knock Nadal out of the match.

Ultimately, Nadal was able only to win a couple of games along the way, losing 6-4, 6-3, 6-2, each set tilting further towards Ferrer.  What was impressive, then, wasn't how Nadal played, but that he played, that he still tried to hold serve in his games, and that he fought to break an opponent moving and hitting better than he had any hope of doing with his leg hurt.  All the while, on change-overs, when the players switch ends of the court, Nadal would sit and stretch and receive treatment, nearly crying at the lost opportunity that defeat last night meant (he had won 3 Grand Slam events in a row, meaning a win in the Aussie Open would have given him a "Rafa Slam," as the media had called it, as he would be defending champion of all for of the biggest tournaments simultaneously).

Because tennis is such a physical game, even a minor injury can be the difference between easy victory and total defeat.  One can only imagine that Nadal's injury was not so bad, because he continued to play.  But watching him play, it was hard not to imagine that his hamstring was worse than almost anyone else would dare to play on.  He could barely move to try to get to Ferrer's smashes down the line, shots that a healthy Rafa would return easily.  Perhaps more telling than any physical sign, however, was that he looked defeated.

What separates Rafael Nadal from the rest of the tennis world, then, is that he played anyway.  He played because he was on the biggest court at prime time, because his was the most anticipated match of the day.  He played out of respect for tennis as a sport, and for Ferrer - his countryman and Davis Cup teammate - as a player.  He played because, deep down, I think a part of him believed he could still win, believed that if he just hung in there and put enough shots in play Ferrer would start making mistakes, would start missing on those shots that were just barely clipping the baseline.

In short, Rafael Nadal showed why he has won so much in his career, even though he suffered one of his most difficult losses.  It's almost a cliche in sports, but even the best players lose, and often how a player confronts losing tells you a lot more about their ability to win than anything else.  In contrast to his first round opponent - who forfeited the match to Nadal midway through the second set thanks to the beating he was receiving, Nadal stood in and took his beating, demonstrating why - in addition to his physical gifts - he has become the best player in the world.  He showed the psyche of a great player last night, the mind and determination of a winner.

It would have been so easy for Nadal to give up, to forfeit the match to Ferrer in the second set, by which time it was clear that he was going to lose.  But he didn't.  He kept fighting, he played as hard as he could with his leg barely working, he hit winners and pumped his fist.  I'm fond of saying that one of the most important qualities in a professional baseball player is that, even though they make an out some 60 to 70 percent of the time, they legitimately believe they will get a hit every at bat.  The same is true for tennis players; even though, on this night, Nadal knew he would struggle to win points, deep down he undoubtedly believed - without fooling himself - that he would win every point.  Or, at the last, that he would make Ferrer earn every point.

Monday, January 24, 2011

Brief Thoughts on Writing Poetry

I'm on a surprise visit to my brother in Portland, hence the gap since my last post.  Regular posting will recommence upon my return to Honolulu.

Writing poetry is difficult.  Anyone who has tried knows this.  Or rather, anyone who has tried and come out at the other end with a terrible, boring, poorly-constructed poem despite the desire and inspiration to write poetry in the first place knows this.  The thing is, there’s no mechanical process by which the writing of a poem is made easier.  Sure, the constraints of the sonnet form, for example, help to structure the poet’s thoughts, to force limitations on rambling by imposing metrical regularity and a rhyme scheme.  But, while in many ways it is thus easier to write a passable sonnet than a passable blank-verse rambling, it is also harder to produce anything at all.

I think, though, that the biggest barrier to writing poetry is a cultural one.  Poetry is written by the immature, by adolescents trying to cope with the wild emotional rollercoaster that they suddenly find themselves on.  Poetry expresses – poorly – their lust, their fear, their sadness, and their pleasure.  Who, having passed the age of 20, still experiences the world in colors as bright, in emotions as intense, as those of puberty?  Who – among those who have written poetry, anyway – is not embarrassed by the poems they have written as a youth?

What is there to write about, as an adult?  So much modern poetry, it seems to me, is either about itself, or about empty themes like the realities of urban life or political frustrations.  In some sense, I suspect, the themes of Whitman and Eliot and Kerouac and Stevens – our more recent “great” poets – have been exhausted.  How much more so the themes of Keats and Yeats and Tennyson?  That is not to say one may not write a great poem, in our modern age, about the fear of aging and death, but how could it compare with “The Love Song of J. Alfred Prufrock,” filled as it is with self-consciousness and shame at its own pretension, with a sense of pathetic lust, with imagery borrowed from the profound to the absurd?  Could there be a better statement of what it is to be a young man knowing that he will one day be an old man?

Instead, poetry has become the work of the clever, and indeed it must be exceedingly so to be purchased at all.  Or, rather, there are no poets anymore, at least not poets who’s primary occupation is poetry, because the act of writing good poetry might be opposed to the act of writing poetry that people are willing to purchase.  Even someone like Eliot, were he not famous, would likely never be read at all because his good poetry is hard to understand.  And hard-to-understand doesn’t cut it in the marketplace.  Why else would so many people go to see Little Fockers instead of, well, anything else?

The mistake, here, would be to think of poetry too much as an outcome, a mistake that is pervasive across our society.  Because we are so inundated with a historical account – of not just art, but science and politics and you name it – that is concerned mainly with the outcome of “great” poets and their great poems, we forget the hundreds and hundreds of poems that even those poets wrote which are not great.  We forget about first drafts, second drafts, third drafts.  We forget about the process of writing poetry, imagining that the great outcome might come from merely trying to write a single poem without practice, without thought, without the ability to write poetry, and not merely to have written poetry.

Monday, January 17, 2011

The Precession of the Equinoxes in Astrology

The New York Times, early this weekend, had breaking news for all us stargazers.  It turns out, they report, that the astrological signs in the sky no longer match up to what they were thousands of years ago, when astrology first came into being.  The result: if you're a Cancer, the sun isn't really set against the constellation Cancer when you're born.

The opening sentence of the article, however, is a snarky and ignorant attack on astrologers everywhere: "Astrologers, not surprisingly, say they knew this would happen."  Well, duh!  Of course astrologers knew this would happen!  Claudius Ptolemy - who developed the mathematic system that astrology is based upon back in the 100s - included the "precession of the equinoxes" in his seminal work.  The precession of the equinoxes has been a part of astrology since the very beginning.

That doesn't stop the writer of the New York Times piece from doing a straw-man, ad hominem hack job on astrologers everywhere, however, asserting in essence that the practice is just vague, meaningless mumbo-jumbo.  Now, I've been interested in astrology for long enough to know that some people will simply never take it seriously, but for those of you who are open-minded enough to at least consider that astrology might, for whatever reason, have some merit, read on.

The precession of the equinoxes is indeed built into astrology.  You've probably heard about the "Age of Aquarius," that we have now entered.  Almost certainly, however, you have no idea what makes our modern era the Age of Aquarius, and, likewise, you probably don't realize that before this we were mired in and/or blessed with the Age of Pisces.  You see, the current astrological age comes from where the sun is on the first day of the astrological new year (that is, on the spring equinox).  Because the stars drift backwards very slowly - taking about 2,000 years per sign, we move through the signs in reverse.

Consider the history of the world from the perspective of astrological ages, to get a sense of how astrology may actually be in the right here:

8000 BC to 6000 BC - Age of Cancer

Transition from hunter-gatherer societies and rise of the first agricultural communities.  Cancer, being the sign of the home (and of food), is particularly appropriate.  Under the influence of Cancer, humans settle, invent newer, more reliable ways to access food, and start to cultivate family relationships (Cancer's emotional attachment is at play here as well) for the first time.

6000 BC to 4000 BC -  Age of Gemini

Gemini is the sign of communication and short-range travel.  Here, it's particularly important for its relationship with practical wisdom: that is, Gemini is a sign of thinking practically, and knowing how to operate in a way that allows for survival.  The development of pottery, plowing, and early trade between communities all fit well into the Age of Gemini's profile.

4000 BC to 2000 BC - Age of Taurus

Taurus is a sign of laziness, but also of determination.  More importantly, however, Taurus is a very possessive sign, interested in beauty and value.  The development of early currency systems, as well as early works of art and literature fit well in the Age of Taurus.  Moreover, city-states began to develop walls and armies at a scale previously unseen, reflecting the sense of ownership that Taurus bestows.

2000 BC to 1 AD - Age of Aries

A sign of, well, war.  It's no accident that turbulent conflicts marked the Greek and Roman eras.  Aries, however, is also aggressive in its developing of opinions and ideas, and the flowering of civilization in Greece and Egypt during this time period demonstrate a love of debate (sophistry and/or philosophy, in Greece) and a general striving for excellence that is indicative of Aries.

1 AD to 2000 AD - Age of Pisces

The most recent age of our time is Pisces, an age of deep spirituality, of rediscovering the past, and of self-deceptions.  Pisces is a transitional sign, always, and the numerous revolutions that have dotted the last two millenia indicate both the covering and uncovering of truth and a lack of unity in intentions that go with Pisces.  The so-called "Dark Ages" or "Age of Christendom" are a strong example of Piscean influence.  But, indeed, there is no better indication than the disagreement about which title is correct for the time from the fall of Rome to the Renaissance.  That both have a claim suggests Pisces has been at work.

Now, don't take these dates as perfect.  There's not really agreement about exactly when a given age ends or when a new one begins, and that's ok.  When it comes down to it, the transitional points between ages are just as important as the ages themselves, and those transitional points tend to include influences from both signs.  Our contemporary world, for example, shows strong Aquarian aspects: the rapid development of technology, the rampant (compared to the past) idealism, the desire for change; but it also still shows Piscean qualities, like religious warfare and conflict, and corrupt and deceptive governments.  Such is to be expected.  Just as a cusp in the birth chart indicates that both signs have influence, the same can be said about the Ages.

The influence of the astrological age, however, is limited by the distance of the stars at work.  The specifics of day-to-day life, or even life-to-life generations, are not really touched that much by the current astrological age.  Instead, astrology is concerned with smaller units of measurment.  You see, the signs are based not upon the stars, but upon the seasons.  While Leo derives its name from the lion-like constellation it was found under two thousand years ago, its qualities are attributes of the mid-summer, of a time after the solstice, but before the autumnal equinox.

Some astrologers, it is true, use the siderial zodiac, choosing to interpret charts based upon what is really in the sky.  It is these astrologers that are concerned with measuring the actual size of the signs (hint, not all 12 cover 30 degrees of the 360 in the ecliptic, the band through which the sun and planets move).  It is, ironically, these astrologers who are most subject to the criticisms of astronomers and less-informed New York Times journalists, as well.  I prefer traditional zodiac for exactly this reason: its mathematics are clearer, and its interepretations are rooted not in astronomy, but in mythology.

Don't get me wrong, astronomy and astrology are deeply intertwined.  It is important, however, to recall the difference between the two, going back to their linguistic roots.  "Astronomy" means, simply, the rules of the stars.  It is a science desinged to calculate exactly what is going on in the heavens.  "Astrology," however, means the story of the stars, or the interpretation, or the logic, or the account.  It is, regardless of exact translation, a lyrical and interpretive study from the outset, concerned as much - or more - with why we call Pluto "Pluto" as whether Pluto is a planet.  Why?  Becuase the account astrology has of human life, at a grand scale or for a given individual, is designed to take into account the mythological and literary history of mankind as well as the mystical influences of the stars.  Indeed, some astrologers, like myself, would argue that astrology has nothing to do with the stars at all, that the stars are just a convenient metaphor for our hopes and dreams and desires, and that finding order in their motions operates by analogy and mythology.

You might think the results, here, would be chaotic, and in some sense they are.  I would argue, however, that they are also meaningful in that they provide an interpretive framework for a chaotic world.  Astrologers use that framework as a starting point for a conversation, as a way to enter into our inner cultural and metaphyiscal lives.  In my experience, it works.

Whether you believe it works or not, however, at least know the facts.  Any criticism of astrology should challenge its interpretive frameworks - its logos - not its understanding of the location of the stars - its astro.  The latter is accurate.  The former?  Up for debate.  I'd venture, though, that a good astrologer can wow even the staunchest skeptic.

Wednesday, January 12, 2011

Graphics or Gameplay

I loathe dichotomies, because it seems to me that usually the things we see as opposed are more complementary than we think.  Opposition itself is a kind of relationship, after all, and a close one at that.  Up and down, I am fond of saying, are a lot closer to each other than up and penguin are.

Given my distaste for things which are allegedly, but not actually, "mutually exclusive," you may not be surprised to learn that I believe that good games often have both good graphics and good gameplay.  Often, in the modern gaming world, it seems to me that you get one or the other; the development team blows its budget on good graphics designers or good game designers, but rarely both.  And, what's more, even if they want to do both, hardware may not be up to the challenge.  Really good gameplay - and especially a strong AI - might take up too much processor power and memory to make it possible for really nice, anti-aliased, 3D, shiny graphics to be feasible, even on a super-machine.

Of course, my natural position is to reconcile the paradox of gameplay and graphics by pointing out that graphics are a part of gameplay.  It's easy for a gamer to get caught up in thinking about the graphics as being icing on the cake, but graphics go beyond what's happening in the center of the screen.  The UI (user interface) is a part of graphics and gameplay, for example.  The PC (player character), if there is one, should look good, but also serves a the vital gameplay function of telling the player where he is.  That maybe goes without saying, but I think that many in the gaming world - or at least game players - have largely forgotten about the importance of functionality in graphics.

The dragon on the menu screen might be a bit much
To me, then, good graphics are functional graphics.  Age of Wonders may not be the flashiest, most beautiful game every made - and no wonder, since it's some 10 years old now - but the graphics are extremely functional.  The player can tell, at a glance, what his army's composition is, where they are on the map, where the nearest towns are and how big those towns are, and even how much income those towns have in the form of farmland.  All of that is modeled in 2D sprites, of course, but the advantage of 2D sprites is not to be ignored: they free up processor and memory resources for other things, like running the AI.

In the modern computing world - as we've gone to multi-core processors - this is even more important.  A game like, for example, Elemental: War of Magic (in many ways a spiritual successor to the Age of Wonders series) is modeled in 3D, but with low-intensity graphics and the option to play on an essentially 2D cloth map.  The relatively non-taxing graphics, compared with multi-core, threaded processing support allows the AI to run in the background (to plan its next move) while the player is playing.  Hence both a better AI and quicker turn times, both leading to improved gameplay.

I want to talk about four games I've played recently, and how they manage the graphics / gameplay issue.  All four, I think, do a good job (if for different reasons), and all four are from genres I prefer: rpg, strategy, and simulation.  I'm not going to bash graphics-happy, but fundamentally crappy FPS games (oops, I just did), or lament about how they are destroying PC gaming or how gameplay is dead or anything like that.  No, all four of these games are contemporary, and while they may not sell like the flashier, more braindead fare that even the least game-happy of you hear about (games like Call of Duty, Civilization Revolutions, or World of Warcraft), they are an indication of the way in which PC gaming is improving, even in the face of graphics-first consoles.*

* This isn't totally fair either.  While AAA titles on the consoles certainly tend to be dumbed down, there are also fantastic innovations happening on XBox and PS3.  Games like Flower and Flow, for example, are fantastic implementations of console technology.  Likewise, small developers like Penny Arcade are making hilarious and amazing cross-platform RPGs.  Not everything on console is FPS or the latest Madden game.

Avencast: Rise of the Mage

Wandering the Kyranian Wastes in search of demons
Avencast is an action RPG made by a very small team of programmers and artists.  The voice acting is mediocre, the story is a bit predictable, and while the graphics are nice enough, they're not going to blow anyone away.  What makes Avencast good, then, is its gameplay.

I'm not an action RPG buff by any means.  Generally speaking, the "action" there means no thinking, no strategy.  It means clicking on enemies to kill them, over and over and over.  That is still true in Avencast, to a degree, but not in the way you're used to if you're a Diablo fanatic.  Rather, in Avencast you are - not surprisingly - a young Mage, and your endless clicking casts a variety of spells, ranging from the tactical (freeze your opponent for no damange) to the devastating (unleash a meteor shower on your foes and wipe them all out).

The spell casting system is fun, because it's all key-combination based.  Rather than queuing spell one to the "1" key, and spell 2 to the "2" key, you press a combination of movement keys in succession, and then cast the spell.  This feels a bit more magical - if only marginally so - than many similar games.  Taking into account the motions, tactics, and vulnerabilities and resistances of your enemies, you have to be careful about what spells you use when, meaning the gameplay isn't just hack and slash.

But it mostly is.  I don't think Avencast is a great game, by any stretch, but it's a good one.  There is one thing that is great about it, however: the graphical rendering of spells.  Spells in Avencast look great, but, beyond that, they look how they should.  That's an important distinction.  Many magic-based games have generic spell effects that do little to differentiate a fire spell from an ice spell, or a shielding buff from a stat-raising one.  In Avencast, on the other hand, every single spell looks exactly right.  Your fire-wall looks like fire, and then when it gets upgraded to be half-fire, half "soul magic," it retains its fire billows but blanches to a white, soul-magicy hue instead of red.  The aforementioned meteors hit randomly in a wide area, and where they hit randomly really matters.  Sometimes they miss, and you can tell.  The quick, "oh no I'm about to get hit" shield that you can cast blocks incoming attacks or effects for exactly as long as it is visible.

All of that sounds like a no brainer, but not all games do it.  Avencast does it so brilliantly that, in the rare event that you fight another mage, you really feel like you know what spells the mage is casting (even if you've never seen the spells before) simply from the animations.  That's a good use of graphics to improve gameplay.  Attractive and functional is the right combination.

Europa Universalis III
 
China has always been huge
Paradox Interactive recently released a 5th expansion for Europa Universalis 3, a game that has been around since before I graduated from college.  Indeed, "Divine Wind," the newest add-on, is the second since Europa Universalis Complete was released.

Though still in need of patching, Divine Wind improves the graphics of EU3 considerably (if subtly), and also adds even more depth to what is already one of the deepest games there is.  I've written about EU3 before, but to summarize, it's a grand strategy/simulation game based upon the history of the world from 1400 until the mid-19th century.  The player leads a historical world power starting any time during that period (and ending any time), and plays without victory conditions, in an attempt to mimic or distort real history by expanding, fighting wars, engaging in diplomacy, colonizing, getting caught up in the reformation, and so on.

What makes EU3 so impressive is how much is going on in the background.  There are literally hundreds of nations, each with their own AI plugging away and making decisions.  The computing power needed to pull this off means no fancy 3D graphics.  But, as you can see in the screenshot, the game still looks attractive - if you like nice maps, anyway.  Armies and navies are easy to command, and indicators pop up to tell you which provinces are doing what.  Perhaps most useful is the menu system - which hardly looks or feels like menus at all - that you use to manage the nuts and bolts of your empire / merchant republic / monarchy / tribal despotism.  Information on your current ruler and heir is presented succinctly and conveniently, hiring new advisors is made easy, with all the information you need on one page, and managing your economy - while an art that takes practice - is only hard because of the game mechanics, and not because of the interface.

EU3 has one of the steepest learning curves of any game I have every played.  Indeed, I have owned various versions of it for almost four years now, and I'm still learning to play.  Only recently, for example, have I fully understood the drawbacks of expanding (the hit your research takes, for example, with more provinces, especially those that are of a different culture).  But one thing is clear: the folks at Paradox chose gameplay first, and then chose graphics that are not bad, but rather that complement that gameplay and are immersive to the armchair-leader who is more fascinated by changing borders and movements of armies than by individual soldiers wiping their noses.

Football Manager

Football Manager is probably the best selling simulation game in the world.  You may not have heard of it, because it's huge in Europe and relatively small in the USA, where soccer is still a fringe sport.  But a recent interview with a SEGA executive in Game Informer would let you know that FM is in the same stratosphere as the Total War series in terms of profitability.  Which, if you consider how soccer-happy Europe is, should come as no surprise.  Who, after all, doesn't want to manage their favorite sports team?

I'm still playing FM 2010, despite the release of FM 2011, because I'm still wrapped up in a career I started in the earlier game.  As you can see, I'm currently coaching Canvey Island, a small English semi-professional team.  The screenshot comes from inside a match against Dartford, and while the majority of the game takes place in a menu-driven, text-based interface, this, I think, is the heart of Football Manager.

Hooray for scoring goals

In any sports simulation, the match or game engine is what makes the thing go.  It's all well and good to have a solid high-level system, but if the games themselves are unrealistic and clunky, the game falls short.  Bowl Bound College Football, for example, had an intolerably bad play-calling system.  Most EA games, as another example, are horribly unrealistic, and thus make bad simulations.  Current online games like those at WhatIfSports are great because of the competition inherent in facing other humans, but their match and game engines are extremely opaque, entirely textual, and not very accurate.

Football Manager is in a class by itself, then.  While it's not perfect, the match engine is ever-improving, and what's best about it is that you can watch, as you coach, your players actually execute your tactics.  Now, for a non-European the learning curve here was very high, because I had to learn soccer tactics first, and then I had to learn the game, but over time I've come to see what a brilliant engine FM has.

Now, Football Manager currently does support 3D rendering of matches, but I choose not to use it.  I do lose a little in terms of seeing who really tripped who (circles don't do a good job of indicating who's on the ground), but I stay with the 2D view for two reasons.  First, it's a lot easier to survey the entire field, so I know not just what's happening on the ball, but what my, for example, left back is doing while my striker moves in to score.  Second, the 2D graphics are better than the 3D ones anyway.

Football Manager's 3D graphics are choppy and clunky.  Watching them, I feel like I'm playing a soccer simulation game.  The 2D graphics make me feel like I'm coaching a soccer team.  Odd?  Perhaps, but I think it gets to the deeper point: graphics, in a game, are not a substitution for imagination.  I can suspend disbelief a lot easier with circles running around, because I know it's an abstraction, than I can with clunky 3D models that don't really look like people, but look enough like them to ruin the abstraction.  Immersion doesn't just come from pretty pictures: even mere text can do the job, and 2D is sometimes better than 3D.

Distant Worlds

2D + Space = lots of ugly stacking; also, amazing gameplay

 Last but not least, I want to mention Distant Worlds, a game that is so massive, so complex, and so wonderfully fun that it could only work in 2D.  Huh?  Look at it.  It's ugly.  The ships are small, the space station looks horrible, and the planet is bland.  The space looks empty, and everything is so dark.

Distant Worlds is a classic "don't judge a book by its cover" kind of game.  The screenshots look unimpressive, but the reason the game looks so flat is because it has to, because in 3D it would be nigh unplayable.  Whereas with other games we've discussed here there's a kind of interplay between sacrificing graphics for gameplay, in Distant Worlds there's actually a synergy.  Simply put, there's so much to do in DW that 3D graphics would only disorient the player, making an already complex game needlessly more complicated.

What makes DW special?  Scale.  In a way, this is almost EU3 in space, only with ship design and without historical basis.  DW takes place in galaxies ranging from the hundreds to the thousands of stars, each of which has planets or asteroids that - if they are not inhabitable - may have resources vital to the success of your empire.  Indeed, economics drive DW, but they do so in the background.  In a fantastic innovation, Codeforce - makers of the game - set up an automation system that takes care of huge swatches of your empire for you.  No matter what, there is always a civilian economy that runs in the background of your military empire, collecting and exchanging resources, migrating between planets, and even visiting resort bases.  But you can also let the AI take care of research and/or ship design and/or fighting wars and/or engaging in diplomacy and/or espionage and/or construction and/or colonization.  The result: an incredibly complex 4X strategy game (that runs in real time, no less, though it can be paused) that is easy to learn because you can simply automate anything you don't want to worry about.

And, above all, the game looks right.  It's easy to tell where your ships are, which colonies are yours, what kinds of ships you're using, and even how much commerce is going on (if you choose to let the game display civilian ships; which makes the game run very slowly on large maps).  Above all, though, the abstraction of playing in 2 dimensions - instead of the 3 of, for example, Sword of the Stars - prevents the player from getting disoriented, and allows for seamless zooming from one side of your empire to another to address whatever problem might arise.  In short, the graphics may not look like much, but they're supremely functional, and they're supported by such in-depth gameplay that they're almost irrelevant.

In summary, then, graphics do not make the game.  Or, rather, graphics are how the player and the game interact, but they need not be fancy or high tech to do their job.  Indeed, sometimes 3D is less immersive than 2D.  Sometimes more flashy and fancy takes away from good gameplay.  In PC gaming, in particular, it's important for developers and players to remember that the more complex interface (keyboard and mouse versus controller) and the more powerful hardware allows for games that are deeper, more engaging, and, ironically, maybe less pretty than games made for the consol.  Kalos and agathos aren't always the same thing.

Wednesday, January 5, 2011

Sacris Solemnis

Not long ago I heard a fascinating rendition of the second movement of Beethoven's 7th Symphony on the radio.  If you're not familiar with the original, here's a passable version from Youtube:



Obviously it's an intense piece, and very intricately constructed.  It's in A-Minor (the 7th Symphony as a whole is in A-Major), and it heavily relies upon the tonic to dominant and back movement that starts the piece.  More than the harmony, however, what makes this piece so effective is its rhythm.  The whole 7th Symphony feels like dance,* from the celebratory and springy first movement to the hectic races of the 3rd and 4th movements, dance rhythms are pervasive.  This movement is no different, and while it is certainly sedate and may even come across - especially on first listening - as profoundly sad, if it is a lamentation it's a dancing one.  The repeated rhythmic meme of One-Two-and-Three-Four gives it a measured, dancing feel, a driving force that pushes the movement onward.


* Indeed, Wagner called it "the apotheosis of the dance."

I could go on about the 7th Symphony, but that's not my object.  Rather, I want to introduce the fascinating rendition I heard on the radio.  Give it a listen:



Now some classical music people are not fond of this kind of pirating of a famous piece.  I, however, think that's a bit stuffy and closed-minded.  Beethoven would maybe be offended by a re-rendition of his work, but, then again, he also re-rendered some of Mozart's themes in his own variations.  If nothing else, Beethoven would appreciate the choice of lyrics Libera leader Robert Prizeman elected to set against Beethoven's work.

You see, the 2nd Movement of the 7th Symphony is easy to understand as a dour, overwhelmingly depressing display of emotion.  I don't believe, however, that such was Beethoven's intent.  Indeed, this movement strikes me as just as joyful as the remainder of the symphony.  If anything, the difference is that this movement is more reverential - or at least reverential in a different way.  Where the other movements are like whirling dervishes, showing their devotion to life, love, and happiness by swinging about wildly, this movement is a wiser, quieter, more intense and personal display.

What perhaps the Libera version loses most is the build-up.  Beethoven's version starts with just chords, and piece by piece introduces further harmonies and a wonderful contrapuntal (and highly chromatic) melody.  He then moves that melody up through the instruments until it reaches the violins, singing above the rest of the orchestra's pounding harmonic and rhythmic core.  Libera can't and doesn't try to recreate that effect, choosing rather to focus on the simple interplay of melody and harmony that a novice might lose sight of in Beethoven's work.

Adding in a percussion part and an original "bridge" in the middle of the piece only further brings out the power of Beethoven's harmonies and rhythms.  The added bits feel, to me anyway, very natural, especially because they are so-well situated in the piece.  They help the listener forget that the Beethoven original contains more themes and development that would be unwieldy - if not downright ugly - performed by voice.  Those orchestral developments become unnecessary against the dynamic crescendo employed by Prizeman in his rendition.

But the lyrics.  This is what fascinates me most.  Prizeman chooses a hymn - Sacris Solemnis - as his text.  Now, Beethoven is ironically famous for introducing lyrics to orchestral music in his 9th Symphony, but the recognition of Beethoven as an innovator for the voice comes despite his disinterest in writing for the voice throughout his life as a composer.  In a time when opera was still very much the rage, Beethoven composed one, better-known for its preludes and overtures than its arias.  It seems to me that Beethoven was concerned, generally, with matters to weighty for lyrics.  Or, rather, his work was always singing in a language more expressive than German or Latin or Italian already, so why spoil it with words?  In some sense, his 9th Symphony seems to me to be an effort to teach we listeners his language of joy, present throughout his musical oeuvre.

So what lyrics did Prizeman select for his setting of the 7th Symphony?  The Latin goes like this:*

Sacris solemnis, Juncta sint gaudia
Corda, et voces, et opera.
Recedant vitera, nova sint omnia.
Sacris solemnis, gaudia.
Te trina Deitas unique Poscimus:
Sacris solemnis, gaudia.

* The bridge is: "Lucem, lucem ad inhabitas," which translates as "The light wherein You dwell."

I'm no Latin scholar, so I have to rely on the translations of others and a little bit of guesswork, but roughly this means something like:

At this our solemn feast, let holy joys abound
In every heart, and voice, and act.
Let ancient rites depart,* and all be new around.
At this our solemn feast, let there be joy.
We implore Thee, God, the Trinity:
At this our solemn feast, let there be joy.

* Reminds me of the opening lyrics of the 9th: "Oh fruende, nicht diese tone."

Even if the translation isn't perfect, could there be a better set of lyrics?  They capture the solemnity and reverence of the piece, but also understand that the occasion is fundamentally a joyful one.

I think, with Beethoven in particular, but also with Chopin and Wagner and a great many other composers of the 18th and 19th centuries, it's easy to get lost in the emotional force of the music.  The Romantic era, in particular, earns its name for its desirous, pining melodies and its adventures far from the tonic, through streams of dominants, making us positively lust after some resolution.  But, even though we might call such music childish - or, worse, adolescent - it is not.  I believe it cheapens those musicians to call their music "angry" or "sad" when, in reality, the story is much more complicated than that, the emotional investment much more nuanced.

Beethoven's music is joyful, not because he was always joyful, but because the act of writing music was, to him, a divine one, and one which brought him joy.  Even his most tempestuous works have a streak of the kind of melancholic joy that comes with understanding one's sadness and frustration to be only a part - and a vital one - of the emotional canvass of a joyful life.  I would argue that Chopin's "sad" music, for example, may come from the heart of a depressed invalid, but that depressed invalid was truly joyful in his heart when he composed music.

The 7th Symphony - and in particular the 2nd movement - is more, however, than just melancholy written with joy, but expressing melancholy.  No, it's exactly how Prizeman describes, a solemn plea and prayer for joy.  That may seem paradoxical, but I don't think it has to be.  I think joy and solemnity often exist side-by-side, and, what's more, that joy is sometimes its most poignant in a minor key.   Listen again and see if you can hear it.

Sunday, January 2, 2011

Follow-Up on Meaning and Truth

I was going to respond to Luc Duval's comment on my last post with a comment, but then I realized I'll probably be long-winded enough to turn the response into a post of its own.  Plus, this way there's a better chance you'll see the excellent points he makes.

Luc writes, in his comment to my last post:

You really dislike Twitter, eh?

That post was intriguing but overwhelming. I have some friendly criticisms and some questions.

1. Were you trying to expound on something you read by Hegel? Or...wait...Wittgenstein? Or Husserl? Though I'm familiar with most of these names, it would be helpful if you provided a more focused explanation of which of their ideas/statements you refer to, or, even better, some quotes.
2. I enjoyed reading what seemed to be buildup to conclusive statements of yours but sometimes felt left hanging. For example, your discussion of the tautology of meaning/truth or truth/meaning was valuable - I was really hoping for you to try to separate the two with something definitive, but I don't think you did.
3. You made a lot of comparisons to the past with great certainty but provide no examples, statistics, or metrics. (Maybe I'm looking for the wrong kind of truth.)

1. How did you get to "Well, certainly we have destroyed 'Capital T' truth," without first defining what you mean by such Truth?
2. Your discussion of media seems limited to online networks. Does your idea that, to paraphrase, "the search for meaning/truth has become homogenized via excessive individual 'shouting in a crowded room'" change at all when you give consideration to literature/drama/music/visual art? Furthermore, does that change your comparison to the past at all?
3. I got a little bit lost reading about "artificial and limiting" systems of expression. Spoken language seems limiting to me, but natural, whereas painting seems un-limiting, but artificial. Were you trying to say that "Truth" is less accessible because our systems of expression are limited? Is, then, "Truth" dependent upon expression?


To do justice to Luc's many excellent points, let's take them one-by-one.  First, the criticisms, all of which are valid.  I'll rephrase how I understand each one (and I'll probably be less tactful than Luc is), and then offer some brief thoughts.

1) Pretentious invoking of famous names does not help your point.  If you're not going to cite exactly what you mean when you mention Hegel, you could at least explain it a little bit.

This is very true, and it's a huge weakness of my first-draft style of blog writing.  Where, in an academic paper, I would never be so careless, I tend to write my blog posts with less care.  Which is ironic because the likelihood of my audience being able to follow the argument without further explanation is far greater if I were writing for a professor.

Anyway, the people in particular I mention in my previous post are people who I talk about regularly, especially with my friends who are fellow Johnnies.  The result is a shared understanding of what those names mean, but that's a shared understanding that does not go beyond the rather limited audience of, well, fellow Johnnies.  Even philosophy majors often don't actually read famous philosophers (though, of course, they are told what those philosophers say, in the opinion of their professors anyway).  It strikes me that I should probably do a post or two about the particular philosophers I cite or allude to most, not as be-all-end-all interpretations, but as helpful way-points for when I'm working on heavier posts, and I don't want to worry about spending extra paragraphs explaining the reference.

To answer Luc's question, then, in general if I'm responding to something I read elsewhere, I say so.  Rather, in this case I'm using "Hegel" as a stand in for a complicated, lengthy argument that I don't want to get into because I already "know" - or at least have a strong opinion about - the beginning and end points, as well as the process.  It's not, ultimately, vitally important to be familiar with the three philosophers in question to understand my last post, but I can see how their presence is confusing.  Luc is right to call me out on my name-dropping for exactly that reason.

2) The train of thought moves pretty well when it leaves the station, but where do these tracks go?

Now this, I would say, is a big part of the Nicht Diese Tone modus operandi.  Because I emphasize process so much over outcome, it is frequently the case that I avoid conclusive statements.  That's not fair, because I often make conclusive statements, but as a writer I generally treat them as a kind of dramatic flourish, and not at all necessary to my own satisfaction.  This, I think, is both a weakness and a strength.  It's also a symptom of my approach to writing: I'm rarely trying - especially in posts like the last one - to get somewhere.  Rather, I'm trying to explore things, and so often I end up with tangents that are more interesting than the main point.

Anyway, the weakness, here, is obvious.  My writing isn't always reader friendly, especially because it's not usually clear what the takeaway is.  And, frankly, if it is I feel like I haven't done my job.  I strive to be a complicator in a world of simplifiers, a person who makes understanding things hard (or, rather, who shows that understanding things is hard innately) in a time when we tend to believe that understanding things is easy.

I would concede, however, that my blog posts tend to get too far afield, which is principally, again, the result of them almost always being first drafts.  My usual writing style - when writing academically or professionally - is to draft, throw away, draft, throw away, and so on many times exactly because of how easily I'll get sidetracked.  Indeed, usually when I begin an academic paper, I'll write three paragraphs before I realize what I'm actually supposed to be writing about, and then I restart.

So, in short, I don't know where the tracks are going, either.  Sometimes I feel like that gets me to an interesting place.  Other times, I'm left as unsatisfied as Luc, with a profound sense of "and that happened."  But hey, at least the train is usually moving pretty well before it crashes.

3) Where's the beef/data?

This is actually a more interesting question than I can address in any reasonable amount of time here.  While I am certainly a proponent of good data and not making stuff up, I'm also extremely wary of the citation-over-idea culture that academia tends to cultivate.  I read too many articles and papers at Stanford that were, essentially, devoid of original thought (or even half-decent writing) that my own tendency is to over-emphasize ideas, to the point of not always doing the extensive background research I should be doing (and would be) were I publishing in a more formal setting.

Suffice to say, Luc is absolutely right to call me out for not supporting most of the assertions I make about the past versus the present, about modes of expression, and even about my interpretations of various philosophers.  Ironically, in the very post in question I talk about the modern explosion of "you have your facts, I have mine," and proceed to practice that very maxim.  Of course, the question becomes what the goal and audience of my writing is.  In general, my failure to provide data is a result of my taking for granted that my audience understands the situation to be very much as I suggest.  That is, I'm relying on "common knowledge," even of opinion.

Now that's hardly rigorous, but I dare say it's extremely common.  Sportswriters, generally, (at least in my experience) tend to take for granted a great many things that are alternately supported or denied by advanced statistical analysis.  Likewise, I suspect my previous post has a handful of assertions that intensive data collection would confirm, and a handful of assertions that intensive data collection would challenge.  I just don't know which is which, and - and here's the damnedest thing - in some cases I suspect the data does not exist, and would be nearly impossible to collect.

Let's not, for the moment, get into what is the most intellectually rigorous approach in such a circumstance.  Rather, sometimes, for the sake of argument, I'm liable to take a position that I don't actually believe - or, rather, that I don't actually have evidence for - because I would rather practice drawing actionable (or at least write-about-able) conclusions from limited data than trying to always be 100% certain about everything.  Indeed, that was largely the purpose of invoking Wittgenstein in my previous post.  His On Certainty is a work which tries and tries and tries to define certainty, and ends up talking itself in circles because, in the end, context matters so much to meaning that knowledge, in any fundamental sense, comes across as impossible to him.

But we're getting ahead of ourselves.  Onward to Luc's questions.

1) What is this "Truth" of which you speak?  And who destroyed it?

This is a very good question, and as I re-read my post with Luc's comments in mind, I realized that I did a very poor job laying out my premises.  The distinction I wanted to make was this:

Truth, with a capital T, is to my mind an indication of some Platonic, universal, almost inhuman reality.  It is, in short, the idea that there is a single, unyielding, unchanging essential set of metaphysical (or, possibly, physical) things and ideas which are applicable to all beings.  In some sense, arguing for the Truth, with a capital T, is akin to arguing for a kind of abstraction of God.  That is, whether or not there is Truth or not is not the same as whether or not there is a God.  Rather, it's whether or not there is a single way in which God (or anything else, really) exists.  That's maybe an opaque distinction, so let me try to clarify.

It seems to me that God, as a word, undoubtedly means a thousand different things to a thousand different people.  We might say, then, that one of those thousand people is right, and the rest are wrong.  God means one thing and one thing only, and that is the Truth.  We might, on the other hand, say that there are a thousand different valid ways of understanding God.  While that doesn't mean that there isn't maybe a deeper Truth (still capital) that synthesizes all or some of the thousand of those disparate and maybe even mutually exclusive (to we humans, anyway) perspectives, what it does mean is that, as far as human experience goes, there is not really any Platonic, eternal, universal Truth worth striving for.  Rather, there's only truth in as much as it applies to a given context, in a given time, to a given society.

So Truth, as such, hasn't been destroyed, per se.  But I would argue that post-modernism, as a philosophical and social movement, has roundly rejected the idea of a single correct interpretation of the world, metaphysical or otherwise.  As a result - because post-modernism has become so ingrained in our society - we are a world easily able to say "let's agree to disagree," because "you have your truth and I have mine."  That doesn't mean it's right and, as I argue (kind of) in my post, that doesn't mean there isn't a Truth (only that we haven't sought for it hard enough).  What it does mean, however, is that society at large, and academia in particular, are a lot more interesting in discovering contexts and situations than Truths, because we don't really take Truths seriously anymore.  Or, perhaps more poignantly, we see Truth, as an idea, as tied into colonialism and racial and sexual oppression, thanks to the preponderance of rich white males among the historically prominent seekers of Truth.

2 and 3*) "Procedural homogeneity," who do you think you are?  People still write books, paint pictures, and play music.  Also, why the Twitter hate?  And what does any of this have to do with truth?

* Because my writing about the once bled into the other.  Also, I'm lumping in Luc's opening comment here, because it's related to his actual question.

I don't hate Twitter, actually.  I find it fascinating, and while I'm not in a rush to start an account, I do sometimes read a select handful of people's Tweets.  Like in many situations, there's a paradox at work here: constricted form limits expression, but it also liberates expression.  The sonnet is an extremely limiting formal construct, but it is precisely because of its limits that writers of sonnets are so creative.  Similarly Twitter, where the character limit may homogenize the creative process (just like the sonnet form does), but within those limits the possibilities for expression are actually quite large.

It is also the case, in our digital age, that people do engage in all of the same creative forms they always have.  And, on top of that, we've begun to synthesize forms in ways that previous generations could never have imagined.  A simple example, to demonstrate the point, would be modern video games, which increasingly blend principles of cinematography with dramatic writing, both two and three dimensional artistic rendering, and, of course, good audio design.  The creative process is certainly alive and well in our modern age, and - by virtue of our increasingly huge population alone, let alone greater access to technology - innovation is probably faster and more widespread now than ever.

The problem, then, is a subtle one, and I'm not sure my post did a good job explaining it.  Indeed, I know it didn't, partially because I'm not sure if I really can explain it.  What I'm getting at, more or less, is that there's a kind of complicity the writer of sonnets has with the sonnet form that I'm not sure the Twitter use has with Twitter.  That is, the writer of sonnets winks and nudges at his form, knowing that its limits are precisely what allow him to achieve such high artistic expression.

Now, compare that to a Tweet, which is generally not taken as a piece of art to begin with.  What does the user of Twitter consider to be his relationship with the medium of self-expression he is using?  Is he aware at all that he is limited?  Those are real questions that I don't have the answer to.  What I suspect, however, is that Twitter users take for granted their character limits, just like we take for granted the limitations of our native tongues (Luc's language example is a good one).

In that sense, there may be little difference between modern man and man of any time.  We have - as I say in the post - always been limited in our means of expression.  All art is limiting in form because it is principally through limitations of form that we are able to create anything at all.  What is more pernicious, then, is not limitation, but the widespread homogeneity in the type of limitation in which we, as a society, participate.  It's not just Facebook, Twitter, and text messages that I'm talking about, it's decreasing biodiversity, it's dying languages, it's fewer and fewer cultures.  Now - and this is a leap - those latter things might be related, at a procedural level, to the former ones.  It might just be... Or let me make it a question: is it the case that there is a technological imperialism going on?  Are the wonderful tools for self-expression and communication we have developed in our modern age choking out diversity of all kinds, not because of what they are, but because of how they work?

Again, I really don't know the answer to that question, but it's a question that I don't think is being asked nearly enough.

Wither truth (or Truth), in all of that?  Well, I'm of the opinion that not only more ideas, but more different ways of generating ideas will get us closer to whatever the Truth might be.  And, even if no such thing exists, at the very least it's a lot more interesting - and probably a lot healthier intellectually, spiritually, and physically - to deal with variety rather than homogeneity.  In the end, I'm not so concerned with truth, I guess, because I'm more concerned with the process of getting there.  Or, rather, it might very well be that the closest we can get to truth as a species is good process.  Not good thoughts, but good thinking.

With that in mind, thanks to Luc for forcing me to do better thinking.