Sunday, January 2, 2011

Follow-Up on Meaning and Truth

I was going to respond to Luc Duval's comment on my last post with a comment, but then I realized I'll probably be long-winded enough to turn the response into a post of its own.  Plus, this way there's a better chance you'll see the excellent points he makes.

Luc writes, in his comment to my last post:

You really dislike Twitter, eh?

That post was intriguing but overwhelming. I have some friendly criticisms and some questions.

1. Were you trying to expound on something you read by Hegel? Or...wait...Wittgenstein? Or Husserl? Though I'm familiar with most of these names, it would be helpful if you provided a more focused explanation of which of their ideas/statements you refer to, or, even better, some quotes.
2. I enjoyed reading what seemed to be buildup to conclusive statements of yours but sometimes felt left hanging. For example, your discussion of the tautology of meaning/truth or truth/meaning was valuable - I was really hoping for you to try to separate the two with something definitive, but I don't think you did.
3. You made a lot of comparisons to the past with great certainty but provide no examples, statistics, or metrics. (Maybe I'm looking for the wrong kind of truth.)

1. How did you get to "Well, certainly we have destroyed 'Capital T' truth," without first defining what you mean by such Truth?
2. Your discussion of media seems limited to online networks. Does your idea that, to paraphrase, "the search for meaning/truth has become homogenized via excessive individual 'shouting in a crowded room'" change at all when you give consideration to literature/drama/music/visual art? Furthermore, does that change your comparison to the past at all?
3. I got a little bit lost reading about "artificial and limiting" systems of expression. Spoken language seems limiting to me, but natural, whereas painting seems un-limiting, but artificial. Were you trying to say that "Truth" is less accessible because our systems of expression are limited? Is, then, "Truth" dependent upon expression?

To do justice to Luc's many excellent points, let's take them one-by-one.  First, the criticisms, all of which are valid.  I'll rephrase how I understand each one (and I'll probably be less tactful than Luc is), and then offer some brief thoughts.

1) Pretentious invoking of famous names does not help your point.  If you're not going to cite exactly what you mean when you mention Hegel, you could at least explain it a little bit.

This is very true, and it's a huge weakness of my first-draft style of blog writing.  Where, in an academic paper, I would never be so careless, I tend to write my blog posts with less care.  Which is ironic because the likelihood of my audience being able to follow the argument without further explanation is far greater if I were writing for a professor.

Anyway, the people in particular I mention in my previous post are people who I talk about regularly, especially with my friends who are fellow Johnnies.  The result is a shared understanding of what those names mean, but that's a shared understanding that does not go beyond the rather limited audience of, well, fellow Johnnies.  Even philosophy majors often don't actually read famous philosophers (though, of course, they are told what those philosophers say, in the opinion of their professors anyway).  It strikes me that I should probably do a post or two about the particular philosophers I cite or allude to most, not as be-all-end-all interpretations, but as helpful way-points for when I'm working on heavier posts, and I don't want to worry about spending extra paragraphs explaining the reference.

To answer Luc's question, then, in general if I'm responding to something I read elsewhere, I say so.  Rather, in this case I'm using "Hegel" as a stand in for a complicated, lengthy argument that I don't want to get into because I already "know" - or at least have a strong opinion about - the beginning and end points, as well as the process.  It's not, ultimately, vitally important to be familiar with the three philosophers in question to understand my last post, but I can see how their presence is confusing.  Luc is right to call me out on my name-dropping for exactly that reason.

2) The train of thought moves pretty well when it leaves the station, but where do these tracks go?

Now this, I would say, is a big part of the Nicht Diese Tone modus operandi.  Because I emphasize process so much over outcome, it is frequently the case that I avoid conclusive statements.  That's not fair, because I often make conclusive statements, but as a writer I generally treat them as a kind of dramatic flourish, and not at all necessary to my own satisfaction.  This, I think, is both a weakness and a strength.  It's also a symptom of my approach to writing: I'm rarely trying - especially in posts like the last one - to get somewhere.  Rather, I'm trying to explore things, and so often I end up with tangents that are more interesting than the main point.

Anyway, the weakness, here, is obvious.  My writing isn't always reader friendly, especially because it's not usually clear what the takeaway is.  And, frankly, if it is I feel like I haven't done my job.  I strive to be a complicator in a world of simplifiers, a person who makes understanding things hard (or, rather, who shows that understanding things is hard innately) in a time when we tend to believe that understanding things is easy.

I would concede, however, that my blog posts tend to get too far afield, which is principally, again, the result of them almost always being first drafts.  My usual writing style - when writing academically or professionally - is to draft, throw away, draft, throw away, and so on many times exactly because of how easily I'll get sidetracked.  Indeed, usually when I begin an academic paper, I'll write three paragraphs before I realize what I'm actually supposed to be writing about, and then I restart.

So, in short, I don't know where the tracks are going, either.  Sometimes I feel like that gets me to an interesting place.  Other times, I'm left as unsatisfied as Luc, with a profound sense of "and that happened."  But hey, at least the train is usually moving pretty well before it crashes.

3) Where's the beef/data?

This is actually a more interesting question than I can address in any reasonable amount of time here.  While I am certainly a proponent of good data and not making stuff up, I'm also extremely wary of the citation-over-idea culture that academia tends to cultivate.  I read too many articles and papers at Stanford that were, essentially, devoid of original thought (or even half-decent writing) that my own tendency is to over-emphasize ideas, to the point of not always doing the extensive background research I should be doing (and would be) were I publishing in a more formal setting.

Suffice to say, Luc is absolutely right to call me out for not supporting most of the assertions I make about the past versus the present, about modes of expression, and even about my interpretations of various philosophers.  Ironically, in the very post in question I talk about the modern explosion of "you have your facts, I have mine," and proceed to practice that very maxim.  Of course, the question becomes what the goal and audience of my writing is.  In general, my failure to provide data is a result of my taking for granted that my audience understands the situation to be very much as I suggest.  That is, I'm relying on "common knowledge," even of opinion.

Now that's hardly rigorous, but I dare say it's extremely common.  Sportswriters, generally, (at least in my experience) tend to take for granted a great many things that are alternately supported or denied by advanced statistical analysis.  Likewise, I suspect my previous post has a handful of assertions that intensive data collection would confirm, and a handful of assertions that intensive data collection would challenge.  I just don't know which is which, and - and here's the damnedest thing - in some cases I suspect the data does not exist, and would be nearly impossible to collect.

Let's not, for the moment, get into what is the most intellectually rigorous approach in such a circumstance.  Rather, sometimes, for the sake of argument, I'm liable to take a position that I don't actually believe - or, rather, that I don't actually have evidence for - because I would rather practice drawing actionable (or at least write-about-able) conclusions from limited data than trying to always be 100% certain about everything.  Indeed, that was largely the purpose of invoking Wittgenstein in my previous post.  His On Certainty is a work which tries and tries and tries to define certainty, and ends up talking itself in circles because, in the end, context matters so much to meaning that knowledge, in any fundamental sense, comes across as impossible to him.

But we're getting ahead of ourselves.  Onward to Luc's questions.

1) What is this "Truth" of which you speak?  And who destroyed it?

This is a very good question, and as I re-read my post with Luc's comments in mind, I realized that I did a very poor job laying out my premises.  The distinction I wanted to make was this:

Truth, with a capital T, is to my mind an indication of some Platonic, universal, almost inhuman reality.  It is, in short, the idea that there is a single, unyielding, unchanging essential set of metaphysical (or, possibly, physical) things and ideas which are applicable to all beings.  In some sense, arguing for the Truth, with a capital T, is akin to arguing for a kind of abstraction of God.  That is, whether or not there is Truth or not is not the same as whether or not there is a God.  Rather, it's whether or not there is a single way in which God (or anything else, really) exists.  That's maybe an opaque distinction, so let me try to clarify.

It seems to me that God, as a word, undoubtedly means a thousand different things to a thousand different people.  We might say, then, that one of those thousand people is right, and the rest are wrong.  God means one thing and one thing only, and that is the Truth.  We might, on the other hand, say that there are a thousand different valid ways of understanding God.  While that doesn't mean that there isn't maybe a deeper Truth (still capital) that synthesizes all or some of the thousand of those disparate and maybe even mutually exclusive (to we humans, anyway) perspectives, what it does mean is that, as far as human experience goes, there is not really any Platonic, eternal, universal Truth worth striving for.  Rather, there's only truth in as much as it applies to a given context, in a given time, to a given society.

So Truth, as such, hasn't been destroyed, per se.  But I would argue that post-modernism, as a philosophical and social movement, has roundly rejected the idea of a single correct interpretation of the world, metaphysical or otherwise.  As a result - because post-modernism has become so ingrained in our society - we are a world easily able to say "let's agree to disagree," because "you have your truth and I have mine."  That doesn't mean it's right and, as I argue (kind of) in my post, that doesn't mean there isn't a Truth (only that we haven't sought for it hard enough).  What it does mean, however, is that society at large, and academia in particular, are a lot more interesting in discovering contexts and situations than Truths, because we don't really take Truths seriously anymore.  Or, perhaps more poignantly, we see Truth, as an idea, as tied into colonialism and racial and sexual oppression, thanks to the preponderance of rich white males among the historically prominent seekers of Truth.

2 and 3*) "Procedural homogeneity," who do you think you are?  People still write books, paint pictures, and play music.  Also, why the Twitter hate?  And what does any of this have to do with truth?

* Because my writing about the once bled into the other.  Also, I'm lumping in Luc's opening comment here, because it's related to his actual question.

I don't hate Twitter, actually.  I find it fascinating, and while I'm not in a rush to start an account, I do sometimes read a select handful of people's Tweets.  Like in many situations, there's a paradox at work here: constricted form limits expression, but it also liberates expression.  The sonnet is an extremely limiting formal construct, but it is precisely because of its limits that writers of sonnets are so creative.  Similarly Twitter, where the character limit may homogenize the creative process (just like the sonnet form does), but within those limits the possibilities for expression are actually quite large.

It is also the case, in our digital age, that people do engage in all of the same creative forms they always have.  And, on top of that, we've begun to synthesize forms in ways that previous generations could never have imagined.  A simple example, to demonstrate the point, would be modern video games, which increasingly blend principles of cinematography with dramatic writing, both two and three dimensional artistic rendering, and, of course, good audio design.  The creative process is certainly alive and well in our modern age, and - by virtue of our increasingly huge population alone, let alone greater access to technology - innovation is probably faster and more widespread now than ever.

The problem, then, is a subtle one, and I'm not sure my post did a good job explaining it.  Indeed, I know it didn't, partially because I'm not sure if I really can explain it.  What I'm getting at, more or less, is that there's a kind of complicity the writer of sonnets has with the sonnet form that I'm not sure the Twitter use has with Twitter.  That is, the writer of sonnets winks and nudges at his form, knowing that its limits are precisely what allow him to achieve such high artistic expression.

Now, compare that to a Tweet, which is generally not taken as a piece of art to begin with.  What does the user of Twitter consider to be his relationship with the medium of self-expression he is using?  Is he aware at all that he is limited?  Those are real questions that I don't have the answer to.  What I suspect, however, is that Twitter users take for granted their character limits, just like we take for granted the limitations of our native tongues (Luc's language example is a good one).

In that sense, there may be little difference between modern man and man of any time.  We have - as I say in the post - always been limited in our means of expression.  All art is limiting in form because it is principally through limitations of form that we are able to create anything at all.  What is more pernicious, then, is not limitation, but the widespread homogeneity in the type of limitation in which we, as a society, participate.  It's not just Facebook, Twitter, and text messages that I'm talking about, it's decreasing biodiversity, it's dying languages, it's fewer and fewer cultures.  Now - and this is a leap - those latter things might be related, at a procedural level, to the former ones.  It might just be... Or let me make it a question: is it the case that there is a technological imperialism going on?  Are the wonderful tools for self-expression and communication we have developed in our modern age choking out diversity of all kinds, not because of what they are, but because of how they work?

Again, I really don't know the answer to that question, but it's a question that I don't think is being asked nearly enough.

Wither truth (or Truth), in all of that?  Well, I'm of the opinion that not only more ideas, but more different ways of generating ideas will get us closer to whatever the Truth might be.  And, even if no such thing exists, at the very least it's a lot more interesting - and probably a lot healthier intellectually, spiritually, and physically - to deal with variety rather than homogeneity.  In the end, I'm not so concerned with truth, I guess, because I'm more concerned with the process of getting there.  Or, rather, it might very well be that the closest we can get to truth as a species is good process.  Not good thoughts, but good thinking.

With that in mind, thanks to Luc for forcing me to do better thinking.

No comments:

Post a Comment