Sunday, October 31, 2010

Of Mice and Straw Men

If you have ever watched a television debate show (or, worse, a political advertisement), if you have read a blog, if you have listened to your crazy neighbor rant about the local football coach's stupid decisions, you've probably come in contact with a straw man or two.  Even my previous post is an example of arguing against a straw man.  Basically, instead of trying to explain a valid opposing opinion, the "straw man" argument-style relies on setting up a mock opponent, and then destroying that mock opponent thoroughly.

It's an extremely effective way to anger people, more than anything, because it doesn't take a lot of insight to see when someone is misrepresenting a position.  For example, if a Republican wants to argue against, say, the pro-choice position, he's likely to represent not the valid, rational arguments for abortion, but rather he'll represent and take on the more emotional, ambiguous, and sometimes absurd ones (for example, he'll argue that Mr. Abortion doesn't value life, or that he's an atheist and a fascist, or that he hates America, or some such).  Instead of engaging with ideas, setting up a straw man is more about taking down people by implying that they have only flimsy (at best) or immoral (at worst) arguments in their favor.

Despite how it seems, I nevertheless feel as though there is a place for the infamous straw-man style.  Of course, in real conversation it's worse than useless.  But sometimes it's easiest to represent a position by explaining what it is not.  And sometimes it's important to persuade by a kind of rhetorical force - or to reinforce a sense of camaraderie with an audience - rather than to explore nuance.  We might rightly wonder if there's not a broader social issue here, in that there might be something off-kilter in a world where persuasion by rhetoric is more important than persuasion by insight and fact, but it's hardly news that how it's said is more important than what is said.

I guess the point is, however, that the straw man argument is a mouse's argument in a lion's world.  There are far subtler and more effective ways to persuade in almost any situation.  Rather, it seems to me that the straw man is reserved for times when one's own emotions get in the way.  For example, in my latest post I was concerned not so much with my actual audience as an imagined one, and in particular a small handful of people with whom I have recently had contact.  I don't even believe that my straw-man represents their understanding of science education, but because I do believe in the position I advocate, I didn't care too much to get into the nuances of their position.

Ah, so the lesson here has something to do with remembering when and where and for whom you are writing.  But it goes deeper than that.  There's also a dash of the vague-but-essential "critical thinking" that has to happen here.  A writer owes nothing to his audience, really.  If they read, it is by choice.  A reader, on the other hand, owes much to himself, and perhaps the most important part of being a reader is, in fact, not being persuaded.  If we agree with the writer, how much more important to retain a sense of objectivity, so that we don't miss the real essence of the writing for its style or its seeming alignment with our own beliefs.  At the very least, we ought not miss the opportunity for a conversation simply because we think we already know what the author is saying.

That's the irony, to me, of my own writing style, which is often reflective and exploratory, but more often (I suspect) polemical or persuasive.  I don't actually want to persuade, and I would never want to be understood to be "certain" about things.  What I write in any given post is, usually, a perspective or a position I'm trying out, something I'd like to explore and something I think my audience - limited though it is - might be interested in exploring as well.

Does that excuse straw men?  Should it?  I don't know.  What I do know is the satisfaction in taking down an argument without having to fight is not a trivial one.  No wonder there are so many arguing so loudly against positions no one really holds.  No wonder, even in our seeming duopoly of a political system, neither side seems to ever actually argue with the other.  No wonder debates are exchanges of monologues instead of conversations.  It's just so much easier that way.  It's just so much easier to be mice, hiding inside of straw men.

Friday, October 29, 2010

On Science Education

I am not a science teacher by trade, but I have found myself teaching science more than any other subject area.  Perhaps this is in part a result of my upbringing - my mother is a science teacher - and perhaps a result of pure chance.  Regardless of why science has found me and I have found science, my background as a graduate of St. John's has followed close at hand, with the result that I think about science a little differently than many people do, a fact that has helped me to understand science myself, but has not always been to my advantage.

What do I mean by that?  Well, one of the most fundamental but unarticulated problems in science education is a failure among the important people in the process to agree on what science really is.  That's not to say they should agree, or that agreement is even possible, but rather to say that the lack of a clear definition of science has led to a hodge-podge of curricular methodologies and pedagogies that are non-complementary, and which do more to alienate and confuse students than to empower or interest them.

Scientists, ironically, are probably the most to blame for this situation.  There is great lamentation in scientific world about how poorly students do in science.  The solution, it seems to these people, is to make science more "real" for students.  That is, to align science classes closer to their own experiences as scientists.  The problem is, they perceive science to be, generally, a fairly static thing.  The average scientists definition of science (and I don't have research to back this up, just my own experiences) seems to be something like this: Science is a set of well-established theories and facts about the natural world arrived by employing a fixed methodology to places where our knowledge is lacking, in order to expand our understanding.

Now there's nothing particularly wrong with this, except it's extremely boring and a waste of student's time.  The idea that science is, at its heart, about theories and facts, and an easy-to-use scientific method, to me trivializes the real efforts of science.  No, science is not just collecting data and analyzing ad infinitum, the real power of science to affect people's lives lies in something more fundamental to it.  To me, science is about a process that is dynamic and contextual.  Scientific knowledge is conditional and flexible.  Being a scientist is about creativity, critical thinking, and learning.

How much more interesting is this perspective to students?  Frankly, they probably don't care how we define science.  More important is the result of this perspective on curriculum, on pedagogy, on what happens in the classroom.  In the former, the scientist's classroom, students sit and listen to lectures, they engage in laboratory sessions with known answers, they are asked multiple choice questions, they fill in worksheets with word banks.  Basically, they're miserable.  Even a field trip, in the traditional science classroom, generally leads to little more than the dissemination of information, the collection of useless and meaningless data, or the acquisition of some boring science "skill" that is irrelevant to a student's life.

On the other hand, the latter classroom is organized around the concept of science as inquiry, meaning students are encouraged to develop a more sophisticated understanding of how science really works.  That is, by giving students enough structure to keep them on task, but enough freedom to let them develop their own observations, questions, hypotheses, and even experiments and conclusions, students get to learn the real pitfalls of scientific work.  What's more, they still learn the content.  Perhaps not as much as if they were just lectured at for an hour straight, but what they do learn, they learn better.  More importantly, memorizing facts, equations, theories, and methods is a waste of time in our modern world; better to learn how to synthesize, how to analyze, and how to ask good questions.

To illustrate: How many of you readers out there remember anything about how to calculate the effect of friction on a moving body from your high school physics class?  For those of you who don't remember, how long do you think it would take you to look it up?  I'll give you hint, type in "friction" in Google, and click on the Wikipedia page.  There's your equation, plus explanations of how to use it.  Good thing your science teachers wanted you to memorize all that.

Beyond acquiring facts, which is manifestly a waste of time, the other biggest problem with traditional science education is that it is built around collecting data.  I certainly acknowledge that good data collection is important, but I will also say that data in itself is worthless unless there's some kind of analysis on the other end, not to mention some kind of purpose for collecting it on the front end.  It's easy, in pre-made experiments, to point to the supposed "purpose" of certain data, but the whole point of good science education, to my mind, is to get students to make those decisions themselves.  What data should they collect?  Why?  How will they use it?

Now an experienced educator will respond by saying that they need to see that process in action before they can do it themselves, and so we should provide them with situations where they are not designing experiments, where they are working with fake data, where they are performing analysis without doing the legwork to get there.  That's fair, but only if we call attention to why we're doing that.  Many teachers emphasize the importance of data analysis for its own sake, not as a part of a broader scientific process, and certainly not as a pathway towards meaning.

Ah, there's an interesting word.  I suppose, if I had to summarize the two approaches to science education I've discussed here, I would say that the first is about information and the second is about meaning.  The problem is, looking for meaning in science is not encouraged by scientists for reasons that are difficult to fathom (though the Freudian in me wants to suggest mean things about scientists being afraid of meaning because of the spiritual, emotional, and social emptiness of their own pursuit of information).  The cries that students do poorly in science, to me, reverses the reality: science does poorly for students.  Make science real, make it authentic, make is about ideas and not facts, make it about processes and not methods, make it about discussions and not dictation, and then you'll see students do well.

Science is a liberal art, in the root sense of the term.  The study of science can help free you from other people's conclusions, from propaganda, from drug commercials that say "studies show..."  Science, however, can also bind you and blind you.  Unfortunately, the classroom (and, more importantly, the legislator's table and the teacher training program) is, in this discipline as in many others, a battlefield.  A battlefield where the "fluffy," holistic, more authentic side is losing, not because it is actually worse, but because it's harder to do well, harder to assess, harder to standardize.  Those are fair criticisms, but it might be that part of what's wrong in education, broadly, is that we shy away from what's better but harder too often.

Sunday, October 24, 2010

Ping Pong

I suppose that most people watching, say, an NFL football game are well aware of how impossible it would be for them to step on the field and even pretend to play with the pros.  Not only are the small NFL players bigger than the vast majority of us, it's pretty clear from watching them play that they're incredibly skilled and focused, not to mention propped up by countless mystery medicines and steroids and assorted body armor that they barely even count as people any more.  And for good reason; the sport they play is far too brutal for any sane human being to participate in.

Nevertheless, because it's so ingrained in our culture, there's a kind of natural intuition as to whether a football player is doing it right, with the result that fans and pundits alike are quick to point out who sucks, who's good, who messed up, and who is so bad that he earns the adjective 'mediocre.'*  The same is true in baseball, and generally speaking, while no rational fan honestly believes he or she could do just as well as the pro, there's kind of this latent, subconscious feeling that, yes, in fact, player X is so bad and messed up so much that I probably could do better.

* Is it just me, or had mediocre come to mean something even worse than awful, terrible, or bad?  I mean, every team in every sport has lots of 'terrible' players, but only a particular kind of talent earns the moniker mediocre.  While certainly a player everyone calls mediocre is probably better than the guy they say sucks, odds are the mediocre guy also plays an important role on the team.  No one ever calls the back-up safety mediocre, for example.  Mediocre is reserved for the QB, for the clean-up hitter, for the ace of the pitching staff, for the coach.  I guess, all in all, mediocre is more damning because, while its fine to have a bad assistant special teams coach, a mediocre head coach in a world of good and great head coaches spells doom.  Or at least mediocrity.

A classic example of the "I could do better" notion that immediately jumps to mind is the USA's goal against England keeper Robert Green in the World Cup.  In case you don't remember, here it is (along with other highlights from the same game) in glorious LEGO:

Anyway, the point is, this is the kind of event that most sports fans look at and say, "Well I could do better than that."  Fair or not, we have enough acquaintance with the game to believe that we could execute the basic fundamentals with as much precision as a pro.

Not so - and we're getting to the post title - in ping pong.  Now I know the serious ping pongers out there will tell you it's supposed to be called "table tennis," and we'll get back to that.  For now, the important thing is that, even though ping pong is a game almost everyone has played, and an extremely simple one with much more straightforward rules than, say, football or baseball, I think it's pretty much impossible to watch a professional ping pong match and feel like you could do better.  Even a good amateur ping pong player generally approaches the game in a way that is so different from the way the pros do it that it's hard to imagine matching up.

Of course, that is true in other sports, too, but the odd thing is that in ping pong the illusion of being able to perform at a high level would take, it seems to me, a much higher level of self-deception.  I don't even hold the paddle all backwards and upside down, and can't imagine that I would play better if I did.  And yet, that's how the real ping pongers do it.  I can't imagine serving with any degree of accuracy the way a real ping pong player serves.  I certainly can't imagine hitting the ball from so far away from the table.

I think the key here, however, is that even someone like me who has been playing a fair (ok, perhaps alarming) amount of ping pong for the last two months now, can't even really follow what's going on in a serious ping pong match.  The points are so fast, the strategy so blurred, the reactions so automatic, that it doesn't exactly make compelling viewing.  Consider this video of the "Top 10 ping pong shots of all time:"

I can't even tell what's going on in half of these!  Who's winning these shots?  What makes them so good?  I don't understand!

This is a chicken and egg thing, of course.  We don't understand ping pong intuitively because we don't watch it, and vice versa.  What's more, we find ping pong impossible to watch because it's too fast, but if we watched it enough maybe it would start to make sense.  Then again, it might just stay weird and impossible to follow (what with the tiny ball and speed of the game), which might be part of why it has never been all that popular to begin with.  That is, popular to watch.  It's extremely popular to play by comparison.  I think ping pong, darts, and pool are probably the three most prominent game-sports* that more people play than watch.

*As opposed to sports like running or skiing or swimming or surfing, which aren't really games.

Which all brings me to the name "ping pong."  My extremely cursory research tells me that ping pong - the term - originated with Parker Brothers around 1900, around the same time as the name "table tennis."  Apparently no one wanted to stick with the game's original name "Wiff-Waff," and I can't imagine why not.  Regardless, the name ping pong seems to me much more appropriate, because table tennis makes the whole thing sound like tennis on a table.

Which, you're probably thinking, is exactly what it is.  Except it's not.  Sure, there's a passing resemblance to tennis, but ping pong is as much like tennis as volley ball is, I would argue.  The differences are far more pronounced than the presence of a table.  For example, the ball is plastic instead of felt and rubber, the paddles hard instead of strung, and the game is played from outside of the playing area (instead of inside).  Serves have to bounce on the servers side of the table, but don't have to go cross court.  Players have to pivot a lot, but rarely do they ever have to run.  Most of all, the strategy is completely different.  Oh, there are analogies, just as there are between other net-and-ball games and tennis, but trying to play like Rafael Nadal won't make you a better ping pong player.

Which is why ping pong needs a different name than tennis, and table tennis is so much less inventive.  Ping pong is colorful, absurd, and impossible to take seriously.  Which is how it should be.  After all, this is no sport of gridiron tactics and bone-crushing hits, nor is it a sport of last at-bat high drama.  It's just ping pong.

Thursday, October 21, 2010

Comparing Discussion to Brainstorming

One of the most important parts of a design process is good brainstorming.  It is impossible to end up with good prototypes - that is, prototypes that result in meaningful feedback - without being unafraid to engage in the kind of intensive, creative, and often bizarre brainstorming that is at the heart of companies like IDEO or programs like Stanford's dschool.  But what makes for a good brainstorm?

I have both participated in and facilitated (and often both at the same time) brainstorms with my peers, with teachers and administrators, and with students.  In that experience, I have found that good brainstorms essentially follow the rules I was taught at the dschool.  But I have also found that those good brainstorms - those rules for brainstorming - are nothing all that revolutionary.  It strikes me that a good brainstorm is a lot like a good conversation, only with a whiteboard and more post-it notes.

What do I mean?  Consider the dschool's "rules for brainstorming:"
1) Defer judgment
2) Go for volume (that is, quantity, not noise)
3) One conversation at a time
4) Be visual
5) Headline
6) Build on the ideas of others
7) Stay on topic
8) Encourage wild ideas

Obviously a number of these rules complement each other.  One conversation at a time, build on the ideas of others, and stay on topic are mutually reinforcing rules.  Likewise, going for volume and encouraging wild ideas work together.  The more wild the ideas, the more likely the brainstormers will produce more of them.

For my own part, I find facilitating a brainstorm fascinating, because the role the facilitator has to play is one of empowerment.  That is, you have to empower people to follow the rules.  Sometimes that comes in the form of yelling "One at a time!" or "Headline!" when appropriate.  More often, however, a teacher of brainstorming has to offer the wildest and craziest ideas that everyone else is afraid to mention.  For example, during our recent NALU 102 session, when the students were brainstorming experimental questions to investigate at Waimanalo Beach Park, I offered the classic "how many grains of sand are there on the beach?"  Why?  Not because I thought it was feasible or even all that interesting, but because I wanted to demonstrate that there's no reason to only ask questions that you can answer easily.

The same can be said of brainstorming for product development.  It's a waste of time to suggest only ideas that are easy to do.  I doubt that the people at Nintendo, for example, were having a sedate and measured conversation about feasibility when they first came up with the Wii.  No, that was a wild idea that turned into an industry force principally because it was a wild idea.  Similarly, I don't think the Google folks held back when offering the idea of getting street-view pictures of every road and intersection in the country to include with GoogleMaps.  Wild ideas - even the ones that don't lead anywhere - are essential to coming up with something that is actually innovative.

Anyway, the point here is not to discuss brainstorming, but to talk about how brainstorming is similar to a good, dialogic conversation.  As a graduate of St. John's College I have some strong opinions as to what makes a good conversation, but fortunately I don't have to articulate those myself.  Stringfellow Barr, one of the founders of the Great Books program at the college, wrote a wonderful piece called Notes on Dialogue that does the job for me.

There's less "headlining" than in the dschool list, granted, but I want to call out what I understand to be Barr's "rules for a good conversation:"
1) Be brief
2) Suggest crazy ideas
3) Ask for clarification
4) Practice prevents bedlam
5) Listen to each other
6) Try to reach an agreement
7) Follow the argument wherever it goes
8) Don't use hand-raising as a crutch
9) Listen to each other (again) and be friends
10) Don't take things too seriously

Now these rules are not identical to the brainstorming rules, but I see some overlap.  Consider Barr's demand for brevity in dialogue.  That matches quite well with the brainstorming maxim of headlining, as well as the desire to promote quantity of ideas.  Both the dschool and Barr agree, also, about the importance of wild ideas.  Moreover, the brainstorming triumvirate of "one at a time," "stay on topic," and "build on each other" are intimately related to Barr's ideals of listening to each other and following the argument where it leads.  I would also associate not taking things too seriously with deferring judgment.

What Barr does that the dschool does not do is address some of the "how," instead of just the what.  Barr advocates what goes without saying in brainstorming: practice makes perfect.  In dialogue, in particular, this is important, because as Barr points out, engaging in a non-hands-raising, authentic dialogue is extremely frustrating and often highly ineffective with beginners.  Of course, the same is true in brainstorming, where too many beginners are liable to talk over each other, to judge each other's ideas, and to be afraid to sound too crazy.

I also think that - and have experienced that - asking for clarification is sometimes essential in a brainstorm.  "Headline" can be translated as "be brief," but it can also mean, "ask someone to clarify or simplify."  Often when someone is facilitating a brainstorm (or when experienced brainstormers are working without a facilitator), the shout of "headline!" means exactly that: simplify, clarify, explain.

Perhaps the only substantive difference between the two lists is in the brainstorming preference for visual representation.  Because dialogue is a purely auditory and oral event, there is little opportunity to draw on the board (though of course, as Barr says, doodling is permissible and even encouraged).  It is true, at St. John's anyway, that good conversations in math or science classes often find a student or tutor at the board drawing mental models or working with equations, but that is a far cry from the design ideal of representing every idea - or as many ideas as possible - with pictures instead of words.

This difference is a fascinating cultural disparity between a past that was much more bookish than we are today.  For good reason we do better encouraging the visual in our modern conversations.  Nevertheless, the heart of a good conversation remains the same: listen, be brief, and don't be afraid to offer a wild idea.  It strikes me that two places so different as Stanford - at the cutting edge of design and technology and science - and St. John's - employing a Great Books and dialogic model that is over a century old - could be so close together not in what they do, but in how they do it.  To me that is an affirmation that good thinking, good collaboration, and good conversations, whether for the sake of building something, for the sake of understanding something, or just for fun, have some fundamental building blocks that are the same in almost any environment.

Monday, October 18, 2010

On Leadership

Over the past two weeks, as I've been running our NALU 102 session along with my trusty co-teachers, I've been thinking a lot about leadership, both because of my role, and because of the expectations we ask our students to live up to.  While I have been a "leader" at various times in the past in both academic and extra-curricular settings, my new position as Director of NALU Studies is the first professional leadership position I've held.  While I am only some six weeks into the job, I'm finding the experience a very different kind of challenge from the ones I faced as a student at Stanford, or as a teacher with NALU when I was a mere part-time helper.

Similarly, I think our students sometimes struggle with the difference between titles and leadership, and rightfully so.  One of the things I'm learning is that it is a lot easier to be a leader in a situation when you already have the title that signifies that leadership.  There is less need to be assertive, to show off one's physical or intellectual powers, such as they may be.  Not so for students who are title-less; they often feel that, without leadership granted to them, it is not theirs to take.  I think this is both an astute observation on their part and an unfortunate fallacy, but the reasons why will not be readily apparent until we define leadership.

What is leadership?  Traditionally, leaders are the people who get to make decisions, who get to shape things according to their desires and interests.  The easiest thing for a leader to do, given the power that goes with leadership, is to make use of that power to orient situations and environments so they are more favorable for the leader in question.  That's a broad and probably uninteresting observation, but I think it is important.  Even the most progressive, forward-thinking leaders in the world know how and when to use their titles, their networks, and their other assets to transform a conversation into a representation of power dynamics.  Stated more simply, without sometimes taking control of a situation, a leader cannot display leadership at all.

There is, however, something missing from the conception of leader as executor of bestowed (or taken) power.  Leadership, I would argue, has a lot to do with listening, weighing options, and making choices based upon the needs of others.  Indeed, that stuff is way more important than ensuring positive outcomes for oneself.  That's not to say leadership is a selfless role, but rather than it is not an entirely selfish one.  It is a balance, wherein the needs of a great many people must be considered.

It is here where the myriad titles that make a leader are not really important.  What matters is an individual's actions, instead.  A leader-in-title who merely exercises his will is no more than a selfish child, where a unremarkable, unassuming student who steps up quietly and makes sure that an upset friend's opinion is heard is clearly displaying leadership.

Of course, all of that is stuff you've probably heard or thought about before.  The wrinkle, here, to me anyway, is that leadership is really a lot easier when the leader is less selfish.  Whereas I think there's a natural disposition, especially in our exceedingly greed-driven society, to think that with leadership comes money, which results in a less holistic and empathetic world view, I believe that model is flawed.  That's not to say that most leaders in this country (and around the world) are not well paid, but rather that most of the excessively rich ones end up working too hard and achieving little of real value.  The problem with being even a rich capitalist, you might say, is that you're always looking over your shoulder at the next generation of greedy, clever up-starts who can take you down.

It's easy for me to say as director of an education program that things should be different than that.  But I do honestly believe that even in a major company, good leaders do not spend the majority of their energy organizing situations for their personal benefit.  Consider companies like Zappos, whose CEO takes down a pittance of a salary, or Southwest Airlines, whose founder and CEO was long the least well-compensated CEO relative to the income of his company in the country.  Those companies are innovative, customer-focused, effective, and generally not evil, thanks in large part to their unselfish leadership models.

The same, of course, is true in the education space.  My primary goal as a leader at NALU Studies is to grow the organization into a successful force for positive change in at-risk education in Hawaii.  That's not the vision of the organization, or the mission, but it is at the heart of my decisions as a leader.  It is worth noting that achieving this goal would be good for me as an individual - I think the idea that the success of a business or non-profit and the success of its leader are mutually exclusive is a silly one - but it is also the case that I cannot succeed in making a difference in at-risk education simply by looking out for myself.  It is also the case, and this is the more important point, that I cannot simply act according to my own comforts and passions.

I think, less than outright greed, the greatest flaw that leaders of small organizations exhibit is the tendency towards making the organization nothing more than a reflection of themselves.  Decisions are too often made that do not reflect the best interests of the organization or its stakeholders because the correct - or at least better - decisions would force the leader into a position that is uncomfortable or outside of his expertise.  To me this is an understandable, but avoidable flaw.  Understandable, because I think the fear of taking risks and failing is so widespread as to be almost unavoidable, even among those who articulate and understand the problems with failing to take risks.  Avoidable, because I think leaders who are also general enough in their expertise are liable to not be intimidated by not having the requisite expertise to execute a decision.  Why?  Because a good generalist knows both where to find an appropriate expert, and also knows how to have the necessary conversations with said expert (or experts) in order to get enough information to make a decision.

Enough is a key word here.  Perhaps, more than anything else, asking the right questions in order to get enough information is at the heart of leadership.  No one person can know everything about everything, even in a simple situation with simple options and simple decisions.  Instead, the leader is the person who is capable of making a decision with enough information.  The 80/20 rule is the leader's best friend.  If the first 80% of the information you need takes 20% of the work to acquire, and the remaining 20% takes the other 80% of the effort to acquire, best to make your decision based on 80% of the information.

For a high school student, all of that is probably a bit beyond their interest.  The kind of leadership that occurs within a cohort of ten students in a two week program is of a different kind.  Only, it's really not.  There's a lot to be said for intuition, and it seems to me that good leaders arise even from among students because those students also know how to operate at the level of "good enough," and also know how to subsume their personal interests to those of the cohort (or, rather, realize that their personal interests and those of the cohort are the same thing).  I suppose the challenge for we teachers - as leaders in hopefully both title and deed - is to empower those leaders among us to articulate not just their understanding of a situation, but their metacognition, their recognition of their own leadership and what makes it go.

Sunday, October 3, 2010

The Plunge

This space is likely to be quite barren for the next two weeks while we run our next NALU 102 session.  As many of you know, I recently took over for Manning Taite as Director of NALU Studies, a small non-profit that runs science education courses for at-risk kids.  I'm sure I'll talk more about my experience in the last month as Director, and about the organization as a whole, in the future, but for now I just want to let my few readers know that my regular schedule of one or two (or, on occasion, three) posts a week is on haitus until late October.

In the meantime, make sure to watch some playoff baseball for me, and send your good vibes to Windward Community College in Kaneohe, Hawaii, where our brave students are about to embark on one of the most challenging and fun courses they've ever taken.