Tuesday, July 31, 2012

101 Interesting Things, part forty-eight: Pyura chilensis

Let's have ourselves a little climb up the family tree.  In the kingdom Animalia, phylum Chordata, you'll find the class Ascidiacea.  Ascidians are the sea squirts, immobile filter feeders who sit in one place and just process whatever floats by.  You know your sea squirts, right?

Saturday, July 28, 2012

How Can Science Be Objective?

This is a paper I wrote in my philosophy of science class.  The instructor told us that the class would be "one long argument," and at this late point in the course, our task was to summarize and synthesize the material presented thus far, and evaluate it.  The question was, "Can science be objective?  And if so, how?"  We were given six pages to answer (my bibliography was page seven).  This is another paper that I'm pretty proud of, so here it is for your enjoyment.  Links to the cited articles are given at the end, except for two which I couldn't track down (I'm not counting Locke, I only used him for a money quote at the end).  They're all good reads, so feel free to take a walk on the web if you're unclear on a point and want it in the author's own words.

Much ink has been spilled in the attempt to characterize science, both its process and its products, in a way that both is accurate and preserves at least some of our notions of what science is “supposed” to be about. Logical positivism has been rendered untenable by the ubiquity of underdetermination and the nature of observation as fundamentally theory-laden. Is there any way for science to be objective in this light, or is it just a highly political free-for-all? This depends on what we mean by “objective,” for some definitions of the term shall certainly fail. Yet there is an important way in which science can be seen as objective, and in a satisfyingly progressive way.

Thursday, July 26, 2012

On Formlessness

When I took advanced US history in high school, our instructor walked up to the blackboard on the first day and wrote a bunch of words on the board:
dog cat horse ox run jump climb fish green blue orange apple pear swiftly Jane Bill
Those probably aren't the actual words, but that doesn't matter.  He then told us, "Organize these."

"According to what," someone asked.  It might have been me, but I don't remember and it's not important.

"Not my problem," he answered with a shrug.  "Just organize them however you want."  I put them in alphabetical order:  apple Bill blue cat climb dog fish green horse Jane jump orange ox pear run swiftly.  Done.  Our instructor asked us how we organized them - one person had organized them into parts of speech, with "fish" going under both the Noun and Verb headings, "orange" being both a Noun and an Adjective, and "Jane" and "Bill" under the Proper Names subheading of Nouns.  Another student had divided them into Domesticated Animals and Other.  We had hit all the obvious ones, and then the instructor put some others on the overhead and had us try to guess what they were.

Tuesday, July 24, 2012

On Faithlessness

There's this YouTube guy, Philhellenes, he's made some sweet videos like a scientifically accurate rewrite of Genesis and an account of science saving his soul.  One of his other videos is on admitting error, and it's also really great:

Why does he need the contrast so high?  Is he in a library's
after hours?  He must be The Phantom of the Library!

It got me to thinking about walking in doubt, a phrase I use from time to time when I need to establish that the brand of atheism I practice is not in fact another kind of faith.  I have a lot of beliefs, but no faith, because I've spent a great many years doubting everything systematically, like RenĂ© Descartes.  What I've ended up with is a bunch of things I endorse as facts with varying levels of confidence, and a whole bunch of inferences between them of varying strengths, which is rather difficult to keep track of.  It's not perfect, by any means, and I still screw up from time to time, which keeps me humble - but beneath it is an attitude of readiness to admit error.  I try not to think, "What do I know for certain," but instead, "How much doubt am I dealing with here?"  If I'm ever not dealing with an amount of doubt, then I've got certainty, and certainty is dangerous.

Saturday, July 21, 2012

On Cognition

Last spring, I took a class in the Interdisciplinary department on cognitive science - we covered philosophy of mind, computer science, and neuroscience, and it was awesome.  I gained a reputation in the class for hedging my answers very carefully, admitting when I didn't know things, but knowing quite a lot, and refusing to engage in wild speculation (unless I was explicitly asked to do so).  I was also the only philosophy major in the class, and I think I did a good job of representing the department rather well.  But then again, I also had at least five years on each of the other dozen or so students, and I had spent those years educating myself and arguing on the internet in my free time, so you could say I was a bit over-prepared.

The prompt for our final was, "What is the nature of human cognition?" and we were given six double-spaced pages to answer.  I think I did a rad job, and the material I brought together (almost all of which had been covered in class) was tremendously interesting, so I thought I'd share it with you, Dear Reader.  Enjoy!

Is Consciousness This Way or That Way?  A Robust Yes and No.

What is the nature of human cognition?  At the very least, we may say that it is complicated.  My own personal conjecture is that the first-person nature of consciousness is essentially a “story” that the brain “tells” to itself, after a fashion; but we humans have been using our minds to ponder our minds for millennia, and I probably stand very little chance of successfully advancing a new theory of mind in a six-page paper for an introductory course in cognitive science.  Three broad avenues of approach have been quite productive, however:  studying brains, attempting to manufacture consciousness “from scratch” (as it were), and inquiring as to what precisely we might mean when we talk about our minds.  Respectively, these may be called the empirical approach, the engineering approach, and the philosophical approach.  Each of these approaches has something to say on the three issues of brain brittleness, mental models, and the “directionality” of consciousness.  What I intend to show is that while each of the various answers proposed to the question, “What is consciousness?” contains at least a kernel of truth, no present account delivers the whole story and so we ought not to endorse one or the other as “essentially right.”

Friday, July 20, 2012

Follow-up Quickies: Bosons and ninjas!

OK, so my last post was kind of bullshit.  Here are a couple other interesting things to jam between yesterday and tomorrow.

National Geographic tackles Ninjutsu, hiring Glen Levy to put his skills to the test.  Please try to ignore his introductory banter; it pains me to listen to it, and I can all but guarantee that he was somehow contractually obligated to say that "journey to your own destruction" crap.  Pay attention to the bit on the vagus nerve, though, and check out the force he's able to deliver with a single punch:

That dummy needs shorts 'cuz it can't decide

In case you skipped it, I'll give you the TL;DR version:  this man is able to punch with the force of a shotgun.  No joke.  OK, it's a police shotgun firing a rubber bullet, but still.  You'd be hard-pressed to hit that hard with a baseball bat.  Yes, you.  Yes, baseball bat.  Glen Levy hits harder.

As for bosons, there was a reason I didn't trot out any of the goofy analogies that other writers have given for the Higgs field:  they don't explain where the mass comes from.  Passing through a sheet of molasses, and the molasses sticks to you?  Fine, but why is the molasses massive?  Walking through a crowded room and all your fans flock to you?  Fine, but why are the groupies massive?  Minute Physics to the rescue:

Physics in a minute.  Fifty-eight videos.  You could watch all his
stuff in an hour, and experience at least six flavors of enlightenment.

The TL;DR version here is that the mass is more or less resistance that comes from moving through the Higgs field all the time always.  In a kinda-sorta way, but it's closer.  Still don't get it?  OK, it's time to talk about the shape of the Universe.

Fortunately, Quarthex does translate to math, and souls don't exist.

It's been said that the Universe is "saddle-shaped."  Like if you took a square sheet of paper, and folded one set of opposite corners up, and the other set down.  Like so:

See?  Like a saddle on a fucked up horse!  (Is it weird that
draw the shape of the Universe better than a horsey?)

Except - and here's the kicker - instead of starting with a flat thing and curving it around, you start with a tiny curled-up thing and blow it up so big it just looks flat.  Bam!  Now I bet you'll have a better understanding of this article, and this key image:

All of those circles, at every point of everywhere, are one other dimension.

All the "extra" dimensions are like that:  tiny and curled up, but existing at every point in spacetime.  Incidentally, this is also why I have never ever made fun of anyone for proposing that time is circular - because, son of a bitch, it's actually kinda-sorta plausible.  Now do that over and over until you get all the dimensions, and bam!  There you have it.  Is the Lorentz factor making a little more sense now?

Thursday, July 19, 2012

More Fake Religion: Much ado about Nothing

I believe in Nothing.  I believe in other things, too - a whole load of 'em, in fact.  But Nothing is at the very center of my worldview.  Nothing is the most important thing in my life, without exception.  Nothing is more important to me than all life on Earth, and all the cosmos besides.

I worship Nothing.  I pray to Nothing.  I came from Nothing, ultimately, and when I die I shall go to Nothing.  Even the Bible says, "In the beginning, there was Nothing."  All that exists came from Nothing:  either the Universe sprang forth, full-formed, from Nothing; or a deity created it, and that deity came from Nothing.  Either way, Nothing is ultimately responsible for all of the everything that we see all around us today.

Even if there is a deity who created everything we see today, Nothing is older than it.  Nothing is stronger than it.  Nothing is morally better than it.  Nothing is more worthy of worship, thanks, and praise.

The best part?  Refuting the central tenets of my new religion requires positing a something that would obviate any other religion, too.  I win, either way!

I'm going to go contemplate Nothing for a while now.  Toodles!  (I thought I'd have more to say on this... but it turns out, it's actually hard to ramble on about nothing...)

Tuesday, July 17, 2012

101 Interesting Things, part forty-seven: It's "-jutsu," dammit!

I've been reading The Wise Man's Fear, sequel to The Name of the Wind in Patrick Rothfuss' Kingkiller Chronicle.  I can't put it down, and there's quite a long bit in the middle where the hero lives and trains among people who are essentially ninjas.  The way their language is described, more suggestive than explicit, is analogous to Japanese; the way they move and fight, with subtle grace and economy, is clearly meant to evoke Earth's own "shadow warriors."  So, naturally, I have ninjas on the brain.

In the twelfth century, a samurai named Daisuke Nishina suffered a crushing military defeat.  Rather than face his death like a man (pfft!), he fled into the mountains, and that's where things get hazy.  Some say he met a wise man from India, some say he met a monk from China, some say he met a tengu - but in the following years, a new breed of warrior arose in Japan.  Unbound by the principles of bushido, these unconventional warriors eventually came to be known as ninja.  They fought uncommonly well, but "dirty" (to the samurai), using techniques that exploited principles of leverage and body mechanics without relying so much on strength or speed.  They also engaged in psychological warfare as their legend grew, cultivating their own myth to their advantage and pressing it against their superstitious opponents.  And, of course, they used stealth - skulking about in the dark, striking from the shadows, disguising themselves, and various other "dishonorable" tactics that would be unthinkable for a samurai.

Sunday, July 15, 2012


I have something of a tab explosion of things I want to talk about, but none of them is really worth a full-on ramble, so here's a pile just to get 'em out of the way.  On to bigger and better things!

Grammar Girl on How to Write Numbers:  I was actually going to go off on a whole rant about style and the horrors of homogeneity, but decided to skip it because fuck it.  It wasn't going to be a rant at Grammar Girl, anyhow; in truth, I was going to be railing against The System, and tooting my own horn a lot.  But after thinking it through, that's not something I really feel like doing.  The main thing is that there's an insoluble tension in a mongrel language like English:  on the one hand, there's a need for some standardization, just to make sure that we can be clear on what we're actually saying; on the other hand, if you have too many rules then you end up with only one way to express a thought, and that's awful.  If you remember your Orwell, that was the point of Newspeak in 1984.  In case you can't guess, I'm right in the middle - of course, I probably only think it's the middle because I'm the one standing there, but whatever.  I mean, we need the rules for things that ought to be clear and unambiguous, including engineering manuals and Western philosophy; but art is half knowing rules and half breaking them creatively, which we need for things like literature and poetry and Continental philosophy.  You'll notice I didn't use a serial comma there, and that was deliberate - because I wasn't pausing in my head, and I didn't want you to pause in your head, either.

Spelling a Word One Way:  I was going to open the aforementioned rant with a quote of dubious origin, which I decided to look up.  Turns out, there is no answer - or at least not one I can know with confidence - and that may or may not have been the straw that broke the camel's hump.  (The hump is my motivation.)  But now I've got a wonderful new resource for diving headlong into trivia, so hooray!  I entertained the idea of ranting on spelling instead of style, but again, fuck it.  In a language like Spanish, rigid spelling conventions make sense, because it's got enough diacritics and few enough sounds that you can tell how to spell a word by hearing it and tell how to pronounce it by looking at it.  But that won't work for English, because English has a much messier and more complicated origin.  And really, all I wanted to say was that I can effect a change in my affect to have different effects when I want to affect someone in this or that particular way.  There.  Done.  No need to conjure up a half-assed rant around it.

Garfield Minus Garfield:  I read about this some years ago, when it was a single web page (as I recall) and just kind of stayed that way for a while.  While I was away, Jim Davis applauded the derivative work, showing uncommon grace and perspective.  If you're not savvy, Garfield Minus Garfield digitally removes Garfield from his own strip, revealing Jon Arbuckle's struggles with his own unstable mind.  While the original Garfield stuck to a narrow range of the emotional palette, slightly amusing within the confines of what you could talk with your folks about at Thanksgiving dinner, the removal of the cat allows the strip to transcend those boundaries and wander all over the map.  The strips now range from bizarre, to hilarious, to depressing, to uncanny, to pitiful, to vindicating, to existential, to insightful.  I'm amazed by it, and you should check it out.

Open Letter from a Millennial:  Yowza.  I have a lot of thoughts on this one, but every time I go over them, they change radically.  Not back and forth, but round and round, in an expanding spiral that keeps touching down at various points in history and possibility.  The more I think about it, the more I think that continuing to think about it is what I should do, rather than come up with some kind of "answer."  But even that may change.  So I offer it to you without further comment.

That Reminds Me:  Stephen Colbert gave a commencement speech at Northwestern last year.  You need to watch this.  It's twenty minutes of awesome, even the "boring" introductory matter.  You owe it to yourself.

Friday, July 13, 2012

Loose Ends in Philosophology: The Master Argument

Bishop George Berkeley (pronounced "BARK lee") was the first person to discover that CAPSLOCK IS CRUISE CONTROL FOR AWESOME.  Unfortunately, he was not also the first person to discover that EVEN WITH CRUISE CONTROL, YOU STILL HAVE TO STEER.  Seriously, this guy used capitalization for emphasis - and he emphasized a lot.

He was also an idealist, in the sense that he thought that the fundamental "stuff" of reality consisted in ideas (in the Lockean sense).  The easiest way to describe it to a modern audience is that he thought reality - the "real" world in which we all actually live - was like a giant MMO, where God plays the role of the server.  What makes things "be real" is "being perceived," it's just that God perceives all in order to keep things running.  To prove this idea that reality is fundamentally ideas instead of fundamentally stuff, he set out to establish that the idea of a "physical" object is incoherent, insofar as it entails a contradiction (a "manifest repugnancy," as Berkeley wrote).  He wrote a bunch on this, but he presented one "Master Argument" on which he said he was "content to put the whole," as in wager everything.  Unfortunately for him, he trades on an ambiguity - depending on what exactly he meant, it might be either equivocation, conflation, or amphiboly.  I think this threefold nature of the ambiguity is what made it go undetected for centuries.

Either that, or nobody really feels obligated to comb through the old ramblings of Irish bishops except for philosophy undergrads.  Y'know.  One or the other.

Bishop George Berkeley’s “Master Argument” is a thought experiment meant to show that it is absurd to believe in mind-independent physical objects.  While the argument itself is clear and appears well-structured, it rests on an underlying ambiguity that Berkeley may well not have noticed.  Additionally, the link between inconceivability and impossibility is not clearly established in this case, and breaks down on closer examination.  Despite these flaws, Berkeley’s physical arguments still show something interesting about the link between the world and what human minds think of it, but it is not the illustration that Berkeley imagines it to be.

Wednesday, July 11, 2012

Joe Klamar takes unconventional photos of Olympians; everyone loses their shit.

On Independence Day, photos were leaked from Joe Klamar's one-minute shoots with the US Olympic Team.  They bore a certain theme to which critics and the public have reacted somewhat strongly.  I read about this on the day, but gave myself a week so my thoughts could simmer and be laid out to dry before I picked them over and made them into new ones.  Here are a couple paradigm cases:

She looks like a human instead of a doll and I can see what went into the shot.  How dare he!

That tear in the paper is destroying America.  You can see the flag wilting over it, for fuck's sake!

He's been criticized for angles, focus, lens choice, lighting, showing the set, you name it.  The technical criticisms in and of themselves are perfectly legitimate, to my mind:  you're observing compositional flaws which remind you of rookie mistakes, and you wouldn't want it in your gallery because it looks sloppy to you.  Their inclusion is a deliberate act of juxtaposition, not an unintended oversight; but this stroke is itself an element that may or may not appeal to you.  It makes a statement, and you don't like it.  Cool.  This is what art criticism is all about - every artwork is a statement, an answer to the question of what statements are worth making, and that worthiness is exactly what you ought to be evaluating  From dance to music to video games, high art gives you a contemplative experience that enriches your life - you are meant to walk away with a sense of the profound.  This didn't do that for you, and you can articulate what you feel is missing, out of place, or handled poorly.  Fair play to you.

That's not the kind of reaction I'm talking about.

Tuesday, July 10, 2012

Elvis Has Left the Building: Update on our cosmic voyage

I read the other day that Voyager I has left our solar system, so I decided to look into what exactly that means.  What exactly are the borders of our neighborhood?  I mean, when we're talking about cosmic distances, the sheer magnitudes we're dealing with make hard lines really tough to draw.  This is compounded by the fact that our space probes move at ponderously slow speeds (astronomically speaking).  However, cosmic weather gives us some convenient rough-and-ready figures.  So first, a little background on the structure of the solar system, and then I'll tie it back to the Voyager mission.

Earth, technically, is in space.  What makes us consider something as "on Earth" but not in space is being within Earth's atmosphere, though I suppose you might consider the magnetosphere or our moon as a boundary, and that's not entirely unreasonable.  But just like Earth has an atmosphere and a magnetosphere, so our star also has a local "sphere of influence" that separates it somewhat from the galaxy at large.  This is called the heliosphere, and while the boundaries are fundamentally fuzzy, it gives us something useful to work with.  What makes the heliosphere "be what it is" is solar wind, charged particles emanating from the Sun.  As the Sun rotates, these emanations twist and ripple in a rad-looking sheet that looks kinda like this:

It's like a cosmic bathtub drain, but in reverse.

Saturday, July 7, 2012

Loose Ends in Philosophology: The Multiple Realizability of Water

Robert Pirsig coined the term "philosophology" to describe the strange habit philosophy departments have of teaching philosophical history without teaching philosophical method - teaching "what it's been," in other words, without teaching "how to do it."  Many people don't see a difference, but it's as big as the difference between learning music theory from a book and composing your own actual music (again, Pirsig's example).

When I went back to school, some of the "problems" in the history of philosophy struck me as insulting - not because they were stupid questions (they were excellent questions at the time), but because apparently philosophers hadn't been keeping on top of their science in the intervening centuries.  So at least a couple of my papers were dedicated to actually settling these so-called "unanswerable" questions.  This one is dedicated to John Locke's question about multiple realizability - what if what we think of as one thing, and treat as one thing, is actually many different things that merely appear the same to our senses?  In one way, this makes no sense when you're up on your science - but in another equally important way, it's actually the case.

(All citations refer to the 1975 edition of Locke's An Essay concerning Human Understanding, edited by Peter H. Nidditch.)

John Locke defines the meaning of a word as its associated ideas, a superficially innocuous definition which becomes problematic in the details.  Because ideas are conscious states, we cannot define our terms with the unseen corpuscular structures that inform these conscious states, but must instead resort to the conscious states themselves, using words as symbols for them.  However, as there is always more than one way to skin a cat, this may lead us to the apparently absurd conclusion that more than one corpuscular structure could lead to the same set of ideas, and so we may be forced on this analysis to call two things with distinct real essences by the same word.  A more thorough analysis of this kind, with some understanding of modern physics and chemistry, dissolves the problem:  while Locke could not have known quite how, we are in fact able to escape the absurdity while still endorsing a Lockean conception of meaning and essences.

Thursday, July 5, 2012

Teetotal? Best avoid brand name colas!

"According to tests carried out by the Paris-based National Institute of Consumption (INC) more than half of leading colas contain the traces of alcohol."


See, I used to think that cola only got alcoholic when I put rum or whiskey into it.  Guess I was wrong!  So now that we know that there's alcohol in it, the next question is naturally:  how much do I have to drink to get drunk?

Tuesday, July 3, 2012

The Higgs Boson!

Two electrons are sitting opposite each other at a lunch table.  A third electron walks up with its lunch on a tray and asks, "Can I sit with you?"  To which one electron replies, "What do you think we are, bosons?"

Electrons are fermions, and only one fermion can occupy a quantum state at a time - unless some other property is different (like spin, represented in the joke as the opposite sides of the table).  This is why you can only have two electrons in an orbit around an atomic nucleus.  Bosons operate under no such restrictions, which makes the joke funny, if you already know all of that.  If not, well, analyzing humor is like dissecting a frog:  few are interested, and the frog dies of it.

Sunday, July 1, 2012

101 Interesting Things, part forty-six: Overtones

Enough of blood.  I mean, I'll probably wrap it up at some point, since the immune system is just so cool.  But enough of blood for now, I want to talk about other things.

So those Pentatonix guys I linked last time, that guy at (your) lower-right?  His name is Avi Kaplan, and he can do something called "overtone singing," and I can almost guarantee that you know what it sounds like even if you don't know what it's called.  Here he is:
See?  You know that sound, right?