SPOILER ALERT! (It does not end!)
OK, so the Terminator series has never been too philosophically driven - at best, it could be said that the movies are character-driven pieces about hope and the human spirit. But they do play with philosophy sometimes, whether in an attempt to justify whatever major thing is going on, or to try saying something substantial in a summer shoot-'em-up. That said, there's plenty of philosophical material to be drawn from the series, which in itself isn't quite so special (you can find philosophy anywhere you look, after all), but the Terminator movies do give some interesting opportunities.
For example, Sarah Connor faces a formidable epistemological dilemma, not unlike a Cassandra complex, in that she is dead-sure of some very important facts about the future but cannot convince anyone that she is correct. How ought she to behave in order to best influence future events? How can she continue to perform checks upon her own sanity and provide herself with some assurance that she hasn't gone off the deep end? Given that the events of the movies happened as they did (what with the time travel and all that), what does that say about how "fate" works in that world? Since the war started because SkyNet considered human beings a threat, what could be done to reconcile humanity and AI so that they might live in peace? Since humanity is winning the war by the time Kyle Reese is an adult, how much of the future must SkyNet be aware of for its first strike to become an irrational action?
In Terminator Salvation, John Connor asks people to disobey direct orders from Resistance Command on the grounds that Command is asking them to carry out calculated but inhumane orders - to behave like machines - and if they're going to do that, then what's the point of surviving? This is what we in the industry call a "theory-laden question," or a "loaded question" in the common parlance. There's quite a bit of meaning packed into the question, and the most important parts are not explicitly outlined in the question itself. This can cause confusion and disagreement, as every respondent will color in the ambiguous but necessary values that give meaning to the question, but each respondent may have different values. Until these are explicitly drawn out, apparent disagreement can cause people to spend a lot of time talking past each other.
Human beings are machines, just very messy ones. As my friend Zach once put it, "I am a machine for turning meat into ideas!" (EDIT: As Zach points out in the comments below, the quote is actually from Dinosaur Comics, which comes highly recommended by everyone who reads it.) Our brains are crude, ad-hoc learning computers which store, access, and run various programs throughout our lives: etiquette programs, problem-solving programs, math programs, architecture programs, science programs, religion programs. Most of the time, we aren't consciously aware that these programs are running, we just get on with our lives and tell ourselves stories about the mystical "I" at the helm of consciousness.
The distinction between man and machine cannot be merely a functional one, for it does not hold up to scrutiny. We simply fit into the category all too well at some fundamental levels. No, the distinction that John Connor wishes to make must be a moral one - so what are the values that would inform such a moral system? We can't tell much, but in the context of the scene, we can tell that what Connor disvalues is the rigid, non-negotiable adherence to procedure. Within this system, a military chain of command, there is no flexibility to allow perceived errors in judgment to be addressed before mistakes are made. This is the problem that John Connor is running into, but it is a flaw that comes as a cost of what is typically a strength: a rigid chain of command, with experienced individuals giving direction from the top down, ought to result in rational decisions being carried out in the field by those who lack such experience but are able to take advantage of it by virtue of that very chain of command. Though individual soldiers may not benefit directly from this system all the time, the soldiers as a group are better off with this system than without it. The system provides organization and a clear structure of power, without which a military outfit would have a hard time functioning.
But what happens when the system breaks down? Due to the very nature of the power structure, mistakes at the top cannot be addressed in real-time. Only after they have been proven to be mistakes by direct experience can they be addressed, and by then it is too late to avoid the consequences of that mistake. All that can be done is to try to design a better system (or at least learn a valuable lesson) so that similar mistakes will not be made in similar situations. Unfortunately, Connor does not have the time for this - this particular mistake will ruin everything! It's one of those "end of the world" scenarios that simply doesn't happen in real life, but is excellent fuel for thought experiments. We can see, by Connor's course of actions, what it is that he values more than the system that has failed him.
For lack of a better term, I will call it "messiness." Human beings, in a lot of important ways, are messier than the machines which seek to exterminate them. This messiness can be a liability, like the way that our bodies must grow from infancy to adulthood instead of being assembled in a factory, which puts constraints on our basic body plan as well as opening the door to all sorts of developmental mishaps and genetic defects. But, if harnessed correctly, it can also be a virtue: brains are not rationality machines or truth detectors, and so humans who possess them are by necessity fallible. If we recognize this fallibility, then we are able to doubt ourselves, and we can give the benefit of this doubt to others. While such good will may be taken advantage of (and it often is), it can also lead us to behave more humanely towards one another, to trust one another, and to get along with one another even when we aren't getting our way.
John Connor is appealing to the humanity of those who receive his transmission insofar as he is asking them to embrace their messiness: to doubt themselves and the system that has kept them alive thus far, to doubt enough to trust him and do something which they are told does not make sense by people who really ought to know so.
Or maybe I'm reading too much into it, and Connor was just taking the lead from the bottom up. After all, a leader doesn't always have to tell his followers what's true, he tells them what they need to hear. Maybe they just needed to hear some emotionally-charged mumbo-jumbo to do what he wanted them to do, in which case Connor was just using a dirty trick to "hack" their decision-making processes. And if that's the case, then he's full of shit and the humans already are no different from the machines, so whatever.