mathjax

Wednesday, March 21, 2018

Common sense versus humanity.

Sometimes I struggle to understand how normal people think. What facts they use, and how they arrive at ill-informed conclusions.

So people are upset this person sold metadata from information that users voluntarily uploaded to Facebook, after signing a user agreement they never read, including your drunk vomit photos and your silly comments about nothing?



But when the #NSA started stockpiling personal data from foreigners from the entire world (outside the USA) in this Utah datacenter, without probable cause nor a warrant, that was met with indifference?


Sunday, March 4, 2018

A response to Stuart Glennan's "On our craving for generality"


In Professor Glennan's "On our craving for generality", he makes an argument about an acceptable need for generality in philosophy and he conjures Professor Wittgenstein's lone criticism.

Wittgenstein also stated:

People are deeply imbedded in philosophical, i.e., grammatical confusions. And to free them presupposes pulling them out of the immensely manifold connections they are caught up in.

I like to paraphrase this by claiming what he really meant was the philosophers are masters of self-confusion. Because, within the folds of language, there are many interpretations to hide your theory in that make skepticism impractical and falsification impossible.  This is the malaise of philosophy borne out by the acceptable practices of it's mendicants. 

While Glennan is right that generalization allows one to make more useful rules ( associations, relationships), if you like, similar to natural philosophy, about the epistemological world that apply to a wider context than would otherwise be possible without generalization. I don't see, however, how he can always by correct when philosophy started out 2000 years ahead of science and has delivered so very little in comparison. This is not disputable by any performance metric one intends to measure. By time. By effort. By accomplishments.


Within the last 200 years, we have seen the capabilities of science-driven thought uncover billion-fold improvements in ability. The abacus may still arrive at instantaneous answers, but binary sequential computers can approach a Zeno's distance of that goalpost. One can scoff at technology, but the models inherent in those systems have also advanced and accelerated in complexity and generality as they are improved, generation after generation. Contrast this with modern philosophical by-products.

And yet, philosophers remain unmoved.  This should be frightfully unnerving for philosophers to see so many pass them by.  That, to me, indifference by philosophers speaks to a relative misapprehension regarding  the notions of achievement and advancement. These are all symptoms of a protected workshop, unwilling to change.

Even within science, the electrical disciplines goad the biological disciples to pick up the pace, as Intel's CEO, attending a pharmaceutical conference, implored them to ramp up the effort because more is possible, faster.

Professor Glennan, in the aforementioned article, points out:

"Wittgenstein’s worries about the craving for generality are in some ways reminiscent of Hume’s worries about the principle of induction. Hume argued that inductive inferences are grounded in our unwarranted commitment to a principle of the uniformity of nature. We use past experience to make predictions about future experience, but this can only work if the future is like the past, and we cannot, on pain of circularity, establish by induction that the future is like the past. Nonetheless we persist with our inductions. It is just habit.
The problem with Hume’s way of putting it is that it suggests that in the past nature has always been uniform; we know it has not.

The real question is not whether the future will be like the past, but when it will be."
Let me pierce that assumption (circular logic amounting to false tautologies) - and philosophical cover - by pointing out a simple proposition. While there are no absolutes like beauty and good, to propose these ideas as time-varying breaks neither generality nor specificity. To assign limits to good or bad may make for exclusions outside the frame, it also distinguishes "better" or "worse" as straightforward. Therein is a model. By claiming we can't induce that this bread, as Hume did, is as nourishing as the last bread may seem rational. But it evades the possibility that if we define the depth and breadth of what bread is, we can make a pronouncement within induction that makes sense. This is where science accelerated away from philosophy.

What philosophy lacks is not generality, it lacks specificity.  Ludwig von Wittgenstein arrived at philosophy from engineering, I can assure you as another engineer witnessing the practices of the philosophical knowledge tribe, what he found was lacking.  Not in the lofty goals nor the ability of the practitioners, but the madness masquerading as method.

Badiou pointed out that truth and false must exist outside any one philosophy.

If so, then any philosophy is the right starting point to make the same inroads on epistemology as the others.

What the sciences developed that philosophy did not, was a set of standards.


They are not what you might imagine, like a protocol or even the scientific method.  They are instead bounded constants that explain the interrelations amongst many concepts. While mass in Newtonian models is incommensurable in an Einsteinian model (in the language of Kuhn), it makes a common reference frame that one can use to compare and contrast models and results.


These standards are mainly embodied as universal physical constants. Boltzmann, Hertz, Avogadro, Newtons, Amperes, and so on. Physical properties - that might be any tangible, practical units of measure - that allow any one's circular logic to depart one constant and arrive at another.  Many are arbitrary, they could be changed, and sometimes do. The length of a metre, the bounds of a second. If a model or proposition about these standards can't be transformed to another then it makes it very easy to falsify. That exposes more error and truth than a messy system where ambiguity is used as cover, not a reason to define and refine. This system makes a mesh or a lattice, or a torus of any circular logic. The transcendence isn't in the method but the patterns it creates in understanding.

Now, Glennan might counter with late-Wittgenstein (also from the Blue Book);

The idea that in order to get clear about the meaning of a general term one had to find the common element in all its applications has shackled philosophical investigation; for it has not only led to no result, but also made the philosopher dismiss as irrelevant the concrete cases, which alone could have helped him understand the usage of the general term. 
Late-Wittgenstein was a study in paradox compared against early-Wittgenstein.  At first,  Wittgenstein is fortified with an optimist's effervescence that symbols and systems had no limit to aiding man's comprehension. At the end, he'd drifted so far into the riddles of words - language is only one knowledge modality - that he'd lost his faith in a better tomorrow.

This all came about not despite his talents nor dedication. I suggest his conversion came about due to the philosophical company he kept, the ill-recognition of his vision, and the plodding pedestrian nature of other minds unwilling to extend their reputation to achieve a better model in philosophy.

As Bertrand Russell wrote of Wittgenstein:

...every morning he begins his work with hope, and every evening he ends in despair
Logical positivism was the attempt to bridge back to philosophy using the same successful techniques that have engorged natural philosophy with more knowledge than philosophy has achieved in 4 times the time. More's the pity that it was gradually excised and replaced.  Given the progress made, was that wiser for the discipline?

Yes, Kant dictates that experience is king, and while physical laws remain temporary theories, their lifespan may exceed the solar system if not infinity. A satisfactory state of affairs to give to our grandchildren.

Science has held its' progress because of formalized definitions and refined common reference frames. Despite the same tribalistic, political, sociological, difficulties of internecine rivalry inherent in all academia.

When and if string theories supersede relativistic models based on Lorentz transformations, that superseded Newtonian Platonic calculus, then mankind is better off than if one hadn't extended upon the standards.


Specificity doesn't proclaim that common reference frame logic is infallible, nor that any one set of arguments cannot be demonstrated false when compared to greater knowledge attained elsewhere. Falsifiability is still the goal, but the way to achieve it at every step in science is understood even if the ultimate outcome is not. Older scientists are proven wrong as new theories are proven better. Better may not be quantifiable in absolute terms, but the practical limits are widened nonetheless.

Let me represent the values of common reference points in an analogy.

Suppose that constants are like handholds on the face of a steep mountain. One can advance up the mountain by building a logical argument that clings to one of these constants.  If science was a pre-climbed mountain, Mount Science perhaps, then new climbers would arrive at the base camp with many visible, understood, and solid points to work from. If one climbs through a point but arrives at a dead end, some impossible vertical, then one can traverse back to another constant in the pursuit of a further plateau.  The mountain is nowhere conquered, but there are many beaten paths to ascend in comfort and safety, making attempts at higher points more achievable in a lifetime.

Now, imagine what today's Mount Philosophy looks like.