Disclaimer: The views expressed here are mine alone, and not necessarily those of my employer or any organization with which I am affiliated. These views are not intended to advertise or offer legal services to the reader, or to be relied on as practical advice in any respect. Apparent statement of facts may or may not have been specifically researched beforehand. Unless I expressly indicate to the contrary, the material appearing here is original work, subject to copyright protection. Any reference in the text to specific individuals or companies who are not explicitly named is unintended and purely coincidental.
Comments? You can (try to) contact me at admin (at) limitsofknowledge (dot) com. Keep in mind that I'm still learning the technical aspects of blogging, and do have a demanding job, so don't be offended if it takes me a while to respond.

Tuesday, October 3, 2006 -- 9:39 pm

Where do ideas come from? I remember back when I was very young, I read that the nervous system contained three types of neurons: sensory neurons that collected information from the environment, motor neurons that controlled the muscles, and "associative neurons", in the brain, that connected the two. Even at the time, it was clear to me that "associative neurons" explained nothing. Associate what? Associate how? These days it seems there's been some improvement in the field, so that we realize that different parts of the brain do elementary processing of stimuli at increasingly higher levels of abstraction, and we are beginning to understand, whether through modeling or study of trauma victims, some basic components of our mental apparatus. Eventually, I suppose, neuroscientists will have to grapple with the notions of thought and consciousness, whatever those are. But until such time as they can answer the question for us, I guess I'll have to try to come to terms with it myself.

Again, where do ideas come from? Is there a reliable methodology for achieving insight? Or is it dumb luck, or divine intervention, that provides us with our keenist creative moments and our most inspired understandings? For my part, I'm inclined to dismiss the dumb luck and divine intervention hypotheses. People sometimes say that Mozart was a genius, and that his gifts must have been divinely inspired. And the same can (and has) been said of many great composers. But it's also clear that Mozart's style and musical sophistication evolved over time, and his last music is clearly superior to his first. He learned things as he got older. If he were simply extraordinarily lucky, we would expect his best works to be scattered at random throughout his career, rather than becoming increasingly frequent as he got older. And if he was merely a vessel for divine dictation, we wouldn't expect any change in the quality of his work over time. So coming up with ideas, even musical ideas, would seem to be a cognitive process.

So what are the methodologies for coming up with good ideas? I'm certainly not privy to Mozart's, and don't have enough of my own to claim any expertise. For my part, I notice that people tend to be startled when I juxtapose seemingly disparate notions and try to draw connections between them. I also notice that I don't so much deliberately craft my ideas as sit quietly and listen to them as they percolate to the surface. (Deliberately crafted arguments so often smack of legalisms and sophistry, rather than going to the heart of the matter.) But this suggests that there's a part of the brain that I'm not privy to which is always sifting and correlating, and occasionally passing something along to, well, *me*. This is presumably a somewhat mechanical process, and therefore one that is subject to formulation -- and, giving sufficient resources, implementation as a computer program. Although I may sketch my ideas for how this can be done at a later point, in the interim I leave it as an exercise for the reader.

Monday, October 2, 2006 -- 10:14 pm

Just a question: in light of the series of NY Times articles the past few days on the political workings of the House of Representatives, how much does the politics in Congress really differ from the tribal politics of indignenous societies we find in New Guinea, the Amazon, and Africa? Perhaps sometime when I have more time, I can explain exactly what I mean. (And I intend no offense to those societies, either, by comparing them to Congress; as usual, I'm on the hunt for human universals.)

Monday, October 2, 2006 -- 10:06 pm

It's a telling sign of how foreign logical thought actually is to our natural ways of thinking that clearly intelligent people -- doctors and attorneys, for example (though fools are well represented in both professions) -- will resort to glib expressions as a kind of mental shorthand that completely glosses over the fact that what they're saying is total gibberish. Once a doctor explained that he had concluded that a woman's accident was the cause of her chronic pain by relying on the principle "post hoc, ergo propter hoc", apparently not realizing that he had just quoted and relied on a classic logical fallacy. An attorney once dismissed a counterexample I had raised in discussion with the comment that "the exception proves the rule". And there's my perennial favorite, the college student who was directed to research the principle of non-contradiction, and couldn't find it. But he did find references to the principle of contradiction, which he concluded must stand for the opposite proposition.

This anecdotal evidence is probably not enough to conclude that logic is dead. But is it falling out of favor? I realize that we no longer live in the heady days of the Enlightenment, when it was assumed that everything could be deduced from Pure Reason. Gödel's incompleteness theorems did as much to drive a stake through that conviction as anything did. (True, Gödel didn't show that logic was irrelevant, only that a finite list of assumptions couldn't be used to deduce all statements that were true without opening us up to contradictions; but I suppose that would be enough to convince the masses that logic was altogether unworthy of worship anyway.)

All the same, you would hope to find more rigor in social discourse than we do. Yet it's actually a rare treat to stumble upon a tightly reasoned argument, like a mathematical proof, where all the assumptions are explicit and supported by evidence. It's far more common to have to contend with ad hominem attacks, and accusations of "immoral", or "disingenuous" or "laughable" positions. Is this because people have gotten lazy? Or is it because, although it would be nice to have the luxury of elaborately constructed syllogisms, it's just not cost-effective to invest that kind of time and thought into our social discourse? Or is it that we realize, consciously or not, that we don't actually have the information we need to support the positions we favor?

Sunday, October 1, 2006 -- 9:26 am

Languages seem to vary a great deal not just in nuances of vocabulary but also in the nuances we are allowed to make grammatically explicit. Navajo and Yup'ik Eskimo can pack very subtle views about time, motive, location, and other factors into just a few phonemes. Instead of the demonstrative pronouns "this" and "that" in English, or the common "this near me", "that near you", "that near neither of us" distinction we find in Hawai'ian and Yucatec Mayan, Yup'ik has roughly thirty different demonstrative pronouns specifying location, visibility, and range of possible motion. And whereas Navajo has a rich assortment of grammatical forms for specifying the tense, aspect, and habitual nature of the action, others dwell largely on aspect, such as Arabic, and others, like most Romance languages, seem more preoccupied with arraying events on a timeline.

It's well known that many languages lack definite articles, indefinite articles, or both, and those languages lacking them (Latin, Russian) can make do, in a pinch, by resorting to their demonstrative articles or the number "one", respectively, to convey roughly the same idea, although these words carry more emphasis than is strictly intended by the speaker. German seems to be a language in transition in this regard, whereas English has wholly severed "the" from "this" and "that". As languages evolve, many of these demonstrative pronouns in fact become definite articles, and other words take on demonstrative significance -- as seems to have happened in the Romance languages in particular. There are also languages, such as Yup'ik, Farsi, and Turkish, which seem to take another route, in the sense that their grammars allow definition only of direct objects. In Yup'ik this definition is accomplished through choice of verb forms, in Turkish through the case structure, and in Farsi through use of the enclitic particle را after the definite, direct object. Is this a halfway step, or a different path altogether? I've also noticed that in the same way, many languages do not have distinct words for their interrogative and indefinite pronouns: کم in Arabic means both how much? and some, ji3 in Mandarin means both how many? and a few, τις in Ancient Greek means both who? and someone. (In Yup'ik, kina also means both who? and someone, but without ambiguity because Yup'ik uses a special interrogative mood for content questions.) This leads me to formulate a number of questions.

I realize, of course, that it may not be possible to formulate universal historical rules in this regard, especially in light of the apparent divergence in favor of definition solely of direct objects that some languags seem to show. And I don't yet know how languages like Icelandic and Basque, which indicate definition through noun suffixes rather than as separate words, fit into this overall scheme.

I should probably make a disclaimer here, since notions of language "evolution" will probably get people excited and/or insulted. I'm using the word "evolution" strictly in the sense of "changing over time", rather than in the accepted biological sense, or in a judgmental or teleological sense (i.e., suggesting that languages are more "evolved" are somehow morally, culturally, or aesthetically superior, or superior simply because they're older). The choice of terms is admittedly sloppy, and the metaphor inexact. Although I have my reasons for finding some languages more interesting or exciting than others, which reasons I may explore at another time, it's also my view that every language is one to be proud of, and must be perpetuated.