On the 200th birthday of Charles Darwin, it’s interesting to note that so many historic figures are given their own holiday, it’s impossible to keep them straight. Presidents’ Day is the canonical example. Yet Darwin, certainly the most important scientist to emerge from the UK (yes, including Newton) and perhaps the man who had the greatest idea in all of history, has no holiday in his honor. It’s ironic.
February 12, 2009
January 6, 2009
I declare this distinction unofficially dead. 300 years ago, choosing the wrong form could get you into disrepute, forever ending one’s status as a lady or gentleman. Today, knowing when to use one or the other marks a person as a word watcher.
Imagine asking your child “Whom did you play with at school today?” (This usage is alleged to be correct.) If you can’t imagine teaching this to your children, it’s moribund if it isn’t already dead.
Even the most staunch “defender of language” will admit that questions just aren’t begun with whom. William Safire, a more sober advisor of linguistics, suggests “When faced with a choice between the pedantic and the incorrect, recast the sentence”.
The “correct” usage, by the way, has a simple rule. Practice a different version of the sentence. If you would say he, use who and you, but if you would say him, use whom and ye. Who and you and he are subject pronouns, while whom and ye and him are object pronouns. You would say “you love him” not “you love he” and so the question ought to be “whom do you love?” That would have made for a disappointing blues song, though.
Finally, let’s get to the meat. Whom is dead, but how can we really tell? Native English speakers don’t use it. Like “Ebonics,” whom is simply not Standard English. Languages evolve, while some constructions (ye, whom, though sayeth, verb last as in “with this ring I thee wed”) die, and others are born.
No native speaker says “I running” or “John Mary kissed” or “who was playing with John and?” Even 3 year olds avoid these errors most of the time, because these really are errors. Refusing to split one’s infinitives may be a sign of education, but it’s not a part of the English language. If it were a fundamental part of the language, nobody would have to explain it to you, because you’d already know and obey it.
October 9, 2008
Might there be life on other worlds?
We know the deck is stacked against life coming into being from non living molecules. We know this because it doesn’t happen very often – not last week in Denmark, or the week before, and not in the laboratory. In fact, it’s only taken hold once.
Erasmus Darwin suggested that all living things are cousins, that life is a continuous filament stretching back to the dawn of time. My sister and I have a common ancestor – we share parents. My uncle and I, also, have a common ancestry, this time stretching back 2 generations. But if we go back a million million generations, the family gets bigger – my neighbor’s dog, the local redwoods, whales that swim off the coast, and house flies all share a common ancestor with the E Coli bacteria. We know this because all living things pass their genetic code around in RNA, use the same 20 amino acids, and build polymer chains by adding groups of three atoms, then removing one. All of these picky quirks, and many others, point convincingly toward inheritance.
But the odds aren’t that against genesis. Primitive Earth was a hellish ball of molten rock and lead, constantly bombarded by asteroids the size of Texas. As the planet cooled, extremaphile life may well have began, and been wiped out by these types of “sterilizing” events (this is one theory, with little hope of ever being verified). After the Hadean era, prebiotic Earth was solid and stable enough to support life, at least in principal, around 4.1 billion years ago. There’s questionable evidence of bacteria from 3.85 billion years ago, although the artifacts may not be relics of biology. In any case, living things took no more than 2 to 3 hundred million years to become abundant.
That’s long compared to a human lifetime, to be sure, but it’s very fast in geological time. Life took hold very quickly. Given favorable, or at least tolerable conditions, Earth did not have to wait long. Hast might just be the luck of the draw (the environment certainly was), but this does suggest that genesis isn’t all that unlikely to happen. Given enough time and resource, the laws of chemistry may even want life to come forth. (This is entirely in keeping with my faith that God loves us all, and set the universe up to breathe life into itself.)
If this were the case, given the incomprehensibly vast number of planets and moons sprinkled through the cosmos, by all rights life must have come into being elsewhere. We’ve never observed it, and we never will – the great distances between stars means we’ve certainly never been visited. But the number of opportunities for it to arise on trillions of planets over 14 billion years means we should be shocked if we truly were alone in the universe.
“A sad thought is that life is probably relatively common, but yet so rare and separated by such distances, that no two outposts of life might ever encounter each other.” -Richard Dawkins
For further reading, see the Iron Sulfer World theory that life may have started in hydrothermal vents in the ocean floor, or the RNA World theory that genetics predated metabolism. In any case, abiogenesis research promises insight into the most important questions about our place in the cosmos.
June 26, 2008
Am I? It sounds like a dumb question, both the title of this post, and whether I’m a Native American. After all, my family came here from Russia. But I was born here, in California, as were my parents. (That’s not true – my father was born in Pennsylvania.) More or less by definition, I’m not only native and American, but I’m a native and not a naturalized American. The only sense I can’t properly call myself a “Native American” is the sense in which that phrase has taken on a specific meaning. One that doesn’t include me.
I see three possible ways to define a Native American, and we’ve just struck one of them down.
- Being born in America, perhaps anywhere in the New World.
- Being of the line that became human in the Americas. In this sense, we’re all Native Africans, and we’re all equal.
- Being descended of the first (or first surviving) humans to reach the Americas.
- Being of the same “race” as anyone we would collectively term a Native American.
The racial definition (#4) is the one we share, and not very much unlike #3. The differences are subtle, but interesting. Most westerners would say #3 is the correct answer, and interestingly, it takes us back to silly #2. How is this possible? Let’s ask another question.
Who Were the First Americans?
This is a hotly debated topic, although much of the controversy has been settling down. There are two possible explanations (three if you accept the Mormon doctrine that the Garden of Eden was in Missouri). Either the first humans to set foot in the continents of the New World were Siberians crossing the Beringia steppe, probably in pursuit of large game, or they were Australians or Polynesians who survived a disastrous mistake and whose descendants have been driven almost to extinction in Tierra del Fuego.
There are uncontested artifacts from human settlements in Alaska, from about 13,000 years ago. In fact, there are many. What this means, is that nobody can doubt people really did cross Beringia, the Bering Land Bridge. But were they the first to set foot here?
In the most extreme example, a single pregnant woman with a male fetus may have survived an accidental journey to South America from Polynesia. The first use of boats must have happened around 30,000 to 50,000 years ago, by people reaching Australia from southern Asia. This wave of human expansion – the first to reach new land, never reached by our protohuman ancestors – continued to New Zealand and surrounding islands.
Many of the tribes on these islands kept in contact with their neighbors, sometimes near and far. It’s unlikely, but entirely possible, that one of these boats, aimed at a not too distant island, lost its way. Possibly in a storm – Polynesians today can read the waves and “sense” islands over the horizon the way many can read a face and see anger or anxiety. This is a finely honed skill, almost certainly not available in those ancient times. And yet, we should remember we’re talking about the world’s first sea farers. Even then, probably a large group set out, and a very small one survived the journey, if such a journey ever happened. When they landed, it was on the distant shores of a New World.
This “founder population” peopled the lands, and then was wiped out almost completely.
Most of the supposedly pre Clovis finds are in South America (Chile, Brazil, etc), as we’d expect. Monte Verde has been one of the more operatic chapters in our story, but is today widely accepted to have predated Clovis by at least 1,000 years. Humans lived in this part of Chile maybe 14,500 years ago. Among these finds are not only tools, but aloso human skeletal remains. Skulls found at many of these sites clearly resemble Australians, rather than the Mongolians who would become “Amerindians” of today. Mitochondrial DNA is ambiguous here, but modern day Tierra del Fuegans could be descended directly from the Polynesians they so closely resemble.
There are few of the coastal settlements we would expect to see on a “new” continent being populated by sea going people. Of course, this was during the last ice age, when much of the Earth’s water was locked in glaciers. The sea level was at the time much lower, so ancient coast we’d expect to find these sites near is underwater today. We need scuba divers to thoroughly explore the area!
What of the Bering Land Bridge (Beringia)?
The Bearing Sea today is about 50 miles at its narrowest point, and an average depth of 400 feet at its deepest. During ice ages, as we’ve seen, the waters retreat and lowlands rise up from the sea. The channel between Siberian Russia and Alaska opens up this way, exposing a land mass to connect the continents. This has happened countless times before mankind set foot on the bridge, and it will happen again – just not within our lifetimes.
Canada was a sheet of ice at the time, and not much else. It would be at least a millennium before our heros could cross to the south. When a corridor finally opened, running north by south across the Canadian ice sheet and letting out in the inland plains near Edminton, they found large animals with no fear of humans. These included elephants (well, mammoths, anyway), horses, giant sloths, saber tooth tigers, and more. Camels seem to have evolved here, as llamas, and crossed the same bridge, long ago under a very different climate.
At the all important Clovis site, a mammoth was found with a clearly man made spear tip in its ribs. Wherever mankind went, we managed to eradicate large game species. This is the “megafaunal crisis of the late Pleistocene” when animals that had survived millions of years died out within a few hundred. Giant lemurs, cow sized marsupial cats, and others have fallen victim around the globe, and it would be a surprise if we didn’t see it accompany the colonization of the New World. So why don’t we see this on the same scale in South America?
In any case, the Austronesian theory says the Bringia crossing was one way people came to the Americas, possibly out of many.
Who Deserves to be Called Native, Then?
Personally, I’d say anyone born or raised in the Americas, but I’m probably trivializing the question. If the hypothesis is true that the New World was peopled through successive migrations, then perhaps all of their descendants should be called natives. Or none of them, since all came from somewhere else, ultimately. This is probably the wrong answer, though, because some level of evolution continued for many thousands of years, adapting people to their environments.
June 12, 2008
A friend told me we should round up all the spammers, and throw them off the Golden Gate Bridge, down to the sharks below. Trouble is, others will take their place. Spammers, sadly, aren’t a hereditary breed – it’s a learned behavior. (Almost Lamarckian!)
Even if you don’t agree about where spammers come from, we’ll have to agree there are too many of them. Spam is a very successful meme, a unit of cultural information that’s better than most at copying itself. In the realm of intellectual selection, spam is to be found far and wide in the meme pool. Spam is maybe a parasite working on (or against?) the get rich quick meme – if people stopped wanting a quick and easy buck, spam would vanish overnight.
What’s this rubbish about it being the “dominant species” though? We eradicated small pox, a more difficult and more important thing than going to the moon, and we’re losing the war against spam. It’s beating us. America gave fire water to the “Indians” to take their land – now, in some places, native casinos are using greed to take modern culture’s money. Spam is doing much the same thing.
Spam is a concept, an idea, that by producing a lot of useless drivel, a person can strike internet riches. It comes in a few varieties, from the email sitting in your box, selling you viagra and mortgages, to the affiliate and “search engine friendly” links in a forum and a blog. It’s PayPerPost, where a blogger can beat the 1849 gold rush by telling you how wonderful a sponge and a bank account are. It’s Digital Pointless, where you can buy other people’s Wikipedia and eBay accounts. Fine, that’s what spam is, but what are we? It’s hapless accomlices, we’re machines, some of us, that spam uses to copy itself.
All of this is Darwinian. If you have variation (spam, job, investment, invention), heredity (new spam is very much like old spam, but refined in its sales pitch or its delivery) and selection (spam filters, forum moderators, people seeing through it), you have evolution. This works in biology (genes), and it works in ideas (memes). If you have the struggle for existance among things that copy themselves, the one that’s better at making copies will come to dominate, to fill its world. Ladies and gentlemen, this is exactly what spam is doing – a digital thing filling its internet world. One of the dominant species in the meme pool.
This proves our point – bad spam, the least fit, failing in the struggle for existance.
March 17, 2008
March 13, 2008
SQL Server supports a cooperative, non-preemptive threading model in which the threads voluntarily yield execution periodically, or when they are waiting on locks or I/O. The CLR supports a preemptive threading model. If user code running inside SQL Server can directly call the operating system threading primitives, then it does not integrate well into the SQL Server task scheduler and can degrade the scalability of the system. The CLR does not distinguish between virtual and physical memory, but SQL Server directly manages physical memory and is required to use physical memory within a configurable limit.
If you’re scratching your head, thank your lucky stars you don’t need to understand this gibberish. I’ve been having to focus on SQL Server and .NET Integration, also known as SQL CLR integration. A lot of people have made fairly bad choices in the very, very recent past. Just because you can write your stored procedures now in Visual Basic doesn’t mean you should. Obviously it cuts both ways, and that doesn’t mean you shouldn’t – the trouble seems to come when people can’t decide whether they should or not.
Having to help guide that decision, I found the 1st paragraph above, which has some frightening guidance on the matter. It seems to suggest never to use .NET at first glance, but that isn’t really the case. What it really says is long running code that might interfere with SQL’s thread scheduling can be bad. Code that spawns new threads really shouldn’t be hosted in SQL. PInvoke calls can hurt.
But XML processing is an example of something that does none of those things. And while SQL has good XML support, for some operations, .NET is better. Also the set based nature of SQL compared to the procedural and object oriented nature of C# mean you have better control over caching in .NET, so by porting this type of sproc, you can parse an XML document in memory once, instead of once for every time you need to access it.
After all, it’s not SyBase in 1997!
March 7, 2008
Hungarian Notation is a naming convention whose main rule is that the prefix in a variable name should be longer than the name of the variable itself. For example, should you find yourself needing a string called Foo, you might call the variable that refers to it as gpnzstrFoo. You would know this because
- g = global variable
- p = pointer, because in C++ everything is a pointer
- nz = null terminated (sometimes sz)
- str = string value
- Foo = the actual name of the variable
This is a mouthful and a lot to memorize. Further, it means that when you refactor your code and change the type (you might encapsulate string handling into a SuperString class as a schoolboy example) you have to either change the name of the variable which is happens in O(n) time, with n being the number of uses. Modern development environments offer refactoring services to safely accomplish this, but they also provide other services that make HN redundant and unnecessary.
Hungarian gives an illuminating view of the history of software development, and of Windows in particular. In the old days one would pull down the Windows “header file” (#include “windows.h”) and delve into its contents, using Microsoft as an example of successful large scale development efforts done right. In fact Microsoft is often credited with birthing Hungarian, in order to make scrolling work in Word and Excel. Today, like Windows 3.1, HN is relegated to the scrap heap of history.
Here is the part that might have fell web site producers call me a spammer – I’m going to copy and paste from a treatise on Hungarian Notation, some text which explains the fallbacks of this convention much better than I ever could. (Borrowing liberally from other sources, sometimes called “scraping”, indicates a lazy webmaster and is typically associated with made for adsense spam blogs. In this case, however, I’m simply trying to point my dear readers to a helpful resource, for those of you who are interested in this topic.)
My problem with Hungarian Notation is more fundamental and stylistic – I think it encourages sloppy, sprawling, poorly decomposed code and careless, ill-coordinated maintenance. Simply put, if your namespace is so polluted that you need a cheap trick like HN to keep track of your variables, you’ve made a terrible mistake somewhere. You should never have so many variables and constants visible at one time; if you do, then you need to review either your data structures, or your functions.
This is especially true for code written under either the OO or FP methodologies, as a primary goal in each is to isolate (in OO) or eliminate (in FP) those variables that are not immediately needed.
HN notation also presents a problem in the case of OO, in that it interacts very poorly with polymorphism. It is actually undesirable to know the class of an object in many instances, but at the same time marking it with the parent class tag is misleading and contradicts the goal of HN. As for marking globals as separate from locals, why on earth do you have any globals in the first place? 😉 — JayOsako
I agree that HungarianNotation is bad because it’s a crutch for overly large namespaces. That’s why I use single character variable names to force myself to write clear code. — EricUlevik
My Response: 26 variables? At once? How can you manage that?
Seriously, HungarianNotation is only a symptom of a larger problem. Too many programmers see the solution for excessive state to be more state; they end up with variables to track variables to track variables, all bound up with arcane rules for naming that are supposed to be indicative of their meaning (which it rarely is). The point isn’t so much that you have to limit the number of variables, but rather that large proliferations of variables in the local namespace is a sign of a design mistake. Why would they all be visible at once? Aren’t there some abstractions they can bound into? Why do you need this particular bit of state visible here, anyway?
The rule should be simple: if you don’t need to see it, you shouldn’t be able to. Period.
Like VisualTools, HungarianNotation is not SoftwareEngineering, its TheIllusionOfSoftwareEngineering. It’s an easy and comfortable fix that looks good because you can DoItNow? and you don’t have to do all the hard thinking that comes from trying to do the RightThing. The time you ‘save’ using it will be more than spent later trying to maintain the resulting morass. — JayOsako
This is very well put, and pretty much sums up my feelings on the matter. — KevlinHenney
Any interested programmers are encouraged to read the original page. This bit cracked me up, since we’ve mentioned the historical implications of the subject
What I think you really want is for your interactive development environment to give you a rollover (ToolTip?, for you Windows junkies, or minibuffer message for you Emacs junkies) that shows you the declaration of this variable at every use point when your cursor goes over it. — RusHeywood
Visual Slick Edit (Windows) does this.
VC6 has something approaching this. They copied it from VB – when you’re typing code to call a method, it pops up with a prompt window showing the names of the arguments. It’s not perfect, but it’s a step in the right direction. — RogerLipscombe
February 28, 2008
Think about evolution, not in the sense of where we came from or how we got here, but as a general concept. You can see the evolution of software, from punch cards to text only displays and a command line user interface to where people are using VRML to browse information in a database. Whether this is actual Darwinian evolution or personal growth and maturity is open to debate. But evolution can be used as a word to describe this style of growth, and if we look at things like software or society, it works on a slow time scale.
When Louis Armstrong (king of jazz, inventor of the modern form we recognize as jazz) was born in 1900, racism was taken for granted. It wasn’t seen as racism, just the natural order of things. Mr Armstrong had a talent so big it couldn’t be contained in his ghetto, and in fact spilled over into white society. Before each performance, he said (non verbally) “I am here to entertain you, but I am not your equal.” This is what he had to do to be recognized in a world that enforced racial division with lynchings.
This continued on until the civil rights movement. But in the last 40 years, things have turned around drastically. America’s attitude toward racism is Puritanical. In the way sexuality, especially adultury, was once met with violent rejection, today racism is shunned to great length. What a radical change in the span of 40 years!!!!! From the days of JFK (who did not implement civil rights change) it’s not just the world that’s changed, it’s us as well.
Is racism dead and behind us? No, not completely. There’s ground left to cover, but most of the work has been accomplished. A black person makes most but not quite all the salary a white person would make for the same work – this is wrong, and yet most “minorities” have never experienced racial violence. The late 1960s gave rise to the Black Panthers, but the situation today simply isn’t dire enough to provoke such blowback.
The point in writing this isn’t to congratulate, turn on the TV, and call it a day. Retrospective is an important part of planning. And looking back, the truth is we’ve done well. We need to continue, but we need to recognize success where it exists (and learn from it).
February 21, 2008
Read this. Or, if you don’t want to, I’ll summarize. A state senator who endorsed Barak Obama was interviewed by TV Idiot Chris Matthews. Chris asked “Name some of his legislative accomplishments… name any…” and there was nothing but dead air. That’s fair – journalists are supposed to ask tough questions, especially when the stakes are this high.
But we’ve been poisoned to believe that what somebody says on television dictates objective reality. For example, if Matthews had asked the man to name a few continents and he hadn’t been able to come up with anything but North & South America, Europe would stop existing.
This is the dumbing down of America. This is how Bush operates – just say Iraq is turning into a democracy, that we don’t torture people, that weapons of mass destruction threaten American national security. It’s true if it gets said on TV.
In 1984, the ruling party put out a new dictionary every year, with less words than the last one. After a few generations, the people were only able to think smaller and smaller thoughts. It’s the same thing TV News does, whether it’s Fake News with Bill O’Lielly or MSNBC with Keith Olbermann. They poison our brains so we can’t recognize logic. After a generation of this, Americans won’t be smart enough to operate a power plant, and we’ll be in the dark literally instead of just figuratively.