Friday, August 29, 2008

IBM clouds computing - and it's spreading

IBM is investing hundreds of millions of dollars in cloud computing.

In practice, this means $400 million for two data centres, in North Carolina and Japan. IBM has moved strongly from selling computer hardware to I.T. services and software, and these data centres are very much in keeping with their services strategy.

Two points to note, though. They announced this as part of their Project Big Green, to significantly improve the energy efficiency of data centres.

Second, cloud computing. Is this revolutionary, or just a forward movement of the concept of data service centres? At its most basic, cloud computing is just an Internet-based abstraction of I.T. resources, ie you locate your resources - data (storage) in particular - 'out there' on the Internet, rather than in front of you. It's been done before, in both storage and processing - and more recently in software-as-a-service. In this sense it is merely stepwise progress.


But I would contend (as this article does) that cloud computing is - in totality - more of a paradigm shift. Your traditional EDS-type data centre involved direct links from your head office to your outsourced hardware at an EDS site in Strathfield or somesuch. The abstraction behind cloud computing involves several other factors, including:
- a degradation in transmission efficiency from direct links to IP-based connection;
- different security issues
- a greater flexibility in incrementing (or decrementing) resources
- location-independence.

That last is theory; in practice, many organisations may feel more comfortable with some certainty in terms of where their data resources are housed.


But stripped back, cloud computing is simply about having access to your data anywhere, anytime. One of the more dramatic impacts of cloud computing is where it is becoming not just about business - but personal too. My life is rapidly becoming stored as data: music, photos, records, writing, archives, etc. Right now, it's mostly centralised - for practical reasons. But in toto, my data/life is not just stored on my computer: it's also located on my PDA, memory sticks, SD cards, external storage disks, CDs, DVDs, etc. Some of this is not likely to change, and will remain scattered. But I hope to get to a point where I don't have to manage the storage, backup, and retrieval: for my core information (that which is vital, or that which I may want to retrieve on the run), I can just call for it - from the cloud. And as much as anything else, it will be the safest, easiest way to save that which is important to me.

This is all likely to become more and more pervasive. Cloud services will be offered by our telco (ISP), bundled in with existing services. Simple storage is quite commodified already; bandwidth (to upload and download) will not be the bottleneck it once was [I hope this last part is not just bravado on my part! - I suspect bandwidth investment and bandwidth demand will be constantly racing to keep up with each other].

A vision of the future that is available now; only limited by the capacity of service providers to productise supply and do (at least) some (of the) demand-leading. And they have proven pretty adept at that in recent years.

Thursday, August 28, 2008

Intel's wireless power

Intel has demonstrated a wireless power system sufficient to power a 60-watt light bulb, which they say is enough to drive a laptop. (Report here.)

They say the power is transmitted via magnetic fields:

"It turns out the human body is not affected by magnetic fields; it is affected by elective [sic - electric] fields. So what we are doing is transmitting energy using the magnetic field not the electric field."

I'm quite rusty on the physics, but I imagine it might involve some sort of inductance, which involves the transfer of energy between magnetic and electric forms.

Looking at the bottom of my laptop, I see 19V by 4.74A, which would mean it's running at about 90 W. Right order of magnitude, and my one would not be the most efficient one on the market.

I'm not sure the human body is unaffected by magnetic fields, but it's plausible the numbers stated would make for a fairly small field. It's not the broadcast power of science fiction dreams: I imagine the application of this technology would be limited by distance to quite local appliances. Thus it's understandable that a company like Intel would be interested in investing.

Wednesday, August 27, 2008

Cutting transport greenhouse emissions

A recent report gave a list of initiatives aimed at reducing greenhouse emissions in the transport sector, said to be the third largest sector for emissions, and growing.

The report, written by transport expert Professor John Stanley (Sydney University), was presented to a local government conference in Melbourne on August 20. It identifies greater fuel efficiency in vehicles as the biggest source of reduction. This was followed by an increase in use of rail for freight - a reversal of the trend of recent decades.

Professor Stanley's full list of headline initiatives is:

1) Increasing vehicles' fuel efficiency by replacing older cars and introducing mandatory standards for car manufacturers.

2) Shifting freight from road to rail and encouraging the use of larger trucks to improve efficiency of freight movements;

3) Increasing car occupancy rates from an average of 1.4 to 1.6 people per vehicle (effectively, carpooling)

4) Doubling the proportion of trips taken by public transport, from about 7.5 per cent to 15 per cent nationally;

5) Increase the share of urban trips taken by walking or cycling from 16 per cent to 26 per cent;

6) Reducing demand for travel through better urban design to ensure people live closer to work, schools, shops etc.


Unfortunately, the Herald article (here) didn't give much more specifics than the above. One can only assume the particular figures plugged in (and the ensuing outcomes) were the results of a set of economic models put together by the Professor. Yet despite his expertise, those models must rely on input assumptions that are not beyond dispute, so the margins of error would have to be fairly healthy. So it's rather brave for the Herald to state in a rather incontrovertible tone, as it did, that car pooling would provide better outcomes than doubling public transport patronage. How do we know for example, that increasing car occupancy from 1.4 to 1.6 people per vehicle is the best, the most achieveable, or the most likely increment? And decoupling the items in the first point might be wise. The very process of replacing older model cars, as well as being very expensive would result in a substantial amount of gases released through the manufacturing process.

Still, governments need starting points for planning. The general points above, with the specifics stripped, would sound rather like motherhood statements, but it can't hurt to present the set of options and leave it to governments to choose how to tweak the factors of each initiative. It's plausible, too, that the relative viabilities of each option would vary on a regional basis.

Tuesday, August 26, 2008

The perils of recycling music

I have to get this off my chest. My appreciation of Sibelius' 5th Symphony was ruined by the baggage thrust upon me long ago.

Trouble is, the start of the fourth movement is copied directly into the break of the 1974 song Beach Baby, by studio band First Class. So I can think of nothing else when I hear that passage.

The usurpation of classical themes for pop music is relentless. However, that particular time saw something of a peak. This was quite apart from some of the direct pop-isation of specific pieces, such as:
  • 1971 Waldo De Los Rios - Mozart's Symphony #40 [jazzed up with guitar, drums, etc]
  • 1972 Apollo 100 - Joy [Bach's Jesu, Joy of Man's Desiring]
  • 1973 Deodato - Also Sprach Zarathustra [from Richard Strauss]
  • 1975 Walter Murphy - A Fifth of Beethoven [his Fifth Symphony, disco style]

- and those were only the palpable hits. Such music-makers were strongly prone to repeating the formula until their popularity waned - as it did consistently in the above cases.


Gustav Holst: the bringer of joy

Around that time there was also a filching of specific classical themes to be disguised. More insidious, one could say. As well as the one mentioned up top, these included:
  • Holst's Jupiter, from the Planets -> Manfred Mann's Earthband - Joybringer 1973
  • Bach's chorale from St. Matthew Passion -> Paul Simon - American Tune 1973
  • Chopin's Prelude in C Minor -> Barry Manilow - Could It Be Magic 1973, hit in 1975
  • Rachmaninoff's Piano Concerto No. 2, second movement -> Eric Carmen - All By Myself 1975
All these are just what I remember off the top of my head, from a thin slice of history. Yes, this happens all the time; I'm just listing specific pieces that I know have been tainted for me. Classical composers in turn are not above this - Bach's chorale above wasn't an original theme, uplifted as it was from an earlier piece that may also have had its history.

And thievery sometimes has its desserts, as Eric Carmen found, when the Rachmaninov turned out to be still in copyright, although the composer was dead. Payments eventually filtered through to Rachmaninov's estate.


Something incidental I found about Paul Simon's American Tune. A minor hit at the time, I barely heard it, and assumed it was a piece of flag-waving. More attention much later, from a CD compilation, and it sounded to me like an expression of the strain of constant travel (touring). But it turns out to be a direct response to Richard Nixon's re-election in 1972. When put in context, the strands of world-weariness, as well as all the words, fit very neatly into place. Worth revisiting the lyrics, here from Simon's own web site.

Other discussions of early 1970s pop music: Vigrass and Osborne; songwriters Ellie Greenwich and Roger Cook.

Monday, August 25, 2008

Keating's wide-ranging speech of ideas

Paul Keating gave a speech to the Melbourne Writers' Festival on Sunday. He took the opportunity to wrestle with the shifting sands of world power in our era.

As I read it, I was reminded of the time, now more than ten years past, where national debate took place in a more robust, colourful atmosphere. He was Prime Minister for a scant three years, sandwiched between the reigns of Hawke and Howard, who by comparison each ground down the weighty issues of the day, made public discourse itself rather moribund, and certainly didn't contribute to outcomes as momentous as, for example, the Wik and Mabo decisions (His 'Redfern speech' - available here - strongly signalled his sympathies with these issues, and directly set the context for Kevin Rudd's impressive apology speech in Parliament).


Although I find much of the speech quite insightful in the issues and perspectives canvassed, there's a fair bit that I disagree with (eg the significance of the EU) - I'm sure most would say the same, although the points of departure will differ. Also, I find the closing stages about as weak as the beginning is strong.

But for me, the value of Keating's speech is in a welcome sharpness and turn of phrase, and the breadth of ideas presented. Again, a call for vision.

Full text is available here.

Sunday, August 24, 2008

Business Intelligence 2.0

A provocative article on B.I. dating back a year and a half makes an interesting juxtaposition between the development of Web 2.0 and business intelligence. Or at least, where he thinks B.I. will be. Or ought to be.


It's by a bloke called Neil Raden, and you can read it here. There are limits to the analogy, of course. Web 2.0 is about a ramping up due to interactivity, collaboration. Business Intelligence 2.0, from what the author says, is simply about what B.I. would be like if the professionals and software developers just got on with the job and came up with more sophisticated tools.

At first glance, I thought it was pie in the sky. Second glance, it all made sense in a general way: the toolset is not what it should be in an ideal world, and there's no reason the B.I. space shouldn't go that way. But in reality, such evolution would take quite some time. And in fact I query how susceptible to simplification something as mercurial as data - and people's approach to it - can be. Raden acknowledges here the value of the semantic web project, and that may actually be whence the best initiatives emerge.

Still, it's challenging, and worth reading for those in the industry, or who are touched by the issue.

Friday, August 22, 2008

A time for vision

An opinion piece in last December's Herald has turned out to be rather precient - or perspicacious, more likely.

A scant couple of weeks into Kevin Rudd's initiation as Prime Minister, Ross Gittens, the Herald's economics editor, took the liberty of commenting directly on the politics of the contrast between Rudd and his predecessor, John Howard.

He pointed to his background as a public servant and diplomat, as well as his reputation as a "detail man", to conclude that Rudd would be "more a manager than a leader", someone who is "stronger on tactics than strategy".

Emerging from an era of mean-spiritedness under the Liberals, where refugees were not given a hearing and dental aid was stripped from the disadvantaged, the specifics of Rudd's nature are very easy to overlook. If he was tweedledee to Howard's small-c conservative temperament, he would be a world of difference in Australia's engagement with the world.

That stands. But Gittens' words ring true today: more a manager than a leader. Thus far, and maybe further. An unsurprising hangover from Howard's legacy: after four election losses: the Opposition had been actively presenting as small a target as possible.

And that remains. If Australia still embraces that feeling of fresh air (Rudd remains high in the polls), it is not due to vision.

We're already on the back foot on some really pressing issues. If Rudd's eye is on the next election cycle (2010), then he's missing the wider focus needed on Australia's - and the world's future.

Thursday, August 21, 2008

Data quality 1: Data Governance

I work in the dirty end of town, as far as data goes. All kinds of rubbish filters down from a variety of data sources, to a data warehouse from which I'm typically called to extract more value than it contains. What makes you expect that data to be fully loaded and correct when you don't even maintain a data dictionary - or any meta-data repository or documentation? From large enterprise to small business, the organisations I've experienced have not gone far enough beyond lip service on this.

That's only one reason data is dirty. Another: it's only as good as the source. Data from sales people - such as lead information - is notoriously haphazard. Conversely, bank account data is likely to be pretty clean - the balance is, anyway, although peripheral information is often less than pristine.


Rule 1) The data is only as good as its source. Don't overextend the data's purview unless you're willing to work on it first.
Rule 2) The state of the data directly relates to the amount it's used, how much someone cares about it.

For example, in putting together aggregate information for resource planning, I find reliability of sales information is poor - but much better for those specific elements of data on which the sales person hangs a KPI. (even then, they may have their own reasons for misrepresenting reality.)
And I find that people may have procedures set out that guide their workflow, and thus govern data at the point of entry. But I find that people often do what they need to get their job done, which may not always involve following procedure strictly - only enough to satisfy the next person in their data stream. But if that data - which purports to be clean - is used in aggregate, the numbers may not add up.

Rule 3) Officially documented procedures don't always guarantee the health of the data.
Rule 4) If the data is to be used for a purpose outside the current day-to-day processing, it should be tested.
...etc etc. There's a lot more to state that could sound like the obvious if you already manage your data well. Business analysis from the dirty end can unearth business process improvements (or enforcements). But systemic improvement needs to come from the C level.


Data Governance, defined broadly, is “a system of decision rights and accountabilities” which describes roles and processes for the use of data. More here.

Steve Adler, IBM's international Director of Data Governance, gave a stimulating talk earlier this year on data governance issues. He began with the concept of toxic (data) content: that which leads to poor, uninformed decision-making. Two examples I heard from him were a) origins of the sub-prime crisis within poor quality risk/lending data; and b) influence-leading by the US administration on the Iraq war by seeding news sources with purportedly independent expert sources (see here). Information that is tainted because the sources weren't verified as pristine.


Adler's presentation (here) is valuable to any organisation with an important stake in the quality of its data. The solutions he recommends are predicated on an organisation caring about its data and having the resources to look after it. Either the will or the resourcing may be lacking, intentionally or by negligence. But at the very least, it is important for an IT manager to understand the issues. Otherwise, they may promise the earth on a simple-sounding programme, only to find the deliverables mandate costly data cleansing projects, from the technical analysis to the business analysis, to documentation to procedural change. Not to mention the issues of political will and influence to improve.

Adler details a theoretical internal market for data whereby user demand "sets the internal price". That often happens already - but very much on an ad hoc basis. If it were more formalised, the IT budget - especially for data maintenance - would be enhanced to the point where it could look after the user's data to the extent that the user has an interest in and is willing to pay for it. This would, of course, necessitate good cooperation with the business areas that source the data – the answers are always ultimately at the business end of the company. Thus it is quite unviable to run good data governance without C-level buyin.


IBM originated the concept of Data Governance back in 2005, however it has spread beyond the vendor-specific (see here). It may sound bureaucratic – un-free-market – but the above illustrates well: if a free market is desired, paradoxic as it sounds, it requires governance.


The general principles apply equally everywhere; however an ideal implementation may be best suited at the enterprise level, as smaller organisations would find it harder to meaningfully commit specific resources on an ongoing basis. For those below enterprise level, there are plenty of ways to grasp the nettle and improve the framework for effective use of data.


Further reading:

Wikipedia: brief description of Data Governance

Steve Adler's blog

Adler on the Data Governance Council

The Data Governance Institutute

Mike2.0 (an open source information management methodology) on Data Governance

Mike2.0 giving a context for Data Governance





Wednesday, August 20, 2008

Jerry Wexler's influence on music

Reading the paper last night, I learnt of the death of Jerry Wexler a few days ago. Not too well-known outside music circles, his influence is still very widely felt, in particular as executive and producer at Atlantic Records. Coincidentally, at the time of reading, I was listening to an Atlantic album: the eponymous Bette Midler (a protege of Wexler's partner Ahmet Ertegün, whose passing I noted in December 2006).

Although Wexler wasn't a founder of Atlantic (he boarded in 1953, insisting on partnership), he had already made his mark in music by then. Inter alia, he was credited with the neologism "rhythm and blues" in 1949, when he renamed Billboard magazine's black music chart, formerly known as "race records".


Always keen on black music, he was instrumental in the careers of Ray Charles and Aretha Franklin in particular, although not so keen on the whitey music of Led Zeppelin and Crosby, Stills, Nash and Young. He had, however, been keen to sign the little-known Elvis Presley, but lacked the $10,000 to clinch the deal.

It was Wexler who was represented in Ray, the biopic of Ray Charles, as the latter's record executive and producer.

He was also a keen proponent of the Muscle Shoals Rhythm Section, white Alabama session musicians who he used to great effect for Wilson Pickett, Aretha Franklin, Dusty Springfield, amongst many others. He recorded many seminal works there, for example coming up with the rhythm track for Pickett's In The Midnight Hour, as well as producing Frankin's Respect, I Say A Little Prayer, and many others.

Atlantic also signed Boz Scaggs for one album in 1968; coupled with the Muscle Shoals team, this resulted in one of Duane Allman's greatest recordings, on Scaggs' Loan Me A Dime.

Wexler left Atlantic in 1975, which putatively signalled the end of his career peak. That period would see his involvement with Dire Straits, George Michael, and Bob Dylan's born-again album, Slow Train Coming. But his work over the three prior decades had already altered the course of popular music.


Further reading: overviews of Atlantic Records here and here.

Tuesday, August 19, 2008

The persistence of cryptozoology

Things wash up on beaches. Apparent proto-humans and bizarre dentiture. Hunters. What do they have in common? Cryptozoology, the study of hidden animals (or, in practice, the search for animals claimed but unverified).


Darren Naish
's Tetrapod Zoology site is always an instructive read. In posing a question in a recent post, he illustrated the capacity of a good story to spread far faster than a good explanation.


He discussed the so-called "Montauk monster", a beach find from July in New York. It might look like a relative of a skinned chicken, or... something - except that it looked so odd as to appear faked (which the photo and the specimen weren't). Still, the UK Telegraph newspaper saw fit to write it up as remaining a mystery, four days after Naish unpicked the whole story (and he wouldn't have been the first).


The greatest puzzlement was in the jaws. The lower jawbone had teeth, whereas the upper mouth looked much like a toothless beak. Definitely a recipe for chimeric mythmaking.

Yet as Naish details it, the answer lies in taphonomy - the study of the decay of organic matter over time. Carcasses in the water bloat; mammalian fur drops off, facial tissue decomposes - as does that on hands/feet/paws. In summary, the most recognisable features of a creature are lost. In the case of the Montauk monster, the beak-like snout is simply the skull becoming exposed at that point earlier than elsewhere. And so, on Naish's blog he reveals the creature as simply a watersodden raccoon. A very good juxtaposing illustration settles the matter.

Partial decomposition is the mystery, and the answer. Someone with a background in animal taxonomy - or palaeontology - should be well-equipped to see past the mask of unfamiliarity.


The two American hunters who recently claimed to have a bigfoot specimen barely merit a mention, of themselves. They said that they had tried to freeze the body, but the freezer broke down - which would leave the creature in a convenient state of ambiguous decay. Still, they claim to have "the DNA evidence". Other claims could be open to dispute, but this one will bring them unhinged. They would need negative evidence, however - analysis to show that the creature's blueprint did not resemble any known creatures (at the very least - let alone its proximity to any known taxa).

Humans have spread so comprehensively over the planet that it's vanishingly unlikely that a breeding population of any unknown creature - of any noticeable bulk - is possible.

The most plausible location for decent cryptozoology - the most underexplored regions of our planet - would be the deep sea. All manner of mechanisms develop over evolutionary time that would look particularly strange outside the pressurised, light-reduced depths. In the wake of the 2004 tsunami, I saw web pages with a variety of creatures that purported to be sucked from the deep by the oceanic forces. They may have been faked, but as far as I know, at least the scenario is plausible. More than I can say about two hucksters with business to drum up. Which may be the ultimate story behind most "cryptozoology".



Update 20-Aug-08: A new report on the purported bigfoot above reveals the rubber costume behind it. Quotes include "it's all about money" and "it is still unclear why a Clayton County police officer" (one of the charletans - who have since disappeared) would defraud, but "these are not very intelligent individuals". If Fox news reports are true (!), the one they successfully defrauded was a mickey of a bigfoot investigator who paid them an undisclosed advance. So, no damage done really. Names not included here, as they do not deserve it.

Monday, August 18, 2008

Ossetia, the political football

"Georgia will become a member of NATO if it wants to - and it does want to."

With those words relating to the conflagration in South Ossetia between Georgia and Russia, German Chancellor Angela Merkel has raised the stakes in rhetorical warfare that increasingly resembles a poker game. Neither side knows how far the other is prepared to go, what cards they will play, but so far is willing to test the other.

Earlier, Russian president Dmitry Medvedev had given the US a very direct message to keep out of the situation, saying that US intervention would endanger the current Russia-US relationship. In that light, Russia may see any escalation in NATO involvement as simply US-by-proxy. Concurrent with Merkel's declaration, Condoleezza Rice was referring to Russia's reputation as in "tatters".


Ossetia is a region that straddles the Greater Caucasus mountains that separate Russia from Georgia. North Ossetia is clearly a Russian territory, while South Ossetia is ostensibly part of Georgia, although strong irredentist pressures have kept Georgia from maintaining control of the region and resulted in a breakaway government in South Ossetia that is not recognised internationally. Over the past twenty years, tensions between Georgians and Ossetians (who speak a language originating in Iran, although they are predominantly Orthodox Christians) has resulted in violence, refugees, and claims of ethnic cleansing on both sides.

The Georgian army is pathetically small compared to the Russians, which is why Russian forces have blithely traversed parts of Georgia since the latest conflagration began, and still do, despite a ceasefire agreement that stipulated withdrawal of Russian forces from non-Ossetian Georgia. Overnight, a BBC correspondent asked some Russian soldiers when they would be leaving, but of course the soldiers didn't have a clue, and seemed entrenched. Russian calls its forces "peacekeepers", however there does not appear to be any international sanction of this, and Georgia refers to them as an army of occupation.

The current turmoil erupted between Georgian and Ossetian separatist forces on August 1, with both sides claiming ceasefire violation by the other. Russia stepped up rhetoric after Ossetian refugees streamed into Russia, then despite another ceasefire agreement, Georgian president Mikheil Saakashvili vowed to wrest control from the Ossetian "criminals".

Saakashvili, who has in his background US legal training, arose from the ruins of the notoriously corrupt Georgian administration of ex-Soviet Minister Eduard Shevardnadze. Young, pro-Western, and very popular (96% of the vote in 2004), Saakashvili has reduced corruption a fair bit. But from that mandate (down to 53% this year) he's proved himself capable of being as much an autocratic firebrand as any of the ex-Soviet leaders.

In probing the origins of the current hostilities, I could find nobody blameless. Russian, Georgian and Ossetian leaderships all have their own agendas, and in pushing them, cause people to suffer.


Update 21-Aug-08: Overnight BBC news added a couple of pertinient points to the above. First, Poland has agreed to host US missile bases (that purport to be strategically placed for Middle Eastern, rather than Russian, threats. As if). This is a salient development because a) talks had stalled prior to the current Ossetian conflagration; and b) Polish public opinion was against it, now is for it - again, possible to sheet home to the Ossetian issue.
The other development was Human Rights Watch reporting that whereas Russian had claimed genocide due to indiscriminate Georgian shelling of Ossetia, all evidence points to deaths being numbered in the dozens rather than thousands. This comes from ground reports such as hospitals, burials, etc.

Still it remains that no side has clean hands.

Wednesday, August 13, 2008

World's largest solar plant?

A report in the Herald suggested Australia is to be home to the world's largest solar powered electricity generating station.

The proposal would provide 250 MW (megawatts) of electricity by 2011, said to be sufficient for 100,000 homes. The report says it is backed by "some of the nation's biggest polluters, including BHP Billiton, Rio Tinto and Delta Electricity".

Great to have a good news story, but the hackles may go up when it's revealed that sites mooted are scattered in WA, SA, Queensland and NSW. Almost as an afterthought, the article revealed the company behind the proposal: WorsleyParsons.

WorsleyParsons is a little-known Queensland-based company, variously described as involved in mining services and engineering. Although it would seem to have a presence in Singapore (as of 2006), it doesn't even have its own website, and the low web presence suggests it is pretty smalltime.

I would be happy to see such a venture succeed. But I suspect it would take more than a smalltime Queensland engineering company to get it off the ground. The planning resources alone for a properly-formed proposal would be considerable. To its credit, the article noted that similar proposals in the past have failed to attract sufficient financial backing.

Of course, Australia sorely needs infrastructure development like this. Without doubt the emerging legislative/regulatory environment in this area will encourage active development - although the Federal government needs to be much more proactive than it has been to date, if we are to see any serious development on this scale within the next three years. This could be just another one of those pie-in-the-sky Herald stories which vanishes without trace inside six months.

Tuesday, August 12, 2008

Cognitive tricks 2: dissonant paleontology

How do you grapple with evolution if you're a creationist?

Marcus Ross gained a doctorate in geoscience (paleontology), researching mosasaurs, marine reptiles that disappeared at the end of the cretaceous (65 million years ago). As a young earth creationist, he believes Earth was created less than 10,000 years ago. But his dissertation work was "impeccable", according to his doctoral supervisor David Fastovsky, who is apparently well-respected himself: Ross worked "within a strictly scientific framework".

This generated a bit of chatter when written up in a New York Times article last year.


Encyclopaedia Britannica defines cognitive dissonance as "the mental conflict that occurs when beliefs or conflicts are contradicted by new information". Wikipedia's headline gives it as the "uncomfortable feeling or stress caused by holding two contradictory ideas simultaneously", further that the theory "proposes that people have a fundamental cognitive drive to reduce this dissonance by modifying an existing belief, or rejecting one of the contradictory ideas"... or, from Britannica, by explaining away, avoiding the new information, persuading themselves that no conflict exists, reconciling the differences, or "any other defensive means of preserving stability or order".

It is not clear whether Ross is in mental conflict, per se. His approach to the inherent contradictions? He treats each of evolutionary theory and creationism as "different paradigms" for understanding the past. Within the paradigm of his academic discipline, he appears to maintain a consistent analytical approach that is scientifically orthodox. However, he is on record - within fundamentalist Christian circles - as arguing that intelligent design is a "better explanation" for the Cambrian Explosion than evolution. [for interesting perspectives on this, one can look no further than two famous non-theistic scientists, Stephen J Gould and Richard Dawkins. Up to his death in 2002, Gould had opined that there were serious questions still to be answered about the apparently sudden explosion of radical biodiversity in the Cambrian Explosion whereas Dawkins, perhaps with the additional weight of more recent research, feels that that there is nothing in it that cannot be explained within existing scientific knowledge and theory.]

A lot of opinion has washed about on Ross' merits in being awarded a doctorate. Some academics felt that if he had done the work and demonstrated sufficient understanding, he deserved it, no matter whether he was mouthing requisite platitudes that he didn't believe. Others, of course, differed.

Yet how can you test what someone believes? If they are able to testably maintain consistency within the paradigm, what more can be done?

He currently teaches earth sciences at a Christian University (Liberty), claiming to do so entirely within the scientific (that is to say falsifiable) paradigm. He has also published an interesting analytical paper (here) which attempts to clarify the qualitative differences between the scientific paradigm and each of the received beliefs such as intelligent design and young earth creationism - as a series of discrete points rather than a continuous spectrum. Worth a read.


It is an interesting approach to a resolution of cognitive dissonance. I expect that each side will suspect him of leaching ideas from the opposition. I believe that if Ross is being entirely honest, that compartmentalisation - which might seem to be a successful response to his inherently conflicting conceptions - will not easily stand the test of time nestled within a single person, to the point where he will fall more clearly one way or the other. Meanwhile it will probably reduce his effectiveness in either world.

References
Dawkins, B (2004): An Ancestor's Tale. Phoenix, London.
Encyclopaedia Britannica (1988): Cognitive dissonance, v3 p434. Encyclopaedia Britannica Inc, Chicago.
Wikipedia on Cognitive Dissonance
Wikipedia on Marcus R. Ross

Monday, August 11, 2008

Forgotten pop: Vigrass and Osborne (1972 - 74)

Here you'll find all I have been able to assemble about Vigrass and Osborne: the British duo of Paul Vigrass and Gary Osborne. They had a brief recording history in the early 1970s which resulted in a handful of sparkling pop singles that were criminally ignored in most territories around the world.

Depending on territory, the records were released on labels such as Uni (in USA), MCA (NZ, Brazil), Epic or JCM.  The known singles were
  • Men Of Learning/Forever Autumn (1972)
  • Virginia/Ballerina (1972) - audio of Virginia now available on Youtube
  • Mr Deadline/Remember Pearl Harbour* (1972) - audio of Mr Deadline now available on Youtube (unfortunately missing some of intro and a few bars in the middle) 
  • Gypsy Woman/I've Seen Her Shining, 1974. 

The albums:

Queues (1972): Forever Autumn, Ballerina, and probably Men Of Learning/Don't You Worry/Ballerina/Mississippi Lullaby/Virginia//Sail Away/Forever Autumn/An Invitation/Remember/The End.

Steppin' Out (1974): Gypsy Woman/Daily Express/Engine Driver/Summer Passed You By/Haystacks/Sunshine Cake/Wild And Windy Sea/Steppin' Out/Sit Yourself Down/Hey Brother-Heyla


Their producer, Jeff Wayne, was a writer and producer of commercial jingles - he undoubtedly brought that sensibility to the table, as well as playing organ and co-writing some of the songs.  Musicians I recognise on the albums include guitarists Chris Spedding and Caleb Quaye, and Ray Cooper (percussion); the latter two frequently worked with Elton John.

Wayne, of course, later produced the highly successful War Of The Worlds album in 1978. To that project, he took with him Gary Osborne to write lyrics to some of the songs - and he uplifted a Vigrass And Osborne tune - Forever Autumn, which became the biggest seller and signature tune from War Of The Worlds. The original - less wistful and at a brighter tempo - had been the b-side to V&O's first single, and you can listen to the original here. I post the link with some hesitation, as the song by no means does them sufficient justice, and pales against the better production values of the later, better-remembered version.

According to this site, they were also both in a band called Casablanca (as was Bias Boshell, writer of I've Got The Music In Me) for one eponymous 1974 album.

Prior to V&O, Paul Vigrass had a few solo releases in the late 1960s (including Free Lorry Ride and A New Man), then briefly replaced Tony Burrows as lead singer of the studio band Edison Lighthouse. But I have found no record of him past 1974.  There's one audio of his on Youtube - Suzie - very much in the Burrows/Lighthouse style.

Gary Osborne leaves a number of traces, though - as a lyricist. He apparently wrote English lyrics for the Veronique Sanson song Amoureuse, recorded by Kiki Dee in 1974 (a year after the original). But his best residual income would be from the War Of The Worlds album, possibly also for his lyrics for Elton John on three albums in the mid-1980s; the hits were Part Time Love, Little Jeannie, and Blue Eyes. He's been active as lyricist as recently as 2006.

I write this because Vigrass And Osborne have nigh-on disappeared from the tribal consciousness, which is quite the opposite of the fate they merit. I best know them for their highlight: Mr Deadline, a wonderful piece of pop which thoroughly deserves to be remembered. Interestingly, it shared a vocal line with Sweet's Blockbuster. My memory tells me I heard Deadline first and thought Blockbuster was the ripoff; however the records show the latter was released a couple of weeks earlier in my territory (and no comparisons available elsewhere), so the situation is unresolved. However, never put it beyond messrs Chinn and Chapman (Sweet, Quatro, Mud, Smokie, etc etc) to steal anything not nailed down. [listen to the falsetto 'ah-aaaah' at the very end of Deadline, and at the beginning of Blockbuster - they're identical.]

The only charting info I have (to my knowledge, all based on record sales):
  • Men Of Learning: US (Billboard) #65; Boston (WRKO) #20; Australia #84; New Zealand #17; Wellington NZ (2ZM) #12
  • Mr Deadline: Wellington #16


Some additional information available at Gary Osborne's Wikipedia entry. A blog entry for The Great Big Radio provided some information found nowhere else, as well as the above picture cover, and more accolades.

Other discussions of early 1970s pop music: songwriters (and occasional singers) Ellie Greenwich, Roger Cook, and the strange trend of recycling classical music.


*Note that Pearl Harbour is nothing to do with war, but like Virginia relates the tale of an eponymous woman.

Friday, August 08, 2008

Cognitive tricks 1: mind creating meaning

I've been listening to Led Zeppelin backwards and let me tell you, it's the work of the devil.

Well, I should let other people tell you that; Believers will do so anyway. Yet it's the people in the middle that concern me the most, those who are too easily swayed by a stunning coincidence.


Michael Shermer, founder of the US Skeptics Society (and a former fundamentalist Christian), gave a talk for TED, an annual US conference for the "spreading of ideas".

The talk, available here on TED.com and here on LiveLeaks.com, in the course of debunking a few junk beliefs, showed how and why people often believe rubbish. A few examples he gave of those reasons:
1) People notice the "meaningful" results and ignore the failures - eg a US company sells a dowsing rod for detecting marijuana. If it can "occasionally" prove successful in finding it in high school lockers, administrators may be duped.
2) Taking something on face value without asking "what's the most plausible explanation" with all available evidence - viz crop circles.
3) Untestable explanations for existing events or outcomes - viz miracles, creation, divine intervention. (Contrast this with untested reports of individual happenings.)
4) Reading meaning into random patterns - eg seeing Mother Teresa in the shape of a bun.


That last point was the most interesting, and perhaps one of the most overlooked type of credulence-inducing phenomena. The brain has a natural tendency to interpret what one sees or hears. Where the data is incomplete, the mind can fill in the gaps. If a picture of Mars, or tree bark, or sandwich contains random patterns that are suggestive of something else, a little nudge can help convince some people - squint more, for example (it reduces even more the received data), while someone tells you what it is you're looking for. Does it look more like the Virgin Mary - or Jane Russell? Since more mysticism is attached to the former, the latter is "seen" less often.

But the most amazing example from Shermer was a section of Stairway to Heaven played backwards. When he first did this, I could hear the word Satan - but that was because I was listening for something, having been primed by Shermer - and also having heard of people claiming devil messages encoded backwards in music.

Still, I wasn't prepared for playback the second time, when Shermer put up a specific set of words to accompany. What a match! (once my mind was guided.)

How could Led Zep possibly record something that made sense backwards, yet was so musically integral in the forwards direction that no doctoring could be detected? That must take genius. Or Satan.

It's a bit tragic then, that part of the backwards lyric included '666'. The number of the beast? Not quite. Although popular culture has come to associate that number with the devil, more recent research has found the earliest reference to the number in the Book of Revelation gave it as 616.


Hold on, isn't that all a trick by Satan to mislead us?

Well, by this point people are simply going to believe what they want to believe.

Interestingly, there's a holy roller who interprets those same backwards words somewhat differently, on this video - start at 7.53 if you don't want to wade through the lot. A good illustration of the power of suggestion to help fill in the "data" gaps; it also illustrates that the desire to locate (create) interpretations is the driving force: - if not for Led Zeppelin, those preachers would be finding satan elsewhere - as they do.


(Thanks to Bill for the original web reference. It made my day to listen to that Led Zep backwards while my brain was guided by the printed "lyrics". Sad satan will never be the same.)

Thursday, August 07, 2008

Evolution experienced - on a bacterial scale

A fascinating experiment written up in PNAS brings to life - and helps quantify - evolutionary theory that is usually only induced or deduced from widely scattered evidence.

Richard Lenksi, at Michigan State University, has been running a meticulous experiment on the common bacteria Escherichia coli, aka E coli (below).


Starting with a single bacterium in 1988, he cultivated its descendents (daughters, as they grow to a certain length, then reproduce asexually through binary fission, splitting) into twelve populations in an identical medium. That medium was glucose-limited, and contained a secondary nutrient, citrate (based on citric acid), which E coli cannot use.

Those populations have evolved through 44,000 generations over the past 20 years; every 500 generations, Lenski extracted and froze samples of each strain. In this way, he could test mutations in the populations, even to the extent of "replaying" the evolutionary clock from given points.

In general, each of the 12 samples evolved in these lab conditions larger cells, faster growth rates, and lower peak population densities.

And there evolved a population of E coli that could metabolise (and thus feed on) citrate. This happened in only one of the 12 populations, from around the 31,500th generation.

Lenski calculated that within this time, all "simple" mutations would have appeared at some point in each population. He reasoned that citrate-metabolising must be either a particularly unusual mutation, or one that required several mutations to accumulate in a rare sequence - no E coli strain had been found with this trait outside his lab.

So he extracted some of the frozen samples for testing. The finding was that the special trait evolved only when the population of that one-in-twelve strain was replayed from at least the 20,000th generation - no earlier. This demonstrated that a crucial precursor to the citrate-metabolising trait had occurred around that point.

And with the additional nutrient available, the citrate-metabolising strain yielded a greater population size and variety.

Thus far, the experiment adds little of significance to general evolutionary theory, apart from adding some numbers to the pace of change at a macro level - and then yet, further similar experimentation could better quantify the statistical variance involved.

But the experiment is a very neat, clean illustration of a significant trait evolving via a series of smaller and insignificant steps. And of the randomness inherent in evolutionary change. Beneficial changes are contingent, and can rely on key chance developments. Then once a crucial point has been crossed, certain outcomes are much more easily achieved.

Wednesday, August 06, 2008

Evolution: birds re-sorted



A recent analysis of bird DNA has reported some corrections to the phylogenetic family tree. The study, published in the journal Science, looks to be an equal collaboration between a scientists from a number of American universities.

They report corroborating some contentions groupings (eg flamingos and grebes), and making some surprising new groupings (eg hummingbirds with nighthawks). Some conceptual rearrangements were called for too, for example regarding flight vs flightlessness and nocturnal vs diurnal traits.

New Scientist reports some comments on the study indicating some of the findings are disputed. One, for example, places the flying order tinamous right in the middle of ratites, which comprise the majority of well-known flightless birds, including emus, kiwis, ostriches and moas.

Tinamus major

Tinamous and ratites had already been nestled together in a superorder called Palaeognathae, whereas all other birds are grouped as Neognathae. The fossil record of tinamous, from South America, is poor, only going back 10 million years where the ratites go back to the cretaceous, over 65 million years ago.

Palaeognathae were thought to originate in the southern Gondwana continent. However, more recent DNA analysis, although confirming the monophyly of ratites, appears to show they diverged to recently to share a single Gondwanan ancestor (see discussion here). According to Colin Tudge (The Variety Of Life), a hallmark of the ratites is a lack of deep keel on the breastbone for flight muscles, so including amongst them a flying bird might be unexpected. In fact, most species of tinamous are poor flyers.

These facts in themselves are suggestive that tinamous had, to some extent, re-evolved flight. This would not be out of keeping with the length of lineage of the Palaeognathae.

As I said, the findings haven't met with universal agreement. The study was based on 19 regions of the bird genome, covering 32,000 DNA letters. I'm not qualified to comment on whether this covers sufficient ground, but disagreement on some of the findings would surely revolve around this issue.


Regardless, we should not be surprised to find DNA analysis forcing changes to phylogenetic trees. This has been the exception rather than the rule. We move from systematics predicated on apparent equation of form to that based on the genetic blueprints. The failings of form-based systematics remind us constantly of convergent evolution: that given enough time, the evolutionary path of different animals in similar environments has followed similar paths. The randomness of genetic change provides the differences, but environment trumps that chaos, and steers.

Tuesday, August 05, 2008

Google Street View: virtual real life

It's scary. But really promising.

It's Google Street View, and it's just been released in Australia. It's Sydney at street level: you can navigate a virtual world of real photos, seamlessley stitched together. (The best place to start is an introduction here.)

Sydney was mapped out around November last year. Starting with what I know best, I went home first. I saw our house, car, and front garden (the jasmine bush is rather bigger now). Izoomed around our neighbourhood on a bright, sparsely-populated day. On the main road stood someone whose features weren't clear enough to discern, but who is doomed to stand in te middle of the road for the foreseeable future, waiting to cross the rest of the way. I saw clearly all the houses around, including a couple in the neighbourhood that have since been torn down, such as the one below which went a few days ago.


Apparently, the mapping process involved cars traversing Sydney with cameras mounted, taking photos every few metres. So you're following the road, but you can see all the buildings visible from the street by panning around at any point, as well as up, down, into the sky, onto the ground... One disconcerting aspect, however, is that the move forward along the road (in a series of jumps) depends on whether the photographers have covered the street both ways. You can be following a road for a while, consistently behind a particular set of cars, but then you can find yourself suddenly switched to the other side of the road - same orientation, but moving directly towards oncoming traffic. Then back again.


What use is it? Right now, it's just a toy to me. But it will probably become a useful navigational aid, for one. "Build it and they'll come" is apt here; it is so powerful that thousands of people's imaginations will surely find thousands of applications for it, particularly in embedding in other web pages and applications.

Monday, August 04, 2008

Bacteria directing the course of life?

"Gut bugs may have guided the evolution of life" screams the headline in the New Scientist.

The article reports on a study by Jeff Gordon and Ruth Ley of Washington University in St Louis and published in the journal Science. The microbiologists analysed bacteria from the digestive systems of numerous mammals, then compared the samples with DNA proximity of the animals.

Surprise, surprise, they found that the closer the mammals were genetically, the closer the correlation between populations of gut bacteria.

The scientists speculate that the partnerships between bacteria and mammals could help explain the success in spread and diversity of mammals [over the 65 million years since the KT event gave us breathing space over the dinosaurs]. They say that any adjustments in diet (eg carnivorous to herbivorous to grass-based) would need to be accompanied by a change in internal bacterial populations.


I'm not convinced that this is saying much that is new and significant. That team was caught struggling with the question of causality - did a change in diet to the herbivorous necessitate a change in digestive bacteria, or did a change in internal bacterial populations enable a change in diet?


In fact, evolutionary reliance on useful bacteria is not an unusual phenomenon - quite the reverse. From insects to mammals to legumes, plants and animals have relied on bacteria to aid in essential living processes. For example, Richard Dawkins (Ancestor's Tale) recounts how some (but not all) termites do not - cannot - digest wood fibre (cellulose) unaided, and use internal bacteria to break it down into useful chemicals - the bacteria's waste products. There are several unusual aspects of that bacteria, Mixotricha Paradoxa, but suffice it to say it is found nowhere but in the gut of Darwin's Termite.

This suggests coevolution of bacteria and "higher" organisms can happen in strong partnership, with each essential to the other. In fact, it is simpler and more meaningful to conceptualise bacteria - and the toolkits they bring with them, including the capacity to evolve comparatively rapidly - as an inherent part of the environmental niche in which organisms evolve. Thus our evolutionary environment is both external and internal to us.

Again, there is nothing new in this. But in a built world in which humans usually respond to bacteria as inimical to life, it is another obligation of deanthropocentrism to reorient our thinking to regard bacterial life on the whole as fully essential to where we came from and to our continued existence.

Sunday, August 03, 2008

DNA coding - message in a bottle?

"DNA's performance as an archival medium is spectacular. In its capacity to preserve a message, it far outdoes tablets of stone. Cows and pea plants... have an almost identical gene called the histone h4 gene. The DNA text is 306 characters long... cows and peas differ from each other in only two characters out of these 306... fossil evidence suggests [their common ancestor] was somewhere between 1,000 and 2,000 million years ago... Letters carved on gravestones become unreadable in mere hundreds of years”
- Richard Dawkins, The Blind Watchmaker (p123).


The implications, at first glance, are quite spectacular too. For example, why not preserve meaningful information in such a fashion? We send out signals seeking contact with extra-solar civilisation; why not inscribe similar messages in DNA to likewise reach across the distance of time? The human genome, for one, is several billion base pairs long, plenty of space for encoding information that can be read as clear messages.



There are in fact several reasons why this is somewhat impractical.

First, we are already halfway through the effective life-supporting span of the solar system. If, for example, we were to take to the extreme this current artificially-induced extinction event (global warming and destruction of biodiversity), we may leave few species behind; humans would not ipso facto be the most robust of them. If we were to propel destruction back to the bacterial level, there could well evolve again life forms sufficiently complex to analyse and read such messages – but the timing would be quite fine. The gap between “Oh, someone's encoded a message for us in DNA” and the sun expanding to render the planet uninhabitable, could be so small that contingency might not allow for that rediscovery. A simple event on the scale of the KT event's meteorite can play havoc with such timing.

Second comes the inevitable problem with seeking to encode for two different – potentially conflicting meanings. (This is why database designers tend to create primary keys that are independent of specific data fields.) On the one hand, it would be tricky to code a section of DNA to be meaningful both genetically and as a message. And there is no guarantee that such genes would not be subject to evolutionary changes that obliterate the message.

On the other hand, large sections of genome are seen as “junk DNA”, that is, likely to be filling no purpose directly relevant to an organisms makeup. (which is not to say junk DNA is fully useless – for any organism to carry any excess baggage, there is a cost. We just don't know for sure the purpose and origin of junk DNA. It seems to consist of duplicates and misprints of DNA present elsewhere, rather like a computer's waste bin that hasn't been cleared.)

However, junk DNA looks to be more susceptible to mutation than purposive genetic material. Why? If mutation is steady and equally likely throughout the genome (say, for instance, that solar radiation causes a slow but steady rate of damage – a small percentage of miscoding – in haploid genetic material), DNA that has purpose is more subject to error-correction – via the decrease in viability of mutated, ie DNA-damaged, individuals. Thus junk DNA mutation – coding errors – at the individual level is more readily retained, and the information inherent in that junk code would change more frequently. An ideal vacant repository for information, but not as secure.

So a genetic designer could conceivably store non-core information in DNA, but couldn't reliably expect it to last through an evolutionary time scale. However, I can picture the technology being developed to enable insertion of signature or copyright information in junk DNA that would last the required human timeframe.






References

Dawkins, R (1986): The Blind Watchmaker. Penguin, London.

Saturday, August 02, 2008

misconstruing computers (Blame it on computers 2)

I heard a radio interview today with a professor from the Music department of Sydney University. Conversation happened to touch on computers. She seemed to indicate she actively avoided computers altogether. One comment was that she couldn't see why everybody needed a computer, it was so wasteful to have one computer per person. Somebody may occasionally catch her at a university computer, but she'd be checking up references in assessing a student's work - they sometimes cited web references.


She rather entirely missed the point about the whole phenomenon. For most people, computers should be and are simply a tool in the course of whatever it is they are focused on.

In this sense, a computer is meaningful simply as an extension of one's faculties. It helps record and store information, collate and retrieve and, especially, communicate with others. Such a tool can be and is helpful in any discipline - music not being an exception. She gave a good example of this, above.


There are a variety of hurdles in this construction, for many people.

One lies in the fact that the personal computer is exemplary of technology, and technological advancement. More complex perhaps, but essentially equivalent to all devices that modern life has mandated coming to grips with through the 20th century. This (coupled with the relative complexity of operation) can ipso facto create barriers. In the case of the professor, she was in her mid-sixties - and age can be a barrier [not that age brings with it mental degeneration, but rather a distance from popular culture, and/or a reluctance or weariness in the face of the continual adaptions needed).

Another hurdle is the very use of the tool. It is far from intuitive for those coming to it for the first time. The rapid evolution of operating systems could eventually make this easier, but I suspect this is happening quite slowly so far. Fuelled in part by the fact that those designing those successive generations are building rather too readily on existing paradigms of computer usage, rather than trying to make that use more intuitive for someone that may be new to it. In this way, the failures in adaption/adoption must to a fair extent be laid at the door of those designing, (and presenting, maintaining, and teaching) the technology.



My wife also suggests, in all seriousness, that musicians are a different breed, thinking differently, and that they would make for very reluctant adopters of such technology. If true, it's certainly not true for all. If true for many of them, it should still be much less true for academics, whose minds must have instilled in them sufficient rigour to attain the positions they are in, including organisational and analytical.


The hurdles are scarcely due to lack of need for computers. For nigh on everything we do, they should be able to help us do them better.

Friday, August 01, 2008

Evolutionary tolerance for alcohol

A small mammal has developed a tolerance for alcohol.

This comes from a German study published in the respected PNAS (Proceedings of the National Academy of Sciences), and reported here.

The animal is a pentailed tree shrew in a Malaysian rainforest. The shrew, which has ancestors in common with primates, feeds on the flower buds of the Bertam Palm, with an alcohol content of 3.2%, comparable to a mid-strength beer.


The researchers report the flower's smell as being somewhat like a brewery. They believe the tree had evolved alcoholic nectar to attract animals, so improving pollination.

But the researchers also report that despite spending a couple of hours per night licking the nectar, the animals show no signs of being drunk. The scientists reckon the tree shrews correspondingly developed mechanisms for metabolising alcohol that are different to human processes.

The nett effect is the development of an ongoing evolutionary ecology that is apparently beneficial to both sides, with no ill effects.

Interesting, but can we learn anything from this? If the mechanism is one of elimination rather than simply tolerance, it may be possible to find ways for humans to do the same.

Which is not to say you could then stay intoxicated with no chronic effects, rather that all effects would be removed. This begs the question of whether there is anything addictive in alcohol that is not related to the intoxicative effect; the researchers seem to suggest the treeshrews maintain a relationship akin to addiction.