Tuesday, May 30, 2006

Tech: Innovative IBM database and BI releases

Two recent initiatives from IBM are promising in terms of information integration and discovery.

Of course, since IBM lost to Microsoft its mantle as the most monolithic and pervasive entity in the I.T. world, it’s been working hard to re-invent itself. It’s even sold (to Lenovo, a Chinese company) its PC hardware business – the very business that fostered microcomputer standardisation, allowed Microsoft to gain pre-eminence, and ate away at its traditional, mainframe business. Their business is currently split between software, services, and mainframe hardware. Mainframes are now a niche market, and it’s their software innovation that garners attention.

On imminent release is Viper, software technology for their DB2 database platform which, amongst other things, allows for “native” XML databasing. The presentation I attended last week gave me the impression it permits admixtures of data with XML-defined data, but I’d be quite cautious about that until I could see it in action.

This is quite a dramatic initiative*, providing some enabling technology for the Semantic Web (discussed here and here). For me, the significance lies not simply in its ability to handle XML – which can be done in proof-of-concept by any number of vendors – but that it can do this natively, as an integral part of its DB2 product.

Also announced is IBM’s Content Discovery for Business Intelligence. Although this is a part of their WebSphere (application server) product range, in concept it permits pervasive business intelligence throughout an organisation’s structured and unstructured data. Provided, I presume, the unstructured data has been sufficiently tagged (manually or automatically). The announcement is careful not to include the term “data mining” so I’m a bit suspicious of its “discovery” nomenclature. Business Intelligence involves specific query, analysis, and reporting functions, whereas data mining is more a discovery of trends – the difference between asking specific questions and asking what patterns are in the data.

We’ll find out the full story when the dust settles. Still, access to unstructured data is nothing to be sneezed at. And if Viper can’t immediately database extant web pages, be sure that that’s the direction they’re going.

*1-June: In fact, it's been said to be not that dramatic, that Oracle has had native XML support for some time. I guess it's down to how genuine that "native" label is, and how they mix XML and non-XML. Comments welcome.
(Viper also adds range partitioning, which I can see being particularly useful in a data warehouse/business intelligence context.)

World: knowledge, and the pressure to learn

Far more than ever before, we are under pressure to learn. It’s not just the mantra in recent times that “life-long learning” is important. There’s pressure to understand more of the world around us, to understand the implications of the government’s budget, to understand the latest technological gadgets, to understand computers and the internet and the malevolent emails and software that mutates all the time.

There’s even pressure for our elderly relatives to understand all this, to learn and to adapt to the new technology so they can better communicate with their family.

And there’s pressure on the young, to get started at an earlier age, to adapt to not just technology but more intensive learning.


We had the first ever parent-teacher meeting for our daughter yesterday. It certainly wasn’t an informal chat. In the limited time we had, there was a number of tests, KRAs (Key Result Areas), and performance indicators that the teacher had to show us. Even primary teachers have to know more, do more, teach more. I hear choruses of “in my day, the first year was just playtime”. Well, it’s not now.

From my perspective, it’s good to see our five-year-old picking up on things so early. She’s really keen on school anyway. And I’m happy to see that they’re teaching word processing to the six-year-olds. Still, I can see the converse: parents that are putting pressure on the kids early, pressure on the schools to take the kids earlier. And I see that some four- and five-year-olds are not yet ready for that, whether emotionally, behaviourally, or academically. Some of those kids will suffer for years, constantly trying to catch up because they didn’t have a gentle start and weren’t equipped to match their classmates’ pace.


Where does all this pressure come from, to learn and perform quickly? (Too quickly for some.) Partly from family, friends and colleagues. Partly from economists that say it’s a competitive world and we have to match the standards of our nation’s competitors. I also think it’s just in the nature of modern life: to paraphrase Isaac Newton, we stand on the shoulders of giants. We live in a more complex world where we have to absorb (to one extent or another) all the information and knowledge that has gone before. Further, we actually live in an age of extremely rapid technological advance – particularly compared with the pace of the past. The internet alone has massively revolutionised information and knowledge, in ten short years. And the pace is only going to get faster. We will put more stock in ability to access to information than in the knowledge itself. We will also need the skills to absorb information more readily, and discard it more easily. There will be tools to do some of this, and for many, the path to wisdom stops there. But the wise will be less keen on discarding information, and will be associating and integrating knowledge constantly.

Of itself, information is knowledge only if you can access it instantly. And knowledge leads to wisdom only if you use it well.

Monday, May 29, 2006

Tech: Database the world with XML (Semantic web, part 2)

(part 1 was Semantic web, super web.)


I have a vision: I want to see the whole digital world databased.

Why? Databases are wonderfully associative tools. We can make connections, sort, and list. We can gain new insights into our information with rapid querying and analysis tools (business intelligence tools in particular).

Now, databases are rather inefficient for storing information, as a colleague pointed out to me. But once upon a time, relational databases were said to be impractical in the real world for much the same reason. Then precipitous drops in CPU and storage costs brought the theoretical into the real world, to the point where you’d be hard-pressed to find a database not predicated on the relationship model.

My vision will prevail (although I’m in for a bit of a wait). The web will become a virtual database, thanks to semantic web and XML technology. We will see a gradual takeup of the concept, through the markup of new and existing pages in XML, which will define the web page semantically, giving machine-readable meaning to the information on the page. Search engines will need to be more powerful to process that meaning, to integrate an open set of disparate pages. This is the power of the semantic web paradigm, this is how true integration will happen.

Finally, the whole of human knowledge will be integrated, and we’ll all be experts on everything… whoops, getting ahead of myself here. (We only think we’re experts.)
Seriously, there’s no reason we won’t go down this path. Of course, beyond a certain point much of this information will remain specific and privatised, sensitive to organisations or individuals. Yet what remains in the public domain – even now – is powerful. We just need the tools in place to boost the value of this chaotic, cluttered web.

Film: The Parallax View (1974)

There's a sub-genre of film called the conspiracy thriller. I thought I made it up just now, but there's actually a Wikipedia entry on it: here's a list of films. There's even a detailed essay on Conspiracy thrillers of the 1970s, by Jay Millikan, a film writer and critic


Like Millikan, I believe its heyday was in the 1970s, a by-product of the string of political assassinations in the 1960s, plus the belated realisation that the president himself (Nixon) was just as venal and just as capable of conspiracy as the most cynical and heinous of puppetmasters. Of course, a byproduct of that era is the inculcation of cynicism in the general population, to the point that we're no longer surprised by villainy from the top: we're surprised by its absence. Is it just a twist in the plot of life itself that we have to go back as far as the 70s to find a fully ethical president (Carter)? Or does the one follow the other axiomatically?

Alan J Pakula had a relatively brief career as a director. Frequently producing his own films, his c.v. is quite liberally sprinkled with this genre.

(spoiler warning)

I'm writing about The Parallax View mainly for its hidden gem, detailed below - because I’m not convinced it’s one of the best conspiracy thrillers. The number of loose ends left dangling may be a good reflection of reality, but it's usually less satisfying as a filmic experience. There are also several points where the irrelevant holds too much sway or the relevant - the keys - are fogged. Again, Pakula could be mirroring conspired reality, where we never know - really know - the full story. However, the plot privatises conspiracy. As against most works of the genre where evil is driven from within the system, this film - as far as I can make out - places the answers with a guns-for-hire firm. You can wonder who's doing the hiring, but that seems to be relegated to an afterthought. Still, I have to give Pakula credit for his vision, as detailed at top.

I can forgive this film my personal niggles - including Warren Beatty's irritatingly pervasive 70s haircut - and accept that Pakula has his story to tell, in his way. The gem in this film is where Beatty, undercover, is tested to find out if he has what it takes to be an assassin - a paranoiac disregard for his fellow human. He - and the viewer - is exposed to a prolonged sequence of images set to emotive, manipulative music, which pulls the subject through an emotional rollercoaster. Family, apple pie, love, suffering, killing, wastage, patriotism, sharing, right-thinking, etc, etc. This goes on for some time, and might appear trite on paper - but it works, and it's powerful, and you can feel the manipulation even as it's leading you. It must be quite impressive on the big screen.

Credit must also be given for the non-hollywood ending – ie Beatty as doomed patsy. The remake of Manchurian Candidate – powerful as it was – couldn’t secure such an ending. These days, hollywood films just cost too much.


(end spoiler warning)


And the final cut for Pakula was that he was killed - in 1998 - when a pipe on the road was flicked through the windscreen of his car by the car ahead. Irony or conspiracy?

Thursday, May 25, 2006

Tech: So you thought you knew the most basic database rule?

The most basic rule of databasing is First Normal Form. This rule is very simple and glaringly obvious, isn't it? No repeating groups. But how can you abrogate this rule in today's databases? Well, James Koopmann has detailed a couple of ways people have casually broken first normal form! Beware, it's easier than you think. I think the post is a very salient lesson in mindfulness, even if you think you know it all and do it all.

I have seen the latter method myself: a field was changed from one value to a string of values. From memory, this changed the field type from int to char. If you do this, it should be deliberate, with a clear understanding of the ramifications. Or better, don't use such an expedient at all.


Footnote: of course, I'm talking about normalisation of relational databases. When I was first exposed to the concepts - early 1980s - it was largely theoretical. Relational databases were presented as only one model, alongside hierarchical and, um, network [thanks Wiki] models. Thank god technology has caught up, and sense prevails.

World: Australia's government abuses human rights

If you’re held indefinitely without trial in a foreign country, you’d expect your government to do all it can to get you out – or get you a fair trial. You have a right as a citizen.

Yes, I’m talking about David Hicks. The Australian government – the Liberals under John Howard – have been shocking in their treatment of this Australian. In the name of politics, they have left Hicks festering in Guantanamo when they have an absolute duty to see that justice is served on their people. They have made weak statements that they’re encouraging the US government to bring him to trial sooner rather than later. That’s all. And that military trial the US plans to inflict has an appalling standard of justice, which Howard knows and Bush knows.

Why does Howard ignore Hicks? Politics. Like Bush, Howard’s made a meal of terrorism scaremongering. Howard is also kowtowing to Bush because they’re politically likeminded, and Howard’s after a few pennies of trade reward.

(As shown by Amnesty International, this is just one egregious example of the dreadful slide in human rights in Australia under Howard’s watch.)

Of course it’s political. In similar circumstances, Howard’s government has done all it can to aid its citizens. An example is Wang Jianping, held in China for eight years. Despite Hicks' case being stronger (Wang was a Chinese national who had escaped from a Chinese prison, got Australian citizenship, then was locked up again when he returned to China), the Australian government had made repeated representations, and positively crowed about their efforts when he was released. But oh yeah, China’s the enemy and the US is our ally, ergo justice doesn’t apply to Australians equally.

In all probability the case against Hicks is quite weak. But as far as I’m concerned, he could be guilty as sin, but he deserves fair process. If we don’t give fair process to one person, everyone becomes at risk.
I had a friend who was quite happy about Howard’s anti-terrorism laws, and their impingement on due process. He wasn’t a terrorist, so it won’t affect him. But, I said, what if there was a scare relating to China, and he got caught up in the dragnet because he’s Chinese? Maybe because his name is substantially the same as a cousin (which it is) – or a stranger. He could be held in detention in Australia or the US for months – then told to bugger off, we didn’t really want you. This can now happen.

Amnesty has condemned the Howard government. But not all Australians stand with Howard’s mob on this.

Monday, May 22, 2006

Tech: Gaining trust on the internet

“Deciding whether or not to trust a person is like deciding whether or not to climb a tree, because you mght get a wonderful view from the highest branch, or you might simply get covered in sap, and for this reason many people choose to spend their time alone and indoors, where it is harder to get a splinter.”
Lemony Snicket, The Penultimate Peril



Trust is about building relationships. You balance the amount you commit to that trust against the amount of risk involved. All this is much harder on the internet, where you have neither personal contact, nor physical infrastructure to provide comfort. Other methods must be used.

It’s a big issue here in cyberspace. Luis Suarez pondered the issues involved in collaborative processes - how do you trust someone on the net to be an expert?

The trust process online doesn't translate too easily from real-world mechanisms. I was looking for a car recently, and there’s no real time to build relationships. The basis for trust I used was personal contact; I felt I could trust the green car bloke more than the red car bloke.* But I can't use this judgment process online. Those relationships either take time, or rely on offline experience and knowledge. On Ebay, the feedback mechanism uses other people’s testimony, other relationships built. Yet to be effective, sellers need to ramp up from scratch. How do you trust someone with no record? You build a record slowly, and people accede or not; the risks are greater where the record’s not there.

How do you trust a blog? That the writer a) is telling the truth; b) is saying something worthwhile; c) is worth spending the time reading? Again, it’s a matter of building up a record; sometimes with the help of links from trusted sources.

So far, then, I see three broad methods at work, when seeking to trust an unknown:
1) developing the relationship over time yourself;
2) recommendation from a reliable source;
3) recommendation from a large number of sources.
I’ve used all three in my time. Sometimes, more than one at once. For example, with Wikipedia I built up a level of trust over time (method 1) then examined its structure much more closely, which provided evidence for methods 2 and 3.

We’ve already seen variety of trust mechanisms build up on the net, and I’m sure there’ll be new ideas to come. But will they all make use of those three methods?


*In the end, I bought a car from someone else, on the basis of a better deal – and the fact that I was dealing with someone who I felt comfortable trusting (partly because she was a woman).

Pers: Balancing lessons with sponteneity

The brain is not as delicate as we might imagine. It spends its life creating connections (synapses) between cells (neurons), and it’s those synapses that mark us out. Despite the fact that the number of synapses and neurons is constantly deteriorating, we seem to have neurons to spare, each with an average 7,000 connections to other brain cells.

On the other hand. I watch my kids draw pictures, learn music, dance and sport, learn about the alphabet and life. I see the connections slowly form. But I also see that this process reduces their sponteneity, their imagination. Channels it and reduces it. The power of the organic brain over the electronic one lies in our associative abilities – being able to freely make connections between ideas.

While watching my daughter at her piano lessons, it struck me that while learning helps people develop useful patterns of action or behaviour, that same process grinds our behaviour along those pathways that are gradually worn into our brains. We have less of a struggle making sense of their world, but more of a struggle breaking free of those restrictions.

At the piano, we practice and practice, and grind in the pathways that tell our fingers what to do. As the lesson is learnt, we then have the opportunity to create, to put our own emotion into the work. We can veer – slightly – off the pathways. But never far off, and the lesson has to be refreshed.

It is hard to dispute the value in learning rules so we can break them. Without those rules, what we do, what we produce is less meaningful, less useful. But at the same time, the instilling of those rules plays a part in stifling our free association, our imagination. Maybe the answer lies in our capability to balance the lessons learnt and the breaking free from them. Of course, it's easier to stay within the bounds of the lesson than to escape.



Disclaimers: Although my brain retains a useful associative ability, I’m as guilty as the rest of us at learning lessons and not travelling beyond. And I haven't gone far enough with the piano to express or extemporise.

Friday, May 19, 2006

Tech: Regulation: do you think the Internet should be a free-for-all?

Another day, another complaint about internet regulation.

There’s good and bad to that issue, and it shouldn’t be said that it’s either one or the other. It just reflects life.

It reflects life in that trafficking in child porn is illegal. It reflects life in that exchanging perfect copies of music peer-to-peer is a breach of copyright. And, of course, it’s no surprise that China would institute various forms of censorship in its citizens’ use of the net (what should be more noteworthy is the extent to which it has achieved its aims on that score). It’s to be expected that each country would – within their ability – try to ensure that regulation of the internet reflects the legal code of the real world. I guess there are people out there who remember the days when the internet was the wild frontier – their beast - and resent efforts to tame it.

However, it’s unsurprising that regulatory bodies are still playing catchup on the information highway. The net moves at the speed of light, while legislation moves at the speed of… legislators. That’s why, for example, we see so many song lyric sites, which are surely a breach of copyright too (not to mention the misquotes that abound). Yes, it’s like playing catch with mercury sometimes, but that doesn’t mean regulation shouldn’t, or won’t, exist. I suspect the gap between technology to spread content and technology to regulate it is probably narrowing; it’s certainly not getting wider.

Governments represent people (imperfectly, to a greater or lesser extent), Copyright organisations represent copyright holders, etc. Anyone else gets to battle out libel laws in court, as it ever was. However, I note that much of what we do on the internet is mediated, for example by ISPs, and if those mediators exert undue caution because of concern for lawsuits, then again it’s not much different from the real world.

There remain outposts of wilderness on the net, particularly countries of the former Soviet Union, where it’s harder to exert control over things like copyright and child pornography. However, I would expect these to be gradually addressed through international political and diplomatic means, as they are in the real world. The only difference is that the internet puts it all at our fingertips, so the physical barriers are few. This is something the legislators are gradually seeing the need to come to grips with.

One person’s censorship is another person’s ethical standard. The battle is fought in the political arena. On a positive note, I feel we have refined our ethical standards reasonably well over the past thirty years – for the most part. We are a more tolerant society, while maintaining standards on fundamental issues.

I don’t understand why anybody is surprised about internet regulation. Everyone has their ethical standards, no matter how low they are.

World: Two questions on ethics – one on life, one on death

A couple of adjacent news articles from May 6th made me uneasy at first. But looking back on them later, I find the ethical issues somewhat more murky.

Woman defends giving birth at the age of 63
A 63 year old psychiatrist is due to become Britain’s oldest mother. She underwent fertility treatment somewhere is the former Soviet Union, with a “maverick” Italian doctor, who I believe caused controversy in the past, with similar treatment for an over-sixty Rumanian woman.
After the death of her husband, this British woman married again in 2003, to a man who is now 60. She already has two adult children. Although they approved, she was criticised by sociologists, an IVF group, and her cousin, who said she was the same age “and when I look after my grandchildren I’m tired after 10 minutes”. On similar lines, another comment is that “he or she is going to be without a mother or father at the most crucial moment of adolescence or when that child is growing to maturity”.
So the factors include:
- ability to look after children late in life (including whether the mother is still alive!);
- There are definitely increased physical risks that can jeopardise the baby, including abnormal pregnancy situations, and arthritis, heart disease and cancer.
- On the other hand, there are plenty of instances of fathers – rich and poor – who have children naturally when they’re past sixty, including Clint Eastwood, Rupert Murdoch and, I seem to recall from a few years ago, a claim about someone who was around ninety. Of course, it is rare but possible for men to be fertile in this age group;
On the whole, I would accede to the uniform judgment of Western medical systems, who have refused treatment for people this old, presumably on ethical grounds.

Director leaps to defense of bridge suicide film
People seem to make films about anything these days. Plenty have caused moral outrage, and a subset of those encapsulate ethical issues.
This furore is over a film about suicide at a popular venue, San Francisco’s Golden Gate Bridge. Camera crews were set up to monitor activity during all daylight hours of 2004, and recorded 23 suicides, six of which were shown in the film. They zoomed in on people showing erratic behaviour. The director said the crews notified officials whenever someone climbed over the railing, and said this saved lives. I’d note here that suicide is typically planned with little certainty, and a failed attempt doesn’t necessarily result in a later successful attempt.
Questions:
- is this a snuff movie?
- Would it encourage more copycat suicides?
- Would it encourage officials to post cameras themselves?
I guess the two main issues around this exercise is whether it would lead to more or less suicides, and whether the suicides should be shown on a film. The director’s stated intent was to “save lives by raising awareness”, but intentions and outcomes are often at odds.


On the whole, I’d be inclined to think the first situation is not ethically sound, while the second one is not ethically unsound. But I’m not yet thoroughly convinced either way. Thoughts, comments?

Tuesday, May 16, 2006

Tech: The future of Broadband (CeBIT part 3)

This is the final episode in my matters arising from CeBIT Australia 2006. Part one is an overview of future trends; part two is on Business Intelligence misconceptions.


Paul Budde, a telecommunications industry analyst, gave a series of seminars on the future of Broadband. His company Buddecom spans the globe; its biggest customer is the US government. Budde came across as particularly credible, and was fearless in disagreeing with others on his vision of the future. The following comments are all his, but any mistakes in transcription are mine.

Budde considers 64 K to be acceptable for voice quality (everything here being measured in Megabits per second); 1 M provides excellent audio; 2M for videoconferencing; and 6 Mbps for good video. Yet what is referred to as broadband in Australia is only 256 or 512K. There is no standard definition, but most comparable countries refer to anything 1M and upwards as broadband.

“Telepresence allows us to redefine space, time and knowledge”
- Paul Budde
The future is definitely in Fibre to the Home, ie fibre optics all the way. Nothing else can sustain the bandwidth en masse. However, this requires heavy infrastructure investment, which means phased rollout, and a number of interim technologies will be used before we get there. FttH will be preceded by Fibre to the Node, according to Budde, with Telstra rolling out 20,000 nodes over the next 3 – 5 years. This would be enough to cover the major urban centres of Sydney, Melbourne and Brisbane (other areas to follow), effectively with a node – a mini-exchange - in every street. It will sustain a 50M bandwidth, compared with the current technologies at 1.5M for ADSL and maximum 8M for ADSL2. Budde predicts 70 – 80% of Australia will have access to FttH within 10 years. FttH will make sense in any builtup area with 1500 – 3000 homes.

All this will be specifically driven by video, which has highest bandwidth demands. For this, consumers will be at the front, rather than business – in particular, via IPTV: not broadcast, but video-on-demand.

Wireless will be one of those interim solutions, but Budde says it will eventually be relegated to urban fringes, where the lower population density can’t sustain local nodes yet can cope with the spectrum demands. Satellite can deliver to remote areas, but costs are high, and subsidies will be needed for this 1 – 2% of the population. (This link gives the latest on government plans for infrastructure spending – subsidies – which Budde fears being utilised less than optimally, due to pork barrelling.)

ISPs will roll out their own DSLMs (nodes); competition will be somewhat ahead of Telstra, but it will pay to be close to a node.

Meanwhile, Telstra is trying to prolong the life of its copper network. They claim to have no plans to roll out ADSL2+ (a protocol permitting 24M speed, but which tapers off to ADSL speed – 1.5M – after about 4 kms), however Budde believes they will make an announcement offering this service before the end of June. Currently, only one provider – Internode – offers ADSL2+. They currently have about seven nodes in Sydney, so coverage is far from complete. Of course, in all this talk it’s the higher density populations that will first have access to the new technologies.

In Australia, broadband use will reach 4 million by the end of the year, yet Wireless is currently only at 50,000 subscribers – it has not taken off significantly in Australia.

Throughout this, the regulatory environment – and Telstra in particular – looms large. Telstra will be battling to preserve and/or capture monopolies; However, there’s no guarantee government tenders for subsidised remote areas will go their way. Budde in fact anticipates Telstra’s structure to be rationalised by Structural Separation – making a clean divide between infrastructure and content provision. He expects this to happen within 5 years. (This makes eminent sense as a vision, yet I think it will be hampered by the government’s plan to sell off its remaining stake by the end of this year. That makes SS harder; it would have to be accomplished by legislating the breakup of a private company. Budde expects the politics to nobble the roadmap somewhat, and it’s hard to disagree with that.)

Meanwhile, Budde expect to see transformations in existing providers from dumb pipe operators to smart pipe operators, providing services such as outsourcing, data centres, and content hosting.

As I said, Budde came across as credible and knowledgeable. His seminars were always standing room only, which I saw at none of the other three presentation sites.

Pers: Random thoughts while walking before dawn

The far corner of the sky was glowing when I set out; a patch of sky light in the east-north-east. It was still dark, and moisture was in the air although it wasn’t quite raining.

Street lights, streetscape and the occasional plume of pollution from a passing car; night can’t disguise the fact that we live as far away from the country as you can get in Sydney.

It wasn’t too cold; once I’d warmed up with a run, I stayed warm. I was listening to Explosions In The Sky; still not getting too much out of it, but pleasant as aural background.

We live at the back of the basin that is Coogee; half the world is over a hill and the rest is out to sea. Close by, we’re bisected by Carrington Road; in the words of the five-year-old, you can “see the whole world” from either end – or from anywhere in the semi-crown of hills surrounding the beach. From that perspective, Coogee feels like the whole world.

Marked development has increased the density of the built environment over the past six years. On our block alone, there’s been: a house replaced by a multi-dwelling building; a second storey added to a semi; a carport attached to the front of a house (which diminished its presence); an additional dwelling built above a set of garages; and our alterations (invisible from the street, no increased footprint). This is typical of the changes in Coogee. Like the light building in the dawn sky, the difference from one point in time to another is imperceptible, until suddenly you notice the world is different.

It’s an interesting world if you like looking at houses – which I do, sometimes. Imaging what it’s like to live in this house or that, being much fussier than reality affords us. Yet although most houses are in better shape than ours, most have less open space, and no large trees. Quite a canopy there would be if everyone had our quota.

It may seem like a hotchpotch, but the houses are largely variations on a small set of themes, which lends both unity and diversity to the experience. California bungalows abound, but with so many mutations to the façade that it’s hard to find two alike.


More than one property has replaced the nature strip with a collection of native plants. It may look lower maintenance than grass, but it’s probably not. Rare enough to seem out of keeping, but a good idea on consideration. Yet our own garden is decidedly non-native; more by inclination than conscious choice. Rather out of keeping, if you think we should learn to appreciate the subtleties of the dry continent.

Light by the time I got back. A short time later, the sun briefly bathed the back garden in yellow, speckled by gentle rain.

Monday, May 15, 2006

Tech: Business Intelligence misconceptions (CeBIT part 2)

Part two in matters arising from CeBIT Australia 2006. Part one is my overview of future trends (revised). Part three is on the future of broadband, a crucial aspect of information integration.


What is “Business Intelligence”? In a nutshell, it encompasses database query, data analysis, and reporting functions, usually wrapped in a single software package. Typically, it runs off a data warehouse or data mart, a database (often denormalised) that is fed information from, and is offline from, an organisation’s transaction database(s). This can often utilise data ‘cubes’ extracted from the database for ease of analysis, but it doesn’t have to.

Now that we’ve got that out of the way. The term is widely misunderstood or abused, going by its categorisation at CeBIT Australia 2006. Some of it was likely to be due to over-zealous marketing, but I think there are some genuine misconceptions, as would happen with any term that has its fifteen minutes of fame as a buzzword.

The most common misuse is to describe any reporting as BI. A suite of reports is not BI if you can’t dynamically query data, analyse it (with pivot tables, for example), and create custom reports. Some of the self-styled BI exhibitors had a small set of pre-canned reports; some could create custom reports - in a limited way. Only one - Yellowfin - could perform most BI functions.
Pronto said they had a business intelligence module – but it wasn’t installed at the show. Others, including Sage, swore they were in development as we spoke. However, the overall record is poor. I talked to 13 companies purporting to do BI, and found only one had a product to demonstrate that remotely approached Business Intelligence.
There was even one or two - Baycorp being an egregious example - that categorised themselves Business Intelligence because they provided an information service to business. Well yes, but it’s not BI in an IT sense.

Aggressive marketers aside, it sounds like some education is needed, even within the IT world.

Sunday, May 14, 2006

Film: The Man With The Golden Arm (1955)

There's no standard definition for film noir, although like a lot of people, it evokes for me a gritty 1950s monochrome image, detectives and desparate men, night shadows and the City's criminal underbelly. Yet Polanski's Chinatown was set in the harsh California sunlight, and Sweet Smell of Success contained no criminals or detectives. For my money, film noir is typified by the description above, but more generally a state of mind, a film pervaded with unease.

Otto Preminger's 1955 film The Man With The Golden Arm certainly fits the bill. Frank Sinatra plays Frankie, a junkie fresh out of jail, trying to stay clean, and go straight. The film tracks his cycles of success and failure on these scores. He’s a skilled poker dealer, who took the rap for the game’s organiser, but he wants to make a career from his new-found talent in drumming.
Elmer Bernstein’s jazzy score is strong, pervasive, sometimes quite jarring, in keeping with the subject matter. And that subject is addiction, in all its forms. The alcoholic is made to humiliate himself for a drink. The gamblers want to gamble, and Frankie is reeled back into the game, despite landing in jail for it. His wife is addicted to pity, and pretends to be crippled to attract that pity. His girlfriend is “addicted” to love, and languidly lights up a cigarette, only to focus instead on Frankie’s eyes, seeing in horror his slide back into heroin. Absolutely everyone in this film has an addiction of one type or another. Even Frankie’s devoted sidekick Sparrow, who seems to be there for comic relief, actually shows his addiction for Frankie with pathos.

It’s gritty, full of desparation and moral ambiguity, and largely devoid of sympathy. I guess the viewer is meant to root for Frankie. Not so easy in my case, since I didn’t warm to Frank Sinatra in the role. For one thing, he was - for me - not remotely plausible as a drummer. And Frankie is a complex character to portray, with moments of tenderness and brutality, depravity and redemption. Although Sinatra could show the hard side with excellence, he needed to carry the film through its upbeat moments and its softer scenes, right through to the end – but he was not fully convincing.
Sinatra apparently grabbed the role from Marlon Brando. I don’t know if Brando was ideal either, but he certainly had the depth as an actor that Sinatra lacked.

It’s good, and overall a successful film, definitely worth a look. Yet I found it a hard film to watch. Very hard. It took me several goes over a long period to persist all the way to the end. Maybe this is what makes it a good example of film noir.

Tuesday, May 09, 2006

Tech: Seeking the future at CeBIT Australia (Part 1 of 3)

Part two of my posts on CeBIT discusses business intelligence misconceptions; part three discusses the future of broadband.


Today I tried to glean the IT future by forging my way through CeBIT Australia 2006. Only somewhat more reliable than reading tea leaves. Still, some useful insights emerged.

Top Tier IT companies had no presence, apart from Telstra. They had a stand as well as a keynote from Randy Lynch, new COO, Telstra Business and Government (to paraphrase: “don’t call them small business, they hate that. Just call them business, and large business, enterprise”). No stands for Microsoft, IBM, Oracle, etc etc. The ones that were left were hungry enough, but it’s hard to tell whether they’re the future, or whether stands were marketed hard and successfully to particular technologies. If it’s not the latter, the future’s in VOIP, broadband, mobile workforces, and USB-enabled everything. Oh, and Blackberry, for some reason.

Noteworthy was a series of presentations on broadband by Paul Budde, a telecomms industry analyst. He was the highlight for me; I’ll devote an entry to him soon.

Of less interest were a bunch of people forcing on me blurbs that I won’t read, more magazines than I have time for, more CDs than I will ever plug in, and more second-string CRM and ERP vendors than I can absorb. Business Intelligence was mostly represented by those last - not too ably, either.

There were really far too many stands and talks to assess them all in one day. Blackberries and iPods were frequent sweeteners (in prize draws), but cheap lollies were absolutely ubiquitous. Some companies were particularly poor at engaging the punter; I could then choose whether or not to engage. If that’s not desired, I suggest they pay more attention to gimmicks - I saw mini-golf, virtual air hockey, build-a-tower, and wheel spins - or something to make the passer-by's experience more sticky. Points given to the stands offering coffee.

The organisers missed a few marks. The guidebook was quite awkward to navigate, and had omissions and spelling mistales. There could have been more rest points, too, perhaps littered with promos to pay the way. As it was, most of the available seating was in overpriced, low ambience cafeterias.

Specifics:

Blackberries: For all the talk of, I’m still not sold. Although they represent a convergent PDA device that does phone and email, one of the articles I did read was scathing about the crippling price plans available through service providers, particularly Telstra. In Australia, push email is not a standard service, and Blackberries get ahead simply because Blackberry (ie Reseach In Motion) provides its own push service, and the main telcos (Telstra, Optus, Vodafone) use Blackberry. I can see nothing else that couldn’t be done just as well, or better, with normal PDAs. And the screen size of the latest models is still woeful.

Voice Over IP (VOIP): A lot of exhibitors were plugging this, as a service (to business) or a product (eg VOIP handsets). Used to be, the advantage in VOIP was cheap calls via the internet (to other VOIP-enabled parties); the call quality suffered correspondingly. Now, businesses are flocking to it for other reasons such as infrastructure integration. And they're getting better quality out of it.

Ultra Mobile PC (UMPC): Simply put, a new computer format somewhere between PDA and tablet size, with the touch screen of both. Hard to know whether it's just a big hype, or a format that will stay the distance. I stick my neck out and plum for the former. A computer in a pocket is a powerful concept, but if you're going too big, might as well be carting around a cutdown laptop. Or a tablet. Why reduce functionality by making it even smaller?

Credit Card memory: Wallet Flash is USB memory in a credit card size. It's somewhat thicker than a credit card, and a little USB interface pokes out the side. The exhibitor (from Walletex) swore blind it had well proven itself sufficiently robust for the wallet. Prices she quoted me were somewhat comparable to other memory formats, but they were still looking for a distributor, and import costs may be added. I'm happy enough to see yet another format for carrying memory around - with a standard interface.

World: The music dies again: Jack Frost caught out by Grant McLennan’s death

Sad to hear that Grant McLennan died suddenly over the weekend, not even reaching the wrong side of 50.

Of course, McLennan is best known as a singer and songwriter for the Go-Betweens, a seminal Queensland band whose heyday was in the 80s, yet found fresh musical beauty when they reformed a few years ago.

Two things I didn’t know about him: he was active in protests during the repressive Bjelke-Petersen regime in the 70s; and music was an accidental career, when fellow Go-Betweens founder Robert Forster encouraged the budding poet to join him in a band – and learn a musical instrument.

This latter is quite a surprise, given his evocative melodic contributions to the Go-Betweens catalogue with such songs as Cattle And Cane, Bye Bye Pride, and Streets Of Your Town.

Yet I came not to praise him in the Go-Betweens, but to bury him in glory for his achievements with a side project, Jack Frost.

A collaboration between McLennan and Steve Kilbey of The Church, Jack Frost recorded a scant two albums, of which the latter, Snow Job, is an undervalued masterpiece in the Australian musical lexicon.

The songwriting voices of McLennan and Kilbey meshed particularly well on this album. The voices blended well, too, and they were by turns world-weary, bitter, yet powerful, magical, soaring. Anyone who knows well Kilbey’s music from The Church (visit this rich fan site) will recognise this bipolar description, yet Snow Job’s uniqueness was in the depth two accomplished musicians brought to each other, making the earlier eponymous Jack Frost album something of a pre-gig warmup. The egos took a backseat to the songwriting, the music, and the sublime harmonies. They rocked hard and well on Jack Frost Blues, a whimsical look at the persistence of a filmmaker (the wigs got wet so “we shaved our heads, that was the better bet”) – for which they both must have drawn upon their indefatiguable musical experiences. Yet the music and harmonies are in incandescent flight for the most part; at a zenith on Cousin/Angel, Empire and Angela Carter (yes that writer: “she lives in her own world” is the refrain).

I’ll dabble some more in the Go-Betweens; there’s gems to be found, particularly in the later of their nine albums. I’ll even try once more to discover the spirit in McLennan’s solo effort Fire Boy. By way of consolation for Jack Frost’s future denied.

Sunday, May 07, 2006

Tech: PDAs - One device to bind them all

A news report that IDC says the sale of PDAs is plummeting. Another report states WiFi convergence has been quite lacklustre.


Personally, I think that’s a shame. I’ve had a pocket computer for a couple of years now, and I find it indispensible– yet I still don’t feel I’ve exploited it sufficiently.

Of course, the market has been ripped asunder by competition from two other consumer devices: the mobile phone and the MP3 player. It will only come back together again when convergent devices are the norm.

The fact is, it’s a pain carting around several devices; there’s always been debate about this amongst PDA enthusists. Typically, a PDA is bulkier than a phone, doesn’t have phone capabilities, and doesn’t have a lot of memory – less than 100M standard, in most cases. However, the memory can be beefed up pretty easily (typically via an SD card – I use a 1G one, the sweet spot at the moment), and there are actually convergent devices on the market. The O2 is a good example of a feature-packed PDA: it has mobile phone, wireless and camera; HP is doing it too.

The solution isn’t ideal. PDAs with phone capability are bigger than most and expensive, and for what it’s worth, it doesn’t look that cool holding a PDA to your ear.

Convergence is happening. But so far, the marketplace test is the decider, and where the bulk of consumers go, so go economies of scale, and so goes product development. Mobile phones are entrenched in the market. MP3 players have achieved that critical mass consumer acceptance too. PDAs have gone a certain distance, but they haven’t fully broken out of the tech geek enclave.

Why would someone bother? Well, why not? Who wouldn’t want a mobile computer they don’t have to lug about? One place for documents, data exchange, contacts, appointment reminders, music, family photos and videos, maps, games for the kids, even TV/DVD remote control.

Limitations: small screen, and the need to use a foldup keyboard for serious data entry. Challenges for the future.

I don’t have GPS, phone or camera on my PDA. I would like a fully convergent device. But at the rate we’re going, the market dictates development remains focused on phones. It will come all the same: a GPS, net-linked computer, camera and video phone all in one. I’ll be waiting.

Pers: Ethics of being incognito at the coalface

Coalface ethics:

The story goes that a bloke who was appointed State Director spent his first week on the job incognito at a local office as a junior. The intention was apparently to get a feel for things at the coalface. That’s laudable, but is it ethical?

I don’t think it is, despite disagreement from two people whose opinions I respect.

My problem is with the undercover aspect of the exercise. It is true that if people knew who he was, they would treat him differently, and he would not get as true an idea how it was on the ground. The value of the operation would certainly be diminished.

But I believe ethics is not just about intentions, and it’s not just about doing the right thing. It’s important to be seen to be doing the right thing. Motivation, and what is done with the information gained, is an individual matter. Some people will be beyond reproach in the matter, but certainly not all. Yet ethics is about fair dealing with other people, and private intentions are not open to all. That has only an incidental relationship to how the dealing is received by other people. You can have good intentions, but how can you prove it except by being seen to be above board?

Further, there is no guarantee that information gleaned during that week will not and can not be used in some way against some of the individuals at the local office. In the situation in question, the regional office was located with the local office, and the State Director would conceivably have future dealings with the senior management there. In mitigation, the reporting lines were not direct (local offices’ reported through to the national office, but not via the State Office). The dilemma is murkier here, but for my money, that’s not really arm’s length enough to ensure that a) nobody would be personally impacted by the internship, and b) this is seen to be so. There’s still scope, for example, for the State Director to subsequently bring pressure to affect someone’s career – due in some way to confidences received, slights perceived, etc.

Am I splitting hairs? Comments welcome.



PS Australia’s public broadcaster ABC has a news radio station - Newsradio. Weeknights at 10pm, it gets a feed of All Things Considered, the current affairs programme from US public broadcaster NPR. They have a weekly feature in which an ethicist responds to listeners’ dilemmas. Although he sounds really easy-going, his advice is fairly strict - within a practical, day-to-day framework. Worth a listen; it helps tune up one’s sense of ethics, no matter how finely-honed already.

Thursday, May 04, 2006

Tech: Semantic Web, superweb: part one

When the semantic web takes off, it’ll be like databasing the internet – or encoding it. If it doesn’t take off, it’ll be only a matter of time.

Having said that, I remember being only coolly fired at a World Wide Web Consortium seminar on the topic a couple of years ago (the W3 Consortium is behind the initiative). The seminar was somewhat theoretical, and the topic seemed geared to the academic world. But its current manifestation seems to be proving some worth already in the commercial world, according to an article, Semantic Breakthrough in this month’s Oracle magazine.

It’s already happening, in more rudimentary form, with metadata in web pages. Web crawler technology makes it a reality. But complex metadata – particularly through XML – opens up a whole new world.

I started pondering variants on semantic encoding. Miscoded semantics: misinformation or disinformation. Coded - encrypted - semantics. And because it's an opt-in situation, whole swathes of the net will not be encoded at all. More on this to come.

The article mentioned Oracle making a direct application of "semantic technology" to grid computing. However, they didn't seem to demonstrate that they were doing anything more than leap on a bandwagon. It seems to be just a matter of resource discovery. Calling for thoughts - are they drawing a long bow to make the connection to the Semantic Web concept?

world: election campaigns: the evil pandering to the dumb

Ahh, politics. Isn’t it so stupid and venal, so much of the time? Don’t you long for the occasional presence of statesmen? Ah, but they wouldn’t get where they are today if they didn’t pander to all sides, ignore their native constituencies.

Of course, John Howard won the 2001 Australian federal election with an intentionally dishonest scare campaign – the children overboard saga. Senior MPs – including Howard – knew they were likely to be misleading the electorate, by claiming asylum seekers were throwing their own children out of their boat. Stupid basis for an election victory? Stupid bloody times.

And we know the 2004 election was won through a dishonest scare campaign too. Media swallowed the Liberal Party ploy, repeatedly headlining their claim that interest rates would run higher under the Australian Labor Party.

The fact that the Reserve Bank upped the interest rate yesterday has been trumpeted as proof of the lie. But it’s not really, of course. Editors are slaves to the story angle.

If it wasn’t for that, the Liberal Party wouldn’t be able to sustain their fear campaigns. But that’s how politics and media works. At the worst - at election time - they are evil carrion-feeders, teaming up to prevent any semblance of rationality creeping into a serious decision about stewardship. A semblance of media bipartisanship [support for the ALP’s 1993 GST scare campaign] is stone cold comfort.

It’s not the fault of a monolithically dumb electorate. Most people will have solidified their opinion over the previous term. It’s the fault of the pig-idiot strata on the cusp, those in the middle who were stupified by the responsibility, giving their soul to any perverted doomsayer that could insert a half-baked idea through their drunken numbness.

But we know that, don’t we?

We know that both sides were taken over years back by the same monetary ideology (here called economic rationalism). We know that yesterday’s hike proves nothing, and that interest rates are on the uphill worldwide.

We also know by now that the middle ground is populated by rubber-spined penny-pinchers who can’t see past their noses, let alone next week. It’s a shame politicians learn that lesson so rapidly.

We shouldn’t underestimate the cynicism and inhumanity of politicians and their criminal media cohorts who love an alarmest story at the worst of times, election time.

Wednesday, May 03, 2006

Tech: Business Intelligence: Hyperion and Microsoft seek shelter in each other’s arms

Unexpected news today that Hyperion and Microsoft are to integrate their Business Intelligence offerings.

Hmm. That report wasn’t greatly helpful: integration covers a multitude of sins. I had a closer look, via the respective companies’ websites. The press release is identical (except that Hyperion’s site was easier to navigate!); in fact, the announcement was made at Hyperion’s global Solutions 2006 conference.

The key word here is ‘interoperability’. They are planning to allow their respective products to work with each other. In particular, people will be able to access components of Microsoft’s SQL Server with Hyperion tools, and access Hyperion components from SQL Server Reporting Services.

Although the proof of the pudding etc, this is a significant initiative. In a number of ways, the two companies have become direct competitors. Like few others, Hyperion has built up a vertical solution package encompassing database, business intelligence, OLAP (online analytical processing), ETL (extract, transform, load) and a healthy range of tools relating to Business Performance Management. They’ve done this both by internal development and acquisition – with a particularly useful find in Brio for business intelligence [unfortunately, they’ve drowned the brand name, but you can find it as significant parts of Hyperion Intelligence v8, or Hyperion 9 BI+].

On the other hand, Microsoft is simply Microsoft. They mostly build their own (the SQL Server database technology was acquired from Sybase). As the elephant in the living room, they’ve diverged from other BI toolsets - as I’ve discussed before. Their delivery model tends to be “here’s the components in pieces on the floor. Oh, would you like one of our partners to do some consulting?”

However, by now they both have the full toolset and are going head to head.

Which is one of the reasons the news was unexpected – but good. Interoperability is good. It’s going to be essential for the survival of software companies, and it’s a boon for the information consumer. Hyperion and Microsoft each have something to gain from this initiative – although Hyperion more so, given the disparity in market presence. In fact, Hyperion has already integrated Microsoft's .NET development framework into a number of its products.

Hyperion seems to be doing what is needed to survive in its core areas. By comparison, a takeover of one BI company by another (as with Crystal Reports and Business Objects) is risible. Expect Hyperion and Microsoft to remain two of the main players in this space.

Tuesday, May 02, 2006

Pers: the myth of UFOs, and Occam’s Razor

Last night I applied Occam’s Razor, and found I did not believe UFO sightings were of extraterrestrials. [Thanks to Mark for reminding me that what I was trying to express at one point was indeed Occam.]


Of course I could be wrong, but that’s the beauty of scientific analysis: it’s fun to be proven wrong. Unless you have invested too much in the stance. Being proven wrong represents an opportunity to explore a whole new world, from a fresh perspective.

According to my Macquarie, Occam’s Razor is “the principle that entities must not be unnecessarily multiplied, which as the principle of economy of hypothesis, is applicable to scientific research”; “William of Occam, d1349?, scholastic philosopher.” [As you can imagine, being a logician in that era, he was excommunicated.] The key phrase here is “economy of hypothesis”. In effect, go for the simplest explanation that covers the evidence.

Of course, often enough commonly accepted scientific hypothesis doesn’t cover all available evidence. Sometimes it’s accepted that some evidence falls outside the hypothesis, and needs to be tidied up by later refinement of theory.

By my application of Occam’s Razor, UFO sightings haven’t been extraterrestrial. The simplest explanation relates to technology at the height of the fad – 1950s, odd – being rudimentary by today’s standards, and so giving less precise recordings. Further, that the cold war fostered a type of paranoia which was propitious for “other” interpretations of unusual phenomena (although sometimes UFOs were attributed to Soviet technology, extraterrestriality was possibly less world-threatening, having no direct implications of nuclear destruction - not to say more realistic, especially with what we know now about Soviet technology). In fact, it could be said that "alien" now frequently substitutes for "supernatural" or "divine", reflecting the respective zeitgeists. A bit like St Elmo’s fire, which was seen by sailors as supernatural.

In support of this idea, I note that Wikipedia’s list of major sightings has them concentrated in the cold war era.


My problem with the ET explanation is fourfold:
1) Wormholes aside, it would take years for a life form to travel from home base to our solar system. We’re three to four light years from the nearest star, and much much more distant from the nearest star with plausibly inhabited systems. Further, if life forms had the technology to make it here, they wouldn’t be playing hide and seek. There’s precious little to be gained from travelling huge distances for brief observations then travelling back again.
2) With today’s technology available to amateurs everywhere being much more sophisticated than was available at top levels in the 1950s, any presence would bring multiple, documented reports. Even if the sighting was in an isolated location, the object would necessarily be tracked in travelling to that location.
3) Some of the descriptions included the physically impossible, for example high speed right-angle turns in the sky.
4) We know from SETI projects that there are no ETs nearby. At least, none that are using any communications systems, yet it would be near inconceiveable not to communicate.

My reservations about this are:
a) Some difficulty explaining all UFO phenomena; Wiki has a comment that about 35% of the best [strongest] cases are unexplained.
b) a miniscule possibility that technological extraterrestrials could develop in environments too hostile for us to consider inhabitable;
c) Wormholes.
However, I’m happy to be proven wrong.

More on the wormholes… someday.

14-May-06 Update: It's plasma
The percentage unexplained just took a dip, due to Project Condign. This study, by the UK Department of Defence, examined 10,000 witness reports, and attributed most sightings to plasmas, electric atmospheric phenomena caused by a range of circumstances, including meteors; air flows shaped them into UFO-like phenomena. (My source, the Sydney Sun-Herald via UK's Guardian newspaper, seems to imply all sightings were attributed to plasma, which doesn't sound right to me.)

Monday, May 01, 2006

Tech: Software as a Service - the Next Big Thing or a fizzer?

In a fit of generosity, Microsoft’s CEO Steve Ballmer suggested software prices could fall – because of a reduction in piracy!

Ah well, blame it on some over-enthusiastic subeditors, trying to put an angle into a headline. Reading the article, I don’t think that was his overriding message.

First, I think piracy is still absolutely rife with Microsoft products, particularly in China and the developing world. And I suspect that many pirates would not be using the product if they had to pay full price. Ah well, Bullmer must know something. Perhaps piracy's down in the developed world, where the money is.

Second, I don’t think Microsoft in for a hard landing anytime soon. They have a number of rivers of gold, even without their flagships of Windows, Office and SQL Server. They have a wealth of resources to draw upon when it comes to plotting the future, developing new golden rivers. For such a company, the chief concern is always going to be maintaining the quarterly profit increase, to keep their shareholders happy.

But Microsoft is aware of the broad threats into the future. They need to – and they do - scrutinise any paradigm shift that could upset their business. Two of those are software-as-a-service and open source software. I have discussed the latter before; the former seems to have been Ballmer’s real message. It was a conference on Web 2.0 – a murky term used as a catch-all for emerging web technologies, and which I’m not convinced includes SaaS, which is effectively web-based software for which you pay ongoing fees, or lose the software.

A recent article tried to infer that SaaS is a norm, or is widespread – which it’s not, really. However, as that item pointed out, we do commonly use that paradigm, even if we’re not aware of it: antivirus software can only work in that fashion. But I don’t really think SaaS, per se, represents the dawning of a new era. Unless we see common Microsoft products offered in that fashion at a drastically reduced price. And I can’t see that happening, because then Microsoft would find out just how much of its products end up as shelfware. But they probably already did that study, so expect it to be factored into the price. Myself, I don’t expect SaaS to predominate except in the areas where it’s a natural fit.

World: Gun control and Americans

PM John Howard’s recent comments on gun control are a blessing – something you will hardly ever hear me say about his utterances.

That is, if he follows through. He has no specific initiatives planned yet.

Ahead of the 10th anniversary of the Port Arthur massacre, he said that nobody should have firearms unless it was an essential part of their job.

Hallelujah. And yah sucks boo to Americans. There are not many nations that glorify guns like they do. Thank goodness. They have twisted a revolutionary situation two hundred years old, to make themselves one of the dangerous nations in the world in which to live. I would just hate to think that there is a gun in every second house in Sydney, waiting to be used. There’s not. There are guns in Sydney, but not in epidemic proportions, and fewer now with the compulsory buybacks of recent years. The above article details a study which shows a dramatic reduction in murders, suicides, and accidental shootings – specifically since the buybacks. On that basis, our gun deaths are 1.7 per 100,000. What’s yours, America? (and no, other means don’t substitute if guns are absent. Homicides, suicides actually go down - as the study illustrates.)

You don’t need guns to defend yourself in Australia. There are some around, but thank goodness we’re not infested.

Postscript: I recently saw a noir-ish film called Suddenly. Frank Sinatra plays a mobster – and he’s a natural! Just about worth seeing for that alone. But it’s got some really redneck attitudes on guns, eg every kid ought to have one, get used to handling guns. Freaky, but they were serious. It's a sad world, where some Americans feels safer with guns.

27-Jun-06: A recent "quality of life" survey rated US cities lower than those of Australasia, Canada and Europe, specifically because of crime and personal security issues.