Showing posts with label review. Show all posts
Showing posts with label review. Show all posts

24 August, 2010

Review: The Artificial Ape: How Technology Changed the Course of Human Evolution
 by Timothy Taylor

(This review first appeared in The New York Journal of Books on 23rd August, 2010.)

The Artificial Ape is a book with a plausible idea, but that is all it has. If you are looking for a convincing argument that “technology changed the course of human evolution” or even some compelling evidence, this is not the book for you. However, if you like informed speculation about humanity’s prehistoric past and you enjoy surveys and summaries of this immensely long and fascinating period, The Artificial Ape will keep you turning the pages.

Taylor is a well-known and popular archeo-anthropologist and is beginning to make himself a name for controversial speculation. His Prehistory of Sex takes us back 8 million years and The Buried Soul makes some startling claims about how widespread cannibalism and vampirism were in prehistory. The Artificial Ape follows in this tradition.

Taylor’s main contention is that tool use in early hominins was a necessary step to allow us to develop our large brains. In particular, he speculates that the invention of the baby sling must have occurred about two million years ago (although there is no actual evidence). This would have allowed a hairless ape with an upright gait—and thus a restricted pelvic gap—to give birth to increasingly immature babies, ones that could not cling to their mothers and would need to be carried, thus allowing the brain to continue to grow and develop outside the womb. As Taylor puts it, turning ourselves into artificial marsupials.

He makes much of the fact that tool use in hominins began about 2.5 million years ago, long before signs of accelerated skull-size began to be seen in the fossil record (after 2 million years ago). It is a puzzle that stone tools were being made and used before Homo ergaster and then Homo erectus began to develop their larger brains, and it is this puzzle that Taylor’s hypothesis attempts to tackle.

Taylor also points to the fact that an ape with an upright gait has a much shorter intestine than one on all fours. This means that not only meat eating but cooking may have been essential precursors to the development of bipedalism, simply because of the difficulty of finding sufficient nourishment from a vegetarian and raw meat diet with a short gut, at a time when we would have been extremely active and burning calories at a rate rarely seen in humans today.

Interestingly, recent evidence, published after the book was released, pushes the date for tool use and meat eating back to perhaps 3.4 million years—the pre-Homo days of Australipithecus afarensis. This find gives Taylor a 1.4 million year gap to explain before brain sizes begin to increase. But it does provide more time for full bipedalism to evolve after tools for butchering meat are first seen.

Given the paucity of the evidence, much of what Taylor proposes must be taken with a pinch of salt. For example, hominin skulls are quite plentiful across the last two million years, but there are only a dozen or so before that time. The graph of brain capacity against time that he presents is quite compelling but it would not need many new data points in the pre-2 million years’ range for it to look very different. More critically for the argument, there are just three hominin pelvises that have been found covering a period of almost 3.5 million years. While they approximately match the required changes in morphology for an ape specializing increasingly in bipedalism and immature neonates, it is very little to base an argument on.

So the book is disappointing in that, having made its surprising but apparently reasonable claim, it then provides scant evidence and only weak arguments in support of it. It is disappointing in other ways, too. It contains long and frequent digressions into areas of human cultural evolution that are not strongly connected to the main argument and which tend to dilute and confuse the message.

While fascinating in their own right, Taylor’s discussion of neolithic art and culture do not contribute much. Similarly, his extended discussion of why Tasmanian aborigines had apparently “regressed” to a level of tool use and a style of living not far removed from that of chimpanzees, while a very useful antidote to Victorian condemnation of and dismay at their lifestyle (which still persists in a mild form in academic circles today), does not strengthen his argument appreciably.

Some discussion as to why other hominids (the great apes) have not taken the same evolutionary path as humans, despite the strong probability that they were as proficient with tools as our distant ancestors were, would have been worthwhile. It is likely that chimpanzees have been using tools for as long as us, yet it has not led either to bipedalism or to increased brain size. The same problem arises with birds. Modern studies show extremely surprising sophistication of tool use in crows and other species of bird, yet we do not see the same evolutionary tie to tool use that Taylor suggests for ourselves. Birds have not become “artificial avians.” Why not?

And the same problem arises with dolphins, which also use tools. Bird brains also raise the interesting problem for Taylor’s hypothesis that their brains are notoriously small. Claiming that tool use (technology) enables increases in brain size, in the face of a crow’s tiny brain, begs the question as to whether the evolution of technologies and brains is causally linked at all. It would have been useful if Taylor had addressed some of these issues.

The Artificial Ape is a good read. It is full of interesting and provocative ideas and information. Yet, while it is interesting and its main idea is appealing, in the end, it fails to make its case.

13 July, 2010

Review: Why Does E=mc2? (And Why Should We Care?) by Brian Cox and Jeff Forshaw

(This review first appeared in the New York Journal of Books on 13th July 2010.)

Why Does E=mc2? is one of those questions that educated non-physicists must have been asking themselves for over a hundred years, ever since Albert Einstein derived the equation back in 1905. Now, in this easy-to-read little book from Brian Cox and Jeff Forshaw, we have the answer. The authors are both professors of physics at Manchester University, and Brian Cox is also a well-known TV personality—well known enough to warrant a jacket blurb from Stephen Fry.
The book begins with the traditional approach to explaining the slowing of clocks for observers in motion relative to one another, by examining the geometry of a light beam bouncing up and down in a moving vehicle. The authors demonstrate just how easy it is to get to Einstein’s time dilation formula using nothing more than Pythagoras’ Theorem and the knowledge that the speed of light is capped. But they don’t leave it there. In the first half of the book they consider two more approaches that lead us to the same conclusion. 
Along the way, they very cleverly introduce all the ideas we will need to get to the world’s most famous equation, E=mc2. What is more, they focus on the most puzzling part: the question of what c, the speed of light, is doing in there. Very early on, they introduce c as a scaling factor so that we can talk about “distances” in spacetime. Later, by various means, they explain why c has to be the maximum speed that anything can travel. It is a small triumph of the book that Cox and Forshaw make the attempt to show the logical necessity of there being a universal speed limit, and that their arguments are presented so clearly.
Yet, as with any book of this size tackling a subject so enormous, it is not long before the authors start asking us to take things on trust, undermining the comprehensibility of their presentation. The first big one is when they introduce Maxwell’s equations and ask us to believe they demand that the rate of propagation of an electromagnetic field be constant for all observers. Then comes the work of mathematician Emmy Noether and her demonstration that invariance leads to the conservation of quantities. 
These, and many others introduced later, are tough ideas and hard to swallow. The authors introduce them to provide alternative ways into the understanding of relativity and that famous equation. It is to their credit that they do not always hide the complexity nor the long history of ideas behind relativity, but it would have been better, perhaps, to have spent a few more pages on some of these notions. It is also to their credit that they make the case, as Feynman and others have done before them, that, at some level, the weirdness of the universe just has to be accepted, and the only test of physical theories that matters a damn is whether they are supported by actual observation and experiment.
And there would have been many pages to spare for additional background and explanation if, near the end, the book had not wandered into obscure and largely unrelated areas as it tackled a broad-brush description of the Standard Model in an attempt to explain what mass is. It was inevitable that some particle physics had to be discussed and that this would lead to discussions of quantum theory. After all, the book’s sub-title is And Why Should We Care? and the reasons given largely involve nuclear power, chemistry, and cosmology—all of which are helped by discussions at a subatomic level. Perhaps also Brian Cox’s involvement at CERN (he heads a project there to upgrade the ATLAS and CMS detectors for the Large Hadron Collider) meant that a discussion of the Higgs particle was inevitable. Nevertheless, this, and the very brief glimpse of general relativity right at the end, seemed to detract from the clarity and force of the earlier exposition.
It is a curious book that tackles several of the most difficult ideas in modern science in the tone of a friendly, almost patronizing, high-school teacher, trying to ensure that the slow kids manage to keep up with the rest of the class. The tone and the endless asides (did you know that the Sun converts 600 million tonnes of hydrogen into helium every second?) can become a bit wearing, but Cox and Forshaw have to be praised for their unwavering insistence that their subject is accessible to anyone at all who will stay with them and think about it.
In an age when most lay people throw up their hands at the mention of relativity or quantum theory, when religious creation stories and New Age mysticism offer a far simpler, less challenging route for the intellectually overwhelmed, it is hugely important that ordinary people see that physics is not just for the egg-heads, that it can be understood, and that there is a grand beauty in what it reveals about our world. Cox and Forshaw have made an important contribution in this area, one that will help school science teachers as much as it will their students.

14 May, 2010

Review: Epitaph Road by David Patneaude

(This review first appeared in the New York Journal of Books on 12th May, 2010)

Epitaph Road is the latest in a string of successful young adult novels by David Patneaude. In 2067 a world reeling from recent nuclear brinkmanship between the USA and China is suddenly devastated by a virus that kills almost every male person on the planet. Only those males at sea or in remote places survive.
Thirty years later, the world is populated and dominated by women. There are a few more males, but strict birth control laws ensure that the male population cannot rise above 5%. It is a world free of much petty crime and war but one in which the remaining males are subjugated and controlled, and the women in power have all the vices that political elites have always had. 
Kellen Winters is fourteen in 2097, the son of one of the few survivors of the plague they called Elisha’s Bear. He lives with his absentee mother, an important person in the new North American government, and dreams of leaving one day to join his father, who lives as a “loner” among male “throwbacks” on a kind of reservation. He is preparing for his citizenship exams and coping with the oppression and subjugation that is the role of all males, when he and two female classmates stumble on some information that leads them to delve into the origins of the plague that changed the world and which still recurs from time to time. What they uncover sends them on a journey to find his father and warn him about a potential new outbreak. But powerful forces don’t want Kellen to reach the throwbacks, and police and other agencies are searching for him as he and his friends stumble upon another shocking and deadly secret. 
Epitaph Road is a straightforward adventure story in which a group of youngsters fight the forces of an oppressive and hypocritical adult world. It has good pacing, is nicely written, and the adventure runs its course as it should to its proper ending. Yet it is a most unsatisfying story, with two major flaws that spoiled it for me.
The first is the aftermath of the plague. Almost overnight, half the population of the world dies. And it is the male half, the half that hogs most of the power, dominates all industries except the lower-paid service industries, and has a near stranglehold in areas such as engineering, construction, power, transportation, communications, and so on. Yet, the world carries on, civilization carries on as if nothing has happened. The supply of electricity keeps flowing, the farms and food distribution keeps going, the communications networks stay up, the domestic water and sewage networks still operate, and billions of bodies are buried. There is no mass starvation; no millions dying in that first, unheated winter; no disease; no cessation of oil supplies; no massive shortage of doctors. Within a handful of years, countries have merged, a new world political order is established, new education systems are put in place, and massive social change is underway. You might think that, possibly by 2067, there is sexual equality and the work and power disparities of today no longer exist, but according to the story, that is not the case. If anything, male dominance is worse by then.
With a pinch of suspended disbelief, you might get past this issue, but it is the kind of “world building” problem that leaves me very uncomfortable. Worse, however, is the fact that the bulk of the story is set in 2097—almost a hundred years from now—but no technology seems to have advanced beyond today’s level. True, the desktop computers have touch screens, the smartphones are called “e-sponders,” and electric cars are commonplace (although the throwback men still drive old petrol cars), but there are no new technologies.  
The idea that civilization could progress for 90 years without radical new technologies appearing is just incredible. Even if we suppose that technical innovation ended in 2067 when the men died, we should still expect fifty more years’ worth. For example, the Internet is just twenty years old as you read this. It was barely known to the world at large before 1995. Yet the Web and texting (a technology which is about the same age as the Web) are supposedly still the technologies in use by kids in Epitaph Road, eighty-seven years from now. 
This is such a massive failure of the imagination, and introduces such a jarring credibility problem, that the question has to be asked: Why didn’t Patneaude set the book in the present? Without more than this minimal nod in the direction of world building, this is not science fiction. So why not make the date 2007 instead of 2097 and let this become an alternative history novel? 
It is a young adult novel, and is intended for children, but that is no excuse for not treating his readership with more respect. The description and development of the book’s main characters, their complex feelings and motivations, is well up to scratch, the plot is simple and easily anticipated but nicely executed and suited to the genre. However, the book is badly let down by the credibility of a major plot element and the complete failure to present a believable future world.

03 May, 2010

Review: Cro-Magnon by Brian Fagan

(This review first appeared in The New York Journal of Books on 2nd May 2010.)
By any standards, Brian Fagan is a leading authority on archaeology, and, with 46 books on the subject to his credit, he is among the world’s leading popularizers of the field. In Cro-Magnon, he gives us an easily digested round-up of what is known about the pre-history of modern humans in Europe.

Fagan presents an essentially chronological account, starting with the Neanderthals who were already present in Europe when modern humans arrived, and taking a brief detour to look at the evolution of hominins in Africa. From the arrival of Cro-Magnons around 45,000 years ago until the spread of farming in Europe, about 8,000 years ago, the book traces the movements and developing cultures of these people who were the first homo sapiens to settle the continent. It has a good index and an extensive list of further reading in the Notes section.

If you live in Europe, or are of European descent, then the Cro-Magnons were almost certainly your direct ancestors. Fagan digests and presents for us the extremely complex evidence that reveals population movements and social conditions, without burdening us with details or much controversy. This evidence is mostly archaeological—the bones, human and animal, that were left behind, the stone tools, the excavations, and the paintings and carvings. But he also makes much use of climatological data, studies of modern and recent stone-age peoples, and recent genetic studies, again, sparing us the arguments and supplying only the conclusions.

Fagan works in a field that is massively interpretative. Controversies abound—especially in the assessment of purely social, spiritual, or linguistic aspects of ancient peoples. Yet this reviewer thinks it is a strength of his approach that he delivers what he feels is the most likely interpretation, given a broad, eclectic, yet conservative, summary of the data from many disciplines, merely indicating where there may still be some disagreement among experts. It allows him to present an extended and coherent narrative that makes sense of the whole story of Cro-Magnon settlement in Europe.

And the way he tells it, it was a long, hard struggle. Europe, for most of the time that Cro-Magnons carved out a place there, was a bitterly cold, hostile environment, more akin to Northern Siberia or Canada than to the temperate land we know today. Frozen tundra and barren steppes were what greeted those first immigrants. Yet the Neanderthals had survived there for nearly 200,000 years when we arrived. It is typical of Fagan’s non-controversial approach that he doesn’t indulge in lurid speculation about how modern humans drove the Neanderthals to extinction. It was a slow and gradual process that took place over many thousands of years. In Fagan’s view the Neanderthals simply continued to live their lives as they always had, only with Cro-Magnons hunting the same territories, times just grew harder, until their already-marginal existence was gradually pushed beyond the brink.

Yet, while the absence of detail such as the minutiae of debates about dating and statistical analyses allows Fagan to present the bigger picture with bold strokes, it also leaves you wondering about some of his assertions. He is, for example, very firm on what was men’s work and what was women’s work. How much of that is in the actual evidence, and how much is imported from modern anthropological studies, or even modern prejudices?  And the speculations about whether Neanderthals danced seem fanciful and based on slender evidence (which appears, from what is said, also to be consistent with the hypothesis that they wrestled).

And it isn’t as if there was no room for more detail or more discussion. The book proceeds at the painfully slow pace of a modern TV documentary, with considerable repetition and often tedious dramatizations of life in the late Ice Age. The material in the book could have been presented in perhaps a quarter of the number of pages if not for the slow, repetitious style. 

The book proceeds at a measured pace, to put it kindly, and, while clearly written, the language used is often clichéd and itself repetitive. (There were several points where I thought if I read the words “bestiary” or “tool kit” one more time, I would throw the book down and jump on it.) Which is disappointing because there are sections—like the discussion of Cro-Magnon art near the end—where Fagan writes with fascination and insight. If the whole book had been like that, it would have been such a joy to read. As it was, the book provides a clear, pain-free summary of what is known about the earliest Europeans—it just happens to be a bit slow.

It is a sign of the rate of change in this field that, even as Cro-Magnon comes to press, DNA analysis of a finger bone found in a southern Siberian cave suggests that a third hominin species may have co-existed with Cro-Magnons and Neanderthals in that region. How would the presence of another human line affect the conclusions in Cro-Magnon? Only future editions will tell.

12 March, 2010

Review: Einstein's God by Krista Tippett

This review originally appeared in the New York Journal of Books, where you can also find other reviews by me.

Einstein’s God: Conversations About Science and the Human Spirit by Krista Tippett
(Penguin Books, February 2010)

Krista Tippett has spent the past decade interviewing people about religion and spiritual ethics as the host of public radio’s “Speaking of Faith.” Einstein’s God is an edited selection from these interviews in which she discusses the relationship between science and religion with a number of eminent guests: some scientists, some not, some believers, some atheists—all of them leaders in their fields with interesting ideas. It’s an eclectic group of guests, and the conversations cover a very broad range of topics, including Darwin’s relationship to religion, the psychological basis of forgiveness and vengeance, and how God might have room to act within the constraints of modern physics.

Unlike most of what appears in print these days about religion’s interactions with science, Tippett’s book is not about conflict. It is about reconciling the two world-views. Its intentions are to show that scientists—even ones that have no religious belief— feel the same sense of awe and wonder at the world as believers, that even the devoutly religious can and should respect the study of the natural world, and that scientists themselves can be practicing believers and feel no contradiction within themselves.

Tippett is attempting in this book what, for many people on both sides of the religion vs. science “debate,” must seem impossible. She is speaking candidly and respectfully to scientists, theologians, and artists about their spirituality and beliefs, seeking to find the common ground between these extremely different world-views. In the process, whether you feel she succeeds or not, she achieves something just as helpful: She finds the common humanity in all these seekers, and gives us a basis for mutual respect and a sense of fellowship.

Within this framework, some of the interviews work better than others. The first interviewee in the book is the main reason I wanted to read it, physicist Freeman Dyson discussing Einstein’s spirituality. Yet the conversation was dry, if not dull. It covered ground that would be well known to anyone interested in Einstein. The only point of real interest it made was the idea that the feeling Einstein had about the Universe and how it is put together, about the “miraculous” way mathematics is able to describe nature (there being no reason anybody knows why it should), is very close to the religious sense that believers have when they contemplate Creation.

Unfortunately, this interview and the one that follows it with physicist Paul Davies, may have been recorded too early for either Dyson or Davies to be aware of a letter on religion that Einstein wrote to the philosopher Eric Gutkind in 1954, which became well known only in 2008. In the letter he clearly denies any belief in God—not just the “personal God” he famously rejected— saying, “The word god is for me nothing more than the expression and product of human weaknesses, the Bible a collection of honorable, but still primitive legends, which are nevertheless pretty childish.” Tippett should have known about this letter and, I think, addressed the complexity it adds to Einstein’s expressed views on religion.

Once we leave Dyson behind, the interviews become more lively and engaging. Also, after the initial discussion about Einstein, the collection moves away from him, specifically, and goes off to explore the interplay between science and religion in other disciplines and through other thinkers. Sherwin Nuland, a surgeon, talks about his notion that human spirituality and religious feeling, human good and evil themselves, are the products of an evolutionary process that has selected and nurtured them. Tippett’s comment that such ideas “might richly inform many religious perspectives” is typical of the hopeful and inclusive attitude she projects throughout the book. Whatever we might think of the likelihood of this happening, it is impossible not to wish with her that it could be so.

High spots for me were the chat with Jana Levin, another world-class physicist, who talked about her novel, A Madman Dreams of Turing Machines, and about a rationalist world-view that is nevertheless filled with wonder and beauty. Psychologist Michael McCullogh talked about the evolution of forgiveness and its central, everyday role in preserving civilisation. Charles Darwin’s biographer, James Moore, was eloquent in describing the deep reverence of the Great Man for the natural world “undefaced by the hand of man.” And Esther Sternberg, a Canadian immunologist I had not encountered before, was fascinating on the complex connection between health and emotion.

Low spots included Anglican Priest (and one-time physicist) John Polkinghorne who, while decrying “God of the gap” arguments, proceeded to describe a Universe where God excludes himself from all but the most marginal influence through quantum uncertainty and chaotic processes. Polkinghorne appears to be a favorite of Tippett’s, judging by the number of times she mentions him in the book, yet I found his message that God created a self-creating Universe (i.e. He set up the initial rules and conditions but then lets it run more-or-less untended) far less intellectually satisfying (or even honest) than that expressed by V. V. Raman in another interview. Raman, a Hindu, seems able to keep his religious and scientific world-views completely separate and to experience the world in these two, quite different ways without feeling the need to find ways of fitting them together. I also found that the format of the book—essentially a series of transcripts with an overall introduction, introductory remarks before each interview, and break-out comments within (generally to give background to what is being discussed)—was rather tedious and involved a lot of repetition.

Einstein’s God swings between fascinating and infuriating with only a little dull in between. It would almost be impossible for it to do anything else with such interesting and controversial contributors involved. Tippett has attempted to move us away from the often hostile and sterile debate between science and religion, and instead demonstrate how, in the ordinary world of people’s lives, scientists and theologians are asking the same questions of and feeling the same wonder at the world they inhabit, without conflict, and with great humility and respect for the truth. And I think she has made a good job of it. However, her eclectic and inclusive approach may have worked against her to some extent. Suggesting, by their inclusion, that all religions are somehow equivalent and the content of their doctrines does not really matter, reduces them to the status of a mystical or spiritual impulse, whereby they can, indeed, be compared to Einstein’s “religious sense” of the Universe. It’s possible that some believers will be offended by this. But in the end, perhaps Tippett’s point is that it is the urge toward spirituality that is really important for most of us, and whether we satisfy it through scientific study or through religious devotion matters very little.

10 March, 2009

Jathia's Wager - Don't Bother

I just finished watching an open source movie - and it was awful.

Now I like open source software. I use it all the time. It's great. So it seemed worth a look to see what an open source movie was like - especially since this one is a sci-fi movie called 'Jathia's Wager'. Go and see it at the Moviepals site if you have 20 minutes to spare.

I suppose it's early days and open source movies might get better, but this one was very badly written. (The filming, directing and acting looked pretty ordinary too but what do I know from making movies?) There were tedious passages with no dialogue where almost nothing happened. There were tedious passages of pure exposition where absolutely nothing happened. Oh, and did I mention it was tedious? Don't waste your time looking for realistic dialogue or any hint of humour, either.

[WARNING: Spoiler.]

As for the plot, I have no idea what happened. It's a mystery. There was a guy running around a lot. He seemed to be one of some humans left behind when others went transhuman. Although most of the future humans were religious nuts and liked being ignorant (so no change there, then) our hero had the option to join the post-humans , which, after one of the tedious exposition segments I mentioned earlier, he took. Then he came back for his sister, who either went with him and came back again, or had a dream about it and then ran into the hills screaming. Don't ask me which, or why. As for the 'wager' in the title, maybe I missed something...

In places the film had that quirky, amateurish quality that made 'The Man Who Fell to Earth' so charming. In other places it was just amateurish. Parts were even (unintentionally) comical.

I imagine it took the people involved lots of time and effort to put this together, so it is sad that they wasted their opportunity to do something interesting and good. There must be hundreds of excellent writers out there who could have written a script a thousand times better than 'Jathia's Wager'. Maybe the up-side of this is that, having set the ball rolling, the next open-source movie project will attract better writers.

31 December, 2008

2008 Retrospective

Some year, huh?

From the Obama election win to the all-but-collapse of the world economy (see 2009 for the grand finale) there have been some major world events none of us will forget in a hurry.

On a personal level, this has been an amzing year too. I finally ran down my consultancy business and retired - just in time for my savings to be halved by the America-led economic collapse. (Thanks, guys.) In return, however, I got a full year of living on this mountain, surrounded by beautiful forest and wildlife, with nothing but peace, sunshine, and my wonderful wife to keep me company. I measure my personal wealth in terms of how much leisure I have to pursue the things I enjoy, so 2008 has been a year of immense riches.

I also got a dog. Bertie - or Gobby, as I mostly call him - is a purebred mixed blessing. Handsome, fit and happy, great fun, clownish and playful, he's also a right royal pain in the arse. Mostly, now, he can control his bladder. Mostly, he doesn't steal and eat everything in his reach. But he still likes to jump on guests and chew their faces, and he has picked up new tricks, like jumping in the dam and then drying himself on the carpet, and chasing after cars like a bat out of Hell. Has he improved my life or not? The jury is still out, but 2008 is the year I'll remember as the one in which Bertie was a wild and crazy puppy.

And then there was the writing. If you only know me from this blog and not the other one, you might not even be aware that my new career as a writer of fiction has finally begun to take off. In May I won a place on a 'manuscript development retreat' after submitting my unpublished novel Time and Tyde in a national competition. It didn't lead to publication or anything but it gave me such a huge boost in understanding of the whole writing and publishing business that, in the seven months since then, I have had four short stories accepted for publication (only one is out so far), I was short-listed in one short-story competition, and was the winner in another. I have also written and polished a whole new novel (called TimeSplash!) which I am now looking for an agent to represent. This may not seem like much, but it represents a major breakthrough for me. In the whole of my life until May 2008, I had published only one short story, and had never won a writing competition. If I can keep up the momentum, 2008 will be the year I remember as the turning point in my writing career.

And there were lots of other things too - Wifie built her first website, Daughter passed her driving test, the Large Hadron Collider came online and went off-line again, I finally got a phone line installed (at enormous expense), I got in touch with all my long-lost neices and nephews in the UK, and so on, and so on.

All in all, quite a year.

I hope your 2008 was a full and rewarding one and that 2009 will be even better for everyone (prolonged global recession notwithstanding).

03 November, 2008

'English Correspondence' by Janet Davey

The mechanics of getting a book from the brain of a woman living in London to a small mobile library in a tiny village in rural Australia are daunting. In many ways a metaphor for our whole global economy. The improbability, the number of unlikely choices made across a ten-thousand-mile chain of unrelated people, that put me and that particular book in that converted coach on the same day, in the same place, is disturbing to contemplate.

Yet there we were: me and 'English Correspondence' by Janet Davey.

I've been reading a lot of low-quality nonsense lately, working on understanding how such books are constructed and how their authors use language, how to please publishers of speculative fiction, trying to learn lessons that will help me get my own writing published. But there is only so much of this I can take and I needed a proper book, one that was beautifully written, one that explored character and motivation, one that treated people as more than two-dimensional, stylised, comic-book sketches, one that used words for what they are meant for - to tell, not to show. I might have picked up a book by one of the really good sci-fi writers - J.G. Ballard, or Ray Bradbury, say - but there were none available on the library bus. In fact, there was little that promised anything but shoot-'em-up adventure or hose-'em-down romance, until I found 'English Correspondence'.

Janet Davey's book - her first novel - is one in which almost nothing happens. Time passes, the heroine moves from place to place, there are conversations, but there was no 'plot' to speak of, no three-act structure culminating in an exciting shoot-out, the hero did not get the girl. Instead there were the thoughts of a woman struggling to think her way free of a life painfully unsuited to her, a woman who had made a wrong turning many years ago and who could no longer bear the consequences, whose last prop - the correspondence she maintained with her father in England - is pulled away when he dies.

The heroine is an intelligent, sensitive person who, like most people, does not have the depth of reflection, ever to understand herself and where the roots of her unhappiness lie. Instead, her thoughts skitter about on the surface of her life, trying to make sense of patterns which are mostly epiphenomena, hoping that she will reach a safe harbour by intuition or good luck. I cringed for her, as she teetered, half-blind, on the edge of yet more horrible mistakes. I hope she makes it.

The writing is intelligent, carefully crafted, occasionally witty, and just a little odd. As I read, I was wondering how to describe Davey's terse, almost staccato style when I turned a page to find she had already done it for me. She wrote of, '... her own demarcated phrases, like tidy hedging.'

'English Correspondence' was an oasis. I have been sheltering there and refreshing myself. As soon as I've written this, I will go on across the desert of my chosen genre, looking for a path to follow. It was a lovely spot and I am grateful that I stumbled across it.

19 July, 2008

Can You Drown In Yoghurt?

You could knock me over with a keyboard! I made it to two hundred blog postings! And you, my probably first-time visitor, are looking at number two hundred. So, as I always do in my 200th blog posting, I'll take the opportunity to reflect on blogging and the nature of life in the tag clouds.

I started this, you may recall, because I wanted my voice to be out there with the tens of millions of others who take part in this pandemonium we call the blogsphere. I'd read a lot of stuff and wise old heads at the time said give it a couple of years – about 200 posts – to gather a reasonable-sized readership. Well, here I am and where are you?

Actually, I'm not complaining. I get about 5 unique visits a day to Waving Not Drowning. That's 150 unique visitors per month, mostly from Australia and the USA but also the UK, Canada and India, as well as places I didn't expect, like Greece, Poland, Moldova, Romania and the Philippines. As far as I can tell, about a third of my visitors each month are regulars and two thirds are drop-ins. It's not roaring success by any means but it's a great feeling to think there are people all over the world who click by to read what I have to say about life. It's an even better feeling to know that there are some who keep coming back for more. (Thank you, all of you – especially those of you who have been interested enough to leave comments.)

It's also a source of guilt. In the past six months I've been letting the blogging slip. I post once or twice a week now (not quite the two or three times of my first year) but there have been weeks on end recently when I didn't post at all. All I can say is I had a lot of stuff to do and a big change of lifestyle to get used to but I'm getting back into the swing of it and I hope to do better in the coming months. When I was posting at a rate of 15 to 20 posts a month, I was getting 50 to 100 visitors every day. This was quite exciting but it's hard work finding something different and worth saying every other day. I don't really see the point in just blathering. If I haven't got something interesting to write about, I'd rather keep my fingers to myself.

In the course of my 200 postings on Waving Not Drowning, other blogs of mine have come and gone. I've had music blogs and writing blogs and user interface design blogs but I've hardly used them and have shut them all down except the UI design one (which I haven't posted to in ages.) I have, however, just started up a new blog about writing (which you might care to go and look at) and it seems to be more successful than the others. (More successful than this one, too, according to Google and Technorati – both of which rate it as twice as popular and 'authoritative', even though it gets only two thirds the traffic – go figure.)

In all I'm getting 350 visits per month for my three blogs (about 11 per day). It may seem unambitious of me, in a world where popular blogs get thousands of visits per day, but I'm very happy with what I've got. Fewer might make me wonder what the point of it all was but many more and I would start feeling pressured. I can easily imagine 11 people stopping by the house each day for a chat. I'd be hiding behind the sofa if there were a thousand queueing up the drive. I'm also pretty pleased with the standard of the comments I get. I've had a few excellent arguments over the years.

So here's to the next 200 posts, to the many great bloggers who have inspired me to join in, and to all my readers, without whom I would be talking to myself (you don't think I'd shut up, do you?)

Oh yes, and the title of this post. One of the many interesting stats I get from Google Analytics to help me interpret traffic to my blogs, is a list of the Google queries that have led people to come here. And, yes, one of them was, 'Can you drown in yoghurt?' I certainly hope my blog helped with that.

19 June, 2008

Magical Thinking

An uncharacteristically brief posting from me today I'm afraid, but I wanted to tell everyone about a great article I just read. It is called Magical Thinking and it is by Matthew Hutson. (Actually, the wrap-up on the last page is a bit lame but the rest is excellent.)

One of the pleasing things about modern psychology is that it seems to have taken on board the task of explaining superstitious and religious thinking. It's about time! Hutson's article does an excellent job of summarising current thinking on the subject.

Sadly, he offers no cure :-(

13 April, 2008

Unix and the Asus EeePC

Almost 25 years ago, I got my first experience of using an Apple Macintosh computer. Until that point, I had used various other machines, each with its different operating system. My favourite, at the time, was Unix, with which I had become quite proficient. Yet the moment I saw the Mac, I realised that command-line operating systems were dead and buried. The new windows-based operating systems were a quantum leap forward and there would be no going back.

How wrong I was! Even as Xerox, Sun and Apple tried to drag us into the future, IBM and Microsoft threw out a massively heavy anchor – the IBM PC, running DOS – that held the world back for 15 years while Microsoft slowly, painfully, caught up to where the great pioneering companies had long since been. Eventually, Microsoft Windows became a very good, windows-based operating system with high levels of usability.

In the years since I first saw the Mac, I have used only MacOS and (from Windows 3.1 onwards) Microsoft Windows. I've also used 'palmtop' or hand-held computers for writing with (as I have mentioned before). These each had their quirky little operating systems but I never did much with them so there wasn't much to learn, or complain about. The last of these, my HP Jornada 720 is a Windows CE machine – close enough to desktop versions of Windows that it was easy to use. I've been looking for a replacement for it for a couple of years now and there just isn't one. So when I saw the Asus EeePC advertised, I realised this was about as close as it was going to get and grabbed one. (Well, Wifie bought it for me as a present, actually, knowing how keen I was.)

The Eee is a little miracle – a fully-fledged laptop that is just a little bigger than a DVD box (that's it on the left as I was showing it off to some friends). It's twice the size of my beloved Jornada but packs in so much more – for so much less money - that I was willing to give it a go. The operating system on the Eee is Unix (although you can install Windows XP if you want to) but not the Unix I used to use 25 years ago. This is a modern Unix with a proper, windows-based graphical user interface (GUI). The machine has all the networking capabilities you'd expect in a modern laptop (including Wi-Fi) as well as three USB2.0 ports. All it lacks is optical media (DVD/CD reader) and the kind of fat memory we feel we need these days. For my purposes it is ideal. I only want it for writing on. What's more, it comes with the Open Office.org applications pre-installed – and they are the ones I use all the time now anyway (as I have also mentioned before).

Let me say right now that the Eee is exactly right for my purposes. The only drawback is that it has Unix installed. What I've discovered since using the Eee is that Unix with a GUI is still the same old Unix it always was but with a prettier face. Unix, it seems, is not a patch on Windows XP. It is not a patch on MacOS X either. It looks superficially similar, it has windows, it has pointers, it has Help, and so on but it's usability is awful. When things go wrong, one discovers the Help is badly-written, minimal and obscure. The way things are done is hopelessly complicated – 'user hostile' is the phrase that springs to mind.

I'm an extremely experienced computer user, one-time programmer and one-time Unix user, yet I have been completely unable to solve trivial problems on the Eee – like loading and installing a new printer driver. (I won't bore you with this but it is so fabulously complicated that I have had to spend two days trawling through online tutorials and user-group forums just to get to grips with what I need to do. I haven't tried to do it yet – I'm saving that for when I have most of a day to spare!) I also haven't yet managed to get my Eee to network with my Windows desktop (partly because of the added complication of my crappy Telstra wireless broadband modem but also because the copious and well-written Windows XP help files assume you're connecting to another Windows machine, while the minimal, useless Unix help files assume nothing will go wrong with the simple wizard process that a child could follow without instructions.) I've spent about a day in the online forums on this issue too – enough to convince myself I will never solve it and I'd better call in a network guru.

Part of the problem with Unix today seems to be the plethora of slightly different versions that exist. If your printer company, for an example close to my heart, only produces a driver for one Unix version, you can't install it in another. Well, actually, you can but first you have to translate it using another piece of software. But then you discover this piece of software is written for yet another slightly different version of Unix than the one you have and you'll need to download and install a sizeable software environment all of its own just to make it work (which some experts in the online forums say you should really avoid doing if you can help it – which you can't).

Another part of the problem is usability. Usability is a deep and fundamental property of a system. It isn't a gloss you add to the surface. Apple has always understood this. Microsoft has gradually come to understand this. The Unix community just hasn't got a clue! However good the GUI on a Unix implementation, it will never have the usability of MacOS or Windows if the underlying user tasks are not themselves usable, or if the user support infrastructure (labels, layout, instructions and help) is not fully cognizant of the users, their mental models of the system, their tasks and their task knowledge, or if the underlying file systems and command structures are not fully consistent with the user's task model.

Finally, and this is also a usability issue, part of the problem is the shallowness of the GUI. It is assumed in the Unix world that, as soon as something goes wrong, or as soon as something complicated needs to be done, the user will abandon the GUI in favour of a command-line interpreter! I have only had my Eee a few weeks but I now have on my wall a summary of the Unix command shell syntax and a table of Unix commands. All you Unix evangelists out there, please take note. People will keep buying Windows (and MacOS) in preference to using Unix for free as long as Unix feels like a horrible, unfriendly kludge instead of a well-organised, intuitive appliance.

To be fair to Unix, its main audience comprises techies and nerds. You only have to look at the Unix online forums to see this – all those propeller-heads gabbling away to one another in impenetrable jargon. These are people who like to live with their heads under the bonnet. They are actually happy to see inside the machine and fiddle with the cogs and levers. But if Unix is ever going to make it into the real world, where people don't have the time or inclination to type hieroglyphs into 1970's-style 'Teletype windows' – a world where most people find even the complexities of Windows XP seriously challenging and completely irrelevant to what they need to achieve – then Unix is going to have to clean up its act.

This is obviously not impossible. The Macintosh itself is now a Unix machine but still (almost) as usable as it has ever been. So why isn't the Asus Eee?

One of the sad things about the Eee's usability failures is that it is a fantastically popular machine. Its price-performance level has made it a truly desirable little computer and it is selling like hotcakes. Which means that hundreds of thousands of people – eventually millions – will be getting their first exposure to Unix through the Eee and, I confidently predict, they will not be enjoying the experience. In fact, it will probably drive them quickly back into the arms of Microsoft. Soon, someone will have a machine out at the same price-performance point but running Windows out of the box and it will grab Asus' market away from them in a flash. I also predict that Asus will soon drop Unix altogether as a the OS for the Eee and will only sell it with Windows installed.

Frankly, Unix deserves this treatment. It is still a very long way from being a mass-market product.

21 December, 2007

Buy Northern Lights and Upset the Vatican!

What idiots Catholics must be. I'm one of those people who never pay much attention to what new, blockbuster films are being released and I very rarely read a best-selling novel. Yet when the Vatican newspaper l'Osservatore Romano starts trying to suppress a book - and the film of the book - it really gets my attention.

The film is The Golden Compass (staring the strangely attractive Nicole Kidman) and it is based on a book by Philip Pullman called Northern Lights. The Vatican says the book is anti-religious (Big deal. So what?) and shows just how terrible it is to be without 'God'. To quote from l'Osservatore Romano, Pullman's writing apparently shows that "when man tries to eliminate God from his horizon, everything is reduced, made sad, cold and inhumane." Of course, if this is really what Pullman is trying to show, then he is simply wrong. All magical beings, including 'God', have been long since eliminated from my horizon and it has only made life more deep, cheerful, happy and humane. The idea that it could be otherwise seems nonsensical. Surely living in this real world of wonder and beauty has to be a richer and more rewarding experience than living in a bizarre fantasy world of gods and devils? What is wrong with these people?

On the other hand, it is possible that Pullman didn't havethat in mind at all. Perhaps he just wanted to write a good yarn – although it sounds like he did have a bit of a dig at the Church, God bless him – and he does belong to the British Humanist Association. (The cringing, wimps who made the film, apparently removed all references to the Church so that they wouldn't get into trouble with these fanatical nutcases. Serves them right, doesn't it, that they got their wrists slapped by Il Papa anyway!)

Of course, the truly stupid thing about the Vatican's rantings is that if The Golden Compass and Northern Lights really do paint such a bleak and terrible picture of what it is like to be without a god (on your horizon) wouldn't that make them great adverts for the Church? Wouldn't that make people want to give up their life of sense and sanity and start eating pretend flesh and drinking pretend blood like the Pope does? Yet the Catholic League in the USA is trying to organise a boycott of the film saying its purpose is "to bash Christianity and promote atheism.”
If only I thought that was the film's purpose! Then I'd rush out and see it. As it is, not even Nicole Kidman and what I imagine are great special effects will get me into a cinema these days. I might, however, buy the book. Pullman's membership of the National Secular Society being something of a recommendation. Sadly, Northern Lights is a fantasy and I don't really like fantasies unless they are allegorical or extremely entertaining. However, since Northern Lights appears to be both, maybe I will.

Which raises another issue. Why is the Vatican getting so flagellatory about a fantasy? Isn't the point of a thing declaring itself to be a fantasy to say ' Don't believe me. I'm not true.'? But then, the guys at the Vatican are used to reading fantasy and treating it as gospel. Maybe they just can't tell the difference anymore. Or maybe, since the film grossed US$26 million in its first weekend, they are getting nervous about competing products?

14 October, 2007

Machiavelli, The Prince And I

Well, that's another one off my list of Books I Really Ought To Read. I finally got round to finishing The Prince by Niccolò Machiavelli. And it wasn't at all what I expected.

For a start, Machiavelli himself seems so paltry. I imagined he would be a man of soaring vision, a man with a deep and convoluted mind, a rich and interesting character. Don't ask me why. What I discovered was a dry, rather dull pedant. In fact, an academic.

I've worked a lot with academics. I had six years studying at uni, three years post-doc research and nine years of collaborative industrial-academic research. So I know of what I speak. There is a type among academics – a very common type, I'm afraid. Intelligent, yet boring, this type will study even the most profound and exciting subjects like a caterpillar chewing at a leaf. They will consume what matter there is, digest it thoroughly, and produce neatly-packaged analyses that, while they contain the essence of what is to be learned, have robbed the subject of all colour and interest.

Essentially a historian rather than a scientist or philosopher, his 'big idea' seems to have been to dump all the quasi-religious, moralistic nonsense about how a leader gets his authority, or how he should operate and instead to look at what really goes on in the world of power-politics. The flat tone of the writing in The Prince is therefore matched by the flat moral tone of the ideas. Machiavelli sensibly concludes that the human race isn't particularly pleasant but from this he seems to deduce that doing unto others before they do unto you is a reasonable foundation for a personal ethic. Which may explain the basis of his analysis of the best ways to get and maintain power, which takes the success of the enterprise as the main criterion for judging the actors in it. It's not exactly that the ends justify the means, more that, since getting and keeping power is all that people strive for, any means to those particular ends are alright by Niccolò.

And why does that seem so academic? Because you see academics all the time who don't seem to connect at all with the real world of human emotion. For them, the world is a puzzle to be solved, a fact is a fact, the rules that govern the world are to be found and written down. Mostly, this is harmless. They get their kicks from solving hard puzzles – they get their kudos from solving harder puzzles than their peers, or solving them first. They like to believe that morality and ethics are irrelevant to their endeavour – primarily because they don't have the emotional maturity to deal with the complexities of real life. So they do their work for tobacco companies and religious think tanks, repressive regimes and greedy capitalists just as happily as they would for medical charities and universities in pluralist democracies. And this is what Machiavelli seems to have been like.

Ironically, I've sometimes heard Machiavelli referred to as a realist.

Now I don't know much about the art of war, nor about statecraft, and especially about the acquisition and exercise of power but I do know there was some pretty dodgy reasoning in The Prince. I suspect that, had anyone taken it to heart, it wouldn't have been a great success for them (although possibly it was better than anything else available at the time). I also don't know why Lorenzo de Medici, to whom the work was dedicated, didn't accept it wholeheartedly and rush off to unify Italy as Machiavelli wanted him to (perhaps, if he read it, he used the book to help him become Pope – which he achieved about a decade after the book was written).

The thing is this; if you'd just dreamed up a sure-fire scheme to allow someone to gain great power and then hold it, regardless of who got hurt, would you rush out to put it in the hands of a Medici?

14 July, 2007

Has The World Gone Mad?

It's been a strange week.

In the southern Iraqi town of Basra, fierce giant badgers are roaming the docks. The locals believe they were introduced by the British Army to spread panic but local experts say the animals are indigenous – just not often seen in the city. Giant, killer badgers are odd enough but what is much, much more disturbing is that people could think for a moment that the Brits set them loose on the town. What possible chance do the invading armies have of winning the 'hearts and minds' of the Iraqis if the conquered have such a complete and utter misconception of who their conquerors are?

Meanwhile, a 45-year-old man in Sydney has been on a rampage in a restored tank. He drove his tank at dead of night through several Sydney suburbs apparently targeting mobile phone towers. He managed to take out six mobile phone tower sheds and an electricity sub-station (easily confused with a mobile phone tower in the dark) before his tank stalled as he tried to demolish a seventh. Apart from trying to keep people out of his way, there wasn't much the police could do about it except watch. Now, I hate mobile phone operators as much as the next guy, but to spend all that time and money on buying and restoring a tank just so you can have a little rampage and knock down a few towers seems just a little over the top. Surely it would have been easier to start a socialist party, sweep the country in a landslide election and nationalise them all without compensation? Far less bother and so much more satisfying.

And then there was the guy in China who got married this week. The bride a normal-sized Chinese woman, 1.68m tall. He is the world's tallest man, Bao Xishun, who is 2.36m tall. It seems he's a really nice guy under all that enormousness but was driven to advertise for a wife – probably for all the obvious reasons. Curiously, he only got 20 replies. Now, if it had been the West, they'd have built a 'reality' TV show around it and had thousands of female contestants being slowly and tediously eliminated for months before finally picking some completely unsuitable extravert with outsize breasts to appear in the season finale on Mr. Bao's arm. As it was, there was a quiet courtship and the bride seems like a very nice person. Bao is famous not only for his record-breaking length but also for saving two sick dolphins by using his very long arms to pull plastic rubbish from their stomachs. But the really odd thing is, he's Chinese. Aren't those guys suppose to be small?

Finally (Ha! Finally! I didn't mention the mystery philanthropist in Japan who has left at least $40,000, in envelopes each containing $100, in public toilets around the country. Nor the fact that a member of the pop-group Queen has just finished writing up a PhD thesis he started in 1971 and which was rudely interrupted in 1974 when he took 33 years out to become a worldwide global mega rock guitar hero.) Finally, I should mention that Dr. Mohammed Haneef has at last been charged with 'recklessly providing resources to a terrorist organisation.' Dr. Haneef has been infamously held without charge in Australia for 13 days while being questioned by the police about alleged involvement with a UK terrorist group responsible for recent botched car bomb attacks. The strange thing is that, after all that questioning by Australian and British anti-terrorist police, the charge is that Dr. Haneef 'recklessly' (not intentionally) gave a phone SIM card to the terrorists. Stranger still, this kind of recklessness, under the new anti-terrorism laws (America's finest export to the world) could cost him a further 25 years in gaol. Of course, in law, 'reckless' implies that Dr. Haneef didn't care if the terrorists blew people up. That is, that he was indifferent to the consequences of what he did. The common usage of the word to mean something like 'foolishly unthinking' isn't what he has been charged with. It is quite possible, the charge says, that he could clearly foresee what would be done with the SIM card but he just didn't care. Which is a pretty strange thing to charge him with in itself, don't you think? The anti-terrorist laws have the concept of conspiracy to commit a terrorist offence. So why not use that? Presumably because there is no evidence for it – only evidence of the doctor's indifference.

06 July, 2007

Beachcombing With Kurt

I was talking to Wifie the other day and I pointed out that the length that hair grows to on different parts of your body is a function of the speed at which it grows and how long (on average) each hair lasts before it falls out. She looked at me in surprise and asked, 'How do you know that?' I just shrugged, and said, 'I dunno. How do I know most things I know?' Meaning, I just pick these things up, mostly from things I have read.

I am, in fact, a vast repository of arcane knowledge. For example, I know that the centre of our galaxy is in the direction of the constellation Saggitarius, that the wavelength of green light is about 500 to 550 nanometres, that the average length of a marriage in the West these days is under ten years, that Groucho Marx once said, 'I never forget a face, but in your case I'll be glad to make an exception.', that Karl Marx is buried in Highgate Cemetery, London, and so on and so on. I have no idea where most of it came from. I read a lot of stuff.

However, I noticed myself learning a piece of trivia today. I'm reading A Man Without A Country by Kurt Vonnegut and he mentioned in passing that Marco Polo brought back pasta (to Italy) from the Chinese. This struck me as such a singular fact that I know I will remember it. And this must be how I have learnt so much of what I know – by picking up interesting tidbits from novels, histories, biographies, science books, magazines, even TV shows and films. For example, I'm also reading Master and Man, a collection of short stories by Leo Tolstoy (and, yes, I often have two or three books on the go at once) and I'm discovering all kinds of interesting background about 19th Century Russian society, the care of horses, cobbling, how to navigate a horse-drawn sled in a snow-storm at night, and so on. Some will stick. Some will not. It's hard to tell, at this point, whether I will have retained anything from the experience in a year's time.

But the pasta-from-China thing will stay with me. I'm sure of that. As will the terrible sense of sadness that A Man Without A Country communicates. It's awful to think that Vonnegut was so disillusioned at the end of his life and so ashamed of what his country had become. It makes me want to have been able to comfort him – with something like, 'Don't worry about it. Nothing we become will even remember what America was in a million years' time,' or 'So what? We were just monkeys, playing a bit too roughly maybe. None of it really mattered.' You never know, it might have helped.

Anyway, I plan to keep A Man Without A Country handy and hope that, as I re-read it over the years, something more substantial than facts about pasta will stick to my neurons.

15 June, 2007

Art For Art's Sake

I don't suppose I'll ever understand Art. I mean, I get the representational painting stuff – especially all that soft porn that artists have always churned out for rich guys who like to con themselves it's somehow cultivated to ogle fat, naked broads, or cute little ballet dancers. And it's the same with sculpture – naked boys and women mostly – depending on the patron's taste. It's when it all gets abstract, when it's all about ideas, or moods, that it goes beyond my ken.

I remember when I first noticed that Art was getting weird. It was an exhibit in the UK's Tate Gallery – called something like '49 Bricks' – which was, quite literally, 49 bricks. There was also, some time later, a life-sized submarine made out of old car tyres. Later on still, 'artists' became more shocking and often quite disgusting. The work of Damien Hirst springs to mind – you know, the guy who sticks whole or parts of animals in formaldehyde and then puts them on show. Or they're just odd, like the recent sculpture 'My Sweet Lord', a life-size statue of Jesus done in chocolate by Cosimo Cavallaro. I can see how this would offend people (especially the artist's view that visitors to his exhibition might like to lick the naked statue) but I can't see what makes it Art.

There are people who cover cliffs in white plastic, others who sprinkle used condoms and other detritus on their own unmade bed and exhibit it around the world, and then there's artist Mark McGowan who just last week ate a corgi as a piece of 'performance art'.

Is it just me, or does this parade of freakishness simply signify that artists are desperate and unimaginative these days? Or could it be that The Art World has gradually come to be dominated by emotionally disturbed exhibitionists and, since the people who get to define what Art is are the people who produce it, Art has come to mean something altogether different from what it used to? Art seems so far removed now from what ordinary people can understand or enjoy, it seems very much to have gone the way that 'serious music' did in the 20th Century, which ended up in chaos and disarray. Serious music is pretty much dead now. It left its audience behind many years ago and sailed off, unlamented and largely ignored, into the sunset. Art is pretty obviously going the same way, with 'serious' artists talking among themselves and to themselves as the world turns away and leaves them to it.

Meaning the real artists of our age must be doing something else now – making films and TV shows, perhaps, designing software and electrical appliances, or taking photographs, building cars, designing new viruses. To find them, we probably just need to look to see where the audiences have gone.

10 June, 2007

The Tin Men

Birthdays are great. People give you things. And if there's one thing I like, it's things. If there's another thing I like, it's ideas. So a thing full of ideas is the perfect gift for me. Something like a book, for example.

Among the haul this year, I got a book I've been vaguely searching for for years and years; The Tin Men by Michael Frayn. This is Michael Frayn's first novel and was published in 1965. As you may know, Frayn is one of my favourite authors. As Wikipedia so dryly puts it, 'His works often raise philosophical questions in a humorous context.' Well what a start he made with this one. I first read it when I was 11 or 12 and it had me in stitches. I couldn't put it down and read it from cover to cover in a single sitting, laughing aloud for most of that time.

For many years now, I've had a hankering to re-read it, curious as to whether I'd still find it funny. One of the reasons I might not is that The Tin Men is set in a computer automation research institution (what we might now call an Artificial Intelligence Lab) and I spent a lot of my own career in AI Labs. So what seemed a fascinating comic premise at the age of 12 might, 40 years on, just seem silly and ill-informed. So I settled down with feelings of excitement and trepidation to re-live my boyhood experience.

Of course, the writing was excellent – even in his first novel – and the fast-paced, farcical plot, was just as much fun as I remembered. Sadly, the book had dated but, strangely, not in relation to artificial intelligence (which is still stumbling about in the same sort of fields as Frayn's hapless researchers) but in the extent to which attitudes and language have changed since the sixties. Thankfully, this didn't detract from the pleasure of reading this great little book again. And, despite being so much older and so much more jaded than I was then, I still laughed out loud in places. Perhaps what I have lost in freshness and naïveté, I have gained in experience and sophistication. I may not have been rolling on the floor but I greatly appreciated the wit and cleverness of the book.

The AI thing was curious though. While I don't know of any real life stories quite like Frayn's ethical robots wrestling together as they each try to throw the other out of a sinking boat (with a crowd of research assistants making bets as they watched) much of what has gone on over the years has been quite silly. Also, I was astonished to find that the newspaper-headline generating program I wrote in the mid 1980s was straight out of The Tin Men. I had thought I was being original, yet must have had Frayn's idea lodged in my subconscious all along. Similarly, there is Douglas Adams' electric monk from the first of the Dirk Gently novels, which was built to believe all the improbable things an over-automated society could no longer make the effort to believe. There it was, also described by Frayn more than twenty years before Adams re-invented it!

Nice to be in such good company.

14 April, 2007

Free Software - It Really Is Possible

If you don't like paying for software, you can get practically everything you need for free without compromising quality or performance. On the contrary, the free stuff is often better than its commercial equivalents. This is what I have gradually come to understand lately and is a message I would like to share with the world – for free.

In the past few weeks, I have been making a concerted effort to replace all my bought software with free software. My goal is to have everything except the operating system sourced from companies other than Microsoft and free - and if Vista really is as bad as they say it is, I won't be buying that, either. Here is my progress so far:

Office Software: I have downloaded and begun to use OpenOffice.org 2.2. This completely replaces Microsoft's Word, Excel, PowerPoint and Access with, respectively, Swrite, Scalc, Simpress and Sbase. These programs are almost identical to their Microsoft equivalents in every way – similar user interface, similar functionality – but they seem to be more reliable and to run much faster. They also produce files which are about a fifth of the size of their Microsoft counterparts. They will open and save in Microsoft file formats if you want them to (with no penalties that I can see yet – formatting, macros, formulae all go both ways) or in the OpenDocument formats which are becoming a global standard. That software this good can be free is a miracle (or a testimony to how much Sun Systems hates Microsoft.)

Email: I've always been happy enough with Microsoft Outlook but I find I can replace it quite easily with free equivalents. However, what I chose to do instead was use a Web email service rather than get a new email client for my PC. The thing is, I don't like wasting my bandwidth on downloading junk mail (it amounted to half a megabyte of junk the other day!). With Web mail, you only download the emails you actually want to see. And when it comes to Web email, little else has the functionality of Gmail from Google. I can even use Gmail to consolidate all my other email accounts and have my own email address in the From and Reply-to fields – just as if it was a desktop client.

Other Internet Stuff: Browsers are all free anyway but the change from Internet Explorer to Firefox is one I am more than happy to make. As for file transfers, I used to use Coffee Cup's FTP software, which has a great user interface, but the change to the free FileZilla has been completely painless.

Image Manipulation: I've been using Paint Shop Pro for some years now – not as functional as Photoshop by a long way but so much easier to use, quite adequate for what I need and very cheap by comparison (up to AU$1,500 for Photoshop, depending on the version, vs less than AU$200 for Paint Shop Pro). However, I recently downloaded GIMP which seems to be every bit as powerful as Photoshop (although it has an even worse user interface!) and it is completely free.

Music: The one thing I do on a computer that I can't find decent free software for is writing music. There are plenty of free programs for stringing together sampled sounds but free software that will let you just write notes onto staves in the good old-fashioned way is very rare. I've tried a couple of things but they are not really adequate. However, I have found the next best thing to a full-function free program; Harmony Assistant from Myriad Software. It is a wonderful program, almost as good as the brand-leader Sibelius but it costs less than AU$90. As for sound editing and file conversions, I find that Audacity is a great piece of free software.

I'll keep you posted on new developments.

05 April, 2007

David Bowie

I don’t know about you but tunes are always running in my head. It’s something like hearing voices, I suppose – but in a good way. Today the tune was ‘Teenage Wildlife’ by David Bowie, one of those pieces the reviewers tend to call ‘anthems’ (for the under-fifties that’s one of those numbers like ‘Smells Like Teen Spirit’ by Nirvana). Since I’m my own man these days, and can pretty much suit myself as to what I do, I went to my computer and played the track – and played it loud! And, as ever with Bowie, it was even better than I remembered. So I set my media player to play nothing but Bowie and feasted on his work.

There are several musicians (by which I really mean composers) I admire unreservedly. Mozart, Beethoven, JS Bach, Handel and Haydn for instance. Then there is a second tier who almost make it into this league – Wagner, Brahms, Verdi, Mendelssohn – and many others, less astonishing but nevertheless breathtakingly brilliant (the Mahlers and Puccinis are in there with the JC Bachs and the Berlioz’s). Somewhere in the late 1800s though, the list peters out. In the early-to-mid twentieth century, there were a few – Debussy, Bartok, Stravinsky – but I can’t think of a single ‘serious’ musician I admire who is writing today. Where did they all go?

I think that they turned to pop music. Or, to put it another way, the musicians who really felt genius surging through them, began to express themselves in fresh and exciting new ways – writing popular music – the way musicians like Mozart once did. And judging by the criteria of impact on the genre, groundbreaking innovation that pushed the field forwards, the sheer number of imitators, and the subtlety and emotional intensity of the music itself, there are very, very few people who stand out like David Bowie does (you might argue that Bob Dylan is a contender and I might even grant you The Beatles, if we allow ‘collective’ composers).

I know, I know. It’s hard to think of a guy in a silver catsuit, who named his son Zowie and used to do that invisible wall mime thing on stage, as the modern equivalent of Beethoven or Mozart but I really believe he is. In another age, he could certainly have been Wagner and, let’s face it, since he ‘got God’, I can see him churning out cantatas as JS Bach too. If you don’t believe me, get hold of a copy of Scary Monsters – in my opinion the best album Bowie ever made – stick it on your iPod and go for a long drive with it blasting in your ears. You may need to listen to the album a few times through before you start fully to appreciate its quality but that is no hardship at all.

I’ve been a ‘fan’ of Bowies since I first heard his really early work (Rubber Band, Uncle Arthur, etc. – lightweight but fun) and there are major gems throughout his career. Hunk Dory was his own ‘Sgt. Peppers’ but there are many truly great albums you should listen to (Aladdin Sane, The Rise and Fall of Ziggy Stardust and the Spiders from Mars, Young Americans, Lodger, Diamond Dogs, Black Tie White Noise – tell you what, just listen to them all!) But it was Scary Monsters that convinced me that Bowie was a truly great artist, that living in England in the 1980s was something akin to living in Vienna in the 1780s, that people 200 years from today would wonder what it was like to be alive now, the way we wonder what it must have been like to share the world with Mozart.

Actually, guys, it was pretty great!

25 March, 2007

Confabulation To The Max

I’ve written before about confabulation – to my mind, one of the keys to understanding human nature. Once you are tuned in to the phenomenon, you start spotting it everywhere. In the past couple of days, I have come across two extreme examples: one in the medical literature and the other in fiction.

The medical one first. I came across this on the British Psychological Society website. ‘AD’ is a 65-year-old man who suffered a cardiac arrest which caused damage to the fronto-temporal region of his brain. This brought on a number of ill-effects, including anterograde amnesia (the inability to remember things that have happened since the cardiac arrest). The really interesting thing about AD, however, is that he now tends to adopt different personalities depending on his social setting. His doctors set up some scenarios to test it. In a cocktail bar, AD immediately assumed the role of bartender, inventing an elaborate story to explain his presence there. In a hospital kitchen, he became the head chef, again with a complex story to explain himself. His doctors describe his condition as a form of ‘disinhibition’ but to me this is just an extreme case of confabulation, probably in response to the amnesia. In the absence of any memory of why he is where he is, AD seems to be confabulating plausible stories. The strange part is why he always chooses to be a central character in the situation. It may be no coincidence that the guy was a politician before the heart attack.

The fictional example of extreme confabulation is the film ‘Stay’. I watched this mostly because it has Naomi Watts in it – possibly my favourite actress – but I had fairly low expectations. As it played – a story about an art student (Ryan Gosling) and a therapist (Ewan McGregor) who becomes obsessed with trying to prevent him killing himself – I was finding it interesting enough but nothing special. In fact, as the story began to grow increasingly weird and the identity of the student seemed to be merging with that of the therapist and the direction become more and more David Lynch-like, I was beginning to get a bit irritated with it. I’ve had too many films that go all surreal and ‘deep and meaningful’ on me and I didn’t like the idea that I’d wasted my time with another. Then, in the last five minutes, a wonderful twist was revealed that redeemed the whole thing and turned it into one of the best films I’ve seen in ages. And the confabulation thing? Well, I’m sorry, but if I told you that, I’d give away the twist. But trust me, it’s there.

The Gray Wave Jukebox


Powered by iSOUND.COM