Keep the masses from majority

Geoff Hoon made some unusually revealing statements about Labour Party democracy today on The world this weekend.

Essentially, the Hoon line is that there is – and should be – no such thing. Hoon was asked whether dissenters from the leadership could draw legitimacy from the Labour conference, which had passed motions critical of the New Labour clique now running the party. In other words, do Labour MPs have any right to express the democratically-agreed views of the Labour Party, if these differ from the positions of the New Labour leadership? He replied:

[left-wing MP] was elected on the Labour Party manifesto, as was I. He is bound to deliver that manifesto, as am I.

Time was, it would be generally recognised that party conference decisions were binding on the party as a whole, the leadership included; a Labour Prime Minister who wanted to set aside conference decisions would have many critics and few supporters. Conference’s role has long been advisory at best – one of the first priorities in the construction of the New Labour machine was the marginalisation of the party’s formal democratic structures (party conference and the National Executive). Now, apparently, we’re into a new phase. Not only does the leadership have every right to ignore what the party says; the party has no right to speak. Party policy is leadership policy; what the party itself thinks is sublimely irrelevant.

Hoon’s view of democratic procedure within the parliamentary Labour Party followed similar lines: for the parliamentary party to assert itself against the leadership would not only be ill-advised, it would be illegitimate. Specifically, Hoon was asked about the possibility of a leadership challenge. He replied:

recently, at a general election, [Blair] received the overwhelming support of the British people

It’s hard to ignore the fact that this is a bare-faced lie: even if we assumed that every single Labour voter was moved by a burning desire to give Blair his or her support, that would still only account for 23% of the British people. There’s also the inconvenient, but well-documented, fact that many Labour voters gave the party their vote despite Blair. But this is secondary. The real question is whether the Labour Party exists as an organisation, capable of formulating and expressing collective viewpoints as to its policies and leadership. The Hoon line is essentially that it doesn’t exist, or that it exists but should be treated as if it doesn’t. Not the Labour Party but the New Labour clique has been anointed by the electorate, on terms set out in the party manifesto, and its word should henceforth be law.

I think Hoon’s remarks are revealing and significant – and significant in part because they reveal far more than a better-briefed (or more intelligent) member of government would have done. Through long repetition, the critique of New Labour’s ‘control freak’ tendencies has acquired the kind of familiarity that breeds contempt, but Hoon’s comments show just how serious a matter this is – and how destructive it can be. The founding of the Labour Party was a major advance in the history of British democratic representation. Many previous leaders had done their best to hobble Labour Party democracy or stifle its voice, but nobody before Blair had set about dismantling the machinery with such nihilistic gusto. What Hoon’s remarks signal is that Blair’s project of organisational vandalism is more or less complete: the Labour Party has been destroyed.

Not that Blair would put it quite that bluntly, or for that matter as bluntly as Hoon. But then, eight years down the track, it’s not glad confident morning in anyone’s book; the first generation of New Labour hacks and Old Labour fellow-travellers is a distant memory. These days, you really can’t get the staff. It’s the time of the apparatchik: the time of maximum contempt for ordinary party members, with minimum justification. And so we have Hoon: the fourth-year plodder who gets appointed as a prefect (for want of alternative candidates) and promptly starts holding forth about the selection criteria: they don’t take just anyone, you know. I don’t expect they’d want you

Once, the Labour Party was built, and the building of the party was a major advance in the history of British democratic representation. But now…

Maintenant c’est joué. L’hacienda, tu ne la verras pas. Elle n’existe pas.
Il faut construire l’hacienda.

Everything playing at once


I no longer look at the front page of the NY Times to tell me what’s important. I look at it to see what people like the editors of the NY Times think is important. I’m finding the news that matters through the Internet recommendation engine: Blogs, emails, mailing lists, my aggregator, websites that aggregate and comment on news, etc.

Brief thoughts (also appearing in comments at Dave’s): we’re back with finding out what people say about stuff. Which is, ultimately, all there is to find out. Knowledge – and, for that matter, news – has always been produced in cloud form, as an emergent property of conversations. When we counterpose knowledge to conversation, we’re really saying that certain conversations have ended – or been brought to an end – and left unchallenged conclusions behind them. What’s changed is that, until recently, the conversations which produce knowledge (and news) have taken place within small and closed groups, so that most of us have only seen the crystallised end-product of the conversation. What Wikipedia, blogging, RSS and give us is the rudiments of a distributed conversation platform, enabled by pervasive broadband. (Which is why the ownership of the authority to stop the conversation – and crystallise the cloud – is such a big issue.)

Some are mathematicians

Or: of Dylan and Dylanophiles. Scorsese’s No Direction Home is glorious, although I think the BBC’s decision to show it in two halves is debatable; after Part One I was left wondering how, having got as far as 1966, they were going to fit the next 39 years of Dylan’s career into Part Two. I was still wondering when Part Two ended. Apparently what happened after 1966 was that Dylan had a motorbike accident and stopped touring; after that some other events may have taken place, but apparently we can’t be sure at this stage.

No Direction Home both was and wasn’t a revelation. I’ve been a Dylan fan in a small way for a long time, although ‘fan’ doesn’t seem quite the right word: he’s just there, monumentally. I realised during the programme that I’d heard far too little of his early stuff; album purchases are going to be required, but I can’t decide whether to start with Bob Dylan and make up the gaps chronologically, or just go straight for Blonde on blonde.

Still, hearing some of his stuff from 1965-6 for more or less the ahem first time koff had the advantage of giving me some idea just how strange a departure it must have seemed at the time. One point which Scorsese’s film underlined was that going electric wasn’t the only break Dylan made. In one of the many post-concert vox pops – some of which, to judge from the accents, must have been at the Free Trade Hall gig, although Manchester was never credited on the titles – a disgruntled fan curtly dismissed the second-half electric set before excoriating Dylan’s solo set at some length: his singing was out of tune, his rhythm was off and he kept playing that bloody harmonica… And, you felt, he kept playing those weird new songs. One telling moment during the electric set saw Bob pouring oil on troubled waters by announcing that the next number was a protest song. The effect was immediate: the booing died away; there was an anticipatory hush and some scattered cheering. Then the band started up. Playing “Leopardskin Pillbox Hat”. You’ve got to love him for that.

Which isn’t to say that ‘protest songs’ were a wrong turning for Dylan – any more than the repertoire of traditional songs that got him started, and led naturally into the ‘protest’ phase. On the contrary, Scorsese’s film suggested that breaking with the ‘protest’ repertoire was the single most important, and most creative, thing that Dylan ever did – but that the energy released by that break came only partly from Dylan’s discomfort with being seen as a ‘protest singer’; partly, it grew from his success in that role.

Clearly, I’m not the first person to notice that several of the songs on The times they are a-changin’ can be called political. Still, I think it’s worth stressing that they should be called political – particularly in the current climate of Arnoldian, high-culture Dylanophilia, which fits Dylan’s own claim to a kind of apolitical humanism only too well. (I’ll come back to last week’s Dylan tribute concert further on, but really – what was Barb Jungr doing? I’d come close to losing the will to listen during Odetta’s languid take on “Mr Tambourine Man” – not a particularly frisky song in the best of readings – but dear Lord, Jungr’s “Like A Rolling Stone”… How does it feel? It feels like I’m watching a rich kid’s drama school audition, actually.)

Retrospectively, anyway, Dylan likes to present himself as doing no more than stand up for universal values – for justice against injustice, freedom against tyranny – and perhaps some of his ‘protest songs’ can bear that reading; “Blowin’ in the Wind”, with that weirdly complacent dying fall closing each verse, for one. (Although, as Mavis Staples pointed out in Scorsese’s film, the first two lines meant something quite specific to Black Americans at the time.) But “The Times They Are A-Changin’”?

Come senators, congressmen
Please heed the call
Don’t stand in the doorway
Don’t block up the hall
For he that gets hurt
Will be he who has stalled
There’s a battle outside
And it is ragin’.
It’ll soon shake your windows
And rattle your walls
For the times they are a-changin’.

Rhyming “heed the call” with “Don’t stand in the doorway, don’t block up the hall” – in other words, Heed the call and get out of the way – is not the work of a disengaged prophet of peace and freedom. And that’s not to mention “Masters of war” or “The lonesome death of Hattie Carroll”, let alone “Only a pawn in their game” – a deeply political response to the murder of Medgar Evers, one of the most scandalous events of those years. Dylan didn’t give a voice to a generation; he gave a voice to a movement, and the movement gave him the world he needed to write about. According to Joan Baez, Dylan wrote “When the ship comes in” in reaction to being refused a hotel room (very Pirate Jenny); but I think that very exorbitance – so familiar from his later work – only came to him in the first place through his engagement with something much bigger.

After that, and because of that, there was more he could do; the 1966 tour was when it started. The songs the Free Trade Hall audience wanted to hear were what enabled Dylan to write the songs he played there – and kept the audience from hearing them. It was extraordinary seeing those images of thousands of fans queuing to see Dylan, only to turn on him when he started playing. It would be silly to invoke the Golden Bough or Orpheus’s sparagmos, but still – there’s something singular, and singularly intense, about the anger of the betrayed fan. In after-show footage Dylan looked both shaken and perplexed – All the booing! Why are they booing, man? Most likely, from that point on, they’d go their way and he’d go his.

They weren’t just booing, mind you. At the Free Trade Hall, the legendary shout of “Judas!” was followed, from somewhere else in the auditorium, by “You pillock!” (This may have been aimed at the first heckler rather than Dylan, it’s hard to be sure.) On the Scorsese film, we heard one heckler call out “Go home, Bobby” (nasty little diminutive there); later, as Dylan settled himself at the piano and adjusted his vocal mike, somebody called out “Try switching it off.” (And yes, these were people who’d paid for tickets, queued up, etc – they’d probably queued up for tickets, come to that.) It may be a Manchester thing. In my experience, Manchester audiences have a good line in heckling; for some people it’s an integral part of the evening’s entertainment. I wasn’t into Dylan at the time of the 1966 tour and didn’t live in Manchester; otherwise I might have been at the Free Trade Hall that night, if only I’d been more than five years old. But I did see Robyn Hitchcock, years later, take the stage to be greeted immediately by a double-act of hecklers:

Heckler 1: “Here he is, the man himself.”
Heckler 2: “He’s got a beer-gut!”
H1: “He’s a footballer.”
H2: “And a very good one!”

I don’t think Robyn Hitchcock is easily fazed, but they did it. He quelled them later, though. The two of them had started outbidding each other in guesses as to the precise length of the grotesquely elongated legs of the hallucinated Scottie dogs encircling the protagonist’s deathbed in that night’s introduction to “The Yip song”. (As one might.) Robyn broke off to interrupt them (“Eight feet!” “Ten feet!”), saying with some dignity, “No – you can say that, but it doesn’t make it true. Because this is how it was.” They shut up after that.

Robyn, coincidentally, was one of the three best acts at the tribute concert last week, although for some unknown reason they limited him (uniquely) to one song; he did what I took, rather embarrassingly, for one of his own. (Still. Hands up who’s got Time out of mind…) The other two were Martin Carthy (his “Hattie Carroll” was dreadful, but he did a wonderful “Scarborough Fair”) and K T Tunstall, who played the first two tracks of Blood on the Tracks – yes, starting with “Tangled up in blue”. (She made a pretty good fist of it, I have to say, as well as disrupting an increasingly precious evening with some welcome amplification – 39 years on, had we learnt nothing?)

I confess I was outraged when KT announced “Tangled up in blue”: I couldn’t imagine anyone but Dylan singing it (it’s so personal, so autobiographical) Listening to the song and re-reading the lyrics afterwards, I realised that I’d been thinking of it as a straightforward narrative with a couple of flashbacks – and that it’s anything but. It’s actually quite hard to say what happens in “Tangled up in blue”, and it’s impossible to say in what order it happens. Even the one reasonably clear section (narrator goes to topless bar, gets picked up by waitress) is impossible to place: has he found “her” again or is this their first meeting? The dialogue suggests both, at one point in two successive lines -

“I thought you’d never say hallo,” she said,
“You look like the silent type”

- which makes it impossible to settle on either. Straight after that, we’re into

I lived with them on Montague Street

Them? She was married when we first met, which would fit – but We drove that car as far as we could… sounds like a different episode from She had to sell everything she owned… – and neither of them sounds like Her folks, they said our life together/Sure was going to be tough. (Is he even talking about the same woman?) It’s an extraordinary piece of writing: not so much like a story, more like knocking a box of slides on the floor and describing them as you pick them up – but with none of the tricksy coolness that image suggests.

What sticks in my mind from “Tangled up in blue”, apart from a couple of wonderful lines in the first verse, is the way it ends. We’ve had

music in the cafes at night and revolution in the air

and the narrator has… just kept on:

The only thing I knew how to do was to keep on keeping on

Then there’s this strange writing-off of an entire scene, casual, almost playful in its phrasing, but at the same time stern and unmistakably final:

All the people we used to know, they’re an illusion to me now
Some are mathematicians, some are carpenters’ wives
Don’t know how it all got started, I don’t know what they’re doing with their lives

What they’re doing with their lives Living them, you feel like replying – and it got started when the movement circus left town (or got beaten, or got co-opted – either way, it wasn’t there any more). Living, is what they’re doing with their lives – it’s mundane and it’s limited, but what else are you going to do?

Me, I’m still on the road, heading for another joint

Yes, on one level it’s tired old rock troubadour imagery; yes, it would be interesting to compare it with the amount of time Dylan spent touring that year, let alone the time he spent on the road in any meaningful sense. But on another level, damn it, he’s right. Social movements change lives, both in the experience of the movement and more permanently: new forms of sociality, new forms of communication, new ways of conceiving and portraying the world can survive a movement’s ebb. Anyone who lived through the Civil Rights movement and then returned to their studies (or their husband’s carpentry), unchanged by what had happened, had missed out on something. And they’d missed out in a way that Dylan (for all his self-seeking arrogance, for all his ambition) didn’t miss out. One of the many achievements of that movement was turning Bob Dylan into a poet.

Scaring the nation

Or: what’s being said about the Walter Wolfgang incident, and what isn’t.

It’s appalling that this should happen to an old man / a lifetime Labour Party member / a former refugee from Fascism

This is roughly the Blair line. I have every respect and sympathy for Walter Wolfgang as an individual, but really… spare us. Blair’s recourse to this argument suggests a worrying confusion between ethics and sentiment, between ‘wrong’ and ‘unpleasant’. (A very New Labour confusion, incidentally.) If Walter had been a strapping twenty-year-old who’d recently joined the party from the BNP, what happened to him wouldn’t have been any less wrong. (An ex-fascist being manhandled by security would have had a certain entertainment value, admittedly, but it would have been just as wrong.)

It’s appalling that Labour should treat dissenters this way

This is closer to the mark. The idea of suppressing all heckling at a Labour Party conference in the 1970s or 1980s would make a cat laugh. You’ve got to wonder quite how much the party membership has changed in that time – does nobody oppose the leadership any more? Or have the members simply been managed into submission? We knew, of course, that the New Labour takeover had involved restructuring the apparatus of the Labour Party; perhaps until Wednesday we didn’t appreciate quite how far it’s gone. Wednesday’s scenes put me in mind of accounts of BUF meetings in the 1930s (“a solitary heckler was quickly removed from the hall by burly stewards”). To be fair, the WRP in its heyday had a similar way with dissent; if Blair’s a Fascist, so was Gerry Healy. (So, not quite out of the woods yet, Tony.)

It’s appalling that the Terrorism Act should be invoked

I almost endorse this line of argument wholeheartedly. The Terrorism Act 2000 (commonly known as TACT) explicitly classifies as terrorism such activities as politically-motivated vandalism, political protest which threatens the ‘health and safety’ of the public and politically-motivated hacking. Even more alarmingly, it makes no distinction between the action itself and the threat of carrying it out. As I said earlier, if I threatened to take down the Home Office Web site on behalf of NO2ID, that threat would in itself amount to terrorism. The same would be true of threatening to impinge on public health and safety by… oh, I don’t know… preventing petrol tankers from leaving oil refineries, say, or clogging up the M4 with a convoy of farm vehicles. TACT, in other words, is a catch-all law, which can be used to criminalise as much or as little of the spectrum of effective political protest as the government of the day chooses. This is not only an authoritarian law, it’s an arbitrary law – a law which legitimises arbitrary state action instead of limiting it.

This vein of arbitrary authoritarianism runs right through TACT. Section 44 of TACT, under which Walter was supposedly detained, is all about defining situations in which police powers can be extended. Section 44 enables a senior police officer to issue an authorisation, covering a specified area for a period of up to twenty-eight days, under which the police have extended powers to stop and search people and vehicles. The authorisation must be issued because the person giving it considers it expedient for the prevention of acts of terrorism. Once it’s issued, however, individual searches don’t need to be justified; the existence of the authorisation, together with a police officer’s stated belief that the search is related to the prevention of terrorism (as defined by TACT), is justification enough. Assuming that the area of the conference was already covered by a section 44 authorisation, all that would be needed to justify hauling Walter Wolfgang out of the conference hall and searching him would be a police officer’s belief – well-founded or not, reasonable or not, the law explicitly makes no distinction – that Walter was on the verge of committing a terrorist act and that searching him would bring to light articles of a kind which could be used in connection with terrorism. For instance, after shouting ‘Nonsense!’ Walter might have advocated mass civil disobedience in order to bring the country’s war effort to a halt; a blockade of military bases would certainly create a serious risk to the health or safety of the public or a section of the public. He might even have called for protesters to vandalise missiles and war planes (serious damage to property). And he might have been about to take a list of military bases from his pocket and read it out (articles of a kind…). Of course, he wasn’t about to do any of these things, but the police weren’t to know that. Under the provisions of section 44 of TACT, Sussex Police were entirely justified in searching Walter; which is to say that TACT is a arbitrary, authoritarian monstrosity.

But they didn’t search him. And what’s not being said about this incident is:

It’s appalling that the police should have exceeded their powers

Contrary to much popular belief, the police do not have a legal right to play Simon Says: failure to comply with a police officer’s requests is not a criminal offence. More specifically, the police do not have an unfettered right to detain people – indeed, this is precisely why section 44 of TACT was invoked in this case. But TACT doesn’t give them this right either – section 44 provisions, as broad as they are, relate only to searches. If, as most observers seem to agree, Walter was detained under section 44, then he was detained unlawfully.

When it comes to outrage, this incident is a target-rich environment: New Labour management of dissent is genuinely appalling, as is TACT. But there seems to be yet a third level of arbitrary authoritarianism. Section 44 may give the police a free hand in selecting people to search, but that’s all it does. Wednesday’s incident suggests that Sussex Police, at least, are interpreting it as giving them much broader powers to clamp down on protest – and they’re not, as yet, being called to account. That’s really worrying.

Know what I mean

Back here, I wrote:

Tagging, I’m suggesting, isn’t there to tell us about stuff: it’s there to tell us about what people say about stuff. As such, it performs rather poorly when you’re asking “where is X?” or “what is X?”, and it comes into its own when you’re asking “what are people saying about X?”

This relates back to my earlier argument that all knowledge is cloud-shaped, and that tagging is simply giving us a live demonstration of how the social mind works. In other words, all there is is “what people are saying about X” – but some conversations have been going on longer than others. Some conversations, in fact, have developed assumptions, artefacts, structures and systems within and around which the conversation has to take place. The conversation carried on in the medium of tagging isn’t at that stage yet, perhaps, but it will be – the interesting question is about the nature of those artefacts and structures.

Now (with thanks to Anne Galloway) over to Dan Sperber.

When say, vervet monkeys communicate among themselves, one vervet monkey might spot a leopard and emit an alarm cry that indicates to the other monkeys in his group that there’s a leopard around. The other vervet monkeys are informed by this alarm cry of the presence of a leopard, but they’re not particularly informed of the mental state of the communicator, and they don’t give a damn about it. The signal puts them in a cognitive state of knowledge about the presence of a leopard, similar to that of the communicating monkey — here you really have a smooth coding-decoding system.In the case of humans, when we speak we’re not interested per se in the meaning of the words, we register what the word means as a way to find out what the speaker means. Speaker’s meaning is what’s involved. Speaker’s meaning is a mental state of the speaker, an intention he or she has to share with us some content. Human communication is based on the ability we have to attribute mental state to others, to want to change the mental states of others, and to accept that others change ours.

When I communicate with you I am trying to change your mind. I am trying to act on your mental state. I’m not just putting out a kind of signal for you to decode. And I do that by providing you with evidence of a mental state in which I want to put you in and evidence of my intention to do so. The role of what is often known in cognitive science as “theory of mind,” that is the uniquely human ability to attribute complex mental states to others, is as much a basis of human communication as is language itself.

I am full of admiration for the mathematical theory of information and communication, the work of Shannon, Weaver, and others, and it does give a kind of very general conceptual framework which we might take advantage of. But if you apply it directly to human communication, what you get is a mistaken picture, because the general model of communication you find is a coding-decoding model of communication, as opposed to this more constructive and inferential form of communication which involves inferring the mental state of others, and that’s really characteristic of humans.
For Dawkins, you can take the Darwinian model of selection and apply it almost as is to culture. Why? Because the basic idea is that, just as genes are replicators, bits of culture that Dawkins called “memes” are replicators too. If you take the case of population genetics, the causal mechanisms involved split into two subsets. You have the genes, which are extremely reliable mechanisms of replication. On the other hand, you have a great variety of environmental factors — including organisms which are both expression of genes and part of their environment — environmental factors that affect the relative reproductive success of the genes. You have then on one side this extremely robust replication mechanism, and on the other side a huge variety of other factors that make these competing replication devices more or less successful. Translate this into the cultural domain, and you’ll view memes, bits of culture, as again very strong replication devices, and all the other factors, historical, ecological, and so on, as contributing to the relative success of the memes.

What I’m denying, and I’ve mentioned this before, is that there is a basis for a strong replication mechanism either in cognition or in communication. It’s much weaker than that. As I said, preservative processes are always partly constructive processes. When they don’t replicate, this does not mean that they make an error of copying. Their goal is not to copy. There are transformation in the process of transmission all the time, and also in the process of remembering and retrieving past, stored information, and these transformations are part of the efficient working of these mechanisms. In the case of cultural evolution, this yields a kind of paradox. On the one hand, of course, we have macro cultural stability — we do see the same dish being cooked, the same ideologies being adopted, the same words being used, the same song being sung. Without some relatively high degree of cultural stability — which was even exaggerated in classical anthropology — the very notion of culture wouldn’t make sense.

How then do we reconcile this relative macro stability at the cultural level, with a lack of fidelity at the micro level? … The answer, I believe, is linked precisely to the fact that in human, transmission is achieved not just by replication, but also by construction. … Although indeed when things get transmitted they tend to vary with each episode of transmission, these variations tend to gravitate around what I call “cultural attractors”, which are, if you look at the dynamics of cultural transmission, points or regions in the space of possibilities, towards which transformations tend to go. The stability of cultural phenomena is not provided by a robust mechanism of replication. It’s given in part, yes, by a mechanism of preservation which is not very robust, not very faithful (and it’s not its goal to be so). And it’s given in part by a strong tendency for the construction — in every mind at every moment — of new ideas, new uses of words, new artifacts, new behaviors, to go not in a random direction, but towards attractors. And, by the way, these cultural attractors themselves have a history.

There’s more – much more – but what I’ve quoted brings out two key points. Firstly, communication is not replication: in conversation, there is no smooth transmission of information from speaker to listener, but a continuing collaborative effort to present, construct, re-present and reconstruct shared mental models. The overlap between this and the ‘knowledge cloud’ model is evident. Secondly, construction has a context: the process of model-building (or ‘thinking’ as we scientists sometimes call it) is always creative, always innovative, and always framed by pre-existing cultural ‘attractors’. And these cultural attractors themselves have a history – you could say that people make their own mental history, but they do not do so in circumstances of their own choosing…

This is tremendously powerful stuff – from my (admittedly idiosyncratic) philosophical standpoint it suggests a bridge between Schutz, Merleau-Ponty and Bourdieu (and I’ve been looking for one of those for ages). My only reservation relates to Sperber’s stress on speaker’s meaning … a mental state of the speaker. I think it would enhance Sperber’s model, rather than marring it, to focus on mental models as they are constructed within communication rather than as they exist within the speaker’s skull – in other words, to bracket the existence of mental states external to communicative social experience. On this point Schutz converges, oddly, with Wittgenstein.

Sperber’s argument tends to underpin my intuition on tagging and knowledge clouds: if all communication is constructive – if there is no simple transmission or replication of information – then conversation really is where knowledge develops, or more precisely where knowledge resides. Sperber also helps explain the process by which some conversations become better-established than others; we can see this as a feedback process, involving the development of a domain-specific set of ‘attractors’. These would perhaps serve as a version of Rorty’s ‘final vocabulary’: a shared and unquestionable set of assumptions, a domain-specific backdrop without which the conversation would make no sense.

One final thought from Sperber:

The idea of God isn’t a supernatural idea. If the idea of God were supernatural, then religion would be true.

Well, I liked it.

Drop you where you stand


Fascinating. The only person to heckle Jack Straw at today’s Labour Party conference was an 82-year-old man, who couldn’t bear Straw’s garbage about Britain only being in Iraq to bring democracy and stability. “Lies!” he shouted. Five security guards promptly pounced. When the delegate next to the heckler told the guards to leave the old guy alone they pounced on him instead. He is apparently the chairman of the Constituency Labour Party whose parliamentary representative is John Austin MP. The delegate complains that he was violently dragged from the hall, thrown up against a wall and suffered bruising. The delegate tried to phone his MP and was told his phone would be seized if he didn’t put it away. Some of the security guards went back for the heckler. The 82 year old was subsequently detained by police. John Austin MP says he was present and couldn’t believe his ears when the cop informed the heckler that he was being “detained under section 44 of the Terrorism Act”.

The heckler, it turns out, was Walter Wolfgang, peacenik of long standing. (I used to know Walter slightly – in the early 1990s he was a Tribune contributor and a reliable presence on the Labour CND scene.) The BBC has more:

[Linda Riordan MP] was sitting just a few rows in front of the ejected man when he began shouting. “He was immediately surrounded by three or four stewards and physically lifted off his feet and bundled out of a side door,” she said.

Ms Riordan’s predecessor as Halifax MP, the prominent anti-war campaigner Alice Mahon, also witnessed the incident.She said: “We were listening to Jack talking about Iraq. This gentleman shouted `That’s rubbish, that’s a lie’. Two or three of the security people dived on him. This other chap a couple of rows in front turned round and said `You must be joking’, because this was simple political heckling. He wasn’t threatening anybody. He got manhandled out as well. I think they were really over the top.”

A Labour Party spokesman said: “Following a disturbance in the visitors’ balcony, two people were escorted out, having been asked three times to be quiet.”

As for that bit about the Terrorism Act… well, let’s not get it out of proportion:

Police later used powers under the Terrorism Act to prevent Mr Wolfgang’s re-entry, but he was not arrested.

Outlining the measures taken by its officers, Sussex Police said: “The protocol in this situation is that a police officer is called. The police officer attended and asked the man to wait for a member of the Labour Party. We wish to stress that the delegate was not arrested or searched at any point during his brief interaction with the police officer and that it is a matter for the Labour Party to decide who they allow into their conference.”

It’s just a case of providing police backup for Labour’s hired bouncers, nothing more sinister than that. Why shouldn’t they be choosy about who they let in? Just another case of privatisation of public space.

Still… the Terrorism Act? Here’s Section 44:

44. – (1) An authorisation under this subsection authorises any constable in uniform to stop a vehicle in an area or at a place specified in the authorisation and to search- (a) the vehicle;
(b) the driver of the vehicle;
(c) a passenger in the vehicle;
(d) anything in or on the vehicle or carried by the driver or a passenger.

(2) An authorisation under this subsection authorises any constable in uniform to stop a pedestrian in an area or at a place specified in the authorisation and to search-

(a) the pedestrian;
(b) anything carried by him.

(3) An authorisation under subsection (1) or (2) may be given only if the person giving it considers it expedient for the prevention of acts of terrorism.

There’s an interesting slippage between the first two subsections and the third: the stipulation regarding the prevention of acts of terrorism refers to the authorisation, not to the individual search. Once an authorisation to stop and search has been granted – covering the whole of a specified area, for a specified period – the wording of section 44 does nothing to restrict the actions carried out under that authorisation. Which is to say that it authorises every police officer in the area to stop and search at will.

Hopefully somebody out there is now muttering about section 45 of the Act. Yes, section 45 modifies this picture substantially:

45. – (1) The power conferred by an authorisation under section 44(1) or (2)- (a) may be exercised only for the purpose of searching for articles of a kind which could be used in connection with terrorism, and
(b) may be exercised whether or not the constable has grounds for suspecting the presence of articles of that kind.

Section 45 gives with one hand but takes away with the other. 45(1)(b) explicitly confirms that s.44 legitimises arbitrary stops and searches; 45(1)(a), however, stipulates that these can only carried out for the purpose of searching for articles of a kind which could be used in connection with terrorism (emphasis added).

Now, this wording is extraordinarily broad, particularly when you consider that the Act’s definition of terrorism is pretty broad to begin with:

1. – (1) In this Act “terrorism” means the use or threat of action where- (a) the action falls within subsection (2),
(b) the use or threat is designed to influence the government or to intimidate the public or a section of the public, and
(c) the use or threat is made for the purpose of advancing a political, religious or ideological cause.

(2) Action falls within this subsection if it-

(a) involves serious violence against a person,
(b) involves serious damage to property,
(c) endangers a person’s life, other than that of the person committing the action,
(d) creates a serious risk to the health or safety of the public or a section of the public, or
(e) is designed seriously to interfere with or seriously to disrupt an electronic system.

(3) The use or threat of action falling within subsection (2) which involves the use of firearms or explosives is terrorism whether or not subsection (1)(b) is satisfied.

Note that the words ‘use or threat’ in 1(1) qualify all the types of action listed in 1(2); if I threatened to take down the Home Office Web site on behalf of NO2ID, that threat would in itself amount to terrorism – and the police, if so authorised, could frisk me and confiscate any articles of a kind which could be used in connection with mouthing off about being a L337 H4x0r.

All this is alarming, mind-boggling and frankly rather weird. But it’s also a bit beside the point, since it appears that Walter Wolfgang wasn’t in fact searched (the delegate was not arrested or searched at any point during his brief interaction with the police officer). Which poses a problem for the Sussex Police. If we assume that the Terrorism Act was invoked (and assuming otherwise would mean calling several people liars) there are really only two possibilities. Either Walter was in fact searched for terrorist impedimenta, and the Sussex Police spokesperson got it wrong; or Sussex Police, in effect, stopped reading the Act before they got to section 45, and came away with the mistaken impression that an authorisation obtained under section 44 allowed police officers to stop anyone for any reason. To put it more bluntly, if they used a section 44 authorisation for purposes other than those laid down by section 45, their action wasn’t covered by the Act – and Walter would have a good case for wrongful detention.

Needless to say, I don’t hold out much hope for a prosecution. I think it’s more likely that the government will tack on a clause to work round s45(1)(a) when they review the 2005 PTA. Making it retroactive would be a stretch, but I wouldn’t rule it out; this is, after all, a government which not only wants to give the police a radical extension of summary powers but actually says so.

As for Walter, Ian McCartney MP has promised him an apology on behalf of the Labour Party. Which is nice. He’s just not going to get it in Brighton:

“I’m going to personally apologise to him,” Mr McCartney said. “I’m going to personally meet him if he takes the opportunity.” But Mr McCartney said Mr Wolfgang would not be allowed back into the conference, which ends on Thursday.

If I drew a detailed map

Several months ago, I wrote (regarding the Wikipedia page on ‘anomie‘):

For what I’d want to know about a concept like that, that page is pretty dreadful. It veers wildly between essentialism (there is a thing called ‘anomie’ and we know what it is, across time and space) and nominalism (different people have used this combination of letters to mean different things, who knew?). What’s not there is any sense of the history of the concept

I was reminded of this argument by Tom‘s recent comments on the ‘penis envy’ page (“I know this article on penis envy is bullshit, and it’s been on my ‘to do’ list of things to fix for weeks, and I’ve got nowhere“). The problem here is that making things more complicated is a lot harder than keeping them simple. What’s worse, the kind of people who are critical of other people’s simplifications tend also to be critical of their own work, which means that getting the complicated version written and getting it right is a long and painstaking job. Which, in turn, means that in the absence of serious incentives it’s quite likely not to get done. Wikipedia’s native system of informal incentives breaks down, in other words, where the workload gets too large – and, when it comes to making things more complicated (and getting it right), the workload starts at ‘large’ and goes up.

I was talking about this stuff with a friend the other day (hi Chris!) when he came up with a proposal for filling the incentive gap. The idea is to mobilise peer pressure among the population of disgruntled complexifiers. What we want isn’t so much an army of subject experts as a group of people who mistrust simple explanations and are good at digging out and writing down the underlying complications, in any of a number of fields. Hacks rather than professors, essentially – but good hacks. A list of apparently oversimplified Wikipedia articles could then be drawn up, and each one could be offered to names picked from the pool. I’ll just reiterate that I’m not talking about people with expert knowledge, so much as perfectionists with inquiring minds. The Wikipedia articles I’ve mentioned left me with a stack of unanswered questions, which I’d happily devote a few evenings to answering if I was being paid to do so – or if I had any incentive to do so. A virtual tap on the shoulder from an online group of pedantic curmudgeons might just do the job.

That just leaves the task of assembling the group. Here, Chris made the brilliant suggestion of using PledgeBank. Something like this:

I will take part in a group of volunteers who will improve Wikipedia by correcting and extending inaccurate and simplistic entries on social science concepts, but only if another 99 people do so too.

I think it could work. What do you think?

A place for everything

Or: what ethnoclassification is, and what folksonomy isn’t.

When it comes to tagging, I’m facing both ways. I think it’s fascinating and powerful and new – qualitatively new, that is: it’s worth writing about not just because it’s shiny, but because there’s still work to be done on understanding it. At the same time, I think it’s been massively oversold, often on the back of rhetorical framings which only have a glancing relationship with evidence or logic. Tagging is fascinating and powerful and new, but a lot of the talk about tagging has me tearing my hair.

I’ll pick on a recent post by Dave Weinberger. (Personal to DW: sorry, Dave. I’m emphatically not (is that emphatic enough?) suggesting that you’re the worst offender in this area.)

Let’s say you type in “africa,” “agriculture” and “grains” because that’s what you’re researching. You’ll get lots of results, but you may miss pages about “couscous” because Google is searching for the word “grain” and doesn’t know that that’s what couscous is made of. Google knows the words on the pages, but doesn’t know what the pages are about. That’s much harder for computers because what something is about really depends on what you’re looking for. That same page on couscous that to you is about economics could be about healthy eating to me or about words that repeat syllables to someone else. And that’s the problem with all attempts by experts and authorities to come up with neat organizations of knowledge: What something is about depends on who’s looking.

Let’s say you come across the Moroccan couscous web page and you want to remember it. So you upload its Web address to your free page at that lists all the pages you’ve saved. Then asks you to enter a word or two as tags so you can find the Moroccan page later. You might tag it with Morocco, recipe, couscous, and main course, and then later you can see all the pages you’ve tagged with any of those words.That’s a handy way to organize a large list of pages, but tagging at really took off because it’s a social activity: Everyone can see all the pages anyone has tagged with say, Morocco or main course or agriculture. This is a great research tool because just by checking the tag “agriculture” now and then, you’ll see every page everyone else at delicious has tagged that way. Some of those pages will be irrelevant to you, of course, but many won’t be. It’s like having the world of people who care about a topic tell you everything they’ve found of interest. And unlike at Google, you’ll find the pages that other humans have decided are ABOUT your topic.

What strikes me about this passage is that Dave changes scenarios in mid-stream: Let’s say you come across the Moroccan couscous web page… How? Google couldn’t find it. Let’s compare like with like, and say that you’re still looking for your couscous page: what do you do then, if not go to and type in “africa,” “agriculture” and “grains”? Once again, assuming that whole-site searches aren’t timing out, you’ll get lots of results (particularly since doesn’t seem to allow ANDing of search terms) but you may miss pages about “couscous” – and checking the tag “agriculture” now and then won’t necessarily help. Google will miss the page if the term ‘couscous’ doesn’t appear in the source (which doesn’t necessarily mean ‘appear on screen’, of course); will miss it if the term hasn’t been used to tag it (even if it is in the source).

Google vs is an odd comparison, in other words, and it’s not at all clear to me that the comparison favours It’s great to get classificatory(?) input from the users of a document, of course – as I said above, tagging is fascinating and powerful and new – but in terms of information retrieval it can only score over a full-text search if

1. the page has been purposefully tagged by a user
2. the page has been tagged with a term which doesn’t appear in the page source
3. a second user is searching for information which is contained in the page, using the term with which the first user tagged it

I don’t think tagging advocates think enough about what those conditions imply. For example, at present I’m the only user to have tagged Mr Chichimichi’s Tags are not a panacea; I tagged it with ‘tagging’, ‘search’ and ‘ethnoclassification’. Until I did so, anyone looking for it would have been out of luck. Even Google wouldn’t be much help – the word ‘ethnoclassification’ doesn’t appear anywhere in the text. No, until a couple of days ago your only way of stumbling on that post would have been to run a clumsy, counter-intuitive Google search on terms like ‘tagging’, ‘tags’, ‘folksonomies’ and ‘social software’. (Google even knows that ‘folksonomies’ is the plural of ‘folksonomy’, so searching on the singular form would work just as well. That’s just not fair.)

Dave also contrasts the world of collective knowledge through distributed tagging with attempts by experts and authorities to come up with neat organizations of knowledge. Further along in the same piece, he writes:

This takes classification and about-ness out of the hands of authors and experts. Now it’s up to us readers to decide what something is about.Not only does this let us organize stuff in ways that make more sense to us, but we no longer have to act as if there’s only one right way of understanding everything, or that authors and other authorities are the best judges of what things are about.

One question: who ever said that there was only one right way of understanding everything? OK, too easy. I’ll rephrase that: before tagging came along, who was saying there was one right way, etc? Who are the tagging advocates actually arguing against? (It certainly isn’t librarians (context here).)

There’s a difference between classifications which have a single pre-determined set of definitions and classifications which are user-defined and user-extensible. But that’s not the same as the difference between having an underlying ontology and not having one, or the difference between hierarchical and flat organisations of knowledge, or the difference between single and multiple sets of classifications. A closed, expert-defined, locked-down controlled vocabulary may contain multiple sets of overlapping terms; it may be a flat list of categories rather than a ‘tree’; it may even be innocent of ontology. (Thanks to Jay for pointing this out, in comments here.) If tagging is better than top-down classification, it’s better because it’s user-defined and user-extensible – not because it’s free of the vices of ontology, hierarchy and uniformity. The idea that tagging – and only tagging – stands in opposition to a classifying universe built on hierarchical uniformity is a straw man. (But the librarians get it both ways – if a top-down classifying system is shown to be flat and plural, this can be put forward as a sign of the weakness of top-down systems; the fact that bottom-up systems are more, not less, vulnerable to Chinese Encyclopedia Syndrome is passed over.)

So, tagging systems make lousy search engines, and they don’t mark a qualitative leap in the organisation of human knowledge. What they’re really good for – and what makes them fascinating and powerful – is conversation. Tagging, I’m suggesting, isn’t there to tell us about stuff: it’s there to tell us about what people say about stuff. As such, it performs rather poorly when you’re asking “where is X?” or “what is X?”, and it comes into its own when you’re asking “what are people saying about X?” (Of course, much tag-advocacy is driven by the tacit belief that there’s no fundamental difference between what people say about X and expert knowledge of X – and that an aggregate of what people say would be equivalent, if not superior, to expert knowledge. But that’s an argument for another post.)

Tagging is good for telling us what people say about stuff, anyway – and when it’s good, it’s very good. To see what I’m talking about, have a look at Reader2 (via Thomas). It’s a book recommendation site, implemented on the basis of a user/tag system. It’s powerful stuff already, and it’s still being developed. Does it tell me what books are really like? No – but it tells me what people are saying about them, which is precisely what I want to know. And it couldn’t do this nearly as well, it seems to me, without tags – and tag clouds in particular. This, for me, is what tagging’s all about. Ethnoclassification: classification as a open-ended collective activity, as one element of the continual construction of social reality.

Who took the money?

This is a fascinating post (in Italian) by Pietro Speroni on the relationship between authority, communities and markets. This is an interesting and controversial area; the fact that Pietro also invokes the Long Tail (which, as you’ll recall, is not what it seems) makes it all the more compelling (to me at least).

I’ll translate as I go along; hopefully Pietro will correct me if I go wrong.

I don’t believe that the ruling class has vanished. I believe that it has simply been transformed – just as the world itself is being continually transformed from day to day. Decades ago, our world was simpler – more homogeneous, less diverse. If you followed a martial art, it would be judo or karate. A game? Chess. A religion? Christian, Jewish, perhaps Muslim at the outside.

on the Net, via Google (and wikipedia), you can find the specific branch of the specific religious tradition which best meets your needs. … And this is not true only of religions, but of everything: interests, political groups, passions, games, ways of life.

Now, every one of these groups has its own implicit hierarchy. … And everyone is a member of more than one group. And in every group you listen to some people, and what you say influences other people.

[In every area of my life] I have leaders: people I trust; people who I admire and learn from. But they’re not the same people as your leaders. Not only that, but there are other people who come to me to learn (worse luck for them!), in some fields more than in others. The process of diversification tends towards having as many groups as people – and every one of us, of necessity, becomes the small-scale leader of a small-scale group, scattered around the world.

This whole process mirrors what’s happening in the economy, where a market consisting of niches is growing explosively … The key phrase is Long Tail.

So I don’t believe that the ruling class is vanishing, but that we’re seeing a gradual diversification of interests, which leads to the diversification of the ruling class – accompanied by the redefinition and contraction [ridimensionamento] of the role of traditional leaders.

There’s a lot that I like about this – I think Pietro’s right to say that there’s a new kind of process of diversification under way, and to trace it back to the Internet’s basic sociality, its nature as a medium for conversation.

But… a transformation of the ruling class? Non tanto. Pietro’s larger argument is undermined by a couple of strange elisions. Firstly, it’s true that we all have multiple ‘authorities’ – the topics of folk music, statistics, Belgian beer and operaismo are all important to me, for instance, and in each case I could name an authority I’d willingly defer to. But those people aren’t the people who enforce the laws I obey, or set the level of tax I pay, or price the goods I buy, or write the newspapers I read, or appear on the news programmes I watch. The ruling class, it seems to me, is still very much in place, and whether I’m a tequila-crazed Quaker or a tea-drinking Tantric Buddhist is a matter of sublime indifference to it. Roy Bhaskar has written that historical materialists, by virtue of starting from the material facts of social existence, cannot propose absolute freedom, “a realm free of determination”; what we can envisage is moving “from unneeded, unwanted and oppressive to needed, wanted and empowering sources of determination”. The world Pietro describes is a world which is governed only by those needed, wanted and empowering sources of determination. It sounds good, but I don’t think we’re there yet.

Secondly, on the matter of niche marketing. Pietro assumes that a proliferation of niche markets will lead to a proliferation of niche suppliers, and hence the dilution of the authority of the big suppliers. I don’t see any reason to believe that this is the case. Indeed, one of Chris Anderson’s own preferred examples is based on Amazon sales rank – and there’s nothing very diffuse about Amazon, or the authority wielded by Amazon. Much of the buzz around the ‘Long Tail’ seems to derive, ultimately, from this confusion of the two meanings of ‘niche’. Clearly, mining niche markets can be profitable, if you’re a monopolistic behemoth like Amazon; but, equally clearly, it doesn’t follow that niche suppliers can make a living in the same way. Indeed, making niches visible to companies like Amazon actually threatens existing niche suppliers. (Ask your local bookshop, if you’ve still got one.)

Of course, Long Tail proponents tell a different story. Back in July, Scott Kirsner quoted George Gilder thus:

His central thesis is that Internet-connected screens in the home – whether it’s the PC in your den or the plasma screen on your living room wall – are going to change the way we consume video by offering us infinite choice.

“The film business will increasingly resemble the book business,” he says, with a few best-sellers that achieve widespread popularity, and lots of publishers making a profit selling titles that no one’s ever heard of.

Lots of who doing what? Run that past us again, could you? While you’re at it, send the good news to the novelist A.L. Kennedy, whose wonderful FAQ includes this:

SO, WHAT’S HAPPENING WITH THE WONDERFUL WORLD OF PUBLISHING?Fewer publishing houses concentrated in conglomerate hands, trying to produce more books of less quality. No full time readers, no full time copy editors and therefore missed newcomers and pisspoor final presentation of texts on the shelves, silly covers, greedy and simple-minded bookshop chains, lunatic bidding wars designed to crush the spirit of unknown newcomers, celebrity “tighten your buns and nurture your inner pot plant” hard backs and much related insanity.

Mass markets are where the units get shifted; niche markets – like literary fiction – are where survivors linger on (until they’re bought out) and upstart competitors emerge (and hang on until they’re bought out). It’s the logic of the monopoly, which is to say that it’s the logic of the market. Some years ago a McDonald’s spokesman, asked if the fast food market had reached saturation point, responded that, as far as his company was concerned, the market would only be saturated if there were no cooked food outlets anywhere on the planet apart from McDonald’s. I don’t think Amazon, or the publishing conglomerates, or the media companies who would source Gilder’s ‘infinite choice’, think any differently.

But Pietro’s half right: there is something interesting going on, even if it doesn’t mirror what’s going on in the economy; there is a process of diffusion and diversification, even if it doesn’t affect the main sources of authority over our lives. In fact, what’s significant about the Net is that it can host conversations which escape the marketplace and evade pre-existing (‘unneeded and unwanted’) forms of authority. That said, it can also reproduce the marketplace and reinvent old forms of authority – just like other conversational media.

In short, what’s good about the Web is – or can be – very good; what’s bad about is – or should be – very familiar.

Plans that have far-reaching effects

Katrina update. Back here, I wrote:

Louisiana, we now know (thanks to China at Lenin’s Tomb) was one of the areas where the ‘free market’ reforms of FEMA took effect: in 2004, a private consultancy called IEM was paid half a million tax dollars to develop a ‘Catastrophic Hurricane Disaster Plan‘. It’s not clear whether this plan was ever completed, let alone implemented. According to one source (cited by China), hurricane-oriented workshops in July and December 2004 produced “a series of functional plans that may be implemented immediately”; moreover, “resource shortfalls were identified early, saving valuable time in the event an actual response is warranted.” However, a January 2005 report from the National Emergency Management Association (PDF) notes, “Participants from this exercise are waiting for a private contractor to finish the after-action report and plans from this exercise”. Perhaps IEM’s ‘functional plans’ weren’t quite finished after all.

That NEMA report was dated 21st January 2005. You’d think that IEM would have got its ‘functional plans’ ready to go some time in the next seven months, but maybe not. Perhaps the reason why the local and national response to Katrina looked so shambolic was, quite simply, that the people in charge didn’t know what to do.

Here is an important post by Greg of Suspect Device, who was present at the July 2004 ‘Hurricane Pam’ exercise. You should read the whole thing, but here are a few particularly striking quotes:

As with most IEM projects, the Hurricane Pam exercise was put together at the last minute, in a blind animal panic with no time for refinement, testing, or subtlety, but it still was a remarkable and bold idea.

Attendees included emergency managers from all across Louisiana, representatives from the EPA, the National Guard, the Department of Wildlife and Fisheries, the DOTD, the Red Cross (who I remember as being marginalized and tolerated at best, with more than a little eye rolling from the “professionals”), the State Police, and many others. Also taking on important roles were representatives from the Army Corps of Engineers and FEMA, who provided facilitators, computers, and a great deal of support.

There was a certain amount of contention, a few turf wars, some loud talk. None if it consequential, in the end, because of the single greatest emollient: FEMA. The Federal Emergency Management Agency promised the moon and the stars. They promised to have 1,000,000 bottles of water per day coming into affected areas within 48 hours. They promised massive prestaging with water, ice, medical supplies and generators. Anything that was needed, they would have either in place as the storm hit or ready to move in immediately after. All it would take is a phone call from local officials to the state, who would then call FEMA, and it would be done. There were contracts-in-place with major vendors across the country and prestaging areas were already determined (I’ll have more to say about this later, but this is one reason FEMA has rejected large donations and turned back freelance shipments of water, medical supplies, food, etc: they have contracts in place to purchase those items, and accepting the same product from another source could be construed as breach of contract, and could lead to contract cancellation, thus removing a reliable source of product from the pool of available resources. I’m not saying I agree with this — in fact, I don’t, and think it’s boneheaded — but the reasoning is that if they accept five semis of water from the east Weewau, Wisconsin, Chamber of Commerce, the water supplier who is contractually bound to provide 100,000 gallons per day will be freed from that obligation.

The organizers of the exercise … insisted that the plans contain no “fairy dust”: no magical leaps of supply chains or providers … Everyone tried to keep the fairy dust to a minimum, and they did so, for the most part, despite having big plans: LSU, Southern, Southeastern and other campuses dismissed for the semester and turned into giant triage centers/tent cities; acres of temporary housing built on government-owned land; C-130 transport planes ferrying evacuees to relatives in other states, and so on. Bold plans, but doable, with cooperation. A comprehensive plan was beginning to emerge.Except that it didn’t. A followup conference, to iron out difficulties in some of the individual plans and to formalize presentation of the final package, scheduled for either late ’04 or early ’05 — I can’t remember and can find no mention of the followup event on the web — was cancelled at the last minute, due to lack of funding (which agency called the cancellation, I’m not sure, although the lack of funds would take it all back to FEMA, in the end).

So: Louisiana did have a hurricane plan, but was devising a new one, to be based on recommendation from the people who would actually be doing the work. The need to evacuate people from impact areas, including those without transportation or the means to obtain it, was discussed, despite media assertions to the contrary. … There were and are officials in Louisiana, including New Orleans Emergency Management, who know the limitations of current planning and who have been trying to come up with a better solution.

The problem is FEMA, and by extension the Department of Homeland Security, which gobbled FEMA up in 2003. FEMA promised more than they could deliver. They cut off deeper, perhaps more meaningful discussion and planning by handing out empty promises. The plans that were made — which were not given any sort of stamp of authority — were never distributed or otherwise made available to those who most needed stable guidance; they vanished into the maw of FEMA

In comments, Greg sums up:

the state didn’t convene the second Pam workshop, to flesh out the plan, because FEMA cancelled the funding, and that even the skeletal plans that were created are not available, because they’re technically FEMA property and FEMA hasn’t released them.

Greg also notes that the number of people without their own transport in south-eastern Louisiana was estimated at 100,000; he adds

The notion of doing something to evacuate those without transport was raised late in the game, but was left as an action item for the followup meetings.

It sounds as if the December 2004 meeting described here had not in fact taken place, because FEMA cancelled the funding.

It’s not clear whether this plan was ever completed, let alone implemented. I think it’s clear now. What’s worse than handing responsibility for vital social support functions to a private company (along with a suitcase full of money)? Doing all that, then pulling the plug on them before they’ve finished the job. FEMA management aren’t just ideologically-driven bureaucrats – they’re incompetent ideologically-driven bureaucrats.

We are the weeds

My previous post on Katrina and its aftermath focused on the contribution made by incompetence – albeit willed and cultivated incompetence. This post is about malice.

As I wrote earlier,

FEMA is now functionally subordinate to the Department of Homeland Security, founded after September 11; this may help explain why FEMA’s interventions in New Orleans placed such an emphasis on securing the perimeter of the city and ensuring that nobody, as a general policy, moved. The triumph of the Homeland Security worldview: natural disasters as a public order problem.

Apparently the Homeland Security worldview predates the Department itself; here’s a passage from the FEMA article I quoted earlier:

In the 1980s, the Reagan administration endowed FEMA with extraordinary powers to keep the country running – powers bordering on martial law, critics argued. The agency became responsible for “continuity of government” plans devoted to salvaging national authority in the event of a nuclear attack. Other plans, drafted by the likes of National Security Council aide Oliver North, laid the groundwork for rounding up rabble-rousers in the event of societal breakdown, whatever the cause.

Larry Bradshaw and Lorrie Beth Stronsky’s story, in case you haven’t read it already, is a graphic illustration of how this approach works out in practice. Now picture the forces of order going from house to house as the floodwaters subside, taking survivors away to ‘refugee camps’, in handcuffs if necessary (I heard that last detail on BBC Radio 4 this morning). And picture the forces of order waiting outside New Orleans until they had built up a large enough force to pacify a supposed insurrection. (Not for the first time, China at Lenin’s Tomb has got the goods: the army had no delusions about their remit – it was not to secure human life and bring supplies, but to suppress an “insurgency”.) If the aftermath of Katrina is a problem, in other words, the survivors aren’t the people who have got the problem – the survivors are part of the problem. In the words of a FEMA staffer at an Oklahoma internment camp, You don’t understand the type of people that are about to come here.

What type of people is that? Here’s Barbara Bush, wife of one President and mother of another, visiting a stadium in Houston which was being used as a holding camp:

What I’m hearing, which is sort of scary, is they all want to stay in Texas. Everyone is so overwhelmed by the hospitality.And so many of the people in the arena here, you know, were underprivileged anyway, so this – this [she chuckles slightly] is working very well for them.

And here’s her boy, visiting Mobile, Alabama:

The good news is — and it’s hard for some to see it now — that out of this chaos is going to come a fantastic Gulf Coast, like it was before. Out of the rubbles of Trent Lott’s house — he’s lost his entire house — there’s going to be a fantastic house. And I’m looking forward to sitting on the porch.

Even if we forget who Trent Lott is, this is dreadful, Marie Antoinette stuff. All those people have lost everything? They’ll be OK – after all, my friend lost his house, and he’s building a new one… If we remember that Trent Lott is the Republican who endorsed the segregationist Strom Thurmond (“we voted for him. We’re proud of it. And if the rest of the country had followed our lead, we wouldn’t have had all these problems over all these years”); and if we remember that most of the people who got stuck in those New Orleans internment camps are Black… I’m not suggesting that George W. Bush and his government are pursuing an actively racist agenda – that they saw the chaos caused by Hurricane Katrina as an opportunity to to treat poor Black people like dirt. I suspect it’s worse than that. I’m suggesting that the government is genuinely attempting to mount an effective response to the disaster – but that its criteria for an effective response don’t exclude treating poor Black people like dirt, and may even encourage it.

It’s as if the government is running two sets of books on its responsibilities to the public. There are the deep-rooted assumptions of the social contract: if we have a government, and if it intervenes in our lives, it must surely intervene to maximise the safety of its citizens and prolong our lives – all our lives, without distinction. But then there’s a political contract, which isn’t cited openly but informs the government’s rhetoric as well as its policy-making – and that contract says, quite plainly, that those people don’t count. Hence, perhaps, a certain genuine bafflement on Bush’s part in the face of the public reaction to the aftermath of Katrina: what’s up with them? they knew what they were voting for, didn’t they?

Here’s Alasdair Gray in 1982 Janine:

[Frazer] was telling us about Machiavelli’s The Prince. “Listen,” he said, “you have just conquered a neighbouring state, right?, and you want to conquer another. So what do you do to the defeated people to stop them revolting against you when you withdraw most of your army?”
We could not answer because we had not read Machiavelli.
“Easy!” cried Frazer, “You split the population into three, take most of the wealth away from one-third and divide it with the rest. The majority have now profited by being conquered. They accept your government in return for your help if the minority start a civil war to get their own back, a civil war which will not occur because the impoverished losers know they are bound to be defeated. The conqueror can now repeat his manoeuvre elsewhere. What I don’t understand,” said Frazer, “is why no
governments have taken Machiavelli’s advice? Surely the first to do it would conquer the world?”

Alan, who seemed not to have been listening, said, “They do.”
After a pause I said, “You don’t mean the British Empire.”
“No. I mean Britain.”

I don’t think this is a question of racism, in other words. (A friend of mine once wrote that she saw just as much evidence of a class structure in the US as she had in her native Britain; the only difference was that Americans persisted in referring to class as ‘race’.) The concerted neglect and casual brutality which have characterised the US government’s response to Katrina seem to be the product of an authentically Machiavellian philosophy of government, which holds that leaders can gain consent by mobilising their subjects against one another. We don’t get many hurricanes here, thankfully, but we’d be kidding ourselves if we thought that this was someone else’s problem:

Mr Blair said he wanted to change the culture of the criminal justice system. He called for “an historic shift from a criminal justice system that asks, first and foremost ‘How do we protect the accused from the transgressions of the state or police?’ to one whose first question is ‘How do we protect the majority from the dangerous and irresponsible minority?’”.

A criminal justice system which downgrades the presumption of innocence, the better to neutralise the ‘dangerous and irresponsible minority’. An emergency management system which lets people die, the better to control poor and unruly survivors. All we need now is more votes for the decent folk – and perhaps that’s not far away (Non-registration was highest in densely populated urban areas with mobile populations, particularly inner London, and areas of economic deprivation).


What the public gets

One possible reason why the aftermath of Katrina has been so dreadful is provided by the piece by Jamie I quoted earlier. There’s something weirdly soviet about all this. We’re seeing this immensely powerful country which has somehow stopped working. Perhaps we should take this image literally: perhaps the reason why it looks as if the US Government is broken is that the US Government, or at least its capacity to act promptly and effectively, is broken.

Or rather, the government’s effectiveness has been broken. This article from 2004 throws some light on the weirdly sclerotic approach which the Federal Emergency Management Agency has displayed during the crisis. Over the last few years, FEMA has been systematically exposed to the logic of the capitalist market. Firstly, the agency has been told that everything it does could be done just as well by external contractors and consultancies; the result has been cost-cutting and corner-cutting, running to stand still and general demoralisation. Secondly, FEMA’s own services have been marketised – thrown open to competitive bidding from potential ‘clients’. The predictable result has been that FEMA’s attention goes disproportionately to richer areas, rather than to those most at risk (such as Louisiana). Thirdly, preventative and ‘mitigating’ action – protecting people from natural disasters in advance rather than clearing up afterwards – has been downgraded, despite having been one of FEMA’s great strengths. There is, after all, no market logic to this type of action: there’s no demand-pull if the disaster has yet to happen. (Come to that, if it hasn’t happened yet it may not happen at all, and then how would you cost-justify your ‘mitigation’?) Read on:

In June [2004], Pleasant Mann, a 16-year FEMA veteran who heads the agency’s government employee union, wrote members of Congress to warn of the agency’s decay. “Over the past three-and-one-half years, FEMA has gone from being a model agency to being one where funds are being misspent, employee morale has fallen, and our nation’s emergency management capability is being eroded,” he wrote. “Our professional staff are being systematically replaced by politically connected novices and contractors.”

From its first months in office, the Bush administration made it clear that emergency programs, like much of the federal government, were in for a major reorientation. … The White House quickly launched a government-wide effort to privatize public services, including key elements of disaster management. Bush’s first budget director, Mitch Daniels, spelled out the philosophy in remarks at an April 2001 conference: “The general idea–that the business of government is not to provide services, but to make sure that they are provided–seems self-evident to me,” he said.

As a result, says a disaster program administrator who insists on anonymity, “We have to compete for our jobs–we have to prove that we can do it cheaper than a contractor.” And when it comes to handling disasters, the FEMA employee stresses, cheaper is not necessarily better, and the new outsourcing requirements sometimes slow the agency’s operations.William Waugh, a disaster expert at Georgia State University who has written training programs for FEMA, warns that the rise of a “consultant culture” has not served emergency programs well. “It’s part of a widespread problem of government contracting out capabilities,” he says. “Pretty soon governments can’t do things because they’ve given up those capabilities to the private sector. And private corporations don’t necessarily maintain those capabilities.”

In recent congressional testimony, a NEMA representative noted that “in a purely competitive grant program, lower income communities, those most often at risk to natural disaster, will not effectively compete with more prosperous cities…. The prevention of repetitive damages caused by disasters would go largely unprepared in less-affluent and smaller communities.”

And indeed, some in-need areas have been inexplicably left out of the program. “In a sense, Louisiana is the flood plain of the nation,” noted a 2002 FEMA report. “Louisiana waterways drain two-thirds of the continental United States. Precipitation in New York, the Dakotas, even Idaho and the Province of Alberta, finds its way to Louisiana’s coastline.” As a result, flooding is a constant threat, and the state has an estimated 18,000 buildings that have been repeatedly damaged by flood waters–the highest number of any state. And yet, this summer FEMA denied Louisiana communities’ pre-disaster mitigation funding requests. In Jefferson Parish, part of the New Orleans metropolitan area, flood zone manager Tom Rodrigue is baffled by the development. “You would think we would get maximum consideration” for the funds, he says. “This is what the grant program called for. We were more than qualified for it.”

Within FEMA, the shift away from mitigation programs is so pronounced that many long-time specialists in the field have quit. “The priority is no longer on prevention,” says the FEMA administrator. “Mitigation, honestly, is the orphaned stepchild. People are leaving it in droves.” In fact, disaster professionals are leaving many parts of FEMA in droves, compromising the agency’s ability to do its job. “Since last year, so many people have left who had developed most of our basic programs,” Mann says. “A lot of the institutional knowledge is gone. Everyone who was able to retire has left, and then a lot of people have moved to other agencies.”

A lot of the institutional knowledge is gone. In the name of not doing anything the free market could do – and not doing anything the free market wouldn’t do, because anything the market wouldn’t do can’t be worth doing – the government has, in effect, broken itself. It’s divested itself of so many responsibilities that, when disaster strikes, the capabilities which it needed to maintain in order to meet those responsibilities just aren’t there any more. Paul Krugman‘s peroration is horribly persuasive:

The reason the military wasn’t rushed in to help along the Gulf Coast is, I believe, the same reason nothing was done to stop looting after the fall of Baghdad. Flood control was neglected for the same reason our troops in Iraq didn’t get adequate armor. At a fundamental level, I’d argue, our current leaders just aren’t serious about some of the essential functions of government. They like waging war, but they don’t like providing security, rescuing those in need or spending on preventive measures.

So America, once famous for its can-do attitude, now has a can’t-do government that makes excuses instead of doing its job.

Which brings us back to Jamie’s strange ‘Soviet’ parallel. The last years of the Soviet system saw a command economy undermined from within by a pervasive disillusionment with the system: if you were a factory manager, not only was there no point trying to reach your targets, after a certain point there was no point even bothering to doctor the figures to make it look as if you had. Everyone knew – above you in the chain of command as well as below – that the system wasn’t working, if it ever had. Worse, everyone knew that the system they had in the West – where supply and demand information was exposed through the price mechanism – worked better. In that situation, there was no point keeping the system working, or even feeding the system the lies it needed to pretend it was still working. And so the system ground to a halt and fell apart. Unfortunately there wasn’t much to replace it, initially; the years after the collapse were dark (note the change in the death rate between 1992 and 1993, in particular).

Mutatis mutandis – and yes, that’s a lot of mutandis – something comparable seems to be happening in the USA; there, ironically, the ideology which is corroding the machinery of government is promulgated by the government itself. For the Bushites, it seems, the function of government is firstly to maintain a favourable environment for business, and secondly to step out of the way and let business do its thing. When this worldview is superimposed on the prudential, interventionist, humanitarian public-service ethic of an agency like FEMA, the result is confusion and bureaucratic paralysis at best. At worst… It’s worth remembering that FEMA is now functionally subordinate to the Department of Homeland Security, founded after September 11; this may help explain why FEMA’s interventions in New Orleans placed such an emphasis on securing the perimeter of the city and ensuring that nobody, as a general policy, moved. The triumph of the Homeland Security worldview: natural disasters as a public order problem.

One last point. Louisiana, we now know (thanks to China at Lenin’s Tomb) was one of the areas where the ‘free market’ reforms of FEMA took effect: in 2004, a private consultancy called IEM was paid half a million tax dollars to develop a ‘Catastrophic Hurricane Disaster Plan‘. It’s not clear whether this plan was ever completed, let alone implemented. According to one source (cited by China), hurricane-oriented workshops in July and December 2004 produced “a series of functional plans that may be implemented immediately”; moreover, “resource shortfalls were identified early, saving valuable time in the event an actual response is warranted.” However, a January 2005 report from the National Emergency Management Association (PDF) notes, “Participants from this exercise are waiting for a private contractor to finish the after-action report and plans from this exercise”. Perhaps IEM’s ‘functional plans’ weren’t quite finished after all.

I said I had a theory – well, two theories, but this is long enough already; I’ll keep the other one for the next post. Here’s a theory. That NEMA report was dated 21st January 2005. You’d think that IEM would have got its ‘functional plans’ ready to go some time in the next seven months, but maybe not. Perhaps the reason why the local and national response to Katrina looked so shambolic was, quite simply, that the people in charge didn’t know what to do. Oh, sure, they’d had policies and procedures in place for this kind of thing, but those were the old procedures. Under the new procedures… well, funny thing, they’d had a presentation about the new procedures and it all looked pretty good, and then an email had gone round saying the new procedures were about to be issued, but that was a while ago and they should really have had them by now…

Ridiculous, of course – that couldn’t happen. Not in America.

Update: Shelley of Burningbird has some relevant reflections and pointers here. In particular, Shelley links to some searching questions about the preparation for and the response to Katrina, and to this extraordinary piece by Dave Rogers. Dave tells some sea stories, does some serious thinking about the meanings of faith, honour and leadership, and comes to conclusions similar to some of the things I’ve said in this post, but with less pussyfooting. Finally, Dave in turn links to this bizarre piece by Daniel Henninger; all I’ve got to say about that is that if I’m right, Henninger is precisely, diametrically, dead wrong. (And, I suppose, vice versa, if you insist.)

It’s only water

They order these things better in Cuba; there, evacuation means that everybody leaves, down to dogs and cats:

they have family doctors in cuba (!), who evacuate together with the neighborhood, and already know who, for example, needs insulin.

they also have veterinarians and they evacuate animals. they begin evacuating immediately, and also evacuate TV sets and refrigerators, so that people aren’t relucatant to leave because people might steal their stuff.

(The ‘(!)’ isn’t mine; I don’t know what’s funny about the idea of Cubans having family doctors.) Perhaps this isn’t a great source evidentially – the speaker is talking about how things work in general – but it is borne out by the Red Cross in this story from 2002:

Hurricanes Isidore and Lili battered the whole country, especially the tobacco-growing province of Pinar del Río and the nearby Isla de la Juventud, causing widespread devastation.Cristina Estrada, a regional spokeswoman for the Red Cross, told BBC News Online that only the country’s prompt and well-organised evacuation procedures ensured no-one was killed.

“In any other country in the region it would have been a disaster in terms of loss of life,” she said.

In any other country in the region, indeed.

Going back a bit further, in 1974 they ordered these things better in Australia. As Brian notes, Cyclone Tracy passed through Darwin on Christmas Day(!) 1974. The result was the effective destruction of 70% of the buildings in the town – and a death toll of 65, or slightly more than 0.1% of the pre-cyclone population. (‘Pre-cyclone’, because all but 10,000 of the population were evacuated, and many of them decided not to come back. Understandably, perhaps – apart from anything else, do you know where Darwin is?)

What happened in New Orleans wasn’t much like either the Cuban system or the Darwin experience. On Saturday 27th August the city authorities issued a mandatory [sic] evacuation order, which was followed by many (most?) of those able to do so. For those who remained behind, the city laid on buses – which transported them, by the thousand, to assembly points within the city and left them there. Once inside what were effectively internment camps, the people of New Orleans were treated like internees everywhere – which is to say, like cattle (and not very highly-valued cattle at that). Water, food, sanitation, shelter and medicine were supplied haphazardly or not at all. No one was allowed out of the camps: locals who had survived unscathed offered to take people away in their cars but were told to stay away; survivors who could have walked out of the city were told to stay put. When buses out of the city finally came, survivors were not told where they were going until they’d got on one – nor, almost incredibly, were they allowed to get off a bus before it reached its destination.

The city at large, meanwhile, was effectively written off – far more decisively than seemed to be justified by the outbreaks of gang violence, as alarming as those were. My immediate reaction to those pictures of stranded survivors, waving from balconies and roofs as TV crews passed overhead, was to imagine similar scenes in Britain. And there my imagination failed me: I couldn’t picture that scene without adding a boat of some sort, crewed by concerned neighbours or the RNLI or Red Cross or St John’s Ambulance or the WRVS or the local Rotary Club… If disaster struck a British city, I thought, surely there’d be half a dozen charities and voluntary organisations and ad hoc committees lining up to help, even before the flood waters began to subside. What had happened to civil society over there? I still don’t know if the St John’s Ambulance and the WRVS have any US equivalent, but as it turns out that’s not really the point. What had happened was that the Federal Emergency Management Agency had been approached by several hundred locals who wanted to rescue survivors using their boats, and they had turned them away. FEMA had also refused to permit external agencies to enter the city – the American Red Cross included – on the grounds that their presence in the city would slow down the evacuation. They had also refused… but I won’t go through the list; you can see it here. The long and the short of it was, the city was locked down, and locked down it would stay – whatever the immediate cost to the inhabitants of the city. In the context of a disaster recovery operation, this order of priorities seems odd, to say the least.

If all this is hard to understand, the personal interventions of George W. Bush beggar belief. He visited New Orleans on the 3rd of September – by which time evacuations were, finally, proceeding; his presence promptly halted food distribution for several hours, by imposing a no-fly zone. More culpably, he had relief and rebuilding work started for his media appearances – and halted afterwards. The story of a Potemkin food stall in New Orleans which has been circulating seems to be unfounded (thanks to Chris (in comments) for the nudge). What has been reported on German TV – the video is here (from about 3:20) – is a sudden outbreak of ground-clearing and construction work when Bush and his media crew visited Biloxi. The workers downed tools after Bush left; it was all done for the cameras. But the Biloxi charade was no more than a missed opportunity to do something more constructive – the workers had been clearing an area where nobody had actually lived before the hurricane. More seriously, vital repair work in New Orleans was started for the President’s benefit – and stopped when he no longer needed it. Also via Kos, here’s Louisiana Senator Mary Landrieu, writing on 3rd September:

perhaps the greatest disappointment stands at the breached 17th Street levee. Touring this critical site yesterday with the President, I saw what I believed to be a real and significant effort to get a handle on a major cause of this catastrophe. Flying over this critical spot again this morning, less than 24 hours later, it became apparent that yesterday we witnessed a hastily prepared stage set for a Presidential photo opportunity; and the desperately needed resources we saw were this morning reduced to a single, lonely piece of equipment.

Paul Krugman, writing on September 1st, sums up:

Katrina hit five days ago – and it was already clear by last Friday [26th August] that Katrina could do immense damage along the Gulf Coast. Yet the response you’d expect from an advanced country never happened. Thousands of Americans are dead or dying, not because they refused to evacuate, but because they were too poor or too sick to get out without help – and help wasn’t provided.

Something’s going on. Or rather, something’s going wrong – really horribly wrong. Jamie nails the mood:

So the hurricane strikes and all of us foreigners watch the footage on the news with concern but without much anxiety. It’s just a matter of time before can-do America rolls up its sleeves and cleans up the mess, right? Time goes by and then the Mayor of New Orleans pops up on the BBC talking about bodies floating down the streets and suddenly the estimate of deaths goes up into the thousands. It’s like watching someone jump out of an aeroplane and slowly realising that that person does not, in fact, have a parachute.

There’s something weirdly soviet about all this. We’re seeing this immensely powerful country which has somehow stopped working. There’s sand in the joints and the parts don’t fit together properly. There’s a general air of sluggishness and fatalism. No-one in authority seems to know what to do about anything, or if they do, they don’t have the resources. The president looks on with vague stupefaction as bits drop off and float away.

As for what‘s going wrong, well, I’ve got a theory. Two theories, actually, and I’m not sure yet whether they fit together. I’ll let you know when I find out. Tune in tomorrow ect ect ect.

[Update: the analytical posts are here and here.]

That shallow feeling

Light blogging ahead – life calls.

Very briefly: Ken Macleod asks, “if you are going to limit free speech at all, is it more illiberal to do so by making the proclamation of certain specific and narrowly defined doctrines illegal, or by making administrative decisions based on broad and vague provisions?” It’s an interesting dilemma, but what strikes me most forcibly is that both alternatives are counsels of weakness. A thriving political movement – and, by extension, a government confident in its ability to rally support – will not issue Clarkean all-purpose anathemata against anyone who might in future turn a bit dodgy; but neither will it spend time and effort coming up with a precise legal definition for the Men of Evil, their Evil Groups and their Evil Ideology.

I’m not even sure that this second approach is a real alternative to the first: in practice this type of definition would, I think, inevitably catch too much or too little, and end up so garlanded with interpretative codicils as to amount to an alternative approach to Clarkean constructive vagueness. The real alternative – the counsel of strength – is not narrowing the field of free speech. A thriving movement or a confident government would engage its opponents (or, more to the point, their sympathisers) in open debate, secure in the knowledge that its resources and its support were superior to theirs – so that anything good they had to offer could be quietly appropriated and re-framed within its own ideological and tactical vocabulary, bringing (most of) their supporters across into the bargain.

Of course, the idea of New Labour doing this with radical young British Muslims would make a cat laugh – but that’s a reflection of the weakness of New Labour in 2005, not a statement about the general conditions of political dialogue with disorderly social movements (or with British Muslims in particular). We are where we are – but the conditions of possibility imposed by our current situation aren’t absolute.

And the market forces play

Time for a commercial break. This goes out to all my readers in the Northampton area, particularly those who may be in the market for a photographer – perhaps because they’re planning to acquire a passport, or because they want to celebrate the purchase of a nice new sundial. There are many local businesses competing for your custom; I might mention Profile Photography, Charles Ward Photography or Harvest Studios. Then there are John Roan Photography and PRS Digital – the list goes on. Weddings, of course, are big business for photographers, in Northampton as in other areas. (I had a wedding once, and a very nice day out it was.) If a wedding is on your agenda – assuming once again that you’re in the Northampton area – you might want to consider going to Nene Digital Wedding Photography, or getting the whole thing on video courtesy of April Productions.

I’m not able to endorse the quality of the work carried out by these businesses, as I know nothing about any of them. However, they do have one point in their favour, which sadly isn’t shared by one of their competitors. None of these businesses has attempted to gain cheap publicity by spamming the comments section of this blog. For that, I salute them.

(CC’d by email to… you know who you are.)

[Update: the offending photographer was only the first in a stream of comment-spammers, all of whom have presumably signed up to a particularly scummy direct-marketing service. Deleting them individually was getting to be a pain, so all comments are now giftrapped (thanks to Chris for the term). Sorry about the inconvenience.]

Bullet got the wrong bloke

For a few hours on the 22nd of July, Jean Charles de Menezes was a terrorist suspect. What he wasn’t was a capital-S Suspect; he wasn’t ‘known to the police’, as we used to say. (Or rather, he wasn’t the known person the police thought he was – apparently he was mistaken for Osman Hussain.) What if he had been?

Following last night’s appalling revelations, much attention has focused on the police’s apparent failure to verify that de Menezes was the Suspect they were after. What if they had done? What if it had been Osman Hussain who was shot?


I heard shouting which included the word ‘police’ and turned to face the male in the denim jacket. He immediately stood up and advanced towards me and the CO19 [firearms] officers … I grabbed the male in the denim jacket by wrapping both my arms around his torso, pinning his arms to his side. I then pushed him back onto the seat where he had been previously sitting … I then heard a gun shot very close to my left ear and was dragged away onto the floor of the carriage.

The male in the denim jacket was (self-evidently) not about to detonate any explosives: officers had no reason to suppose that their lives, or the lives of the tube passengers, were in danger. (As I wrote back here,“was de Menezes, in his denim jacket, seen as a low enough risk to be watched on the bus rather than being intercepted, and rugby-tackled on the tube train rather than being shot from a distance?”) He could, when he approached the firearms officers, have been intending to go for a knife or a gun – but pinning his arms to his sides and pushing him back into his seat handily dealt with that possibility.

So it’s hard to see any legal – or rational – justification for the shooting; and this would still be the case if they’d got the right bloke. To quote myself at greater length,

was de Menezes, in his denim jacket, seen as a low enough risk to be watched on the bus rather than being intercepted, and rugby-tackled on the tube train rather than being shot from a distance? But if so, why was he killed? Not, surely, because he had been misidentified as one of the July 21st bombers – this would be summary justice pure and simple.

What I wonder about, after last night’s news stories, is: what if it had been Osman Hussain wearing that denim jacket and forced back into that seat on the tube train – what would be the mood of the country now? Would a leak from the Police Complaints Commission have been front page news? Would we be hearing calls for multiple resignations? Or would an act of summary justice – an extra-judicial execution in broad daylight, a truly appalling precedent – have been accepted? Would we now be being encouraged to hail the Metropolitan Police for its resolute stance against terror and its willingness to take the fight to the enemy? (They might cut a few corners here and there, but what’s the odd dead terrorist to you or to me?)

The charge that Ian Blair, like his namesake, is a liar has gained some traction lately. The possibility I’m considering here is that he’s a gambler: that he saw the July 21st bombings – and the Stockwell operation – as a chance to massively extend the effective power of the Metropolitan Police, and to do so without endangering its support in the political class and the media. I don’t know if the gamble would have paid off; I’m glad we never found out.

The Templars and the Saracens

In a piece which appears in The Salmon of Doubt (I don’t know whether it was published in the author’s lifetime), Douglas Adams writes:

There’s always a moment when you fall out of love, whether it’s with a person or an idea or a cause, even if it’s one you only narrate to yourself years after the event: a tiny thing, a wrong word, a false note, which means that things can never be quite the same again. For me it was hearing a stand-up comedian make the following observation: “These scientists, eh? They’re so stupid! You know those black-box flight recorders they put on aeroplanes? And you know they’re meant to be indestructible? It’s always the thing that doesn’t get smashed? So why don’t they make the planes out of the same stuff?“The audience roared with laughter at how stupid scientists were, couldn’t think their way out of a paper bag, but I sat feeling uncomfortable. Was I just being pedantic to feel that the joke didn’t really work because flight recorders are made out of titanium and that if you made planes out of titanium rather than aluminium, they’d be far too heavy to get off the ground in the first place? … There was no way of deconstructing the joke (if you think this is obsessive behavior, you should try living with it) that didn’t rely on the teller and the audience complacently conspiring together to jeer at someone who knew more than they did. It sent a chill down my spine, and still does. I felt betrayed by comedy the same way that gangsta rap now makes me feel betrayed by rock music. I also began to wonder how many of the jokes I was making were just, well, ignorant.

De mortuis, but I tend to think the (self-)criticism was apt. A lot of Hitchhiker is less like a novel – or radio series – than a student revue (a very good student revue, admittedly): take the paper-thin characterisations, the dialogue built around gag lines or – more importantly for the current argument – the evocation of weird and counter-intuitive areas of science and philosophy, undercut by a common-sensical English ordinariness. This is amplified by the Pythonesque dogged persistence which won’t let go of an idea until it’s been pushed to its logical limit, taken over the limit, fined for exceeding the limit and embroiled in a lengthy but inconclusive case in the Court of Over-Extended Metaphors. Stylistically, this gives us Arthur’s exchange with Prosser over the planning notice (“…behind a door marked Beware of the Tiger”) or most of Marvin’s lines (“The second million years, they were the worst too.”) – great lines all, but very unlike anything anyone would actually say. Put it together with the common-sensical idea-juggling and you get, for example, the argument for atheism derived (all too logically) from the Babel Fish. What’s most striking about this argument is that it’s got nothing in common with the arguments of actual proponents of “intelligent design” – which are no less ridiculous, but turn on the idea that the wondrous complexity of the universe does provide evidence of the handiwork of a Designer. There’s a lack of engagement with the Creationist mindset here, which ironically makes that mindset harder to combat. If you assume that everyone starts from the same set of common-sense precepts, genuinely alien world-views will only be explicable on the grounds that the people holding them are irrational or stupid – which isn’t the best way to open an argument, even (or especially) an intransigently critical argument.

The mindset that this kind of writing seems to represent (and affirm) is that of someone who’s learnt a lot of valuable stuff in a short time, and who now doesn’t see the need to learn very much more. There is stuff out there that you could learn, but most of it’s not really worth the effort – at best it’s inessential, at worst it’s a pile of pretentious verbiage. If you demonstrably know a lot more than the average person about genuinely important topics, the chances are that you know enough – enough to see through the people who tell you there’s more to be known, anyway. It speaks to the inner second-year science student, in short. (One of the benefits of doing an arts degree is that you never forget that there’s lots of important stuff out there that you genuinely don’t understand. You never forget this if you have any contact with second-year science students, anyway.)

Terry Pratchett has a lighter hand with the dogged persistence than Douglas Adams, but in most other respects he’s a far better writer (he’s much better at people, for a start). That said, some of his jokes suggest the same kind of self-enclosed common sense, evoking the alien without engaging with it. (Does Pseuds’ Corner take nominations from blogs?) One example is the (admittedly funny) dwarfish war-cry “This is a good day for someone else to die!” Some years ago, the KliLakota original of this slogan (“This is a good day to die!”) was discussed on the newsfroup. The tone of the discussion was cheerful and uncomprehending. I wouldn’t say that anyone jeered at the KlinLakota, but very few people showed much sign of understanding the slogan, as distinct from Pratchett’s common-sensical inversion of it). One’s own death is, after all, an eventuality to be postponed as long as possible, not to be embraced. One poster even suggested that the slogan had begun as a deliberately-tempting-Fate insurance policy, akin to “break a leg”.

Fortunately one poster – the wonderfully-named ‘Catherine Denial’ – pointed out that death in battle was an honourable fate for KlingLakotadammit warriors, so that the slogan could actually be taken literally (‘death in battle’=’good death’, ‘today’=’day of battle’, therefore…). [Update 23/6/2007: it's just come to my notice that Catherine Denial is in fact not a clever pseudonym but the name of a real person, who has written widely on nineteenth-century American history. Apologies.]

And I’m not sure even this goes far enough. The point is, surely, that the function of soldiers (contemporary, dwarfish or KlingoLakota) is to kill and risk being killed – and that unwillingness to do the latter makes them less effective in doing the former. The tone is very different, but in terms of the underlying worldview “This is a good day to die!” isn’t so far from the Royal Navy saying “If you can’t take a joke you shouldn’t have joined.” Meaning, in the words of a post from soc.history.what-if by the late and much-missed Alison Brooks,

When it is raining and dark, your feet are giving you hell because they have been wet for two weeks, when you are carrying a pack weighing your own weight, when you are on the edge of a minefield, aware that, well within range, are more people than you who want to kill you, and they have the capacity to do so, when your best friend standing ten feet from you gets hit, and you have to wipe his brains from your face so that you can see, and when the instruction is given to go forward, if you can’t take a joke, you shouldn’t have joined.

You risk death – and, if so instructed, take actions which you know will increase your risk of death – because that’s what you do: that’s what being in the armed forces is all about. (Not that you’ll find it in the recruitment literature.) In its more aggressive form – getting back to the Native Americans – this outlook also makes for a more formidable opponent: an enemy who wants to save his own skin first and kill you second is a lot easier to deter than one who just wants to kill you.

As you’ve probably worked out by now, this post isn’t really about Douglas Adams or Terry Pratchett; it’s not even about the Royal Navy or the Lakota (let alone the blasted Klingons). It began life about a month ago – a decade or so in blogtime – in response to this post on Brian Barder’s blog and the ensuing comments, this one in particular. Brian writes:

it’s obviously psychotic, isn’t it?, to be unable to perceive the large-scale random murders of wholly innocent people as anything but evil? And when the murders are deliberately and unnecessarily accompanied by the suicides of the murderers, doesn’t that suggest minds that have become completely unhinged? Isn’t it psychotic to suppose that some desirable result can be achieved by killing others and oneself because of ‘grievances’ that have nothing whatever to do with the murder victims, and which can’t possibly have a better chance of being remedied as a result of the murders committed?

As long as we persist in seeing [the bombers] as politically and rationally motivated people whose response to their grievances is to go out and kill people, and as long as we strive to ‘understand‘ that behaviour, we shall encourage more of the same. It is insane as well as evil to act in the way that they have done, and while we need to try to hack out the roots of the insanity as well as of the evil and criminality, we need to beware of giving the impression that by trying to understand them and what they did, we regard murder as an understandable (and therefore in some sense defensible) response to a political grievance. Psychiatrists may properly seek to understand the roots of insane and evil behaviour: the rest of us need to be clear that the behaviour is insane and evil and that it can never be condoned.

Brian conflates two arguments which, I think, urgently need to be disentangled. On one hand, I don’t believe that it does any good to deny that the bombers acted rationally, let alone to describe them as ‘psychotic’: their world view was certainly alien to me, but I don’t think it was also insane. Apart from anything else, is it necessarily a sign of psychosis to kill innocent people, to carry out attacks which will cost your own life, or to attack people whose death can’t in itself advance your cause? Not, I would argue, if you’re a soldier – or an irregular combatant (were Orde Wingate’s Special Night Squads ‘psychotic’? is Hamas?). Similarly, the bombers’ actions make sense if we assume that they saw themselves as part of a guerrilla force, fighting in one front of a war with Britain (among other nations), and prepared to use any means – however inhumane – to further their cause.

Obviously this world-view – as well as the acts it inspires – is vile and cannot be condoned: to understand it is not (pace Brian) to see it as in any way defensible. But, as I said above, there are two separate arguments here. Yes, the London bombings were evil and can never be condoned; but no, this does not require us to characterise them as insane. Visualise concentric circles. To demand that Britain withdraw from Iraq is a legitimate political point of view which is widely held (and which is not necessarily counter to British national interests). To demand that ‘the West’ withdraw from ‘Islamic lands’ is a legitimate point of view which has rather fewer adherents (and which is counter to British national interests). And to set out to kill at random in order to further this point of view is unforgivably evil; moreover, it is an unforgivable evil committed in a bad cause. (As I’ve argued before, it’s hardly possible – and may not even be desirable – to uncouple your assessment of a terrorist act from your assessment of the cause involved.)

This is what I mean by ‘understanding’ – and I don’t see that it involves any ‘condoning’, any ‘in some sense defensible’. What it does involve is visualising those concentric circles – which I think is essential, if we’re to have any hope of stopping the flow of recruits from outer circle to inner.

Ashtrays of emotion

This blogpost is in conjunction with the Elect The Lords campaign, who recently made a Pledgebank appeal to blog about Lords reform, which I signed. It marks the anniversary of the first Parliament Act, according to which

it is intended to substitute for the House of Lords as it at present exists a Second Chamber constituted on a popular instead of hereditary basis

The 94th anniversary, to be precise. Related posts can be found via Technorati tag , and the New Politics blog.

In the beginning, there were barons. In the time of the Normans – and the Anglo-Saxons, for that matter – the people who mattered were the people who owned land and commanded allegiance; the monarch was essentially Top Baron, the capo di tutti capi. Taxes were collected, percentages were taken and favours were granted; it was a system.

Over time, the distinction between monarch and barons grew stronger; in reaction, the barons began to operate as a power in the land in their own right, independent of – and sometimes in opposition to – the Crown. Simon de Montfort took things further in the thirteenth century, buttressing his own power base with the support of commoners (landed gentry and knights of the shire, that is). Another century and the ‘commoners’ are themselves seeking collective representation, so that they can also make demands of the Crown – although not the kind of demands made by Wat Tyler, of course. (They weren’t that kind of commoner, either; these people were about as ‘common’ as the average Cheshire magistrate – who is of course their direct descendant.)

There it is: we’ve got a House of Lords and a House of Commons, and we’re not even up to the Tudors. Members of the House of Commons are even elected, although the electorate is small and rather select. Subsequently the balance of power tipped still further against the Crown; you could say it tipped quite decisively on the 30th of January 1649, although that date isn’t generally celebrated in histories of parliamentary democracy. By the eighteenth century, anyway, Parliament is starting to run things; this is when we start hearing about the Ministers appointed by the Crown, chief among them the Prime Minister.

In the nineteenth century, after the unpleasantness in France, we started to hear about democracy. By 1900 the electorate of the House of Commons is a pretty high proportion of the adult male population; getting there only took a couple of mass movements, a few years of near-insurrectionary agitation and a dead Prime Minister. (The assassination of Spencer Perceval had nothing to do with any of this, but it must have concentrated some minds.) Another mass movement and a world war, and even women are voting. Never let it be said that reform is impossible. (Never let it be said that it’s easy, either.)

The vote for all adults (aged 21 or over) was finally conceded in 1928. All this time, the House of Lords had been sitting there unreformed, preserving its ancient traditions and generally getting in the way – more and more so as the House of Commons becomes more representative. The decisive confrontation had come in 1911, when the Lords and the King, under duress, conceded the supremacy of the Commons – and endorsed the project of replacing the House of Lords with something more representative.

And then nothing happened, for 94 years.

To put it schematically, from Simon de Montfort to Edward VII there were always two sides: a ruler on one side, an opposition with its own power base on the other. It’s King vs Lords; then King vs Commons; then the 30th of January 1649 (although, as I’ve said, we don’t really speak about that). Then 1688, after which it’s not King vs Parliament so much as Parliament vs King; and finally 1911, when it’s decisively Commons vs Lords (and King). But what should have been the final victory of the Commons was never pressed through. What happened instead, oddly, was that a new opposition developed: Prime Minister vs Commons. In the penultimate stage of this development, under Thatcher, the unreformed House of Lords was even brought into play on the Prime Minister’s side. Still more bizarrely, in the Blairite final stage the Commons were so thoroughly managed that the Lords began to seem a bastion of liberty, due process and free speech, if not democracy. Perhaps this is the final act in the re-centralisation of government power in Britain: Prime Minister vs Lords. It’s not hard to see how that one will play out – particularly given that the Prime Minister has inherited the Crown’s power to pack the House of Lords with his own capi. And the barons, the damned stupid barons…

Ninety-four years after the Parliament Act, arguing for a democratically-elected second chamber isn’t particularly hard: it seems like a reform whose time has come, to put it mildly (ninety-four years!). Given the background I’ve sketched out above, it’s also a reform which would have some very far-reaching consequences. Replacing the current bodged-up medieval absurdity with an elected second chamber would instantly create a massive counterweight to the power of the Prime Minister – perhaps more massive than we can readily imagine in these diminished times. (Think of the ‘control orders’ debate, only with the aura of democratic legitimacy which Ken Livingstone gained when Thatcher started threatening the GLC and a free-spokenness somewhere between Lord Hoffman and George Galloway. Something like that.) It would also open several cans of democratic worms in the House of Commons itself – if members of the second chamber are elected on a fixed date and with a proportional system (and I can’t see why they wouldn’t be), what about the Commons? For both these reasons, it’s vanishingly unlikely to happen, unless a lot of people shout for it loudly and persistently – and even then, it’ll be pretty damn unlikely.

But that doesn’t mean it’s not worth shouting.

In another country, with another name

In a comment thread on his blog, Brian Barder writes:

You [meaning me - PJE] take a more generous view than I do … of the opinions, implied or explicit, of those many commentators who have been saying (and continue to say) that because Blair must have known that UK participation in the invasion and occupation of Iraq would be used by Muslim extremists to generate additional anger and resentment against Britain, and that this would increase the likelihood of a terrorist attack in Britain, therefore Blair has a share of responsibility for the London bombings. Attributing responsibility in this way has two unavoidable implications: (1) that Blair deserves a share of the blame for the bombings and (2) that the increased likelihood of a terrorist attack in Britain ought to have been a factor influencing Blair against his decision to join the Americans in invading Iraq, even if on other grounds he believed it right and necessary to do so.You come perilously close to adopting this view, it seems to me, when you write:

the Iraq invasion created new opportunities for terrorists, created anti-British feeling which was likely to make it easier to recruit new terrorists, and created disaffection among British Muslims which was likely to produce active or passive support for terrorists – and that all these consequences were probable, could have been predicted and should have been weighed in the balance when Blair & co were contemplating joining Bush’s invasion. To have overlooked predictable consequences like this in a good cause would be bad enough (pace Geras); when the cause in question is the Iraq war as we’ve known it, Blair’s responsibility is heavy.

Once you accept that the threat of terrorist attack in response to a specific act of policy is a factor legitimately to be taken into account in making decisions on that policy, you are handing over control of our foreign (and eventually our domestic) policy to terrorists. This is exactly comparable to yielding to the demands of a blackmailer. The only consequence of such surrender is that the demands of the terrorists (and of the blackmailer) will become yet more frequent and more exorbitant. In other words, the increased risk of terrorist attack in the UK should have been totally excluded from Blair’s calculations of the pros and cons of taking part in the Iraq war.

In response to Brian’s first point, I don’t think that Blair’s government can sensibly be blamed for the bombings, unless there’s an unusually long and obscure trail yet to be uncovered, leading from the Foreign Office back to the madrassas. What does fall to the government’s responsibility is protecting its citizens from arbitrary killings. The question is whether the government may bear a share of the blame for failure to protect us from the bombings – a failure which may include failure to avert the bombings altogether, by contributing to the development of conditions which made the bombings more likely. The second argument – that Blair would have been correct to leave the threat of terrorism out of his pre-Iraq calculations – is more substantial, but I have to say that I find it highly counter-intuitive. As Tony Hatfield said in comments here,

The State has an obligation to consider every effect flowing from its policy-especially its foreign policy and certainly a policy involving a declaration of war. That must include the effect of any “blowback” from terrorism. … If that is so, then there must be circumstances- the threat is so immediate, and disproportionate to the benefit you seek- that it tips the balance firmly against the policy.

Brian’s analogy with blackmail is suggestive, but I don’t see that it can entirely sustain his argument – after all, any concession to anyone may be interpreted as a sign of weakness and exploited accordingly. When one government makes demands of another, there is always the possibility that one of the two will end up paying Danegeld or conceding the Sudetenland; however, in practice these extreme cases can be disregarded, and demands can be considered on their merits (bearing in mind the foreseeable consequences of granting or refusing them). Certainly it would be absurd to say, as a matter of principle, that no government should change its policies based on demands made by another government. Should we exclude demands made by non-governmental actors? But that’s not right either – we would expect (and in some cases hope) that governments would be responsive to demands made by multi-national businesses, by the world’s major faiths, by trade union confederations, by charities and campaigning organisations.

There’s obviously something about terrorist organisations which makes it reasonable (from Brian’s perspective) for governments to refuse any demands outright and on principle: something which turns pressure into blackmail and recognition into capitulation. Intuition tells me that the difference is staring me in the face, in the word ‘terrorist’, but in this case I think intuition is wrong. The problem with terrorist groups, in other words, isn’t the fact that they back up their demands with arbitrary and random violence. Imagine an organisation which attempted to gain publicity for its demands by planting dummy bombs. At first the bombs would be taken for the real thing and there would be a certain amount of panic and alarm, even if nobody was actually injured by them. After a while, though, the ‘bombs’ are treated with contemptuous lack of interest, by police and public alike. At this point, has the group ceased to be terrorist – and should the government become willing to negotiate with it? Conversely, imagine a campaign for constitutional reform whose rallies, ignored by the government, grow larger and more unruly, to the point where violent clashes with the police are a predictable occurrence. The campaign’s activities have led directly to the wounding of police officers, in other words; does this mean that it has turned into a terrorist campaign, whose demands should be ignored on principle? In both cases, the reverse appears more likely.

It seems that the judgment of whether a terrorist organisation is terrorist – meaning that its demands should be rejected unconsidered – is independent of what it does. The key is, perhaps, provided by Brian’s analogy with hostage-taking. A terrorist group, we could say, is criminal by nature: in order to achieve its aims, it needs to undermine the state and attack the rule of law. Criminal actions carried out by a constitutional political group are an anomaly which only have a limited effect on our willingness to recognise or deal with them. By contrast, criminal actions carried out by a terrorist group reaffirm the criminal nature of the group and vindicate our refusal to recognise them.

The trouble with this line of argument is that it brings the aims of the group into play as well as its tactics: if terrorist groups are defined by their fundamental opposition to the state and the rule of law, we need to be sure that the groups we describe as terrorist are fundamentally opposed to the state and the rule of law, rather than using criminal tactics to promote demands which could in principle be granted by the state (and legitimated by the law). Hence, perhaps, Blair’s bizarre argument that what sets Al Qaida apart from the British Army is that “They don’t regret the loss of innocent, civilian life. They rejoice in it, that is their purpose.” (Let’s hope for Blair’s sake that Al Qaida never takes lessons in PR from the IRA, who were past masters in regret for the consequences of their actions (we deeply regret the loss of innocent life, caused by a conflict which will inevitably continue…).) I’m not going to go into the question of whether the aims of Al Qaida are non-negotiable in this sense, beyond recommending some cogent arguments for and against the proposition. I think it bears stressing that the ‘blackmail’ analogy rests on an assumption that terrorist groups are different in kind from other political actors, and – most importantly – that this difference derives primarily from their goals rather than their actions (however criminal – however vile, come to that – those actions may be).

But let’s say that, in the case of Al Qaida, we are dealing with a criminal conspiracy with no political aims which could possibly conceded. Even in that case, I don’t think it follows that principled policy-making should take no account of them. Consider a less controversial criminal conspiracy, the Mafia. The Mafia certainly has no demands which any responsible government would grant; formulating policy in order to benefit the Mafia would be reprehensible. However, according to the ‘blackmail’ logic, allowing the government’s opposition to the Mafia to influence policy – perhaps by favouring policies which limited the Mafia’s opportunities to penetrate British society – would itself represent a tacit recognition of the Mafia as a force to be reckoned with, and should therefore be rejected. The responsible course of action would be to take whatever actions the government believed would benefit Britain, leaving the Mafia – and the possibility that government action or inaction might favour the Mafia – out of consideration.

This argument is clearly fallacious. Whether or not the government’s decision is influenced by the existence of the Mafia, the Mafia continues to exist and to have significant effects on the government, both at the time the decision is taken and at the time it is implemented. There is no possible decision which does not have a relationship to the Mafia, in other words; the choice is whether that relationship is favourable or unfavourable. A decision which limits the opportunities available to organised crime (perhaps by putting a lower limit on the number of casinos to be licensed) is unfavourable; a decision which does not limit those opportunities is favourable, whether it does so actively or by default. As with the Mafia, so with Al Qaida: if the government did, in fact, deliberately ignore the possibility that the Iraq invasion would expand the opportunities open to terrorists, it can fairly be charged – on those grounds alone – with making this outcome more likely.

Brian also argues that there is a fundamental and important discrepancy between the (wholly unacceptable) tactics of the bombers and the (potentially legitimate) political causes with which they have been associated.

The other implication of much bien-pensant comment has been that we need to ‘understand‘ what drove the suicide bombers (successful or failed) to commit such dreadful acts and to accept that we (or the Blair government, or western society, or whatever) are all partially to blame for the policies and actions that drove the bombers to do what they did. This seems to me an utterly unacceptable proposition, too, for the reasons eloquently expressed by Brownie in the passage that I quoted. The idea that the pursuit of policies with which others violently disagree is partly responsible for acts of criminal madness committed, apparently, as an expression of that political disapproval, is nonsense, and we shouldn’t hesitate to say so. You write that

people aren’t born terrorists. People have to become terrorists – even that subset of people who are also fundamentalist Muslims and believers in a restored Caliphate. Obviously the terrorists are to blame for their actions, but for those people to have become terrorists something must have gone wrong – something more than being exposed to an ‘evil ideology’.

but it’s a far cry from that to the assertion that the whatever ‘must have gone wrong’ is something for which our own society, or government, or culture, or original sin, must be to blame.

My point here was that successful terrorist actions require a continuing supply of recruits – all the more so in the case of suicide bombings, obviously – and that each of these individuals must go through a whole series of events and influences before they become a terrorist. Pace Brian, I’d say that it would be absurd to assume – on the grounds that terrorists have carried out ‘acts of criminal madness’ – that nothing about “our own society, or government, or culture” played a part in the formation of those terrorists. That is not to say that we can necessarily identify what those contributions are or how significant they were – in absolute terms or in comparison to other influences. But to say that no one other than the terrorists themselves bears any responsibility for their actions, and that we cannot – and should not – address the grievances which motivate terrorist sympathisers, seems to me to set up an absolute separation between ‘us’ and ‘them’ which is highly unhelpful. Something did go wrong for the eight bombers we know about; as far as we know it went wrong right here in Britain, some time in the last few years. In the circumstances, it seems to me, the burden of proof lies with anyone maintaining that the Iraq invasion was not a factor.

Postscript: at Veritatis Splendor, enigmatic NederlanderVlaming D says it all more succinctly than I’ve been able to:

The pro-war people will argue that the jihadists will always find some excuse to launch another terrorist attack on us, regardless of what “root causes” we take away. They’re confusing two things. It’s true that you can’t make deals with or give in to the jihadists. You can’t take the “root causes” of their hatred or extremism away. They will always hate us, for it is our very existence, our “way of life,” that is the root cause of their hatred. Their ideology is so diagonally opposed to our own, that peaceful co-existence with these people is not possible. And indeed, we shouldn’t try to appease them or adopt a laissez-faire attitude towards them. The only strategy against these people is confrontation: not only do we need to prevent them from attacking us, we need to attack them. Again, this is a matter of police and intelligence forces.We can however tackle the “root causes” of Muslim support for these people. As I’ve argued above, a radical minority is nothing without the support of the mainstream. This jihadist “radical minority” will cease to exist (or cease to be consequential in any case) without fresh recruits to carry out its suicide missions and without the silent, or vocal, approval of ordinary Muslim communities. The war in Iraq is a good example, because this is where the opinions of ordinary Muslims and jihadists “overlap”: they both think it stinks to high heaven. By stressing how much they have in common, the jihadist can persuade the average Muslim.

Conversely, jihadists are not that successful in gathering real, practical support for their ultra-conservative interpretations of Islam, or for their utopian “Caliphate.” We naturally oppose these ideas too, but why be so bothered with them when we know they have no real basis of support within the Islamic community itself? Does anyone seriously believe Europe will one day be overrun by massive hordes of Muslim warriors bent on establishing the Caliphate?

The average Muslim in Europe doesn’t want to kill homosexuals, or prevent women from driving a car, or stop us from eating pork, or burn every copy of Harry Potter. If we are to prevent his radical counterpart from convincing him he should do all these things, our job is to convince him of the contrary (“battle for the hearts and minds,” anyone?), stress what is clearly unacceptable and what is open to civilized debate (this as opposed to shutting down the debate in its entirety with the fallacious mantra “opposing the war = supporting terrorism”), and finally, do more to promote alternatives. In doing so, you take away the ordinary Muslim’s every reason to believe the jihadist.

Such a waste of energy

Nick Cohen is getting careless. On the Guardian Web site, a recent Cohen column with the uncompromising headline “Face up to the truth” is now prefixed with the following health warning:

The comment piece below was wrong to say that the composer Karlheinz Stockhausen was ‘delighted’ at the attack on the World Trade Centre, describing it as ‘a great work of art’. In fact, Stockhausen made a statement to the effect that he believed the devil was still an active force in the world and condemned the attack as ‘Lucifer’s greatest work of art’. Apologies.

And what are we to make of this?

In 1989, the number of sexual offences recorded by the police shot up. … The Home Office’s statisticians took a hard look at their data, and noticed a peculiar increase of 500 in the number of arrests for indecency. Odder still, 350 of the arrests had been made in Slough or, more specifically, in the public conveniences in Slough town centre.In 1988, there had been just six. Within a year, Slough had become the San Francisco of the south, the Sodom of suburbia. The Home Office dug deeper. Its researchers found that one of the local police commanders had firm views on the homosexual question and had ordered handsome PCs to go to the lavatories and arrest any man who tried to seduce them. The purge of Slough’s lavatories sent recorded indecency offences in Britain back towards the highs of the 1950s, when homosexuality was illegal. Until, that is, the policy changed and Thames Valley Police pulled its men out of the cottages.

Slough’s gays carried on cruising, but their assignations were no longer recorded. The crime figures depended on what the police were looking for and what the police counted.

The broader point, in this case, is reasonable – the last sentence is an essential caveat for anyone dealing with crime statistics – but the way Cohen gets there is distinctly questionable.

Here are the figures (from the Home Office Web site):


Well, yes, there was a spike in 1989, and the figure recorded had only been surpassed in 1954 and 1955. Beyond that, though, Cohen’s account of these figures is alarmingly slipshod. First, a minor but significant point: the figures didn’t go up by 500 between 1988 and 1989, but by over 700. This in itself suggests that Cohen’s story is a little too neat: if Slough’s extra 344 arrests had been added to the 1988 total, the result would have been a spike of 1,650, well above the levels of the mid-eighties but below the levels recorded in 1974, 1975 and 1978. (All together now: The British police are the best in the world…). Second, the law. Cohen’s reference to “the 1950s, when homosexuality was illegal” sounds plausible, but in fact it’s irrelevant twice over. On one hand, the Wolfenden reforms weren’t introduced until 1967; (male) homosexuality was just as illegal in 1965 (when arrests were in the low 800s) as it was in 1955 (2,322) – or, for that matter, in 1949 (852). On the other hand, these arrests were for ‘gross indecency’, an offence which stayed on the statute book until 2003. The police devoted considerable resources to ‘gross indecency’ during the ‘Great Purge’ of the mid-1950s, then gave it a lower priority in the run-up to Wolfenden. However, there was another period of high arrest rates in the mid-1970s, followed by another trough in the early 1980s. Against this background, the 1989 spike looks less like an aberration caused by an individual police force, and more like an abortive third peak. (Before 1989, it’s worth noting, arrest numbers had risen for three years in succession.) In other words, it looks as if the situation developing in 1986-9 parallelled 1950-3 and 1970-3 – the difference being that the Home Office reined in police forces (not only in Slough) earlier and more sharply than it had done on previous occasions. Taking the 1989 spike out of context, then blaming it on one off-message senior police officer, is hardly a shining example of intellectual honesty.

Intellectual honesty, however, is Nick Cohen’s stock in trade; we have it from the man himself. Cohen made a brief appearance on a Crooked Timber comment thread recently. Both the tone and the content of his intervention are interesting, so I’ll quote it in full:

Look, I’ve learned after the last few years not to appeal to basic principle or to imagine that those who say they’re leftists are within one thousand miles of the left. But after being sent to this thread by Harry I’m genuinely curious: didn’t you people take my reference to the best and the brightest to refer to the democrats, liberals, women—and, yes, for there are still a few—socialists who are being slaughtered in the Middle East?
Can one person here name one genuine secular democratic party in Iraq—or Iran, or Syria or Palestine—they support and which acknowledges their support?
If your answer is no, and you fully understand why it is no, you may at least, after all this time, be experiencing the novel thrill of intellectual honesty.

The argument is stark and simple, not to say simplistic. I am True Left, you are False Left. I am intellectually honest, you are congenital liars.

Perhaps the most interesting characteristic of this line of argument is its insulation against any possible rebuttal. It doesn’t greatly matter what Cohen’s opponents say in reply, because he already knows they’re liars. This, of course, is an appallingly dangerous train of thought, reminiscent of the mentality of commissars and heresy-hunters through the ages: if those who oppose you are also liars, you won’t accept new information unless it supports your existing position. We’re back with Caliph Omar, who (apocryphally) ordered the burning of the Library of Alexandria on the grounds that it contained works which conflicted with the teachings of the Qur’an; on being told that some of the works in the library were in conformance with the Qur’an, the Caliph replied that they could be burned as well, as they were clearly surplus to requirements.

Ironically, Cohen appears to be well aware of the shortcomings of his current position, although he associates it with his opponents:

The least attractive characteristic of the middle-class left – one shared with the Thatcherites – is its refusal to accept that its opponents are sincere. The legacy of Marx and Freud allows it to dismiss criticisms as masks which hide corruption, class interests, racism, sexism – any motive can be implied except fundamental differences of principle.

I think Cohen’s describing a real problem here, but I don’t know what Marx is doing in there (let alone Freud). I blame the rationalism which goes along with a certain kind of commitment to bodies of ideas. (As the anarchists used to say, ‘theory’ is when you have ideas, ‘ideology’ is when ideas have you.) The logic goes like this. You know that you’re a reasonable and well-intentioned person, in possession of the facts; and that you’re on the Left; and that you believe in policies X, Y and Z. I tell you that I don’t believe in X, Y and Z – perhaps even that I oppose those policies – but that I am also a reasonable, well-intentioned and well-informed Leftist. But your beliefs are underpinned by a rational assessment of the facts and a freely-chosen commitment to Leftist principles. My beliefs are therefore wrong. I am clearly mistaken in thinking of myself as a Leftist; if I persist in maintaining that I am, I should be resisted and denounced. Cue Caliph Omar: if I am trustworthy, I will agree with what you already believe; if I disagree with you, I am untrustworthy and can be ignored.

I agree with Cohen that this mentality is distressingly common on the Left: I’ve criticised Chomsky along these lines before now. What Cohen seems not to have registered is that the Leftists he prefers are not immune: witness Geras’ recent tirade against people who have recently written articles which he interprets as erring on the side of apologia for terrorism (or, as Geras puts it, against apologists). Nor, sadly, is Cohen himself.

Postscript: here’s Cohen, back in February :

Over the past year, I’ve been astonished and delighted by the quality of British political blogs. What’s happened reminds me of the punk explosion when I was a teenager. People are ignoring the established system and beating it at its own game. Obvioulsy, there’s a great deal of dross, but what is heartening is how much original and intelligent journalism is coming from people entirely outside the media class, whose only chance of talking to the world would once have been confined to a few paragraphs on a letters’ page or a few minutes on a radio phone-in.As I’m on the left I started out with Harry’s Place, Normblog and Socialism in an Age of Waiting. But as my confidence has grown I find myself zooming all over the net and listening to people I would have crossed the street to avoid in the past. I’ve also realised with a feeling close to despair that if I write a lot of nonsense, it will be exposed and dissected.

We try, Nick. We try.


Get every new post delivered to your Inbox.

Join 204 other followers

%d bloggers like this: