Category Archives: geekage

Marginal notes – 2

The story so far:

I looked at the size of Labour’s majority over the Conservatives – or vice versa – in the most marginal Labour/Conservative battleground seats, in general elections over the last twenty-odd years, i.e. going back to 1997 and New Labour. … Labour’s offensive battleground seems to be very much the same terrain as the area it needs to defend. In both cases, we’re looking at former safe Labour seats where a substantial majority was allowed to trickle away over successive elections – between 2001 and 2005, 2005 and 2010, 2010 and 2015; and in both cases, in 2017 two-thirds of these seats saw either a Labour gain or a substantial cut in the Tory majority.

All but two of the 40 marginals I looked at in that post were held by Labour in 1997; 28 went to the Tories between 2005 and 2015, of which 13 were regained in 2017. Moreover, in all but three of the 40 the Labour relative vote share fell in both 2005 and 2010; in 21 of them it fell in 2005, 2010 and 2015, then rose in 2017.

If these results generalise beyond the marginals, then we can conclude that

  1. Labour has had some bad elections – some elections that really cried out for a thorough rethink of the party’s goals, branding, resources and personnel.
  2. 2010 was definitely one of them, and you wouldn’t call 2005 or 2015 examples of best practice. (“He won three elections!” Yes, about that third one…)
  3. 2017, on the other hand, definitely wasn’t one of them. If you forget about the internal party politics and look at the results through an entirely pragmatic, vote-maximising lens – or view them from Mars, through a telescope which registers party names and vote numbers but nothing else – what leaps out is that 2017 was an astonishingly good result by the standards of the previous three elections; a result so good, you could say that the party of the 2005-15 elections didn’t really deserve it. (But then, it wasn’t the party of the 2005-15 elections that did it.)

But I’m getting ahead of myself. We can draw those conclusions, if these results apply generally. Do they?

A bit of methodology. First, I got hold of constituency-level election results for UK general elections from 1997 to 2017. What I’m interested in is the Labour/Conservative relative vote shares, so I limited my scope to England. Then I eliminated all seats which – across that 20-year period – had ever been held by a third or fourth party, or an independent: goodbye to the Speaker, to Richard Taylor and George Galloway, to Caroline Lucas and to the Liberal Democrats.

So far so straightforward. The next step was more of a leap in the dark: matching constituencies between the 2015 and 2017 results or between 2001 and 2005 was easy enough, but what to do about the 2008 boundary review? In the end I took the quick-and-dirty approach (political scientists, look away now) of treating every constituency with the same name as the same constituency. (Although when I say ‘the same name’… The 2008 reviewers had an infuriating habit of switching names around to make them more logical – main piece of information first – so goodbye West Loamshire, hello Loamshire West! That made for a fun evening’s work.) In addition to name-matching, I matched manually in a few cases where a post-2008 constituency was identified with a pre-2008 one by (I did say to look away) the Wikipedia entry on the Boundary Commission. This isn’t ideal; I’m sure there are constituencies out there with the same name pre- and post-2008 and vastly different boundaries, just as I’m sure that I’ve missed some renamed seats with more or less the same boundaries. If I were doing this for anything more enduring (or rewarding) than a blog post, I would do it properly and assess each of the 500+ constituencies individually. But I’m not, so I haven’t.

I ended up with 421 constituencies – English constituencies in contention between Labour and the Conservatives – which can be categorised as follows:

  • 142 were held by the Conservatives at every election from 1997 to 2017
  • 157 were held by Labour at every election from 1997 to 2017
  • 119 were held by Labour in 1997 but lost to the Conservatives at one of the next five elections
    • of these, 29 were then regained by Labour (one in 2010, eight in 2015, 20 in 2017)
  • 2 (Canterbury and Kensington) were held by the Conservatives from 1997 to 2015 but lost to Labour in 2017
  • one (South Dorset) was won from the Conservatives in 2001 and lost again in 2010

Discarding the last two oddball categories gives us three similar-sized groups to analyse, across a series of six elections.

One final methodological note: the measure being used here is relative vote share, a phrase which here means “Labour vote % minus Conservative vote %”. Since my dataset excludes Lib Dem and minor-party seats, this is usually the same figure as the majority expressed as a percentage (or the majority multiplied by -1 for a Conservative seat). Usually, but not invariably: although none of these seats has ever gone to a third party, a number of them have had either the Lib Dems or UKIP in second place at some of these elections. If I was doing a professional job, I could have addressed this complication by adding a new dimension to the analysis, cutting down the dataset or a combination of both. As I’m not, I turned a blind eye and simply measured the Labour-Conservative difference in all cases.

Now for some charts. First, here are those 119 Labour losses, and when they were lost. In this chart – and most of the others – I’m adopting the convention of treating Labour gains from the Tories as positive numbers and Tory gains from Labour as negatives. A bit partisan, perhaps, but I am specifically looking at gains and losses as between those two parties, and this makes it easier to see what’s happening.

Labour seat gains and losses, 1997-2017

Every time I see this chart I think I’ve accidentally deleted the label on the 2010 ‘loss’ bar. Scroll down… oh, there it is. Basically 2001 saw a bit of slippage compared to 1997, and 2005 was a bad result – but 2010 was an appalling result. There was a bit of fightback in 2015 and a lot of fightback in 2017, but we’re still a long way short of where we were, thanks largely to those losses in 2005 and 2010 – substantial losses and huge losses, respectively.

The next series of charts shows loss and gain in relative vote share. The bars represent the numbers of seats in which Labour’s vote share relative to the Conservatives went up or down (by any amount) at each election. Since the total number of seats doesn’t change from one election to the next, the bars in each chart stay the same overall length, but with larger or smaller portions above the origin line.

All seats:

Just look at those first three blue bars. Up and down the country, Labour threw away vote share in 2001; then we did it again even more widely in 2005, and then again in 2010 – with the (cumulative) results we’ve just seen. Again, 2010 stands out as a disaster, with near-universal vote share losses and almost no increases, even after the reductions in vote share over the previous two elections. (Curiously, while there were 31 seats showing an increased vote share in each of the 2005 and 2010 elections, there’s only one where vote share increased in both 2005 and 2010 – and it’s a safe Tory seat where Labour was in third place both times.) But then things look up in 2015 (with 92 more seats with increased vote share than decreased), and even more so in 2017 (222 more)

Here’s the same data for the “Labour losses” group of seats – the 119 seats featured in the first chart, including those that were retaken by Labour.

There isn’t much to say here, except “here’s that trend again” – and perhaps “no wonder they were former Labour seats”. The 2015 recovery is (proportionately) weaker here, but the 2017 rally is just as strong.

Here are the safe Conservative seats.

This is quite interesting. Naively, I wouldn’t have expected very much variation in the Labour vote in safe Tory seats, what with them being… well, safe Tory seats. Far from it: there were quite a few seats where Labour saw losses in vote share between 2001 and 2010, and many more where Labour’s vote share increased in 2015; as for 2017, in that year there were Labour increases in getting on for 90% of Tory seats. These are all seats that were Tory in 1997 and have been Tory ever since, so I wouldn’t want to read too much into this, but it is a strong trend; it suggests that there may be a substantial suppressed Labour vote out there, released by Corbyn’s – and, to be fair, Ed Miliband’s – new direction(s) for the party. (Perhaps the trouble with trying to poach Conservative votes by moving Right is that you end up giving Conservative voters no particular reason to switch.) One, two, many Canterburys!

To complete the set, here are the safe Labour seats, where the trends are a bit different.

Oddly, 2010 isn’t the nadir now, but represents a bit of an improvement on 2005 in terms of the numbers of seats showing vote share gains and losses. Nor is 2017 the peak fightback year; that would be 2015. I don’t know if the post-Iraq tactical voting campaign – or the Lib Dems’ anti-war positioning – had a huge effect on the 2005 vote, but if they did these are the kind of seats where you’d expect to see it. As for 2015 and 2017, from this chart we can already see that there were 30-something safe Labour seats where vote share went up in 2015 – Ed Miliband, hurrah! – and went down in 2017 – Jeremy Corbyn, ugh! As with 2005, these are perhaps the kind of seats where issues and debates within Labour are most likely to make themselves felt (albeit without any immediate effect on the results).

To look at those trends in a bit more detail, here are a couple of charts which need a bit more of an introduction. As we’ve seen there’s an overall tendency for the Labour vote share to drop between 1997 and 2001, then again in 2005 and (mostly) in 2010, before going up in 2015 and (mostly) in 2017. But how many seats actually follow this pattern – down, down, down, up, up – and how many are exceptions? If there are exceptions, what pattern do they follow? Can we distinguish between Tory, Labour and ex-Labour seats, or do the same trends apply generally?

Following a qualitative comparative analysis approach, I translated vote share change into a letter – D for (Labour relative vote share) down, U for up – giving a string of Ds and Us for each seat based on that seat’s successive changes in relative vote share. Since there are six elections overall, each seat has five letters, corresponding to the vote share changes in 2001, 2005, 2010, 2015 and 2017: DDDDD, DDDDU, DDDUD and so on. An ordered series of characters each of which can only take two values is just asking to be translated into binary digits, so that was the next step: DDDDD=0, DDDDU=1, DDDUD=2, and so on up to UUUUU=31. This meant that I could easily calculate frequency tables for the dataset and for each of the three main groups of seats, which in turn made it possible to visualise different patterns and their frequency.

And that’s what you see here, albeit in slightly cut-down form; for simplicity I left the 1997-2001 period out of these charts. The four letters you see here thus correspond to up/down vote share changes in 2005, 2010, 2015 and 2017.

I’ve singled out five patterns – DDDD, DDDU, DDUD, DDUU and DUUU – for the sole reason that these were the only ones which occurred in the data in significant numbers. You’ll notice the prevalence of Ds in the first two positions (loss of vote share in 2005 and 2010) and Us in the fourth (increased vote share in 2017). The way that this first chart is arranged, the first four blocks reading up from the origin – from dark red up to mid-blue – represent all the seats in which the Labour relative vote share went down in both 2005 and 2010. That is, 84% of them: five out of six.

Here’s the same data ordered differently:

In this version the first five blocks reading up from the bottom – i.e. the red blocks – represent all the seats in which Labour’s relative vote share went up in 2017. Which is to say, 72% of them – nearly three quarters. The first two red blocks represent the DDxU pattern, i.e. “down in 2005 and 2010, up in 2017”: 65% of the total, 69% of former Labour seats and 81% of Tory seats. The exception – and the reason why that total isn’t higher than 65% – is the “safe Labour” group, where this pattern only applies to 48% of seats.

The message of the data is pretty clear. While there is some variation between different seats – and regional variation can’t be ruled out (see below) – across England there are some fairly consistent trends. Where 2017 is concerned, the only realistic conclusion is “we’ve had a terrible election, but this wasn’t it” (apologies to Groucho Marx). 2017 was better than 2015 – and 2015 was better than 2010, in much the same sense that vitamin C is better for you than cyanide. We on the Left have a great deal to be proud of and nothing to apologise for – except, perhaps, letting the culprits for the 2010 disaster off the hook, and not moving against them harder and more decisively. (This isn’t sectarianism; this isn’t a quest for ideological purity. We want a party that can win back vote share and gain seats, like the party did in 2017 – not one that loses vote share everywhere and loses seats by the dozen, like the party did in 2005 and 2010.)

Some generalisations about the different categories of seats are also possible – and about Labour seats in particular. Reading from the bottom of the chart:

  • In the Tory and Ex-Labour groups around 40% of seats fit the DDDU pattern, compared to less than 5% of the Labour group
  • In the Tory and Labour groups around 40% of seats fit the DDUU pattern, compared to around 25% of the ex-Labour group
  • In the Labour group around 15% of seats fit the DUUU pattern, compared to less than 5% of the Tory and Ex-Labour groups
  • In the Labour group around 20% of seats fit the DDUD pattern, compared to around 5% of the Tory and Ex-Labour groups
  • In the ex-Labour group around 20% of seats fit the DDDD pattern, compared to around 5% of the Tory and Labour groups

Translated into English, Labour relative vote share in has gone up at some point in 95% of Tory and Labour safe seats in England, and 80% of ex-Labour seats. Around 40% of Labour and Tory seats, and 25% of ex-Labour seats, showed an increased Labour vote share in 2015 and 2017 (only); around 40% of Tory and ex-Labour seats showed an increased vote share in 2017 (only). Among the Labour seats, smaller groups of seats showed increases either in 2015 alone or in 2010 as well as 2015 and 2017.

In short, if we compare Labour seats to all other seats, as well as a lot of commonality there are some significant differences: there is

  • a sizeable group of Labour seats (and very few others) where 2015 was the only recent election with an increased Labour vote share
  • a very small number of Labour seats (but sizeable numbers of others) where 2017 was the only election with an increased vote share
  • a sizeable group of Labour seats (and very few others) where 2010, as well as 2015 and 2017, saw increased vote share

This tends to suggest that – while most of them are living in the same world as the rest of us – non-negligible numbers of Labour MPs are living in a world where Corbyn and the 2017 campaign didn’t deliver the goods; or a world where Miliband and the 2015 campaign did; or a world where the disastrous result of 2010 wasn’t actually all that bad. The effect that these perceptions are likely to have had on their view of the Corbyn leadership – and their retrospective view of life before Corbyn – doesn’t need to be spelt out. These MPs can – and often do – speak eloquently about their own experiences and the threat that Labour faces in their locality, but they are not reliable sources on Labour’s situation nationally.

There’s also a sizeable number of ex-Labour seats – and not very many others – where Labour’s relative vote share has gone down at every one of the last four elections; this suggests that the loss of the seat to the Tories was part of a long-term trend in those areas, and one which hasn’t yet been reversed. To be precise, this pattern applies to 6 seats held by the Tories throughout the period, 8 held by Labour and 24 which went Tory at some point between 2001 and 2017. This would be worth investigating. A quick scan of the 24 seats and their former MPs on Wikipedia gives few pointers, other than to remind me that the 1997 wave swept some truly awful placemen and -women into the Commons: some are noted only for their loyalty (to Tony Blair); others made headlines in the local press during the expenses scandal; one became head of a local NHS trust on leaving Parliament, having continued to practice as a GP throughout. (I guess time weighs heavy when your only responsibility is being an MP.)

Geographically, it may be worth noting that the 32 1997-Labour seats in this group include

  • 4 in the North East
  • 5 in east Yorkshire
  • 7 in the east Midlands, and
  • 6 in Staffordshire

All of which are, perhaps, areas where Labour MPs had grown accustomed to weighing the vote rather than counting it; where weak local parties made for soft targets for incoming Blairites; and where, after five or ten years of New Labour, there just didn’t seem to be that much of a reason to keep up the old habit of voting for the red rosette, whoever wore it. That’s speculation; all I can say is that if I were one of the MPs for the eight seats in this group where Labour hung on in 2017 – Ronnie Campbell, John Woodcock(!), Helen Goodman (seat inherited from Derek Foster), Paul Farrelly (heir to Llin and before her John Golding), Ian Lavery (heir to Denis Murphy), Catherine McKinnell (heir to Doug Henderson), Ruth Smeeth (heir to Joan Walley) or Gareth Snell (heir to Tristram Hunt and before him Mark Fisher) – I wouldn’t be placing the blame for my 2017 performance on things that have changed since 2015. There’s a downward trend in those constituencies which was clearly established long before that – and the great majority of Labour seats, along with the great majority of English constituencies generally, broke that trend in 2017, if they hadn’t already broken it in 2015. It’s not him, it’s you.

Swings and… swings

We’re not still going on about the European elections and what happened to the Labour vote, are we. That’s a statement, not a question, and actually I’m quite disappointed that we aren’t; as soon as minor-party voting intentions dropped below 20%, and the shouting about ‘four-party politics’ subsided, people seem to have lost interest in what happened. But, while we are clearly back in the world of two ‘main’ parties, the Brexit Party and the Lib Dems do seem to have put quite a large dent in both the Tory and the Labour vote; it would be worth knowing whether this is likely to fade between now and, oh, say for example the end of October.

Fortunately, the Euro elections have been run before (who knew?) and – as I said in an earlier post – voters have shown a tendency to use the Euros to “send a message” before now. But what does this mean in practice? If we compared the Euro election vote with the previous General Election, we could establish that the Labour vote had dropped from 40% of a 69% turnout in 2017 to 14% of a 37% turnout in 2019, but what did that actually mean – particularly when Labour’s vote at the previous European election had been 24% of a 36% turnout, which was down from 35% of a 65% turnout in the previous General Election, which in turn was up from 15% of a 35% turnout at the Euro election before that? (Labour on 15% of the vote, eh? Dreadful! To be fair, Wikipedia says that Gordon Brown “faced calls for him to resign” after this result – but the linked news story shows that what he faced was calls to resign as Prime Minister, from the Leader of the Opposition. There doesn’t seem to have been any internal opposition to Brown – or if there was they kept their traps shut.)

Anyway, I tried for some time to work out the significance of 24% of 36% vs 40% of 69% vs 14% of 37% – or, failing that, to work out a way of representing the relevant figures in a readable chart so that I could see the significant bits – before it hit me that the only way to do it was to ditch the percentages and go back to the raw numbers. Which gives us these two little beauties. (Complete with titles. I’m spoiling you, I really am.)

Top Tip #1: look at the X axis – and in particular look at the origin. The Y axis is not centred at zero – for reasons which will be obvious when you look at the Y axis. Everything above zero is an increase in votes – or rather in millions of votes – as compared to the previous relevant election; everything below the line is a decrease, in millions of votes. The first big thing to take away from these charts is just how asymmetrical they both are. At all but one General Election from 1997 to 2017, around 15 million more people turned out to vote than had done at the previous European election; the exception is 2005, and even then the rise in turnout was over 10 million as compared to the previous year’s Euros. The negative difference between General Election turnout and turnout in the next European election varies more widely, but again mostly ranges between 10 and 15 million; the exception is the 1999 European election, where turnout was down 20 million on the General Election of 1997. (There’s a story there – or a sub-plot – about voters getting swept up in high-enthusiasm, high-turnout elections, and coming down to earth when they’re asked to vote again a couple of years later. (“What, another?”)) The main point here is that the story of the difference between a Euro election – any Euro election – and the previous General Election is not a story of swings and voter movements; it’s primarily a story of voters staying at home, or rather of who stays at home. Who stays at home, and who goes out muttering “voting? damn right I’m voting, this‘ll show ’em…”.

Top Tip #2: trend first, anomaly second. Is there a trend? We can’t understand what people are doing now without having some idea of what they were doing previously. Were voters behaving in a particular way for the run of Euro elections before 2019, and/or the run of General Elections before 2017? Fortunately in this case the trend is pretty clear; look at the columns for 2004, 2009 and 2014 in the first chart, and those for the General Elections in the following year – 2005, 2010 and 2015 – in the second chart. What do you see? In 2004, 2009 and 2014, between thirteen and seventeen million people who had voted for one of the three major parties in the previous General Election – four to seven million ex-Tory and ex-Labour voters and two to six million ex-Liberal Democrat voters – didn’t; while about four million people who hadn’t voted for the Greens or UKIP at the previous General Election, did (in a ratio of a million Greens to three million Kippers). Some people stayed loyal; a lot of people stayed at home; a minority of people cast a protest vote – and that minority was made significant by the low turnout. The chances are that most of the Euro Kippers had voted Tory rather than Labour or Lib Dem at the previous General Election – and that the opposite is true of the Euro Greens – but this is less important than the scale of these numbers: the main thing that happened at all those elections was abstention. Relative to the previous General Elections, the Tory vote fell by between half and two-thirds, Labour’s by between half and three-quarters and the Lib Dems’ by between half and five-sixths. For the most part this wasn’t a swing to anyone; the total combined Green and British nationalist vote at each of those European elections was, at most, half of the Tories’ vote at the previous General Election.

Now look at the second chart. Relative to the previous years’ Euro elections, in 2005, 2010 and 2015 the major parties are up thirteen to seventeen million votes. (Labour: up five to six million; Tories: up four to six million, and seven million in 2015; Lib Dems: up three to five million, and one million in 2015. That coalition was powerful stuff.) The Greens and British nationalists, on the other hand, are down a total of three and a half million in 2005 and 2010, and one million in 2015. Again, we can assume that these voters went back to their ‘home’ parties – and we can assume that the British nationalists probably went back to the Tories and the Greens probably didn’t – but, again, this is much less important than the change in turnout, which in each case was up by 10-15 million as compared with the previous European election. The swing away from UKIP and the Greens was far less important in determining those results than the swing away from the sofa.

So those are the trends. What about the last couple of elections? 2017, as you may remember, saw an unusual election campaign and an unusually high degree of polarisation between the two main parties. Relative to the 2014 European election, the Labour vote was up by nearly nine million and the Tories’ by nearly ten million, three or four million more than the increase in 2015. The Lib Dems, by contrast, only put on a million relative to 2014 – and, since I’ve measured both elections relative to 2014, this was effectively the same million that they’d put on in 2015 (in other words, the party’s vote was almost completely unchanged from the previous General Election; in fact it was down a bit). The Green and British nationalist votes fell by a total of five million relative to 2014 – but, again, the main swing was the swing away from not voting at all: overall turnout was up by nearly sixteen million. These were familiar changes, in other words, but on a larger scale than usual: compared to the 2014-15 vote changes, the rise in turnout, the rise in Tory and Labour votes and the decline in British nationalist votes were, respectively, 1.5 million greater (+11%), 2.3 million greater (+30%), 3.6 million greater (+67%) and 3.3 million greater (+330%). Presumably some Euro-election Kippers swung to Labour in 2017, but the numbers won’t have been huge. The main effects were turnout effects, as usual, but on a larger scale: the Tories were better than usual at getting out the vote, while Labour were a lot better than usual. Also, thanks to the EU Referendum seeming (temporarily) like old news, both parties did better than they had done in 2015 at calling roving Kippers home.

What happened in 2019? Those bars look pretty big, but I wonder if there’s less there than meets the eye. Over and over again, we’ve seen what are at first blush fairly huge movements of voters, between General Election and the following European election, followed at the subsequent General Election by an equally huge movement in the opposite direction; the burden of proof is surely on anyone maintaining that this time is different. So, this time, Labour and Tory vote shares – having gone up by 8.9 million and 9.8 million between 2014 and 2017 – are right back down again, dropping by 10.6 million and 12.1 million respectively; so too the British nationalist vote share, having gone down by 4.3 million between 2014 and 2017 – is up again, by 5.2 million. There’s a story, perhaps, in the ‘extra’ four million votes that the big parties lost, and the extra 0.9 million British nationalist votes; polarisation is increasing, even if it’s only at the margins. But it is at the margins – once again, there are some relatively small voter movements which have been made to look much bigger by the one big movement, the (usual) slump in turnout. (The Brexit Party topped the polls with 5.2 million votes; a party gaining that many votes would have been in a narrow third place at the General Elections of 1997 and 2001, and a firm fourth place in every other General  Election from 1983 to 2010.) There’s also a story in the results for the Lib Dems, who – for the first time ever – appear to have been seen as one of the ‘alternative’, ‘insurgent’ parties, and actually increased their vote as against the General Election; they put on a million votes as compared to 2017. But, just as the crash in votes for Labour and the Tories needs to be set against the unusually high votes for those two parties in 2017, the Lib Dems’ result needs to be set against their own crash in 2015 and their non-recovery in 2017: their total of 3.4 million votes, although higher than the party’s vote in those two General Elections, is lower than any other General Election that the party has ever contested. To find a General Election vote lower than 2017’s 2.4 million you need to go back to 1970, and even that represented a higher proportion of the (then) electorate than the 2017 result (5.4% vs 5.1%); in those terms Farron plumbed depths that the Liberals hadn’t seen since the 1950s and Jo Grimond’s leadership. All credit to the Lib Dems for their outstandingly clear – if opportunistic and misleading – positioning in the Euros; arguably they’ve reaped a deserved reward. But it’s also arguable that there’s only so low that the Lib Dem vote can go – Farron’s 2.4 million was lower than the party’s vote in four of the previous eight European elections. Really, after 2017 the only way was up – just as, for both the Tories and Labour, the only way was down.

What of the narratives? What of Theresa May’s Brexit strategy hitting the rocks and Farage moving in to pick up the survivors? What of Labour’s Brexit fence-sitting and the Lib Dems’ positioning as the party of Remain – what of the potential Remain Alliance, the Lib Dems and Greens piling up the votes while Labour’s vote plummeted? I think you’ll find it’s a bit less exciting than that. The 2019 results showed both Labour and the Tories doing a bit worse than might have been expected, the Brexit Party doing a bit better (at the expense of the Tories) and the Lib Dems doing substantially better (at the expense of both Labour and the Tories). But they’re not wildly out of line with earlier trends. Perhaps polarisation is increasing, but only at the margins: the main trend at this European election was abstention, just like it always is. Vote flows are a pain to model, but arithmetic is a limiting factor. The Labour and Tory votes were down (relative to 2017) by ten and twelve million respectively; the total votes for the Lib Dems plus the Greens, on one hand, and BXP plus UKIP and all the minor British nationalist parties, on the other, were 5.4 million and 5.8 million respectively.

What that means is that, in and of themselves, these figures don’t give any reason to believe that voters won’t be returning en masse to Labour and the Tories at the next high-turnout election – just as they did in 2005, 2010 and 2015, as well as 2017. In particular, if the next election follows the pattern of 2017, with a highly polarised campaign and a focus on getting out the vote – and why wouldn’t it? – we could easily see a similar bulge in the Labour vote. And if that’s followed by yet another slump – complete with the obligatory prophecies of doom and calls for Corbyn’s resignation – at the European elections in 2024, that’s a price I’d be prepared to pay.

For he is good to think on, if a man would express himself neatly

My cat lies to me. I find this interesting.

My cat – our cat, rather – generally eats tinned food, but occasionally we give him cat biscuits. Not very often, and certainly not often enough as far as he’s concerned. He knows where they’re kept; when hungry will often sit in front of the biscuit cupboard giving it meaningful looks, even if he’s got a bowl full of food.

That’s not the interesting thing, though. What’s interesting is that, on several occasions, he’s sat by the back door and mewed to be let out, only to turn back and head for the biscuit cupboard when I open the door for him. The thinking is fairly straightforward, if you think of it as thinking – it goes roughly like this:

This‘ll get his attention!

But there’s an awful lot going on under the surface, particularly when you think that we’re dealing with a cat. How do you get to that thought? Or, if ascribing thoughts to a cat is a step too far, how do you get to that action? It seems to me that any creature capable of doing the back-door feint would have to go through something like this series of steps:

  1. Move (instinctively, or at any rate unreflectively) towards the back door when wanting to go out
  2. Move (unreflectively) towards the biscuit cupboard when fancying a biscuit or two
  3. Observe that move 1 is usually successful
  4. Observe that move 2 is usually unsuccessful
  5. Analyse events involved in successful outcomes to strategies 1 and 2
  6. Identify common factor, viz. getting a human’s attention
  7. Reflect on goals of move 1 and move 2
  8. Identify common intermediate goal of getting human’s attention
  9. Redefine move 1 as move which achieves intermediate goal
  10. Plan to make move 2 more effective by preceding it with move 1, thus getting human’s attention before expressing interest in biscuit cupboard

I don’t know about you, but that strikes me as pretty sophisticated thinking, particularly if we assume (as I think we must) that none of these thought processes are conscious.

Cats: they’re brighter than they look. Or rather, they really are as bright as they look.

Come write me down

There’s a particular form of serendipity that comes from learning something in one area which resolves a puzzle, or fills a gap in your thinking, in another area entirely. It’s all the more serendipitous – and pleasant – if you didn’t realise the gap was there.

This line of thought was prompted by this piece on the excellent FactCheck blog, which made me realise that I’d always been a bit dubious about the notion of “policy-based evidence”. OK, it’s a neat reversal – and all too often people who say they’re making evidence-based policy are doing nothing of the sort – but is the alternative really policy-based evidence? Doesn’t that amount to accusing them of just making it up?

Thanks to Cathy Newman at FactCheck, I realise now that I was looking at this question the wrong way. Actually “policy-based evidence” means something quite specific, and it hasn’t (necessarily) got anything to do with outright fraud. Watch closely:

Iain Duncan Smith has been celebrating the government’s benefits cap. Part of the welfare reform bill, state handouts will be capped at £26,000 a year so that “no family on benefits will earn more than the average salary of a working family,” i.e. £35,000 a year before tax.

Today, the work and pensions secretary was delighted to cite figures released by his department which he said were evidence that the policy is already driving people back into work. Of 58,000 claimants sent a letter saying their benefits were to be capped, 1,700 subsequently moved into work. Another 5,000 said they wanted support to get back into work, according to the figures.

OK, this is fairly simplistic thinking – We did a new thing! Something happened! Our thing worked! – but it’s something like a legitimate way to analyse what’s going on, surely. It may need more sophisticated handling, but the evidence is there, isn’t it?

Well, no, it isn’t.

In order to know how effective the policy had been, we would need to know the rate at which people on benefits worth more than £26,000 went into work before the letter announcing the changes was sent, and compare it to after the letter was received. But those figures aren’t available.

“[These figures do] not reveal the effect of the policy,” Robert Joyce, senior researcher at the Institute for Fiscal Studies told us. Mr Joyce went on: “Indeed, this number is consistent with the policy having had no effect at all. Over any period, some fraction of an unemployed group will probably move into work, regardless of whether a benefits cap is about to be implemented. The number of people who moved into work as a result of the policy is 1,700 minus the number of people who would have moved into work anyway. We do not know the latter number, so we do not know the effect of the policy.”

The number of people, in a given group of claimants, who signed off over a given period is data. Collecting data is the easy part: take five minutes and you can do it now if you like. (Number of objects on your desk: data. Number of stationary cars visible from your window: data. Number of heartbeats in five minutes: data.) It’s only when the data’s been analysed – it’s only when we’ve compared the data with other conditions, compared variations in the data with variations in those conditions and eliminated chance fluctuations – that data turns into evidence. The number of people who moved into work as a result of the policy is 1,700 minus the number of people who would have moved into work anyway: that number would be evidence, if we had it (or had reliable means of estimating it). The figure of 1,700 is data.

One final quote:

A spokesman for the Department for Work and Pensions said: “The Secretary of State believes that the benefits cap is having an effect.”

Et voilà: policy-based evidence.

Let memory fade

It’s a small enough thing, but this is profoundly depressing.

Of 360 posts to be cut, 120 are from Future Media & Technology, up to 90 from BBC Vision, up to 39 from Audio & Music, 17 from Children’s, 24 from Sport and 70 in journalism from national news and non-news posts on regional news sites.

Outlining its plans today, the BBC said it will meet with commercial rivals twice a year to clarify its online plans, increase links to external sites to generate 22m referrals within three years and will halve the number of top level domains it operates.

The corporation also outlined five editorial priorities for BBC Online and clarified its remit. The BBC aims to meet all these objectives, and make 360 posts redundant, by 2013. The restructured BBC Online department will consist of 10 products including News, iPlayer, CBeebies and Search. Editorial will be refined, with fewer News blogs, and local sites will be stripped of non-news content. Blast, Switch and h2g2 are among the sites to be ditched. Other closures will include the standalone websites for the BBC Radio 5 Live 606 phone-in show and 1Xtra, 5 Live Sports Extra, 6 Music and Radio 7 digital stations.

In all, the BBC is pledging to close half of its 400 top level domains – with 180 to be gone ahead of schedule later this year.

(That’s top level directories, people – the word that goes after “bbc.co.uk/”. The top level domain is “.uk”.)

The BBC’s Web presence is vast, sprawling and a bit anarchic – a quality it has in the past shared with the groups of people responsible for it. (Back in 2002 I made a concerted effort to get some writing work from the technology bit of BBC Online, a task made more difficult by the impossibility of finding any personal contact information on the site. Sustained and ingenious googling eventually rewarded me with a name and a phone number(!). I rang it and spoke to the right person, only to be told that he’d moved to BBC History and was about to move on again. On the other hand, before he left he did commission me to write an 12,000-word timeline of English history from the Romans to Victoria, so it wasn’t as if no good came of it.) There is an awful lot of good stuff there, much of it user-generated, and lots of little online communities that have grown up to support it. And yes, the bits that the corporation pay for are ultimately paid for out of the licence fee, meaning that they don’t have to make money and hence have an advantage over commercial rivals which do. This is a good thing: there are lots of worthwhile things that can be done very easily with a small subsidy, but can only be done with great difficulty, if at all, on a profit-making basis. There is no earthly reason why a corporation which doesn’t have to make money – and can afford to chuck a few grand around here or there – should behave as if it did and couldn’t. No reason, apart from political reasons. So now BBC Online are going to have a “clarified remit”, and they’re going to show their plans to commercial rivals (!) twice a year (!!), and 360 creative people are going to walk.

What really gives this announcement the smell of wanton vandalism – wilful and ignorant destruction – is the part about all the sites that are going to close. Not the fact that they keep getting the terminology wrong – that’s a minor niggle – but the fact that all these sites aren’t going to be kept up as static pages; they’re not even going to be archived. Like all those old Doctor Whos and Not Only… But Alsos, they’re just going to disappear. (All except H2G2, which is going to be sold – news which leaves me feeling relieved but slightly baffled.) Two cheers for the Graun, which put up the whole list but couldn’t resist playing it for laughs – “Ooh look, there’s a site for Bonekickers – that was rubbish, wasn’t it? Let’s see, have they got Howard’s Way?” There isn’t a Howard’s Way site. There is, however, Voices, Nation on Film, the inexhaustible Cult and a curious online mind-mapping thing called Pinball. Check them out while you can. And do take a look at WW2 People’s War, a truly extraordinary work of amateur oral history, which contains… well, here it is in its own words:

The BBC’s WW2 People’s War project ran from June 2003 to January 2006. The aim of the project was to collect the memories of people who had lived and fought during World War Two on a website; these would form the basis of a digital archive which would provide a learning resource for future generations.

The target audience, people who could remember the war, was at least 60 years old. Anyone who had served in the armed forces during the war was, at the start of the project, at least 75. Most of them had no experience of the internet. Yet over the course of the project, over 47,000 stories and 14,000 images were gathered. A national story gathering campaign was launched, where ‘associate centres’ such as libraries, museums and learning centres, ran events to helped gather stories. Many hundereds of volunteers, many attached to local BBC radio stations, assisted in this.

The resulting archive houses all of these memories. These stories don’t give a precise overview of the war, or an accurate list of dates and events; they are a record of how a generation remembered the war, 60 years of more after the events, and remain in the Archive as they were contributed. The Archive is not a historical record of events, a collection of government or BBC information, recordings or documents relating to the war.

47,000 stories! I’ll declare an interest here: the site also contains “historical fact files on 144 key events”, about 40 of which I wrote. (I found the other day that 16 of them have also migrated to the main WWII page, where I guess they will hang on after the cull.) I hate seeing my work go offline, but that’s not the main thing. The main thing is that I know how much work and care went into each of my pieces; the thought of multiplying that by a factor of, well, 47,000, boggles me. And then to snuff all of that out for the sake of saving a few gigabytes of disk space – or, more realistically, for the sake of making the BBC look as if it’s not competing unfairly with its commercial rivals – beggars belief.

Perish the thought that something hugely worthwhile and massively popular, which ITV and Sky can’t do and don’t want to do, should get done for no other reason than that the BBC can do it and do it well. Perish the thought that public money should be spent on capturing irreplaceable memories and assembling them into “a digital archive which would provide a learning resource for future generations”. Perish the thought that a public service media organisation should actually provide a public service. Utter, wanton vandalism.

Later on we’ll conspire

Answer me this: after receiving gifts on Christmas Day, what is it that children proverbially do with them within 24 hours? Do they (a) break them or (b) give them away?

Fairly straightforward, I think you’ll agree. Now let me put to you a second and superficially unrelated question. When a man loves a woman, as we know, he can’t think of nothing else; he will, indeed, trade the world for the good thing he’s found. But if he should happen to entrust his heart, metaphorically, to someone undeserving of his affections, what is the uncaring female proverbially said to do with it? Does she (a) break his heart or (b) give his heart away to someone else (this could perhaps take the form of fixing him up with a friend)?

I think it’s clear that option (a) is appropriate in both cases. So if one were to write a Christmas song likening these two situations, it would be sheer perversity to write anything other than

Last Christmas, I gave you my heart
And the very next day you broke it

See? See? It makes sense now! Honestly, if they’d got that right in the first place it could have been a big hit.

Update It has been brought to my notice that there are strong reasons to object, even in theory, to the notion of a daily round of Christmas-themed celebration. The obvious objection to a daily Christmas is that this would rapidly have adverse effects in terms of exhaustion, obesity, alcohol poisoning and so forth. However, these effects could easily be mitigated, or even eliminated, by simply varying one’s level of indulgence in Christmas cheer and jollity, over the weeks and months of perpetual Yule. A less tractable problem is presented by the irreducible necessity of preparation. When, under the daily-Christmas regime, would any of us have time to buy food, presents, cards, bottles of winter ale, bags of Bombay mix and other such essential accoutrements of the season? I feel that this objection has considerable force, and would therefore propose that any future songwriter working in this thematic area should write something along the lines of

I wish it could be Christmas every other day

Once again, I think you’ll agree that this would represent a considerable improvement.

Ho ho ho.

Writing frightening verse

The papers have been all over Will Reader‘s presentation at the British Association Festival of Science; the Guardian alone has run two separate stories by James Randerson, headed “Social networking sites don’t deepen friendships” and, more bluntly, “Warning: you can’t make real friends online”.

I’ve been socialising online for over ten years now, and I’m pretty sure I’ve made (and lost) some real friends in that time, so I think that second headline is a bit silly. More to the point, I think presenting the story that way risks creating controversy rather than debate: I know that I’ve made friends online, you know that they can’t have been real friends, we shout at each other in comments boxes for a few days and entertainment results. (Possibly. Traffic results, anyway.)

What Reader is reporting is more nuanced and more tentative than that. From the Graun‘s story (the one with the ‘warning’ headline):

The team asked more than 200 people to fill in questionnaires about their online networking, asking for example how many online friends they had, how many of these were close friends and how many they had met face to face. The team found that although the sites allowed contact with hundreds of acquaintances, as with conventional friendship networks, people tend to have around five close friends.

Ninety per cent of contacts whom the subjects regarded as close friends were people they had met face to face. “People see face to face contact as being absolutely imperative in forming close friendships,” added Dr Reader. He told the British Association Festival of Science in York that social networking sites allow people to broaden their list of nodding acquaintances because staying in touch online is easy. “What social network sites can do is decrease the cost of maintaining and forming these social networks because we can post information to multiple people,” he said.

But to develop a real friendship we need to see that the other person is trustworthy, said Dr Reader. “What we need is to be absolutely sure that a person is really going to invest in us, is really going to be there for us when we need them … It’s very easy to be deceptive on the internet.”

The results are interesting – although ‘more than 200’ sounds like a pretty small sample – and Reader’s interpretation seems pretty reasonable. But I part company with him in the last couple of sentences quoted here: the major problem with online sociality is not the lack of identity verification. I’ve been on a couple of mailing lists for several years; there are fifty or sixty people I’ve known, online, for longer than I’ve known many of my real-world friends. We use our real names on those lists; we talk about work, family and relationships; we occasionally arrange meetings.

All in all, the scope for deliberate deception is very limited. Nevertheless, I wouldn’t call every one of those fifty or sixty people a close friend. The point isn’t that online relationships are a fraudulent imitation of emotionally real relationships, which are demanding and require commitment; the point is that online relationships have their own emotional reality, which is relatively uncommitted and relatively undemanding. There’s a broader truth to Clay Shirky’s pessimistic comments about the Howard Dean campaign, which I wrote about back here:

the pleasures of life online are precisely the way they provide a respite from the vagaries of the real world. … the difference between “would you” and “will you” is enormous — when “would you use this product?” changes to “will you use it?”, user behaviour frequently changes dramatically. “Would you vote for Howard Dean?” and “Will you vote for Howard Dean?” are two different questions, and it may be that a lot of people who “would” vote for Dean, in some hypothetical world where you could vote in the same way you can make a political donation on Amazon, didn’t actually vote for him when it meant skipping dinner with friends to drive downtown in the freezing cold and venture into some church basement with people who might prefer some other candidate to Dean.

Similarly, with the best will in the world there’s a difference between Would you put yourself out for a friend? and Will you put yourself out for a friend? – particularly when you’ve never actually met the friend in question. In other words, the point is precisely not that we can’t be absolutely sure that a person is really going to invest in us [and] is really going to be there for us when we need them: the point is that we can’t assume that those two things go together. This disjuncture between emotional investment and binding, push-comes-to-shove mutual obligation isn’t entirely new – think of penfriends or AA groups – but I think it’s fair to say that the spread of online sociality has made it a much more widespread experience.

What’s going on – or rather, what’s specifically not going on – is summed up by the phenomenologist Alfred Schutz, quoted here by Ulises Mejias:

In order to observe a lived experience of my own, I must attend to it reflectively. By no means, however, need I attend reflectively to my lived experience of your lived experience. On the contrary, by merely “looking” I can grasp even those of your lived experiences which you have not yet noticed and which are for you still prephenomenal and undifferentiated. This means that, whereas I can observe my own lived experiences only after they are over and done with, I can observe yours as they actually take place. This in turn implies that you and I are in a specific sense “simultaneous,” that we “coexist,” that our respective stream of consciousness intersect

Simultaneity, the ability to experience our consciousness in parallel, is a defining feature of face-to-face interactions. The outcome of this inter-subjectivity is not that we are able to “read” the other person’s mind. It is simply a realization that ‘I am experiencing a fellow human being.’

I suspect that this experience of continuous mutual presence – what Schutz called the ‘We-relationship’ – is the distinguishing feature of close friendships. It’s a relatively rare experience – and social networking software doesn’t make it any less so.

One final thought. What would a collective We-relationship – the experience of the consciousness of time passing, of an event unfolding, shared and reflected within a group of people – look like and feel like? Something like a really good meeting? (Physical presence, again, is hard to do without. I’ve attended multi-site Access Grid meetings; it’s great being able to see people’s faces, but it’s impossible to meet anybody’s eye.) Or something like this?

Modern religions demand ‘belief’, an act of the imagination; traditional ritual didn’t need to demand any such thing as it offered direct experience of ‘god’, ie, of society, of social solidarity.

But you don’t know me

I don’t know Tilda Swinton. At all.

There are, of course, many people I don’t know; the list could be extended more or less indefinitely, potentially forming the basis for a rather unchallenging game (“Yeah? Well, I don’t know Charles Kennedy, Jason Orange or Hufty from the Word…”) The point about Tilda Swinton in particular is that, if you stopped me in the street and asked me if I knew her, I’ve got a horrible feeling I’d say Yes. (At least, I used to… Well, when I say ‘know’, I met… actually no, I never actually met… sorry, what was the question?)

Obviously, the image of anyone you’ve seen a lot on the screen can get painted on the back of your mind, to the point where they seem as familiar as a friend or neighbour (“In the street people come up to Rita/It’s Barbara Knox really but they’re still glad to meet her” – Kevin Seisay). I suppose something similar’s going on here, assisted in this case by the fact that I was at the same university as Tilda Swinton for at least one year; I even saw her in a college theatre production once, playing opposite a friend of a friend of mine. (I think. It may have been someone else.)

I’ve never even had any contact with Tilda Swinton, if it comes to that. I did once try to get in touch with her, for a series of brief interviews we were running in Red Pepper at the time. A friend gave me the number of a friend, who she thought had known her and might be able to put me in touch. I duly phoned the friend’s friend, who was a bit taken aback and suggested that if I wanted to speak to Tilda Swinton I should probably go through Tilda Swinton’s management. Nothing ever came of it.

In short, whatever fantasies I may half-consciously harbour, the real world is unanimous on this one: I don’t know Tilda Swinton, at all. I’ve got a friend who’s got a friend who may once have known her, and I had a friend at college who had a friend who may once have acted with her, but none of that adds up to anything.

Or it didn’t, until LinkedIn.

LinkedIn is a social networking site for people who want to make their social network work; it’s designed to enable members to exploit “the professional relationships you already have”. You join LinkedIn by writing a ‘profile’ (a c.v., more or less). You then ‘build your network’ by exchanging emails with existing members of LinkedIn who you already know; the software helpfully provides lists of LinkedIn members who are, or were, at your workplace, former workplace or university. When your emailed invitation has been accepted, the user you invited becomes one of your ‘connections’, while you become one of theirs. Ultimately you end up with a network “consist[ing] of your connections, your connections’ connections, and the people they know, linking you to thousands of qualified professionals”. ‘Thousands’ is no exaggeration: after a month’s membership I’ve got 41 ‘trusted friends and colleagues’, and many LinkedIn users have five or ten times as many. It adds up, or rather multiplies out: if you count “[my] connections’ connections, and the people they know”, I’m connected to over 200,000 people. Woohoo.

There are two main ways to make money out of social software – adding advertising or charging a fee for a premium service – and I’m generally in favour of the latter. This is the route LinkedIn have chosen. Annoyingly, the result in this case is not simply that fee-paying users benefit but that free riders are penalised. The profiles of users outside your network are only shown in full if you’ve got a paid-for account, which can be frustrating. Worse, the highest echelons of power-networking users can opt out of receiving common-or-garden email invitations, so that they can only be contacted using the network’s ‘InMail’ facility – which is, of course, only available on paid-for accounts. There’s being linked in, and then there’s being linked in. I suppose this says something about the nature of the service they’re providing: a professional social network is one with lots of people excluded from it.

The bigger question is what LinkedIn actually provides (apart from the warm glow of knowing that somebody else has been excluded). I wrote last year that tagging, for me, is more an elaborate way of building a mind-map than anything to do with bookmarking pages and finding them again; I’m interested to see that Philipp has reached a similar conclusion (“Let’s put it straight: Using tags to find my bookmarks later just doesn’t work. I give up.”) Similarly, I suspect that one of the main benefits of LinkedIn – at least for us non-power-networkers – is the capacity it gives you to contemplate the scale and plenitude of your own network: all those people I know, sort of! I mean, I know someone who knows them, or else there’s a friend of a friend who knows them… So I sort of know them, really, don’t I, just a bit?

But Tilda Swinton’s not on LinkedIn. So I don’t know her at all.

Wrapped in paper (10)

One last column, from right back in 1998. I had actually worked in IT until a couple of years before; I think my sympathies are clear.

BUSINESS MANAGERS are never short of advice these days. Any large bookshop has several yards of books devoted to Self Help for Managers: Feel the Stress and Do it Anyway; Meditations for Women who Manage Too Much; Men Are From Mars, the One-Minute Manager is from Venus… However, managers haven’t had any advice from the best source of all: the IT department. Not, that is, until now. I will shortly be bringing out a compendium of tricky real-life management problems with IT-friendly solutions, Just Don’t, All Right? Here’s a selection.

PROBLEM: You are the national sales manager for a major distributor of synthetic insoles. You wish to rationalise the structure of the sales force by cutting out a level of management. There are four levels; the top level consists of two meaningless jobs with grand-sounding titles, created specially for the previous MD’s wastrel half-brother and his friend Simon. However, Simon’s wife is currently expecting their third child; moreover, she is an old friend of the receptionist at the office next door, who often lets you use their car park when your space gets taken. What do you do?

SOLUTION: Remove a level in the reporting structure? Are you mad? Have you any idea how many systems that will affect? Dedicated IT professionals worked long hours to design systems around the current management structure, with all the job titles and reporting relationships carefully hard-coded. Are you going to throw that work back in their faces? Besides, redeveloping all those systems would take approximately… let’s see… eighteen months, and that’s with everyone working flat out… factor in development work, allow for holidays and you’re talking three years easily. Maybe four. And by that time you’d probably want the old structure back, so it’s actually quicker this way.

PROBLEM: Your company has merged with Acme Insoles, previously your biggest rival. Acme management wants the new company to standardise on the Acme IT system, which offers a thin-client VR interface to a Web-enabled object-based next-generation system running on a wide-area network-centric protocol-independent massively-parallel cluster array. Your operations people argue strongly against this, on the grounds that your own system is ‘loads better’. Acme management took you out to lunch the other day, which was nice. On the other hand, they did insist on going to that posh Italian restaurant, and you missed out on a session down the pub with the ops. Who should you believe?

SOLUTION: The ops, every time. They should know, it’s their job. Besides, all that wide-object massively-independent stuff is all very well, but who’s going to get out of bed when it falls over at 2 a.m. on a Sunday? The ops, that’s who. Antagonise them at your peril.

PROBLEM: The year 2000 is fifteen months away. When you asked your IT director, he told you that all your systems were millennium-compliant; however, immediately afterwards he took early retirement and opened a greengrocer’s. When you went past the other day there was a sign in the window saying

All ‘Fruit’ Is ‘Guaranteed’ Millenium-Bug ‘Free’!

Should you be worried?

SOLUTION: The use of bad English on a greengrocer’s sign is not in itself worrying, or indeed surprising. You may even be able to turn it to your advantage: is there a gap in the market for an scrupulously literate greengrocer? If on the other hand you are not expecting a sizeable lump sum from your current employer, you may wish to consider bar work. (Note: avoid pubs with computerised tills).

PROBLEM: You are having trouble motivating your IT staff. You have tried departmental meetings, informal group chats, fun events after work, lunchtime quizzes, motivational posters, Dress Down days, Dress Up days, Tidy Desk days and Work Normally days. Nothing works. What should you do?

SOLUTION: Try money. Or holidays. Or, no, wait, money and holidays. And shorter hours. Let’s see… more money, more holidays, shorter hours and paid overtime. And free beer. That ought to do it.

Wrapped in paper (9)

Another from 1999, this time from Ned Ludd’s column in NTexplorer. Bill Gates’s book Business @ the speed of thought had just come out. (No, I don’t remember anything about it either.)

SINCE THE SUCCESS of my first book, the Superhighway Less Travelled, rumours of a sequel have been rife. I’m happy to say that ‘Ludd 2.0’ is finally available. It’s called Thinking at the speed of business, and your local bookstore may still have some signed copies. (They certainly had a few left when I went.)

It’s a 300-page book, so I can’t do justice to the full complexity of the ideas I presented in it here – not unless I had a double page at least. (No chance – Ed.) Here, by way of a taster, are some of the key concepts from the book they’re already calling a paradigm-busting block-shifter.

Digital nervous system. Not everyone realises this, but the information which is held on computers is actually encoded in the form of digits – that’s numbers to you and me. One and zero are the numbers most commonly used, but that’s just down to programming tradition. Many people are unhappy about computers having all that information, and so they try and beat the system – they spell their names different ways, they leave the ‘optional information’ boxes blank, sometimes they don’t even register their software! What I say is, computers already have so much information about you, what does it hurt to give them a bit more? Besides, the computers don’t care about your information – to them it’s just ones and zeroes, remember? There really is no need to get nervous about the digital system.

Working Web-style. Go into any large company, ask twenty different knowledge workers what they’ve found on the Web recently, and you’ll probably get thrown out by Security. Not only that, but you’ll have wasted the best part of a morning. And they’d all lie to you anyway, so what would be the point? Give people Web access, and you’ll find that from then on they’re working in a different way – a more secretive way, very often. Take their Web access away, on the other hand, and they’ll leave. The Web, in today’s business world, is a chaotic strange attractor; in other words, it’s a quantum leap which will transform the working environment for generations yet unborn, probably. I expect it’ll work out all right.

Information on your fingers. From the teletype to the keyboard; from the keyboard to the mouse; from the mouse to those funny-shaped mice with the little wheely thing in the middle – a whole series of quantum-busting paradigm-leaps, and every one of them has depended on the human finger. Several fingers, in fact. Developments in VR technology which are already being written about will take this process a revolutionary step further, with the advent of a tactile user interface or TUI. Imagine being able to reach out and use your hand to smooth the curve of a graph, align a heading, massage the data. No, I can’t imagine it either, but it’s certainly worth thinking about.

The speed of business. Go into any large company – you shouldn’t have so many problems with Security this time round – and see how quickly things are getting done. That’s right: not very quickly at all. Most office workers spend significant amounts of time doing what time and motion experts classify as ‘chatting’. Approaches to chatting differ, but the overall chat quotient (OCQ) is thought to be remarkably constant as between the three main sociological categories of office worker: the Infuriatingly Calm Slacker (ICS), the Crisis-Driven Maniac (CDM), and the Manager (BOF). The moral is clear. The true speed of business is a leisurely speed, and there’s really no call to speed it up – I mean, who wants to work around the clock anyway? Let the computers sort it out – we’ve got homes to go to.

Thought-provoking stuff, I think you’ll agree. In all modesty, I think this book could get me recognised as the most influential business author since Tom Peters, or possibly Napoleon. Already I’m hotly tipped for this year’s award for the business writer who makes the most use of scientific terms without knowing what they mean. That’s what I call a paradigm shift!

Wrapped in paper (8)

After all those columns from 1999, here’s one from last month. (And then I’ll get back to proper blogging, probably.) They say you should write about what you know; what I knew, that particular weekend, was beer. ‘Dave Bitzer’ doesn’t represent anyone in particular. Years ago I invented a consultancy called Gargle Bitzer Helipad, and I’ve used various Gargles and Bitzers ever since then to stand in for different talking heads and company spokespeople. Usually I make them talk rubbish, for obvious reasons, but in this one I think Dave talks a lot of sense.

I RAN INTO my old friend and commenting partner Dave Bitzer the other day at local beer tasting event Pale And Bitter (And Slightly Sour). I’d worked my way through the milds by this point and started on the fruit beers. In retrospect I think the second blackcurrant flavoured porter may have been a mistake.

“Dave!” I put it to him. “How’s it going! How is it going? How’s life in the… well, you know.”

I could see that my incisive style of questioning had caught Dave unprepared. For a moment, in fact, he got so confused that he said “Hello” and then turned his back on me – very much as if he had said “Goodbye”! Pausing only to sample the ginger-flavoured pale ale, I hastened to set his mind at rest.

“Dave,” I put it to him, placing a friendly arm around his shoulders. “David, David, Davey Davey Dave. It’s like this. I mean, is it like this? That’s the thing, you see – is it like this or not? I mean, if you spend your time reading about Web 2.0 on blogs and podcasts… and, and blogcasts…”

Dave said that people who did that should probably get out more, although in my case he’d make an exception. I thought that was a very good point.

“That’s a very good point,” I put it to him. “Thing is, if you read the Webby, Web things, lots of stuff. Lots of stuff happening. Reminds me of the dot boom. Is this another dot boom boom, Dave? Wait a minute, that’s not right. Is this… another… dot dot boom, de-boom boom boom. That’s what I say.”

Dave gave a heavy sigh, clearly impressed with the cogency of my argument. OK, he said, look at it this way. He tore the top layer of paper from a beermat and drew a cross on the exposed surface. So here’s your basic quadrant, he said. You can call this one –

“I’ll call it Henry,” I put it to him.

Dave sighed again, obviously deeply impressed. OK, he said, here’s Henry the Quadrant. Left to right we’ve got usefulness – is an application idea actually useful or not? Top to bottom, marketability, or whether or not you can get people worked up about it. We can rule out a couple of combinations straight away. ‘Dull but useful’ is an uphill battle for any company (apart from companies that have a large installed base they can sell to), and ‘dull and useless’ is best avoided. Clearly, ‘useful and exciting’ is what most developers are aiming for. But here’s the problem. How can developers actually come up with something that’s both exciting and useful? Look at the way we live already – we wear clothes, we drive cars, we synchronise calendars, we download MP3s, we shop around to get the best price for DVDs and don’t worry too much about the Hong Kong customs stamp when they arrive. It all works, basically. So people end up going for “exciting but useless”, and you get applications like Twitter – sounds great, if you like the idea of reading other people’s diaries in real time, but it’s not much use if you’ve actually got a life. Speaking of which, he added cryptically, then turned to look around the room, avoiding my gaze completely. I was touched by this mark of respect and put my arm round his shoulders again.

“So… So, so, so, Dave,” I put it to him. “Tell me, Dave. Is this another dot boom boom?”

Dave made a strange respectful growling noise. Not really, he said, because… oh, never mind. Look, it’s as if a brewery had to keep coming up with something new, and after a while they found they’d done every kind of beer that was actually drinkable, but they just kept going anyway and turned out, I don’t know, ginger-flavoured pale ale or blackcurrant-flavoured porter. There’s just not a lot going on, and the stuff that gets the hype isn’t really worth it. That’s why I’m here, actually.

“What, to check out the brewing… trendy… trendy trends?” I put it to him.

No, Dave said – to get drunk. What do you recommend?

Wrapped in paper (7)

More from the last century. This one had a wider audience than many of my columns, as it appeared in Computing. I wrote five of these columns for the paper in the first half of 1999, working on a rota with four or five other writers, after which they had a big reorganisation and dropped the column. It’s nice to feel one’s made a difference.

This column’s dated surprisingly well, although the obvious anachronisms are now harder to spot. The only bits that have a really odd ring are the bit about ‘free Internet services’ and, ironically, the reference to dialup access ‘costing you money the whole time’. The economics are different these days.

THE CURRENT explosion in free Internet services is sure to create a whole new generation of users: keen, enthusiastic, ignorant. With these lucky people in mind, I’ve put together a brief Guide to the Net.

What is the Internet?

The World Wide Internet (or ‘Web’ for short) was originally set up as a means for American military commanders to communicate with one another following a nuclear holocaust. It was thought vital to national security to assure the continuing availability of chat rooms, games of noughts and crosses and, of course, ‘adult’ material. Although the Cold War ended some time ago in a no-score draw, by some oversight the authorities failed to dismantle the International Net (or ‘w.w.w.’ for short). Many new technologies have sprung up to threaten the continuing viability of the ‘Web’ – the Sega Megadrive, Rabbit phones, the Microsoft Network – but it continues to hold its own. Indeed, some experts believe that it has grown within the last three to five years – and that this upward trend may continue!

Why is it so popular?

The popularity of the ‘Net’ is undoubtedly due to the unparallelled range of information and services which it can offer: the latest news from Kosovo, hardware specifications for everything from a Furby to a Happy Fun Ball, and, of course, material for ‘adult’ eyes. And that’s without mentioning the vast communications possibilities opened up by the electronic ‘news’ and ‘mail’ services which form an integral part of the ‘Interweb’, although it’s a different part from the part that we’ve been talking about up to now. Mail (or ‘email’ for short) quite literally shrinks the world – when you first got an email address, who would have thought you’d soon be getting business propositions from people in Taiwan? Not you, I’ll bet. As for ‘news’, don’t get hung up on that stale old idea of new information presented in an unbiased manner – most of the ‘news groups’ are full of ancient gossip and incomprehensible insults. And they’re all the better for it!

What about the…

Many news groups are entirely dedicated to the provision of material designed for an ‘adult’ audience.

Just checking. So, what are the drawbacks of ‘Net life’?

Net life is a lot like real life: you meet people, you talk about things, you make friends, you fall out, you insult them in public, they refuse to speak to you, you realise you’ve gone too far and try to apologise, it’s too late, nobody ever writes to you again except people in Taiwan with business propositions. The main difference is that when you’re on the Net it’s costing you money the whole time. On the other hand, how many people do you know from Taiwan?

What about the dangers of addiction?

Net addiction – or ‘Web addiction’ for short – is a real danger for today’s ‘knowledge workers’ (people who really have to work at it to acquire knowledge). Sufferers become distracted and irritable when they’ve been ‘off the line’ for too bloody long – often they can’t even complete a simple English, oh, what’s the point anyway? Look, I’ll just check my mail, all right? I won’t be on for long.

Does the Net interfere with users’ social lives?

What was that? Sorry, I wasn’t listening.

What are the major growth areas in Web use?

The Web is international – hence the name! This means that the only laws which apply on the Web are laws which apply all over the world. This is good news for casinos, which have expanded far beyond their original base in the North of England, as well as for providers of material intended for customers who can be described as ‘adults’. Another recent growth area, also taking advantage of the Net’s ‘offshore’ existence in ‘cyberspace’, has been drug trafficking (or ‘e-commerce’).

What is the significance of encryption?

Nfx n fvyyl dhrfgvba…

Wrapped in paper (6)

As a sort of companion-piece to the last one, here’s a column from September 1999.

THIS MONTH this page is given over to an interview with a pioneering futurologist: Michel de Nostredame. De Nostredame – more widely known as ‘Nostradamus’ – has had a huge influence on the very course of life on this planet itself, and on the development of the computer industry. I was particularly curious to hear Nostradamus’ interpretation of recent events, which have damaged his reputation in some quarters.

So we’re still here, then.

Can I make one thing very clear right at the outset? When soldiers cross the burning river, only a young Pope can hold the jam.

Do you think you could make that even clearer?

Sorry – force of habit. What I meant to say was, I never actually said the world was going to end on the fourth of July 1999.

What about “a creature with two heads will be born the day the eagle celebrates his festival”?

Well, there you go – that could mean just about anything. The Yanks aren’t the only people who make a fuss about eagles, are they? Besides, I didn’t specify a year. I didn’t specify a century, for that matter.

Elsewhere you did refer to July 1999, though. ‘Year 1999, seven months, from the sky will come a great king of terror to revive the king of the Mongols.’

For a start, it’s not the king of the Mongols: it’s the king of Angoulême, which is a region in France. People keep assuming I wrote in anagrams – as if my verses weren’t incomprehensible enough to start with! If I’d meant ‘Mongols’ I would have written ‘Mongols’, I can assure you.

But Angoulême doesn’t have a king.

That’s easy for you to say. I was writing four hundred years ago, remember? Anything could have happened in that time. Then there’s this ‘great king of terror’. What I actually wrote was deffraieur, which means someone who pays the bill – the kind of person who’ll get the drinks in and pick up the tab.

So it should be translated as ‘a great entertaining King’?

Uh-huh.

That gives us: ‘Year 1999, seven months, from the sky will come a great entertaining king to revive the king of Angoulême’. It’s not a great improvement in terms of accuracy, is it?

You realise that the seventh month of the astrological calendar only starts in mid-September? No, you’re right, it’s not very likely. Chalk it up to experience.

What influence do you believe your work has had on the computer industry?

It’s had a huge influence. Bill Gates himself is known to have studied my writing extensively. He even used one of my verses as justification for one of his major campaigns. As it happens that verse was a fake – it was planted by the British government, which had learnt about his superstitions following the defection of Rudolf Hess – but it shows how seriously he took my writing.

I think you’re thinking of Adolf Hitler.

You may be right – these twentieth-century leaders all look alike to me.

How do you think your writing will fare in the next millennium?

I’m optimistic. That reference to 1999 was the last specific date I used – I wish I hadn’t bothered, it was asking for trouble. There are plenty of verses still left to interpret, and some of them are so weird that they’ll be almost impossible to prove or disprove. “They will come to deliver the prince of Denmark, a shameful ransom to the temple of Artemis” – what’s that about? People will be trying to make sense of my prophecies for a long time to come.

Can I quote you on that?

I’d rather you didn’t – you never know what might happen.

Wrapped in paper (5)

This one’s from March 2000. I should say that I took Y2K very seriously indeed; we even stockpiled. (Well, we had a box.) I vividly remembered being a programmer in 1987, and having to argue long and hard before my project leader would allow me to use eight-digit dates. Multiply that out across the country, I thought, and who knew what would happen? Ironically, I was one of those posters to comp.software.year-2000 who were regarded as sunny optimists, on the grounds that we anticipated large-scale disruption but not actually the end of the world as we knew it, as such. At one stage I formulated a rough 5-point scale for measuring the severity of our predictions, and pegged myself at around 4 (where 5 was, well, TEOTWAWKI). There were plenty of no-nonsense 5s; somebody even extended the scale up to 10, to incorporate vaguely Nostradamus-like predictions of exactly how the WAWKI would E.

So I have every sympathy with people like Peter de Jager and Ed Yourdon, who did a great deal of what I still believe was good and worthwhile work in raising awareness of Y2K, and with Ed Yourdon’s afterthoughts in particular. Just thought I’d establish that.

I CAUGHT UP with my old friend Ed Gargle at his remote farmhouse recently. Ed was widely regarded as one of the leading authorities on the Millennium Bug in 1998 and 1999, although more recently he has been less in demand. I began by asking Ed the obvious question: what went right?

“What went right? Precious little, as far as I could see. Oh, there were a few failures – I believe the trains in Mali are still up the spout – but by and large Y2K was a bit of a washout..”

Remediation had been successful, in other words?

“Absolutely – and nobody’s happier about that than I am. Y2K could have been a major disaster. There was a real risk of an economic slowdown, caused by nothing more than the ever-mounting expense of last-minute fixes and the spiralling fees which would have been charged by Y2K consultants. We could have seen supply chain disruptions, leading to shortages in basic supplies; that would have caused untold hardship for everyone, except for those farsighted individuals who prepared by buying a year’s supply of rice, drinking water and toilet paper. (That’s a lot of toilet paper, incidentally – particularly if you got some extra for barter purposes.) At worst, we could have seen society decline into lawless, bloodstained chaos, in which civilisation itself would only be kept alive by a few hardy pioneers in isolated farmhouses. Instead, everything just went on working. I’m glad about that. Really very, very glad.”

I wondered how Ed would account for the success of remediation.

“Mali, for God’s sake. Talk about adding insult to injury. New Zealand: OK. Australia: OK. Japan: OK. China: OK. Russia: OK – Russia, would you believe! Europe: OK. The US: OK. Mali: problems on the railways. Oh, big deal. Who in their right mind is going to get on a train in Mali at the best of times, let alone on the day before the end of the world as we know it?”

Quite. However, I also wondered how Ed would account for the success of –

“People are blaming me now. Can you believe that? All I did was state how it looked to me – people have got to draw their own conclusions. So what if a world-renowned Y2K consultant says there’s a 79% probability of one or two major disruptions to essential services during the first quarter of 2000, each lasting between two and three weeks – it’s just one person’s opinion. People are even blaming me for the money they spent on preparing for the rollover. All I said was that I’d sold up, moved to the country and bought a year’s supply of rice, bottled water and toilet paper (which is a lot of toilet paper, incidentally) – I never said that anybody else should do the same. There wouldn’t be much point if everyone did it.”

Indeed. I wondered how Ed would account for –

“I’ve got no bookings, you know. My diary’s empty. Correction, I’ve got a few of these gigs in the first quarter – ‘Ed Gargle Explains Why He Got It Wrong’ – but after that, nada. I’m hoping I’ll be able to fall back on the stuff I was doing before Y2K. I don’t know, you tell me – is C++ still making headlines? Thought not. Still, look on the bright side – I won’t need to buy rice any time soon.”

Clearly. I wondered how –

“And then there’s all that toilet paper – it’s taking up space apart from anything else. I put a note in the last edition of my subscribers-only Y2K newsletter asking what I could do with 144 rolls of toilet paper, but I haven’t had any suggestions. Well, I haven’t had any practical suggestions.”

Ed sighed and poured us both another slug of ‘Sloe Poison’ (a locally-produced fruit brandy).

“As for why remediation succeeded, God only knows. Dedicated programmers, I suppose. Well-written applications. Stable, reliable, robust platforms, if there is such a thing. Still, mustn’t grumble – never know what’s going to happen at the end of this year.”

At the end of this year?

Ed smiled.

“Can I interest you in a seminar?”

Wrapped in paper (4)

Finally (for now), here’s another one from a defunct print publication, in this case one that wasn’t even available on this side of the Atlantic. The magazine was called ePro and it was aimed at IBM users. IBM what users, you ask. That was the clever part – ePro was for users of IBM ‘eservers’, in other words any of IBM’s four (or thereabouts) server platforms. (That was ‘eserver’ with that squiggly at-sign ‘e’. You do remember the squiggly ‘e’, don’t you? Alex? Anyone?)

Anyway, I got the WebSphere-related commentary gig, which involved sounding knowledgeable once a month without making too many jokes. Most of the columns are pretty damn geeky, to be honest, as well as tending to slip into the corporate-breathless mode (I’m guessing here, but if IBM have successfully developed the philosopher’s stone – and that is a big if…) Some of the less technical ones still read pretty well, I think. For example, this one, from March 2003.

MONSTER MOVIES never give you a good view of the monster until halfway through. Representing Godzilla through one enormous footprint — or even one enormous foot — is a good way of building up suspense. It’s also realistic: if Godzilla came to town, one scaly foot would be all that most people ever saw.

Some things are so big they’re hard to see. Although e-business is making some huge changes to the way we live and work, we don’t often think about where it’s coming from and why. Asked to identify trends driving e-business, analysts tend to resort to general statements about business efficiency or customer empowerment. Alternatively, we get the circular argument which identifies e-business as a response to competitive pressures—pressures which are intensified by the growth of e-business.

The real trends driving the evolution of e-business are at once more specific and more far-reaching. Moreover, these trends affect everyone from the B2C customer at home to the IBM board of directors, taking in the hard-pressed WebSphere developer on the way.

The first trend is standardization. On the client side, there is now only one ‘standard’ browser. A friend of mine recently complained about a site which was not rendering properly (in Navigator 7.0). The Webmaster — presumably a person of some technical smarts — replied, “This is not a problem with our site, but your browser. I am running Windows 98 with IE 5.50 and everything displays perfectly.” At the back end, conversely, the tide of standards rolls on—from CORBA to XML to SOAP to ebXML. Interoperability between servers is too important for any company, even Microsoft, to stand in its way.

Whether standards are set by mutual agreement or by the local 800-pound gorilla is secondary; however it’s achieved, standardization has fostered the development of e-business, and continues to do so. The effect is to commoditize Web application servers and development tools; this in turn promotes the development of a single standard application platform, putting ‘non-standard’ platforms and environments under competitive pressure. From OS/400 to Windows 2000, platforms which diverge from the emerging Intel/Linux/Apache norm are increasingly being forced to justify themselves.

The second trend is automation. Since the dawn of business computing, payroll savings have been an ever-present yardstick in justifying IT projects. E business continues this trend with a vengeance. Whether you’re balancing your bank account or making a deal for office supplies in a trading exchange, you’re interacting with an IT system where once — only a few years ago — you would have had to deal with a human being. The word processor was the end of the line for shorthand typists; e-business is having a similar effect on growing numbers of skilled clerical employees. The next step, promised by Microsoft and IBM alike, is an applications development framework so comprehensive that business analysts and end users will be able to generate entire systems: even application development will be automated. (No, I don’t believe it either, but are you going to bet against IBM and Microsoft?)

The third trend is externalization of costs. Not long ago, if you asked a shop to deliver to your home, you could expect to see a van with the name of the shop on the side. Place an order online today, and your goods may well be delivered by a self-employed driver working with a delivery service contracted to an order fulfillment specialist. Talk of ‘disintermediation’ as a trend in e-business is wide of the mark. By offering more agile, flexible and transparent inter-business relationships, e business makes it possible for intermediaries to proliferate, each contracting out its costly or inconvenient functions. On the B2C front, meanwhile, operating costs are increasingly passed on to the customer: I sometimes spend far longer navigating a series of Web forms than it would take to give the same details to a skilled employee.

A drive for standardization, forcing all platforms into a single generic framework; automation for all, cutting jobs among bank tellers and programmers alike; businesses concentrating ruthlessly on core functions, passing on costs to partners and customers. These trends have had a huge impact on IT and society at large — and there’s more to come. In the e-business world, we’re all in Godzilla’s footprint.

Wrapped in paper (3)

One more back number. This one is a bit older than the other two and requires some introduction.

For three years, I edited a magazine called NEWS/400.uk; it’s still going, albeit under another name, and I’ve gone on writing a regular column for it ever since. The mag’s appeal is and always has been fairly specialised, as it’s aimed exclusively at users of IBM’s System i midrange platform (formerly known as the AS/400). There was a brief period – 20 monthly issues, to be precise – when the company I worked for also produced a magazine for users of Windows NT and Windows 2000. I was the launch editor – I left after eight issues – and I’m still convinced it could have been big. For various reasons, it didn’t happen.

Anyway, I had a column in NT explorer as well; it was called “NTWA” and it was written, for reasons I don’t now remember, under the name of Ned Ludd. This is the column from March 1999.

“AH, MR LUDD – I’ve been expecting you.”

A familiar, bespectacled figure greeted me. He was sitting in a swivel chair, which he turned to face me as I entered the room. At first sight I thought he was stroking a Persian cat; after a moment I realised it was a stuffed purple dinosaur. As I watched, he dashed the toy to the floor; it bounced once, squeaked “Say Hi to Barney!” and lay still.

“We meet at last,” he said. “And for you, it really is the last time. I mean, it’s the last time you’ll meet anyone, because I’m, like, going to kill you. I know, it kind of sucks, but what else can I do?”

“You could tell me your master plan,” I suggested.

“Ha!” he riposted. “Tell you my master plan? Ha! And, uh… Ha! And stuff. Oh, what the hell, let’s do it. I mean, I’m going to throw you to the piranhas anyway, right?”

He gestured towards what I had thought to be an ornamental water feature. Then he reached down, picked up the purple dinosaur at his feet and flung it across the room. It gave out a plaintive “I wuv you, Billy!” as it flew, then disappeared into the tank. The water boiled up around it. I shuddered.

He gave a sinister giggle. “So, you want to know my master plan. I guess you know about Y2K?”

“The Millennium Bug? But… your software implements different fixes for the bug in different packages – even in different releases of the same package! You’re on record as saying that the Millennium Bug isn’t a big deal! You’ve even said it can be fixed by subtracting thirty years from all dates, and everyone knows it should be twenty-eight – I thought…”

“You thought I was just like totally clueless, yes?” His accent was changing as he spoke. “And now you are realising, like, nobody is that clueless? And if I am not clueless… Hmm?”

“Incompatible fixes, fixes that don’t work, misleading advice – you’re trying to make things worse!”

“Ha! Correct. And after the Millennium Bug, what happens? When the date rolls over, when the computers of the world are crashing and burning, what then? I’ll tell you – it will be the end of computing as we know it! And, as the cloak of anarchy falls over the smouldering ruins of Western civilisation, only one system will survive. One light in the darkness, one beacon of hope, one operating system which will be fully compatible with the emerging requirements of the new millennium!”

“You mean – ”

“Yes. Windows 2000! Oh, they used to laugh at Windows. They laughed at my dancing paperclip; they laughed at the repeated shipping delays for NT 5.0; they even” – his voice trembled – “they even laughed at my talking Barney. But no more! There was Windows, now there is Windows NT; soon there will only be Windows 2000. The third Windows will last a thousand years! Give or take a few Service Packs.”

“That’s fiendish!”

“I thought it was kind of cool, actually. But enough of this idle chit-chat. There is a second piranha tank beneath your feet: when I press this button the floor will open up beneath you and you will suffer the fate of Barney. I’m clicking on ‘OK’… now. Now I’m doing it again, because the system has not responded. And once more. And now I am being told an illegal operation has been committed, and I am exiting the program to try again. And now the system is hanging, and – hey, where are you going?”

As I made my escape he shouted after me:

“Go, Ludd! Tell the world! They will never believe you! Ha! No one will listen to your ridiculous story, and that’s just like so uncool. Ha! And stuff.”

I think he needs to work on the accent.

Wrapped in paper (2)

More about blogging from iSeries NEWS UK (or System i News UK as it now is), this time from April this year. (Reverse chronological order?)

SINCE BLOGGING exploded onto the national consciousness about a year ago, around the time that I first wrote about it, the phenomenon has grown exponentially. It is now estimated that, out of any given class of fifteen-year-olds, half have a MySpace account, a third have a personal blog and one in ten are using Facebook, while the other two haven’t been online since they got the ASBO. But what are the perils and pitfalls of this new medium? Can we safely entrust our deepest personal secrets to the Web, blithely trusting in the good intentions of everyone who reads our uncensored outpourings? Or not?

Here are some tips for would-be voyagers in the blogosphere. Careful now.

Q: I’m writing a blog. Should I be worried?

A: Very probably. Let’s face it, writing about whatever comes into your head for the benefit of a few dozen readers is no kind of occupation for an adult – not like being a columnist, for example! Perhaps you should get out more. Unless you’re one of those fifteen-year-olds, in which case you probably get out quite enough. Isn’t there some homework you should be doing?

Q: No, I mean, should I be worried about getting sacked?

A: There have been a couple of high-profile cases recently of bloggers being sacked or suspended, on the general grounds that holding a responsible position in society is incompatible with writing about whatever comes into your head for the benefit of a few dozen readers – particularly if you’re doing it in work time. But let’s keep it in proportion. Before blogging, it was not unknown for employees occasionally to use the Web for personal purposes at work, particularly when Big Brother was on. Before the Web, work computer facilities could be used for employees’ personal ends just as easily, if not quite so entertainingly. Even before PCs, employees sometimes used work facilities for their own purposes, generally by having long telephone conversations with friends, lovers or relatives, often with little or no work content. Where this was not possible, employees often had workplace affairs. Blogging is just one form of workplace timewasting, and by no means the most prevalent (or the most messy).

Q: Good heavens! Can people really be so irresponsible?

A: Yes, I’m afraid so. (You are one of those fifteen-year-olds, aren’t you?)

Q: Any tips for safe blogging?

A: Think about who’s going to be reading your blog. Once it’s up there on the Web, anyone at all could read it – and it’ll stay there for years to come! On the other hand, in practice hardly anyone will read your blog, and most of those who do won’t look beyond the front page, so it’s probably not worth getting too worked up about. But do think about first impressions, and about the effect you’re having on casual visitors, and about printouts and employment tribunals. Don’t call your blog “Notes from a wage slave” or “My boss is a crook”, even if the title accurately describes its content.

Q: Shouldn’t employers actually embrace blogging, along with other forms of social networking software such as tagging, podcasts, vodcasts, wikis and mashups?

A: OK, you’ve had your fun. I’ll answer this one question, but after that I’m going to insist on talking to a grown-up. The answer is, no, they shouldn’t. The factor you’re overlooking here is that blogs are only partly to do with social networking. What they’re very largely to do with is writing about whatever comes into your head for the benefit of a few dozen readers. Which is fine if you’ve got a workforce consisting of egotistical narcissists who only want to hear the sound of their own voice and don’t understand the concept of dialogue.

Q: Many bloggers have gone on to land book contracts and TV appearances.

A: Wait a minute, I hadn’t finished. Encouraging workplace blogging is fine if your employees are all egotistical narcissists, but – let me stress this – not otherwise. What were you saying?

Q: Many bloggers have gone on to land book contracts and TV appearances. Will my blog change my life?

A: Call it “My boss is a crook” and you’ll soon find out.

Wrapped in paper (1)

A propos of not very much, here’s a magazine column about blogging. Regular readers of iSeries NEWS UK may recognise it, as it appeared in that estimable magazine last year.

BLOGGING – it’s the new thing! Everyone’s blogging these days – at least, everyone except you! But what is blogging all about? What are the do’s and don’ts of this new medium – what does it take to be a good citizen of the blogosphere? And that MySpace thing that the kids are doing – what’s that all about? Let’s find out.

Q: Reverse chronological order?

A: That’s right – you’ll see the latest posts at the top and earlier ones lower down. It’s easy to get used to – just imagine that you’re living life backwards, perhaps as the result of exposure to a top-secret military experiment that warped the very fabric of reality itself. Or that you’re reading one of those chain emails where people add their replies at the top.

Q: What about developing a coherent argument?

A: Many blogs have a continuing theme or an argument to which they frequently return. Bloggers whose writing has a particularly clear focus are sometimes referred to as ‘subject experts’, and sometimes as ‘nutters’. You may prefer to avoid being regarded as a nutter; in this case, your best strategy is to have opinions which people agree with. Otherwise, building an extended argument on a blog is no different from doing it in any other situation: cross-examining defence witnesses in a fraud trial, say, or ascertaining whether that bloke in the taxi queue did in fact want some. The only difference with blogging is that you write it all down – that, and the fact that what you write appears in reverse chronological order.

Q: But what would I write about?

A: Whatever you like – the sky is quite literally your oyster. To get some ideas, try browsing some IT blogs. The tech blogosphere is a happy hunting ground for lovers of rare, obscure and historic technology – from the LEO to the One Per Desk, from the Osborne to the Sinclair QL… The iSeries hasn’t been neglected, either – at last count there are as many as two dedicated iSeries blogs, which sometimes feature code! But it’s up to you: you can write about whatever crosses your mind, and goodness knows most people do.

Q: So who writes this stuff?

A: According to popular stereotypes, the typical blogger is a twenty-something American Unix enthusiast who lives with his parents and compensates for his lack of a social life by hunching over a keyboard for hour after lonely hour, conducting tediously pointless contests of geek one-upmanship and exchanging incomprehensibly elaborate in-jokes, pausing only for a swig of Mountain Dew or a bite of cold pizza. This stereotype is far removed from reality – Mountain Dew’s more of a skater thing, apart from anything else. In reality, the range of bloggers is as broad as the range of blogs – and that’s pretty broad. There are blogs out there devoted to every topic under the sun – computing, cult films, Dungeons and Dragons, beer, you name it! It is believed that there are also blogs written by women, although the subject matter of these has yet to be ascertained. That’s the great thing about blogging: anyone can do it. You could be a blogger, if you put your mind to it.

Q: OK, so what is blogging?

A: Blogging is the activity of keeping a blog. A blog is a personal Website, updated regularly by the user; you can think of it as a kind of online journal or commonplace book or advertisement for oneself. The word ‘blog’ may derive from ‘Web log’, a type of Web site consisting of a ‘log’ of other interesting sites. It may also derive from ‘backlog’, a term for the mass of blog-worthy material which dedicated bloggers tend to build up, and the mass of work which doesn’t get done while they’re blogging about it. Alternatively, it may be a cross between ‘brag’ and ‘slog’, encapsulating the experience of reading a blog for (a) the author and (b) everyone else.

Q: Blogs – are they something to do with that MySpace thing that the kids seem to be doing these days? What’s that all about?

A: God knows. Shall we talk about blogging?

The vagaries of science

The slightly oxymoronic Britannica Blog has recently hosted a series of posts on Web 2.0, together with responses from Clay Shirky, Andrew Keen and others. The debate’s been of very variable quality, on both the pro- and the anti- side; reading through it is a frustrating experience, not least because there’s some interesting stuff in among the strawman target practice (on both sides) and the tent-preaching (very much on both sides). As I said in response to a (related) David Weinberger post recently, it’s not always clear whether the pro-Web 2.0 camp are talking about how things are (what knowledge is like & how it works) or about how things are changing – or about how they’d like things to change. The result is that developments with the potential to be hugely valuable (like, say, Wikipedia) are written about as if they had already realised their potential, and attempts to point out flaws or collateral damage are dismissed as naysaying. On the anti- side, the danger is of an equally unthinking embrace of how things are – or how they were before all this damn change started happening.

All this is by way of background to some comments I left on danah boyd‘s contribution (which is well worth reading in full), and may explain (if not excuse) the impatient tone. danah, then me:

Why are we telling our students not to use Wikipedia rather than educating them about how Wikipedia works?

Because I could give a 20-credit course on ‘how Wikipedia works’ and not get to the bottom of it. It’s complex. It’s interesting. I happen to believe it’s an almighty mess, but it’s a very complex and interesting mess. For practical purposes “Don’t cite it” is quicker.

Wikipedia is not perfect. But why do purported experts spend so much time arguing against it rather than helping make it a better resource?

This is a false opposition: two different activities with different timescales, different skillsets and different rewards. I get an idea, I write it down – generally it won’t let me go until I’ve written it down. I look at what I’ve written down, and I want to rewrite it – quite often it won’t let me go until I’ve rewritten it. All of this takes slabs of time, but they’re slabs of time spent engrossed with ideas and language, my own and other people’s – and the result is a real and substantial contribution to a conversation, by an identifiable speaker.

I look at a bad Wikipedia article [link added] and I don’t know where to start. What I’d like to do is delete the whole thing and put in the stub of a decent article that I can come back to later, but I sense that this will be regarded as uncool. What I don’t want to do is clamber through the existing structure of an entry I think shouldn’t have been written in the first place correcting an error here or there, because that’s a long-drawn-out task that’s both tedious and unrewarding. And what I particularly don’t want to do is return to the article again and again over a period of weeks because my edits are getting reverted by someone hiding behind a pseudonym.

(I think what Wikipedia anonymity has shown, incidentally, is that people really don’t like anonymity. Wikipedia has produced its own stable identities – and its own authorities, based on the reputation particular Wikipedia editors have established within the Wikipedia community.)

Is it really worth that much prestige to write an encyclopedia article instead of writing a Wikipedia entry?

Well, yes. If I get a journal article accepted or I’m commissioned to write an encyclopedia article, I’m joining an established conversation among fellow experts. What I’ve written stays written and gets cited – in other words, it contributes to the conversation, and hence to the formation of the cloud of knowledge within the discipline. And it goes on my c.v. – because it can be retrieved as part of a reviewable body of work. If I write for Wikipedia I don’t know who I’m talking to, nobody else knows who’s writing, and what I’ve written can be unwritten at any moment. And it would look ridiculous on my c.v. – because they’ve only got my word that it is part of my body of work, assuming it still exists in the form in which I wrote it.

The way things are now, knowledge lives in domain-sized academic conversations, which are maintained by gatekeepers and authorities. Traditional encyclopedias make an effort to track those conversations, at least in their most recently crystallised (serialised?) form. Wikipedia is its own conversation with its own authorities and its own gatekeepers. For the latest state of the Wikipedia conversation to coincide with the conversation within an established domain of knowledge is a lucky fluke, not a working assumption.

Update The other big difference between traditional encyclopedias and Wikipedia (as someone known only as ‘bright’ reminded me, in comments over here) is that the latter gets much more use. From my response:

Comparisons with the Britannica are interesting as far as they go – and I don’t believe they do Wikipedia any favours – but they don’t address the way that Wikipedia is used, essentially as an extension of Google. When I google for information I’m not hoping to find an encyclopedia article. Generally, Britannica articles used to appear on the first page of hits, but not right at the top; usually you’d see a fan sites, hobby sites, school sites, scholarly articles and domain-specific reference works on the same page, and usually the fan sites, etc, would be just as good. (I stopped using the Britannica altogether as soon as it went paywalled.) If all that had happened was that Britannica results had been pushed down from number 8 to number 9, with their place being taken by Wikipedia, I doubt we’d be having this conversation. What’s happened is that, for topic after topic, Wikipedia is number 1; the people who would have run all those fan sites and hobby sites are either writing for Wikipedia instead or they’re not bothering, since after all Wikipedia is already there. (Or else the sites are still out there, but they’re way down the search result list because they’re not getting the traffic.) It’s a monoculture; it’s a single point of failure, in a way that encyclopedias aren’t. And it’s the last thing that should have happened on the Web. (I’ll own up to a lingering Net idealism. Internet 0.1, I think it was.)

Alright, yeah

Stephen Lewis (via Dave) has a good and troubling post about the limits of the Web as a repository of knowledge.

while the web might theoretically have the potential of providing more shelf space than all libraries combined, in reality it is quite far from being as well stocked. Indeed, only a small portion of the world’s knowledge is available online. The danger is that as people come to believe that the web is the be-all and end-all source of information, the less they will consult or be willing to pay for the off-line materials that continue to comprise the bulk of the world’s knowledge, intellectual achievement, and cultural heritage. The outcome: the active base of knowledge used by students, experts, and ordinary people will shrink as a limited volume of information, mostly culled from older secondary sources, is recycled and recombined over and again online, leading to an intellectual dark-age of sorts. In this scenario, Wikipedia entries will continue to grow uncontrolled and unverified while specialized books, scholarly journals and the world’s treasure troves of still-barely-explored primary sources will gather dust. Present-day librarians, experts in the mining of information and the guidance of researchers, will disappear. Scholarly discourse will slow to a crawl while the rest of us leave our misconceptions unquestioned and the gaps in our knowledge unfilled.

The challenge is either – or both – to get more books, periodicals, and original source materials online or to prompt people to return to libraries while at the same time ensuring that libraries remain (or become) accessible. Both tasks are dauntingly expensive and, in the end, must be paid for, whether through taxes, grants, memberships, donations, or market-level or publicly-subsidized fees.

Lewis goes on to talk about the destruction of the National and University Library in Sarajevo, among other things. Read the whole thing.

But what particularly struck me was the first comment below the post.

I think you’re undervaluing the new primary sources going up online, and you’re undervaluing the new connections that are possible which parchment can’t compete with like this post I’m making to you. I definitely agree that there is a ton of great knowledge stored up in books and other offline sources, but people solve problems with the information they have, and in many communities – especially rural third world communities, offline sources are just as unreachable, if not more, than online sources.

This is a textbook example of how enthusiasts deal with criticism. (I’m not going to name the commenter, because I’m not picking on him personally.) It’s a reaction I’ve seen a lot in debates around Wikipedia, but I’m sure it goes back a lot further. I call it the “your criticism may be valid but” approach – it starts by formally conceding the criticism, thus avoiding the need to refute or even address it. Counter-arguments can then be deployed at will, giving the rhetorical effect of debate without necessarily addressing the original point. It’s a very persuasive style of argument.In this case there are three main strategies. The criticism may be valid…

I think you’re undervaluing the new primary sources going up online

but (#1) things are getting better all the time, and soon it won’t be valid any more! (This is a very common argument among ‘social software’ fans. Say something critical about Wikipedia on a public forum, then start your stopwatch. See also Charlie Stross’s ‘High Frontier’ megathread.)

you’re undervaluing the new connections that are possible which parchment can’t compete with like this post I’m making to you. … in many communities – especially rural third world communities, offline sources are just as unreachable, if not more, than online sources

but (#2) you’re just looking at the negatives and ignoring the positives, and that’s wrong! Look at the positives, never mind the negatives! (Also very common out on the Web 2.0 frontier.)

I definitely agree that there is a ton of great knowledge stored up in books and other offline sources, but people solve problems with the information they have

but (#3) …hey, we get by, don’t we? Does it really matter all that much?

I’m not a fan of Richard Rorty, but I believe that communities have conversations, and that knowledge lives in those conversations (even if some of them are very slow conversations that have been serialised to paper over the decades). I also believe that knowledge comes in domains, and that each domain follows the shape of the overall cloud of knowledge constituted by a conversation. But I’ve been in enough specialised communities (Unix geeks, criminologists, folk singers, journalists…) to know that there’s a wall of ignorance and indifference around each domain; there probably has to be, if we’re not to keel over from too much perspective. Your stuff, you know about and you know that you don’t know all that much; you know you’re not an expert. Their stuff, well, you know enough; you know all you need to know, and anyway how complicated can it be?

Enthusiasts are good people to have around; they hoard the knowledge and keep the conversation going, even when there’s a bit of a lull. The trouble is, they tend to keep the wall of ignorance and apathy in place while they’re doing it. The moral is, if your question is about something just outside a particular domain of knowledge, don’t ask an enthusiast – they’ll tell you there’s nothing there. (Or: there’s something there now, but it won’t be there for long. Or: there’s something there, but look at all the great stuff we’ve got here!)

%d bloggers like this: