Malofiej-week is here

I have decided to no longer facilitate comments on this site. The ratio for valid comments compared to spam must be around 1:2000 (really - not kidding!). - Gert K Nielsen, Admin

- 10% of the news in infographics …

Another beautiful infographic with glaring errors goes viral

errorvisInteresting data – even presented as ‘The Science behind Wikipedia’s Jimmy appeal’ by Information is Beautiful-writer David McCandless at his own site here:

This visualization sets out to show just how effective it is to use a personal dating site-style  ad to appeal for funds. Very effective indeed, as the big blue box with the mugshot tells us.

Nice thing that David McCandless puts in a source for his work. I like to check up on facts, when I’m surprised by them. Unfortunately for David McCandless his source lists another source – an actual spreadsheet with numbers.

Spend five minutes on the spreadsheet and you’ll realize that this visualization is everything else than ‘the science behind’.

Error number 1. The banner the visualization sets out to explain is not the one tested. The visualization has mirrored the wrong mugshot to vectorize.

The banners tested on October 26, 2010 were these four:

wikibanner1

wikibanner3

wikibanner2

wikibanner4

A bit more than the message is different with these banners, don’t you think? This very visual fact is completely left out of the visualization. And the banner tested is not the one, which eventually ended up on WikiPedia. On the final one Jimmy Wales has lost his rugged NGO-look and got a light blue shirt and a more CEO-like attitude:

finalwiki


Error number 2: Visualization is not taking into account that the number of impressions heavily favors (by a factor 3) the Jimmy-appeal.

But it gets much more serious. If you look at the spreadsheet again – we see that they’ve tested each banner to show up 10% of the pageviews. But for some reason tested two versions of the Jimmy appeal (linking to different landingpages) – making it show up 20% of the time. They’ve tested the first three for two hours, but the last two for three hours. It all totals up for the image-driven Jimmy-appeal.

Total impressions for each ad:

Brain Massage: 1.328.092

Depend on you: 1.324.894

Admit it: 1.326.289

Jimmy-appeal: 4.076.426


Error number 3: The dataset wrongly assumes, that the test running on Oct. 26 took place all day.

The timescale at the top of the graphic is also completely wrong: Effectiveness of donation $ per day we’re told. No – the tests were carried out in hours.

(Amazingly big business with such an appeal btw.)

Weird fact number 4: Why are the numbers rounded off for only the first three ads?

Exact numbers – first the number in the spreadsheet followed by the number in the visualization:

$582.21 = $580

$2901.08 = $2.900

$2992.91 = $3.000

$47433.28 = $47.433

I’m all for rounding off big numbers – but it should be done consistently. Otherwise it looks like some kind of manipulation to drive home a point – and it does look more ‘effective’ and data-like to have all the numbers in 47.433, doesn’t it?

Conclusion: The story is correct. The Jimmy-appeal is a lot more effective than the text-based ads. But the visualization is not showing why and how.

So why do I rant so heavily about a harmless piece of visualization then? Why don’t I mind my own business and let David McCandless mind his?

Well, I just gave a presentation for the Danish Union of Journalists, telling them that the current boom in infographics – beautiful infographics too – has one major flaw: They’re apparently not rooted in a passion for telling the true story – the research too often isn’t good enough.

This one is just another example among many, and as it is already making the rounds with 237 retweets and counting, I would like to grab the opportunity to make more people interested in checking the facts behind the figures – and best of all: Ask the visual journalists visiting this site to pay attention to the research before doing the first sketch for a great visualization. (Amen!).