Thoughts from MuseumNext

Last week I was at MusuemNext in Geneva. It’s an international conference on the future of museums and has been going for a little while now. I’ve had my eye on it for a while – what’s not to love about visiting European cities to hear interesting people talk about the subjects you find fascinating? Nothing, that’s what.

This year I had an excuse to attend – I was presenting some work I did for the V&A at the end of last year. A full write-up of that is coming, but you can check The Museum Dashboard on Slideshare in the meantime, with some extra notes on the One Further website.

Thoughts from the conference

I’ve collected together lots of round-ups, notes and slides on CulturalDigital, so feel free to dig in to those. Here’s my contribution.

Oh, and MuseumNext isn’t just a conference about digital tech/media in museums, but that’s my interest, so that’s where I’m coming from here. Just so you know.

1. How deep does digital transformation go?

There are lots of organisations doing really interesting digital work, but on a few occasions I was left wondering just how deeply rooted it is. Good work seems to be the product of good people pushing things through. If those people were to leave then would the work continue?

Related thought: just how good are museums (and I’d extend this over to the rest of the cultural sector) at attracting and retaining talent in digital teams? I missed the session on Making the most of your people but from what I heard it was very well attended.

Related fact: almost everyone seems to be looking for a new job.

2. Outsider approaches

Projects like Museum Hack and Invasioni Digitali are coming in and doing a great job of sparking excitement and interest in museums. Are they making up for a failing on the part of the museums themselves, or do they exist because they’re being welcomed into institutions that serve as foundations for all sorts of different experiences? Either way, good stuff.

3. New ways of working

Not so long ago I was talking to someone about ways that cultural organisations might be picking up on processes and methodologies like agile, lean, user-focused design, etc and so on. We’d both seen digital agencies (often very close to emerging trends) influencing their contacts in an organisation’s digital team, with that team then passing on knowledge to their colleagues. So that’s one way.

Interesting to see that a couple of former heads of digital at major US institutions have recently turned consultant and are teaching this kind of stuff.

Coming back to MuseumNext, workshops on agile development and customer journey mapping might seem like slightly curious things to have at a conference. However, they were really well run and got rave reviews. Just goes to show.

MuseumNext - Southbank Centre

4. Sharing stuff

MuseumNext attracts a progressive bunch (as you’d expect, given the name) and the instinct towards sharing and building on each other’s work comes across strongly. By and large case studies were more ‘This is what we did, here are some pointers if you’re thinking of doing something similar’, rather than ‘This is what we did – applaud our brilliance’.

The Southbank Centre’s open website approach typifies this. It could be summarised as ‘Here’s what we’re doing, here’s our documentation, code and design patterns and lets collaborate on parts of it where we can. Is there anything else we can give you?’.

Thoughts about the conference

I tend to find that museum conferences, as opposed to the visual/performing arts ones I usually attend, tend have quite an academic feel to them. Not so with MuseumNext. The subject matter is no less substantial, but it feels lighter and more approachable. Maybe that’s because the organisers have a visual comms background. Hmm.

The attendee list was really very excellent indeed. In fact I’d say that’s one of the great strengths of this conference. It was great to have a few days of talking to smart, friendly people about interesting things with no need to explain as you go along. That was really good.

The experience of speaking really couldn’t have been much better either. The tech setup was simple and seamless and the audience was receptive and asked some really smart questions afterwards.

In the interests of balance, I should say that it wasn’t all amazing. What is? There was some sketchy wifi, venues that were a little too spread out (resulting in extended breaks and long sessions that didn’t always justify their runtime) and the perfect storm of conference no-no’s that resulted in a session called ‘The Sociable Museum’ which… well, I’ll leave it there.

None of which really mattered. I came away a bit smarter and knowing many more interesting people, which was exactly what I was looking for.

 

Shareable exhibitions

A quick observation.

A while back I remember talking to someone about how, one day, promoters might start taking into account things like Twitter followers into account when booking bands and DJs. it wasn’t a huge leap to make – MySpace must’ve been getting close to it’s peak and the media were talking up how large online fanbases had propelled Lily Allen, Arctic Monkeys and others to success.

Fast forward a bit and we’ve got YouTubers all over the bestseller lists, radio and on tour, a Vine star with a (cancelled) TV show, and a Twitter comedian with a C4 TV show. There’s presumably plenty of Facebook and Instagram stars out there that I’m missing.

Anyway. Point is that being able to attract an audience has always been a good thing.

A tale of two exhibitions

I went to Paris last weekend (it was lovely, thanks) and while I was there I went to a couple of exhibitions. One was Lumières : carte blanche à Christian Lacroix at the Musee Cognacq-Jay, the other was Olafur Eliasson – Contact at the Fondation Louis Vuitton. They felt like very different beasts.

The Musee Cognacq-Jay exhibition was interesting enough. The museum houses a collection of 17th-century objects and has recently re-opened after renovation, with a little stardust applied by guest curator, Christian Lacroix. That’s a name that’ll make ears perk up and will garner some extra attention. While you’re at the exhibition you wander about quietly, maybe download the rather duff app and generally have a nice enough time of it.

On the other hand, the Olafur Eliasson exhibition at the FLV was extraordinary, and very popular too. Sure, the guy’s got an excellent reputation, but how many people would recognise the name (he did the big sun in the Tate’s turbine hall).

I didn’t see anything saying you could or couldn’t take photos (there was maybe a mention of the #EliassonFLV hashtag somewhere), but the immersive light installations – all striking patterns and mirrors – made it practically impossible to take a bad photo or an uninteresting selfie. That meant a lot of people taking photos and videos of themselves, each other and the exhibition (probably in that order of importance).

The result of that is a lot of stuff being shared across Facebook, Instagram and Twitter, resulting in a tonne of digital word of mouth. You can see a snapshot of this activity at http://olafureliasson.net/eliassonflv/.

Formats

I’ve been doing a lot of pondering recently around the question of formats that fit with patterns of digital dissemination (more to come on this). Olafur Eliasson’s exhibition does a great job of making itself shareable by creating an experience that involves visitors (and wouldn’t work nearly so well without their participation) and letting them do their thing.

There’s no digital aspect of this exhibition that’s been tacked on as an afterthought (hashtag aside). Instead, the aspects that appeal to social media-using visitors are an intrinsic part of the art and experience. Is that by luck or design? Who knows.

Of course, not every exhibition needs to be immersive, interactive and participative (at least, I don’t think so…). But it did make me wonder whether artists, curators, artistic programmers, exhibition designers and the like are going to start taking this sort of thing into account when designing and selecting what will go on show. If they’re not already.

Improving the digital metrics Arts Council England collects from funded organisations

I’ve complained before about the digital metrics that Arts Council England collects as part of the annual review. However, I’m aware that that’s not particularly constructive. What I should do is explain why I think the metrics are useless and provide some sort of solution. In fact, a few people have asked me about this and I’ve only ever given half answers.

Here’s an attempt to answer the question properly.

Some background

Arts Council England asks their National Portfolio Organisationsto submit an annual report with all sort of information and figures. As part of that they’re required to provide some digital metrics.

Here’s what they ask for (photo via @SamScottWood):

Arts Council NPO web metrics

On the face of it, collecting this type of aggregate data is a good thing. As a major funder, ACE is in a position to do two very powerful things:

  1. Collect data that can inform policy and produce insights that can benefit the sector.
  2. Influence the behaviour of their portfolio organisations by making them take their online presence seriously.

However…

What’s the problem?

You may have spotted the answer to this when you saw the screenshot above. In a nutshell, those questions are rubbish.

If you’re not clear on why that is, then no worries – I’ll explain that below (at some length, as it turns out). For now, just know that the upshot is:

  • There is approximately zero chance of the data resulting in any useful insights that can drive sector-wide improvements or inform policy.
  • Busy people at NPOs are having their time wasted. Some aren’t aware of this, but the more digitally savvy ones know this only too well. Especially as plenty of them report on digital metrics internally and will have come up with figures that actually mean something.
  • It promotes the idea that ACE don’t have a clue what they’re doing. Which is a problem, and possibly not even true.

On that last point, I’ve spoken to people who suspect that the current questions have been put down as a marker, intended to start a conversation about how things should be done. If true then it’s working – this post is proof of that – so well done. Personally, I’m not convinced. Knowingly wasting people’s precious time and making yourself look clueless is not a smart way to start a productive conversation.

I’m actually pretty sure the folks at ACE know that the information they’re collecting is rubbish. I’ve heard that plans are afoot to improve things, so hopefully this post will be outdated before too long.

For now, let’s go back to those questions and take a look at what exactly is so wrong with them.

The short version

  • It’s not possible for the organisations to collect some of the information requested.
  • The data doesn’t provide a sensible basis for comparison between NPOs.
  • The data isn’t particularly useful in the aggregate, either.
  • Some of the metrics are far too easy to game.
  • The data is likely to be meaningless and unusable.

The long version

I’m going to pick through the questions one at a time.

  • Number of unique browsers. This is the total number of unique devices (e.g. computers or mobile phones) that have made requests to the site in the period being measured.

This is an odd question for several reasons. For starters, browsers are not at all equivalent to devices (especialy on a mobile or tablet), but that’s not my main beef with this question. My main issue is that it’s not something anyone in their right mind would ever ask.

It’s what happens when you let your thinking be guided by the menu of metrics offered by something like Google Analytics. I bet what ACE would really like to know is ‘How many different individuals have visited the website?’. A fine question, but unfortunately not one that a website analytics package can answer reliably, due to the near impossibility of tracking users across multiple browsers devices.

Instead of recognising this and asking an answerable question they’ve gone for it anyway and still ended up falling short. Click the image below to select your choice of delightful visual metaphor.

Kitten Jump Fail GIFs on Giphy

  • Number of page impressions. This is the total number of requests (e.g. mouse clicks) made for a site’s content by users of the site (i.e. unique devices) in the period being measured.

The wording of this one is all over the place. Mouse clicks do not equal page views. Users of the site do not equal unique devices (ever visited the same website on your laptop and your phone? Or borrowed someone else’s? Of course you have).

Page impressions are a useless metric anyway. A high number could indicate:

  • indicate a haphazard user experience, with people getting lost and clicking around desperately.
  • lots of highly engaged users who want to devour everything your website can give them.

If high numbers of pageviews are good, then websites that are happy to disregard user experience in favour of reporting ‘good news’ (to gullible advertisers or funders) can game this quite easily. With this kind of things You have to think about the effect of your incentives – if anyone took these questions seriously we’d be in all sorts of mess.

Also bear in mind that websites that have a large transactional element will typically require ticket buyers to view more pages as they select seats, view their basket, log in, add payment and delivery details and check out. That’s extra pageviews that an otherwise equivalent NPO won’t be able to count on.

Caveat time – pageviews might be useful if you’re estimating traffic load so you can provision servers properly. Even then, you’re still going to want to need to know more about the traffic profile. They might also be useful to know if you’re changing the site to improve navigation in some way. I’m pretty sure neither of those fit ACE’s reasons for collecting the data. I’m not really convinced they have a reason.

  • Number of visits. A visit is a single period of activity by a unique browser.

I can see why you’d ask for this – it’s every beginners favourite metric – but, without some context, what does it actually reveal? That Website A has more visitors than Website B? Fine, but so what? By the way ‘So what?’ is a really useful follow-up question when it comes to digital metrics. If you can’t answer ‘Well, it means that we should…’ then your metric could well be meaningless.

For instance, if I were to tell you that Website A has had 2 million visits in the past 12 months and Website B has had 10 million then what does that tell you? Without context, it tells you nothing.

To labour the point, if I threw in any of these extra factoids:

  • Website A does four shows a year; Website B presents 1,000.
  • Website A spends £10,000/year on online advertising; Website B spends £200,000/year.
  • Website A has a bounce rate of 30%; Website B has a bounce rate of 55%.
  • Website A is for a touring theatre company and tickets are sold via third party venues; Website B belongs to an arts venue with it’s own ticketing system.
  • Website A costs £25,000/year to maintain; Website B costs £250,000/year.

…then it would completely change how you viewed those numbers of sessions.

The wider point is, if you’re focussing on sessions with no context then you’re flying blind and would be considered a rank amateur in analytics circles.

  • How much time have visitors spent on your organisation’s website (in minutes and seconds)?

Again, strangely worded. If you wanted you could easily interpret that as being a request for number of sessions x average time on site. But that’d be ridiculous.

More importantly, anyone using Google Analytics won’t be able to tell you this. Not unless they’ve tweaked their setup in a very specific way, and what are the chances of that? Yet again, it’s hard to know why ACE needs to know this or what sense to make of any particular figure – it’s the same problem as with pageviews.

  • Does your website have specific content for children and young people aged 0-19 years and/or teachers?

It’s not exactly a high bar to pass, is it? I can see why you’d ask, but does anyone really answer ‘no’ to this one? What does content for young people aged 0 look like, anyway?

How to improve matters

I’ve gone on a bit there, but hopefully I’ve shown that there’s plenty of room for improvement. Here’s where I start being more constructive and stick my neck out a bit.

Off the top of my head, I think there are four options:

  1. Keep things as they are.
  2. Do nothing.
  3. Collect better data.
  4. Do something totally different.

There may well be others. If anything occurs to you then please do let me know. In the meantime, let’s pick through what I’ve come up with.

Option 1. Keep things as they are.

Just because it’s a time-wasting farce, that doesn’t mean it has to stop. After all, it’s pretty minor in the grand scheme of things and surely not the worst example of the genre. At least it means organisations dip into their website analytics once a year, so maybe some good comes of it.

I don’t like this option.

Option 2. Do nothing.

As the (variously attributed) saying goes:

Better to remain silent and be thought a fool than to speak and to remove all doubt.

The current questions are useless and working out what the right questions are might take a while. In the meantime, why waste everybody’s time? If you can’t get it right, at least don’t get it wrong.

I don’t like this option either, but I think it’s preferable to Option 1.

Option 3. Collect better data.

This is more like it. The central problem with the current questions is that they’re trying to find metrics that are applicable to a very broad group of organisations. I believe that’s a fool’s errand. For the metrics to make any sense they really need to allow for more detail.

One way to do this would be to split website functions into categories. Perhaps something like:

  • Non-transactional brochure/portfolio.
  • Transactional/ecommerce.
  • Artistic content (interaction/consumption).

You’d then ask for metrics that are relevant to each category.

Of course, some sites will have elements of all three. Some will concentrate on just the one. Picking the right metrics for each element might still be tricky, but it’d be much more manageable. The categories might even provide some basis for meaningful comparisons between the organisations.

Having to identify those sorts of metrics might also spook the more recalcitrant organisations into putting some effort into seeing how their websites are working for them. Instead of just logging into Google Analytics login once a year and heading straight for the Audience Overview tab, they’d be forced to look at something that would actually tell them something.

It’s not a perfect solution but, done mostly right, I think it’d be better than what there is at the moment.

Option 4. Do something totally different.

Maybe the whole matter of digital metrics is a red herring. Maybe it’s not a question of collecting data after all.

Let’s take a step back for a sec and ask two important questions:

  1. Why is the Arts Council asking these questions in the first place?
  2. What does it really need to know?

Here’s a hypothesis. Maybe ACE is asking their NPOs for website stats as a way of making sure that those organisations are taking their digital presence seriously. If that’s the case then it doesn’t really matter that the questions are rubbish and the resulting data is meaningless as long as they ask something. As a result, as long as NPOs can provide some sort of answer then it shows that they too are doing something.

If so, the answer to the second question is pretty clear. The Arts Council just needs to know that the various organisations have some sort of online presence, are paying attention to how it’s performing, and are taking some appropriate steps to develop it.

In which case, why not junk the whole question of metrics and ask some better questions. For instance, any or all of the following:

  • Ask organisations to report on what they’ve considered with regards to their online presence.
  • Consider adopting something along the lines of the online analytics maturity model. Come to think of it, the Collections Trust recently put together something similar-ish with support from ACE.
  • Ask the NPOs whether they have a digital strategy. Get them to state what progress they’ve made towards putting it into practice. I know digital strategies are much maligned (“But it should be woven into the overall organisational strategy!” I hear you cry) but it’s a tool that has uses and the process of putting one together can be valuable.

I’m sure there are others along these lines – I’d love to hear them.

A final thought about effort

I think it’s important to remember that the Arts Council’s NPOs aren’t all blessed with endless resources. Even the bigger ones are somewhat stretched, and increasing their reporting requirements unduly would be a bad thing. That’s one of the reasons I prefer Option 2 to Option 1.

That said, I like to encourage people to think of their website as they would an employee. An incredibly productive employee that works 24/7 for the benefit of multiple departments, is often the first person that an audience member will meet, who spans international boundaries and who drives costs down and revenue up.

Employees have reviews. Their progress is monitored and reviewed. By the same token, I think it’s reasonable to do that with a website, no matter the size of the organisation.

It’s also worth pointing out that using a form to collect analytics data feels a tad anachronistic. My last Arts Analytics post showed that 99% of my sample group (representing about a sixth of all NPOs) have Google Analytics on their websites. If reporting requirements were increased as per my Option 3, then it really wouldn’t be too tricky to create a tool that automated that data collection. Just a thought.

Conclusion

Well done for making it this far. For that you deserve a conclusion with a solid opinion. Unfortunately, I used to be a lawyer, so everything has to conclude with ‘it depends’. In this case, I think that:

  • If ACE wants meaningful data that they can use for improvements and in advocacy they should go for Option 3.
  • If ACE just wants to make sure arts organisations are keeping their eye on the ball when it comes to their online presence then go for Option 4.
  • If they don’t want to do either of the above then they should do everyone a favour and go for Option 2.

I’d love to hear your thoughts and see how you might build on this or approach it differently. What do you think – am I wide of the mark? Getting close to something?