Friday, 27 November 2009

Innovation and the creative industries

You know our stance on how Interactive Media is not really considered a sector in its own right and doesn't get picked up by government departments for statistical purposes – or many other purposes really. It boils down to where we get positioned in terms of being IT related or creative, as those denominations dictate the money allocation. Well, there might be a slow shift in the right direction of recognising we're a hybrid in the latest report from NESTA (National Endowment for Science, Technology and the Arts).

"Our understanding of innovation has changed: where once it was understood to be largely the result of scientific research and development, it is now seen more widely to include changes to services, ways of working and delivery, customer insight and many other forms."

Previously they have been preoccupied with film and games as representing our form of `the creative industries'. This latest report- Measuring sectoral capability in nine areas of the UK economy' 26.11.2009 - looks at innovation in a newly-defined way and uses a correspondingly different analysis tool – the IVC (Innovation Value Chain). They chose Software and IT Services as one of their nine sectors and analysed that with their new way of describing innovation, they call `hidden innovation'. Not sure that they consider Interactive Media in this division of Software and IT services, but it does include consultancy, data processing and database design. Getting there, maybe!

We won't be surprised by the results: but they have been.

The sector reported the highest product and services innovation during 2006-9 compared to other sectors. This is a key indicator for NESTA's IVC. The average levels of innovation capability were relatively high when compared to other sectors in the report. They were surprised by the intensive team-working involved in the sector to access knowledge. Ah! Cross-functional teams strike again!

Take a look if this snippet tempts you at

Saturday, 21 November 2009

Free the data!

We have grown use to the idea that information is freely available on the internet and there are two news items this week that demolish or reinforce that presumption.

Rupert Murdoch and News Corporation have shown concern for some time that newspapers should charge for access to their stories. Added to this is a stance that news aggregators should also pay to reproduce the headlines from stories on other (such as News Corp's) web sites.

Tom Forenski of the Silicon Valley Watcher has an interesting analysis of how this might play out and why the 'simple' technical tactic of blocking aggregators with a robots exclusion file has been ignored in favour of attacking through the media.

I thought the thorny issue of whether taking headlines for links back to the orginal story infringed copyright had been settled years ago; it's certainly a common approach and is actively encouraged by RSS news feeds.

Murdoch Snr seems to be accusing organisations like, especially, the BBC, of getting news from the newspapers.

He is quoted in the Telegraph as saying "...most of their stuff is stolen from the newspapers now, and we’ll be suing them for copyright."

Oddly enough I see some resonance here, and not just with ITV's Michael Grade once saying that web sites that took broadcasters' content were 'parasites'. Broadcasters do scan the papers for stories and newspapers will regularly promote 'exclusive' stories into electronic media to build up interest in tomorrow's edition. TV coverage of the storm over Gordon Brown's sympathy letter was usually accompanied by extracts from an interview with the aggrieved mother which were brightly branded with the Sun logo. So the Murdochs (père et fils) must have a deeper concern, albeit one that I personally see brightly tagged 'self-interest' whenever the BBC is mentioned.

I doubt if many people think that traditional static-media-based notions of copyright can survive eternally in a connected world. I have shared the view of many that, as a classic example, Ordnance Survey's mapping data should be freely available. This includes the ability to map things to your location, usually by postcode. This isn't as simple as it might at first appear because OS get data from the Royal Mail to which they then map grid references. These dual rights are one reason it has been convoluted and expensive to access such data.

The OpenSpace project has opened OS data up a lot but this still has limitations to control the number of times you access their system, which has restricted large-scale use. Now commercial operations are starting using the system, which uses an application program interface (API) to access the OS data. To top this, the UK Government has announced that OS mapping will be freely available online from 2010  although I have not as yet seen the fine print.

The scheme's advisors are Sir Tim Berners-Lee (who of course always comes with his own tag of inventor of the world wide web) and Professor Nigel Shadbolt. Professor Shadbolt has reinforced the use of this data for what he calls hyper-locality; finding out what relates to your own street or postcode. He is also quoted on the BBC Web Site as believing that "... OS maps are more comprehensive in their coverage than other open source competitors which are already freely available online". This presumably doesn't refer to Google Maps (although rights issues do seem to limit how you can interact with them for hyper-locality) but is certainly true of the open map sites that rely on users inputting information. Cartography and GIS are a lot more complicated than most of us think.

It'll be interesting to see how this pans out.

Friday, 13 November 2009

A blog by any other name ...

We have decided to take the plunge and open up the iProfessionals group even more. Since it doesn't seem to be possible to let anyone read the Yahoo group without also letting them post graffiti willy-nilly, and also to give us more flexibility, we are moving over to Google's Blogger system.

The new iProfessionals URL is although we will run the two in parallel for a while and copy recent posts over to the Blogger version ... and keep the archive intact.

Initially I have set the comments to be open to anyone who is logged in to Google, either with a Google account or an OpenID. So please feel free to chip in to any discussions.

If any of you would like to contribute new postings then I will happily add you to the blog authors list ... just drop me an email at You don't have to have anything to say now, this is some forward planning.

Why is it theiprofessionals and not just iprofessionals? That one is already taken. You should check it out.

We won't make Technorati's top 100 blogs list just yet.

What do you think? Is this a step in the right direction? You can still set up an email notification if you want and you can even get an RSS feed of the posts and discussions. Going for Google's existing system was a pragmatic decision and I have already used Blogger for my Infrared 100 centenary site and am using Google Analytics and Adsense on some of my own web pages.

(Funny story: I put Adsense ... contextual advertising... onto my page about the BBC Domesday Project and initially it put up adverts for glass domes! It has since settled down to surprisingly relevant things like data backup and storage but I was rather amused by that first stab. It reminded me of a parsing system I once used which thought that Andes ... as in the mountain range ... was the plural of the word and.)

I am all in favour of using existing and even shared services if they're appropriate. You see this happening a lot in the consumer world, notably with even professional photographers using either Flikr or SmugMug rather than building (or having built) portfolio sites of their own. Are these the kinds of things that you should recommend to your clients? Well yes, if they do the job and the client's brand or reputation isn't diminished by it.

A client of mine is experimenting with one of the social networking recommendation systems called ShareThis which gives an easy way to flag pages on things like Twitter, Delicious, Stumbledupon and even Blogger. It's too early to tell whether this will increase traffic but it does add an extra option for analytics to your pages.

And then there is SideWiki. It's not a wiki ... more a Post-It note ... but it is on the side. Have you come across this yet?

Never mind what you may want to scribble on your web pages, this lets anyone scribble on them. You'll need IE or Firefox to try it out.


You can take this one of two ways: it is encouraging grafitti (back to that again) or it provides a way for experts to add insight to existing pages. More likely something in between. As a publisher (how grand that sounds) you can launch a pre-emptive strike by adding your own sidenote to your own pages, as I have done on the home page of Invisible Light.

It raises an interesting issue as well. I think that until now the content of a web page has had a well-defined publisher, but who is the publisher of a web page plus its SideWiki? Can they be joint if they have no real connection with each other? If the web page publisher does not activate SideWiki they may not even know that comments are being posted.

And finally ...

Getting examples is often difficult, especially understandable legal ones. I recently came across a web site called Chilling Effects whuch describes itself thus: "Chilling Effects aims to help you understand the protections that the First Amendment and intellectual property laws give to your online activities ... [and] encourages respect for intellectual property law, while frowning on its misuse to "chill" legitimate activity."

Obviously this collaboration between the Electronic Frontier Foundation and a group of law schools is based on US law (especially the First Amendment bit) but it still makes interesting reading and is an example of how legalese can be translated into people-speak.

For example ... legal issues around web linking: ( In this you will see that Chilling Effects' comments about inlining of images and framing do not seem to concur with Graham Smith's notion (in 'Internet Law and Regulation') of an 'Implied Licence'. When lawyers disagree, treasuries tremble!

Your thoughts on the iProfessionals blog please.

Friday, 6 November 2009

Social Network Marketing

There's still a lot of confusion as to how to make best use of the incredibly successful social networking sites in terms of marketing. With Facebook now well in the lead for use in terms of numbers of people and time spent there, (New Media Age, Social Circles, 5th November), and with a majority of US companies saying they will employ a social media specialist, (2009 Digital Readiness Report, Essential Online Public Relations and Marketing Skills), the social network trend is enticing mainstream business.
But it seems that brands are having a rough time working out what interactive users want. Since social sites favour word-of-mouth or tips from trusted others, some brands have fallen foul if the trusted haven't liked their offerings. This news gets spread virally just as fast – if not faster – than good news!
Interactive users also want input into shaping ideas that they might then promote. The 'active' user's profile is very different from the passive user's profile of static communication channels. Maybe new profiling can move towards a better understanding. It's an emerging field full of landmines, so tread carefully.
See The CMO third online survey results about future market trends at:
and the 2009 Digital Readiness Report at: