logo

Silver lining on the copyright cloud?

Sunday, 11 November 2007


Jonathan Richards
Will it be the silver lining media companies have been hoping for?
There has been a lot floundering about how content owners should protect their material from being copied illegally and shared on the internet.
A broad consensus seems to agree that some type of filtering technology - where a digital 'fingerprint' of a piece of content is taken and cross-checked against content uploaded to sites like YouTube - is a good idea.
And while some significant legal bickering remains about where responsibility should lie for searching out unauthorised content, a common view that the technology should be industry standard and work across sites is at least emerging.
The unveiling of Attributor.com, a content-tracking site, takes the debate on and focuses attention on the next question, which is how a media company manages the revenue when its content is consumed - on a range of sites - across the web.
Attributor's service, to which the likes of Reuters have already signed up, lets companies monitor which sites are using their content and, most importantly, which aren't offering any acknowledgement of the owner.
The site constantly scours an index of more than 15 billion pages - 100 million pages are added each day, and feeds back to customers with a range of data, from the extent of the copying - 'extensive' to 'just some', to details about whether a site links back to the content owner, whether a site is generating advertising revenue, and if it is, which advertising platform is serving its ads.
The idea is that an owner can then decide either to 'go after' the offending site and request it to take the content down or, if it wishes, take the matter up with the company serving the site's ads -- Google, say -- and have it banned from the platform.
"Often, all they want is a link," Jim Brook, chief executive of Attributor told TechCrunch, referring to the fact that in many instances, what will be most valuable to a content owner is a link back to their site -- because that will drive up their rankings in search. (Links being the golden currency of the web.)
But other possibilities can also be foreseen - namely, that the site will be forced to share a portion of its advertising revenues with the content owner.
Reuters, which has been involved in testing the site over the previous six months, said that tracking of unauthorised content was one of the "side benefits" of the service, and that the principal goal was to get intelligence on how customers were using its content.
But Ric Camacho, a vice president of digital syndication at the company, acknowledged that if the tracking service offered a chance to find sites that were using Reuters content without consent and introduce them to any of the "number of models" for accessing such content, then that would be a good thing.
"We'd view it as a sales lead," Mr Camacho said.
Last week, Tom Curley, president and chief executive of Associated Press said that media owners needed "dynamic new distribution plans" for their content on the web, but cautioned they should "couple" such initiatives with "strong new efforts to protect news web sites from unauthorised scraping through tighter site protocols and content tagging."
What Attributor - whose service will only initially be text based (there are plans to introduce photo and video in the coming months) - suggests is that media companies may now finally be able to harness the technologies of the web, long their enemy, to work for them for a change. And that the question 'How do we stop this from happening?' is being replaced, whisper it, with 'How can we make this work for us?'
.....................
Timesonline