By now, you might have heard that Google is being sued by an association of newspapers in Europe known as Copiepresse. The reason? The plaintiffs think its great that Google has indexed their news stories in the regular search engine. But as soon as they created a news portal out of several thousand newspapers that publish their work on the web, Copiepresse decided to take action. Here are a few links to get you going:
Greed is Suicide for Belgian Press
Copiepresse Upset Ruling On Google Wasn't Visible Enough
Here they go after Microsoft, too.
And then, there's the Register for their pointed commentary.
One thing that really bothers me about this suit: they want to be paid for indexing. To paraphrase, "we don't mind being in their search engines, but when they present themselves as a news portal with our news, we have a problem." I guess they're upset that Google doesn't show any advertising on their news portal. It seems schizophrenic. They want it but they don't. They can't really decide. Google has already stated flat out that they don't pay for news. Perhaps they're in a better position to bargain than Copiepresse. You think?
My fear is that news organizations will use this to force search engines to only show news stories that are politically acceptable. Yes, that's right. I think it is yet another veiled form of censorship. They just can't wait for us to accept their own custom utopia (you know, the Road to Serfdom). Better that we don't get any countervailing news. Never mind that the purpose of journalism is to question the prevailing wisdom.
Of course, the article in the Register, like many others, have pointed to a standard that is more than a decade old: robots.txt. This is just a tiny little text file letting robots (and people) know if they can index the article and/or cache it (store it locally on disk at the server where the robot returns the information it collects). Copiepresse refuses to use this tool. They believe that if they choose to exclude robots from their website, they will be excluded from the search results. But if they don't use this tool, then they will be included in the search results and in news portals such as Google's. They believe there is a better solution than choosing between death and life. There must be a middle ground.
Well, I don't know. They want to be in search engines but they don't want to be in a news portal that drives traffic to their sites. Lithium anyone?
My hope is that if news associations decide that they're the final arbiters of what search engines can show and be included in search indexes, that this will give rise to a large wave of people who want to get the news out - independently. Bloggers will fill the void like never before. I don't think bloggers will mind showing up on the news portals.
Copiepresse is being irrational. Copiepress got what they wanted - Google stopped indexing. But now they are upset because they've been shut out. Oh, you can still find them if you already know their website address. And then maybe you can search their website locally with their own search engine. But they have already noticed a significant decrease in traffic to their sites. Get ready for their screams of bloody murder when Yahoo and Microsoft avoid them like the plague, too.
A lawsuit was already tried here in the US. The plaintiff was found to have not used the robots.txt tool intentionally so that search engines would crawl his site with their robots. Then he had planned to sue for copyright infringement all along. The court found in favor of the defendant with the reasoning that he could have used the robots tool but refused to do so. Something about the clean hands doctrine.
One other thing they don't seem to anticipate: the search organizations could get together and start de-listing Copiepresse as a whole. Perhaps then, Copiepresse may see reason. When that happens there will be another lawsuit forcing the search engines to give them equal time. Not.
Censorship is already a problem as it is. Why can't these nuts just be happy with the way things are?