This is not new that Google’s new panda algorithm designed to produce better results in search actually didn’t help much improve the quality of the search results. Instead it ranked down good content sites which weren’t quality enough. Here is what a quality site should be like…<read>.
But I couldn’t stop myself from sharing an example of the search result page (below) when you search for useful google reader shortcuts or other similar techie stuff.
As you see above, Lifehacker’s post about a chrome extension shows up 3 times on the first page of the SERPs at 5th, 6th and 8th position.
Both 5th and 6th are the same permalink to that post. (surprising that the same page can be indexed twice).
The 8th position results is a cross post (duplicate) on their .com.au domain. Yes, duplicate content doesn’t seems to be a problem here.
I feel sad for Hacktrix’s post at the 7th position which is actually a more better and relevant result for "useful google reader shortcuts" which is lost amidst the 3 redundant articles shown from Lifehacker domains.
There are several examples like these… So I guess that <the post> about content and quality like "Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations" is not meant for all sites. Especially the ones with lot of authority on the niche, but the same article 3 times in the search results ?