There has been a lot of talk recently about content farms showing up on search engines like Google and Bing. This is particularly noticeable if you are a developer and use Sites like and others harvest StackOverflow content, add more key words, add meta tags, etc. and somehow appear higher than StackOverflow in search results. Google has said that they are improving their algorithm to make sure these sites are ranked lower than the site with the original content. Blekko has actually started to block these sites entirely. While these are nice efforts, one thing has always confused me about search engines. Why can’t I setup a custom lists of sites that I never want to see in search results?

When was the last time ever helped you solve a problem? When was the last time you saw it in a search result? I am guessing the answer to the first question is seldom if ever and the answer to the second is sometime this week.

This is my request to Google, Bing, etc. Allow me to specify a list of domains that are always blocked from my search results. This will improve the user experience for me immediately and provide the search engines with valuable data about sites users don’t want to see. It seems to me that everybody wins.