Feed on
Posts
Comments

While browsing around a bit I read a pretty nasty hoax is going around in Germany that tries to trick webmasters into banning search engines from their website. The e-mail claims millions of web servers are being attacked by viruses and that you can protect yourself by creating a robots.txt with a couple of lines of code to forbid the virus from visiting your website.

The mail advises you to create a robots.txt file in the root folder of your site with the following code:

User-agent: *
Disallow: /

Basically this code tells any search engine like Google or Yahoo that they aren’t welcome on your site. Any webmaster who falls for this will see his search engine traffic dry up as the search engines will no longer index his site(s).

It’s a pretty nasty way to beat your competitors.

RSS feed | Trackback URI

Comments »

No comments yet.

Name (required)
E-mail (required - never shown publicly)
URI
Subscribe to comments via email
Your Comment (smaller size | larger size)
You may use <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> in your comment.