Unknown robots drastically eating my website's bandwidth unnecessarily. I could not be able to identify them. Because they are crawling the website by without any of its details like name, IP address etc. While they crawling the only thing I know is "Unknown robot crawling". If I know the IP address, I can stop them by IP blocking. And the Unknown robots may not required to obey robot.txt standards or else other techniques like "no follow" attribute. They will just crawl websites for all the website links available on the web without user authentication. These robots not required to crawl websites and there is no use of them. How to stop them in a efficient way?