Writing Robot.txt file code
How to disallow all using robots.txt
To block all robots or search engine bots to crawl your website, then this is the code you should put in your robots.txt to disallow all:
User-agent: * Disallow: /
Here the “User-agent: *” part adress all robots.
And
The “Disallow: /” part means that it applies to your entire website.
In effect, this will tell all robots and web crawlers that they are not allowed to access or crawl your site.
Important Note: By disallowing all robots means you are blocking your website from search engines, no any search engine like Google or Bing or other will index your website and any of it’s pages. No one will find your website by searching on search engine.