The recent accidental leak of the complete user database of over 300,000 users of Groupon's India Subsidiary - SoSasta.com is an example of one of the glaring security mistakes which webmasters often commit unknowingly. In this article we will tell you 2 ways in which you can prevent Google and other search engines like Bing, Yahoo from Indexing SQL and database files on your website.
Prevent Search Engines From Indexing SQL Files using robots.txt
The Robots.txt file is used to give instructions to search engines about what to index and what not to index in an website. In order to prevent search engines like Google and Bing from indexing your database dumps (SQL files), just create a text file and name its as: robots.txt in your web - server's document root. Generally /var/www/ folder for Apache web servers and inside subfolders of C:/Inetpub/ directory in the case of Microsoft IIS web servers is the Document Root.
Paste the following text in the robots.txt file you just created. In case you already have have robots.txt file just add this to the end of it's contents:
User-agent: * Disallow: /*.sql$
Do not place Database Dumps (SQL Files) in server root
The first and the foremost security measure which you must take is NEVER ever keep or place your website's database dumps (generally in sql file formats) in side the Root directory of your web-server. As already said above, is generally /var/www/ folder for Apache web servers and inside subfolders of C:/Inetpub/ directory in the case of Microsoft IIS web servers.
Linux users can run the following command in order to check if the document root of Apache web server has any SQL file stored in it:
cd /var/www/ && sudo find . -name .sql
In case you find any such files, immediately remove them. Microsoft IIS Server users can simply run a search for SQL files from the search dialog. If you need some help with any of the methods above, simple leave a comment below and we will get back to you.
Related Posts by Tags: bing, Google, index, search engine, SQL