One way to limit what files and directories a search engine robot program scans is to create a robot.txt file. This file will tell properly configured web robots which files and directories to search and which not to search. For more information on the robots.txt file see the Web Robots FAQ at:
The default location of robots.txt in a VPS is:
The default location of robots.txt on a Plesk based account (Upipe, MDH, OHP, WebPro) is:
(for files accesses through https://).
You will need to substitute the actual user name for "username" and substitute the actual domain name for "domainname.com".