Robot.txt is a simple and easy text file that is given in the root point of your site. It helps the search engine buy telling them, in which areas of your site thay can visit and index their own site. If you truly want security on your site, you will have to actually put the files in a protected directory, rather than trusting the robots.txt file to do the job. It's guidance for robots, not security from prying eyes. For this you can use the robot.txt generator tool. For this you have to choose some options such as by default robots, crawling time period, which pages of your site the robots can search, which areas are restricted for them. Then you have to click the submit button and see the result.