A robots.txt file notifies search engine crawlers which URLs on your site they can access. This is mostly intended to prevent requests from overwhelming your site; it is not a strategy for keeping a web page out of Google. Block indexing with noindex or password-protect the page to keep it out of Google.