PyLucid will create a robots.txt file, dynamically:
If settings.DEBUG it will send the following content, to disallow all pages:
User-agent: * Disallow: /
In normal mode, we send the following content to disallow all URLs that one '?' contains:
User-agent: * Disallow: /*?
If you will use your own static robots.txt file, add this to your .htaccess:
RewriteRule ^robots.txt$ - [L]
The robots.txt would not protect your page. It is only a recommendation. Not every search robots followed this.
More information: http://en.wikipedia.org/wiki/Robots_exclusion_standard