User-agent: * # All robots Allow: /app # Advanced robots like Google and Bing will understand this directive Allow: /web # Advanced robots like Google and Bing will understand this directive # Disallow: / # It is a security risk to specify, specificly, all paths that should # be disallowed because then it exposes the paths to any bad robot # (who wouldn't really follow this robots.txt file anyway) so we go # ahead and take advantage of advanced crawlers' ability to follow # the "non standard" Allow directive.