LII:Web Application Security Guide/Special files

From LIMSWiki
Jump to navigationJump to search

Special files

Special files like .htaccess, robots.txt, crossdomain.xml and clientaccesspolicy.xml have special meanings which has to be considered before deploying such files.

To prevent this type of attack

  • Know the meaning of these files.
  • Ensure robots.txt does not disclose "secret" paths.
  • Ensure crossdomain.xml and clientaccesspolicy.xml do not exist unless needed.
  • If used, ensure crossdomain.xml and clientaccesspolicy.xml allow access from trusted domains only.
  • Prevent users from uploading/changing special files (see file upload vulnerabilities section).

Rationale

Special files like .htaccess, robots.txt, crossdomain.xml and clientaccesspolicy.xml define security relevant settings and rules. Knowing their meaning is necessary to use them securely.

.htaccess influences the behaviour and security relevant settings of the web server (e.g. access rights, executable file types, ...).

robots.txt can be ignored by malicious or badly written robots. As this file is publicly available, an attacker can gain valuable information about "interesting" paths (like administration interfaces) if they are mentioned in the robots.txt file. Attackers do check this file for such content.

crossdomain.xml and clientaccesspolicy.xml can disable the same-origin policy in some plug-ins. Incorrect configuration leaves the site open for cross-site scripting/cross-site request forgery attacks using plugins. Note that crossdomain.xml files are also valid if they appear in subdirectories.

Further reading

Notes

The original source for this page is the associated Wikibooks article and is shared here under the CC BY-SA 3.0 license.