Measures Concepts
GitHub icon

Robots.txt

Robots.txt - Config format

< >

Robots.txt is a config format created in 1994 by Martijn Koster.

#2100on PLDB 30Years Old

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site.


Example from the web:
User-agent: googlebot # all Google services Disallow: /private/ # disallow this directory User-agent: googlebot-news # only the news service Disallow: / # disallow everything User-agent: * # any robot Disallow: /something/ # disallow this directory

Language features

Feature Supported Token Example
Comments ✓
# A comment
Line Comments ✓ #
# A comment

View source

- Build the next great programming language · Search · Add Language · Features · Creators · Resources · About · Blog · Acknowledgements · Queries · Stats · Sponsor · Day 605 · feedback@pldb.io · Logout