Skip to content

disallow irrelevant pages by defaults in robots

Ramya Authappan requested to merge github/fork/bbodenmiller/robots.txt into master

Created by: bbodenmiller

Update default robots.txt rules to disallow irrelevant pages that search engines should not care about. This will still allow important pages like the files, commit details, merge requests, issues, comments, etc. to be crawled.

Fixes http://feedback.gitlab.com/forums/176466-general/suggestions/7665033-create-robots-txt

Merge request reports

Loading