Aurora SEO: Regulate content crawling by search engines using robots.txt
Updated 5 months ago
Version 5.0To stop pages belonging to certain boards from being crawled, you can create a custom rule and disallow pages from those boards. e.g., To stop pages of a forum board from being crawled, use " Disallow: /discussions/<board-name>/* "
You don't necessarily have to mention a user agent. If you don't mention a user agent, " User-agent: * " gets applied by default.