Hey everyone,
I'm working for a client who has a website blog and having some issue with multple duplicate content created when the content is tagged or labled.
In SEO perspective, it could cause issues of page authority being diluted or could even be considered as a black-hat SEO tatctics by Google for stuffing keywords under one domain.
With wordpress, you can install a plugin to automatically noindex certain URLs created due to more than one category or tag.
The only function related to this issue I found when I go into Lithium setting is that I can limit the number of tags or labels to be created by users/per certain period/ per post.
Is there a way to noindex URLs created by tags, categories, or labels with Lithium CMS besides having to manually noindex individual URLs?
Many thanks in advance.
Ed
Simplest way to do this is you can add url's to your robot.txt file to prevent these articles from crawled.
To add a custom rule:
To edit a custom rule:
To delete a custom rule:
Thank you a lot for sharing your suggestion.
robots.txt would certainly help.
But as far as I know, robotted URLs could still be craweld when they are linked to or from other webpages, and could be indexed. This will cause us duplicate content issues.
Is there any SEO plugin that could add "noindex" tags with custom rules for Lithium that are similar to Yoast SEO plugin for Wordpress?
Many thanks in advance!!
Ed
There no such kind of plugin in Lithium to do that.
This need customization for such bulk url's and add them to under conditions on the basis of labels and tags of posts that you want to exclude them from crawl.
You can ask your development team to make one for you.
Hey Ed,
In addition to the Robots.txt, as you've mentioned the best way to prevent the page from being indexed is to use the <meta name="robots" content="noindex"> on the individual pages. It would be great if Khoros had the ability to automate this or at least had a field in the editor to add the robots meta tag.
https://support.google.com/webmasters/answer/93710
For example on all tag URLs that are generated. If we didn't want those to be crawled and indexed, it would be great if we could assign to all URLs with /tag/ in the URL.
Welcome to the Technology board!
Curious about our platform? Looking to connect on social technology? You've come to the right place!