About Aurora SEO
Your community page appearance in web search results is the first impression you make, an important factor in attracting more visitors to your site, and a way to get higher search engine rankings. Search Engine Optimization (SEO) is a series of techniques to improve your community's visibility and search engine ranking. Unlike paid search, SEO optimizes your site to get the best results based on the ranking criteria used by various search engines.Aurora SEO: Manage URL redirects for Community pages
Over time, content on your community changes, moves, is deleted/archived, or replaced. When this happens, you want to make sure that the web searches don’t show results that go to obsolete or missing pages. Or, sometimes, you might want to point people to newer or more relevant content. To help optimize these search results and make sure that people get to the right content, you can create redirect rules. Redirects enable you to keep the page and link authority of your website when a website’s URL is redirected to another URL for any reason. Basically, redirects help you keep the SEO of your website healthy and keep visitors engaged on your site. Properly defined redirects help keep your search rankings. The most common types of redirects are 301 (permanent) redirect and 302 (temporary) redirect. Technically, there’s no behavioral difference between a permanent or temporary redirect. Permanent redirects are for the pages that you do not want to retain and might want to permanently delete. Temporary redirects are for the pages you plan to keep in place for a finite time period. For example, you might want to temporarily redirect to another page while you make updates to the source page on your site and then remove the rule when the update is complete. Defining the redirect as temporary (302), you can easily find these rules in the list when you want to remove them. Admins and members with the Manage Redirect Rules permission can access Redirect rules in the SEO Settings page (Settings > System > SEO). Key points When creating redirect rules, you must: Use valid community URLs. You cannot redirect to an external site. Use different URLs. Currently, we don't support nested redirects. When creating redirect rules, keep the following in mind: You cannot use a destination URL in one rule as the source URL in another rule. You cannot use a source URL in one rule as the source URL in another rule. If you experience these conflicts, you must delete one of the other rules in your list to create this new rule. Tip: Avoid redirecting to posts marked as spam, archived content, or content in private boards and Groups. You should link only to public URLs. Add redirect rule Sign in to the community as an Admin. Go to Settings > System > SEO. Click Add Rule. Enter the Source and Destination URLs. (URLs must be from one of your company’s community pages.) Select Redirect Type: 301 (Permanent) or 302 (Temporary). Click Add. Note: It takes an hour for the redirects to reflect on the site. Delete redirect rule To delete an existing rule, click Deletenext to the rule you want to remove.Aurora SEO: Regulate content crawling by search engines using robots.txt
When you publish content in the community, search engines (web robots or web crawlers) crawl these newly published pages to discover and gather information from them. After crawling the content, the search engines index these pages to provide relevant search results based on the search queries. It is important to instruct the web crawlers to crawl only the relevant pages and ignore the pages that don't require crawling activity. Using Robots Exclusion Protocol (a file called robots.txt), you can indicate the resources that need to be included or excluded from the crawling activity. When a new community is created, the Khoros platform configures the robots.txt file with the default rules for the community. The default rules include instructions, which are generic for all the communities. Admins and members with permissions can view the Default Rules in Robots.txt Editor (from Settings > System > SEO area). In the editor, you can also add Custom Rules that are appended after the default rules. Note: You cannot edit the default rules. How does Robots.txt work? You can find the robots.txt file in the root directory of your community by appending “robots.txt” at the end of the URL (https://site.com/robots.txt). The file includes the list of user agents (web robots), community URLs, and sitemaps with instructions indicating whether the user agents are allowed or disallowed to crawl the specified URLs. When the user-agents or web crawlers enter your website, they first read the robots.txt file and proceed further with the crawling activity based on the instructions added in the file. The user-agents gather information only from the community pages that are allowed and are blocked from the pages that are disallowed. Robots.txt syntax The robots.txt includes these keywords that are widely used to specify the instructions: User-agent: The name of the web crawler for which you are providing the instructions. Example: User-agent: testbot To provide instructions to all the user agents at a time, enter * (wildcard character). Example: User-agent: * Disallow: Command to indicate the user-agents not to crawl the specified URL. Note that the URL must begin with ‘/’ (forward slash character). Example: User-agent: testbot Disallow: /www.test1.com Allow: Command to indicate the user-agents that they can crawl the specified URL. Note that the URL must begin with ‘/’ (forward slash character). Example: User-agent: testbot Allow: /www.test2.com Sitemap: Indicates the location of any XML sitemaps associated with the URL. The Khoros platform automatically generates sitemaps for each community when it is created and adds them to the robots.txt file. Example: User-agent: testbot Sitemap: https://www.test.com/sitemap.xml The following is the sample format to allow or disallow a user-agent "testbot” to crawl the community pages: User-agent: testbot Disallow: /www.test.com Allow: /www.test1.com Sitemap: https://www.test.com/sitemap.xml Using the Robots.txt Editor The Robots.txt Editor enables you to add, edit, and remove custom rules to robots.txt. You can look for more information provided by Google and other crawlers handling rules in robots.txt. Let’s take an example where you want to add a custom rule to disallow a user-agent “testbot” from crawling a member profile page of the community. To add a custom rule: Sign in to the community as an Admin. Go to Settings > System > SEO. In the Robots.txt Editor, you can view the Default Rules and Custom Rules sections. In the Custom Rules section, click Edit. In the Edit window, enter the instructions and click Save. The rule appears in the Custom Rules area of the tab. You can edit or remove the existing Custom Rules by clicking the Edit option. The new custom rules get appended to the robots.txt file located in the root directory: After you edit the custom rules, you can validate the robots.txt via the Lighthouse tool. Learn more about robots.txt validation using lighthouse. Note: The Audit log records the member actions made in the robots.txt file.147Views0likes5CommentsAurora SEO: Exempt domains from rel = “nofollow” attribute set
Generally, when a search engine finds a page with links on it, the standard process is to index the page and then follow all the links on the page. However, when it sees “nofollow” on the link, it: Does not follow the link to the specified URL. Does not count the link towards its popularity score in their ranking engine (using “nofollow” causes Google to drop the target links from their overall graph of the web). Does not include the link text in the relevancy score of those keywords. Note: Different search engines might handle “nofollow” in slightly different ways. Adding a follow to the suboptimal external links can diminish community content’s ranking in search results. By default, external links added to content are set to "rel=nofollow." However, there are always exceptions. Khoros enables you to override this default and add specific domain links you may want to make exceptions from this rule, such as any company-owned domains. Admins can access the SEO settings page (Settings > System > SEO) and create a list of domains to be exempted from the “nofollow” attribute set. To create a list of domains where you do not want to set the "nofollow" attribute: Sign in to the community as an Admin. Go to Settings > System > SEO. Click Edit. Enter the domains separated by commas for which you want the “nofollow” value set and click Save. Note that you can include * (wildcard) character in the beginning of the domain to consider all the links pertaining to the domain. For example, *.sample.com.Aurora: Configure SEO settings
In the community, you can configure various SEO settings to optimize your content to attract more readers to your site. Khoros offers several out-of-the box SEO features such as custom metadata and structured markup. From the SEO Settings page, admins can configure SEO settings such as redirect rules, robots.txt editor, and more. To configure SEO settings: Sign in to the community as an Admin. Go to Settings > System > SEO. UnderSettings,configure the following: For the Exempt domains from "nofollow" option, click Edit and enter the domains (separated by commas) for which you want the “nofollow” value set. Learn more about Exempt domains from rel = “nofollow” attribute set. Toggle on or offAppend topic ID to title and description meta tags. For theSitemap update frequencyoption, clickEditto adjust how often, in minutes, the sitemap file should be regenerated. This value cannot be lower than 15 minutes. For theWebsite name to display when posting a community link on Facebookoption, click Edit to enter the URL (up to 200 characters) you'd like to display when users copy/paste community links on Facebook. This name is added to the Open Graph "og:site_name" property of the meta tag. Go to Redirect Rules and configure rules to map one community URL to another. Learn more about managing URL redirects for community pages. Go to Robots.txt Editor, where you can view the Default Rules. In the editor, you can also add Custom Rules that are appended after the default rules. These rules instruct the web crawlers to crawl only the relevant pages and ignore the pages that don't require crawling activity. Learn more about regulating content crawling by search engines using robots.txt.Aurora SEO: Avoid duplication of content title and description meta tags
Often, the first impression you make with people trying to find your community content is how that content appears in web search results. Clear titles and descriptions are important factors in attracting more visitors to your site. Effective SEO titles and descriptions help your content achieve higher rankings in search engines. Khoros Communities offers out-of-the-box SEO metadata for your forum discussions, blog posts, ideas, and knowledge base articles to attract more readers to your site. Learn more about custom metadata. By default, Khoros sets the SEO title and description for a post based on the title and a snippet of its content. The SEO title and description added to your content appears in web search results as shown below: In some cases, a community may contain multiple pages with the same meta titles or descriptions. These duplications can confuse search engines and make it harder for them to prioritize the right content and reduce search results. To ensure that the metadata for your titles and descriptions are unique, Khoros enables you to append the topic ID of individual posts to your title and description metatags. This avoids duplication of the tags and improves search engine rankings. Admins can navigate to the SEO settings page (Settings > System > SEO) to enable the Append topic ID to title and description meta tags option. To append the topic ID to title and description metatags: Sign in to the community as an Admin. Go to Settings > System > SEO. Toggle on Append topic ID to title and description meta tags.