How to Remove Feeds from Sitemap and in Google Search Console
Removing unwanted feeds from your sitemap and GSC can be perplexing. Learn to streamline with robots.txt.
Managing your website’s sitemap is crucial for effective SEO. However, dealing with unwanted feeds in your sitemap can be perplexing. One of the most annoying things for me personally is unindexed /feed
URLs in Google Search Console (GSC).
In this guide, we’ll explore how to remove feeds from your sitemap and tackle related issues using robots.txt
.
Understanding Feed URLs
Feeds are snippets of your website’s content, often generated automatically by platforms like WordPress. While they serve a purpose, having them clutter your sitemap can hinder your SEO efforts.
Feed URLs, such as /post-1/feed/
and /post-2/feed/
, are automatically generated snippets of your content, often used for syndication or subscription purposes. They may also include URLs like /post-2/?amp=1
and /post-2/?utm_source=rss&utm_medium=rss&utm_campaign=post-2
.
Let’s see 2 methods of how we can remove these URLs from GSC.
Editing robots.txt to Manually to Remove Feed URLs
Locate your website’s robots.txt file, typically found in the root directory, and insert the following Disallow directives to instruct search engine bots not to crawl specific feed URLs.
User-agent: *
Disallow: /*/feed/
Disallow: /*?amp=
Disallow: /*?utm_source=
Save the updated robots.txt file and upload it to your website’s root directory.
This is easy for someone who has access to WordPress server and some technical/coding expertise but if you don’t here is another method using the RankMath plugin.
Using the RankMath plugin to edit robots.txt
If you’re already using the RankMath plugin for SEO in WordPress, you can conveniently edit your robots.txt file directly from the plugin’s interface. Here’s how:
- Navigate to RankMath > General Settings.
- Scroll down to the robots.txt section and click on “Edit robots.txt.” You can then add or modify directives to block specific URLs as needed.
- After adding the above snippet of code with directives, save your changes within the RankMath plugin.
Irrespective of how you edit your robot.txt file, ensure the syntax is correct and test your robots.txt file using Google’s robots.txt testing tool or Google Search Console.
Benefits of Removing Feeds from Sitemap
- Precision control: You can specifically target and exclude unwanted feed URLs from search engine crawling.
- SEO optimization: Removing unnecessary feeds improves your site’s crawl efficiency and prevents the indexing of duplicate content.
- Feeds cluttering your sitemap can lead to indexing of duplicate content and confuse search engines. This can potentially harm your website’s SEO efforts.
Addressing Google Search Console Issues
Even if feed URLs are not in your sitemap, Google Search Console may still show them. This could be due to direct crawling or links from other pages.
Additional Tips
- Disable RSS feeds in WordPress if not in use.
- Use the URL Removal Tool in Google Search Console to temporarily block indexed URLs.
- Update your robots.txt file to include other rules preventing the crawling of specific URLs.
Conclusion
In the world of SEO, every little tweak counts.
Removing feeds from your sitemap is essential for optimizing your website’s SEO. By following the methods outlined in this guide, you can declutter your sitemap and improve your website’s visibility on search engines.
Why are feeds included in the sitemap by default?
Feeds are often included by default to ensure that fresh content is promptly indexed by search engines. This is a default behavior in WordPress.
Can removing feeds affect website performance?
Removing feeds can improve crawl efficiency, potentially leading to better website performance.
How often should I update my robots.txt file?
It’s a good practice to review and update your robots.txt file regularly, especially after making changes to your website’s structure.
Will removing feeds improve my website’s ranking?
While it’s not a direct ranking factor, cleaner sitemaps can indirectly benefit your SEO efforts by improving crawl efficiency.
Is it necessary to remove feeds from the sitemap if they are not indexed?
Yes, it’s still beneficial to remove feeds from your sitemap to streamline the crawl process and avoid potential duplicate content issues.
What happens if I make a mistake in my robots.txt file?
Incorrect syntax or directives may unintentionally block important pages. Always double-check your robots.txt file and monitor for any unintended consequences.
Ensure the syntax is correct and test your robots.txt file using Google’s robots.txt testing tool or Google Search Console.