You're doing your site's technical SEO audit. You check the sitemap content-type header with curl -sI https://your-domain.com/sitemap.xml | grep -i content-type and it comes back:
content-type: application/rss+xml
That's wrong. A sitemap should return application/xml or text/xml. And if your site is on Webflow, you didn't misconfigure anything — this is a known platform-level behaviour that affects a significant portion of Webflow-hosted sites.
This article explains exactly what's happening, whether it actually hurts your rankings, and what your options are to fix it.
What Is application/rss+xml and Why Is It Wrong?
When a browser or bot makes a request for a file, the server responds with a Content-Type header that tells the client what kind of content it's receiving. For a sitemap, the correct values are:
application/xml— the standard MIME type for XML filestext/xml— also acceptable, widely supported
application/rss+xml is the MIME type for RSS feeds — the XML format used by blog syndication, podcast feeds, and news aggregators. It's a valid MIME type, but it's semantically wrong for a sitemap. A sitemap is not an RSS feed.
The difference matters because:
- Google Search Console's sitemap validator checks for correct XML MIME types and flags
application/rss+xmlas an error - Third-party SEO tools (Screaming Frog, Ahrefs, Semrush) treat it as a misconfiguration
- Some older or strict XML parsers may refuse to process a document served with the wrong MIME type
Why Webflow Sitemaps Have This Bug
Webflow auto-generates a sitemap at /sitemap.xml for every published site. The sitemap XML structure itself is typically correct — it follows the sitemaps.org protocol with proper <urlset> and <url> elements. The problem is in how Webflow's edge infrastructure sets the response headers when serving that file.
The most likely explanation is that Webflow's CDN layer (historically built on Fastly) has a configuration that maps the .xml extension to the application/rss+xml MIME type for certain file routes — possibly because sitemaps in some early Webflow iterations were generated with RSS-format XML. Whatever the origin, it's been reported by Webflow users since at least 2021 and affects sites across multiple Webflow plan tiers.
This is not your fault. You did not misconfigure your Webflow site. The wrong Content-Type is served by Webflow's infrastructure, not by any setting you control in the Webflow Designer or Site Settings panel.
Does Google Actually Care?
The honest answer: for most Webflow sites, the practical ranking impact is low. Google's sitemap parser reads the XML content of the file and does not strictly require the correct Content-Type. Googlebot can and does process sitemaps served with application/rss+xml — it's been confirmed in multiple developer discussions and Google's own crawlers are built to be forgiving about header mismatches.
However, "Google can probably still parse it" is not the same as "it has no effect." Here's where the wrong Content-Type does cause real problems:
Google Search Console Sitemap Errors
When you submit your sitemap in GSC, the validator checks the response headers. A application/rss+xml response often triggers a "Couldn't fetch" or format error in the coverage report, even if Google is actually reading and processing the sitemap. This means you lose visibility into indexation issues — your GSC sitemap status becomes unreliable.
New Page Indexation Lag
For SaaS sites that regularly publish new pages (feature releases, blog posts, landing pages), sitemap reliability matters. If Googlebot is uncertain about the sitemap's format, it may deprioritise processing it during busy crawl cycles. This can add days to the indexation of new content — not a catastrophic failure, but a meaningful delay if you're in a competitive niche.
SEO Audit Failures
Every SEO audit tool — and every SEO consultant — will flag this. When you're raising funding or going through due diligence, having documented technical SEO failures that you "knew about but didn't fix" looks sloppy. For a funded SaaS company, this is a $0 fix with a non-trivial credibility cost if left unaddressed.
Bottom line: On a 10-page marketing site that doesn't change often, this bug causes negligible harm. On a SaaS site publishing new product pages, case studies, or blog content regularly, it's a legitimate infrastructure issue worth resolving.
How to Verify Your Sitemap Content-Type
Before you try to fix it, confirm the issue actually exists on your site. Webflow's behaviour has been inconsistent — some plans or regions may have already been patched.
curl command (fastest)
Run this in your terminal, replacing your-domain.com with your actual domain:
curl -sI https://your-domain.com/sitemap.xml | grep -i content-type
Expected correct output:
content-type: application/xml; charset=utf-8
Problematic output (the Webflow bug):
content-type: application/rss+xml; charset=utf-8
Chrome Network tab
Open DevTools → Network tab → navigate to your-domain.com/sitemap.xml → click the sitemap.xml request → Headers panel → look for content-type under Response Headers.
XML Sitemap Validator
Use any online sitemap validator (search "XML sitemap validator online"). Most will flag the wrong Content-Type header in their results. Google's Rich Results Test does not check sitemaps, but Google Search Console's Coverage report will show issues if you've submitted the sitemap.
How to Fix It
You have three realistic options. They are ordered by effort and control:
Option 1: Contact Webflow Support (Lowest Effort, Variable Success)
Submit a support ticket to Webflow describing the issue: your sitemap at /sitemap.xml returns application/rss+xml instead of application/xml. Reference your plan tier. Some users have reported this being resolved by Webflow on specific plans or site configurations.
This is the right starting point, but don't wait on it. Webflow support response times vary and this type of infrastructure-level bug often gets deprioritised. Move to Option 2 in parallel.
Option 2: Host a Custom Sitemap with Correct Headers (Recommended)
This is the most reliable long-term fix. You create your own sitemap.xml file with the correct URLs, host it somewhere that serves XML with proper headers, and tell Google about it.
The cleanest approach for a Webflow site:
- Export your current Webflow sitemap: visit
your-domain.com/sitemap.xml, save the XML content to a file - Host it as a static file on Cloudflare Pages, Netlify, or GitHub Pages — all of these serve XML with correct
application/xmlheaders by default - Update your
robots.txt(in Webflow Site Settings → SEO → robots.txt) to point to your custom sitemap URL:
# In your Webflow robots.txt
Sitemap: https://your-cdn-url.pages.dev/sitemap.xml
- Submit the new sitemap URL in Google Search Console → Sitemaps
- Set up a process to keep the custom sitemap updated whenever you add new pages (or generate it programmatically from your Webflow CMS)
Cloudflare Pages tip: If you're already using Cloudflare in front of your Webflow site (which many SaaS companies do for performance), you can host the custom sitemap there and it serves with application/xml automatically. No separate hosting needed.
Option 3: Use Cloudflare Transform Rules (Most Elegant)
If your site is proxied through Cloudflare (the orange cloud is enabled for your domain in Cloudflare DNS), you can rewrite the Content-Type header at the edge without changing anything about your Webflow setup:
- In Cloudflare, go to Rules → Transform Rules → Modify Response Headers
- Create a new rule with the filter:
URI Path equals /sitemap.xml - Action: Set header
content-typetoapplication/xml; charset=utf-8 - Save and deploy
This fixes the Content-Type header for every request to your sitemap without touching Webflow at all. Googlebot and GSC will now see the correct header.
Note: Cloudflare Transform Rules require a paid Cloudflare plan (Pro or above). If you're on the free tier, Option 2 is your best path.
What Not to Do
A few fixes that sound reasonable but aren't worth the effort:
- Don't disable Webflow's auto-generated sitemap entirely — you lose the auto-update behaviour and then you have to maintain the custom sitemap manually every time pages change
- Don't create a sitemap at a different URL (e.g. /sitemap2.xml) — you end up with two sitemaps and have to manage which one GSC uses
- Don't use a third-party sitemap generator plugin for a plain Webflow site — unnecessary complexity when a static hosted file does the job
Frequently Asked Questions
Why does my Webflow sitemap return application/rss+xml?
Webflow's sitemap generator has historically served sitemaps with the application/rss+xml Content-Type header instead of the correct application/xml or text/xml. This is a platform-level behaviour — not something you caused. The sitemap XML content itself is usually valid; only the Content-Type header is wrong.
Does the application/rss+xml sitemap bug affect Google rankings?
Google can usually parse a sitemap served with the wrong Content-Type because it reads the XML content directly. However, the wrong Content-Type causes failures in Google Search Console's sitemap validator, flags in third-party SEO tools, and can cause inconsistent indexation for rapidly-published new pages. For static brochure sites, the practical impact is minimal. For SaaS sites publishing new content regularly, it's worth fixing.
How do I check if my Webflow sitemap has the rss+xml bug?
Run: curl -sI https://your-domain.com/sitemap.xml | grep -i content-type. If it shows application/rss+xml, the bug is present. It should show application/xml or text/xml.
Can I fix the Webflow sitemap content type myself?
You can't modify Webflow's auto-generated sitemap headers directly. Your best options: (1) use Cloudflare Transform Rules to rewrite the header at the edge, or (2) host a custom sitemap on Cloudflare Pages or Netlify and point your robots.txt and GSC to it. Both give you a correct Content-Type without changing your Webflow setup.
Does this affect my Webflow blog or CMS pages?
The Content-Type bug affects the sitemap file itself, not your blog or CMS pages. Your pages are served with correct HTML Content-Type headers. The concern is that if GSC can't reliably validate your sitemap, it may be slower to discover and index new CMS pages you publish. This matters most for sites publishing new content frequently.
Want a Full Technical SEO Audit?
We audit sitemaps, schema markup, Core Web Vitals, crawlability, and 40+ other SEO signals — and deliver a prioritised fix list within 24 hours.
Get Your Free Audit