: How do search engines index XML-based websites? I'm talking about the kind that transforms XML into (X)HTML on the client side. We all know that search engines use semantics in HTML (headings,
I'm talking about the kind that transforms XML into (X)HTML on the client side. We all know that search engines use semantics in HTML (headings, links) to rank your website. So if you don't already use embedded XHTML in your XML, they can't make any sense of it, can they? I'm especially worried about them not finding links, which would prevent them from effectively crawling your site - a disaster for SEO people. So, any experience to share here?
More posts by @Hamm4606531
1 Comments
Sorted by latest first Latest Oldest Best
AFAIK, Google doesn't process XSLT templates. So while the text content of the XML can be indexed by Google, it'll just be in the form of the plain XML document, meaning most of the document semantics won't be understood by Google aside from some shared attributes and elements between XML and XHTML.
I don't know if the situation is any different for other search engines or whether any progress is being made on this front. It would be nice to have search engines apply XSLT to XML documents, as it saves on having to preprocess it server-side. But right now it's safer to just apply the XSL transformations sever-side at least for search engines (and non-XSL-supporting browsers).
Unfortunately, XSL is an underutilized technology, so it's probably not high on the priorities list for the major search engines.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.