Mobile app version of vmapp.org
Login or Join
Hamm4606531

: How do search engines index XML-based websites? I'm talking about the kind that transforms XML into (X)HTML on the client side. We all know that search engines use semantics in HTML (headings,

@Hamm4606531

Posted in: #Indexing #SearchEngines #Xml

I'm talking about the kind that transforms XML into (X)HTML on the client side. We all know that search engines use semantics in HTML (headings, links) to rank your website. So if you don't already use embedded XHTML in your XML, they can't make any sense of it, can they? I'm especially worried about them not finding links, which would prevent them from effectively crawling your site - a disaster for SEO people. So, any experience to share here?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Hamm4606531

1 Comments

Sorted by latest first Latest Oldest Best

 

@Ogunnowo487

AFAIK, Google doesn't process XSLT templates. So while the text content of the XML can be indexed by Google, it'll just be in the form of the plain XML document, meaning most of the document semantics won't be understood by Google aside from some shared attributes and elements between XML and XHTML.

I don't know if the situation is any different for other search engines or whether any progress is being made on this front. It would be nice to have search engines apply XSLT to XML documents, as it saves on having to preprocess it server-side. But right now it's safer to just apply the XSL transformations sever-side at least for search engines (and non-XSL-supporting browsers).

Unfortunately, XSL is an underutilized technology, so it's probably not high on the priorities list for the major search engines.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme