: How will search engine find out if content of dynamic sites are changing? I have site with few hundred blog posts and few thousand dynamic pages. If I create a sitemap.xml for dynamic pages
I have site with few hundred blog posts and few thousand dynamic pages. If I create a sitemap.xml for dynamic pages , google does not crawl and index it. reason - they are not linked to each other and from other pages.
The data on each dynamic page gets updated every few weeks.
If I create link from home page and then link these pages from inner pages, search engine can crawl them.
How will the search engine figure out if the content has changed after few weeks in absence of sitemap.xml for these dynamic pages?.
I use sitemap.xml for blog content but not for dynamic pages.
More posts by @Alves908
2 Comments
Sorted by latest first Latest Oldest Best
Google may look at your sitemap and evaluate it and choose not to use it preferring instead to crawl your site following links the old fashion way. If these pages are not linked, then Google, even with a sitemap, may not find them.
If these pages are of real value, they should be linked. If the pages are spider food, then I suggest not linking them and not including them in your sitemap. It is possible that these pages are geared more toward search and not humans which is not what Google wants. You have to decide this for yourself.
Again, if these pages are for users and not spider food, then link them using a scheme that is geared toward humans and not search engines. You will be rewarded. But it will take time.
Search engines do not require sitemaps to find and index content. They can find content through various means including links. They also will continue to crawl pages after they original find and index it to see if content has changed. So, you do not need to do anything as re-crawling these pages is a normal part f how search engines work.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.