Mobile app version of vmapp.org
Login or Join
Debbie626

: Google sitemap priorities, am I doing the right thing? I've built a website which has maritime weather and tide data for UK locations, and each location has 7 day forecast (where data is available).

@Debbie626

Posted in: #GoogleSearchConsole #Seo #UrlRewriting #XmlSitemap

I've built a website which has maritime weather and tide data for UK locations, and each location has 7 day forecast (where data is available). My URL structures are like this:
www.example.com/location/llandudno/ <-- Today www.example.com/location/llandudno/1 <-- Tomorrow www.example.com/location/llandudno/2 <-- The following day after that
etc


In my sitemap (dynamically generated), the priorities are set like this:
www.example.com/location/llandudno/ Priority: 0.7 LastMod: (today) ChangeFreq: Daily www.example.com/location/llandudno/ Priority: 0.6 LastMod: (today) ChangeFreq: Daily www.example.com/location/llandudno/ Priority: 0.5 LastMod: (today) ChangeFreq: Daily


etc with the priority changin by 0.1 each day. My thoughts are further into the future, the less priority the page has?

The lastmod are always set to the current date (but 1am) as the data on that page changes beyond midnight.

Questions:


Am I structuring my URLs in an affective manner, our would I be better passing the date instead of /1, /2 etc? What implications would this have when they disappear off the sitemap?
Should I be including subsequent days in the sitemap at all, or just today's data? The pages are identical apart from the figures and the date on each.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Debbie626

2 Comments

Sorted by latest first Latest Oldest Best

 

@Lee4591628

Coming from a slightly different angle, a more effective option may be to use rel=“next” and rel=“prev” tags which "provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page". From Google's description of these tags:


Now, if you choose to include rel="next" and rel="prev" markup on the component pages within a series, you’re giving Google a strong hint that you’d like us to:


Consolidate indexing properties, such as links, from the component pages/URLs to the series as a whole (i.e., links should not remain dispersed between page-1.html, page-2.html, etc., but be grouped with the sequence).
Send users to the most relevant page/URL-typically the first page of the series.



This sound like exactly what you want, since you aren't constantly creating/destroying URLs on a daily basis (making them more difficult to crawl) and you're still giving preference to the first page in the series (for today).

10% popularity Vote Up Vote Down


 

@Sarah324

My thoughts are further into the future, the less priority the page has?


In your particular case I think this is a fair determination. Just keep in mind that the priority flag doesn't really seem to do much so don't expect to see anything change as a result of this change.


Am I structuring my URLs in an effective manner, our would I be better passing the date instead of /1, /2 etc?


Using the date would be a better indication of the page's content and be more user-friendly. But using this format isn't hurting you per sé.


What implications would this have when they disappear off the sitemap?


None really. Sitemaps just tell search engines where to find the content you would like them to index. If a page isn't listed the search engines can and will still find it through other means (e.g. through links) and index it accordingly.


Should I be including subsequent days in the sitemap at all, or just today's data? The pages are identical apart from the figures and the date on each.


If you want the obsolete pages to continue to be indexed leave them in. If you do not want them indexed remove them and also tell the search engines to forget about them. Removing them from the sitemap will not do this. To do this block them via robots.txt and use the x-robots-tag:noindex.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme