: How to manually list set of urls for search engines to index So I have created a video website which has thousands of videos and thousands of videos get added to it on a daily basis. Here
So I have created a video website which has thousands of videos and thousands of videos get added to it on a daily basis. Here is my problem :-
I have created a website which basically loads the skeleton in html and puts all the content through javascript and Ajax. The problem is search engines aren't going anywhere except for the home page.
Is there a way say in robots.txt where i give a link to a single html which has links to all these videos ?
I agree my site is not accessible for a non-javascript user but stats show that this ratio is very low ( > 0.2 % ).
Is there a way I can still keep the complete AJAX website and still get each individual videos listed on google ?
More posts by @Murphy175
2 Comments
Sorted by latest first Latest Oldest Best
Google suggests This solution for indexing ajax sites which use 'hash bang' urls (I don't know if yours does).
You should add sitemap.xml to your site which gets updated with each new video added.
Refer google's documentation for sitemaps.About Sitemap
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.