Mobile app version of vmapp.org
Login or Join
Bryan171

: How to allow google to index my deep pages? Let's say I'm building an open wiki system. Each user can start a new page, give it a name like "how-to-travel-faster-than-light" which will turn

@Bryan171

Posted in: #GoogleIndex #Seo

Let's say I'm building an open wiki system. Each user can start a new page, give it a name like "how-to-travel-faster-than-light" which will turn into the permalink: "mysite.com/how-to-travel-faster-than-light".

There is no site map listing every link to every user page.

My question is: How would google ever find and index the "how-to-travel-faster-than-light" page if it can't find it, even recursively from my landing page (recursively).

How did wikipedia or stack exchange solve this problem? obviously every page of theirs is indexed and they have no single list of all page links in a central location which can be found by a crawler. Have they submitted every page to google for indexing?

thanks.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Bryan171

1 Comments

Sorted by latest first Latest Oldest Best

 

@Shakeerah822

How would google ever find and index the "how-to-travel-faster-than-light" page if it can't find it, ...


If it can't find it; it can't index it. It's that simple. However, Google can potentially discover URLs in many ways, from scanning emails in Gmail to URLs typed into its Chrome web browser (although these methods are naturally unreliable if you are trying to get a URL indexed).


How did wikipedia or stack exchange solve this problem?


Well, Wikipedia does have numerous Contents and hierarchy Indices, as well as a complete alphabetical index - so it would certainly seem spiderable. It is also very well cross-linked and inbound links are second to none. It might even have a sitemap (although I can't find it), as it's still well within the 2.5 billion sitemap URL limit (2017 figures) as set by Google.

The Stack Exchange Questions page - that lists new questions first (and active questions rise to the top) is naturally a spiderable index of all pages/questions on the site. There is also an RSS question feed. And, until recently, there was an XML sitemap. (The sitemap URL stated in the Webmasters robots.txt seems to result in a 404 currently?)


There is no site map listing every link to every user page.


Why not? This is a standard way of informing search engines about hard to reach pages.

When pages are created you can ping Google and other search engines to notify them of an update. They will then request your updated (auto-generated) sitemap.

Just having Google Analytics installed on your site will at least allow Google to discover URLs the very first time you (or the author) visits the page.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme