Mobile app version of vmapp.org
Login or Join
Martha676

: Should I prevent search engines from indexing empty user profile pages? If so, how much content is enough for indexing? I'm developing a social website for book readers, with public user profile

@Martha676

Posted in: #Google #RobotsTxt #WebCrawlers

I'm developing a social website for book readers, with public user profile pages. For each user, there are several pages available:


The main user page (about me, last activity, ...)
Several book listing pages:


Personal library
Wishlist
Reading list
... and more



I already have close to 10,000 registered users, many of which have little or no activity.

I know that Google cannot index millions of user profile pages from day one, and I don't want it to index useless pages that basically contain the user name and "This user has no activity". Google is not willing to index many of my pages at the moment, and I would like to be able to give it a hint as to which pages are relevant.

Should I explicitly prevent Google from indexing the empty profile pages? I was thinking of a noindex, follow robots meta tag, that would basically tell Google that it's OK to grab links from this page, but that its content is of little value.

I know Google will not magically find empty profile pages if they're not linked from anywhere (and I won't make the mistake to put them in a sitemap); however I'm more concerned about the "almost empty" profiles: someone writes a single review, his profile page is linked from the book page, and suddenly GoogleBot finds his almost empty profile page, and his fully empty book listing pages. I don't want it to index those until they contain some content.

Is it a good idea to put some lower limit on the page contents (for example, a personal library page with at least 10 books, or a profile page with a long enough About Me and some activity to display), and only explicitly allow bots to index these pages?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Martha676

1 Comments

Sorted by latest first Latest Oldest Best

 

@Connie744

You can deal with this in different ways.

1. Using robots.txt

Let's say your website is: example.com

And user profile is:

Structure 1: Well designed Profile

- example.com/user/user1 => Main Profile
- example.com/user/user1/other-page => Other Page


Structure 2: Basic Profile

You can create different URL structure like:

- example.com/minuser/user1 => Basic Profile
- example.com/minuser/user1/other-page => Basic Other Page


You can create condition, for example: If user got 5 things ticked then have structure 1 else structure 2 (Not recommended to calculate at run time).

Note: While designing the algoritm you need to think about many things.

BLOCK: You can block /minuser using robots.txt.

2. Using noindex meta tag

As discussed above, you can try this but may not look clean to user.

Warning

Worth reading about noindex vs robots.txt block issue here: support.google.com/webmasters/answer/93710?hl=en

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme