: Will providing an internal graph allow the Google bot to index my site faster, and update it more often? A programmer, but a very inexperienced webmaster here. I am building a social site that
A programmer, but a very inexperienced webmaster here. I am building a social site that gets content generated literally every day. On top, the content of my already indexed pages changes quite often (as I am pulling a lot of stuff from Twitter etc). Yet, when I go to the "Webmaster Tools", I see the indexing process staying frozen to its status from a few days ago. It is still a beta, so you can imaging a lot of metadata missing from the pages, and other things like this, which could potentially trick the Google bot into thinking that my site is a scam, and refuse to index, but I don't see problems, only a slow state of indexing. On top, the pages that are already indexed are one week old, which is a lot - a lot of content has changed.
I thought of "tricking" the Google bot into consuming my site as a never-ending graph structure. If I provide a section on the site, called "Other pages you might like", linking to other pages with content, it would make the Google bot crawl them continuously and revisit them more often.
Is this a way to go, or is it utterly pointless? What else would you advise me to do? How can I tell the Google bot "come back and revisit this page in one day"?
More posts by @Alves908
3 Comments
Sorted by latest first Latest Oldest Best
Google spider check every site depending of many parameters as you know.
The better practices are:
Put Google Webmster Tools and check sitemap and visibility
Avoid errors like 404 or code errors
Check your MetaTags as Google recommend: Metadata Google
Needed robots.txt with map to visit (have in webtools a Robot checker)
Actually is important to supply content to mobile and pc correctly (check on tools from WebmasterTools the visibility)
Offer RSS 2.0 or Atom from your content (specially diary content)
Help to google indicating your "Expires" for each news
Additionally you can activate tools like Google PlayKiosko (last named Google Currents) to index your RSS/Atom directly and activate accounts like FeedBurner.
With this probably your website run similar to a NewsDiary.
I am assuming that because you said your site is "beta" that it is a new site. You have several things working against you for a period. Without knowing your domain name, I cannot fully asses your site, however, here is a short list.
New site. Site age is a significant trust factor.
New registration. Registrant contacts, addresses, phone numbers, and
e-mail addresses signal trust. If new, then no trust is given.
Stability Ranking. If your site is new, stability ranking is 0. You
have no stability history to establish trust.
Inbound (back) links. Significant ranking factor.
Content. If your site is new, your content has not soaked into the
index/SERPs and will take at least 6 months to a year.
Content freshness. If your site is new, your pages will have longer
TTL sytle metrics which effect page fetches. It takes a while for
these metrics to move to shorter times. It could be as long as 6
months.
Social signals. It will take time to garner an audience. Social
signals are not likely strong enough to indicate content popularity
and relevance.
Here are some other factors to consider.
Registrar trust. If you have not chosen a high quality registrar, you
are negatively effecting your trust score.
Host trust. If you have not chosen a high quality host, you are
negatively effecting your trust score.
Private registration. If you use private registration and that
service company is known for not properly vetting registration
information you are negatively effecting your trust score.
Registrant trust. If your site is registered with registration
contacts, addresses, phone numbers, and e-mail addresses that are
known for low quality sites you are negatively effecting your trust score.
Shared hosting. If your site is on a server or using an IP address
that is known for sites that are blacklisted or known for bad
behaviors you are negatively effecting your trust score.
TLD qualtiy. If your site is using a ccTLD or other TLD that is known
for poor quality sites you are negatively effecting your trust score.
Content life span. If your content changes often and content pages
life-span is short, meaning that pages are created and live for a
short period and then deleted, then your content cannot establish trust.
These are just a few things that could be effecting how search engines view your site. The principle SEO factors are site and page trust, content life span, content freshness, inbound links (back links), and social signals. If you have done these things well, it will simply take time for any site to begin to rank and enjoy rapid fetches and index refreshes. I had a site that would have a new page indexed and receiving search traffic inside of 20 minutes. It is very possible, but it will take some time for this to happen. It can take 6 months if all signals are strong for any site to begin to perform as it should. Sometimes longer if some of the signal as weak or mediocre.
Of course, plain ole fahsion SEO must be applied. This helps search engines to trust your content and site. "Tricks" are seen as manipulation. The bottom line is you cannot hurry search engines up. If you try, you will get burned and recovery can take a very long time. Patience and good solid work and high quality content is the best advice that anyone can give any new site. The best trick is creating a high quality site that people want to link to, visit often, and engage in using social media. Short of that, any site has an up-hill climb.
You will need a higher PR rating and a high SEO rating before the bot will consider visiting your site everyday.
Read over this support article by Google there are over 200 factors that the bot takes into account for page relevancy; most factors have been kept secret so people will not try to manipulate the bot.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.