Mobile app version of vmapp.org
Login or Join
Correia994

: Are more pages in one site better for SEO? I developed a site nearly 2 years (PHP, JAVA), but I am a new SEOer. My database has almost 1M articles. I have some questions I need help with:

@Correia994

Posted in: #Seo

I developed a site nearly 2 years (PHP, JAVA), but I am a new SEOer.

My database has almost 1M articles. I have some questions I need help with:


Is it as a good idea to create static pages from all the articles?
Are more pages in the one site better for SEO?
If I created each article into a static page, that could be many GBs. How do I let the search engine crawler collect my database?

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Correia994

3 Comments

Sorted by latest first Latest Oldest Best

 

@Correia994

I will address your points separately.

Is it as a good idea to create static pages from all the articles?

No, it is better to create a single endpoint (e.g. a PHP script) which retrieves the article from the database and displays it as HTML. Of course, it is always a good idea to implement some caching mechanism to reduce hitting the database over and over for popular articles.

However, creating a static page for all articles is unecessary as they will be stored in two places this way - in the database and in static HTML files.

Are more pages in the one site better for SEO?

I am guessing by this you mean: is it better to have 1 million articles in a single site or split them in multiple sites?

The simplest answer is - it depends. If your articles are with mixed topics or you can differentiate them in categories (e.g. 10000 articles per category), it might be a wise idea to split each category in its own website and try to rank the sites on their own, to gain better SE rankings for different long-tail keywords/niches. However, if your goal is to build a general article directory, it might be better to include all 1 million articles in one site. Naturally, you would still want to separate them in topics or separate pages, so that search engines and real human visitors may better navigate and find relevant content.

There is no single solution that fits it all.

How do I let the search engine crawler collect my database?

You can't just upload your database to the search engines, but you may make it easier for them to crawl the pages on your site. By checking your site's content the crawlers determinate if that page is worth including in the SERPs (search engine result pages) for a given keyword.

You should know, that SEs don't just use quantitive factors when ranking pages (e.g. if your page will be the first result or 1000th), but qualititive as well - how old your site is, its PageRank, authority (how much incoming links you have from other websites and their quality).

10% popularity Vote Up Vote Down


 

@RJPawlick198

It doesn't matter where your pages come from (e.g. database / static).

You just have to make sure that all your pages can be accessed by following links on your site (this is how search engines crawl your site).

Either by adding the url's to a menu (not really the way to go with 1M pages :-) ) or by links at some other place on your site.

To 'help' search engines index your pages you could add an HTML sitemap (basically a page with your most important links on it) and/or an XML sitemap.

An XML sitemap will contain the links to the pages in a specific format.

You should save the file sitemap.xml in the root of your website.

Example of contents of sitemap.xml:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://www.example.com/</loc>
<lastmod>2005-01-01</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
<url>
<loc>http://www.example.com/page</loc>
<lastmod>2005-01-01</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
<url>
<loc>http://www.example.com/anotherpage</loc>
<lastmod>2005-01-01</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
</urlset>

10% popularity Vote Up Vote Down


 

@Sarah324

Search engines don't know if articles comes from a database or not. They only see the HTML a URL produces. So if you have one PHP/Java file that gets each article from the database using a unique URL (i.e. example.com?id=12345 with 12345 being the ID of the article) then you will have a website that has 1 million pages. Each of those pages will be ranked on its own merit (quality of content, semantic markup, link popularity, etc).

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme