Mobile app version of vmapp.org
Login or Join
Candy875

: How can I track / discover low performing ("invisible") pages in PIWIK and/or Google Analytics? Let's say I have a website with a couple of hundred pages – some performing well, others not

@Candy875

Posted in: #GoogleAnalytics #LandingPage #Matomo #Seo

Let's say I have a website with a couple of hundred pages – some performing well, others not so well. And then there are some pages that are what I would call 'invisible pages' – no visitors ever entering through these pages since they are not ranking well in SERPs.

If I now wanted to focus my SEO efforts on these 'invisible pages' – how do I even find them? In PIWIK and Google Analytics Landing Pages are only listed if they generate 1 entry or more per chosen time period.

I asked this question already over at the PIWIK forums ("How can I track entry pages with 0 (zero) entries?") but to my surprise I never got any answer or suggestion or whatever. So maybe this is not as easy to accomplish as I first thought? Any pointers or suggestions appreciated.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Candy875

2 Comments

Sorted by latest first Latest Oldest Best

 

@Angela700

It might be possible within GA to get 99% of the pages if not all (do you have search console linked to GA?)

Go to the Behaviour Report -> All Pages.

After the report is loaded for the desired timeframe, select the pivot table view


then on the left hand side “Pivot By Medium”, after that sort the table by Organic. Done.

Consider some manual work

Alternatively, you can export the All Pages report and compare this result with the indexed pages in Search Console (aka Google Webmaster tools). There is no point to consider those pages that has not been indexed, they will not be receiving any organic traffic anyway, will They?

10% popularity Vote Up Vote Down


 

@Sherry384

Invisible pages, using your definition, will simply not appear in GA (or any other JS based analytical software). GA works only after its JS engine triggers and reports something back.

So, the best approach to this issue would be to run a crawler on your site, starting from home page. Let the crawler discover what you have there, then make a CSV file containing all unique URLs. Then go to GA, all pages report, and export everything that you consider to be "visible" or "performing well" - it is up to you how to define this, some suggestions:


all pages that had at least 10 visitors during last 30 days
all pages that had at least 50 impressions during last 30 days


After that you will have 2 CSV files - all pages (crawler) and all "visible" pages (GA). Then remove URLs that are in both files, which will give you only these pages that need some work.

I suggested a crawler because it is better than exporting directly from your CMS - crawler, in theory, should have the same access level as your users and search engine bots. It is safe to assume that if a bot cannot find a page, then users will not as well.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme