Mobile app version of vmapp.org
Login or Join
Smith883

: Save Google Search Console reports for offline reading/backup Google's Search Console does provide useful information to a webmaster. One problem is that the data is only available for the last

@Smith883

Posted in: #GoogleSearchConsole

Google's Search Console does provide useful information to a webmaster. One problem is that the data is only available for the last 3 months or so. Sometimes you'd like to get a larger scale overview of changes on a website.

How would you save Search Console reports (Crawl Stats / Indexed Pages / ...) for offline review later on?

One option is to save complete HTML pages and hope they load well on the browser but this doesn't seem like a clean solution as the associated JS doesn't always run well. Another option would be to print to PDF but that would not have the interactive graphs with their data.

Any ideas here?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Smith883

1 Comments

Sorted by latest first Latest Oldest Best

 

@Vandalay111

i download WMT exports on following way:


i manipulate the URL to WMT to adjust the datum (one day) and filter out some brand keywords (i don't want them in my reports)
then i create manipulated URLs for each day of last 90 days, save them as list and
run an iMacro to download all the data.


On this way i get 90.000 datapoints (1.000 á day).

To exclude more then one keyword from WMT the URL looks like
www.google.com/webmasters/tools/search-analytics?hl=en&siteUrl=http://www.example.com/#state= [null,[[null,"20160202","20160202"]],null,[[null,2,["keyword1"],2,1],[null,2,["keyword2"],2,1],[null,6,["WEB"]]],null,[1,2,3,4],1,0,null,[2]]


The part should be added for excluding, is [null,2,["keyword1"],2,1],
for including: [null,2,["keyword"],2,0]

Before pressing enter or running iMacros, URLs must be entity-encoded and look like
www.google.com/webmasters/tools/search-analytics?hl=en&siteUrl=http://www.example.com/#state=%5Bnull%2C%5B%5Bnull%2C%2220160202%22%2C%2220160202%22%5D%5D%2Cnull%2C%5B%5Bnull%2C2%2C%5B%22keyword1%22%5D%2C2%2C1%5D%2C%5Bnull%2C2%2C%5B%22keyword2%22%5D%2C2%2C1%5D%2C%5Bnull%2C6%2C%5B%22WEB%22%5D%5D%5D%2Cnull%2C%5B1%2C2%2C3%2C4%5D%2C1%2C0%2Cnull%2C%5B2%5D%5D

Note, that Notepad++ if encode URLs, doesn't encodes comma - it should be encoded manually, with search and replace of , to %2C.

The iMacro code follows:

VERSION BUILD=8881205 RECORDER=FX
SET !TIMEOUT_STEP 0
SET !ERRORIGNORE YES
TAB T=1
SET !DATASOURCE wmt-data.csv
SET !DATASOURCE_COLUMNS 1
SET !LOOP 1
SET !DATASOURCE_LINE {{!LOOP}}
URL GOTO={{!COL1}}
WAIT SECONDS=6
ONDOWNLOAD FOLDER=* FILE=+_{{!NOW:yyyymmdd_hhnnss}} WAIT=YES
EVENT TYPE=CLICK SELECTOR="HTML>BODY>DIV>DIV:nth-of-type(3)>DIV:nth-of-type(2)>DIV>DIV>DIV>DIV:nth-of-type(4)>DIV:nth-of-type(3)>DIV:nth-of-type(2)>DIV>DIV" BUTTON=0
EVENT TYPE=CLICK SELECTOR="HTML>BODY>DIV:nth-of-type(11)>DIV:nth-of-type(3)>BUTTON" BUTTON=0


Finally there are so many datapoints á day, like you wish - you can filter and export WMT data endless. All data are as CSV files: i build from them i.e. clients own CTR dashboards.

A really fine and useful thing would be, if somebody would write a VBA script, which would login into WMT, help on filtering exports, export data and import them into Excel.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme