Mobile app version of vmapp.org
Login or Join
Frith620

: How to download all queries from Google Search Console Analytics? In Google Search Console, in Search Analytics, it only lets you download 999 lines of data. How do I download all queries from

@Frith620

Posted in: #Download #GoogleSearchConsole #Keywords #Seo

In Google Search Console, in Search Analytics, it only lets you download 999 lines of data.

How do I download all queries from Google Search Analytics in Google Search Console?

10.05% popularity Vote Up Vote Down


Login to follow query

More posts by @Frith620

5 Comments

Sorted by latest first Latest Oldest Best

 

@Bryan171

You can download all keywords (not limited to 1000) from the Search Console using the API. The trick is that you first get the search landing pages via the Google Analytics API - and then query the corresponding keywords for each of these pages. You can find the step by step guide (incl. Python script) here. Main disadvantage is that it takes quite some time to run (several hours for bigger sites)

10% popularity Vote Up Vote Down


 

@Phylliss660

I noticed that if you connect Google Analytics with Webmasters, then inside Google Analytics, in the
"Acquisition > Search Engine Optimization > Queries" report

you can see all the keywords.

10% popularity Vote Up Vote Down


 

@Sims2060225

If you are not good with programming you can use api explorer on the end of the page:
developers.google.com/apis-explorer/#p/webmasters/v3/
If you are good with code here is example how to do it

#!/usr/bin/python

import httplib2

from apiclient import errors
from apiclient.discovery import build
from oauth2client.client import OAuth2WebServerFlow


# Copy your credentials from the console
CLIENT_ID = 'YOUR_CLIENT_ID'
CLIENT_SECRET = 'YOUR_CLIENT_SECRET'

# Check developers.google.com/webmaster-tools/v3/ for all available scopes
OAUTH_SCOPE = 'https://www.googleapis.com/auth/webmasters.readonly'

# Redirect URI for installed apps
REDIRECT_URI = 'urn:ietf:wg:oauth:2.0:oob'

# Run through the OAuth flow and retrieve credentials
flow = OAuth2WebServerFlow(CLIENT_ID, CLIENT_SECRET, OAUTH_SCOPE, REDIRECT_URI)
authorize_url = flow.step1_get_authorize_url()
print 'Go to the following link in your browser: ' + authorize_url
code = raw_input('Enter verification code: ').strip()
credentials = flow.step2_exchange(code)

# Create an httplib2.Http object and authorize it with our credentials
http = httplib2.Http()
http = credentials.authorize(http)

webmasters_service = build('webmasters', 'v3', http=http)

# Retrieve list of websites in account
site_list = webmasters_service.sites().list().execute()

# Remove all unverified sites
verified_sites_urls = [s['siteUrl'] for s in site_list['siteEntry'] if s['permissionLevel'] != 'siteUnverifiedUser']

# Printing the urls of all sites you are verified for.
for site_url in verified_sites_urls:
print site_url
# Retrieve list of sitemaps submitted
sitemaps = webmasters_service.sitemaps().list(siteUrl=site_url).execute()
if 'sitemap' in sitemaps:
sitemap_urls = [s['path'] for s in sitemaps['sitemap']]
print " " + "n ".join(sitemap_urls)


Reference:https://developers.google.com/webmaster-tools/v3/searchanalytics?hl=en

10% popularity Vote Up Vote Down


 

@Tiffany637

Using just the web interface you can only ever get the top 1000 keywords.

However if you use the Search Console API you can get a maximum of 5000 keywords:
developers.google.com/webmaster-tools/?hl=en
Also if you cannot use the API, another way to get more keywords can be too add every sub folder as a new property in Search Console. (if you use sub folders on your site).

This way you can get more keywords and the ones specific to that area of the site.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme