Automate SEO Reports with Python and Google Sheets

Few things in SEO are more tedious than building the monthly report. Open Search Console, export data, paste it into Sheets, build the formulas, adjust the charts… and next month, all over again. What if I told you that you can automate 90% of that process?

With Python, the Search Console API, and the gspread library, you can create a script that does all of this for you. It runs, pulls the data, formats it, and drops it into your Google Sheet ready to share. That simple.

Why automate SEO reports?

It’s not just about saving time (though that too). It’s because:

  • You reduce human error — copy-pasting data between tools is a recipe for mistakes
  • You get consistency — the report always has the same structure, the same metrics
  • You free up time for analysis — instead of spending 2 hours building the report, you spend 2 hours interpreting it
  • You can scale — if you manage 5 clients, you don’t want to build 5 reports manually each month

What you need

The stack is straightforward:

  • Python 3 — if you don’t have it, install via Homebrew (Mac) or python.org
  • google-auth + google-api-python-client — to connect to the Search Console API
  • gspread + oauth2client — to write to Google Sheets
  • A Google Cloud service account — free, takes 5 minutes to set up

Step 1: Set up the service account

Go to Google Cloud Console, create a new project (or use an existing one), enable the Search Console and Google Sheets APIs, and create a service account. Download the JSON credentials file.

Then share your Google Sheet with the service account email (the one ending in @*.iam.gserviceaccount.com). This gives it write access.

Step 2: The basic script

Here’s the script structure. This isn’t copy-paste-ready code, but the logic you need to implement:

# 1. Authentication
# Load credentials from the service account JSON
# Connect to the Search Console API and gspread

# 2. Pull Search Console data
# Define the site, date range (last 28 days)
# Request metrics: clicks, impressions, ctr, position
# Group by: query, page, or both

# 3. Process the data
# Calculate totals, averages, MoM comparisons
# Identify top queries, top pages, biggest drops

# 4. Write to Google Sheets
# Open the spreadsheet by ID or name
# Clear the raw data sheet
# Write new data with headers
# Update the summary sheet with totals

# 5. (Optional) Send notification
# An email, a Slack message, whatever you prefer

Step 3: Schedule the execution

With the script working, you can schedule it to run automatically:

  • Mac/Linux: a cron job that runs the script on the first Monday of each month
  • Cloud: Google Cloud Functions + Cloud Scheduler (free on the free tier)
  • Low-code: GitHub Actions with a scheduled workflow

What to include in the report

Once you have the pipeline, you can add whatever you want:

  • Summary of key metrics (clicks, impressions, CTR, average position)
  • Month-over-month comparison with percentage change
  • Top 20 queries by clicks
  • Top 20 pages by clicks
  • URLs with the biggest CTR drops
  • New keywords that entered the top 10

Start simple

My advice: don’t try to automate everything on day one. Start with a script that pulls basic metrics and puts them in a sheet. Once that works, add layers: MoM comparisons, alerts, automatic charts.

Automation isn’t a one-day project. It’s an iterative process. But each iteration saves you hours. And those hours can be invested in what really matters: analyzing the data and making decisions.

Leave a Comment