How to Monitor Competitor SEO Rankings with a SERP API
Knowing where your competitors rank on Google is not optional if you are serious about SEO. Manual checking does not scale. Browser extensions break. Third-party dashboards charge hundreds per month for data you could collect yourself.
A SERP API gives you raw Google search results as structured JSON. You query a keyword, you get back every organic result with its position, URL, title, and snippet. From there, building a competitor ranking monitor is straightforward.
This guide walks through the architecture, the code, and the operational details.
Why Build Your Own Competitor Tracker
Tools like Ahrefs, SEMrush, and Moz charge $99-$449/month for rank tracking. They bundle it with dozens of features you may not need. If your requirement is simple — track N keywords across M competitor domains and alert on changes — you can build it with a SERP API for a fraction of the cost.
With SerpBase at $0.30 per 1,000 queries, tracking 500 keywords daily costs about $4.50/month. That is roughly 50x cheaper than a mid-tier SEMrush plan.
Architecture Overview
The system has four parts:
- A keyword list paired with competitor domains to watch
- A crawler that queries the SERP API on a schedule
- A storage layer that records positions over time
- An alert system that fires when rankings change significantly
Step 1: Set Up SerpBase
Sign up at serpbase.dev and grab your API key from the dashboard. You get 100 free credits to start.
pip install requests
Step 2: Query Google Rankings
import requests
SERPBASE_API_KEY = "your_api_key"
SERPBASE_URL = "https://api.serpbase.dev/search"
def get_rankings(keyword: str, gl: str = "us", hl: str = "en") -> list:
resp = requests.get(SERPBASE_URL, params={
"q": keyword,
"gl": gl,
"hl": hl,
"num": 100,
}, headers={
"x-api-key": SERPBASE_API_KEY,
})
resp.raise_for_status()
data = resp.json()
return data.get("organic_results", [])
This returns up to 100 organic results. Each result includes position, link, title, and snippet.
Step 3: Extract Competitor Positions
def find_competitor_positions(results: list, domains: list) -> dict:
positions = {}
for r in results:
link = r.get("link", "")
for domain in domains:
if domain in link:
positions[domain] = {
"position": r["position"],
"url": link,
"title": r.get("title", ""),
}
return positions
# Example usage
results = get_rankings("best project management software")
competitors = ["monday.com", "asana.com", "notion.so", "clickup.com"]
positions = find_competitor_positions(results, competitors)
for domain, info in positions.items():
print(f"{domain}: #{info['position']} - {info['url']}")
Step 4: Store Historical Rankings
import sqlite3
from datetime import date
def init_db(path="rankings.db"):
conn = sqlite3.connect(path)
conn.execute("""
CREATE TABLE IF NOT EXISTS rankings (
id INTEGER PRIMARY KEY AUTOINCREMENT,
keyword TEXT NOT NULL,
domain TEXT NOT NULL,
position INTEGER,
url TEXT,
checked_at DATE NOT NULL,
UNIQUE(keyword, domain, checked_at)
)
""")
conn.commit()
return conn
def save_ranking(conn, keyword, domain, position, url):
conn.execute(
"INSERT OR REPLACE INTO rankings (keyword, domain, position, url, checked_at) VALUES (?, ?, ?, ?, ?)",
(keyword, domain, position, url, date.today().isoformat())
)
conn.commit()
Step 5: Detect Changes and Send Alerts
def get_previous_position(conn, keyword, domain):
row = conn.execute(
"SELECT position FROM rankings WHERE keyword=? AND domain=? ORDER BY checked_at DESC LIMIT 1 OFFSET 1",
(keyword, domain)
).fetchone()
return row[0] if row else None
def check_alerts(conn, keyword, domain, current_pos, threshold=5):
prev = get_previous_position(conn, keyword, domain)
if prev is None:
return None
diff = current_pos - prev
if abs(diff) >= threshold:
direction = "dropped" if diff > 0 else "climbed"
return f"{domain} {direction} {abs(diff)} positions for '{keyword}': #{prev} -> #{current_pos}"
return None
Hook this into Slack, email, or any webhook. A 5-position swing on a high-value keyword is worth knowing about immediately.
Step 6: Automate with Cron
# Run daily at 6 AM UTC
0 6 * * * cd /opt/rank-monitor && python monitor.py >> /var/log/rank-monitor.log 2>&1
Scaling Considerations
Rate limiting: SerpBase handles concurrency well, but space out requests if you are tracking thousands of keywords. A simple time.sleep(0.5) between queries keeps things clean.
Geographic targeting: Use the gl parameter to track rankings in specific countries. A keyword might rank #3 in the US but #15 in the UK. SerpBase supports all Google country codes.
Cost at scale: Tracking 2,000 keywords daily = 60,000 queries/month = $18/month on SerpBase. The same volume on SerpAPI would cost $1,500/month.
What You Can Build on Top
- Weekly ranking reports with position change charts
- Share of voice calculations across keyword groups
- SERP feature tracking — did a competitor gain a featured snippet?
- New competitor detection — alert when an unknown domain enters the top 10
- Historical trend analysis — plot ranking trajectories over months
Summary
Building a competitor SEO ranking monitor with a SERP API is not complicated. The core loop is: query keywords, extract positions, store results, alert on changes. With SerpBase, the cost stays low even at scale. You own the data, you control the logic, and you are not locked into any vendor dashboard.
Get your SerpBase API key and start tracking today. 100 free credits, no credit card required.