SERP Feature Monitoring: Track the Search Page, Not Just Rankings
A rank tracker can tell you that a page is position 3. It may not tell you that position 3 is now buried under an answer box, a People Also Ask block, a video carousel, and a Top Stories module.
That is the blind spot in basic SEO reporting. Modern Google results are not just ten blue links. They are a changing layout of organic results, SERP features, local modules, news results, ads, and entity panels. If you only track organic position, you miss the thing users actually see.
SERP feature monitoring fixes that by tracking the shape of the result page over time.
What counts as a SERP feature?
For monitoring, a SERP feature is any non-standard result block that changes the click landscape. Common examples include:
- answer boxes or featured snippets
- People Also Ask
- Knowledge Panels
- Top Stories
- local packs
- video results
- image packs
- related searches
- shopping or product modules
- ads
Some of these help you. Some push your result down. Some change the query intent entirely. A keyword that starts showing Top Stories may have become news-sensitive. A keyword that gains a local pack may now have stronger local intent. A keyword that triggers an answer box may need a better concise answer on your page.
Why position is not enough
Imagine two daily snapshots for the same keyword:
| Day | Your organic rank | SERP layout |
|---|---|---|
| Monday | 2 | plain organic results |
| Tuesday | 2 | answer box + PAA + video carousel above organic |
A normal rank report says nothing changed. A real visibility report says the page changed a lot. The user now sees several blocks before reaching your result.
This is why teams should monitor feature occupancy: which features appeared, where they appeared, and whether your domain is included in them.
The data model
A SERP feature snapshot does not need to be complicated. Store one record per query, market, and collection time.
{
"query": "best crm software for agencies",
"gl": "us",
"hl": "en",
"collected_at": "2026-04-25T01:00:00Z",
"features": {
"answer_box": true,
"people_also_ask": true,
"top_stories": false,
"knowledge_panel": false,
"local_results": false,
"ads": true
},
"organic_top_10": [
"example.com",
"competitor.com"
]
}
The goal is to compare snapshots, not to preserve every byte of the raw response forever.
Collecting feature snapshots with SerpBase
import requests
from datetime import datetime, timezone
from urllib.parse import urlparse
API_KEY = "your_api_key"
SEARCH_URL = "https://api.serpbase.dev/google/search"
def host(url: str) -> str:
return urlparse(url).netloc.replace("www.", "")
def fetch_serp(query: str, gl: str = "us", hl: str = "en") -> dict:
resp = requests.post(
SEARCH_URL,
headers={
"X-API-Key": API_KEY,
"Content-Type": "application/json",
},
json={"q": query, "gl": gl, "hl": hl, "page": 1},
timeout=20,
)
resp.raise_for_status()
data = resp.json()
if data.get("status") != 0:
raise RuntimeError(data.get("error") or "search failed")
return data
def feature_snapshot(query: str, gl: str = "us", hl: str = "en") -> dict:
data = fetch_serp(query, gl, hl)
organic = data.get("organic", [])[:10]
top_domains = [host(r.get("link", "")) for r in organic if r.get("link")]
return {
"query": query,
"gl": gl,
"hl": hl,
"collected_at": datetime.now(timezone.utc).isoformat(),
"features": {
"answer_box": bool(data.get("answer_box")),
"people_also_ask": bool(data.get("people_also_ask")),
"top_stories": bool(data.get("top_stories")),
"knowledge_panel": bool(data.get("knowledge_panel")),
"local_results": bool(data.get("local_results")),
"ads": bool(data.get("ads")),
"related_searches": bool(data.get("related_searches")),
},
"organic_top_10": top_domains,
}
Run that once per keyword per market and you have the raw material for a real SERP visibility monitor.
Detecting feature changes
The simplest alert is a before-and-after diff.
def diff_features(previous: dict, current: dict) -> dict:
prev = previous["features"]
curr = current["features"]
gained = [name for name, value in curr.items() if value and not prev.get(name)]
lost = [name for name, value in prev.items() if value and not curr.get(name)]
return {"gained": gained, "lost": lost}
before = feature_snapshot("best project management software")
# store it, then compare with tomorrow's snapshot
A useful alert is not simply "PAA appeared." It should explain why the team should care:
answer_boxgained: check whether your page can win or support a concise answertop_storiesgained: query may be temporarily news-drivenlocal_resultsgained: organic landing pages may be less visible in that marketadsgained: paid competition is increasingknowledge_panelgained: entity intent may be stronger than category intent
Measuring feature occupancy
For a keyword set, feature occupancy is the percentage of tracked queries where a feature appears.
def occupancy(snapshots: list[dict], feature: str) -> float:
if not snapshots:
return 0.0
count = sum(1 for s in snapshots if s["features"].get(feature))
return round(count / len(snapshots) * 100, 2)
This is more useful than looking at one keyword. If answer boxes appear on 68% of your informational keywords, you need a snippet strategy. If Top Stories appears on only 3%, news volatility is not your main problem. If local packs appear in one country but not another, your international SEO plan should reflect that.
Tracking ownership, not just presence
Feature presence is the first layer. Ownership is the second.
For example, if a People Also Ask block appears, you want to know whether any answer source belongs to your domain. If an answer box appears, you want to know which site owns it. If Top Stories appears, you want to know which publishers dominate it.
def domain_in_text(domain: str, value: str) -> bool:
return domain.replace("www.", "") in (value or "").lower()
def paa_owners(data: dict) -> list[str]:
owners = []
for item in data.get("people_also_ask", []):
link = item.get("link") or ""
if link:
owners.append(host(link))
return owners
Ownership turns SERP feature monitoring from a passive report into a task list. If competitors repeatedly own the answer box, inspect their structure. If the same publisher owns Top Stories, your PR or content calendar may need to account for that source.
A practical database table
For a small monitoring system, SQLite or Postgres is enough.
CREATE TABLE serp_snapshots (
id INTEGER PRIMARY KEY,
query TEXT NOT NULL,
gl TEXT NOT NULL,
hl TEXT NOT NULL,
collected_at TIMESTAMP NOT NULL,
features_json TEXT NOT NULL,
organic_top_10_json TEXT NOT NULL
);
CREATE INDEX idx_serp_snapshots_query_market
ON serp_snapshots (query, gl, hl, collected_at);
Keep the raw response for a short retention window if you need audits. Keep the normalized feature snapshot for long-term trend analysis.
What to alert on
Do not alert on every change. Google changes too often for that to be useful. Alert on changes that affect decisions:
- an answer box appears on a high-value keyword
- your domain disappears from a SERP feature it used to own
- ads appear on a keyword that previously had no ads
- local results appear for a national landing page keyword
- Top Stories appears for a query you treat as evergreen
- the top three organic domains change for several days in a row
The best SEO alerts are boringly specific. They tell someone what changed and what to check next.
How this differs from rank tracking
Rank tracking asks: where do I rank?
SERP feature monitoring asks: what kind of search page am I ranking on?
You need both. Rank tells you your position among organic results. Feature monitoring tells you whether organic results are still the main event.
For modern SEO dashboards, the second question is often the missing one.
Final takeaway
A keyword is not just a keyword. It is a live search page with a changing layout. Answer boxes, People Also Ask, Top Stories, Knowledge Panels, local packs, ads, and related searches can all change how much attention an organic ranking receives.
SerpBase returns structured Google results that make this monitoring straightforward. Store a daily snapshot, diff the feature set, measure occupancy, and alert only on changes that affect visibility.
Start with a free SerpBase API key and run snapshots for ten of your most valuable keywords. After a week, you will know which keywords are stable, which are feature-heavy, and which deserve more than a position number in a rank report.