Google Local Pack API: Extract Local Business Data from Search Results
When someone searches for "coffee shop near me" or "plumber in Austin," Google returns a Local Pack: a map with three business listings showing names, addresses, ratings, phone numbers, and hours. This block sits above organic results for local queries and captures a disproportionate share of clicks.
Unlike blue-link organic results, Local Pack data is not available through any official Google API. But a SERP API returns it as structured JSON, making it accessible for local SEO monitoring, lead generation, competitor analysis, and market research.
What the Local Pack Response Contains
A typical local_results array from a SERP API looks like this:
{
"local_results": [
{
"position": 1,
"title": "Blue Bottle Coffee",
"place_id": "ChIJN1t_tDeuEmsRUsoyG83frY4",
"rating": 4.6,
"reviews": 1842,
"address": "66 Mint St, San Francisco, CA 94103",
"phone": "+1 510-653-3394",
"hours": "Open until 5:00 PM",
"type": "Coffee shop",
"website": "https://bluebottlecoffee.com",
"thumbnail": "https://..."
}
]
}
The fields available vary by business and query, but you consistently get position, title, rating, and address for any populated Local Pack.
Querying Local Results with SerpBase
Local Pack results appear automatically when your query has local intent. No special parameter is required beyond the standard search call.
import requests
API_KEY = "your_api_key"
def get_local_results(query: str, location: str, country: str = "us") -> list:
resp = requests.post(
"https://api.serpbase.dev/google/search",
headers={"X-API-Key": API_KEY, "Content-Type": "application/json"},
json={"q": f"{query} in {location}", "gl": country, "hl": "en", "page": 1},
timeout=15,
)
resp.raise_for_status()
data = resp.json()
return data.get("local_results", [])
# Get local pack for plumbers in Chicago
results = get_local_results("plumber", "Chicago")
for biz in results:
print(f"#{biz['position']} {biz['title']} — {biz.get('rating', 'N/A')}★ ({biz.get('reviews', 0)} reviews)")
print(f" {biz.get('address', '')}")
Use Case 1: Local SEO Rank Tracker
Local Pack rankings are highly volatile — a business can jump from position 3 to position 1 overnight, or disappear entirely. Tracking this automatically is far better than manual checks.
import sqlite3
from datetime import date
def init_local_db(path: str = "local_rankings.db") -> sqlite3.Connection:
conn = sqlite3.connect(path)
conn.execute("""
CREATE TABLE IF NOT EXISTS local_rankings (
id INTEGER PRIMARY KEY AUTOINCREMENT,
query TEXT NOT NULL,
location TEXT NOT NULL,
business_name TEXT NOT NULL,
position INTEGER,
rating REAL,
reviews INTEGER,
checked_at DATE NOT NULL
)
""")
conn.commit()
return conn
def save_local_results(conn, query, location, results):
today = date.today().isoformat()
for biz in results:
conn.execute(
"INSERT INTO local_rankings (query, location, business_name, position, rating, reviews, checked_at) VALUES (?, ?, ?, ?, ?, ?, ?)",
(query, location, biz["title"], biz.get("position"), biz.get("rating"), biz.get("reviews", 0), today)
)
conn.commit()
def get_position_history(conn, business_name: str, query: str, days: int = 30):
return conn.execute(
"SELECT position, checked_at FROM local_rankings WHERE business_name LIKE ? AND query = ? ORDER BY checked_at DESC LIMIT ?",
(f"%{business_name}%", query, days)
).fetchall()
Run this daily with a cron job and you have a full position history for any business and any local query.
Use Case 2: Lead Generation at Scale
The Local Pack is one of the best sources for local business leads. If you are looking for, say, HVAC contractors across 200 US cities, you can collect thousands of business names, addresses, websites, and phone numbers with a single script.
import time
import csv
SERVICES = ["hvac contractor", "electrician", "plumber", "roofer"]
CITIES = ["Chicago", "Houston", "Phoenix", "Philadelphia", "San Antonio",
"San Diego", "Dallas", "San Jose", "Austin", "Jacksonville"]
def collect_leads(services: list, cities: list, api_key: str) -> list:
leads = []
seen = set()
for service in services:
for city in cities:
results = get_local_results(service, city)
for biz in results:
key = (biz["title"], biz.get("address", ""))
if key in seen:
continue
seen.add(key)
leads.append({
"service": service,
"city": city,
"name": biz["title"],
"address": biz.get("address", ""),
"phone": biz.get("phone", ""),
"website": biz.get("website", ""),
"rating": biz.get("rating", ""),
"reviews": biz.get("reviews", 0),
})
time.sleep(0.3) # be kind to the API
return leads
leads = collect_leads(SERVICES, CITIES, API_KEY)
with open("local_leads.csv", "w", newline="") as f:
writer = csv.DictWriter(f, fieldnames=leads[0].keys())
writer.writeheader()
writer.writerows(leads)
print(f"Collected {len(leads)} unique business leads")
4 services × 10 cities = 40 searches = $0.012. For 200 cities, you are looking at $0.24 in API costs for 800 leads.
Use Case 3: Competitor Local SEO Monitor
If you operate a local business, knowing whether your competitors appear in the Local Pack — and at what position — is critical intelligence.
def monitor_competitors(queries: list, my_business: str, competitors: list, api_key: str) -> dict:
report = {}
for query in queries:
results = get_local_results(query, location="")
report[query] = {
"my_position": None,
"competitor_positions": {}
}
for biz in results:
name = biz["title"].lower()
if my_business.lower() in name:
report[query]["my_position"] = biz["position"]
for comp in competitors:
if comp.lower() in name:
report[query]["competitor_positions"][comp] = biz["position"]
return report
report = monitor_competitors(
queries=["italian restaurant downtown Seattle", "best pasta Seattle"],
my_business="Luigi's Trattoria",
competitors=["Cafe Juanita", "Spinasse", "L'Albero dei Gelati"],
api_key=API_KEY,
)
for query, data in report.items():
print(f"Query: {query}")
print(f" My position: {data['my_position'] or 'Not in pack'}")
for comp, pos in data["competitor_positions"].items():
print(f" {comp}: #{pos}")
Use Case 4: Multi-City Market Research
For franchise businesses or market researchers, understanding which brands dominate local packs across different cities is valuable.
from collections import Counter
def analyze_local_market(query: str, cities: list) -> dict:
brand_counts = Counter()
city_leaders = {}
for city in cities:
results = get_local_results(query, city)
if results:
city_leaders[city] = results[0]["title"] # pack leader
for biz in results:
brand_counts[biz["title"]] += 1
return {
"top_brands": brand_counts.most_common(10),
"city_leaders": city_leaders,
}
analysis = analyze_local_market("urgent care clinic", ["Boston", "Chicago", "LA", "Miami", "Seattle"])
print("Dominant brands in urgent care Local Packs:")
for brand, count in analysis["top_brands"]:
print(f" {brand}: appears in {count} cities")
Cost Comparison
For local SEO and lead generation, query costs matter:
| Provider | Per 1,000 queries |
|---|---|
| SerpBase | $0.30 |
| DataForSEO | $1.50 |
| Bright Data | $3.00 |
| SerpAPI | $15.00 |
At SerpBase pricing, 10,000 local searches (enough to cover 100 services × 100 cities) costs $3.00.
Tips for Local SERP Queries
- Use
in [city]phrasing — "plumber in Denver" reliably triggers the Local Pack. "plumber Denver" also works but may be less consistent. - Use the
glcountry code —gl=gbfor UK local results,gl=aufor Australia, etc. - Track position changes, not just snapshots — Local Pack rankings shift frequently. Daily monitoring reveals real trends.
- Combine with organic results — Some local businesses also rank in organic results. The full SERP response gives you both.
Summary
Google Local Pack data is some of the most actionable structured information in any search result — it contains business identity, location, reputation signals, and contact information in a single API response. With SerpBase, you get this data as clean JSON at $0.30 per 1,000 requests.
Whether you are building a local SEO tracker, a lead generation tool, or a competitive intelligence dashboard, the Local Pack API gives you a reliable data source with no scraping infrastructure required.
Start with 100 free searches at SerpBase — no credit card required.