How We Integrate SEO Services
Working with DataForSEO, SerpAPI, and other SEO data providers
How We Integrate SEO Services
Our SEO tools rely on external data providers. This guide covers how we integrate with them effectively.
Core principle: These APIs are expensive. Optimize for efficiency and cache aggressively.
Our SEO Data Providers
| Provider | Best For | Pricing Model |
|---|---|---|
| DataForSEO | Comprehensive SERP data, AI Overviews | Per-request |
| SerpAPI | Quick Google searches | Per-search |
| Google APIs | Search Console, Analytics | Free (with limits) |
DataForSEO
Overview
DataForSEO provides extensive SEO data:
- SERP results (organic, paid, featured snippets)
- AI Overviews data
- Keyword difficulty
- Backlink data
Authentication
DataForSEO uses HTTP Basic Auth:
// lib/dataforseo.ts
const credentials = Buffer.from(
`${process.env.DATAFORSEO_LOGIN}:${process.env.DATAFORSEO_PASSWORD}`
).toString('base64');
const headers = {
'Authorization': `Basic ${credentials}`,
'Content-Type': 'application/json',
};
SERP API Example
Fetch search results for keywords:
interface SerpTask {
keyword: string;
location_code: number;
language_code: string;
device: 'desktop' | 'mobile';
}
interface SerpResult {
keyword: string;
items: Array<{
type: string;
rank_group: number;
rank_absolute: number;
title?: string;
url?: string;
description?: string;
}>;
ai_overview?: {
text: string;
sources: Array<{ url: string; title: string }>;
};
}
async function fetchSerpResults(keywords: string[]): Promise<SerpResult[]> {
const tasks: SerpTask[] = keywords.map(keyword => ({
keyword,
location_code: 2704, // Vietnam
language_code: 'vi',
device: 'desktop',
}));
const response = await fetch(
'https://api.dataforseo.com/v3/serp/google/organic/live/advanced',
{
method: 'POST',
headers,
body: JSON.stringify(tasks),
}
);
if (!response.ok) {
throw new Error(`DataForSEO error: ${response.status}`);
}
const data = await response.json();
return data.tasks.map(parseTaskResult);
}
AI Overviews Data
For our AI Overviews Checker tool:
async function checkAiOverviews(keywords: string[]): Promise<AiOverviewResult[]> {
const tasks = keywords.map(keyword => ({
keyword,
location_code: 2840, // USA (AI Overviews more common)
language_code: 'en',
device: 'desktop',
}));
const response = await fetch(
'https://api.dataforseo.com/v3/serp/google/organic/live/advanced',
{
method: 'POST',
headers,
body: JSON.stringify(tasks),
}
);
const data = await response.json();
return data.tasks.map(task => {
const aiOverview = task.result?.[0]?.items?.find(
item => item.type === 'ai_overview'
);
return {
keyword: task.data.keyword,
hasAiOverview: !!aiOverview,
content: aiOverview?.text || null,
sources: aiOverview?.references || [],
};
});
}
Rate Limiting
DataForSEO has generous limits, but batch wisely:
// Process in batches of 100 (their recommended batch size)
async function batchProcess<T, R>(
items: T[],
processor: (batch: T[]) => Promise<R[]>,
batchSize = 100
): Promise<R[]> {
const results: R[] = [];
for (let i = 0; i < items.length; i += batchSize) {
const batch = items.slice(i, i + batchSize);
const batchResults = await processor(batch);
results.push(...batchResults);
// Small delay between batches
if (i + batchSize < items.length) {
await new Promise(resolve => setTimeout(resolve, 100));
}
}
return results;
}
SerpAPI
Overview
Simpler API for Google searches. Good for quick lookups.
Basic Usage
async function searchGoogle(query: string): Promise<SearchResult> {
const params = new URLSearchParams({
q: query,
location: 'Vietnam',
hl: 'vi',
gl: 'vn',
api_key: process.env.SERPAPI_KEY!,
});
const response = await fetch(
`https://serpapi.com/search.json?${params}`
);
if (!response.ok) {
throw new Error(`SerpAPI error: ${response.status}`);
}
return response.json();
}
When to Use Which
| Use Case | Provider | Why |
|---|---|---|
| Bulk keyword analysis | DataForSEO | Better batch support, more data |
| Quick single searches | SerpAPI | Simpler, faster |
| AI Overviews | DataForSEO | Has specific AI Overview data |
| Historical data | DataForSEO | Task-based with storage |
Cost Optimization
Caching Responses
SEO data doesn't change every second. Cache appropriately:
import { Redis } from '@upstash/redis';
const redis = new Redis({
url: process.env.UPSTASH_REDIS_URL!,
token: process.env.UPSTASH_REDIS_TOKEN!,
});
async function getCachedOrFetch<T>(
cacheKey: string,
fetcher: () => Promise<T>,
ttlSeconds = 3600 // 1 hour default
): Promise<T> {
// Try cache first
const cached = await redis.get<T>(cacheKey);
if (cached) {
return cached;
}
// Fetch fresh data
const data = await fetcher();
// Cache for future
await redis.setex(cacheKey, ttlSeconds, data);
return data;
}
// Usage
const results = await getCachedOrFetch(
`serp:${keyword}:${location}`,
() => fetchSerpResults([keyword]),
86400 // Cache for 24 hours
);
Cache Strategies by Data Type
| Data Type | TTL | Why |
|---|---|---|
| SERP results | 24 hours | Changes daily |
| AI Overview presence | 12 hours | Can appear/disappear |
| Keyword difficulty | 7 days | Slow changing |
| Backlink counts | 7 days | Slow changing |
Deduplication
Don't fetch the same keyword twice:
async function fetchUniqueKeywords(keywords: string[]): Promise<SerpResult[]> {
// Deduplicate and normalize
const uniqueKeywords = [...new Set(
keywords.map(k => k.toLowerCase().trim())
)];
// Check cache for already-fetched
const cacheKeys = uniqueKeywords.map(k => `serp:${k}`);
const cachedResults = await redis.mget<SerpResult[]>(...cacheKeys);
const toFetch: string[] = [];
const results: Map<string, SerpResult> = new Map();
uniqueKeywords.forEach((keyword, i) => {
if (cachedResults[i]) {
results.set(keyword, cachedResults[i]);
} else {
toFetch.push(keyword);
}
});
// Only fetch what's missing
if (toFetch.length > 0) {
const freshResults = await fetchSerpResults(toFetch);
freshResults.forEach(result => {
results.set(result.keyword.toLowerCase(), result);
});
}
return uniqueKeywords.map(k => results.get(k)!);
}
Error Handling
Common API Errors
| Error | Cause | Solution |
|---|---|---|
| 401 | Bad credentials | Check API key/password |
| 402 | Insufficient credits | Top up account |
| 429 | Rate limited | Implement backoff |
| 500 | Provider issue | Retry with backoff |
Retry with Exponential Backoff
async function fetchWithRetry<T>(
fetcher: () => Promise<T>,
maxRetries = 3
): Promise<T> {
let lastError: Error | null = null;
for (let attempt = 0; attempt < maxRetries; attempt++) {
try {
return await fetcher();
} catch (error) {
lastError = error as Error;
// Don't retry client errors (4xx except 429)
if (error instanceof Error && error.message.includes('4')) {
if (!error.message.includes('429')) {
throw error;
}
}
// Exponential backoff: 1s, 2s, 4s
const delay = Math.pow(2, attempt) * 1000;
await new Promise(resolve => setTimeout(resolve, delay));
}
}
throw lastError;
}
Data Transformation
Normalize Provider Responses
Different providers return different shapes. Normalize to your domain:
// Your domain types
interface SerpEntry {
position: number;
url: string;
title: string;
description: string;
type: 'organic' | 'featured' | 'ai_overview';
}
// DataForSEO transformer
function transformDataForSeoResult(raw: any): SerpEntry[] {
return raw.items
.filter((item: any) => item.type === 'organic')
.map((item: any) => ({
position: item.rank_absolute,
url: item.url,
title: item.title,
description: item.description,
type: 'organic',
}));
}
// SerpAPI transformer
function transformSerpApiResult(raw: any): SerpEntry[] {
return raw.organic_results.map((item: any) => ({
position: item.position,
url: item.link,
title: item.title,
description: item.snippet,
type: 'organic',
}));
}
Monitoring Costs
Track API Usage
// Log API calls with cost estimation
async function trackApiCall(
provider: 'dataforseo' | 'serpapi',
endpoint: string,
itemCount: number
) {
const costs = {
dataforseo: {
serp: 0.002, // $0.002 per keyword
backlinks: 0.01,
},
serpapi: {
search: 0.005, // $0.005 per search
},
};
const cost = (costs[provider] as any)[endpoint] * itemCount;
// Log to database or monitoring service
await supabase.from('api_usage').insert({
provider,
endpoint,
item_count: itemCount,
estimated_cost: cost,
timestamp: new Date().toISOString(),
});
console.log(`API call: ${provider}/${endpoint} - ${itemCount} items - $${cost.toFixed(4)}`);
}
Common Mistakes
Not batching requests
Signs: 1000 individual API calls instead of 10 batches of 100.
Fix: Use batch endpoints. All SEO providers support them.
Ignoring rate limits
Signs: Getting 429 errors, API access suspended.
Fix: Implement rate limiting client-side, respect provider limits.
Not caching
Signs: Fetching same keyword multiple times per day.
Fix: Cache with appropriate TTL. SEO data doesn't change by the minute.
Leaking raw API responses
Signs: Frontend code references provider-specific field names.
Fix: Transform to your domain types. Providers can change their format.
Evaluation Checklist
Your SEO integration is working if:
- Requests are batched appropriately
- Responses are cached
- Errors are handled gracefully
- Costs are monitored
- Data is transformed to domain types
- Rate limits are respected
Your SEO integration needs work if:
- Making individual API calls per keyword
- Same data fetched repeatedly
- 429 errors appearing
- No idea how much API costs
- Frontend knows about DataForSEO field names
Quick Reference
API Endpoints We Use
| Provider | Endpoint | Use Case |
|---|---|---|
| DataForSEO | /v3/serp/google/organic/live/advanced | Full SERP data |
| DataForSEO | /v3/keywords_data/google/search_volume/live | Keyword volumes |
| SerpAPI | /search.json | Quick searches |
Cost Awareness
| Operation | Approximate Cost |
|---|---|
| DataForSEO SERP (1 keyword) | $0.002 |
| DataForSEO Keywords (1 keyword) | $0.001 |
| SerpAPI Search | $0.005 |
| 1000 keyword analysis | ~$2-5 |