Rotating Proxies
Rotating proxies assign a fresh exit IP on every single HTTP request by default. There is no persistent connection to a single peer — the load balancer selects the next available IP from the pool automatically. This makes rotating proxies the go-to choice for high-volume web scraping, SERP monitoring, and any workflow where IP diversity is more important than session continuity.
NinjasProxy's rotating endpoint draws from the full 72M+ residential pool, so each IP originates from a different household in a different ASN. Downstream anti-bot systems see a genuinely diverse traffic pattern rather than a cycling set of known datacenter ranges.
Endpoint
Host: r.ninjasproxy.com Port: 8080 Protocol: HTTP / HTTPS CONNECT
-session- suffix. Without it, every request rotates automatically.Authentication
Authenticate with your portal username and API key. Do not include a session suffix — any session ID will lock you to a single IP and disable rotation.
http://USERNAME:API_KEY@r.ninjasproxy.com:8080
You may still apply country and city targeting while rotating. Each request will get a new IP from your specified geo:
# Rotate through US IPs only http://USERNAME-country-US:API_KEY@r.ninjasproxy.com:8080 # Rotate through German IPs http://USERNAME-country-DE:API_KEY@r.ninjasproxy.com:8080
Best Practices for High-Volume Scraping
- Disable keep-alive / connection pooling at the HTTP client level. Persistent connections reuse the same tunnel and may not rotate the exit IP until the connection closes.
- Open a new session object per request (or explicitly close the connection) when using libraries like Python
requests.Session. - Set reasonable timeouts — residential IPs vary in speed. A 30-second connect + read timeout handles the slowest peers without blocking your pipeline.
- Implement retry logic with exponential back-off. A small fraction of peers may be temporarily slow or blocked by specific targets.
Code Examples
curl — single rotating request
curl --proxy "http://USERNAME:API_KEY@r.ninjasproxy.com:8080" \
"https://api.ipify.org?format=json"curl — loop (new IP each iteration)
for i in {1..5}; do
curl -s --proxy "http://USERNAME:API_KEY@r.ninjasproxy.com:8080" \
"https://api.ipify.org?format=json"
echo ""
done
# Output: 5 different IPsPython — per-request rotation (no Session reuse)
import requests
import time
PROXY_URL = "http://USERNAME:API_KEY@r.ninjasproxy.com:8080"
PROXIES = {"http": PROXY_URL, "https": PROXY_URL}
TARGETS = [
"https://example.com/page/1",
"https://example.com/page/2",
"https://example.com/page/3",
]
for url in TARGETS:
# Do NOT reuse a requests.Session — each call.get() opens a new connection
# and therefore triggers a new IP assignment.
try:
resp = requests.get(
url,
proxies=PROXIES,
timeout=30,
headers={"User-Agent": "Mozilla/5.0"},
)
print(f"{resp.status_code} — {url}")
except requests.RequestException as exc:
print(f"Error on {url}: {exc}")
time.sleep(0.5) # be polite to the targetPython — async concurrent scraping with httpx
import asyncio
import httpx
PROXY_URL = "http://USERNAME:API_KEY@r.ninjasproxy.com:8080"
TARGETS = [f"https://example.com/product/{i}" for i in range(1, 21)]
async def fetch(client: httpx.AsyncClient, url: str) -> str:
try:
r = await client.get(url, timeout=30)
return f"{r.status_code} — {url}"
except httpx.ProxyError as exc:
return f"ProxyError on {url}: {exc}"
async def main():
# httpx rotates automatically when mounts= is set — each request gets a fresh conn
async with httpx.AsyncClient(
proxy=PROXY_URL,
headers={"User-Agent": "Mozilla/5.0"},
follow_redirects=True,
) as client:
results = await asyncio.gather(*[fetch(client, url) for url in TARGETS])
for r in results:
print(r)
asyncio.run(main())Node.js — rotation with got
import got from 'got'
import { HttpsProxyAgent } from 'https-proxy-agent'
const PROXY_URL = 'http://USERNAME:API_KEY@r.ninjasproxy.com:8080'
const targets = ['https://example.com/a', 'https://example.com/b', 'https://example.com/c']
for (const url of targets) {
// Create a fresh agent per request to force connection rotation
const agent = new HttpsProxyAgent(PROXY_URL)
const { statusCode, body } = await got(url, {
agent: { https: agent },
timeout: { request: 30_000 },
retry: { limit: 2 },
})
console.log(statusCode, url)
}Monitoring & Debugging
The portal dashboard shows real-time bandwidth usage and request counts per proxy type. For per-request IP logging, enable Request Logs under Settings → Debug Logs — logs are retained for 24 hours and include the assigned exit IP, latency, and response code.
Next Steps
- Sticky sessions — need the same IP across multiple requests? Use a session token.
- Datacenter proxies — unlimited bandwidth at the lowest possible latency.
- Python guide — advanced async patterns and retry strategies.
- API Reference — programmatic usage stats and session management.