Docs · Scraping

Batch scrape

Up to 100 URLs in parallel, one response.

POST /v1/scrape/batch

Send a list of URLs, get them all back together. Useful for medium-volume jobs where you can wait for the full set. For larger jobs use the async variant.

Parameters

Name Type Required Default Description
urls array yes List of URLs (max 100).
concurrency integer no 5 Maximum parallel requests (max 25).
format string no markdown Output format applied to each URL.
timeout integer no 120 Overall timeout in seconds.

Request

curl -X POST https://api.datasonar.dev/v1/scrape/batch \
  -H "Authorization: Bearer osk_..." \
  -d '{"urls": ["https://a.com","https://b.com"], "concurrency": 5, "format": "markdown"}'

Response

{
  "status": "success",
  "content": "...",
  "time_ms": 1099
}

Related