Get the data you need

The most powerful and easy-to-use data collection API. 50+ ready-made crawlers, simple REST endpoints, structured JSON output.

50+ Ready-made crawlers
Simple REST API
Structured JSON output
99.5% success rate

How it works

Start collecting data in minutes. No infrastructure to manage, no proxies to configure.

example.py
import requests

API_KEY = "your_api_key"
headers = {"Authorization": f"Token {API_KEY}"}

# 1. Create a squid with a crawler
squid = requests.post("https://api.lobstr.io/v1/squids",
    headers=headers,
    json={"crawler": "4734d096159ef05210e0e1677e8be823", "name": "Restaurants Paris"}
).json()

# 2. Add tasks to scrape
requests.post("https://api.lobstr.io/v1/tasks",
    headers=headers,
    json={
        "squid": squid['id'],
        "tasks": [{"url": "https://google.com/maps/search/restaurants+paris"}]
    }
)

# 3. Start the run
run = requests.post("https://api.lobstr.io/v1/runs",
    headers=headers,
    json={"squid": squid['id']}
).json()

# 4. Get results
results = requests.get("https://api.lobstr.io/v1/results",
    headers=headers,
    params={"run": run['id']}
).json()

for place in results['data']:
    print(f"{place['name']} - {place['rating']}ā˜…")

Ready to get started?

Get your API key and start collecting data in minutes.

Get Started