Your AI can now contextualize physical world data using Wherobots Spatial AI Coding Tools Learn More

AUTOMATE

Take spatial data pipelines to production

Orchestrate physical-world data pipelines with Apache Airflow or the Job Runs API. The same code you built in notebooks runs in production with full observability.

Production-grade spatial pipelines
Schedule and orchestrate continuous spatial workflows.
Native Apache Airflow integration
Drop WherobotsDB into your existing Airflow DAGs.
Migrate and accelerate
Existing Spark and Sedona pipelines run faster at lower cost.

Orchestrate spatial data pipelines with Apache Airflow

The Wherobots Airflow Provider drops spatial data pipeline jobs into your existing Airflow DAGs. Run WherobotsDB compute jobs or execute spatial SQL queries on a schedule. Install with pip install airflow-providers-wherobots. Available for Professional and Enterprise editions.

Or trigger spatial data pipelines with the Job Runs API

The Job Runs REST API triggers WherobotsDB jobs programmatically. Submit Python or JAR workloads, specify runtime sizes, monitor status, and retrieve logs. Integrate spatial data pipeline automation into any CI/CD system, orchestrator, or custom application.

Migrate existing Spark and Sedona pipelines to WherobotsDB

Existing Apache Spark and Sedona spatial data pipelines run on WherobotsDB with zero code changes. WherobotsDB benchmarks at up to 3x faster queries and 46% lower cost. [Aarden.ai](http://Aarden.ai) reduced core processing from 7 days to 30 minutes after migration.

Full observability for every spatial data pipeline

Monitor every spatial data pipeline from submission to completion. Track job status, resource consumption, runtime logs, and performance metrics. Identify bottlenecks before they impact production.

Job Runs
Workload Usage
Logs & Metrics

FAQ

How do I use Wherobots with Apache Airflow?

Install the Wherobots Apache Airflow provider with pip install airflow-providers-wherobots. After configuring your Wherobots API key in Airflow, use WherobotsRunOperator to execute Python or JAR spatial data pipeline jobs, or WherobotsSqlOperator to execute spatial SQL queries. Available for Professional and Enterprise editions.

What kinds of code can a Wherobots Run execute?

A Wherobots Run can be configured to execute either a Python file or a JAR file stored on an accessible S3 path (via Managed Storage or Storage Integration). When creating a run, you must specify a required runtime (e.g., tiny, medium, x-large-himem). You can also specify custom dependencies to be included in the environment, such as PyPI packages or other dependency files. Note: Available for Wherobots Professional and Enterprise Editions

What types of workloads can be migrated to Wherobots?

Existing Apache Spark and Sedona spatial data pipelines run on WherobotsDB with zero code changes. WherobotsDB benchmarks at up to 3x faster queries and 46% lower cost than the previous generation. Aarden.ai reduced core processing from 7 days to 30 minutes after migrating their spatial data pipeline to WherobotsDB.
Sphere

Get started

Wherobots is the AI Context Engine for the Physical World.