Planetary-scale answers, unlocked.
A Hands-On Guide for Working with Large-Scale Spatial Data. Learn more.
Authors
WherobotsAI Raster Inference makes it easy to create innovative solutions from large scale satellite and aerial imagery using SQL and Python, and geovision models you import or source off-the-shelf. Today, we’re excited to announce that Raster Inference now supports Meta’s Segment Anything 2 (SAM 2 ) model. With support for the SAM 2 model for geospatial AI in Raster Inference (in preview), you can now perform general purpose object detection and feature segmentation on terabyte scale collections of geospatial imagery, simply by describing what you’re looking for in plain text.
You can now use Raster Inference to answer questions from satellite imagery like
“find the container ships” or “segment crop circles” or ”segment pickleball courts”
from very large areas of interest.
Raster Inference results are stored as Iceberg tables in your S3 bucket, so you can easily join them with other data using WherobotsDB and continue your analysis inline with over 300+ features and functions optimized for spatial solution development.
Foundation computer vision models like Segment Anything 2 revolutionized image understanding by allowing users to detect and delineate objects, typically from small scale images using simple point or text based prompts. However its been very hard, if not infeasible to use this model for large scale object detection on satellite imagery, like “find all of the container ships in the Pacific Ocean”.
Raster Inference in Wherobots makes the SAM 2 model, along with other geospatial computer vision models (including custom), easier to deploy and run on terabyte scale inference runs using a cloud-parallel compute architecture. Such high scale and performance is necessary for extracting insights from high resolution satellite imagery data. Now, customers can prompt the SAM 2 model with English text, and explore use cases for AI on Earth observation AI models much faster than before.
We launched support for SAM 2 in Raster Inference because we saw customers starting to use Raster Inference to explore the potential of geospatial AI for their use cases. Some did not have ML expertise, and experts just wanted a fast way to explore and test their ideas. Exploring is as easy as opening an example notebook, modifying a text prompt, and hitting run.
You can work the tabular output of all Raster Inference runs inline using SQL or Python, and use Wherobots’ rich toolbox of spatial functions to accelerate the realization of value from complex spatial data.
We believe this is a great general purpose AI tool for testing the feasibility of your ideas. From here, you can decide to invest in or find fine-tuned model alternatives that may be optimized for your earth observation use case.
Just like LLMs you’ve probably interfaced with, you have to take quality, reliability, and risk into account before utilizing the results verbatim in an application. But even if they aren’t 100% correct, they offer immense productivity gains. And like we’ve seen with LLMs, text-promptable models like SAM 2 will evolve and get better over time, improving the direct utility of results and use cases that can serve out-of-the-box.
You can get started in a matter of minutes, without any infrastructure or model management overhead.
You can also learn more about running the SAM 2 model in Raster Inference using the Wherobots product documentation.
Engineers who worked with SAM 2 might recognize that out of the box, SAM 2 does not support text-prompting. Instead, it supports automatic segment detection without categories, point prompts, and bounding box prompts to produce segment predictions. To enabled text-prompting in Raster Inference, we integrated a two-stage model pipeline combining Google Deepmind’s OWLv2 open-vocabulary object detector with SAM 2. This enables two new powerful functions:
Text_To_BBoxes
'solar panels'
Text_To_Segments
Using WherobotsDB, all results can be easily joined to other spatial and non-spatial data you may need to complete the solution.
If your solution simply requires localizing and counting objects, use Text_To_BBoxes. If you need to know the size or shape of objects use Text_To_Segments.
Accessing these capabilities is straightforward using either SQL or Python, fitting seamlessly into the your existing data analysis or data engineering workflows.
Here’s how can predict solar panel geometries in SQL:
-- Find bounding boxes for 'solar panels' CREATE OR REPLACE TEMP VIEW detected_bboxes AS SELECT raster_column, RS_TEXT_TO_BBOXES('SAM 2 ', raster_column, 'solar panels') AS detection_result FROM my_imagery_table; -- Find segments for 'large buildings' CREATE OR REPLACE TEMP VIEW detected_segments AS SELECT raster_column, RS_TEXT_TO_SEGMENTS('SAM 2 ', raster_column, 'solar panels') AS segmentation_result FROM my_imagery_table;
And in Python:
# Find bounding boxes for 'solar panels' df_bboxes = df_raster_input.withColumn( "detection_result", rs_text_to_bboxes("SAM 2 ", col("raster_column"), "solar panels") ) # Find segments for 'large buildings' df_segments = df_raster_input.withColumn( "segmentation_result", rs_text_to_segments("SAM 2 ", col("raster_column"), "solar panels") )
Behind the scenes, Raster Inference handles the heavy lifting. It scales your text-prompted inference request and model across a distributed GPU runtime optimized for loading large geospatial raster datasets, and ensures work is balanced across serverless compute nodes. It automatically handles varying image sizes via padding and other techniques, and efficiently reads your raster data using features like limit and sample pushdown. And the results – whether bounding boxes or polygons – are georeferenced, ready for immediate use in SQL or Python based analysis or visualization with helper functions like show_detections.
show_detections
We’re still exploring the model’s capabilities, which is why the SAM 2 integration is in preview. Here are a few considerations we’ve identified based on observations during testing:
SAM 2 performs well on high resolution NAIP 30cm imagery on simpler object categories, like airplane segmentation.
Example: The OWLv2 object detection model (which Text_To_Segments depends on) can confuse visually similar features, like agriculture and solar farms, when prompted to detect “solar farms”.
We are investigating paths to improve SAM 2’s accuracy, including supporting different prompting methods, such as using points to guide segmentation.
Support for the SAM 2 model for geospatial AI in Raster Inference reflects on our core mission, which is to make it easy to utilize geospatial data at any scale. It also offers a tangible view into a future where interacting with vast archives of Earth Observation data is much easier and intuitive.
We will continue to make SAM 2 and other geospatial models more accessible, and easier to use so customers can address diverse challenges solved with the help of Earth observation data.
Feel free to contact the team directly (product@wherobots.com) if you have feedback to share. We can’t wait to see what you discover!
Introducing RasterFlow: a planetary scale inference engine for Earth Intelligence
RasterFlow takes insights and embeddings from satellite and overhead imagery datasets into Apache Iceberg tables, with ease and efficiency at any scale.
PostGIS, Wherobots, and the Spatial Data Lakehouse: A Strategic Guide for Leaders
Explore PostGIS, Wherobots, and the Spatial Data Lakehouse. Learn when to use each for scalable geospatial analytics, AI, and cost-efficient data strategy.
It takes 15 minutes for the Caltrain to get from Sunnyvale to SAP Center
That’s how long it took our MCP server to go from “how many bus stops are in Maryland” to an answer
Wherobots and Felt Partner to Modernize Spatial Intelligence
We’re excited to announce Wherobots and Felt are partnering to enable data teams to innovate with physical world data and move beyond legacy GIS, using the modern spatial intelligence stack. The stack with Wherobots and Felt provides a cloud-native, spatial processing and collaborative mapping solution that accelerates innovation and time-to-insight across an organization. What is […]
share this article
Awesome that you’d like to share our articles. Where would you like to share it to: