External pipelines are currently in Beta. Functionality may change during ongoing development.
If you're new to Pipeline Builder, review how to create a batch pipeline in Pipeline Builder before proceeding.
Pipeline Builder now offers external pipelines, which push down compute to external compute engines. This functions in a similar manner as compute pushdown in Python transforms, and allows Foundry's pipeline management, data lineage, and security functionality to be used on top of external data warehouse compute. As with compute pushdown in Python transforms, all inputs and outputs from external pipelines must be virtual tables.
Tables built with external compute can be composed together with datasets and tables built with Foundry-native compute using Foundry’s scheduling tools, allowing you to orchestrate complex multi-technology pipelines using the exact right compute at every step along the way.

Currently, Databricks and Snowflake are supported external compute engines in Pipeline Builder. To use other external compute engines, such as BigQuery, use transforms with compute pushdown.
| Source type | Status | Notes |
|---|---|---|
| BigQuery | Not available | |
| Databricks | Beta | Serverless (default) or classic compute available. |
| Snowflake | Beta | Contact Palantir Support to enable this feature. |
All input and output tables must be configured from the same source you selected as part of the pipeline setup.

You can edit your pipeline source and configure source-specific compute options in the build settings panel.
External pipelines do not currently support the same set of transforms and expressions as standard batch pipelines.
Due to the differences between external and batch pipelines, you should always verify results using Preview or by examining build outputs.
Currently unsupported features and expressions include: