Palantir's modeling suite of products enables users to develop, manage, and operationalize models. This page compares different products to help you choose the right tool for your needs.
| Product | Details |
|---|---|
| Pipeline Builder | Large scale point-and-click data transformation |
| Code Workspaces | Interactive, pro-code data analysis and transformation in familiar environments such as JupyterLab® |
| Python Transforms | PySpark data pipeline development in Foundry's web-based IDE, Code Repositories |
No-code model training tools are available in Model Studio, providing a simple point-and-click interface for creating production-grade machine learning models.
The foundry_ml library and dataset-backed models have entered the planned deprecation phase of development and will be unavailable for use alongside the Python 3.9 deprecation in October 2025. Support remains available until the deprecation date. At this time, you should migrate your workflows to use the palantir_models library. Contact Palantir Support if you require additional help migrating your workflows.
The palantir_models library provides flexible tooling to publish and consume models within the Palantir platform, using the concept of model adapters.
| Library | Details |
|---|---|
palantir_models | Flexible library to publish and consume models in Foundry through Model Adapters; supports models produced in platform, external models, and containerized models |
foundry_ml | Legacy model development library, formally deprecated as of October 2025; you should use palantir_models instead of foundry_ml unless absolutely necessary |
| Product | Library support | Details |
|---|---|---|
| Code Workspaces | palantir_models | Interactive model development in Jupyter® notebooks |
| Code Repositories | palantir_models | Powerful web-based IDE with native CI/CD features and support for modeling workflows; less interactive than notebooks |
| Code Workbooks | foundry_ml | Modeling support in Code Workbooks is limited to only foundry_ml models, which have been deprecated in October 2025 |
| Product | Details |
|---|---|
| Experiments | Framework for logging metrics and hyperparameters during a model training job |
Models can be used for running large scale batch inference pipelines across datasets.
| Product | Details | Caveats |
|---|---|---|
| Modeling objective batch deployments | Modeling Objectives offers broader model management features such as model release management, upgrades, evaluation, and more | Does not support multi-output or external models |
| Python transforms | Batch inference can be run directly in Python transforms | Use the @lightweight decorator unless specifically looking to distribute inference over Spark |
| Jupyter® Notebook | Users can create scheduled training and/or inference jobs directly from Code Workspaces | Only supports running inference models created from the same notebook; use Python Transforms to orchestrate models created elsewhere |
Models can be deployed in Foundry behind a REST API; deploying a model operationalizes the model for use both inside and outside of Foundry.
| Product | Details |
|---|---|
| Model direct deployments | Auto-upgrading model deployments; best for quick iteration and deployment |
| Modeling objective live deployments | Production-grade modeling project management; modeling objectives provide tooling for model release management, upgrades, evaluation, and more |
Publishing models as functions makes it easy to use models for live inference in downstream Foundry applications, including Workshop, Slate, actions, and more.
| Product | Best for |
|---|---|
| Direct function publication | No-code function creation on models with live deployments, allowing integration with the Ontology. The same functionality is available in the Model and Modeling Objectives applications. |
| Importing model functions in TypeScript v1 repositories | Import model functions into TypeScript v1 functions to further process predictions (for example, make Ontology edits) with support for Model API type checking and Marketplace deployment. |
| Call model functions via Platform SDK from Python or Typescript v2 functions | Model functions can also be called from Python or TypeScript v2 functions if the model and associated functions will not be deployed via Marketplace. |