# soopervisor **Repository Path**: mirrors_andyglick/soopervisor ## Basic Information - **Project Name**: soopervisor - **Description**: ☁️ Export Ploomber pipelines to Kubernetes (Argo), Airflow, AWS Batch, SLURM, and Kubeflow. - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2022-03-28 - **Last Updated**: 2026-02-22 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README Soopervisor =========== .. image:: https://github.com/ploomber/soopervisor/workflows/CI/badge.svg :target: https://github.com/ploomber/soopervisor/workflows/CI/badge.svg :alt: CI badge .. image:: https://github.com/ploomber/soopervisor/workflows/CI%20macOS/badge.svg :target: https://github.com/ploomber/soopervisor/workflows/CI%20macOS/badge.svg :alt: CI macOS badge .. image:: https://github.com/ploomber/soopervisor/workflows/CI%20Windows/badge.svg :target: https://github.com/ploomber/soopervisor/workflows/CI%20Windows/badge.svg :alt: CI Windows badge Soopervisor runs `Ploomber `_ pipelines for batch processing (large-scale training or batch serving) or online inference. .. code-block:: sh pip install soopervisor Check out the `documentation `_ to learn more. *Compatible with Python 3.6 and higher.* Supported platforms =================== * Batch serving and large-scale training: * `Airflow `_ * `Argo/Kubernetes `_ * `AWS Batch `_ * `Kubeflow `_ * `SLURM `_ * Online inference: * `AWS Lambda `_ From notebook to a production pipeline ====================================== We also have `an example `_ that shows how to use our ecosystem of tools to go **from a monolithic notebook to a pipeline deployed in Kubernetes.** Usage ===== Say that you want to train multiple models in a Kubernetes cluster, you may create a new target environment to execute your pipeline using Argo Workflows: .. code-block:: sh soopervisor add training --backend argo-workflows After filling in some basic configuration settings, export the pipeline with: .. code-block:: sh soopervisor export training Depending on the selected backend (Argo, Airflow, AWS Batch, or AWS Lambda), configuration details will change, but the API remains the same: ``soopervisor add``, then ``soopervisor export``.