Curated articles, resources, tips and trends from the DevOps World.
Summary: This is a summary of an article originally published by The New Stack. Read the full original article here →
In the last installment of this series, we have created custom Docker images for provisioning Jupyter Notebook Servers targeting the data preparation, training, and inference stages of a machine learning project. This tutorial focuses on provisioning the storage backend for Jupyter Notebook Servers running in the Kubeflow platform.
As discussed in the earlier parts, Kubeflow has a unique requirement of shared volumes and dedicated volumes to run MLOps pipelines.
For detailed instructions on deploying and configuring Kubeflow storage, refer to the DeepOps guide for NFS and Portworx.
With the custom Docker container images and storage volumes in place, we are all set to launch the Notebook Servers for data preparation, training, and inference.
Made with pure grit © 2024 Jetpack Labs Inc. All rights reserved. www.jetpacklabs.com