14_Operationalizing Machine Learning Flashcards
Operationalizing Machine Learning
Operationalizing Machine Learning refers broadly to the process of deploying predictive models to a production environment, together with ongoing measurement, monitoring and improvement of those models.
KubeFlow
- ML Toolkit for Kubernetes
- Data modelling with Jupyter Notebooks
- Tuning and training with TensorFlow
- Model serving and monitoring
KubeFlow - Pipelines and Components
A pipeline is a description of a machine learning (ML) workflow, including all of the components in the workflow and how the components relate to each other in the form of a graph.
A pipeline component is a self-contained set of user code, packaged as a container, that performs one step in the pipeline. For example, a component can be responsible for data preprocessing, data transformation, model training, etc.
KubeFlow with AI Platform
-
Ingest Data
- Cloud Storage
- Transfer Service
-
Prepare and Preprocess Data
- Cloud Dataflow
- Cloud Dataproc
- BigQuery
- Cloud Dataprep
-
Develop & Train Models
- Deep Learning VM
- AI Platform Notebooks
- Ai Platform Training
- KubeFlow
-
Test & Deploy Models
- TensorFlow Extended
- AI Platform Prediction
- KubeFlow
-
Discovery
- AI Hub: Hosted AI repository of plug and play AI components