Abstract

At this stage, we have the business problem, data from the problem domain, enough experiments, experiment metadata, and the trained model. Now as a developer of MLOps engineer we are still unclear about how to manage the various environments to support model deployment. This post is probably a starter to understand the various execution environment and compute management options available with Azure Machine Learning Service.

This post is extracted from the Kaggle notebook hosted — here. Use the link to setup to execute the experiment.

Introduction

The runtime context for each AML experiment run consists of two elements: The Environment for the script and the Compute Target on which the environment will be deployed and the script run.

The code runs in the _virtual environment _that defines the runtime and the packages which are installed in the environment using _Conda _or pip.

Environments are usually created in docker containers that are portable and can be hosted in target compute, i.e., dev computer, virtual machines, container services in the various public cloud.

Environment

Azure Machine Learning service manages environment creation, register, and package installation by defining the _Environment. _This makes it possible to define consistent, reusable runtime contexts for your experiments.

Creating the Environment

In the experiment which we were running so far, we were running it through the default Coda environment via ‘local’ compute. For production-like scenarios to have better control on the execution environment AML service provides an abstraction to define the custom environment specific to the experiment’s need. This environment definition can repeatedly apply to any execution environment.

The code below defines such an environment by instantiating from the CondadepenDependencies object and then passing conda_packagesand pip_packages required by experiment to run.

_Additional Info: There are many other ways to create and manage package in AzureML see this _link

from azureml.core import Environment
from azureml.core.conda_dependencies import CondaDependencies

iris_env = Environment("iris_trn_environment")
iris_env.python.user_managed_dependencies = False
#iris_env.docker.enabled = True ### If docker ned to be enabled
iris_env.docker.enabled = False
iris_deps = CondaDependencies.create(conda_packages=["scikit-learn","pandas","numpy"], pip_packages=["azureml-defaults",'azureml-dataprep[pandas]'])
iris_env.python.conda_dependencies = iris_deps

Register the Environment

Once the environment definition is created we need to register it in the workspace so that it can be reused later.

#mlops #artificial-intelligence #microsoft-azure

Azure Machine Learning Service — What is the Target Environment?
1.10 GEEK