loader

Model Deployment Options in Azure

There are so many options to deploy models in Azure that is can get quite overwhelming. In this blog, we break down all the available options and consider the pros and cons of each tooling option.

Azure Machine Learning

Azure Machine Learning is a native Azure cloud offering for accelerating and managing ML projects. It offers a Studio GUI for drag and drop ml workflows and a Python SDK for more complex, code-based ml workflows. It has been designed  to allow models to be built easily then quickly deployed, and offers built in ML Ops capabilities such as Drift detections and auditing.

Azure ML offers deployments configurations for Azure Container Instance (ACI) and Azure Kubernetes Service (AKS) through both the Studio and SDK.

Pros:

·        Quick deployments for simple models

·         Native MLOps capabilities

·        Integrates easily with MLFlow

·        Very well documented

·        Integrated with App insights

Cons:

·        Requires learning the AML SDK

·        Hard to create pipelines for complex model pre-processing

·        Must use custom code for models outside the few offered in Studio

·        Expensive

Azure Functions

Azure Function is a serverless compute service that enables user to run event-triggered code without having to provision or manage infrastructure. Being an ‘as a trigger’ based service, it runs a script or piece of code in response to a variety of events, such as HTTP, Queue or Timer. This makes it an ideal solution for deploying models.

Since they are just hosted functions, development and deployment are as simple as it gets. You can build an inference pipeline in python and copy/paste into a box… and it’s done! It is easily accessible and functions can be written in C#, Java, JavaScript, PowerShell, Python, and more!

Azure Functions claim to have flexible pricing options, which is true, but it is generally considered quite difficult to understand and you may end up paying much more than you want or face scaling limitations.

Pros:

·        Easy to deploy

·        Easy package management

·        Easy to debug locally

·        Integrates well with Azure Storage and VS Code

Cons:

·        Complicated pricing model

·        Scaling limitations

Azure Kubernetes Service

Kubernetes, sometimes written as K8s, is an open-source container orchestration platform for managing, automating, and scaling containerized applications. Although there are other tools out there, Kubernetes is generally considered the standard for container orchestration because of its greater flexibility and capacity to scale. Azure Kubernetes Service (AKS) is a serverless flavour of Kubernetes, configuring all nodes on your behalf, integrating with the Azure Active Directory, and offering more additional CI/CD and security features than K8s alone.

An AKS cluster can be created using the Azure command-line interface (CLI), an Azure portal or Azure PowerShell. You can also utilise Azure Resource Manager (ARM) templates for automated deployment.

AKS is technically a free Azure service, in the fact that there is no charge for Kubernetes cluster management. You are, however, billed for the underlying compute, storage, networking and other cloud resources which can all add up.

Pros:

·        Very scalable

·        Load balancing

·        Flexible

·        Efficient resource utilization

·        Security and compliance

Cons:

·        AKS can be overkill for many projects

·        Requires more upfront training

·        Can be laborious to maintain.

Azure Managed Online Endpoints

Microsoft have just announced Managed Online Endpoints for Azure Machine Learning. This new capability is focussed on online/real-time scoring of models and has a managed infrastructure including automatic patching and node recovery. It gives you the ability to debug in a local docker environment just by changing one flag in deployment and due to being a native Azure product is integrated with App Insights.

There are even more benefits to working with Managed Online Endpoints for batch models too. For MLFlow registered models you only need to run one command and a batch endpoint will be automatically set up with pipelines and scoring included. You can run batch inference through managed batch endpoint using an Azure ML registered dataset, other datasets in the cloud, or datasets stored locally.

Pros:

·        Easy to debug

·        Easy to maintain

·        Quick to set up

·        Works with App Insights

·        Further benefits for batch scoring

Cons:

·        Not all functionality is available for real-time models

So which tool should I use to deploy my model?

The simplest answer to this is, it depends. As you can see there are many options with different benefits and limitations. If you need a small inference service for a popular model, such as SGD Classifier, that you want deployed and monitored simply then Azure Machine Learning Studio may be the perfect tool for the job. If you have a training pipeline pre-built that runs locally, to get it in the cloud, Azure Functions will do the trick. For large real-time streaming jobs for complicated models with lots of pre-processing then Kubernetes or Azure Managed Endpoints is probably the route you want to take.

author profile

Author

Tori Tompkins