Maximize & Scale Azure Machine Learning Models with Intel® AI Frameworks

Intel and Microsoft Bring AI Optimizations to Azure MaaS with Intel® Extension for Scikit-Learn*

Get the Latest on All Things CODE

author-image

By

The Takeaway

Intel and Microsoft have brought AI optimizations to the Azure Machine Learning platform, beginning with Intel® Extension for Scikit-Learn*. This empowers businesses that make use of MLOps to build and scale their own ML models with ease.

Managing Machine Learning: ML as a Service (MaaS), Azure & MLOps

Machine learning operations (MLOps) is an emerging field that enables developers to manage the lifecycle of their ML models and applications in production environments. Providers of ML as a Service (MaaS) platforms such as Azure Machine Learning have become indispensable to businesses for two key reasons:

  1. They empower data scientists and engineers to benefit from MLOps functionalities, including model versioning, deployment management, and continuous integration for further optimization of the development process and;
  2. With their intuitive GUI and drag-and-drop features, even beginners can quickly learn how to use Azure ML for their projects.

But here’s the rub:

Although MaaS platforms make it easier to develop, train, deploy, and manage ML applications faster and more efficiently, they come with challenges that can significantly impact production efficiency and cost containment, both of which are key barriers to business ROI and scale adoption.

Additionally, these open source frameworks can be compute- and data-intensive to run. This leads to issues of resource allocation, increased costs, decreased productivity, and fragmented ML teams and services.

All of this typically leads data scientists to do one of two things:

  1. Increase their cloud instances to more expensive hardware that can handle existing workloads better or; 
  2. Change their workload to fit a hardware cost constraint and (sometimes) learn from scratch and introduce a new framework into their code to run their workload better.

Each leads to frustration, numerous obstacles, and increased developer costs that delay results and insights.

The Azure-Intel Solution

Azure Machine Learning provides an end-to-end MLOps platform that enables data scientists to build and scale their own ML models with ease. The platform offers a wide range of features such as automated machine learning, model development and deployment management capabilities, and secure datasets. These features remove problems, such as disconnected ML processes, while taking advantage of the different hardware available on Azure Machine Learning.

Intel and Microsoft are excited to announce that they have collaborated to bring Intel AI optimizations to the Azure Machine Learning platform, beginning with Intel® Extension for Scikit-learn*.

What is Intel® Extension for Scikit-learn*?

It is a Python* module for scikit-learn, the popular analytics and machine learning library. The extension adds simple drop-in acceleration for scikit-learn functions and algorithms, allowing users to scale applications for Intel® architecture and achieve over 100x performance gain and possible accuracy accelerations.

Leveraging the power of Intel® processors and the Intel® oneAPI Data Analytics Library (oneDAL), the Intel Extension for Scikit-learn accelerates complex ML tasks, providing faster and more efficient processing. This reduction in execution time leads to quicker model training, testing, and deployment, ultimately making the ML workflow more efficient and effective. Additionally, it supports a wide range of ML algorithms and provides flexibility to data scientists to choose the most suitable algorithm for their use case. By just adding a few lines of code, data scientists can speed up their scikit-learn workloads and take more advantage of their hardware to reduce time and resource costs.

Benefits of the Azure-Intel Solution

Integrating Intel Extension for Scikit-learn with Azure Machine Learning allows developers to experience the power of scikit-learn optimizations while taking full advantage of the services and capabilities available in the Azure ML platform. Azure ML developers can continue to develop and deploy their models and have the additional opportunity to speed them up by integrating Intel Extension for Scikit-learn optimizations when possible within the Microsoft curated scikit-learn container environment.

How to Get Started

It is relatively easy to implement Intel Extension for Scikit-learn in Azure Machine Learning. The first step is to create a new instance of the Azure ML workspace. Afterward, data scientists can access the extension by using the the Sklearn container(s) that are available and hosted as part of the list of Microsoft curated environments on the platform.  

When data scientists are developing code with this container, they can import the Intel Extension for Scikit-learn library and use it with scikit-learn to accelerate their ML workloads. When using this container, simply add the following two lines of code to enable the Intel extension’s optimizations into the script you would like to pass into the container:

# Turn on scikit-learn optimizations with these 2 simple lines:
from sklearnex import patch_sklearn
patch_sklearn()

For example, in a command script:

%%writefile {train_src_dir}/main.py
import os
import argparse
import pandas as pd
import mlflow
import mlflow.sklearn

#import the Intel Extension for SKlearn optimizations
from sklearnex import patch_sklearn

from sklearn.ensemble import GradientBoostingClassifier
from sklearn.metrics import classification_report
	from sklearn.model_selection import train_test_split

   def main():
	patch_sklearn() #enable SKLearnEx optimizations wherever possible
	“””The rest of your SKLearn Script”””

Once Intel Extension for Scikit-learn is enabled with the above two lines of code, your scikit-learn code will be accelerated by the extension wherever possible – no additional steps required! For more information on the different ways to turn on/off the optimizations and more, please review the Intel® Extension for Scikit-Learn documentation.

One way you can try out the Intel Extension for Scikit-learn optimizations on a full Azure Machine Learning workload is by integrating them into your own existing workload or completing the Azure Machine Learning in a day tutorial and adding the two lines described above to the training script in the instructions.

In this case, the optimizations will be utilized in the train_test_split function, but you can also modify the SKLearn algorithm utilized in the script from GradientBoostingClassifier to another classification algorithm supported by the Intel extension, such as the Random Forest Classifier.

Keep an eye out for future Intel optimizations coming to Azure Machine Learning!

Conclusion:

Accelerate your ML workloads with Intel Extension for Scikit-learn in Azure Machine Learning. Integrating the Intel extension with Azure ML can lead to significant performance improvements. The library is optimized for a wide range of use cases, and its implementation requires minimal effort. With best practices in place, data scientists can maximize the performance of their ML models and leverage Intel’s hardware and software technologies to achieve better results faster.

Learn More