Airflow to Orchestrate Machine Learning Algorithms - DZone AI

Airflow to Orchestrate Machine Learning Algorithms - DZone AI

As a data engineer, a big challenge is to manage, schedule, and run workflows to prepare data, generate reports, and run algorithms. This post suggests a possible, quick-to-implement solution for these activities with a simple example.

Managing and scheduling a data analysis workflow can be done in a lot of ways, but the more common ways are:

Both ways don't scale well with the dimensions of the jobs: if a job fails at one of the stages of the workflow, your workflow needs to be restarted from scratch. If you want to create a better data pipeline with an easy-to-learn interface and a lot of useful features, you need to use a computational orchestrator like Apache Airflow.

Airflow is computational orchestrator because you can manage every kind of operation if you can write a workflow for that. This means that you can use Airflow to author workflows as Directed Acyclic Graphs (DAGs).

Airflow is composed of two elements: web server and scheduler.

A web server runs the user interface and visualizes pipelines running in production, monitors progress, and troubleshoots issues when needed. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command-line utilities make performing complex surgeries on DAGs a snap.

I think the most important feature of Airflow is that workflows are defined as code. In this way, they become more maintainable, versionable, testable, and collaborative. With Airflow, you can schedule a pipeline as complex as you want. The DAG is created with code and not with GUI tools.

But what is a DAG? DAG (Directed Acyclic Graph) is a collection of all the tasks you want to run, organized in a way that reflects their relationships and dependencies. Let me give an example: a simple DAG could consist of three tasks: A, B, and C. It could say that A has to run successfully before B can run, but C can run anytime. It could say that task A times out after 5 minutes, and B can be restarted up to 5 times in case it fails. It might also say that the workflow will run every night at 10 pm, but shouldn't start until a certain date. In this way, a DAG describes how you want to carry out your workflow; but notice that we haven’t said anything about what we actually want to do! A, B, and C could be anything. Maybe A prepares data for B to analyze while C sends an email.

DAGs are written in python, so if B is a clustering algorithm like DBSCAN that clusters the data prepared at stage A; you can use every machine learning library (as Scikit-learn, for example) that helps for this task. Airflow implements the python operator (and much more) that runs a defined python function, and I think this is very useful to easily implement a machine learning workflow, as we can see in this example: the scikit-learn demo of K-Means clustering on the handwritten digits data.

You can find all the code here. In this repo, here is everything we need to run the example:

In the following code, we can see the DAG runs the scikit-learn k-means example.

As we can see, the setup is very simple, and the Airflow interface is very clear and easy to learn.

I think that airflow is a very powerful and easy-to-use tool that enables really fast research to the production process for an ML algorithm. With Docker and Docker Compose, the environment setup is very easy and repeatable, and it can be shared with all the data scientists on your team. In this way, the data scientists can run, on their own laptops, the model in the same way the model will be run in the production environment. Really cool, don’t you think?

Images Powered by Shutterstock