In the early 2000s, most business-critical software was hosted on privately run data centers. But with time, enterprises overcame their skepticism and moved critical applications to the cloud.
DevOps fueled this shift to the cloud, as it gave decision-makers a sense of control over business-critical applications hosted outside their own data centers.
Today, enterprises are in a similar phase of trying out and accepting machine learning (ML) in their production environments, and one of the accelerating factors behind this change is MLOps.
Similar to cloud-native startups, many startups today are ML native and offer differentiated products to their customers. But a vast majority of large and midsize enterprises are either only now just trying out ML applications or just struggling to bring functioning models to production.
Here are some key challenges that MLOps can help with:
It’s hard to get cross-team ML collaboration to work
An ML model may be as simple as one that predicts churn, or as complex as the one determining Uber or Lyft pricing between San Jose and San Francisco. Creating a model and enabling teams to benefit from it is an incredibly complex endeavor.
The MLOps space is in its early days today, but it has massive potential because it allows organizations to bring AI to production environments in a fraction of the time it takes today.
In addition to requiring a large amount of labeled historic data to train these models, multiple teams need to coordinate to continuously monitor the models for performance degradation.
There are three core roles involved in ML modeling, but each one has different motivations and incentives:
Data engineers: Trained engineers excel at gleaning data from multiple sources, cleaning it and storing it in the right formats so that analysis can be performed. Data engineers play with tools like ETL/ELT, data warehouses and data lakes, and are well versed in handling static and streaming data sets. A high-level data pipeline created by a data engineer might look like this:
Data scientists: These are the experts who can run complex regressions in their sleep. Using common tools like the Python language, Jupyter Notebooks and Tensorflow, data scientists take the data provided by data engineers and analyze it, which results in a highly accurate model. Data scientists love trying different algorithms and comparing these models for accuracy, but after that someone needs to do the work to bring the models to production.
AI engineers/DevOps engineers: These are specialists who understand infrastructure, can take models to production and if something goes wrong, can quickly detect the issue and kickstart the resolution process.
MLOps enables these three critical personas to continuously collaborate to deliver successful AI implementations.
The proliferation of ML tools
In the new developer-led, bottom-up world, teams can choose from a plethora of tools to solve their problems.
Source: New feed