Operationalizing Artificial Intelligence

Putting AI into production is not called deployment, it's called operationalizing.

The matrix is all around

Operationalizing Artificial Intelligence

Putting AI into production is not called deployment, it's called operationalizing.

We dont' have build/test/deploy/manage order of things.

There is the now the training phase and the inference phase.

Training Phase:

  • select multiple algorithms
  • select appropriate data
  • clean data
  • label data
  • apply data to algorithm
  • hyper parameters
  • validate
  • test over/under fitting

Inference Phase:

  • apply model to the use case
  • evaluation of real-world data fitting
  • new training data set
  • parameter configurations
  • interatively improve model
  • loops to training phase

Quote from forbes article by Kathleen Walch: But the real world is where things get messy and complicated. First of all, there’s no such thing as a single platform for machine learning. The universal machine learning / AI platform doesn’t exist because there are so many diverse places in which we can use an ML model to make inferences, do classification, predict values, and all the other problems we are looking for ML systems to solve. We could be using an ML model in an Internet of Things device deployed at the edge, or in a mobile application that can operate disconnected from the internet, or in a cloud-based always-on setting, or in a large enterprise server system with private, highly regulated, or classified content, or in desktop applications, or in autonomous vehicles, or in distributed applications, or… you get the picture. Any place where the power of cognitive technology is needed is a place where these AI systems can be used.

This is both empowering and challenging. The data scientist developing the ML model might not have any expectations for how and where the ML model will be used, and so instead of “deploying” this model to a specific system, it needs to be “operationalized” in as many different systems, interfaces, and deployments as necessary. The very same model could be deployed in an IoT driver update as well as a cloud service API call. As far as the data scientists and data engineers are concerned, this is not a problem at all. The specifics of deployment are specific to the platforms on which the ML model will be used. But the requirements for the real-world usage and operation, hence the word “operationalization” of the model are the same regardless of the specific application or deployment.