Week1 Graded Assignment Questions

Question about the Quiz: Intro to MLEP

For the question “What are the unique challenges to overcome in a production-grade ML system?”, some of the responses do not seem right. I have following questions, please help me understand what am I missing.

  1. Why “Training the model on real-world data” not a challenge? Most popular ML platforms in the industry support incremental online training, which means continuously training the model on newly generated real world data while ML application is being used. Collecting, training on real world data is a big problem that these ML platforms solve.
  2. Why “Deploying the model to serve request” is not a challenge? Deploying models in production is not a forte for MLEs(their forte is building a model). Dealing with low latency, high throughput, continuous availability across multiple regions and memory management (load/unload models per RAM availability) seem like big enough challenges for MLEs.

I’m not a mentor for that course, but I can provide a couple of opinions.

  1. Very many ML systems are trained on “real world data”. That means observations of things that happen in life (as opposed to simulations). So it’s not unique to ML production. Note that the question doesn’t refer to real-time updates. Just real-life data.

  2. All ML systems that serve any purpose at all have to be deployed in some way, such that their predictions are useful. This is not unique to ML production.

Thanks for your response.

  1. I see what you are saying. It does make sense.
  2. My mental model here is that “deployment” means making the model available for Inference via a service. In non-prod use cases you might not deploy the model, and just test it on the notebooks. ML Platforms actually make the deployment at scale very easy, and takes that effort off the Machine Learning Engineer’s plate, and hence an integral part of ML platform. Most ML platforms like Machalangelo, Feast etc. call out that their ability to deploy things fast on the right hardware.