Kubernetes build and deploy pipelines in the world of AI/ML

Our data scientists love to pass giant embeddings and classifiers over to our applications. These files must be loaded at run time for our applications to actually work and classify data. Building giant docker images is the wrong way to go, so what can we do instead?

We’re able to build and deploy multiple times a day to our kubernetes infrastructure while working around these challenges; plus we have a few tricks up our sleeve that help speed up our deployment times.

Required audience experience

Basic understanding of CI/CD processes and Docker

Objectives of the talk

  • How to automate a build and deploy pipeline creating docker images with large file requirements.
  • How we automatically run tests, and deploy to Staging environment for review, with a button push deployment to Production.
  • Tricks to speed up deployment times and cache files where possible on k8s nodes.Emphasise the value of tracking code releases centrally

Location: Date: May 11, 2021 Time: 4:15 pm - 5:15 pm Erin Willingham, Spectrum Labs Erin Willingham