“The applications and hardware that OpenAI runs with Kubernetes are quite different from what one may encounter at a typical company.”

Microsoft backed OpenAI has delivered back to back blockbusters with GPT-3 and CLIP, DALL.E in a span of six months. While GPT-3 allowed OpenAI to venture into commercial API space, CLIP and DALL.E rang in a new era of fusion models. However, these models are large. GPT-3 devors all the data on the internet for training. And, it costs a few million dollars. “Scaling OpenAI’s infrastructure is unlike any what a typical startup does. So, even though they use familiar services like Kubernetes, the practices are unique to OpenAI. Today, many software service providers deploy Kubernetes for ease of operation but OpenAI claims to do it differently,” said OpenAI researchers.

Kubernetes Overview

Conventionally, applications with different functionalities are packed into a single deployable artifact. And monoliths are an acceptable way to build applications even today. But they still have their drawbacks. For example, deployments are time-consuming since everything has to be rolled out together. And if different parts of the monolith are managed by different teams, the roll out prepping could run into additional complexities. Same with scaling: Teams have to throw resources at the whole application, even if the bottleneck is on a single channel. To address this, developers came up with microservices.

Each piece of functionality is split into smaller individual artifacts. If there’s an update, only the exact service has to be replaced. The microservice model has scaling benefits. Now individual services can be scaled to match their traffic, so it’s easier to avoid bottlenecks without over-provisioning. So far, so good. But having one machine for each service would require a lot of resources and a whole bunch of machines. This is where containers come in handy. With containers, teams can pack their services neatly. All the applications, their dependencies, and any necessary configuration gets delivered together. Meaning, rest assured the services will run the same way, no matter where they are run. And, Kubernetes is all about managing these containers.

#kubernetes #gpt-3 #openai

What’s Kubernetes Got To Do With GPT-3’s Success
1.85 GEEK