Let’s say you need to save 10000 books into a books service. The service’s API only allows us to save 100 books per request. My first idea would be to split everything into chunks of 100 books and fire everything in parallel.

export const saveBooks = (books) => {
  const chunks = chunk(books, 100)

  return Promise.all(chunks.map(booksChunk => saveBooksChunk(booksChunk)))
}

The downside is that it will create a spike in the load of the books service. It usually means that the service will either struggle to process such a spike or it will be over-provisioned to handle loads like that. Over-provisioning usually means paying for computing power that isn’t used most of the time.So we want to flatten the spike a bit. To do that we will use what’s called a limited parallel execution. Basically, we will be sending batches of books in parallel but not more than 10 at a time.

export const saveBooks = async (books) => {
  const chunks = chunk(chunk(books, 100), 10)

  const results = []

  for (const chunkOfChunks of chunks) {
    const chunkResults = await Promise.all(chunkOfChunks.map(saveBooksChunk))

    results.push(...chunkResults)
  }

  return results
}

#javascript #programming #developer

Effective Limited Parallel Execution in JavaScript
2.15 GEEK