Skip to content
English
On this page

AWS Batch

Batch computing jobs are used for performing complex analysis on large data sets. Some examples of batch computing include financial risk modeling, graphics processing, simulations, and even analyzing genomes. Batch allows you to run thousands of batch computing jobs on AWS without having to build any infrastructure. Simply define your batch job as a Docker container and submit it, and AWS takes care of the rest. For more information, visit https://docs.aws.amazon.com/batch/.

AWS Batch enables developers, scientists, and engineers to quickly and efficiently run hundreds of thousands of batch computing jobs on AWS. Batch offers the following features:

  • Batch dynamically provisions the optimal quantity and compute resources based on the volume and specific resource requirements of the batch jobs submitted.

  • It eliminates the need to install and manage batch computing software or server clusters that you use to run your jobs.

  • Batch plans, schedules, and executes your batch computing workloads across the full range of AWS compute services and features, such as EC2 and Spot Instances.