AWS Batch | Features Flashcards
What use cases is AWS Batch optimized for?
Features
AWS Batch | Compute
AWS Batch is optimized for batch computing and applications that scale through the execution of multiple jobs in parallel. Deep learning, genomics analysis, financial risk models, Monte Carlo simulations, animation rendering, media transcoding, image processing, and engineering simulations are all excellent examples of batch computing applications.
What are the key features of AWS Batch?
Features
AWS Batch | Compute
AWS Batch manages compute environments and job queues, allowing you to easily run thousands of jobs of any scale using Amazon EC2 and EC2 Spot. You simply define and submit your batch jobs to a queue. In response, AWS Batch chooses where to run the jobs, launching additional AWS capacity if needed. AWS Batch carefully monitors the progress of your jobs. When capacity is no longer needed, AWS Batch will remove it. AWS Batch also provides the ability to submit jobs that are part of a pipeline or workflow, enabling you to express any interdependencies that exist between them as you submit jobs.
What types of batch jobs does AWS Batch support?
Features
AWS Batch | Compute
AWS Batch supports any job that can executed as a Docker container. Jobs specify their memory requirements and number of vCPUs.
What is a Compute Resource?
Features
AWS Batch | Compute
An AWS Batch Compute Resource is an EC2 instance.
What is a Compute Environment?
Features
AWS Batch | Compute
An AWS Batch Compute Environment is a collection of compute resources on which jobs are executed. AWS Batch supports two types of Compute Environments; Managed Compute Environments which are provisioned and managed by AWS and Unmanaged Compute Environments which are managed by customers. Unmanaged Compute Environments provide a mechanism to leverage specialized resources such as Dedicated Hosts, larger storage configurations, and Amazon EFS.
What is a Job Definition?
Features
AWS Batch | Compute
A Job Definition describes the job to be executed, parameters, environmental variables, compute requirements, and other information that is used to optimize the execution of a job. Job Definitions are defined in advance of submitting a job and can be shared with others.
What is the Amazon ECS Agent and how is it used by AWS Batch?
Features
AWS Batch | Compute
AWS Batch uses Amazon ECS to execute containerized jobs and therefore requires the ECS Agent to be installed on compute resources within your AWS Batch Compute Environments. The ECS Agent is pre-installed in Managed Compute Environments.