AWS Lambda Flashcards

1
Q

What is Lambda Synchronous invocation and what are some of the AWS services that invoke Lambda functions synchronously?

A

When a function is invoked synchronously, AWS Lambda waits until the function is done processing, then returns the result.
examples of AWS services that invoke Lambda functions synchronously:
Amazon API Gateway Application Load Balancer
Amazon Cognito
Amazon Kinesis Data Firehose
Amazon CloudFront (Lambda@Edge)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is Lambda Asynchronous invocation and what are some of the AWS services that invoke Lambda functions Asynchronously?

A

When a function is invoked asynchronously, AWS Lambda stores the event in an internal queue and handles the invocation. It is typically used for long-latency processes that run in the background, such as batch operations, video encoding, and order processing.
Examples of AWS services that invoke Lambda functions asynchronously:
Amazon API Gateway (by specifying Event in the X-Amz-Invocation-Type request header of a non-proxy integration)
Amazon S3
Amazon CloudWatch Logs
Amazon EventBridge
AWS CodeCommit
AWS CloudFormation
AWS Config

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Lambda Event Source Mapping? What are some of the services that Lambda provides Event Source Mapping for?

A

Event source mapping is a Lambda resource that reads from a queue or stream and synchronously invokes a Lambda function.
Lambda provides event source mappings for the following services.
Amazon Kinesis
Amazon DynamoDB
Amazon Simple Queue Service
Amazon MQ
Amazon Managed Streaming for Apache Kafka (Amazon MSK)
Self-managed Apache Kafka

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When do you use reserved concurrency and provisioned concurrency?

A

Use reserved concurrency to reserve a portion of your account’s concurrency for a function. This is useful if you don’t want other functions taking up all the available unreserved concurrency.

Use provisioned concurrency to pre-initialize a number of environment instances for a function. This is useful for reducing cold start latencies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Lambda@Edge?

A

Lets you run Lambda functions to customize content that CloudFront delivers, executing the functions in AWS locations closer to the viewer. The functions run in response to CloudFront events, without provisioning or managing servers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A company is deploying the package of its Lambda function, which is compressed as a ZIP file, to AWS. However, they are getting an error in the deployment process because the package is too large. The manager instructed the developer to keep the deployment package small to make the development process much easier and more modularized. This should also help prevent errors that may occur when dependencies are installed and packaged with the function code.
Which of the following options is the MOST suitable solution that the developer should implement?

1.Upload the deployment package to S3.
2. Zip the deployment package again to further compress the zip file.
3. Upload the other dependencies of your function as a separate Lambda Layer instead.
4. Compress the deployment package as TAR file instead.

A
  1. Upload the other dependencies of your function as a separate Lambda Layer instead.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

A sports technology company plans to build the latest kneepads version that can collect data from athletes wearing them. The product owner is looking to develop them with wearable medical sensors to ingest near-real-time data securely at scale and store it in durable storage. Furthermore, it should only collect non-confidential information from the streaming data and exclude those classified as sensitive data.

Which solution achieves these requirements with the least operational overhead?

  1. Using Amazon Kinesis Data Firehose, ingest the streaming data, and use Amazon S3 for durable storage. Write an AWS Lambda function that removes sensitive data. Schedule a separate job that invokes the Lambda function once the data is stored in Amazon S3.
  2. Using Amazon Kinesis Data Firehose, ingest the streaming data, and use Amazon S3 for durable storage. Write an AWS Lambda function that removes sensitive data. During the creation of the Kinesis Data Firehose delivery stream, enable record transformation and use the Lambda function.
  3. Using Amazon Kinesis Data Streams, ingest the streaming data, and use an Amazon EC2 instance for durable storage. Write an Amazon Kinesis Data Analytics application that removes sensitive data.
  4. Using Amazon Kinesis Data Streams, ingest the streaming data, and use Amazon S3 for durable storage. Write an AWS Lambda function that removes sensitive data. Schedule a separate job that invokes the Lambda function once the data is stored in Amazon S3.
A
  1. Using Amazon Kinesis Data Firehose, ingest the streaming data, and use Amazon S3 for durable storage. Write an AWS Lambda function that removes sensitive data. During the creation of the Kinesis Data Firehose delivery stream, enable record transformation and use the Lambda function.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

A leading carmaker would like to build a new car-as-a-sensor service by leveraging fully serverless components that are provisioned and managed automatically by AWS. The development team at the carmaker does not want an option that requires the capacity to be manually provisioned, as it does not want to respond manually to changing volumes of sensor data.

Given these constraints, which of the following solutions is the BEST fit to develop this car-as-a-sensor service?
1. Ingest the sensor data in an Amazon Simple Queue Service (Amazon SQS) standard queue, which is polled by an AWS Lambda function in batches and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing
2. Ingest the sensor data in Amazon Kinesis Data Firehose, which directly writes the data into an auto-scaled Amazon DynamoDB table for downstream processing -
3. Ingest the sensor data in an Amazon Simple Queue Service (Amazon SQS) standard queue, which is polled by an application running on an Amazon EC2 instance and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing
4. Ingest the sensor data in Amazon Kinesis Data Streams, which is polled by an application running on an Amazon EC2 instance and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing

A
  1. Ingest the sensor data in an Amazon Simple Queue Service (Amazon SQS) standard queue, which is polled by an AWS Lambda function in batches and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

A leading carmaker would like to build a new car-as-a-sensor service by leveraging fully serverless components that are provisioned and managed automatically by AWS. The development team at the carmaker does not want an option that requires the capacity to be manually provisioned, as it does not want to respond manually to changing volumes of sensor data.

Given these constraints, which of the following solutions is the BEST fit to develop this car-as-a-sensor service?
1. Ingest the sensor data in Amazon Kinesis Data Firehose, which directly writes the data into an auto-scaled Amazon DynamoDB table for downstream processing -
2. Ingest the sensor data in an Amazon Simple Queue Service (Amazon SQS) standard queue, which is polled by an application running on an Amazon EC2 instance and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing
3. Ingest the sensor data in Amazon Kinesis Data Streams, which is polled by an application running on an Amazon EC2 instance and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing
4. Ingest the sensor data in an Amazon Simple Queue Service (Amazon SQS) standard queue, which is polled by an AWS Lambda function in batches and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing

A
  1. Ingest the sensor data in an Amazon Simple Queue Service (Amazon SQS) standard queue, which is polled by an AWS Lambda function in batches and the data is written into an auto-scaled Amazon DynamoDB table for downstream processing

AWS manages all ongoing operations and underlying infrastructure needed to provide a highly available and scalable message queuing service. With SQS, there is no upfront cost, no need to acquire, install, and configure messaging software, and no time-consuming build-out and maintenance of supporting infrastructure. SQS queues are dynamically created and scale automatically so you can build and grow applications quickly and efficiently.

As there is no need to manually provision the capacity, so this is the correct option

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A gaming company is developing a mobile game that streams score updates to a backend processor and then publishes results on a leaderboard. The company has hired you as an AWS Certified Solutions Architect Associate to design a solution that can handle major traffic spikes, process the mobile game updates in the order of receipt, and store the processed updates in a highly available database. The company wants to minimize the management overhead required to maintain the solution.

Which of the following will you recommend to meet these requirements?

  1. Push score updates to an Amazon Simple Queue Service (Amazon SQS) queue which uses a fleet of Amazon EC2 instances (with Auto Scaling) to process these updates in the Amazon SQS queue and then store these processed updates in an Amazon RDS MySQL database
  2. Push score updates to Amazon Kinesis Data Streams which uses an AWS Lambda function to process these updates and then store these processed updates in Amazon DynamoDB
  3. Push score updates to Amazon Kinesis Data Streams which uses a fleet of Amazon EC2 instances (with Auto Scaling) to process the updates in Amazon Kinesis Data Streams and then store these processed updates in Amazon DynamoDB
  4. Push score updates to an Amazon Simple Notification Service (Amazon SNS) topic, subscribe an AWS Lambda function to this Amazon SNS topic to process the updates and then store these processed updates in a SQL database running on Amazon EC2 instance
A
  1. Push score updates to Amazon Kinesis Data Streams which uses an AWS Lambda function to process these updates and then store these processed updates in Amazon DynamoDB

To help ingest real-time data or streaming data at large scales, you can use Amazon Kinesis Data Streams (KDS). KDS can continuously capture gigabytes of data per second from hundreds of thousands of sources. The data collected is available in milliseconds, enabling real-time analytics. KDS provides ordering of records, as well as the ability to read and/or replay records in the same order to multiple Amazon Kinesis Applications.

AWS Lambda integrates natively with Kinesis Data Streams. The polling, checkpointing, and error handling complexities are abstracted when you use this native integration. The processed data can then be configured to be saved in Amazon DynamoDB.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly