Exam Questions 1 Flashcards

1
Q

S3 - Simple Storage Service

A
  • A&D require development and administrative overhead.
  • C might work, but it will not archive the data for the additional 3 months.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Databases in AWS

A
  • D is incorrect because Dynamo DB does not accept complex SQL queries
  • A&B Aurora and RDS does not fit the need of minimal development and the API requirement
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Collecting Streaming Data

A
  • B&D There is no requirement to transform the data before it gets to the Business Intelligence tools
    C We could use RDS Aurora to scale up the storage, but it would be very costly. Database Migration Service may not be able to handle the volume of data we need without costing a huge amount of money.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Database Migration Service DMS

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Database Migration Service : Use Cases

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Replication

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

S3 Transfer Acceleration

A

The speed at which users can access objects in an S3 bucket goes down as the distance between the user and the AZ increases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

S3 Transfer Acceleration - CloudFront - Downloads

A

To get data to users more quickly, CloudFront uses cacheing nodes to cache results at nodes that are closer to the end user in order to improve Download speeds. It is a Global Content Distribution Network.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

S3 Transfer Acceleration - CloudFront - Uploads

A

Transfer Acceleration leverages CloudFront edge locations to create a Global Content Ingestion Network utilizing CouldFront’s optimized network paths. Enabled per bucket. Uses special endpoints.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

S3 Multi Part Upload

A

Max S3 PUT size = 5GiB

Max S3 object size = 5TiB

We achieve this by breaking up an object into smaller pieces, sending them to S3, then reassembling them back into one object.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

S3 Storage Classes

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

S3 Availability

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

S3 Data Lifecycle

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

S3 Access Security Waterfall

A

If permissions are denied at any point in from macro to micro, the request for the S3 object fails. All of these permissions comprise the Organization’s Service Control Policy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

S3 Object Protection and Replication

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Relational Database Types - Row Access

A

A query will pick up a whole row at a time to access data. Excellent for rapid transactions with relatively small pieces of data.

17
Q

Relational Database Types - Columnar Access

A

A query will scan one or more columns to access data. Excellent for analytical workloads that handle large pieces of data. These databases will be the ones used for Data Analytics.

18
Q

NoSQL Databases - Elasticache

A

Memcache

Redis

Key-Value

19
Q

NoSQL Databases - Document

A

Key-Value

DynamoDB

DocumentDB

20
Q

NoSQL Databases - Graph

A

Neptune

Key-Value

21
Q

Graph Database - Graph Structure

A
22
Q

Database Engine Types - Summary

A
23
Q

Managed Relational Database Engines

A
24
Q

Managed Relational Databases - Disaster Recovery

A
25
Q

Managed Relational Database Services - Summary

A
26
Q

Graph Databases - Neptune

A

Interface Languages

27
Q

Graph Databases - Neptune - Comparison to RDS

A
28
Q

Graph Databases - Neptune - Use Cases

A
29
Q

Graph Databases - Neptune - Summary

A