Certification Practice Flashcards

1
Q
Identify system-defined roles in Snowflake from the roles given below. (select four)
A. AccountAdmin
B. SysAdmin
C. Auditor
D. RoleAdmin
E. SecurityAdmin
F. Public
A

A. AccountAdmin
B. SysAdmin
E. SecurityAdmin
F. Public

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
What are the minimum and the maximum number of clusters in a multi-cluster warehouse?
A. Minimum: 1, Maximum: 99
B. Minimum: 1, Maximum: 100
C. Minimum: 1, Maximum: 10
D. Minimum: 1, Maximum: unlimited
A

C

In a multi-cluster warehouse, the minimum number of clusters can be 1 and the maximum number of clusters can be 10.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

When a virtual warehouse is started or resized or resumed, the minimum billing charge is 1 minute.
A. True
B. False

A

True
The above statement is True. When a new warehouse is started afresh, or when you resize an existing running warehouse or when a warehouse is resumed (automatically or manually), the warehouse is billed for a minimum of 1 minute’s worth of usage. After the 1st minute has elapsed, all subsequent billing is charged per second.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
When a multi-cluster warehouse is suspended, which of the following Snowflake cache will be purged?
A. Metadata cache
B. Remote disk cache (Resultset cache)
C. Local disk cache
D. All of the above
A

C
When a multi-cluster warehouse (or a single cluster warehouse) is suspended, only the local disk cache is dropped. This cache is also known as warehouse cache. This is because this cache is essentially the fast SSD of the warehouse. So when the warehouse is gone, this cache is also gone with it. Answer choice-C is correct.

Metadata cache contains Snowflake account object information and statistics. It is always on and it is never dropped. Answer choice-A is incorrect.

Remote disk cache (resultset cache) caches exact query results including aggregations etc. and it persists the exact resultset for 24-hours after the query is executed irrespective of the state of the warehouse. Answer choice-B is incorrect.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
Which of the following programming languages are supported in Snowflake to write user-defined functions (UDFs)?
A. SQL
B. Java
C. JavaScript
D. Python
A

A and C

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Which Snowflake edition supports transmitting data in encrypted form over the network between VPCs (virtual private cloud)?
A. All editions
B. Enterprise edition and above
C. Business Critical edition and Above
D. All except for the Virtual Private Snowflake (VPS) edition

A

C
A Snowflake account on AWS (or Azure) is implemented as a VPC. There are two ways to establish communication between your Snowflake VPC and other VPCs (e.g. your organization’s VPC). One is to transmit the traffic over the public internet. Other (and safer) option is to establish an exclusive, highly secure network between your Snowflake account and your other AWS VPCs (in the same AWS region), fully protected from unauthorized access. To implement this secure channel of communication between VPCs, AWS supports a feature called AWS PrivateLink (Azure also supports a similar feature called Azure PrivateLink). Snowflake offers support for AWS PrivateLink (and Azure PrivateLink) based communication in Business Critical Edition and above.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

More than one clustering key can co-exist in a Snowflake table
A. True
B. False

A

FALSE
This statement is false. You can define at most one clustering key in a Snowflake table to organize micro-partitions. When you define a clustering key, Snowflake will reorganize the naturally clustered micro-partitions and will relocate related rows to the same micro-partition and group them according to the clustering key. This process is called Reclustering.
➤ Practical Info – Reclustering happens automatically once a clustering key is defined for a table. The process consumes credits. So be cognizant of the cost when you go for reclustering

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Which of the following statements will you use to change the warehouse for workload processing to a warehouse named ‘COMPUTE_WH_XL’?
A. SET CURRENT_WAREHOUSE = COMPUTE_WH_XL
B. USE WAREHOUSE COMPUTE_WH_XL;
C. USE CURRENT_WAREHOUSE(‘COMPUTE_WH_XL’);
D. SET CURRENT_WAREHOUSE = COMPUTE_WH, SIZE = XL;

A

B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
In the case of a Snowflake account created on AWS, ……………….. is responsible for the management of Availability Zones?
A. Customer
B. Snowflake
C. Cloud Provider
D. It is a shared responsibility
A

C

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q
Once the time-travel period has expired, it is possible to request Snowflake support to retrieve historical data for a period of
A. Day
B. Days
C. It depends on the Snowflake edition
D. It is user-configurable
A

B
7 days fail safe period.
After the time travel data retention period is over, you can use Snowflake’s fail-safe feature to recover your data. The duration of the fail-safe period is 7 days (Answer choice 2 is correct). This is a fixed duration and cannot be changed. Only Snowflake support personnel can help recover data during the fail-safe period. The fail-safe feature is available to all customers irrespective of the Snowflake edition.
The below diagram succinctly summarizes key differences between Snowflake’s two important data protection features – time-travel and fail-safe.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Which of the following statements are TRUE concerning a stream object in Snowflake? (select all that apply)
A. A stream object provides a record of DML changes (inserts, updates, deletes) made to a table at row level.
B. A stream object can keep track of DML changes for the entire life of a table.
C. Streams on materialized views are not supported.
D. Streams on external tables are not supported.

A

A and C
A: A stream object provides change tracking over a source table. It records DML changes made to tables, (inserts, updates, and deletes) as well as metadata about each change. This is referred to as Change Data Capture (CDC), and this feature is extensively used in data warehousing scenarios to create data pipelines. Please note that the stream object itself does not store this data. It relies on the version history of source data maintained in the metadata layer.
B: Stream object keeps track of DML changes of a source table up until the data retention period of the source table. After that, the DML changes are no longer accessible.
C: Currently, Snowflake does not support creating stream objects on materialized views.
D: Snowflake supports creating insert-only stream objects on external tables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Only one stream object can be created on a source table
A. True
B. False

A

False
You can create any number of streams on a source table. These streams can have the same or different offset positions. One example of creating multiple streams is when you want to report month-on-month changes, week-on-week changes, and day-on-day changes happening in a product inventory table of your POS database. In this case, you may create three streams on the table to record monthly, weekly and daily changes. All three streams exist independently of each other with their respective offset positions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

When deciding whether to use bulk loading or Snowpipe, which factors should you consider?

A. How often you will load the data
B. Location of data (local system or cloud)
C. Data format (structured or semi-structured)
D. Number of files you will load at one time

A

A, B and D

How well did you know this?
1
Not at all
2
3
4
5
Perfectly