[COF-C02] SnowPro Core Certification Mock Exam - 3 Flashcards
A DBA_ROLE created a database. Later the DBA_ROLE was dropped. Who will own the database now, which was created by the DBA_ROLE?
The role that dropped the DBA_ROLE will own the database. It is an important question for the exam.
Which AWS service is used to create private VPC endpoints that allow direct, secure connectivity between your AWS VPCs and the Snowflake VPC without traversing the public internet?
AWS PrivateLink is an AWS service for creating private VPC endpoints that allow direct, secure connectivity between your AWS VPCs and the Snowflake VPC without traversing the public internet. The connectivity is for AWS VPCs in the same AWS region.
For External Functions, you can also use AWS PrivateLink with private endpoints.
In addition, if you have an on-premises environment (e.g. a non-hosted data center), you can choose to use AWS Direct Connect, in conjunction with AWS PrivateLink, to connect all your virtual and physical environments in a single, private network.
Which is the default timestamp in Snowflake?
TIMESTAMP_NTZ is the default timestamp type if you just define a column as a timestamp. Hint to remember: NTZ represents NO TIME ZONES.
Which roles can configure a network policy?
Only security administrators (i.e., users with the SECURITYADMIN role) or higher or a role with the global CREATE NETWORK POLICY privilege can create network policies.
What are the three interfaces in Snowsight.
Left Navigation consists of Worksheets, Dashboards, Data, Marketplace, Activity, Admin, Help & Support.
User Menu lets you Switch Roles, Profile including multi-factor authentication (MFA), Partner Connect, Documentation, Support and Sign Out.
The account selector, located at the bottom of the left nav, lets you sign in to other Snowflake accounts.
Which type of object key is only used for decryption?
Retired Key is used for decryption only.
Active Key is used for both encryption and decryption.
Destroyed Key is no longer used.
Direct data sharing can only be done with accounts in the same region and the same cloud provider. (TRUE/FALSE)
True
Direct data sharing can only be done with accounts in the same region and the same cloud provider. Suppose you want to share with someone outside of your region. In that case, you simply do a replication of that database into the region you want to share with and share from there.
Monica is confused about which sampling method she should use with one of the very large tables, considering better performance. Which sampling method would you recommend from BERNOULLI | ROW and SYSTEM | BLOCK?
SYSTEM | BLOCK sampling is often faster than BERNOULLI | ROW sampling. Also, BERNOULLI | ROW method is good for Smaller Tables, and SYSTEM | BLOCK method is for Larger Tables.
Which copy option is used to delete the file from the Snowflake stage when data from staged files are loaded successfully?
Staged files can be deleted from a Snowflake stage (user stage, table stage, or named stage) using the following methods:
- Files that were loaded successfully can be deleted from the stage during a load by specifying the PURGE copy option in the COPY INTO <table> command.
- After the load completes, use the REMOVE command to remove the files in the stage.
Please note, DELETE or REMOVE are not COPY command options. REMOVE is a different DML command which is used to remove files in the stage.
How long does Snowflake keep Snowpipe’s load history?
Same of Snowsight
Snowflake keeps the Snowpipe’s load history for 14 days. If you recreate [CREATE OR REPLACE ..] the PIPE then the load history will reset to empty [ very important for the exam ].
John wants to load data files from an external stage to Snowflake. He has split the large file into smaller 100 - 250 MB data files, and there is a total of 16 smaller data files. What warehouse size would you recommend him to use for loading these data files quickly and cost-effectively?
XS sized warehouse can load eight files parallelly. S sized warehouse can load sixteen files parallelly. M sized warehouse can load thirty-two files parallelly. L sized warehouse can load sixty-four files parallelly. XL sized warehouse can load one hundred twenty-eight files parallelly and so on.
Snowflake prunes micro-partitions based on a predicate with a subquery, even if the subquery result is constant. (TRUE/FALSE)
False
Please note, not all predicate expressions can be used to prune. Snowflake does not prune micro-partitions based on a predicate with a subquery, even if the subquery results in a constant.
What are the key benefits of The Data Cloud?
The benefits of The Data Cloud are Access, Governance, and Action.
Access means that organizations can easily discover data and share it internally or with third parties without regard to geographical location.
Governance is about setting policies and rules and protecting the data in a way that can unlock new value and collaboration while maintaining the highest levels of security and compliance.
Action means you can empower every part of your business with data to build better products, make faster decisions, create new revenue streams and realize the value of your greatest untapped asset, your data.
What is the best way to analyze the optimum warehouse size?
To achieve the best results, try to execute relatively homogeneous queries (size, complexity, data sets, etc.) on the same warehouse; executing queries of widely-varying size and/or complexity on the same warehouse makes it more difficult to analyze warehouse load, which can make it more difficult to select the best size to match the size, composition, and number of queries in your workload.
What size of the virtual warehouse needs to be created by the sysadmin while loading using Snowpipe?
Snowpipe uses compute resources provided and managed by Snowflake (i.e. a serverless compute model). These Snowflake-provided resources are automatically resized and scaled up or down as required, and are charged and itemized using per-second billing. Data ingestion is charged based upon the actual workloads. User doesn’t need to create any warehouse as it is taken care by Snowflake.