Data Loading Flashcards
Snowpipe supports loading from what stages
- Named internal or external stages
- Table stages
Need to look into copy options
Where can you define the file format settings
- While creating named file formats
- In the table definition
- In the named stage definition
- Directly in the COPY INTO TABLE statement when loading data
When loading data using the COPY INTO cmd, what options can you specify for the ON_ERROR clause
- CONTINUE
- SKIP_FILE
- ABORT_STATEMENT
T/F The kafka connector creates one pipe for each partition in a Kafka Topic
True
To convert JSON null value to SQL null value, you will use
STRIP_NULL_VALUE
What does the following query return:
SELECT TOP 10 GRADES FROM STUDENT;
A non-deterministic list of 10 grades
T/F a column with very high cardinality (e.g. a column containing UUID or nanosecond timestamp values) is also typically not a good candidate to use as a clustering key directly.
TRUE
What are the REST apis provided by SnowPipe
insertFiles
insertReport
loadHistoryScan
Every Snowflake table loaded by the Kafka connector has a schema consisting of two VARIANT columns. What are those?
RECORD_CONTENT
RECORD_METADATA
T/F SNOWPIPE_AUTO_INGEST is supported for external stages only
TRUE
You want to automatically delete the files from stage after a successful load using the COPY INTO command. What would be the recommended approach for deletion?
PURGE=TRUE into the COPY INTO command