ExamGecko
Home / Snowflake / SnowPro Core
Ask Question

Snowflake SnowPro Core Practice Test - Questions Answers, Page 2

Question list
Search

Question 11

Report
Export
Collapse

Which data type can be used to store geospatial data in Snowflake?

Variant
Variant
Object
Object
Geometry
Geometry
Geography
Geography
Suggested answer: D

Explanation:

Snowflake supports two geospatial data types:GEOGRAPHYandGEOMETRY. TheGEOGRAPHYdata type is used to store geospatial data that models the Earth as a perfect sphere, which is suitable for global geospatial data. This data type follows the WGS 84 standard and is used for storing points, lines, and polygons on the Earth's surface. TheGEOMETRYdata type, on the other hand, represents features in a planar (Euclidean, Cartesian) coordinate system and is typically used for local spatial reference systems.Since the question specifically asks about geospatial data, which commonly refers to Earth-related spatial data, the correct answer isGEOGRAPHY3.Reference:[COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
HWANG SEON TAE
43 questions

Question 12

Report
Export
Collapse

What can be used to view warehouse usage over time? (Select Two).

The load HISTORY view
The load HISTORY view
The Query history view
The Query history view
The show warehouses command
The show warehouses command
The WAREHOUSE_METERING__HISTORY View
The WAREHOUSE_METERING__HISTORY View
The billing and usage tab in the Snowflake web Ul
The billing and usage tab in the Snowflake web Ul
Suggested answer: B, D

Explanation:

To view warehouse usage over time, the Query history view and the WAREHOUSE_METERING__HISTORY View can be utilized.The Query history view allows users to monitor the performance of their queries and the load on their warehouses over a specified period1.The WAREHOUSE_METERING__HISTORY View provides detailed information about the workload on a warehouse within a specified date range, including average running and queued loads2.Reference:[COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
Lance Gentle
45 questions

Question 13

Report
Export
Collapse

Which Snowflake partner specializes in data catalog solutions?

Alation
Alation
DataRobot
DataRobot
dbt
dbt
Tableau
Tableau
Suggested answer: A

Explanation:

Alation is known for specializing in data catalog solutions and is a partner of Snowflake. Data catalog solutions are essential for organizations to effectively manage their metadata and make it easily accessible and understandable for users, which aligns with the capabilities provided by Alation.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake's official documentation and partner listings

asked 23/09/2024
Jered Anderson
40 questions

Question 14

Report
Export
Collapse

What is the MOST performant file format for loading data in Snowflake?

CSV (Unzipped)
CSV (Unzipped)
Parquet
Parquet
CSV (Gzipped)
CSV (Gzipped)
ORC
ORC
Suggested answer: B

Explanation:

Parquet is a columnar storage file format that is optimized for performance in Snowflake. It is designed to be efficient for both storage and query performance, particularly for complex queries on large datasets. Parquet files support efficient compression and encoding schemes, which can lead to significant savings in storage and speed in query processing, making it the most performant file format for loading data into Snowflake.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Data Loading1

asked 23/09/2024
Corey Rivers
37 questions

Question 15

Report
Export
Collapse

Which copy INTO command outputs the data into one file?

SINGLE=TRUE
SINGLE=TRUE
MAX_FILE_NUMBER=1
MAX_FILE_NUMBER=1
FILE_NUMBER=1
FILE_NUMBER=1
MULTIPLE=FAISE
MULTIPLE=FAISE
Suggested answer: B

Explanation:

TheCOPY INTOcommand in Snowflake can be configured to output data into a single file by setting theMAX_FILE_NUMBERoption to 1. This option limits the number of files generated by the command, ensuring that only one file is created regardless of the amount of data being exported.

[COF-C02] SnowPro Core Certification Exam Study Guide

Snowflake Documentation on Data Unloading

asked 23/09/2024
Leandro Franklin Franklin
43 questions

Question 16

Report
Export
Collapse

The fail-safe retention period is how many days?

1 day
1 day
7 days
7 days
45 days
45 days
90 days
90 days
Suggested answer: B

Explanation:

Fail-safe is a feature in Snowflake that provides an additional layer of data protection. After the Time Travel retention period ends, Fail-safe offers a non-configurable 7-day period during which historical data may be recoverable by Snowflake. This period is designed to protect against accidental data loss and is not intended for customer access.

asked 23/09/2024
Hassene SAADI
39 questions

Question 17

Report
Export
Collapse

True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-Small Warehouse.

True
True
False
False
Suggested answer: A

Explanation:

Provisioning time can vary based on the size of the warehouse. A4X-Large Warehousetypically has more resources and may take longer to provision compared to aX-Small Warehouse, which has fewer resources and can generally be provisioned more quickly.Reference:Understanding and viewing Fail-safe | Snowflake Documentation

asked 23/09/2024
Asif Ibrahim
47 questions

Question 18

Report
Export
Collapse

How would you determine the size of the virtual warehouse used for a task?

Root task may be executed concurrently (i.e. multiple instances), it is recommended to leave some margins in the execution window to avoid missing instances of execution
Root task may be executed concurrently (i.e. multiple instances), it is recommended to leave some margins in the execution window to avoid missing instances of execution
Querying (select) the size of the stream content would help determine the warehouse size. For example, if querying large stream content, use a larger warehouse size
Querying (select) the size of the stream content would help determine the warehouse size. For example, if querying large stream content, use a larger warehouse size
If using the stored procedure to execute multiple SQL statements, it's best to test run the stored procedure separately to size the compute resource first
If using the stored procedure to execute multiple SQL statements, it's best to test run the stored procedure separately to size the compute resource first
Since task infrastructure is based on running the task body on schedule, it's recommended to configure the virtual warehouse for automatic concurrency handling using Multi-cluster warehouse (MCW) to match the task schedule
Since task infrastructure is based on running the task body on schedule, it's recommended to configure the virtual warehouse for automatic concurrency handling using Multi-cluster warehouse (MCW) to match the task schedule
Suggested answer: D

Explanation:

The size of the virtual warehouse for a task can be configured to handle concurrency automatically using a Multi-cluster warehouse (MCW). This is because tasks are designed to run their body on a schedule, and MCW allows for scaling compute resources to match the task's execution needs without manual intervention.Reference: [COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
metodija durtanoski
41 questions

Question 19

Report
Export
Collapse

The Information Schema and Account Usage Share provide storage information for which of the following objects? (Choose three.)

Users
Users
Tables
Tables
Databases
Databases
Internal Stages
Internal Stages
Suggested answer: B, C, D

Explanation:

The Information Schema and Account Usage Share in Snowflake provide metadata and historical usage data for various objects within a Snowflake account. Specifically, they offer storage information forTables,Databases, andInternal Stages. These schemas contain views and table functions that allow users to query object metadata and usage metrics, such as the amount of data stored and historical activity.

Tables: The storage information includes data on the daily average amount of data in database tables.

Databases: For databases, the storage usage is calculated based on all the data contained within the database, including tables and stages.

Internal Stages: Internal stages are locations within Snowflake for temporarily storing data, and their storage usage is also tracked.

asked 23/09/2024
Dewi Fitriyani
52 questions

Question 20

Report
Export
Collapse

What is the default File Format used in the COPY command if one is not specified?

CSV
CSV
JSON
JSON
Parquet
Parquet
XML
XML
Suggested answer: A

Explanation:

The default file format for the COPY command in Snowflake, when not specified, is CSV (Comma-Separated Values). This format is widely used for data exchange because it is simple, easy to read, and supported by many data analysis tools.

asked 23/09/2024
jose fajardo
34 questions
Total 627 questions
Go to page: of 63