ExamGecko
Home Home / Snowflake / COF-C02

Snowflake COF-C02 Practice Test - Questions Answers, Page 10

Question list
Search
Search

Which of the following describes external functions in Snowflake?

A.
They are a type of User-defined Function (UDF).
A.
They are a type of User-defined Function (UDF).
Answers
B.
They contain their own SQL code.
B.
They contain their own SQL code.
Answers
C.
They call code that is stored inside of Snowflake.
C.
They call code that is stored inside of Snowflake.
Answers
D.
They can return multiple rows for each row received
D.
They can return multiple rows for each row received
Answers
Suggested answer: A

Explanation:

External functions in Snowflake are a special type of User-Defined Function (UDF) that call code executed outside of Snowflake, typically through a remote service.Unlike traditional UDFs, external functions do not contain SQL code within Snowflake; instead, they interact with external services to process data2.

https://docs.snowflake.com/en/sql-reference/external-functions.html#:~:text=External%20functions%20are%20user%2Ddefined,code%20running%20outside%20of%20Snowflake.

What are ways to create and manage data shares in Snowflake? (Select TWO)

A.
Through the Snowflake web interface (Ul)
A.
Through the Snowflake web interface (Ul)
Answers
B.
Through the DATA_SHARE=TRUE parameter
B.
Through the DATA_SHARE=TRUE parameter
Answers
C.
Through SQL commands
C.
Through SQL commands
Answers
D.
Through the enable__share=true parameter
D.
Through the enable__share=true parameter
Answers
E.
Using the CREATE SHARE AS SELECT * TABLE command
E.
Using the CREATE SHARE AS SELECT * TABLE command
Answers
Suggested answer: A, C

Explanation:

Data shares in Snowflake can be created and managed through the Snowflake web interface, which provides a user-friendly graphical interface for various operations.Additionally, SQL commands can be used to perform these tasks programmatically, offering flexibility and automation capabilities123.

A company's security audit requires generating a report listing all Snowflake logins (e.g.. date and user) within the last 90 days. Which of the following statements will return the required information?

A.
SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME FROM ACCOUNT_USAGE.USERS;
A.
SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME FROM ACCOUNT_USAGE.USERS;
Answers
B.
SELECT EVENT_TIMESTAMP, USER_NAME FROM table(information_schema.login_history_by_user())
B.
SELECT EVENT_TIMESTAMP, USER_NAME FROM table(information_schema.login_history_by_user())
Answers
C.
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.ACCESS_HISTORY;
C.
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.ACCESS_HISTORY;
Answers
D.
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.LOGIN_HISTORY;
D.
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.LOGIN_HISTORY;
Answers
Suggested answer: D

Explanation:

To generate a report listing all Snowflake logins within the last 90 days, theACCOUNT_USAGE.LOGIN_HISTORYview should be used.This view provides information about login attempts, including successful and unsuccessful logins, and is suitable for security audits4.

Which semi-structured file formats are supported when unloading data from a table? (Select TWO).

A.
ORC
A.
ORC
Answers
B.
XML
B.
XML
Answers
C.
Avro
C.
Avro
Answers
D.
Parquet
D.
Parquet
Answers
E.
JSON
E.
JSON
Answers
Suggested answer: D, E

Explanation:

Semi-structured

JSON, Parquet

Snowflake supports unloading data in several semi-structured file formats, including Parquet and JSON.These formats allow for efficient storage and querying of semi-structured data, which can be loaded directly into Snowflake tables without requiring a predefined schema12.

https://docs.snowflake.com/en/user-guide/data-unload-prepare.html#:~:text=Supported%20File%20Formats,-The%20following%20file&text=Delimited%20(CSV%2C%20TSV%2C%20etc.)

What is the purpose of an External Function?

A.
To call code that executes outside of Snowflake
A.
To call code that executes outside of Snowflake
Answers
B.
To run a function in another Snowflake database
B.
To run a function in another Snowflake database
Answers
C.
To share data in Snowflake with external parties
C.
To share data in Snowflake with external parties
Answers
D.
To ingest data from on-premises data sources
D.
To ingest data from on-premises data sources
Answers
Suggested answer: A

Explanation:

The purpose of an External Function in Snowflake is to call code that executes outside of the Snowflake environment.This allows Snowflake to interact with external services and leverage functionalities that are not natively available within Snowflake, such as calling APIs or running custom code hosted on cloud services3.

https://docs.snowflake.com/en/sql-reference/external-functions.html

Topic 2, Exam pool B

A user created a new worksheet within the Snowsight Ul and wants to share this with teammates

How can this worksheet be shared?

A.
Create a zero-copy clone of the worksheet and grant permissions to teammates
A.
Create a zero-copy clone of the worksheet and grant permissions to teammates
Answers
B.
Create a private Data Exchange so that any teammate can use the worksheet
B.
Create a private Data Exchange so that any teammate can use the worksheet
Answers
C.
Share the worksheet with teammates within Snowsight
C.
Share the worksheet with teammates within Snowsight
Answers
D.
Create a database and grant all permissions to teammates
D.
Create a database and grant all permissions to teammates
Answers
Suggested answer: C

Explanation:

Worksheets in Snowsight can be shared directly with other Snowflake users within the same account.This feature allows for collaboration and sharing of SQL queries or Python code, as well as other data manipulation tasks1.

What is the purpose of multi-cluster virtual warehouses?

A.
To create separate data warehouses to increase query optimization
A.
To create separate data warehouses to increase query optimization
Answers
B.
To allow users the ability to choose the type of compute nodes that make up a virtual warehouse cluster
B.
To allow users the ability to choose the type of compute nodes that make up a virtual warehouse cluster
Answers
C.
To eliminate or reduce Queuing of concurrent queries
C.
To eliminate or reduce Queuing of concurrent queries
Answers
D.
To allow the warehouse to resize automatically
D.
To allow the warehouse to resize automatically
Answers
Suggested answer: C

Explanation:

Multi-cluster virtual warehouses in Snowflake are designed to manage user and query concurrency needs.They allow for the allocation of additional clusters of compute resources, either statically or dynamically, to handle increased loads and reduce or eliminate the queuing of concurrent queries2.

https://docs.snowflake.com/en/user-guide/warehouses-multicluster.html#:~:text=Multi%2Dcluster%20warehouses%20enable%20you,during%20peak%20and%20off%20hours.

Which statements are true concerning Snowflake's underlying cloud infrastructure? (Select THREE),

A.
Snowflake data and services are deployed in a single availability zone within a cloud provider's region.
A.
Snowflake data and services are deployed in a single availability zone within a cloud provider's region.
Answers
B.
Snowflake data and services are available in a single cloud provider and a single region, the use of multiple cloud providers is not supported.
B.
Snowflake data and services are available in a single cloud provider and a single region, the use of multiple cloud providers is not supported.
Answers
C.
Snowflake can be deployed in a customer's private cloud using the customer's own compute and storage resources for Snowflake compute and storage
C.
Snowflake can be deployed in a customer's private cloud using the customer's own compute and storage resources for Snowflake compute and storage
Answers
D.
Snowflake uses the core compute and storage services of each cloud provider for its own compute and storage
D.
Snowflake uses the core compute and storage services of each cloud provider for its own compute and storage
Answers
E.
All three layers of Snowflake's architecture (storage, compute, and cloud services) are deployed and managed entirely on a selected cloud platform
E.
All three layers of Snowflake's architecture (storage, compute, and cloud services) are deployed and managed entirely on a selected cloud platform
Answers
F.
Snowflake data and services are deployed in at least three availability zones within a cloud provider's region
F.
Snowflake data and services are deployed in at least three availability zones within a cloud provider's region
Answers
Suggested answer: D, E, F

Explanation:

Snowflake's architecture is designed to operate entirely on cloud infrastructure. It uses the core compute and storage services of each cloud provider, which allows it to leverage the scalability and reliability of cloud resources. Snowflake's services are deployed across multiple availability zones within a cloud provider's region to ensure high availability and fault tolerance. References: [COF-C02] SnowPro Core Certification Exam Study Guide

Which snowflake objects will incur both storage and cloud compute charges? (Select TWO)

A.
Materialized view
A.
Materialized view
Answers
B.
Sequence
B.
Sequence
Answers
C.
Secure view
C.
Secure view
Answers
D.
Transient table
D.
Transient table
Answers
E.
Clustered table
E.
Clustered table
Answers
Suggested answer: A, D

Explanation:

In Snowflake, both materialized views and transient tables will incur storage charges because they store data. They will also incur compute charges when queries are run against them, as compute resources are used to process the queries. References: [COF-C02] SnowPro Core Certification Exam Study Guide

A user is preparing to load data from an external stage

Which practice will provide the MOST efficient loading performance?

A.
Organize files into logical paths
A.
Organize files into logical paths
Answers
B.
Store the files on the external stage to ensure caching is maintained
B.
Store the files on the external stage to ensure caching is maintained
Answers
C.
Use pattern matching for regular expression execution
C.
Use pattern matching for regular expression execution
Answers
D.
Load the data in one large file
D.
Load the data in one large file
Answers
Suggested answer: A

Explanation:

Organizing files into logical paths can significantly improve the efficiency of data loading from an external stage.This practice helps in managing and locating files easily, which can be particularly beneficial when dealing with large datasets or complex directory structures1.

Total 716 questions
Go to page: of 72