ExamGecko
Home / Snowflake / COF-C02
Ask Question

Snowflake COF-C02 Practice Test - Questions Answers, Page 10

Question list
Search

Question 91

Report
Export
Collapse

Which of the following describes external functions in Snowflake?

They are a type of User-defined Function (UDF).
They are a type of User-defined Function (UDF).
They contain their own SQL code.
They contain their own SQL code.
They call code that is stored inside of Snowflake.
They call code that is stored inside of Snowflake.
They can return multiple rows for each row received
They can return multiple rows for each row received
Suggested answer: A

Explanation:

External functions in Snowflake are a special type of User-Defined Function (UDF) that call code executed outside of Snowflake, typically through a remote service.Unlike traditional UDFs, external functions do not contain SQL code within Snowflake; instead, they interact with external services to process data2.

https://docs.snowflake.com/en/sql-reference/external-functions.html#:~:text=External%20functions%20are%20user%2Ddefined,code%20running%20outside%20of%20Snowflake.

asked 23/09/2024
ce temp2
42 questions

Question 92

Report
Export
Collapse

What are ways to create and manage data shares in Snowflake? (Select TWO)

Through the Snowflake web interface (Ul)
Through the Snowflake web interface (Ul)
Through the DATA_SHARE=TRUE parameter
Through the DATA_SHARE=TRUE parameter
Through SQL commands
Through SQL commands
Through the enable__share=true parameter
Through the enable__share=true parameter
Using the CREATE SHARE AS SELECT * TABLE command
Using the CREATE SHARE AS SELECT * TABLE command
Suggested answer: A, C

Explanation:

Data shares in Snowflake can be created and managed through the Snowflake web interface, which provides a user-friendly graphical interface for various operations.Additionally, SQL commands can be used to perform these tasks programmatically, offering flexibility and automation capabilities123.

asked 23/09/2024
Karl Ranson
42 questions

Question 93

Report
Export
Collapse

A company's security audit requires generating a report listing all Snowflake logins (e.g.. date and user) within the last 90 days. Which of the following statements will return the required information?

SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME FROM ACCOUNT_USAGE.USERS;
SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME FROM ACCOUNT_USAGE.USERS;
SELECT EVENT_TIMESTAMP, USER_NAME FROM table(information_schema.login_history_by_user())
SELECT EVENT_TIMESTAMP, USER_NAME FROM table(information_schema.login_history_by_user())
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.ACCESS_HISTORY;
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.ACCESS_HISTORY;
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.LOGIN_HISTORY;
SELECT EVENT_TIMESTAMP, USER_NAME FROM ACCOUNT_USAGE.LOGIN_HISTORY;
Suggested answer: D

Explanation:

To generate a report listing all Snowflake logins within the last 90 days, theACCOUNT_USAGE.LOGIN_HISTORYview should be used.This view provides information about login attempts, including successful and unsuccessful logins, and is suitable for security audits4.

asked 23/09/2024
Mounir Mrabet
41 questions

Question 94

Report
Export
Collapse

Which semi-structured file formats are supported when unloading data from a table? (Select TWO).

ORC
ORC
XML
XML
Avro
Avro
Parquet
Parquet
JSON
JSON
Suggested answer: D, E

Explanation:

Semi-structured

JSON, Parquet

Snowflake supports unloading data in several semi-structured file formats, including Parquet and JSON.These formats allow for efficient storage and querying of semi-structured data, which can be loaded directly into Snowflake tables without requiring a predefined schema12.

https://docs.snowflake.com/en/user-guide/data-unload-prepare.html#:~:text=Supported%20File%20Formats,-The%20following%20file&text=Delimited%20(CSV%2C%20TSV%2C%20etc.)

asked 23/09/2024
PEDRO ARIAS
35 questions

Question 95

Report
Export
Collapse

What is the purpose of an External Function?

To call code that executes outside of Snowflake
To call code that executes outside of Snowflake
To run a function in another Snowflake database
To run a function in another Snowflake database
To share data in Snowflake with external parties
To share data in Snowflake with external parties
To ingest data from on-premises data sources
To ingest data from on-premises data sources
Suggested answer: A

Explanation:

The purpose of an External Function in Snowflake is to call code that executes outside of the Snowflake environment.This allows Snowflake to interact with external services and leverage functionalities that are not natively available within Snowflake, such as calling APIs or running custom code hosted on cloud services3.

https://docs.snowflake.com/en/sql-reference/external-functions.html

Topic 2, Exam pool B

asked 23/09/2024
B M
32 questions

Question 96

Report
Export
Collapse

A user created a new worksheet within the Snowsight Ul and wants to share this with teammates

How can this worksheet be shared?

Create a zero-copy clone of the worksheet and grant permissions to teammates
Create a zero-copy clone of the worksheet and grant permissions to teammates
Create a private Data Exchange so that any teammate can use the worksheet
Create a private Data Exchange so that any teammate can use the worksheet
Share the worksheet with teammates within Snowsight
Share the worksheet with teammates within Snowsight
Create a database and grant all permissions to teammates
Create a database and grant all permissions to teammates
Suggested answer: C

Explanation:

Worksheets in Snowsight can be shared directly with other Snowflake users within the same account.This feature allows for collaboration and sharing of SQL queries or Python code, as well as other data manipulation tasks1.

asked 23/09/2024
Paulina Radziszewska
36 questions

Question 97

Report
Export
Collapse

What is the purpose of multi-cluster virtual warehouses?

To create separate data warehouses to increase query optimization
To create separate data warehouses to increase query optimization
To allow users the ability to choose the type of compute nodes that make up a virtual warehouse cluster
To allow users the ability to choose the type of compute nodes that make up a virtual warehouse cluster
To eliminate or reduce Queuing of concurrent queries
To eliminate or reduce Queuing of concurrent queries
To allow the warehouse to resize automatically
To allow the warehouse to resize automatically
Suggested answer: C

Explanation:

Multi-cluster virtual warehouses in Snowflake are designed to manage user and query concurrency needs.They allow for the allocation of additional clusters of compute resources, either statically or dynamically, to handle increased loads and reduce or eliminate the queuing of concurrent queries2.

https://docs.snowflake.com/en/user-guide/warehouses-multicluster.html#:~:text=Multi%2Dcluster%20warehouses%20enable%20you,during%20peak%20and%20off%20hours.

asked 23/09/2024
Harri Jaakkonen
46 questions

Question 98

Report
Export
Collapse

Which statements are true concerning Snowflake's underlying cloud infrastructure? (Select THREE),

Snowflake data and services are deployed in a single availability zone within a cloud provider's region.
Snowflake data and services are deployed in a single availability zone within a cloud provider's region.
Snowflake data and services are available in a single cloud provider and a single region, the use of multiple cloud providers is not supported.
Snowflake data and services are available in a single cloud provider and a single region, the use of multiple cloud providers is not supported.
Snowflake can be deployed in a customer's private cloud using the customer's own compute and storage resources for Snowflake compute and storage
Snowflake can be deployed in a customer's private cloud using the customer's own compute and storage resources for Snowflake compute and storage
Snowflake uses the core compute and storage services of each cloud provider for its own compute and storage
Snowflake uses the core compute and storage services of each cloud provider for its own compute and storage
All three layers of Snowflake's architecture (storage, compute, and cloud services) are deployed and managed entirely on a selected cloud platform
All three layers of Snowflake's architecture (storage, compute, and cloud services) are deployed and managed entirely on a selected cloud platform
Snowflake data and services are deployed in at least three availability zones within a cloud provider's region
Snowflake data and services are deployed in at least three availability zones within a cloud provider's region
Suggested answer: D, E, F

Explanation:

Snowflake's architecture is designed to operate entirely on cloud infrastructure. It uses the core compute and storage services of each cloud provider, which allows it to leverage the scalability and reliability of cloud resources. Snowflake's services are deployed across multiple availability zones within a cloud provider's region to ensure high availability and fault tolerance. References: [COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
Patricia Vontitte
37 questions

Question 99

Report
Export
Collapse

Which snowflake objects will incur both storage and cloud compute charges? (Select TWO)

Materialized view
Materialized view
Sequence
Sequence
Secure view
Secure view
Transient table
Transient table
Clustered table
Clustered table
Suggested answer: A, D

Explanation:

In Snowflake, both materialized views and transient tables will incur storage charges because they store data. They will also incur compute charges when queries are run against them, as compute resources are used to process the queries. References: [COF-C02] SnowPro Core Certification Exam Study Guide

asked 23/09/2024
Stefan Hupfloher
45 questions

Question 100

Report
Export
Collapse

A user is preparing to load data from an external stage

Which practice will provide the MOST efficient loading performance?

Organize files into logical paths
Organize files into logical paths
Store the files on the external stage to ensure caching is maintained
Store the files on the external stage to ensure caching is maintained
Use pattern matching for regular expression execution
Use pattern matching for regular expression execution
Load the data in one large file
Load the data in one large file
Suggested answer: A

Explanation:

Organizing files into logical paths can significantly improve the efficiency of data loading from an external stage.This practice helps in managing and locating files easily, which can be particularly beneficial when dealing with large datasets or complex directory structures1.

asked 23/09/2024
inigo abeledo
39 questions
Total 716 questions
Go to page: of 72