ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 45 - DP-600 discussion

Report
Export

You have a Fabric tenant that contains a new semantic model in OneLake.

You use a Fabric notebook to read the data into a Spark DataFrame.

You need to evaluate the data to calculate the min, max, mean, and standard deviation values for all the string and numeric columns.

Solution: You use the following PySpark expression:

df.explain()

Does this meet the goal?

A.
Yes
Answers
A.
Yes
B.
No
Answers
B.
No
Suggested answer: B

Explanation:

The df.explain() method does not meet the goal of evaluating data to calculate statistical functions. It is used to display the physical plan that Spark will execute. Reference = The correct usage of the explain() function can be found in the PySpark documentation.

asked 02/10/2024
Henny Smit
36 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first