ExamGecko
Home Home / MuleSoft / MCIA Level 1 Maintenance

MuleSoft MCIA Level 1 Maintenance Practice Test - Questions Answers, Page 12

Question list
Search
Search

Related questions











As a part of design , Mule application is required call the Google Maps API to perform a distance computation. The application is deployed to cloudhub.

At the minimum what should be configured in the TLS context of the HTTP request configuration tomeet these requirements?

A.
The configuration is built-in and nothing extra is required for the TLS context
A.
The configuration is built-in and nothing extra is required for the TLS context
Answers
B.
Request a private key from Google and create a PKCS12 file with it and add it in keyStore as a part of TLS context
B.
Request a private key from Google and create a PKCS12 file with it and add it in keyStore as a part of TLS context
Answers
C.
Download the Google public certificate from a browser, generate JKS file from it and add it in key store as a part of TLS context
C.
Download the Google public certificate from a browser, generate JKS file from it and add it in key store as a part of TLS context
Answers
D.
Download the Google public certificate from a browser, generate a JKS file from it and add it in Truststore as part of the TLS context
D.
Download the Google public certificate from a browser, generate a JKS file from it and add it in Truststore as part of the TLS context
Answers
Suggested answer: A

A project team is working on an API implementation using the RAML definition as a starting point.

The team has updated the definition to include new operations and has published a new version to exchange. Meanwhile another team is working on a mule application consuming the same API implementation.

During the development what has to be performed by the mule application team to take advantage of the newly added operations?

A.
Scaffold the client application with the new definition
A.
Scaffold the client application with the new definition
Answers
B.
Scaffold API implementation application with the new definition
B.
Scaffold API implementation application with the new definition
Answers
C.
Update the REST connector from exchange in the client application
C.
Update the REST connector from exchange in the client application
Answers
D.
Update the API connector in the API implementation and publish to exchange
D.
Update the API connector in the API implementation and publish to exchange
Answers
Suggested answer: C

A company is implementing a new Mule application that supports a set of critical functions driven by a rest API enabled, claims payment rules engine hosted on oracle ERP. As designed the mule application requires many data transformation operations as it performs its batch processing logic.

The company wants to leverage and reuse as many of its existing java-based capabilities (classes, objects, data model etc.) as possible What approach should be considered when implementing required data mappings and transformations between Mule application and Oracle ERP in the new Mule application?

A.
Create a new metadata RAML classes in Mule from the appropriate Java objects and then perform transformations via Dataweave
A.
Create a new metadata RAML classes in Mule from the appropriate Java objects and then perform transformations via Dataweave
Answers
B.
From the mule application, transform via theXSLT model
B.
From the mule application, transform via theXSLT model
Answers
C.
Transform by calling any suitable Java class from Dataweave
C.
Transform by calling any suitable Java class from Dataweave
Answers
D.
Invoke any of the appropriate Java methods directly, create metadata RAML classes and then perform required transformations via Dataweave
D.
Invoke any of the appropriate Java methods directly, create metadata RAML classes and then perform required transformations via Dataweave
Answers
Suggested answer: C

An insurance company has an existing API which is currently used by customers. API is deployed to customer hosted Mule runtime cluster. The load balancer that is used to access any APIs on the mule cluster is only configured to point to applications hosted on the server at port 443.

Mule application team of a company attempted to deploy a second API using port 443 but the application will not start and checking logs shows an error indicating the address is already in use.

Which steps must the organization take to resolve this error and allow customers to access both the API's?

A.
Change the base path of the HTTP listener configuration in the second API to a different one fromthe first API
A.
Change the base path of the HTTP listener configuration in the second API to a different one fromthe first API
Answers
B.
Set HTTP listener configuration in both API's to allow for connections from multiple ports
B.
Set HTTP listener configuration in both API's to allow for connections from multiple ports
Answers
C.
Move the HTTP listener configurations from the API's and package them in a mule domain projectusing port 443
C.
Move the HTTP listener configurations from the API's and package them in a mule domain projectusing port 443
Answers
D.
Set the HTTP listener of the second API to use different port than the one used in the first API
D.
Set the HTTP listener of the second API to use different port than the one used in the first API
Answers
Suggested answer: C

Which of the below requirements prevent the usage of Anypoint MQ in a company's network?

(Choose two answers)

A.
single message payload can be up to 15 MB
A.
single message payload can be up to 15 MB
Answers
B.
payloads must be encrypted
B.
payloads must be encrypted
Answers
C.
the message broker must be hosted on premises
C.
the message broker must be hosted on premises
Answers
D.
support for point-to-point messaging
D.
support for point-to-point messaging
Answers
E.
ability for a third party outside the company's network to consume events from the queue
E.
ability for a third party outside the company's network to consume events from the queue
Answers
Suggested answer: C, D

A mule application designed to fulfil two requirements a) Processing files are synchronously from an FTPS server to a back-end database using VM intermediary queues for load balancing VM events b) Processing a medium rate of records from a source to a target system using batch job scope Considering the processing reliability requirements for FTPS files, how should VM queues be configured for processing files as well as for the batch job scope if the application is deployed to Cloudhub workers?

A.
Use Cloud hub persistent queues for FTPS files processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's disc for VM queueing
A.
Use Cloud hub persistent queues for FTPS files processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's disc for VM queueing
Answers
B.
Use Cloud hub persistent VM queue for FTPS file processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's JVM memory for VM queueing
B.
Use Cloud hub persistent VM queue for FTPS file processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's JVM memory for VM queueing
Answers
C.
Use Cloud hub persistent VM queues for FTPS file processingDisable VM queue for the batch job scope
C.
Use Cloud hub persistent VM queues for FTPS file processingDisable VM queue for the batch job scope
Answers
D.
Use VM connector persistent queues for FTPS file processing Disable VM queue for the batch job scope
D.
Use VM connector persistent queues for FTPS file processing Disable VM queue for the batch job scope
Answers
Suggested answer: C

Explanation:


Total 116 questions
Go to page: of 12