ExamGecko
Home Home / MuleSoft / MCIA - Level 1

MuleSoft MCIA - Level 1 Practice Test - Questions Answers, Page 19

Question list
Search
Search

List of questions

Search

Related questions











A banking company is developing a new set of APIs for its online business. One of the critical API's is a master lookup API which is a system API. This master lookup API uses persistent object store. This API will be used by all other APIs to provide master lookup data.

Master lookup API is deployed on two cloudhub workers of 0.1 vCore each because there is a lot of master data to be cached. Master lookup data is stored as a key value pair. The cache gets refreshed if they key is not found in the cache.

Doing performance testing it was observed that the Master lookup API has a higher response time due to database queries execution to fetch the master lookup data.

Due to this performance issue, go-live of the online business is on hold which could cause potential financial loss to Bank.

As an integration architect, which of the below option you would suggest to resolve performance issue?

A.
Implement HTTP caching policy for all GET endpoints for the master lookup API and implementlocking to synchronize access to object store
A.
Implement HTTP caching policy for all GET endpoints for the master lookup API and implementlocking to synchronize access to object store
Answers
B.
Upgrade vCore size from 0.1 vCore to 0,2 vCore
B.
Upgrade vCore size from 0.1 vCore to 0,2 vCore
Answers
C.
Implement HTTP caching policy for all GET endpoints for master lookup API
C.
Implement HTTP caching policy for all GET endpoints for master lookup API
Answers
D.
Add an additional Cloudhub worker to provide additional capacity
D.
Add an additional Cloudhub worker to provide additional capacity
Answers
Suggested answer: A

A mule application is required to periodically process large data set from a back-end database to Salesforce CRM using batch job scope configured properly process the higher rate of records.

The application is deployed to two cloudhub workers with no persistence queues enabled.

What is the consequence if the worker crashes during records processing?

A.
Remaining records will be processed by a new replacement worker
A.
Remaining records will be processed by a new replacement worker
Answers
B.
Remaining records be processed by second worker
B.
Remaining records be processed by second worker
Answers
C.
Remaining records will be left and processed
C.
Remaining records will be left and processed
Answers
D.
All the records will be processed from scratch by the second worker leading to duplicate processing
D.
All the records will be processed from scratch by the second worker leading to duplicate processing
Answers
Suggested answer: C

A company is designing a mule application to consume batch data from a partner's ftps server The data files have been compressed and then digitally signed using PGP.

What inputs are required for the application to securely consumed these files?

A.
ATLS context Key Store requiring the private key and certificate for the company PGP public key of partner PGP private key for the company
A.
ATLS context Key Store requiring the private key and certificate for the company PGP public key of partner PGP private key for the company
Answers
B.
ATLS context first store containing a public certificate for partner ftps server and the PGP public key of the partner TLS contact Key Store containing the FTP credentials
B.
ATLS context first store containing a public certificate for partner ftps server and the PGP public key of the partner TLS contact Key Store containing the FTP credentials
Answers
C.
TLS context trust or containing a public certificate for the ftps server The FTP username and password The PGP public key of the partner
C.
TLS context trust or containing a public certificate for the ftps server The FTP username and password The PGP public key of the partner
Answers
D.
The PGP public key of the partnerThe PGP private key for the companyThe FTP username and password
D.
The PGP public key of the partnerThe PGP private key for the companyThe FTP username and password
Answers
Suggested answer: D

As a part of design , Mule application is required call the Google Maps API to perform a distance computation. The application is deployed to cloudhub.

At the minimum what should be configured in the TLS context of the HTTP request configuration tomeet these requirements?

A.
The configuration is built-in and nothing extra is required for the TLS context
A.
The configuration is built-in and nothing extra is required for the TLS context
Answers
B.
Request a private key from Google and create a PKCS12 file with it and add it in keyStore as a part of TLS context
B.
Request a private key from Google and create a PKCS12 file with it and add it in keyStore as a part of TLS context
Answers
C.
Download the Google public certificate from a browser, generate JKS file from it and add it in key store as a part of TLS context
C.
Download the Google public certificate from a browser, generate JKS file from it and add it in key store as a part of TLS context
Answers
D.
Download the Google public certificate from a browser, generate a JKS file from it and add it in Truststore as part of the TLS context
D.
Download the Google public certificate from a browser, generate a JKS file from it and add it in Truststore as part of the TLS context
Answers
Suggested answer: A

A project team is working on an API implementation using the RAML definition as a starting point.

The team has updated the definition to include new operations and has published a new version to exchange. Meanwhile another team is working on a mule application consuming the same API implementation.

During the development what has to be performed by the mule application team to take advantage of the newly added operations?

A.
Scaffold the client application with the new definition
A.
Scaffold the client application with the new definition
Answers
B.
Scaffold API implementation application with the new definition
B.
Scaffold API implementation application with the new definition
Answers
C.
Update the REST connector from exchange in the client application
C.
Update the REST connector from exchange in the client application
Answers
D.
Update the API connector in the API implementation and publish to exchange
D.
Update the API connector in the API implementation and publish to exchange
Answers
Suggested answer: C

A company is implementing a new Mule application that supports a set of critical functions driven by a rest API enabled, claims payment rules engine hosted on oracle ERP. As designed the mule application requires many data transformation operations as it performs its batch processing logic.

The company wants to leverage and reuse as many of its existing java-based capabilities (classes, objects, data model etc.) as possible What approach should be considered when implementing required data mappings and transformations between Mule application and Oracle ERP in the new Mule application?

A.
Create a new metadata RAML classes in Mule from the appropriate Java objects and then perform transformations via Dataweave
A.
Create a new metadata RAML classes in Mule from the appropriate Java objects and then perform transformations via Dataweave
Answers
B.
From the mule application, transform via theXSLT model
B.
From the mule application, transform via theXSLT model
Answers
C.
Transform by calling any suitable Java class from Dataweave
C.
Transform by calling any suitable Java class from Dataweave
Answers
D.
Invoke any of the appropriate Java methods directly, create metadata RAML classes and then perform required transformations via Dataweave
D.
Invoke any of the appropriate Java methods directly, create metadata RAML classes and then perform required transformations via Dataweave
Answers
Suggested answer: C

An insurance company has an existing API which is currently used by customers. API is deployed to customer hosted Mule runtime cluster. The load balancer that is used to access any APIs on the mule cluster is only configured to point to applications hosted on the server at port 443.

Mule application team of a company attempted to deploy a second API using port 443 but the application will not start and checking logs shows an error indicating the address is already in use.

Which steps must the organization take to resolve this error and allow customers to access both the API's?

A.
Change the base path of the HTTP listener configuration in the second API to a different one fromthe first API
A.
Change the base path of the HTTP listener configuration in the second API to a different one fromthe first API
Answers
B.
Set HTTP listener configuration in both API's to allow for connections from multiple ports
B.
Set HTTP listener configuration in both API's to allow for connections from multiple ports
Answers
C.
Move the HTTP listener configurations from the API's and package them in a mule domain projectusing port 443
C.
Move the HTTP listener configurations from the API's and package them in a mule domain projectusing port 443
Answers
D.
Set the HTTP listener of the second API to use different port than the one used in the first API
D.
Set the HTTP listener of the second API to use different port than the one used in the first API
Answers
Suggested answer: C

Which of the below requirements prevent the usage of Anypoint MQ in a company's network?

(Choose two answers)

A.
single message payload can be up to 15 MB
A.
single message payload can be up to 15 MB
Answers
B.
payloads must be encrypted
B.
payloads must be encrypted
Answers
C.
the message broker must be hosted on premises
C.
the message broker must be hosted on premises
Answers
D.
support for point-to-point messaging
D.
support for point-to-point messaging
Answers
E.
ability for a third party outside the company's network to consume events from the queue
E.
ability for a third party outside the company's network to consume events from the queue
Answers
Suggested answer: C, D

A mule application designed to fulfil two requirements a) Processing files are synchronously from an FTPS server to a back-end database using VM intermediary queues for load balancing VM events b) Processing a medium rate of records from a source to a target system using batch job scope Considering the processing reliability requirements for FTPS files, how should VM queues be configured for processing files as well as for the batch job scope if the application is deployed to Cloudhub workers?

A.
Use Cloud hub persistent queues for FTPS files processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's disc for VM queueing
A.
Use Cloud hub persistent queues for FTPS files processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's disc for VM queueing
Answers
B.
Use Cloud hub persistent VM queue for FTPS file processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's JVM memory for VM queueing
B.
Use Cloud hub persistent VM queue for FTPS file processingThere is no need to configure VM queues for the batch jobs scope as it uses by default the worker's JVM memory for VM queueing
Answers
C.
Use Cloud hub persistent VM queues for FTPS file processingDisable VM queue for the batch job scope
C.
Use Cloud hub persistent VM queues for FTPS file processingDisable VM queue for the batch job scope
Answers
D.
Use VM connector persistent queues for FTPS file processing Disable VM queue for the batch job scope
D.
Use VM connector persistent queues for FTPS file processing Disable VM queue for the batch job scope
Answers
Suggested answer: C

Which Exchange asset type represents configuration modules that extend the functionality of an API and enforce capabilities such as security?

A.
Rulesets
A.
Rulesets
Answers
B.
Policies
B.
Policies
Answers
C.
RESTAPIs
C.
RESTAPIs
Answers
D.
Connectors
D.
Connectors
Answers
Suggested answer: B
Total 244 questions
Go to page: of 25