IICS Interview Questions and Answers

 Mindtree Question and Answer:- 

1.    How many transformation we can use in Synchronization task? And can we add multiple object as a source?
Answer:- Rules and Guidelines for a Multiple-Object Database
 

Use the following rules and guidelines when configuring a multiple-object database:

  • All objects must be available through the same source connection. All database tables in a multiple-object source must have valid relationships defined by key columns or user-defined join conditions.
  • When you add multiple database tables as sources for a task, you can either create relationships or user-defined joins, but not both.
  • The Data Synchronization Task wizard removes a user-defined join under the following conditions:
  • You remove one of two remaining database tables from the list of sources for the task.
  • You change the source connection from database to flat file or Salesforce.
2. Have you worked on JSON Input?
Answer: - No
 
3. How do you read the XML file?
Answer: - check in interview QA in blogger.


1.     4. What are the operation we have in synchronization task?
Answer: - we have 4 operation in synchronization task

  • Insert
  • Update
  • Upsert
  • Delete

1.    5)   Who assign the task to you?
Answer: - Team Leader

6) Difference between Liner Task flow and Task flow?
Answer: - Check printout

7) We have a file in system and created one time job and next time onward don't want to touch

      8) That file again so how we will use that file again?
Answer: - by scheduling the job.


1.       9) Can we run API without Secure Agent? ->
Answer: - Yes, We can develop and run the API without secure agent

10) 
You creating object and storing somewhere OR When you develop mapping where there they saved
?

ns  Answer ->Informatica cloud Repository

11) What are the Cloud Infrastructure do we have in a market?

GCP, AWS, Azure one of these infrastructure should be there to run this IICS.

Answer:-

List of TOP BEST KEY PLAYERS in Cloud Infrastructure Service Market report are:


● Amazon.com
● Cisco Systems
● IBM
● Alphabet
● Microsoft
● Akamai Technologies
● Google
● Hewlett Packet
● Microsoft Corp.
● VM-Ware
● Yahoo Inc.
● Injazat Data Systems
● Malomatia

1.     12) Which Cloud environment you are using to use the IICS? Data Integration
13) Can we run the job without Informatica Secure agent?

->To stablish/ unprimed sources (flat file & relational db.) used secure agent during the connection stablishing.( that answer they were expecting)

Answer: -

Informatica Cloud (IICS) Architecture

Contents

1. Introduction

2. Informatica Cloud as an iPaaS solution

3. Informatica Cloud Services

o   3.1 Administrator

o   3.2 Data Integration

o   3.3 Monitor

o   3.4 Discovery IQ

o   3.5 Application Integration

o   3.6 Application Integration Console 

o   3.7 API Manager

o   3.8 API Portal

4. Informatica Cloud Repository

5. Informatica Cloud Secure Agent

6. IICS Architecture

 

 

1. Introduction

IICS stands for Informatica Intelligent Cloud Services popularly known as Informatica Cloud. It is a next-generation iPaaS (Integration Platform as a Service) solution offered by Informatica that allows you to exchange data between any combination of on-premise and cloud based applications within individual or across multiple organizations.

Before we get to understand the IICS Architecture, let us discuss in detail about each individual component in the Architecture.

 

2. Informatica Cloud as an iPaaS solution

Informatica Cloud satisfies the core requirements of iPaaS based application which are

  • Access it from any machine with an Internet access and web browser application. When you access the IICS application, your browser connects to Informatica Cloud Services through a secure HTTP.
  • A common platform with multiple wizards that guide to build and test your Integration tasks.
  • Enterprise Orchestration of processes built.
  • Centralized monitoring capabilities to track the progress and result of jobs.
  • Preconfigured Integration templates for various recurring business use cases.
  • Real-time data integration and bulk data integration services.
  • Provides connectors to connect to external data sources and connector SDK to build custom connectors

 

3. Informatica Cloud Services

Informatica Intelligent Cloud Services offers a suite of cloud-native services that enables organizations to perform cloud-native data integration, application and API integration and data management via private, public, hybrid or multi-cloud.

The services offered to you depends on the license of your Organization.

 

3.1 Administrator

  • Create Users, User Groups and assign them roles.
  • Define runtime environments.
  • Define connections for Source and Targets.
  • Use Add on connectors for connecting to cloud.
  • Configure schedules to run integration tasks.

 

3.2 Data Integration

  • Develop batch jobs using variety of tasks like Mapping task, Synchronization task, Replication task, Taskflows etc.
  • Execute On demand, on schedule & real time events.
  • Import and Export IICS tasks.

 

3.3 Monitor

  • Allows to monitor tasks and taskflows.
  • Monitor session logs, job status and error logs.
  • Allows you to stop or restart a job.

 

3.4 Discovery IQ

  • Provides a comprehensive view of your product usage and usage.
  • Provides contextual recommendation and best practices.

3.5 Application Integration

  • Allows you to perform real-time web service-based integration of processes, applications, and systems.
  • Allows you to design, integrate, and implement business processes spanning different cloud and on-premise applications. 

 

3.6 Application Integration Console 

  • Provides monitoring and management capabilities for Application Integration

 

3.7 API Manager

  • Manages the APIs for services and processes built in Informatica CAI.
  • Seamless integration with Informatica CAI.
  • Access to APIs description and metadata.
  • Policy management.
  • API analytics.

 

3.8 API Portal

  • Provides users with secure access to the created APIs.
  • Enables users to interact with managed APIs and view API usage analytics.

 

 

 

4. Informatica Cloud Repository

Informatica Cloud Services includes the IICS repository that stores various information about tasks. As you create, schedule, and run tasks, all the metadata information is written to IICS repository.

The various information that gets stored to IICS repository include

  • Source and Target Metadata: The metadata of source and target objects which includes field names, data type, precision, scale and any other information of source and target objects gets stored in IICS repository.
  • Mappings: The metadata of mappings and their transformation rules gets stored in IICS repository.
  • Connection Information: The repository stores the connection information that enables you to connect to specific sources and target systems in encrypted format.
  • Schedules: The repository stores the schedule details of the tasks configured in IICS.
  • Logging and Monitoring Information: The status of all the jobs triggered in the org is stored in the IICS repository.

 

The IICS repository is maintained completely by Informatica unlike the on-premise tools like Powercenter where repository is maintained by the user. Hence the users do not have any direct access to the metadata stored in Informatica repository.

 

5. Informatica Cloud Secure Agent

The Informatica Cloud Secure Agent is a light weight, self-upgrading program that runs on the server inside your network. The Secure Agent is available for Windows and Linux servers.

  • It allows you to access all your local resources like databases, files and applications that reside beside your company firewall.
  • It is responsible for securely communicating between the Informatica Cloud Repository, other cloud based applications and your local resources.
  • Informatica Cloud Secure Agent is the local agent on your server that runs the tasks and is responsible for moving data from source to target.



  • During this process, your source application data never gets staged or run through Informatica Informatica Cloud servers. Your data remains secure and behind your company firewall.

When you install Secure Agent on your server you need to register it with Informatica Cloud repository using the unique registration code provided for your organization account. You can install only one agent per machine. Once a secure agent is registered, a Runtime Environment is created in your IICS Org.

 

However you can install same agent on multiple machines like on a Windows and a Linux machine with in your network and register with IICS repository. Each agent can only access the resources available in their installed machines and you need to use the specific Runtime Environment in tasks you build to access them.

Additional agent requires additional license fee.

6. IICS Architecture

All the components discussed so far constitute the Informatica Cloud Architecture.

 

  • Informatica Cloud as an iPaaS solution.
  • Informatica Cloud Services.
  • Informatica Cloud Repository.
  • Informatica Cloud Secure Agent.

 


Informatica Cloud Architecture


Let’s walk through step-by- step understanding the role of each component in the IICS Architecture.

  1. As an end user you start accessing Informatica cloud services via browser for design time and administration.
  2. During the design time as you start developing mappings, Informatica cloud services will talk to your actual data sources residing on on-premise or on cloud via Secure Agent.
  3. The source and target metadata consisting of table names, fields, data type etc. are given back to the web browser via Informatica Cloud servers.
  4. Once you save the design then the metadata is actually stored into Informatica Cloud repository.
  5. When you run the task, Secure Agent extracts the task metadata XML into the secure agent machine from IICS repository. It parses the XML data and then start connecting to your data sources and process the data.
  6. The Secure Agent extracts the metadata XML file into below location in your secure agent installed machine.<Informatica Cloud Secure Agent installed Directory>\apps\Data_Integration_Server\data\metadata
  7. Once the data processing is done via the IICS task, secure agent will send back the statistics to the Informatica Cloud repository.
  8. The end user can access these statistics via the Monitor through the web browser.

Apart from the data preview, Informatica only reads your metadata. At any given point of time your data is always receding behind your company firewall.

1.      14) Main Difference between IICS and Power Center?

Ans: - PowerCenter Vs Cloud (IICS) - IDWBI and See in earlier put printout.


1) With PowerCenter you can distribute workload over as many servers as you want to. Meaning you could distribute heavy workloads of billions of records over e.g. five different servers.

With IICS (as far as I know) you are restricted to four CPU cores per IICS license.


        2) Workflows have many different features in both worlds. You cannot simply move a "workflow" from one world to the other, you have to manually rebuild quite a few things.

            3)  Connectivity is another area of huge differences. IICS offers more connectors, but many connectors for PowerCenter offer better performance. And quite a few connectors exist in one world but not in the other one.

Sr. No

Informatica Power Center

Informatica Cloud Data integration

1

It can take more than 1 week to step because it takes time to configuration, service approval, and so on.

It can set up and configure all the things in one day.

2

There is a need to install the client tools such as

·        Informatica Power center repository,

·        workflow Manager,

·        Workflow Monitor,

·        Informatica Designer. 

There is no need to install the client tool and all because all the things are managed and controlled by Informatica Agent.

3

Versioning is one of the features of the power center.

There are no versioning features in Informatica cloud.

4

We can perform the debugging till the last transformation in the Informatica Power Center Designer.

In Informatica Cloud, can perform debugging only with Expression transformation.

5

It stores the metadata in the Informatica repository.  But cannot access anywhere.

It also stores the metadata in the cloud but can access from anywhere.

6

The developer can directly run the SQL script to fetch the data from the repository.

The developer doesn’t have permission to fetch the data from the cloud because Informatica maintains its own resources which is created by Informatica itself.

7

If the developer needs to use the new service then he will have to wait a week or more than one week or months for approval, configure and setup and so on.

For any new service, a developer can take the 30 days free trial and for the full use of service, he can buy whatever service he wants to use.

8

In Power Center designer, we are can see all the map fields without opening the object.

Here, we check field mapping only in the target designer by selecting and maximizing the property window.

9

We can compare the same job with the latest and old updates. 

We cannot compare the same job with the latest and old updates.

10

We can reflect the latest job as an old job.

We cannot reflect the latest job as an old job.

11

We can see the iconic views of objects in designers and workflow. 

We cannot see the iconic views of objects in designer and workflow. 

12

It does not support the API Service and API Integration.

We can’t run the API-related service and jobs.

13

Developer can access the repository meta data in power center.

Developer can’t access informatica cloud repository.

14

 

 

15

 

 

16

 

 

150 Informatica Question and Answer :- 

IICS Q_A: -

Set 1

Question 1: Incorrect

__________ is an object that defines the interaction between the Process Designer and the service, so the connector can send and receive data.


·        Service Connector (Correct)

·        Process Designer (Incorrect)

·        Process Connector

·        Service Designer



Explanation

Service Connector is an object that defines the interaction between Process Designer and the service, so the connector can send and receive data.


Question 2: Incorrect

Select all the correct statements w.r.t the task flows and Mappings in Cloud Data Integration

·        Data  can be passed from one Task to another in a task flow (Correct)

·        In Cloud Data Integration, the input parameter can be overridden by overriding the value using parameter file. (Correct)

·        If the Allow override at runtime is checked, then set the value from the task flow would not take effect. (Correct)

·         In Cloud Data Integration, the input parameter can be overridden by setting the value for the field in Taskflow using an assignment step (Correct)

Explanation

All the listed statements are correct

 

Question 3: Incorrect

Select all the mandatory options which should be provided for creating a REST V2 connection


·        Authentication (Correct)

·        Auth UserID

·         Swagger File Path(Correct)

·        Keystore File Path


 

Explanation

Connection Name , Type, Runtime Environment , Authentication and Swagger File Path are the required options  to be filled create a REST V2 connection

 

 

 

 

Question 4: Incorrect

You are using a Web Consumer transformation in Data Integration mapping, select all the options required to create a WS consumer connection.


·              WSDL URL(Correct)

·        EndPoint URL(Correct)

·        Business Service(Correct)

·        Source Operation(Correct)


Explanation

All the options which are listed are required to create a WS consumer connection.

 

Question 5: Incorrect

Pre-Processing Commands can be used in

Note: There can be multiple correct answers to this question


·        Replication Task(Correct)

·        Mapping Task(Correct)

·        Synchronization Task(Correct)

·        PowerCenter Task (Incorrect)


 

Question 6: Incorrect

You have a single Data synchronization job, reading and writing data to Salesforce objects using Salesforce Bulk API, as per the requirement you need to process rows in batch sizes of 5000 rows per batch. Select the correct answer?

·        In the Advanced Salesforce Options, update the required rows to be processed in the Target Batch size option ( i.e. Target Batch size = 5000)(Incorrect)

·        Salesforce Standard API can only process data in Batches, Bulk API process all the data as a single batch

·        Batch size for Bulk API can only be configured on the Informatica Cloud Secure Agent as a custom DTM property(Correct)

·        In the Advanced Source Properties, configure the Max Rows per Batch property to 5000 (i.e. Max Rows per Batch property = 5000)

Explanation

BULK API batch size can be set on the Informatica Cloud Secure Agent as a custom DTM property. The property is applicable on all the BULK API sessions from the updated Secure Agent. By default, Bulk API batch size is set to 10,000 in any Informatica Cloud task.  Include the below custom property values in the required Secure Agent for the Batch size when using Salesforce Bulk API.

SalesForceBulkBatchSize = 5000

 

Question 7: Incorrect

You can use the Cloud Internal variables($LastRunDate/$LastRunTime) directly in Saved Query, SQL Select statements


·        True(Incorrect)

·        False(Correct)

Explanation

Saved Query does not support the Cloud Internal variables directly in SQL Select statement

 

Question 8: Incorrect

Select all the correct statements w.r.t Secure Agents

·        Multiple secure agents can be Installed on Linux(Correct)

·        Multiple secure agents can be installed on Windows(Incorrect)

·        Secure Agents are tied to the ORGS(Correct)

·        One Secure Agent can be connected to multiple environments like Prod and Sandbox environments

Explanation

On a Windows machine, only one secure agent can be installed. On Linux, users can install multiple secure agents with different user accounts\profiles. In Informatica Cloud, you cannot use one secure agent for both Production and Sandbox orgs. Agents are tied to the orgs and hence you can use one secure agent either on Production or on Sandbox, and not on both at the same time.

 

Question 9: Incorrect

A process can be invoked from the following

Note: There can be multiple correct answers to this question

·        From a Guide(Correct)

·        Externally via SOAP or REST(Correct)

·        From Operational Manager(Incorrect)

·        From another process(Correct)

Explanation

A process can be invoked from the following

From a Guide
From another process

Externally via SOAP or REST

 

 

Question 10: Correct

If in Linear task flow, you have configured the post-processing command and enabled stop on warning, your task has executed with a warning, select the correct statement

·        Post-processing command associated with the task will not be invoked as the task has completed with a warning and stop on warning is enabled.

·        The Task and the rest of the task flow will be stopped as stop on warning is enabled, the task flow will execute the rest of the task flow and the Post-processing command if the task flow is released manually from the cloud GUI

·        Irrespective of the task status, the Post-processing command will be executed as a stop on warning is related to the task execution only.

·        Post-processing command associated with the task will be invoked. But the rest of the flow will be stopped as stop on warning has been enabled in the Linear task flow.(Correct)

Explanation

The task has run with a warning, post-processing command associated with it will be invoked. But after that, rest of the flow will be stopped as stop on warning has been enabled in Linear task flow.

 

Question 11: Incorrect

Which of the Data Integration transformation is present in PowerCenter but not in Cloud

·        Web Services

·        Data Masking(Incorrect)

·        Update Strategy(Correct)

·        Normalizer

Explanation

Update Strategy  transformation is present in PowerCenter but not in Cloud, you need to use the Operation (option) in the Target to flag records as Insert, Update , Delete , Upsert or Data Driven

 

Question 12: Incorrect

How can guides be accessed in the Salesforce1 mobile app?
Note: There can be multiple correct answers to this question

·        Through Salesforce1 Application connector(Incorrect)

·        As a publisher action(Correct)

·        From the record detail page(Incorrect)

·        From the main navigation menu(Correct)

Explanation

Guides cannot be accessed through Salesforce1 Application connector and record detail page

 

Question 13: Correct

When you reset target tables for a replication task, the indexes on a target tables will be dropped and recreated

·        True

·        False(Correct)

Explanation

The replication task drops the indexes and the target table. You must create the indexes again after the tables are recreated by the replication task

 

Question 14: Correct

Amazon S3 only supports the REST web service Interface

·        False(Correct)

·        True

Explanation

Amazon S3 supports both SOAP and REST web service Interfaces

 

Question 15: Correct

Guides Designer can be designed only when processes are published

·        False

·        True(Correct)

Explanation

Guides Designer can be designed only when processes are published

 

Question 16: Incorrect

Select all the properties of a Linear Taskflow

·        Custom email notification(Correct)

·        Runtime Environment(Incorrect)

·        Stop on Warning(Correct)

·        Location(Correct)

Explanation

Runtime Environment  is not defined in Linear Taskflow

 

Question 17: Correct

To include all incoming fields from an upstream transformation except those with BigInt, what should you do?

·        All Incoming fields are Included by default, in the field mapping section ignore the fields with the BigInt data type from passing these fields to the downstream transformations

·        Feild exclusions by datatype are only supported in Dynamic Mappings

·        Delete the Include Rule and Edit the Exclude Rule and select the Feilds by Datatype and select BigInt

·        Configure two field rules. First, use the All Fields rule to include all fields. Then, create a Fields by Datatype rule to exclude fields by data type and select BigInt as the data type to exclude.Multiple (Correct)

Explanation

Configure two field rules. First, use the All Fields rule to include all fields. Then, create a Fields by Datatype rule to exclude fields by data type and select BigInt as the data type to exclude.

 

Question 18: Correct

Bundles are versioned

·        False

·        True(Correct)

Explanation

The initial version number of the Bundle defaults to 1.0. The new version number must be greater than the current version in the format #.#, for example, 1.2.

 

Question 19: Incorrect

Select all the tasks which can be performed in Mapping Configuration task but not in Data Synchronization task

·        Read and Write operation(Incorrect)

·        Connected Lookup(Correct)

·        Connection Parameterization(Correct)

·        Push down Optimization(Correct)

Explanation

Push down Optimization , Connected Lookup and Connection Parameterization are supported in Mapping Configuration task only

 

 

Question 20: Incorrect

You cannot run multiple instances of a Data Replication task simultaneously

·        True(Correct)

·        False(Incorrect)

Explanation

You cannot run multiple instances of a Data Replication task simultaneously. If you run a Data Replication task that is already running, the Data Replication task fails with the following error: Data replication task failed to run. Another instance of a task <Data Replication task name> is currently replicating the same objects.

 

Question 21: Incorrect

By default, Informatica Cloud secure agent process only _____ tasks at a time

·        2(Correct)

·        200

·        2000(Incorrect)

·        20

Explanation

By default, Informatica Cloud secure agent process only 2 tasks at a time, you can override the default limit by updating the maxDTMProcesses parameter on the secure agent

 

Question 22: Incorrect

Select all the valid services available in Data Integration Service

·        Discovery IQ(Correct)

·        Operational Insights(Correct)

·        API Integration(Incorrect)

·        Application Integration(Correct)

Explanation

API Integration is not a valid service

 

Question 23: Incorrect

Select the correct statements w.r.t Data Decision steps

·        Allows a guide to take different paths depending on the value of a field(Correct)

·        Each condition has a branch(Correct)

·        The order of the branches is not important(Incorrect)

·        Tests for a condition(Correct)

Explanation

The order of the branches is important in Decision steps

 

Question 24: Incorrect

_________ should be provided when you configure the decryption action for a mass ingestion task

·        Key pair

·        key ID(Incorrect)

·        Key Ring

·        key passphrase(Correct)

Explanation

When you configure the decryption action for a mass ingestion task, you provide a key passphrase. The key passphrase is the private key passphrase of the receiver who decrypts the file.

 

Question 25: Incorrect

Select all the valid license categories available in IICS

·        Custom licenses(Correct)

·        API licenses(Incorrect)

·        Connector licenses(Correct)

·        Edition licenses(Correct)

Explanation

The following license categories are available:

Edition licenses

Edition licenses control the Informatica Intelligent Cloud Services features that you can use. Feature licenses provide access to data integration tasks such as mapping tasks, replication tasks, and synchronization tasks. They also provide access to components such as business services and saved queries and to features such as fine-grained security and Salesforce connectivity.

Connector licenses

Connector licenses provide connectivity to entities such as Amazon Redshift, Microsoft SQL Server, and Oracle.

Custom licenses

Custom licenses are licenses that are not part of an edition. They provide access to features, packages, or bundles. If your organization uses a custom license that provides access to a feature that is also included in an edition license, the terms of the custom license override the terms of the edition license.

 

Question 26: Incorrect

You can stop re-triggering the jobs in Informatica Cloud by Outbound Message stuck in the queue

·        Yes(Correct)

·        No(Incorrect)

Explanation

You can stop re-triggering of the job by restarting the Informatica Cloud secure agent. After restarting secure agent it would clean up the outbound messages queues from the Informatica server.

 

Question 27: Incorrect

Select the utility which invokes Informatica Cloud Data Integration Jobs from PowerCenter

·        IICSRunAJob

·        ptJobCli

·        RESTPlugin(Incorrect)

·        runAJobCli(Correct)

Explanation

runAJobCli is utility which has to be present on the Secure Agent, using runAJobCli utility Informatica Cloud Data Integration Jobs can be invoked from PowerCenter

 

Question 28: Correct

When would a process be executed on a local agent

·        If the process includes input and output fields.

·        If the process needs to access an on-premise resource that resides behind a firewall.(Correct)

·        By default, all the process can only be executed in Cloud agent

·        If the process will connect to more than one application or service.

Explanation

If the process needs to access an on-premise resource that resides behind a firewall.

 

Question 29: Correct

Use the __________ to find the performance benchmark of a task

·        Operational Insights

·        DiscoveryIQ(Correct)

·        Administrator

·        Cloud API Manager

Explanation

Use the DiscoveryIQ to find the performance benchmark of a task

 

Question 30: Incorrect

Select all the correct statements w.r.t AMQP

·        AMQP supports synchronous as well as asynchronous(Incorrect)

·        When you publish a process that uses the AMQP connector, the binding details are stored in Informatica Cloud Repository(Incorrect)

·        AMQP is an open standard application layer protocol for message-oriented middleware(Correct)

·        AMQP allows you to facilitate business transactions by passing real-time data streams(Correct)

Explanation

When you publish a process that uses the AMQP(Advanced Messaging Queuing Protocol) connector, the binding details are stored in a catalog file (entrypoints.xml) that determines how to route the message. And AMQP does only the asynchronous. calls

 

Question 31: Incorrect
Which of the following properties of a Salesforce connection are required in IICS?

Note: There can be multiple correct answers to this question

·        Service URL(Correct)

·        Code Page(Incorrect)

·        ServiceRootURL(Incorrect)

·        Salesforce Connection Type(Correct)

Explanation

ServiceRootURL and Code Page are not required for Salesforce Connection

 

Question 32: Incorrect

Data replication task can be configured for

Note: There can be multiple correct answers to this question

·        Database(s)(Correct)

·        SalesForce(Correct)

·        XML File(s)

·        Flat File(s)(Correct)

Explanation

Data replication task will work only with database, SalesForce, and flat file connections. Flatfile connections cannot be used as source in a Data replication service, it can only be used as a target.

 

Question 33: Incorrect

In IICS under Activity Monitor page, RowsProcessed is the count of 

Note: There can be multiple correct answers to this question

·        Total failed rows for source(Correct)

·        Total failed rows for target(Correct)

·        Total success rows for source

·        Total success rows for target(Correct)

Explanation

RowsProcessed = TotalSuccessRows + TotalFailedRows

Where,

TotalSuccessRows = Total success rows for target

TotalFailedRows = Total failed rows for source + target + all transforms

 

Question 34: Incorrect

Select all the valid types of Tasks which can be created in Data Integration Service

·        Parallel Tasks(Correct)

·        Taskflow(Correct)

·        Single Task(Correct)

·        Sequential Tasks(Correct)

Explanation

All the listed options are valid

 

Question 35: Incorrect

You cannot schedule the task flows to execute

·        Daily(Incorrect)

·        Run Continuously(Correct)

·        Hourly

·        Every N minutes

Explanation

You can not schedule the tasks to Run Continuously. The smallest increment of time allowed for creating a schedule in Informatica Cloud to run jobs automatically is 5 minutes

 

Question 36: Incorrect

Scheduler and scheduler properties carried over to the Target Org while using Export/Import Utility

·        False(Correct)

·        True(Incorrect)

Explanation

When you import the task in a specific Org, users would have to manually edit the task and choose a schedule that is defined in that Org

Question 37: Correct

A user cannot access a guide unless it has been published

·        True(Correct)

·        False

Explanation

Yes, user cannot access a guide unless it has been published

 

Question 38: Incorrect

For Data masking tasks, staging connection has to be provided if

Note: There can be multiple correct answers to this question

·        By default, the masking task stages the source data to apply the data masking rules on the secure agent, and the stage data is automatically dropped after the successful execution of the job(Incorrect)

·        Data filter has to be applied(Correct)

·        Data has to be read from multiple source objects(Correct)

·        The task has to use a different source and target connections(Correct)

 

Question 39: Incorrect

What security is set for all the communications from the secure agent to IICS servers

·        TSL

·        SSL(Incorrect)

·        SOA

·        TLS(Correct)

Explanation

All communication from the Secure Agent to the IICS host is Transport Layer Security (TLS) 1.2 encrypted using AES128-SHA (128 bit) cipher.

Informatica leverages the underlying transport layer of these connector communication protocols to ensure that customer data is transmitted securely across data stores and applications. Customer data is transmitted encrypted via Transport Layer Security (TLS) using AES (128 bit) cipher.

 

Question 40: Incorrect

Select all the stats which can be viewed in DiscoveryIQ

·        Task Run Count by Secure Agents(Correct)

·        Task Run Count by Connection Type(Correct)

·        Task Run KPI(Correct)

·        Task Run Count By App Type(Correct)

Explanation

Stats for all the options which are listed can be viewed in DiscoveryIQ

 

Question 41: Incorrect

Multiple object source type can be used with

Note: There can be multiple correct answers to this question

·        Application Connections

·        Database connections(Correct)

·        Salesforce Connections(Correct)

·        Flat Files Connections(Incorrect)

Explanation

You can import multiple objects using Multiple object source type  from Database and Salesforce Connections

 

Question 42: Incorrect

Reusable Sequence Generator transformations can be created in the Data Integration Mappings

·        False(Incorrect)

·        True(Correct)

Explanation

You can create Shared Sequence to use a Sequence Generator in multiple mappings.

 

Question 43: Correct

Which of the following services is used to design a Processes in CAI

·        API Portal

·        Monitor

·        Process Designer(Correct)

·        API Manager

Explanation

Process Designer is used to design a Processes in CAI

 

Question 44: Correct

SQL queries used in the SQL transformation in CDI can be validated

·        True(Correct)

·        False

Explanation

SQL queries used in the SQL transformation in Cloud Data Integration can be validated

 

Question 45: Incorrect

Which File Input type you should select in Hierarchy Parser Transformation, when the xml/json data is in a file and want to parse the same

·        File(Correct)

·        Splitting

·        Parse(Incorrect)

·        Buffer

Explanation

When the xml/json data is in a file and want to parse the same, File type should be used and the absolute path of the file with the name should be mapped

 

Question 46: Correct

Does Pre SQL/Post SQL support parameterization in their queries

·        Yes(Correct)

·        No

Explanation

Yes. It is supported. Parameter variables has to be mentioned as $$ParamVariable. For example,

DELETE FROM EMPLOYEE WHERE EMP_ID = $$EmpNo;

 

Question 47: Incorrect

You can parameterize  _______________  in a Data Replication Task

·        Source and Target Connections(Incorrect)

·        Source Connections only

·        Targets Connections only

·        None of the above(Correct)

Explanation

It is not possible to parameterize connections in a standard  Data Replication task. You can parameterize connections by using the parameterization at the mapping and resolving it at the MCT level. 

 

Question 48: Correct

Select all the components of the Parallel Tasks

·        Jump(Correct)

·        Subtaskflow(Correct)

·        Command Task(Correct)

·        Wait(Correct)

Explanation

All the mentioned options can be used in the Parallel Tasks

 

Question 49: Incorrect

Select all the correct statements w.r.t migrating objects from one org to another org

·        Saved queries have to be manually created in the Target Org. (Incorrect)

·        When you migrate a data replication task, Informatica Cloud includes target prefix information(Correct)

·        Informatica Cloud assigns all migrated connections randomly to the available Secure Agent in the list of Secure Agentsin the target org(Incorrect)

·        When you migrate an object, the migration includes all dependent objects except for objects installed froma bundle.(Correct)

Explanation

Informatica Cloud assigns all migrated connections to the first Secure Agent in the list of Secure Agents for the target organization. Saved queries can be migrated from one org to another org

 

Question 50: Incorrect

To find the annual sum of quarterly data for each store, you might use a _______ expression macro in an Aggregator transformation.

·        Hybrid Macro

·        Vertical Macro(Correct)

·        Aggregator Macro(Incorrect)

·        Horizontal Macro

Explanation

To find the annual sum of quarterly data for each store, you might use a vertical expression macro in an Aggregator transformation

 

Set 2

 

Question 1: Correct

Select all the valid services available in Application Integration Service

·        Guide(Correct)

·        Process Object(Correct)

·        Process(Correct)

·        Service Connector(Correct)

Explanation

All the listed options are correct

 

Question 2: Incorrect

Select all the transformations which are available in Cloud Data Integration

·        Verifier transformation(Correct)

·        Cleanse transformation(Correct)

·        Rule Specification transformation(Correct)

·        Deduplicate transformation(Correct)

Explanation

All the mentioned transformations are available in Cloud Data Integration

 

Question 3: Correct

Select all the valid Assets

·        Saved Query(Correct)

·        Single Task(Correct)

·        Synchronization Task(Correct)

·        Business Services(Correct)

Explanation

All are valid Assets

 

Question 4: Incorrect

The property "Max Rows per Batch" that is configurable on a SFDC target is used when you are using

·        Bulk API(Incorrect)

·        Standard API(Correct)

·        REST API

·        SFDC API

Explanation

The property "Max Rows per Batch" that is configurable on a SFDC target is used when you are using Standard Salesforce API to write the target data. If you choose to run the job using Bulk API, "Max Rows per Batch" will be ignored.

 

Question 5: Incorrect

Select all the correct statements w.r.t $LastRunDate

·        The $LastRunDate value does not get updated if the task  has completed with Warninig status.

·        The values of $LastRunDate get stored in Informatica Cloud repository/server.(Correct)

·        The value of $LastRunDate is retrieved during the task runtime and written to parameter file.(Correct)

·        In the Cloud Adminstrator, you have the option to change the $LastRunTime timezone to Org's timezone.

Explanation

Currently, IICS does not have the option to change the $LastRunTime timezone to Org's timezone.The $LastRunDate  value gets updated after execution of the task with Successful/Warninig status. It doesn't get updated when the task is in Failed status

 

Question 6: Correct

You have created a mapping task to extract data from SQL Server table to a flat file (target object), the same file has to be placed in the below two shared folders paths.

1.  1. D:\IDWBI\IICS\Curr

2.  2. D:\IDWBI\IICS\Prev

Select the best option?

·        Create one flat file connection pointing to the path D:\IDWBI\IICS\, and in the mapping task Target details, append the child path (i.e \Curr or \Prev) and assign to two different target objects

·        Enter the list of sub directories in the flat file connection, and assign the created connection to one target object, Informatica Cloud by default creates the same file in all the listed sub directories

·        Parametrize the flat file path, and pass the required path dynamically during the mapping task execution

·        Create two flat file connections pointing to each path, and create two target objects in the mapping and assign each connection to the target objects(Correct)

Explanation

A Flat File connection points to a single folder/directory

 

Question 7: Correct

Blackout period can only be configured at ORG level

·        False(Correct)

·        True

Explanation

You can configure Blackout period at runtime environment level as well as at ORG level

 

 

Question 8: Incorrect

Select the role which should be assigned for a user to monitor the job runs including the imports or exports in IICS

·        Monitor role(Incorrect)

·        Operations role

·        Developer role

·        Designer role(Correct)

Explanation

Admin or Designer role. Users with either of these roles can monitor all jobs in the organization.

Monitor role. Users with this role can monitor Data Integration jobs but not imports or exports.

 

Question 9: Incorrect

___________ unexpected errors in process execution, such as bad data or problems with external systems.

·        Fault Exception Management

·        Bad Record Exception Management(Incorrect)

·        Alert Service

·        Process Exception Management(Correct)

Explanation

Process exception management involves managing exceptions that occur in running processes. The exceptions typically manifest themselves as faults that are not caught by normal fault handlers; that is, they were not planned for. They are the result of unexpected errors in process execution, such as bad data or problems with external systems.

 

Question 10: Incorrect

Select all the tasks which can be executed on Informatica Cloud Hosted Agent

·        Data Synchronization tasks(Correct)

·        Mass Ingestion task(Correct)

·        Mapping Configuration task(Correct)

·        Data Masking tasks(Incorrect)

Explanation

​Informatica Cloud Hosted Agent can run Data Synchronization, Mapping Configuration and Mass Ingestion tasks. Data masking tasks are not supported on Informatica cloud hosted agents, these tasks can be executed on local secure agents instead of the Informatica Cloud hosted secure agent.

Question 11: Incorrect

Select all the correct statements w.r.t pushdown optimization

·        The task processes all transformation logic that it cannot push to a database.(Correct)

·        To use cross-schema pushdown optimization, you create a connection for each schema.(Correct)

·        You can enable cross-schema pushdown optimization for tasks that use source or target objects associated with different schemas within the same database.(Correct)

·        You can use a pushdown optimization user-defined parameter to perform pushdown optimization based on the parameter value defined in a parameter file.(Correct)

Explanation

All the statements are correct w.r.t pushdown optimization

 

Question 12: 

Skipped

A WSDL has multiple operations (example:- Create, Update, CreateMultiple, etc), you need to select CreateMultiple operation when you use the web services transformation in a Data Integration Mapping.

·        First, create a REST V2 connection with the WSDL file and then in the Business Service select the Required Operation, use the Business Service in the WebServices Transformation(Correct)

·        In the create page of the Rest V2 connection, select the operation as CreateMultiple

·        Informatica Cloud currently doesn't support, WSDL files with multiple operations, you need to customize the WSDL file with a single required operation ( CreateMultiple in this case)

·        Create a Rest V2 connection, and then in the web service transformation all the operations from the WSDL will be displayed, select the required operation (i.e. CreateMultiple) and then proceed with the field mapping

Explanation

First, create a REST V2 connection with the WSDL file and then in the Business Service select the Required Operation, use the Business Service in the WebServices Transformation

 

Question 13: Incorrect

You can use the below functions to change the value of the in-out parameter

·        SetVariable(Correct)

·        SetMinVariable(Correct)

·        SetMaxVariable(Correct)

·        SetCountVariable(Correct)

Explanation

To change the value of the in-out parameter, below functions can be used in the expression transformation

​SetVariable, SetMaxVariable, SetMinVariable, SetCountVariable

Question 14: Incorrect

Select all the authentication types which can be configured while creating a REST V2 connection

·        Digest(Correct)

·        Basic(Correct)

·        Kerberos(Incorrect)

·        Advanced(Incorrect)

Explanation

When you can create a REST V2 connection, you can configure one of the following REST authentication types:

•BASIC

•DIGEST

•OAUTH Version 1.0

 

Question 15: Correct

Which of the following Data load types are possible in IICS?

·        On-premise to On-premise(Correct)

·        Cloud to on-premise(Correct)

·        Cloud to Cloud(Correct)

·        On-premise to Cloud(Correct)

Explanation

All the listed options are correct

 

Question 16: Incorrect

Can you use a JDBC connection in Data Replication Task

·        No(Correct)

·        Yes(Incorrect)

Explanation

Data Replication Task can not be configured using a JDBC connection

 

Question 17: Incorrect

You can monitor the agent status in IICS by using the

·        DIscover IQ

·        Monitor(Incorrect)

·        API Manager

·        Operational Insights(Correct)

Explanation

This Operational Insights service alerts by sending an email to the configured email address if the agent is down, CPU and Disk usage are high

 

Question 18: Incorrect

Key Range partition can be defined in

·        Active Transformations(Incorrect)

·        Target Objects

·        Source Objects(Correct)

·        Passive Transformations

Explanation

Key Range partition can be defined in Source Objects

 

Question 19: Incorrect

Which File Input type you should select in Hierarchy Parser Transformation, when the xml/json data is in the incoming source column

·        Parse

·        Buffer(Correct)

·        File(Incorrect)

·        Splitting

Explanation

Buffer type should be used when the xml/json data is in the incoming source column.

 

Question 20: Incorrect

A Service Connector can define multiple actions.

·        False(Incorrect)

·        True(Correct)

Explanation

Using Service Connector you can define multiple actions.

 

Question 21: Incorrect

SQL transformation can be used in Query Mode and Script Mode in Data Integration mappings

·        True(Incorrect)

·        False(Correct)

Explanation

You can't configure the SQL transformation in Script Mode in Data Integration mappings

 

Question 22: Incorrect

Separate license is required to create a Sub Org in the Org

·        True(Correct)

·        False(Incorrect)

Explanation

Yes, one should have Organization Hierarchy license enabled in his org to create one or more sub-organizations within the organization

Question 23: Correct

Select all the connection types which support Pre/Post SQL command in Source transformation

·        ODBC(Correct)

·        SQL Server(Correct)

·        MySQL(Correct)

·        Oracle(Correct)

Explanation

Oracle, MySQL, SQL Server and ODBC type connections supports Pre/Post SQL command in Source transformation of Informatica Cloud Mapping.

Question 24: Incorrect

You need to create a task flow which should execute two Data Integration tasks that run concurrently and if one of the task flow fails you need to execute a third Data Integration task that resets a particular date in an Audit table, for this you need to create a task flow with

·        Parallel Tasks

·        Task flows(Incorrect)

·        Parallel Tasks with Decision(Correct)

·        Linear Tasks with Decision

Explanation

If your major requirement is to run two or more data integration tasks in parallel and then make a decision based on the outcome of any task use Parallel Tasks with Decision template.

 

Question 25: Correct

Multi-pipelines are not supported in the Data Integration Mappings

·        False(Correct)

·        True

Explanation

You can create mappings with multiple pipelines in Data Integration Mappings. Example: - You can have a flow (first  pipeline) extracting data from the Relational source to Flat file and a different flow (second pipeline) extracting data from flat file and load it to a table.

 

Question 26: Incorrect

You have created a Linear Taskflow with six tasks, the fourth task has failed during execution, and how you can execute the tasks from the failed task.

·        Use the Stop on Errors = 0 option, to execute all the tasks in the Linear Taskflow

·        Navigate to the Monitor tab > All Jobs and click on Restart, the Linear Task Flow will start from the failed task(Incorrect)

·        You need to navigate to the individual task and execute the tasks (4th, 5th and 6th tasks) one by one(Correct)

·        In Linear Taskflow, if any task fails, rest of the tasks get executed. Parallel Tasks allows you to add dependencies based on previous task status (i.e. execute only if previous task is successful)

Explanation

You need to navigate to the individual task and execute the tasks (4th, 5th and 6th tasks) one by one

Question 27: Correct

Using Rest API schedulers can be

Note: There can be multiple correct answers to this question

·        updated(Correct)

·        Published

·        un-Published

·        deleted(Correct)

Explanation

Schedules could be created, updated and deleted using REST API

 

Question 28: Correct

You are trying to Import a PowerCenter workflow which contains transformations which are not supported in Informatica Cloud Data Integration mapping, the workflow upload to Data Integration will

·        only the source and target objects will be imported

·        fail(Correct)

·        partially Imported

·        get succeeded with warnings

Explanation

You are trying to Import a PowerCenter workflow which contains transformations which are not supported in Informatica Cloud Data Integration mapping, the workflow upload to Data Integration will fail

 

Question 29: Incorrect

IDIQ can be used for the following

Note: There can be multiple correct answers to this question

·        To know how many records got loaded into the target.(Correct)

·        To know how many times the task has run.(Correct)

·        To get status of services in the runtime environment(Incorrect)

·        To get task level in house reporting.(Correct)

Explanation

Discovery IQ (IDIQ) can be used for the following

To get task level in house reporting.

To know how many records got loaded into the target.

To know how many times the task has run.

 

Question 30: Incorrect

Data from multiple source connections can be combined using multiple object source.

·        No(Correct)

·        Yes(Incorrect)

Explanation

You can use multiple source tables which are present in the same source connection

 

Question 31: Correct

What is the benefit of using a Jump step

·        you can jump to a step on another branch of the same Parallel Path step.

·        Allows the guide to automatically start a background process

·        Allows a guide user more flexibility.

·        Avoids duplication of the same set of steps.(Correct)

Explanation

If you are in a Parallel Path step, you cannot jump to a step on another branch of the same Parallel Path step.

 

Question 32: Incorrect

Select all the correct statements w.r.t Data Replication.

·        You can simultaneously run multiple Data Replication tasks that write to the same target table. (Incorrect)

·        You can use database aliases as sources.(Correct)

·        You cannot simultaneously run multiple Data Replication tasks that write to the same target table.(Correct)

·        Data Replication supports Teradata, IBM DB2, Oracle, SQL Server, MySQL and Flatfiles as Target objects.(Incorrect)

Explanation

You cannot simultaneously run multiple Data Replication tasks that write to the same target table.

You can use database tables, aliases, and views as sources.

The supported Targets are Oracle, SQL Server, MySQL and Flatfiles.​​

 

Question 33: Incorrect

More than one command is not allowed in the pre/post processing command in the Informatica Cloud task

·        False(Correct)

·        True(Incorrect)

Explanation

You can include more than one command in the pre/post processing command, each command can be seperated by &&,or You can create a simple bash script file specifying all the commands inside it. After this you can call that script file from pre or post processing section of the task.

 

Question 34: Incorrect

Salesforce Lookup performance can be Improved in IICS by

Note: There can be multiple correct answers to this question

·        Increase jvm size to 1024 on Secure Agent(Incorrect)

·        Fetch only the required fields from salesforce(Correct)

·        Enable Lookup Caching(Correct)

·        Use related objects(Correct)

Explanation

The performance of a task will depend on many factors such as network speed, server configuration, etc. Below are some of the options that might help in improving the performance:

1. Enable Lookup Caching - This property is available in the Advanced section of Lookup transformation. This will bring entire data into temp cache and perform the lookup from the cache after the transaction is completed the file will be deleted. This option will be a better suit for 1million records.

2. Bring only the required fields from salesforce - Include the required fields in the lookup transformation

3. Use related objects - Join the related object using the related objects option.

 

Question 35: Incorrect

Only a single step can jump to the same target step

·        True(Incorrect)

·        False(Correct)

Explanation

More than one step can jump to the same target step.

Question 36: Correct

Select all the Advanced Salesforce Options, which are enabled, when SalesForce Bulk API is used

·        PK Chunking(Correct)

·        Parallel Mode

·        Serial Mode(Correct)

·        Related Object

Explanation

Enable Serial Mode and Enable PK Chunking( and PK Chunking Size, PK Chunking StartRowID) are the options available when using Salesforce Bulk API

 

Question 37: Correct

Select all the valid Macro Types in Informatica Cloud

·        Hybrid(Correct)

·        Horizontal(Correct)

·        User-Defined

·        Vertical(Correct)

Explanation

User-Defined is not a valid Macro Type in Cloud

Question 38: Correct

Versioning for Cloud objects can be enabled only during the Installation of Secure Agent

·        True

·        False(Correct)

Explanation

There is no option to enable versioning during the installation of Secure Agent

 

Question 39: Incorrect

Select all the common properties which are required for creating a SQL Server and Flatfile connections in IICS?

·        Code Page(Correct)

·        Connection Type(Incorrect)

·        Runtime Environment(Correct)

·        Date Format(Incorrect)

Explanation

Runtime Environment and Code Page are common properties which are required while creating a SQL Server and Flat file connections.

 

Question 40: Incorrect

Hierarchical schema is required for

Note: There can be multiple correct answers to this question

·        Hierarchy Builder Transformation(Correct)

·        Hierarchy Parser Transformation(Correct)

·        Hierarchy Designer Transformation(Incorrect)

·        Hierarchy Validator Transformation

Explanation

Hierarchical schema is required for Hierarchy Parser and Hierarchy Builder Transformation

 

Question 41: Incorrect

________ connector works with Mass Ingestion only

·        Amazon S3(Incorrect)

·        Amazon S3 V1

·        Amazon S3 MI

·        Amazon S3 V2(Correct)

Explanation

Amazon S3 V2 works with Mapping and Mass Ingestion only.

Question 42: Incorrect

PowerCenter mapplets can be used in Data Replication task

·        False(Correct)

·        True(Incorrect)

Explanation

PowerCenter mapplets can be used in Data Synchronization Task

 

Question 43: 

Skipped

Select all the correct statements Key Range partition 

·        The old Outbound Message stuck in the queue would attempt to re-process the stuck message and keeps re-triggering the job until the job completes successfully.(Correct)

·        Whenever any new record is inserted, updated or modified as per the workflow rule in the salesforce object, Salesforce sends outbound message alert to Informatica Cloud and triggers the related job.(Correct)

·        Once the data is sent by Salesforce, Informatica the SFDC connector process it. But if rejected by oracle, it will be lost.(Correct)

·        Deactivate the Salesforce WorkFlow Rule in Salesforce to avoid generating the Salesforce Outbound Message.(Correct)

Explanation

All the statements are correct  w.r.t Salesforce OutBound Message triggers

 

Question 44: Incorrect

You can use an Amazon S3 object as a source in

Note: There can be multiple correct answers to this question

·        Parallel Tasks with Decision

·        Synchronization task(Correct)

·        Replication Task(Incorrect)

·        Mapping task(Correct)

Explanation

You can use an Amazon S3 object as a source in a synchronization task, mapping, or mapping task

 

Question 45: Incorrect

__________ enables mass file transfer between on-premise and cloud data stores

·        AWS Snowball

·        Informatica Managed File Transfer(Incorrect)

·        Mass Ingestion(Correct)

·        Data Replication

Explanation

Mass Ingestion enables mass file transfer between on-premise and cloud data stores to support large data ingestion and data lake initiatives

 

Question 46: Incorrect

Select the correct sequence to use a saved query in Data synchronization  task

1. Edit the data type, precision, or scale of each column before you save the saved query

2. Create the saved query component

3. Select the saved query as the source object

4. In the synchronization task, select the saved query component

·        3,2,1,4

·        4,3,1,2

·        2,3,1,4(Incorrect)

·        2,1,4,3(Correct)

Explanation

2,1,4,3 is the correct sequence

 

Question 47: Incorrect

API Portal is a cloud-based service that an organization uses to manage the APIs for enterprise services and processes that are built-in Informatica Cloud Application Integration.

·        True(Incorrect)

·        False(Correct)

Explanation

API Manager is a cloud-based service that an organization uses to manage the APIs for enterprise services and processes that are built in Informatica Cloud Application Integration.

 

Question 48: Correct

REST V2 Connector supports the following media types

·        application/json(Correct)

·        Extended JSON mime type(Correct)

·        JSON custom type.(Correct)

·        JSON subtype(Correct)

Explanation

REST V2 Connector supports the following media types:

•application/xml

•application/json

•JSON subtype. For example: application/hal+json

•JSON custom type. For example: application/vnd.ds.abc.v1+json

•Extended JSON mime type. For example: application/vnd.ds.abc.v1+json;version=q1

 

Question 49: Incorrect

Select all the correct statements w.r.t. REST V2 Connector

·        REST V2 Connector does not support special characters in the array type fields.(Correct)

·        REST V2 Connector does not support a proxy server on which authentication is enabled.(Correct)

·        REST V2 Connector does not support special characters in the field name for query parameters and operation names.(Correct)

·        REST V2 Connector supports only Integer and Decimal data types.(Correct)

Explanation

All the listed options are correct

 

Question 50: Correct

You can Join two tables from the same database using source sql override

·        True(Correct)

·        False

Explanation

Yes, using the multiple source type, you can Join two tables from the same database using source sql override

 

SET 3

 

Question 1: Correct

___________ consists of data integration, process server, and mass ingestion engines and connectors to external data sources to execute both batch and real-time integrations and other forms of integrations in the future

·        Org

·        Secure Agent(Correct)

·        Secure Agent Cluster

·        Shared secure agent

Explanation

Secure Agent consists of data integration, process server, and mass ingestion engines and connectors to external data sources to execute both batch and real-time integrations and other forms of integrations in the future

Question 2: Incorrect

When you reset target tables for a replication task, the task performs the

Note: There can be multiple correct answers to this question

·        Drops all of the target tables included in the replication task from the database.(Correct)

·        Sets the load type for the replication task to full load.(Correct)

·        The data loaded in the previous Incremental load will be deleted, to perform a full load, reset all option should be selected(Incorrect)

·        Displays the list of target tables and you need to manually select the target tables(Include objects) which have to be reset(Incorrect)

Explanation

When you run the replication task after you reset the target, the replication task recreates each target table. The replication task then loads all of the data into the new table.

 

Question 3: Incorrect

You have created a Data Replication Task, with SQL server as Source Connection and Oracle as Target Connection. Select the load type which can be selected

·        Incremental loads after initial full load(Incorrect)

·        Incremental loads after initial partial load

·        Initial load : Rows created or modified after a particular date

·        Full load each run(Correct)

Explanation

Data Replication Task will always do full load when database is Source, user cannot change the loadtype as it is as designed in the task. To do incremental load, columns like created date, modified date are necessary and these columns may not be available with all tables in database. Hence by default task will do only full load.

 

Question 4: Incorrect

Select all the statements which are incorrect w.r.t Data Replication service.

·        You can parameterize connections in the Data Replication task.(Correct)

·        Target database field has to be manually altered in case of any data type, precision, or scale changes in Salesforce source(Correct)

·        Use Remove deleted columns and rows, to delete the rows from target which are deleted in source(Incorrect)

·        If source rows are deleted, there is no option to delete the rows from the target, as Data Replication can perform Full load followed by Incremental load(Correct)

Explanation

(1) For Data Replication tasks, set the 'AutoAlterColumnType' custom configuration property so the database target column adjusts when the data type, precision, or scale of a Salesforce source field changes. (2) it is not possible to parameterize connections in a standard Data Replication task.

 

Question 5: Incorrect

You have created a Data Integration mapping which uses Web Services transformation, due to some issue some source records are getting dropped, to isolate the issue, you need to capture all the SOAP requests and responses generated by the Web Services transformation.

Select the correct answer?

·        Click on the web service transformation > Advance and enable the Tracing Level to Verbose(Incorrect)

·        Edit the mapping task and change the Execution Mode from Standard to Verbose

·        Edit the mapping task and click on Advanced Session Properties and add a Custom Property > TracingLevel = Debug

·        Navigate to the Runtime Environment > Select the Agent > Custom Configuration > Data Integration Server > DTM > LOGLEVEL=DEBUG(Correct)

 

Question 6: Incorrect

You have created a Data Integration mapping which uses Web Services transformation, you need to pass a default value of '10' to three different columns (Type, No and Code) as shown below. Select the correct answer?

https://img-c.udemycdn.com/redactor/raw/2020-04-16_13-48-16-e5573bd1744cd020ded936984c9ee450.jpg

 

·        Drag the default value port from exp_IDWBI and drop it on the three columns (Type, No and Code)(Incorrect)

·        Each Input column can be mapped only once, create three ports(P1,P2,P3) with default value of '10' and map each port separately (P1 to Type, P2 to No and P3 to Code)(Correct)

·        Update the WSDL definition to pass the default value of 10, as unnecessary transformation logic will impact the mapping performance

·        Check the CreateMultiple option when the WSDL definition is imported, to map the same input value to multiple outputs in the Webservice consumer transformation

Explanation

Like PowerCenter you can't use the same default value input port to be mapped to multiple output ports in web service consumer transformation, you need to create separate ports with the same default value and map them

 

Question 7: Incorrect

Informatica Cloud Real-Time (ICRT) guides cannot be run on a local secure agent

·        True(Correct)

·        False(Incorrect)

Explanation

ICRT guides cannot be run on a local secure agent, they can only be run using Informatica Cloud hosted agent.

 

Question 8: Incorrect

You can parameterize  _______________  in a Standard Data synchronization Task

·        Source Connections only

·        Targets Connections only

·        Source and Target Connections(Incorrect)

·        Source and Target Connections can't be parameterize(Correct)

Explanation

No, it is not possible to parameterize connections in a standard Data Synchronization task . You can parameterize connections by using the parameterization at the mapping and resolving it at the MCT level. 

In the source and target transformations in mapping, there is an option called new parameter near the connection. Selecting that will let you provide the dynamic connection details at runtime in the MCT level.

 

Question 9: Incorrect

If an error occurs while processing an object in a Data Replication task, you can

Note: There can be multiple correct answers to this question

·        Cancel processing the remaining objects(Correct)

·        Continue processing the remaining objects(Correct)

·        Skip the error records and process all the objects

·        Abort processing all the object(Incorrect)

Explanation

If an error occurs while processing an object in a Data Replication task, you can not skip the error records or Abort processing the objects

 

Question 10: Incorrect

You can Import the Data Masking rules created in Informatica Test Data Management Tool (TDM) into Informatica Data Masking tasks.

·        True(Incorrect)

·        False(Correct)

Explanation

You can't Import objects from Informatica Test Data Management Tool  in to the Informatica Cloud

 

Question 11: Incorrect

_________ should be provided when you configure the encryption action for a mass ingestion task

·        key ID(Correct)

·        key passphrase

·        Key pair(Incorrect)

Key Ring

Explanation

When you configure the encryption action for a mass ingestion task, you provide a key ID. The key ID is the public key ID of the receiver who decrypts the file. You can also add your private key ID and key passphrase to sign the files.

 

Question 12: Incorrect

You need to load data to a salesforce target object, as per the requirement you need to load data to the salesforce target in the order it receives them.

·        Use Salesforce Standard API and Enable serial mode

·        Use Salesforce Bulk API and Enable serial mode(Correct)

·        Irrespective of Salesforce Standard/Bulk API, Informatica cloud processes the entire content of each batch from source before processing the next batch.

·        Configure the “SalesForceSerialLoad" property at the secure agent level, all the jobs which use Salesforce as Target will process data in the order it receives from the source(Incorrect)

Explanation

You can Enable Serial Mode, only when you use the Salesforce Bulk API. In a serial load, Salesforce writes batches to targets in the order it receives them. Salesforce processes the entire content of each batch before processing the next batch.

Question 13: Incorrect

Salesforce Bulk API is configured to extract data from Salesforce, and you have enabled PK Chunking to generate separate batches based on the PK Chunking size 10,000. The PK Chunking option can be enabled for

Select all the correct answers

·        Synchronization tasks(Correct)

·        Replication tasks(Correct)

·        Mapping tasks(Correct)

·        Mass ingestion tasks(Incorrect)

Explanation

The PK Chunking option is available in Synchronization tasks, Replication tasks, and Mapping tasks. To enable this feature, you must use Salesforce API version 32.0 or above

Question 14: Incorrect

What are the packages required for Hierarchy Parser/Builder Transformation

·        DataTransformation(Correct)

·        UDTforHierarchy(Correct)

·        HierarchyParser(Incorrect)

·        HierarchyBuilder(Incorrect)

Explanation

Below are the mandatory packages for Hierarchy Parser Transformation:

1) saas-xmetadataread

2) UDTforHierarchy

 

3) DataTransformation

 

Question 15: Incorrect

When you create a Bundle, you can allow the Bundle to be used as

Note: There can be multiple correct answers to this question

  • Reference(Correct)
  • Copy(Correct)
  • InSync(Incorrect)

·       Connector(Incorrect)

Explanation

You can use the Bundle as Copy or Reference or Reference and Copy

-Reference. Allows others to select the assets in the bundle to use in their projects. The assets remain in the bundle folder and cannot be edited.

-Copy. Allows others to copy and edit the assets in the bundle.

-Reference and Copy.

Question 16: Correct

What are the minimum permissions to the User to Log On Informatica Cloud secure agent

·        Permission to edit the registry(Correct)

·        Permission to run the services and processes(Correct)

·        Permission on all the Secure Agent files(Correct)

·        Permission on the system temp location(Correct)

Explanation

The User should be part of the Administrator group, if the user is not Administrator, it should have the following:

Permission on all the Secure Agent files and folders.

Permission to run the services and processes.

Permission to edit the registry.

Permission on system temp location.

 

Question 17: Incorrect

To create categories for employees based on salary ranges, you might create a macro that defines the minimum and maximum values for each range and corresponding job categories. This is an example of

·        Horizontal Macro(Correct)

·        Vertical Macro

·        Hybrid Macro

·        Aggregate Macro(Incorrect)

Explanation

A horizontal macro expression produces one result, so a transformation output field passes the results to the rest of the mapping. You configure the horizontal macro expression in the transformation output field.

 

Question 18: Correct

Does Informatica Cloud make multiple Salesforce API Calls for lookup on salesforce used in the Informatica cloud task

·        True(Correct)

·        False

Explanation

True, performing lookups on SFDC through Informatica cloud task consumes more number of API calls on SFDC, that is one call per record and the performance of the task will also be impacted.

It is advisable to change the SFDC lookups to a Flatfile lookup to improve the performance and reduce the number of API calls.

 

Question 19: 

Skipped

Related Object is an object that is related to another object based on a relationship defined in Informatica Cloud.

·        True

·        False(Correct)

Explanation

You can configure a Source transformation to join related objects. You can join related objects based on existing relationships or custom relationships. The types of relationships that you can create are based on the connection type.

 

Question 20: Incorrect

Select all the Incorrect statements

·        Informatica PowerCenter Mappings can be Imported as Informatica Cloud Data Integration Mappings(Correct)

·        Informatica Cloud Data Integration Mappings  can be Imported as Informatica PowerCenter Mappings(Correct)

·        You can create reusable transformations in Informatica Cloud Data Integration Mappings(Correct)

·        You can use Shared source objects in different projects(Correct)

Explanation

All the mentioned statements are incorrect

Question 21: Incorrect

Informatica Cloud allows users to

Note: There can be multiple correct answers to this question

·        Synchronize account data(Correct)

·        Work from their handheld devices(Incorrect)

·        Import data on sales leads(Correct)

·        Monitor PowerCenter jobs(Incorrect)

Explanation

Informatica Cloud allows users to Synchronize account data and Import data on sales leads

Question 22: Incorrect

________ is a set of screens that prompts users to review, enter or confirm data

·        Guide(Correct)

·        Process Call

·        Service(Incorrect)

·        Milestone

Explanation

A Guide is a set of screens that prompts users to review, enter or confirm data

 

Question 23: Correct

Which Informatica Cloud application is best to use for performing regular backups of data or perform offline reporting

·        Data Replication(Correct)

·        Mapping Configuration

·        Data Synchronization

·        Mass ingestion task

Explanation

Use the replication task to replicate data to a target. You might replicate data to back up the data or perform offline reporting

 

Question 24: Incorrect

Select all the correct statements w.r.t embedded guides?

·        Allow you to make changes in one place(Correct)

·        Allow you to repeat steps on multiple records(Correct)

·        Allow you to have multiple versions of the same guide(Incorrect)

·        Allow you to avoid creating the same set of steps(Correct)

Explanation

Using an embedded guide has the following advantages 1) Avoids Repetition (2) Clarifies Design (3) Clarifies Logic

 

Question 25: Correct

The secure agent uses HTTPS port ____ for any outbound communication.

·        443(Correct)

·        442

·        22

·        80

Explanation

By default, the secure agent uses HTTPS port 443 for any outbound communication.

 

Question 26: Incorrect

The primary purpose of a Create step is

·        To create a new guide outcome

·        To create a new record of any type(Correct)

·        To create a process based on the defined paths

·        To create a set a values(Incorrect)

Explanation

Use a Screen step to perform the following tasks:

Show instructions about what is being displayed, what to enter, and so on.

Update an object's data based on user input.

 

Question 27: Correct

You can get the below stats from

https://img-c.udemycdn.com/redactor/raw/2020-04-11_13-34-02-7d10258e525366da209e14eac2bcd465.JPG

·        DiscoveryIQ (Correct)

·        Operational Insights

·        Data Integration Stats

·        Monitor

Explanation

DiscoveryIQ will provide the stats listed in the question

 

Question 28: Incorrect

Can you use Amazon S3 V2 objects in a connected cached Lookup transformation

·        Yes(Correct)

·        No(Incorrect)

Explanation

Yes, you can use Amazon S3 V2 objects in a connected cached Lookup transformation

Question 29: Incorrect

Which method is used to achieve real-time integration when Salesforce is the integration source?

·        Salesforce Trigger

·        Third-Party Scheduler(Incorrect)

·        Salesforce Outbound Messaging(Correct)

Explanation

Whenever any new record being inserted, updated or modified as per the workflow rule in the salesforce object, Salesforce send outbound message alert to Informatica Cl oud and triggers the related job

 

Question 30: Incorrect

Bulk mode can be used to read real-time data in a Salesforce connection

·        Yes(Incorrect)

·        No(Correct)

Explanation

No, the bulk mode cannot be used to read real-time data in a Salesforce connection in Cloud Data Integration. Bulk mode stages the data in a compressed file and writes data to a target only when it reaches a limit.​

 

Question 31: Incorrect

Which of the following SQL statements cannot be used in Saved Queries

·        Select(Incorrect)

·        Insert(Correct)

·        Update(Correct)

·        Delete(Correct)

Explanation

Select queries can only be used in Saved Queries

 

Question 32: Incorrect

Select all assets which can have subtasks

·        Replication tasks(Correct)

·        Advanced taskflows(Correct)

·        Linear taskflows(Correct)

·        Synchronization tasks

Explanation

The following types of assets have subtasks:

Replication tasks

Advanced taskflows

Linear taskflows

Azure data sync tasks

Data profiling tasks

 

Question 33: Incorrect

You have two legacy systems that have merged into one system. You need to concatenate the identification numbers from both systems into another system. The macro input field %ClientID% contains the following fields:

ID1, ID2

The following expression concatenates the fields in %ClientID%:

·        %OPR_CONCAT[ %ClientID% ]%(Correct)

·        ID1||ID2(Incorrect)

·        %OPR(ID1||ID2)%

·        %OPR_CONCAT[ %ID1,ID2% ]%

Explanation

%OPR_CONCAT% concatenates the IDs into one return value:

 

Question 34: Incorrect

Select all the addon connections available in IICS

·        Adobe Cloud Platform(Correct)

·        Amazon Dynamo DB(Correct)

·        Azure Data Lake Store Gen2(Correct)

·        SAP(Incorrect)

Explanation

SAP is not available as an addon connection in IICS

 

Question 35: Incorrect

Partitioning is not supported for

·        Parameterized objects(Correct)

·        Custom query(Correct)

·        Single Object(Incorrect)

·        Multiple Objects(Incorrect)

Explanation

Partitioning is not supported for Parameterized objects and Custom Query

 

Question 36: Incorrect

Select all the licenses that are required to be enabled on the ORG to have the services and components for Mass Ingestion

·        Connector licenses(Correct)

·        Mass Ingestion(Correct)

·        MassIngestionRuntime(Correct)

·        ConnectorRuntime(Incorrect)

Explanation

Mass Ingestion and MassIngestionRuntime licenses are required to be enabled on the ORG to have the services and components for Mass Ingestion available.

Connector licenses (example:- Amazon S3 V2, Advanced FTP) which are required  in order to set up connections that can be used for Mass Ingestion tasks

 

Question 37: Incorrect

You cannot invoke the Data Integration tasks in parallel using Cloud Application Integration

·        True(Incorrect)

·        False(Correct)

Explanation

You can invoke the Data Integration tasks in parallel using Cloud Application Integration

 

 

Question 38: Correct

You can reset the $lastruntime value in ICS

·        True

·        False(Correct)

Explanation

No, in Informatica Cloud Services (ICS) you would not be able to reset the $lastruntime value.

 

Question 39: Incorrect

API Manager provides API consumers with secure access to managed APIs.

·        True(Incorrect)

·        False(Correct)

Explanation

API Portal provides API consumers with secure access to managed APIs.

 

Question 40: Incorrect

Select the assets which have to be manually updated after migrating the task from one Org to another Org which has multiple Secure agents in Informatica Cloud?

·        Schedule tasks(Correct)

·        Connection passwords and Security token(Correct)

·        Runtime environment(Correct)

·        Flat File Connections(Correct)

Explanation

It is recommended to check the following after migrating an object from one Org to other Org:

1. Schedule tasks and task flows. Schedule information is not migrated with tasks or task flows.

2. Configure connection passwords and security token after migration. Connection passwords and security tokens are not migrated.

3. Runtime environment:- verify the runtime environment. The migration process assigns all connections to the first available runtime environment in the list of runtime environments in the target organization.

4: FlatFile Connections:- The Flat File connections may still point to the org from which the object is copied, updated as per the target ORG file path

 

Question 41: Incorrect

Select all the correct statements w.r.t screen step

·        Displays instructions telling the user about what is being displayed, what to enter, and so on.(Correct)

·        Updates an object's data based on user input.(Correct)

·        Allows a guide to take different paths depending on the value of some field.(Incorrect)

·        The actions defined in service are placed within the current screen(Incorrect)

Explanation

Using the Screen Step you can

Displays instructions telling the user about what is being displayed, what to enter, and so on.

Updates an object's data based on user input.

 

Question 42: Incorrect

When displaying a list of records in an embedded guide step or screen step, how do you define which fields (columns) will display?

·        You cannot control which fields display in a list of records

·        Edit the Guide properties and select the fields(Incorrect)

·        Edit the field properties and select the Display Fields(Correct)

·        Edit the screen properties and enable the Display Fields

Explanation

You should Edit the field properties and select the Display Fields

 

Question 43: Incorrect

An assignment step cannot be used to set the value of fields on a related object.

·        True(Incorrect)

·        False(Correct)

Explanation

You can use an assignment step to set the value of fields on a related object.

 

Question 44: Incorrect

Incremental load in Data Replication Service allows all Salesforce objects to be replicated

·        True(Incorrect)

·        False(Correct)

Explanation

Incremental load usually depends on the created date and systemmodstamp fields. If the Salesforce objects include these fields then Incremental load can be performed

And the Salesforce object should have the replicable property set to true

 

Question 45: Correct

PowerCenter workflow can be Imported and executed as a Cloud Data Integration task.

·        True(Correct)

·        False

Explanation

Yes,PowerCenter workflow can be Imported and executed as a Cloud Data Integration task.

 

Question 46: Correct

ReST connector can parse PUT/POST response and get the flat output

·        True

·        False(Correct)

Explanation

ReST V2 connector can parse PUT/POST response and get the flat output. ReST V2 connector doesn't parse the PUT/POST response

Question 47: Incorrect

Select all the correct statements w.r.t Monitoring

·        You can increase the number of log entries at Monitor page

·        The running page lists the mapping, task, and taskflow instances that are starting, queued, running, and suspended.(Correct)

·        When you view running jobs, job properties such as end time, rows processed, and state is continuously updated.(Correct)

·        Use the All Jobs Jobs page for live monitoring of the jobs that are running in your organization.(Incorrect)

Explanation

You can not increase the number of log entries at the Monitor page.

Use the Running Jobs page for live monitoring of the jobs that are running in your organization.

 

Question 48: Incorrect

Select all the correct statements w.r.t Permissions

·        The folder permission will have precedence over the Project permission.(Correct)

·        The permission defined at Asset will have the highest precedence than Folder and Project(Correct)

·        If no any permission is defined on Asset, it will inherit the Folder permission which has precedence over the Project permission.(Correct)

·        Any user (of any role) will be able to navigate amongst any Project, Folder and Assets.(Correct)

 

Question 49: Correct

Select all the correct statements w.r.t Process Object in Cloud Application Integration

·        The process object is a structure of how the XML can be stored in these Process Object.(Correct)

·        Process object validate the XML against the Process object structure(Correct)

·        The fields cannot be marked as required in the Process object(Correct)

·        Process objects generated in a service connector are available only within the service connector.(Correct)

Explanation

All the mentioned statements are correct w.r.t Process Object in Cloud Application Integration


Question 50: Correct

The __________ converts relational input into hierarchical output.

·        Hierarchy Builder transformation(Correct)

·        Hierarchy Parser transformation

·        Relational To XML Transformation

·        Hierarchy To Relational  Transformation

Explanation

The Hierarchy Builder transformation converts relational input into hierarchical output.


Comments

Popular Post

SQL Server Database Interview Questions and Answers

SQL Server Database Details