Informatica Intelligent Cloud Service (IICS)


Informatica

Informatica is a software development company founded in 1993. it is headquartered in Redwood City, California. Its core products include Enterprise Cloud Data Management and Data Integration.




Notes :- Each Informatica, We have one domain and multiple nodes.
  • Info_DMPOC :- domain Name
  • ISPOC: Repository Services
  • RSPOC: Integration Services
  • nosde01: Node

Business Intelligence

Business intelligence (BI) is a technology-driven process for analyzing data and delivering actionable information that helps executives, mangers and workers make informed business decisions. 

ETL Tool List :- 

  1. Integrate.io
  2. CData Sync
  3. QuerySurge
  4. Rivery
  5. Dataddo
  6. DBConvert
  7. AWS Glue
  8. Alooma
  9. Stitch
  10. Fivetran
  11. Matillion
  12. Streamsets
  13. Talend
  14. Informatica PowerCenter
  15. Blendo
  16. IRI Voracity
  17. Azure Data factory
  18. Logstash
  19. SAS
  20. Pentaho Data Integration
  21. Etleap
  22. Singer
  23. Apache Camel
  24. Actian
  25. Qlik Real-Time ETL
  26. IBM Infospere DataStage
  27. Oracle Data Integrator
  28. SQL Server Integration Services

Informatica Products & Usage:-

There are multiple products under Informatica product suite that helps you to satisfy different data integration requirements. they are

  • Informatica Power Center
  • Informatica Data Quality
  • Informatica Power Exchange
  • Master Data Management
  • Data Masking
  • Data Virtualization
  • Complex Event Processing
  • Cloud

Informatica Cloud Data Integration (IICS)

Data Integration is one of the Informatica Cloud Data Integration service. Informatica Cloud is a data integration solution and platform that works Software as a Service (SaaS). Informatica Cloud can connect to on-premises, cloud-based applications, databases, flat files, file feeds, and even social networking sites.

For More details go through the link :- Informatica Cloud - javatpoint

IICS Services: - 
1) common Services:- 

2)  Data Quality and Governance Cloud Services:- 



3) Master Data Management Cloud: -




4)  Integration Cloud Services:  - 




After login into IICS you will see the below My Service Page:- 


  1. My Service Page you will see some by default services :-
  • API Manager
  • Application Integration
  • Application Integration Console
  • Data Integration 
  • Administrator
  • Monitor
    -> Here we will discuss about the Monitor, Administrator and Data integration Services.
    -> If you click in Show all services then you will see the all the services provided by Informatica cloud.
  • Other Services Are :-

     2. One you Click in Data Integration then Home page will look like as below :- 
3. To create an Assest-> click on New :- 


Then you will some assests (objects) here in below photo.



Under the Task Service have some sub services/ Assests as below :- 
  1. Tasks
  • Mapping Task:- Process Data based on the data flow login defined in a mapping.
  • Synchronization Task:- Synchronize data between a source and a target to integrate applications, databases, and files.
  • Replication Task:- Copy data from Salesforce or database source to a database or file target.
  • PowerCenter Task:- Import a PowerCenter workflow so you can run it as a cloud data integration task.
  • Mass Ingestion Task:- Transfer files from source to target. Use the task for massive file transfer between the organization and cloud storages. 

    2. Mappings:-
  • Mappings:- Create a mapping. This mapping can then be used in one or more mapping tasks.
  • Integration:-
  • Cleaning
  • Warehoursing

    3. Taskflows:-

  • Liner Taskflow:- Combine integration task and run them in a specific order.
  • Taskflow:- Create control flow logic to run multiple Data Integration task based on the previous task result.
  • Parallel Tasks:- Creates a taskflow with multiple Data integration task that run concurrently.
  • Parallel Task with Decision:- Creates a taskflow with multiple Data integration tasks that run concurrently and a Decision step that directs further processing.
  • Sequential Task:- Creates a taskflow with multiple data integration tasks that run consecutively.


    4.  Components:-
  • Business Services
  • Mapplet- PC Import
  • Mapplet BAP/IDOC
  • Saved Query
  • Hierarchical Schema :- Upload an XML schema or an XML or JSON sample file to use with a Hierarchy transformation.


Data Integration Task:- 

1) Synchronization Task :- to load data from one source system to a target system
2) Replication Task:- to create and schedule data backups
3) Mapping Tasks:- to implement advanced transformation such as data masking, and aggregation.


1) Synchronization Task :- It is used to update existing records or to insert new records in the data warehouse. It has 6 steps and every steps have save button as below :- 

A) Definition
    • Task Details :- 
      • Task Name
      • Location
      • Description
      • Task Operation
        • Insert
        • Update
        • Upsert
        • Delete

B) Source
    • Source Details
      • Connection
      • Source Type
        • Single
        • Multiple
        • Saved Query
      • Source Object
    • Data Preview

C) Target
  • Target Details
    • Connection
    • Target Object
  • Data Preview
D) Data Filters
  • Row Limit
    • Process all rows (By default set)
    • Process only the first 100 rows (Can change it)
  • Data Filters:- New Button exists here
    • There are no filters defined. The task will process all data from source.
E) Field Mapping
  • Add Mapplet 
  • Refresh Fields
  • Source :- here source field appear with status and Name 
  • Target :- here target field appear with status, Name, Actions, & Expression/Lookup.
F) Schedules
  • Schedule details :-
    • Don't run the task as a schedule. (By default set)
    • Run the task on schedule :- New Button is there.
  • Email Notification Options 
    • Use the default email notification options for my organization
    • Use custom email notification for this task:
      • Failuer Email notification
      • Warnning Email Notification
      • Success Email Notification
Note :- 

Targets in a Data Synchronization Task

  • You can use a single object as a target for a Data Synchronization task.
  • The target connections that you can use depend on the task operation you select for the task. For example, if you select the upsert task operation, you cannot use a flat file target connection because you cannot upsert records into a flat file target.
    For More Details visit:- Data Synchronization Task Options (informatica.com)


2) Replication Task:- It is used to store the backup of a data warehouse in another server using informatica cloud. It have only 5 step to complete 
A) Source
  • Task Details:- 
    • Task Name
    • Location
    • Description
  • Source Details:-
    • Source Connection
    • Object to Replicate:-
      • All Object
      • Include Objects 
        • Account (For Example )
        • Opportunity (For Example )
      • Exclude Objects
B) Target
  • Target Details
    • Connection 
    • Target Prefix
  • Replication Options
    • Load Type
      • Incremental loads after initial full load
      • incremental loads after initial portal load
        • Initial load Rows created or modified other
      • full load each run
    • Delete  Option
      • Remove deleted columns and rows
      • Retain deleted columns and rows 
  • Advanced Options
    • Commit size.
C) Field Exclusions  
  • Field Exclusions:- Exclude Fields button is there.
D) Data Filter :- same as synchronization task 
E) Schedule:-  same as synchronization task 


3) Mapping :- First create mapping between source and target and include transformation as per the requirement. 

4) Mapping Task:- It is used for advanced transformation and perform optimization. and based on the a mapping create a mapping task. It  have few option as below :- 
A) Definition :- define the name of the task and folder and description.
B) Sources :- select mapping  to run the specify logic in mapping.
C) Input parameter :- if your mapping is parametrized then use this option to assign value to parameter                                     which is specify in mapping.
D) Schedule :- Same as synchronization task AND Pre & Post SQL Command can set here.
Secure Agent
Secure Agent consists of data integration, process server, and mass ingestion engines and connectors to external data sources to execute both batch and real-time integrations and other forms of integrations in the future.





Comments

Popular Post

SQL Server Database Interview Questions and Answers

SQL Server Database Details