1. What is Oracle Data Integrator (ODI)?
Oracle acquired Synopsis in 2006 and with it “Synopsis Data Integrator”.
Oracle Data Integrator (ODI) is an E-LT (Extract, Load and Transform) tool used for high-speed data movement between disparate systems.
The latest version, Oracle Data Integrator Enterprise Edition (ODI-EE) brings together “Oracle Data Integrator” and “Oracle Warehouse Builder” as separate components of a single product with a single licence.
2. What is E-LT?
E-LT is an innovative approach to extracting, loading and Transforming data. Typically ETL application vendors have relied on costly heavyweight , mid-tier servers to perform the transformations required when moving large volumes of data around the enterprise.
ODI delivers unique next-generation, Extract Load and Transform (E-LT) technology that improves performance and reduces data integration costs, even across heterogeneous systems by pushing the processing required down to the typically large and powerful database servers already in place within the enterprise.
- What components make up Oracle Data Integrator?
“Oracle Data Integrator” comprises:
1) Oracle Data Integrator + Topology Manager + Designer + Operator + Agent
2) Oracle Data Quality for Data Integrator
3) Oracle Data Profiling
- What is Oracle Data Integration Suite?
Oracle data integration suite is a set of data management applications for building, deploying, and managing enterprise data integration solutions:
- Oracle Data Integrator Enterprise Edition
- Oracle Data Relationship Management
- Oracle Service Bus (limited use)
- Oracle BPEL (limited use)
- Oracle WebLogic Server (limited use)
Additional product options are:
- Oracle Goldengate
- Oracle Data Quality for Oracle Data Integrator (Trillium-based DQ)
- Oracle Data Profiling (Trillium based Data Profiling)
- ODSI (the former Aqualogic Data Services Platform)
3.What systems can ODI extract and load data into?
ODI brings true heterogeneous connectivity out-of-the-box, it can connect natively to Oracle, Sybase, MS SQL Server, MySQL, LDAP, DB2, PostgreSQL, Netezza.
It can also connect to any data source supporting JDBC, its possible even to use the Oracle BI Server as a data source using the jdbc driver that ships with BI Publisher
Related Courses: Oracle Internet Directory >> Oracle Identity Federation
Improve your skills with >> Oracle Enterprise Data Quality >> Oracle Identity Analytics
4.What are Knowledge Modules?
Knowledge Modules form the basis of ‘plug-ins’ that allow ODI to generate the relevant execution code , across technologies , to perform tasks in one of six areas, the six types of knowledge module consist of:
Reverse-engineering knowledge modules are used for reading the table and other object metadata from source databases
Journalizing knowledge modules record the new and changed data within either a single table or view or a consistent set of tables or views
Loading knowledge modules are used for efficient extraction of data from source databases for loading into a staging area (database-specific bulk unload utilities can be used where available)
Check knowledge modules are used for detecting errors in source data
Integration knowledge modules are used for efficiently transforming data from staging area to the target tables, generating the optimized native SQL for the given database
Service knowledge modules provide the ability to expose data as Web services
ODI ships with many knowledge modules out of the box, these are also extendable, they can be modified within the ODI Designer module.
5.How do ‘Contexts’ work in ODI?
ODI offers a unique design approach through use of Contexts and Logical schemas. Imagine a development team, within the ODI Topology manager a senior developer can define the system architecture, connections, databases, data servers (tables etc) and so forth.
These objects are linked through contexts to ‘logical’ architecture objects that are then used by other developers to simply create interfaces using these logical objects, at run-time, on specification of a context within which to execute the interfaces, ODI will use the correct physical connections, databases + tables (source + target) linked the logical objects being used in those interfaces as defined within the environment Topology.
6.Does my ODI infrastructure require an Oracle database?
No, the ODI modular repositories (Master + and one of multiple Work repositories) can be installed on any database engine that supports ANSI ISO 89 syntax such as Oracle, Microsoft SQL Server, Sybase AS Enterprise, IBM DB2 UDB, IBM DB2/40.
7.Does ODI support web services?
Yes, ODI is ‘SOA’ enabled and its web services can be used in 3 ways:
The Oracle Data Integrator Public Web Service, that lets you execute a scenario (a published package) from a web service call
Data Services, which provide a web service over an ODI data store (i.e. a table, view or other data source registered in ODI)
The ODIInvokeWebService tool that you can add to a package to request a response from a web service
8.Where does ODI sit with my existing OWB implementation(s)?
As mentioned previously, the ODI-EE licence includes both ODI and OWB as separate products; both tools will converge in time into “Oracle’s Unified Data Integration Product”.
OWB 11G R2 is the first step from Oracle to bring these two applications together, it is now possible to use ODI Knowledge modules within your OWB 11G R2 environment as ‘Code Templates’
9.What is the ODI Console?
The ODI console is a web-based navigator to access the Designer, Operator and Topology components through a browser.
10.Suppose I have 6 interfaces and running the interface 3 rd one failed, how to run the remaining interfaces?
If you are running Sequential load it will stop the other interfaces. so, go to operator and right click on the field interface and click on restart. If you are running all the interfaces are parallel only one interface will fail and other interfaces will finish.
11.What are load plans and types of load plans?
Load plan is a process to run or execute multiple scenarios as a Sequential or parallel or conditional based execution of your scenarios. And same we can call three types of load plans , Sequential, parallel and Condition based load plans.
12.What is your profile in ODI?
profile is a set of objective privileges. we can assign these profiles to the users. Users will get the privileges from profile
13.How to write the subqueries in ODI?
Using the Yellow interface and subqueries option we can create subqueries in ODI. or Using VIEW we can go for sub queries Or Using ODI Procedure we can call direct database queries in ODI.
14.How to remove duplicates in ODI?
Use DISTINCT in IKM level.it will remove the duplicate rows while loading into target.
15.Suppose I have a unique and duplicate but I want to load a unique record into one table and duplicate one table?
Create two interfaces or one procedure and use two queries one for Unique values and one for duplicate values.
16.How to implement data validations?
Use Filters & Mapping Area AND Data Quality related to constraints use CKM Flow control.
17.How to handle exceptions?
Exceptions: In the packages advanced tab and load plan exception tab we can handle exceptions.
18.In the package one interface failed how to know which interface failed if we had no access to the operator?
Make it a mail alert or check into SNP_SESS_LOg tables for session log details.
Related Courses: Oracle Data Integrator
19.How to implement the logic in procedures if the source side data is deleted that will reflect the target side table?
Use this query on the Command to Delete from Target_table where it does not exist (Select ‘X’ From Source_table Where Source_table.ID=Target_table.ID).
20.If the Source have total 15 records with 2 records are updated and 3 records are newly inserted at the target side we have to load the newly changed and inserted records
Use IKM Incremental Update Knowledge Module for Both Insert and Update operations.
21.Can we implement package in package?
Yes, we can call one package into another package.
22.How to load the data with one flat file and one RDBMS table using joins?
Drag and drop both File and table into the source area and join as in the Staging area.