Pages

Tuesday, March 3, 2015

Talend Integration Best Practices



1. To start project with Business Model, also attach the document to it whenever Buniness model gets changed

2. Talend workspace path should not contain any spaces.

3. Have a document for important components

4. Maintain the job version and document , do not forget record it in document, when changes happen

5. Create Repository Metadata for DB connections and retrieve database table schema for DB tables. 

6.Use Repository Schema for Files/DB and DB connections.

7.Create Database connection using t<Vendor>Connection component and use this connection in the Job. Do not make new connection with every component 

8. Create a Repository Document corresponding to every Talend job including revision history.

9.Provide Sub Job title for every sub job to describe the sub job purpose/objective.

10.Avoid Hard Coding in Talend Job component. Instead use Talend context variables

11.Create Context Groups in Repository

12.Use Talend.properties file to provide the values to context variables using tContextLoad

13. Create Variables in tMap and use the variables to assign the values to target fields.

14.Create user routines/functions for common transformation and validation

15.Always rename Main Flows in Talend Job to meaningful names

Open Source ETL tools vs Commercial ETL tools


















The ETL-tools are validated on the following categories

InfrastructureFunctionalityUsability
Platforms supportedDebugging facilitiesData Quality / profiling
PerformanceFuture prospectsReusability
ScalabilityBatch vs Real-timeNative connectivity


Figure 1: Simple schematic for a data warehous... 

Pentaho Kettle vs Talend


Pentaho
  1. Pentaho is a commerical open-source BI suite that has a product called Kettle for data integration.
  2. It uses an innovative meta-driven approach and has a strong and very easy-to-use GUI.
  3. The company started around 2001 (2002 was when kettle was integrated into it).
  4. It has a strong community of 13,500 registered users.
  5. It has a stand-alone java engine that process the jobs and tasks for moving data between many different databases and files.
  6. It can schedule tasks (but you need a schedular for that - cron).
  7. It can run remote jobs on "slave servers" on other machines.
  8. It has data quality features: from its own GUI, writing more customised SQL queries, Javascript and regular expressions.


Talend
  1. Talend is an open-source data integration tool (not a full BI suite).
  2. It uses a code-generating approach. Uses a GUI, but within Eclipse RC.
  3. It started around October 2006
  4. It has a much smaller community then Pentaho but has 2 finance companies supporting it.
  5. It generates java or perl code which you later run on your server.
  6. It can schedule tasks (also with using schedulars like cron).
  7. It has data quality features: from its own GUI, writing more customised SQL queries and Java.


Comparison - (from my understanding)
  • Pentaho is faster (twice as fast maybe) then Talend.
  • Pentaho's GUI is easier to use then Talend's GUI and takes less time to learn.


My impression
Pentaho is easier to use because of its GUI.
Talend is more a tool for people who are making already a Java program and want to save lots and lots of time with a tool that generates code for them.



Assuming Pentaho made it to the next round....

Pentaho Kettle vs Informatica

Informatica
  1. Informatica is a very good commercial data integration suite.
  2. It was founded in 1993
  3. It is the market share leader in data integration (Gartner Dataquest)
  4. It has 2600 customers. Of those, there are fortune 100 companies, companies on the Dow Jones and government organization.
  5. The company's sole focus is data integration.
  6. It has quite a big package for enterprises to integrate their systems, cleanse their data and can connect to a vast number of current and legacy systems.
  7. Its very expensive, will require training some of your staff to use it and probably require hiring consultants as well. (I hear Informatica consultants are well paid).
  8. Its very fast and can scale for large systems. It has "Pushdown Optimization" which uses an ELT approach that uses the source database to do the transforming - like Oracle Warehouse Builder.


Comparison
  • Pentaho's Javascipt is very powerful when writing transformation tasks.
  • Informatica has many more enterprise features, for example, load balancing between database servers.
  • Pentaho's GUI requires less training then Informatica.
  • Penatho doesn't require huge upfront costs as Informatica does. (that part you saw coming, I'm sure)
  • (edited)Informatica is faster then Pentaho. Infromatica has Pushdown Optimization, but with some tweaking to Pentaho and some knowledge of the source database, you can improve the speed of Pentaho. (also see line below)
  • (new)You can place Pentaho Kettle on many different servers (as many as you like, its free) and use it as a cluster.
  • Informatica has much better monitoring tools then Pentaho.