Pages

Friday, August 10, 2018

ETL Tools Categorywise

ETL Tools: A Modern List

We take a look at how ETL tools have evolved over the years to incorporate the cloud and open source. See which tools work best in which situations.


Extract, Transform, and Load (ETL) tools enable organizations to make their data accessible, meaningful, and usable across disparate data systems. When it comes to choosing the right ETL tool, there are many options to chose from. So, where should you start?
We've prepared a list that is simple to digest, organized into four categories to better help you find the best solution for your needs.

Incumbent Batch ETL Tools

Until recently, most of the world’s ETL tools were on-prem and based on batch processing. Historically, most organizations used to utilize their free compute and database resources, during off-hours, to perform nightly batches of ETL jobs and data consolidation. This is why, for example, you used to see your bank account updated only a day after you made any financial transaction.

Cloud Native ETL Tools

With IT moving to the cloud, more and more cloud-based ETL services started to emerge. Some of them keep the same basic batch model of the legacy platforms, while others start to offer real-time support, intelligent schema detection, and more.

Open Source ETL Tools

Similarly to other areas of software infrastructure, ETL has had its own surge of open source tools and projects. Most of them were created as a modern management layer for scheduled workflows and batch processes. For example, Apache Airflow was developed by the engineering team at AirBnB, and Apache NiFi by the US National Security Agency (NSA).

Real-Time ETL Tools

Doing your ETL in batches makes sense only if you do not need your data in real-time. It might be good enough for salary reporting or tax calculations. However, most modern applications require a real-time access to data from different sources. When you upload a picture to your Facebook account, you want your friends to see it immediately, not a day after.
This shift to real-time led to a profound change in architecture: from a model based on batch processing to a model based on distributed message queues and stream processing. Apache Kafka has emerged as the leading distributed message queue for modern data applications, and companies like Alooma and others are building modern ETL solutions on top of it, either as a SaaS platform or an on-prem solution.

Conclusion

Now that you know how ETL tooling has advanced over the years and which options work best in which scenarios, let's take a more visual look at how those tools have evolved:
Image title
This post contains some representative examples for each category to help you make the choice that meets your needs.

How to select the right ETL tool

First things first, if you don't think you need real-time updates or if you aren't handling data from streaming sources, you can get away with using a tool from any of the categories above.
That said, if you're dealing with streaming data, or very large amounts of data, or if you would rather build your own solution based on open source technology, you're going to want an ETL tool or platform that can keep up with your specific requirements.
If you want to work with your existing vendors, use on-prem technology, and don't rely on real-time processing, consider an incumbent batch tool.
If you prefer to use tools built and delivered via the cloud, or if you want to avoid the overhead of equipment and maintenance costs as your data needs expand, consider a cloud-based solution.
If you want to build the solution yourself and/or if you're comfortable administering, maintaining, and operating open source tools, look into open source offerings.
If your business depends on real-time processing of events, especially large volume data sources and streams, you're going to want a modern ETL platform designed with modern needs in mind.

1 comment: