QBoard » Big Data » Big Data on Cloud » Reasons to use Azure Data Lake Analytics vs Traditional ETL approach

Reasons to use Azure Data Lake Analytics vs Traditional ETL approach

  • I'm considering using Data Lake technologies which I have been studying for the latest weeks, compared with the traditional ETL SSIS scenarios, which I have been working with for so many years.

    I think of Data Lake as something very linked to big data, but where is the line between using Data Lake technolgies vs SSIS?

    Is there any advantage of using Data Lake technologies with 25MB ~100MB ~ 300MB files? Parallelism? flexibility? Extensible in the future? Is there any performance gain when the files to be loaded are not so big as U-SQL best scenario...

    What are your thoughts? Would it be like using a hammer to crack a nut? Please, don't hesitate to ask me any questions to clarify the situation. Thanks in advance!!

    21/03 EDIT More clarifications:

    1. has to be on the cloud
    2. the reason I considered about using ADL is because there is no substitution for SSIS in the cloud. There is ADF, but it's not the same, it orchestrates the data, but it's not so flexible as SSIS
    3. I thought I could use U-SQL for some (basic) transformations but I see some problems
      • There are many basic things I cannot do: loops, updates, writing logs in a SQL...
      • The output can only be a U-SQL table or a file. The architecture doesn't look good this way (despite U-SQL is very good with big files, if I need an extra step to export the file to another DB or DWH) - Or maybe this is the way it's done in Big Data Warehouses... I don't know
      • In my tests, It takes 40s for a 1MB file, and 1:15s for a 500MB file. I cannot justify a 40s process for 1MB (plus uploading to the Database/Data Warehouse with ADF)
      • The code looks unorganised for a user, as the scripts with many basic validations will be U-SQL scripts too long.

    Don't get me wrong, I really like ADL techonologies, but I think that for now, it's for something very specific and still there is no substitution for SSIS in the cloud. What do you thing? Am I wrong?

      October 27, 2021 2:10 PM IST
    0
  • Be careful. This question is likely to get closed for being too broad.

    There are many arguments for and against. We can't discuss them all here.

    ADL isn't a replacement for SSIS. The consultants answer as always will be.. it depends what your doing/trying to do.

    A simplistic answer might be. ADL is unlimited and highly scalable. SSIS is not. But, yes, ADL has a high entry point for small files because of that scalability.

    Generally I don't think the two technologies are comparable.

    If you want SSIS in Azure. Wait for MS to release it as a PaaS. Or use a virtual machine.

      October 30, 2021 2:06 PM IST
    0
  • I think for simpler transformations it may be a good solution, however if you have complexities, notifications etc. it may be incompatible. A typical scenario would be something like transforming a JSON document to CSV, then taking the CSV and running that through SSIS for further transforms. There is certainly a future state that will enable U-SQL to to be much more powerful, for now I think there are separate and distinct uses for U-SQL/ADLA/ADLS and SSIS.

     
      November 1, 2021 2:31 PM IST
    0
  • For me, if the data is highly structured and relational, the right place for it is a relational database. In Azure you have several choices:

    1. SQL Server on a VM (IaaS) Ordinary SQL Server running on a VM, you have to install, configure and manage it yourself but you get the full flexibility of the product.
    2. Azure SQL Database PaaS database option targetted at smaller volumes but now up to 4TB. All of the features of normal SQL Server with potentially lower TCO and the option to scale up or down using tiers.
    3. Azure SQL Data Warehouse (ADW) MPP product suitable for large warehouses. For me, the entry criteria is warehouses at least 1TB in size, and probably more like 10TB. It's really not worth having a MPP for small volumes.

    For all database options you can use clustered columnstore indexes, (the default in ADW), which can give massive compression, between 5x and 10x.

    400MB per day for a year totals ~143GB, which honestly is not that much in modern data warehouse terms, which are normally measured in terabytes (TB).

    Where Azure Data Lake Analytics (ADLA) comes in, is doing things you cannot do in ordinary SQL, like:

    • combine the power of C# with SQL for powerful queries - example here
    • dealing with unstructured files like images, xml or JSON - example here
    • using RegEx
    • scale out R processing - example here

    ADLA also offers federated queries, the ability to "query data where it lives", ie bring together structured data from your database and unstructured data from your lake.

    Your decision seems more to do with whether or not you should be using the cloud. If you need the elastic and scalable features of cloud then Azure Data Factory is the tool for moving data from place to place in the cloud.

      October 28, 2021 4:26 PM IST
    0