Skip to content
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions landing-pages/site/content/en/use-cases/etl_analytics.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@ blocktype: use-case

</div>

Extract-Transform-Load (ETL) and Extract-Load-Transform (ELT) data pipelines are the most common use case for Apache Airflow. 90% of respondents in the 2023 Apache Airflow survey are using Airflow for ETL/ELT to power analytics use cases.
Extract-Transform-Load (ETL) and Extract-Load-Transform (ELT) data pipelines are the most common use case for Apache Airflow. 90% of respondents in the 2023 Apache Airflow survey are using Airflow for ETL/ELT to power analytics use cases.

The video below shows a simple ETL/ELT pipeline in Airflow that extracts climate data from a CSV file, as well as weather data from an API, runs transformations and then loads the results into a database to power a dashboard. You can find the code for this example [here](https://github.com/astronomer/airflow-quickstart).
The video below shows a simple ETL/ELT pipeline in Airflow that extracts climate data from a CSV file, as well as weather data from an API, runs transformations and then loads the results into a database to power a dashboard. You can find the code for this example [here](https://www.astronomer.io/docs/learn/2.x/airflow-quickstart).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually - we should avoid pointing to external sources from our "regular" content. I think we should remove it from the main pages - and add it to "ecosystem" page as one of the "community" sources.



<div id="videoContainer" style="display: flex; justify-content: center; align-items: center; border: 2px solid #ccc; width: 75%; margin: auto; padding: 20px;">
Expand All @@ -33,7 +33,7 @@ Airflow is the de-facto standard for defining ETL/ELT pipelines as Python code.
- **Tool agnostic**: Airflow can be used to orchestrate ETL/ELT pipelines for any data source or destination.
- **Extensible**: There are many Airflow modules available to connect to any data source or destination, and you can write your own custom operators and hooks for specific use cases.
- **Dynamic**: In Airflow you can define [dynamic tasks](https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/dynamic-task-mapping.html), which serve as placeholders to adapt at runtime based on changing input.
- **Scalable**: Airflow can be scaled to handle infinite numbers of tasks and workflows, given enough computing power.
- **Scalable**: Airflow can be scaled to handle infinite numbers of tasks and workflows, given enough computing power.


## Airflow features for ETL/ELT pipelines
Expand Down