Mastering Airflow:Effortlessly Connect to PostgreSQL| Airflow Connection connect to Postgres#airflow
923 بار بازدید -
6 ماه پیش
-
Welcome to our latest tutorial
Welcome to our latest tutorial where we dive deep into the world of Apache Airflow and PostgreSQL! In this video, I’ll guide you step-by-step on how to seamlessly connect PostgreSQL from an Airflow DAG. Perfect for data engineers, ETL developers, and anyone passionate about workflow automation
#ApacheAirflow #PostgreSQL #DataEngineering #ETL #WorkflowAutomation #AirflowTutorial #datapipeline
####
how to connect to docker postgres:-
expose postgres ports using below change in yml file:-
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
volumes:
- postgres-db-volume:/var/lib/postgresql/data
ports:
- 5432:5432
run below command:-
docker-compose up -d --no-deps --build postgres
#####
from datetime import datetime, timedelta
from airflow import DAG
from airflow.providers.postgres.operators.postgres import PostgresOperator
default_args = {
'owner': 'sumit kumar',
'retries': 5,
'retry_delay': timedelta(minutes=1)
}
with DAG(
dag_id='dag_with_postgres_operator',
default_args=default_args,
start_date=datetime(2024, 1, 20),
schedule_interval='0 0 * * *'
) as dag:
task1 = PostgresOperator(
task_id='create_postgres_table',
postgres_conn_id='postgres_local',
sql="""
CREATE TABLE if not exists Orders (
OrderID INT PRIMARY KEY,
Status VARCHAR(50)
)
"""
)
task2 = PostgresOperator(
task_id='insert_into_table',
postgres_conn_id='postgres_local',
sql="""
insert into Orders values(2,'delivered')
"""
)
task1 task2
#ApacheAirflow #PostgreSQL #DataEngineering #ETL #WorkflowAutomation #AirflowTutorial #datapipeline
####
how to connect to docker postgres:-
expose postgres ports using below change in yml file:-
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
volumes:
- postgres-db-volume:/var/lib/postgresql/data
ports:
- 5432:5432
run below command:-
docker-compose up -d --no-deps --build postgres
#####
from datetime import datetime, timedelta
from airflow import DAG
from airflow.providers.postgres.operators.postgres import PostgresOperator
default_args = {
'owner': 'sumit kumar',
'retries': 5,
'retry_delay': timedelta(minutes=1)
}
with DAG(
dag_id='dag_with_postgres_operator',
default_args=default_args,
start_date=datetime(2024, 1, 20),
schedule_interval='0 0 * * *'
) as dag:
task1 = PostgresOperator(
task_id='create_postgres_table',
postgres_conn_id='postgres_local',
sql="""
CREATE TABLE if not exists Orders (
OrderID INT PRIMARY KEY,
Status VARCHAR(50)
)
"""
)
task2 = PostgresOperator(
task_id='insert_into_table',
postgres_conn_id='postgres_local',
sql="""
insert into Orders values(2,'delivered')
"""
)
task1 task2
6 ماه پیش
در تاریخ 1402/11/01 منتشر شده
است.
923
بـار بازدید شده