Airflow Data Pipeline Engineer
- Role: Airflow Data Pipeline Engineer
- Employment: Full Time
- Experience: 5 to 7 Years
- Salary: Not Disclosed
- Location: PAN INDIA Remote
Programmers.IO is currently looking to hire Airflow Data Pipeline Engineer on Apache Airflow for data orchestration,Azure Databricks , Snowflake, Python Technology. If you think you are a good fit and willing to work from PAN INDIA Remote location.Please apply with you resume or share your resume at ayushi.khandelwal@programmers.io
Experience Required: 5 to 7 Years
Job Overview
The Airflow Data Pipeline Engineer will design and manage data orchestration workflows using Apache Airflow for Project Quasar. This role ensures efficient and reliable data pipelines for RAN data ingestion and processing, integrating with Databricks and Snowflake platforms.
Responsibilities
- Develop and maintain data orchestration workflows using Apache Airflow.
- Integrate Airflow pipelines with Databricks and Snowflake for data processing.
- Monitor and optimize pipeline performance and reliability.
- Collaborate with data engineers to ensure seamless data flow.
- Troubleshoot and resolve pipeline issues.
- Document pipeline configurations and processes.
Qualifications
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- 4+ years of experience in data engineering or pipeline development.
- Expertise in Apache Airflow for data orchestration.
- Familiarity with Databricks, Snowflake, or similar data platforms.
- Strong programming skills in Python.
- Must be located in India and eligible to work.
Preferred Skills
- Experience in telecommunications or RAN data workflows.
- Knowledge of cloud platforms (e.g., Azure, AWS).
- Familiarity with CI/CD pipelines and Git.
- Experience with large-scale data orchestration projects.
Skills and Knowledge:
- Apache Airflow for data orchestration,Azure Databricks , Snowflake, Python