Back

project
full-time

Key Responsibilities:
- Design, develop, and optimize data pipelines and ETL processes.
- Implement scalable big data solutions using AWS, Spark, Scala, Java, and Airflow.
- Collaborate with data scientists and analysts to ensure efficient data processing and analysis.
- Maintain and improve data architecture, ensuring reliability and performance.
-Troubleshoot and resolve data quality and performance issues in existing pipelines.
- Ensure best practices for security, compliance, and governance in data engineering.

Qualifications & Requirements:
- Proficiency in AWS cloud services (Lambda, S3, Redshift, Glue, etc.).
- Strong experience with Spark for large-scale data processing.
- Expertise in Scala and Java for backend data engineering.
- Hands-on experience with Airflow for workflow automation.

Soft Skills:
- Excellent communication skills in English (B2 Medium+ to C1 Advanced-).

Requirements


AWS
Airflow
Java
SCALA
English

About the company

Technology services company specializing in software development, AI, digital transformation, and innovation, helping businesses enhance their operations through cutting-edge solutions.

Data Engineer Senior
$
25
-
32
/hour