We are looking for a Middle/Senior Data Engineer with a Java / Scala / Python background to join the project for a Top-5 US retail broker (by the number of users). The project is devoted to trading experience, finance reports, and risk management.
You will join a cross-functional team that excels in getting features done from zero to production.
Key responsibilities:
1. Data Pipeline Development:
- Design, develop, and maintain robust data pipelines using Java within AWS infrastructure.
- Implement scalable solutions for data analysis and transformation using Apache Spark and PySpark.
- Utilise Airflow for efficient workflow orchestration in complex data processing tasks.
- Ensure fast and interactive querying capabilities through the use of Presto.
2. Infrastructure Management:
- Containerise applications using Docker for streamlined deployment and scaling.
- Orchestrate and manage containers effectively with Kubernetes in production environments.
- Implement infrastructure as code using Terraform for provisioning and managing AWS resources.
3. Collaboration and Communication:
- Collaborate with cross-functional teams to understand data requirements and architect scalable solutions aligned with business goals.
- Ensure data quality and reliability through robust testing methodologies and monitoring solutions.
- Stay updated with emerging technologies and industry trends to continuously enhance the data engineering ecosystem.
Requirements
1. Education and Experience:
- Bachelor's degree in Computer Science, Engineering, or related field.
- Minimum 4 years of hands-on experience in Java / Scala / Python development, emphasising object-oriented principles.
2. Technical Proficiency:
- Proficient in Apache Spark or PySpark for large-scale data processing.
- Experience with Airflow for workflow orchestration in production environments.
- Familiarity with Docker for containerisation and Kubernetes for container orchestration.
- Knowledge of Terraform for infrastructure as code implementation in AWS environments.
- Experience managing AWS services such as S3, EMR, Glue, Athena, and Redshift.
- Strong background in SQL and relational databases, with proficiency in technologies like Postgres.
- Preference for experience with streaming platforms such as Kafka for real-time data processing.
3. Communication Skills:
- Excellent English language communication skills, both verbal and written.
- Ability to collaborate effectively with technical and non-technical stakeholders.
Β
Devexperts
Devexperts consults and develops for the financial industry, solving complex technological challenges for well-respected financial institutions worldwide.
Other jobs at Devexperts
Β
Β
Β
Β
Β
Β
Β
Β
Notifications about similar jobs
Get notifications to your inbox about new jobs that are similar to this one.
No spam. No ads. Unsubscribe anytime.
Similar jobs
Β
Β
Β
Β
Β
Β
Β
Β