- Drive implementation of state-of-the-art CI/CD pipelines for data applications
- Experience with some RDBMS: Postgres, Oracle, MySql
- Comfortable developing CI/CD pipelines with tooling such as Jenkins, Github Actions or Apache Airflow
- Very good software devlopment skills in either Java, Scala or Python including knowledge of OOP and data structures and algorithms
- Build and improve existing continuous integration and deployment environment
- Deliver high quality automation solution for building, deploying and running data pipelines
- Bring best practices into the team and coach junior team members
- Drive transition from semi-automated processed to fully automtaed, container-based workflows
- Understand business requirements, comfortable talking to customers and translate requirements into technical solution
Requirements
- BE (Computer Science) / MCA with 2 to 4 years experience.
- Common scripting languages, preferably bash & python
- Experience with CI/CD pipelines based on Jenkins, Gitlab or Github Actions
- Knowledge of docker
- Knowledge of Kubernetes basics
- Preferably experience with Apache Airflow
- Having very good knowledge of CI/CD concepts and state-of-the art tools
- Deep experience with at least one common CI/CD tool (any of Jenkins, Github Actions, Gitlab CI, Azure DevOps)
- 3+ years of experience in DevOps / Site Reliability Engineering
- Advanced troubleshooting and debugging skills
- Comfortable working in an agile environment
- Comfortable working on bare metal, virtualized, containerized and cloud environments
- Ability to start and drive technical initiatives
- Should be able to identify and automate manual processes
- Good to have: working knowledge of data pipelines and frameworks used within data engineering (e.g. Apache Spark, Hadoop)
ย
Notifications about similar jobs
Get notifications to your inbox about new jobs that are similar to this one.
No spam. No ads. Unsubscribe anytime.
Similar jobs
ย
ย
ย
ย
ย
ย
ย
ย