Sr Big Data Engineer

RemoteSenior
🇮🇳 India
Data Engineer
Technology

About the Role:

We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

What you will be doing

Build a reusable, and reliable code for stream and batch processing systems at scale. This includes working with technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark. Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc.

Requirements

  • About the Role:

  • We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing batch processing systems, with extensive experience in Oozie, the Apache Hadoop ecosystem, Airflow, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

  • What you will be doing

  • Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase

  • Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem

  • Leverage GCP for scalable big data processing and storage solutions

  • Implementing automation/DevOps best practices for CI/CD, IaC, etc.

  • Requirements:

  • Experience with GCP managed services and understanding of cloud-based batch processing systems are critical.

  • Proficiency in Oozie, Airflow, Map Reduce, Java

  • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL

  • Expertise in public cloud services, particularly in GCP.

  • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce

  • Familiarity with BigTable and Redis

  • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.

  • Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions.

  • Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.

  • Proven experience in engineering batch processing systems at scale.

  • Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.

 

Rackspace

Rackspace

Rackspace Technology is the multicloud solutions expert, delivering end-to-end solutions across applications, data, and security

Cloud Computing
Cybersecurity

LinkedIn

Realize the full value of the cloud.

🏭Information Technology & Services
🎂1998
7.7K
309.8K

Updated  

Other jobs at Rackspace

 

 

 

 

 

 

 

 

View all Rackspace jobs

Why OmniJobs?

  • Rare & hidden jobs
  • New jobs every day
  • No expired job posts
  • All jobs in English

Receive emails about similar jobs

Get alerts to your inbox about new open jobs that are similar to this one.

🇮🇳 India
Data Engineer
Remote

No spam. No ads. Unsubscribe anytime.

Similar jobs