Summary
- We are seeking a highly skilled and motivated Staff Software Engineer with extensive background in data engineering, cloud services, with proficiency in scripting languages to join our team. The successful candidate will be instrumental in designing, building, and maintaining our data analysis pipelines, ETL processes, AWS backbone functionality, CI/CD and other related activities.
Required Skills
- Ability to design and build data pipelines.
- Ability to design and build Api’s in languages in Python or Java.
- Amazon Web Services data services including:
- EC2 / S3/ Postgres
- Aurora MySQL
- RDS
- Lambda
- Step Functions
- Glue
- Airflow or similar orchestration experience.
- DevOps practices like continuous integration/continuous deployment (CI/CD), infrastructure as code (IaC), and automated testing is a plus.
- Strong demonstrable SQL, Python, Java skills.
- Thorough understanding of, and support for, Agile development methodologies.
- Ability to work independently, as well as with a team.
- Ability to communicate technical concepts and designs to cross-functional teams who have varying levels of technical experience.
- Proven data engineering, problem solving, and analysis skills.
- High-level written and verbal communication skills.
- Ability to adapt to changing conditions and lead others through change.
- Analytical and problem-solving ability and orientation.
- Demonstrated organizational, prioritization, and time management skills.
- Attention to detail.
- Lead and mentor junior engineers.
- Establish reusable patterns for the engineering team.
Roles & Responsibilities - Design, create and modify data pipelines and related interfaces.
- Own and develop interfaces to/from Snowflake, Snowflake admin a plus.
- Contribute to backend and ELT development effort of our data platform.
- Lead and provide hands-on new development as well as enhancement of existing data processes.
- Design, maintain, and tune extraction, load, and transformation, (ELT / ETL) processes using PL/SQL, SQL, Python, or Spark.
- Promote collaboration through activities including design sessions, design reviews, and pair programming, etcetera.
KEY DUTIES - Develop and maintain data engineering solutions for data driven new and existing product lines.
- Analyze business requirements and work with teammates to formulate supporting design and design documentation.
- Other duties as assigned.
Education and Experience: - Master’s or PhD in Computer Science or a related field
- AWS Certifications such as Certified Solutions Architect or similar
- Snowflake Core plus Advanced certifications like Architect or Data Engineer with related experience
- 5+ years of data engineering experience building business intelligence applications with exceptional SQL, Python, and/or Java/JavaScript skills.
- 1+ years working in an agile development environment.
Preferred experience / qualifications - Full-Stack development experience, knowledge of Angular.
- API development experience.
- Stash/GitHub/Liquibase experience.
- NoSQL or similar document-oriented storage and analytics experience.
GHX
GHX is a healthcare business and data automation company, empowering healthcare organizations to enable better patient care and maximize industry savings using our world class cloud-based supply chain technology exchange platform, solutions, analytics and services
Other jobs at GHX
Why OmniJobs?
- Rare & hidden jobs
- New jobs every day
- No expired job posts
- All jobs in English
Receive emails about similar jobs
Get alerts to your inbox about new open jobs that are similar to this one.
No spam. No ads. Unsubscribe anytime.
Similar jobs