Here's what you'll be doing:
- Partner with business stakeholders: Collaborate closely with analysts to translate their needs into technical requirements and data-driven solutions.
- Ingest large scale data: Use python, Spark & SQL to pipeline data from business applications and products APIs.
- Architect and deliver robust Data solutions: Design and implement ETL pipelines and data warehouses Models using SQL and Python.
- Fuel machine learning and AI: Prepare and wrangle data, ensuring high quality and readiness for advanced analytics initiatives.
- Maintain data quality and performance: Ensure data integrity, accuracy, and optimal performance within data pipelines and BI systems.
- Collaborate for success: Partner closely with IT Infrastructure, DevOps, and Security teams to ensure seamless integration and secure deployment of all data solutions.
- Stay ahead of the curve: Continuously learn and stay updated on the latest data engineering and BI trends and technologies.
#LI-CR1
Requirements
- Data Pipeline developer: Minimum 3 years of experience designing and implementing efficient ETL pipelines using industry-standard tools like Informatica, Rivery, SSIS, or equivalent.
- SQL Master: Deep expertise in writing complex queries and manipulating data with advanced techniques in Oracle, Snowflake, or similar relational databases – Minimum 3 years.
- Python Proficient: Strong ability to use Python libraries and frameworks (NumPy, Pandas, Spark) for data manipulation, analysis, and scripting tasks.
- Tools Specialist: Experience building and training BI models using Databricks or comparable tools. Experience with Rivery, Snowflake.
- Global Collaborator: Proven ability to collaborate effectively with geographically dispersed teams in a fast-paced environment.
- Communication Expert: Skilled in explaining technical concepts clearly to both technical and non-technical audiences.
- Problem Solver: Capacity to approach challenges with creative solutions and adapt quickly to evolving requirements.
- Multitasking Master: Demonstrated ability to manage multiple data workloads concurrently while delivering results efficiently.
- Fluent in spoken and written English.
Cyberark
CyberArk is the global leader in identity security, providing the most comprehensive security offering for any identity – human or machine – across business applications, distributed workforces, hybrid cloud environments, and throughout the DevOps lifecycle
Other jobs at Cyberark
Notifications about similar jobs
Get notifications to your inbox about new jobs that are similar to this one.
No spam. No ads. Unsubscribe anytime.
Similar jobs