Analytics Engineer

Hybrid
Mid-level
🇦🇺 Australia
Data Engineer
Data science & Analytics

Do you have the analytical prowess to make data dance and the engineering skills to build data pipelines? If you answered yes, then we've got the perfect gig for you!

What's in it for you? We are on the lookout for a tech-savvy, data-driven genius who's passionate about the world of crypto and gaming - someone who’s an analytics enthusiast who lives and dies by data-driven decisions. The successful candidate will join our Data & Business Intelligence team at Easygo to help us improve and optimise our use of data and transition us into a data-driven organisation. Your role with us: A permanent opportunity where you will be driving improvements in data platform engineering and striving to get our data to a reliable, reusable and robust state, ensuring it can be used by multiple teams allowing us to make data-driven decisions to achieve business value. You will work across multiple projects, creating solutions, implementing automated pipelines, maintaining and monitoring them. You will know about data architecture and build a plan on how it will be ingested for the BI Team to use and consume via Analytics tools. This individual will help establish a shift in the use of data across the organisation - where you will also have a deep understanding of data ingestion, integration, transformation processes and be proficient in leveraging cloud-based data warehousing and analytics solutions. It's important to note you will be playing a crucial part in building and maintaining our data infrastructure, ensuring high-quality data is readily available for business intelligence and analytics purposes.Who are we?
At Easygo we proudly stand as a prominent service provider to a powerhouse of brands within the iGaming industry, including Stake.com, Kick.com and Twist Gaming.

Stake is the world's largest crypto casino, and leads the industry with a seamless online casino and sportsbook experience. Level up your online entertainment with Kick.com, the vibrant live streaming platform, which connects millions of gamers and content creators worldwide. All alongside the innovative game design studio, Twist Gaming, which takes creativity to new heights by crafting cutting-edge and captivating games.

Our commitment to placing our clients and their communities' entertainment at the forefront of everything we do, has solidified us as the ultimate online service provider for entertainment companies.

Headquartered in the beautiful city of Melbourne, our growth has been remarkable. From humble beginnings to a thriving workforce of 300+, we've expanded not only in numbers but in ambition. There really is something for everyone here, whether you work in Tech, Marketing, Operations, Mathematics or Design, we are sure to have something for everyone.

Click play, on your career today!

What you will do:

  • Leverage AWS Services to build a robust and high-performance data platform to accommodate increasing data volumes, driving enhancements in the data architecture and designing innovative data products.
  • Develop and maintain Terraform code to provision and manage scalable data infrastructure and pipelines, adhering to infrastructure as code (IaC) best practices.
  • Document technical architecture and workflows in Confluence, ensuring comprehensive, accurate, and up-to-date technical documentation.
  • Manage and optimise CI/CD pipelines using AWS CodePipeline for data management and centralized configuration, ensuring efficient, automated, and reliable deployment workflows.
  • Ingest data from various sources including external APIs, web services, and cloud storage, using Python or other programming languages along with AWS services, integrating this data into a centralised repository.
  • Build and manage Amazon Kinesis streams to enable seamless and continuous streaming of data, ensuring optimal performance, scalability, and reliability for real-time data processing.
  • Leverage AWS Glue to develop robust ETL pipelines implementing best coding practices that automate data extraction, transformation, and loading processes to build comprehensive data catalogs.
  • Design and implement robust data models, incorporating principles from methodologies such as Kimball or similar frameworks to optimise data storage, retrieval, and processing, enabling seamless integration and analysis across diverse systems and applications.
  • Implement data governance best practices, including data lineage, quality assurance, and security measures to ensure compliance and maintain data integrity organisation-wide. This includes adhering to data classification levels and standards such as GDPR to safeguard sensitive information and uphold regulatory requirements.
  • Establish a unified source of truth for data models to empower business analytics solutions, implementing best practices and the latest techniques for data quality, conducting automated data checks, reconciliation, and implementing testing processes.

What you will bring:

  • Bachelor's degree in Computer Science, Software Engineering, Information Systems, or a related field (Master's degree preferred).
  • Proven experience in data engineering or a related role, with a strong background in data modeling, ETL, and database management.
  • Proficiency in programming languages such as SQL, Python, or similar.
  • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift) and big data technologies (e.g., Hadoop, Spark).
  • Solid understanding of data governance, data security, and compliance standards

Bonus points if you also have:

  • AWS certifications (e.g. Data Engineer, Solutions Architect, DevOps Engineer, Machine Learning, Developer) demonstrate proficiency in AWS services.
  • Proficiency in Matillion ETL tool (low code), which is commonly used for building and managing ETL pipelines in the current architecture.
  • Familiarity with Machine Learning concepts and tools, which can enhance data engineering projects involving predictive analytics and advanced data processing.
  • Understanding of DevOps principles and practices enhancing the collaboration cross department.
  • Familiarity with Agile development methodologies and Jira facilitating iterative and collaborative approaches to project management and delivery.

Some of the perks of working for us:

  • EAP access for you and your family
  • Access to over 9,000 courses across our Learning and Development Platform
  • Paid volunteer day
  • Two full-time baristas who will make your daily coffee, tea, fresh juices and smoothies!
  • Daily catered breakfast
  • Massage Wednesdays - we get professionals to do this!
  • Team lunches and happy hour in the office from 4pm on Fridays
  • Fun office environment with pool tables, table tennis and all your favorite gaming consoles
  • 'Help Yourself' Drinks Fridges and Snack Walls on each of our operating levels

“We are a 2024 Circle Back Initiative Employer – we commit to respond to every applicant”

We believe that the unique contributions of everyone at Easygo are the driver of our success. To make sure that our products and culture continue to incorporate everyone's perspectives and experience we never discriminate on the basis of race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. We are passionate about providing a workplace that encourages great participation and an equal playing field, where merit and accomplishment are the only criteria for success.

 

Easygo Gaming

Easygo Gaming

Leading entertainment company fostering a continuous learning environment for engineering excellence.

Gaming
Entertainment
Engineering

Other jobs at Easygo Gaming

 

 

 

 

 

 

 

 

View all Easygo Gaming jobs

Notifications about similar jobs

Get notifications to your inbox about new jobs that are similar to this one.

🇦🇺 Australia
Data Engineer

No spam. No ads. Unsubscribe anytime.

Similar jobs