Get notifications to your inbox about new jobs that are similar to this one.
No spam. No ads. Unsubscribe anytime.
Fast-growing InsurTech company from Silicon Valley, Hippo Insurance, modernizing the home insurance industry with IOT monitoring devices and industry-leading software.
We are a Polish branch of a fast-growing InsurTech product company from Silicon Valley - Hippo Insurance. Our mission is to revolutionize home insurance in the US, starting from IoT monitoring devices to our industry-leading software.
How do we want to get there? We need a top-notch talent, just like you!
Putting a lot of effort into hiring top-tier professionals, is a proof that we care a lot about tech experience, the attitude, human approach, and what we could call „culture fit”.
„SwingDev is all about people” - yes, it may sound a bit cliche. But whether we're writing code or just hanging out, we know that people are at the heart of everything we do. We like to have a good time and keep things light, even when we're tackling big projects. We could brag about what’s making us special, but we’ve boiled it down to two key ingredients: mature, companionable people who, rather than compete, prefer to inspire and have each other’s backs; a culture of trust, empathy, and positivity that keeps us together, lets us interact as teammates and friends, and truly enjoy the ride.
About the role:
We’re looking for someone who is passionate about tackling complex data engineering challenges and excited to drive impactful solutions. You'll play a key role in shaping our data architecture for scalability and efficiency, collaborating closely with cross-functional teams to unlock the power of data across the organization.
So if you're a Data Engineer looking to shake things up and have a good time while you're at it, you’ve come to the right place. 🚀
Have 4+ years industry experience, including 2+ years in Data Engineering.
Have proficiency in SQL and at least one major development language (Python or JavaScript preferred).
Posses a strong understanding of data warehousing principles, data quality practices, and data pipeline design.
Have an excellent verbal and written communication skills to communicate technical designs to a technical audience and understand and align on project requirements.
Have ability to size technical efforts and deliver projects within agreed upon timelines.
Design, develop, and maintain robust data pipelines to ingest, transform, and load data from various sources into our data warehouse to support company-wide reporting needs.
Partner with product, engineering, analytics, and business teams in several domains to translate business needs into technical requirements and execute on impactful projects for the business.
Work collaboratively to improve data quality across source systems and champion a data-driven culture within the organization.
Actively participate in the evolution of data architecture, contributing your expertise to discussions and proposing innovative solutions.
Are available in the afternoons - due to collaboration with the United States, evening meetings may occur. Rest assured, we prioritize work-life fit, respect everyone's private lives, and don’t work at night but we still must ensure that communication between the time zones is effective.
You will get extra points for:
Experience with DBT, Airflow, Bigquery.
Experience with Salesforce data or actuarial insurance concepts.
What benefits are waiting for you?
Salary
22 000 - 26 000 PLN + VAT on B2B or equivalent on the contract of employment
Basics
📝 Form of employment of your choosing
🌎 Remote work & flexible working hours
🤒 Paid sick leave
🏖️ Paid holidays
Health & Safety
💊 Private medical care with dentists & orthodontists package for you and your family
❤️ Group life insurance
🧘 Psychotherapists support — free online sessions with psychologists and psychotherapists.
🤸 Home physiotherapy
🏅 Multisport card & meditation apps reimbursed 50%
Working conditions & Development
💻 Gear with Apple Logo and a nice Dell monitor
🌱 50% reimbursement for courses, conferences, books & certificates
🇺🇸 Free access to private language lessons
🐕 6 Personal Development Days & 4 Voluntary Days Off
Extras you may like
🎫 Cafeteria platform — extra “stówka”every month to spend on whatever you want to
🧒 Nanny services for parents
📦 Concierge services – a personal assistant to help you to deal with your everyday matters
🎮 Chill room with table football & PlayStation 4
🍦 Free snacks, and ice cream in the office (every day, all year round!)
🍱 Free Friday Lunch in the office
🎉 Team building events — we party together several times a year during the annual Offsite & Christmas Parties, beer after work, or our #WinterEscapeMonth workation in Cyprus
🇵🇱📚Added 6h ago
Big Data Engineer
Allegro is a leading e-commerce platform in Central and Eastern Europe.
PythonScalaJavaSparkGCPComposerAzureAWSClean CodeTDD + 6
🇵🇱📚Added 6h ago
Senior Big Data Engineer
Allegro is a leading e-commerce platform in Central and Eastern Europe.
ScalaJavaPythonSparkGCPComposerAzureAWSUnix/LinuxSpring + 8
🇵🇱Added 9h ago
Data Engineer
IKEA Supply (Malaysia) Sdn Bhd is a wholesale unit under Inter IKEA Group, strategically located in Port Klang, Selangor Malaysia
AzureDatabricksPythonSQLAPI
🇵🇱Added 9h ago
NIFI Data Engineer
ARHS - Arηs Developments Hellas is a Greek entity within the Arηs Group, providing high-quality services in Software Development for the European market.
Apache FlinkApache BeamDebeziumApache KafkaTransactional Data LakeApache Flink data flowsETL data flowHDFSApache IcebergCloud + 1
🇵🇱Added 9h ago
Stream Processing Data Engineer
ARHS - Arηs Developments Hellas is a Greek entity within the Arηs Group, providing high-quality services in Software Development for the European market.
Apache FlinkApache BeamDebeziumApache KafkaTransactional Data LakeApache NiFiApache IcebergCloudMachine learning
🇵🇱Added 10h ago
Stream Processing Data Engineer
ARHS - Arηs Developments Hellas is a Greek entity within the Arηs Group, providing high-quality services in Software Development for the European market.
Apache FlinkApache BeamDebeziumApache KafkaTransactional Data LakeApache NiFiApache IcebergCloudMachine learning