Summary
The Data Reliability Engineering team is responsible for helping build and maintain data cloud-native services and products, our main responsibility is ensuring high standards of reliability, scalability, and security during the entire data lifecycle.
What you'll do
- Responsible for building, maintaining, and evolving cloud-native and containerized infrastructure dedicated to hosting and integrating data products and services
- Play a crucial role in increasing our platform maturity by supporting other squads and participating as a team player in multidisciplinary projects, with a primary focus on availability, security, and scalability
- This staff position requires an advanced understanding of systems design, cloud infrastructure, network, and databases or data-related technologies, also is expected previous experiences in complex technology adoptions are expected.
Minimum Qualifications
- Solid experience with Cloud Management (AWS preferable)
- Solid experience with Kubernetes Management and Containerized applications
- Experience with observability (Logging and Metrics, nice-to-have Tracing)
- Experience with Shell Script and Linux Management
- Experience with automation language like Python or other
- Experience with Streaming (Kafka, Kinesis, or similar)
- Experience with Network (Load Balancer, Reverse Proxy, Gateway, VPC, NAT, etc.)
- Knowledge in Databases: MySQL and Postgres
- Desirable knowledge in BigData
- Desirable experience with serverless such as Lambda or Google Functions
- Desirable experience with orchestration tools like Airflow
- Desirable knowledge of development and design patterns
- Desirable experience with NoSQL database.
Core Benefits
- Remote work
- Flexible hours
- Gympass
- Meal & Food vouchers
- Remote work financial support
- Life Insurance
- Medical and Dental Assistance
- Employee child care benefit: daycare
- Vidalink partnership
- Day off (Birthday)
- Support for studying languages
- 50% off AWS and GCP certifications
Technologies that we apply in our day
Engineering
- Java, Groovy and Go
- Automated Testing
- K6 (Load Testing) and Gremlin (Chaos Testing)
- SQL / NoSQL
- Git
- Rest APIs and streaming data
- Cloud (AWS and Google)
- Docker and Kubernetes
- Codefresh & ArgoCD
- Grafana & Honeycomb
- Jira / Confluence
Data
- AWS Services;
- Data Processing: Spark, Flink
- Python
- Airflow
- Relational databases (PostgreSQL and MySQL)
Platform Engineering
- AWS
- Codefresh
- ArgoCD
- Grafana & Honeycomb
- Kubernetes
- Terraform
- Go, Python, and Shell Script
- Prometheus
- Istio
Security
- SAST
- SCA
- IaC Scans
Pismo
A remote-first company building cutting-edge financial solutions with a diverse and multicultural team.
Other jobs at Pismo
Notifications about similar jobs
Get notifications to your inbox about new jobs that are similar to this one.
No spam. No ads. Unsubscribe anytime.
Similar jobs