ABOUT THE TEAM
World Data Lab (WDL) Tech team is building next-generation data analytics platforms for private sector clients, while also delivering data-driven solutions for sustainable development goals (SDGs). Our engineering team consists of skilled frontend and backend developers, data engineer, and QA specialist, all collaborating across borders to build resilient, scalable, and innovative data products. We work at the intersection of cloud infrastructure, software engineering, and data science - creating products that empower organizations to make high-impact, data-informed decisions.
As our technical landscape grows in complexity and scale, we’re seeking someone who can take ownership of our DevOps function - driving improvements, increasing system reliability, and embedding observability across our infrastructure. ABOUT THE ROLE
We’re looking for a
DevOps Engineer who will play a leading role in designing, maintaining, and improving our infrastructure. This is a hands-on position where you’ll be responsible for enabling fast, reliable software delivery and platform performance through automation, observability, and operational excellence.
You’ll work closely with software engineers, data engineers, and product team to ensure that our development processes are scalable, secure, and consistent. This role is ideal for someone who is eager to shape our DevOps practices, proactively identify areas for improvement, and lead infrastructure initiatives across multiple projects. THE CHALLENGE
- Implement and maintain CI/CD pipelines;
- Actively manage and monitor all Cloud (AWS) instances and services;
- Build and manage the AWS server infrastructure (including using kubernetes);
- Implement and maintain a monitoring/metrics/logging platform for all WDL projects;
- Working on ways to automate and improve development and release processes;
- Working with software engineers to ensure that development follows established processes and works as intended;
- Planning projects and being involved in project management decisions.
- Working with Data Engineers and Data Scientist on verifying and releasing data updates to our Data Products.
YOUR PROFILE
- 2+ years of professional experience as a DevOps Engineer ;
- Knowledge and experience in working with AWS, Github (actions), Kubernetes, and Relational DBS (PostgreSQL, ClickHouse) required;
- Knowledge and experience in working with Java(Spring) and Python is an advantage;
- Experience in working with Jira, Observability frameworks (ie Grafana) is considered an advantage;
- Hands-on mentality;
- Ability to work with a broad range of technical areas and leverage skill-sets within and across different teams;
- Strong knowledge of agile development processes and current best practices;
- Capacity to guide the development of complex projects that involve data scientists, product owners, and designers. Exposure to tools like Terraform/Terragrunt, CloudWatch, and Cloudflare
- Familiarity with Redis, Amazon SQS, and AWS Cognito
- A habit of writing clear technical documentation
OUR OFFER
- State-of-the-art equipment and tools that help you to perform at your best
- The opportunity to have an impact in a fast-growing and well-established big data SaaS scale-up
- Flexible working hours with high acceptance of home office
- Be part of a team of highly motivated and like-minded individuals
- Great individual development- and growth opportunities
OUR RECRUITMENT PROCESS
- Initial Screening: A quick chat with our People & Culture Manager to understand your background and expectations.
- 1st Interview: Meet the hiring manager and showcase your real people skills.
- 2nd Interview: Meet with the team and get to know our working culture personally.
- Final Steps: Receive feedback and, if successful, an offer!
We keep it simple and aim to wrap up the process within 3 weeks.
We encourage you to apply even if you only meet most of the requirements (but not 100% of the listed criteria) – we believe skills evolve over time. If you’re willing to learn and grow with us, we invite you to join our team