Kargo is seeking a Senior Data Engineer to join their team in London, UK. The successful candidate will be responsible for implementing, optimizing, and maintaining robust ETL/ELT pipelines, engaging in collaborative design and brainstorming sessions, and contributing to the definition and implementation of robust testing strategies.
Requirements
- Strong expertise in implementing, maintaining, and optimizing large-scale data systems with minimal oversight.
- Deep proficiency in Python, Spark, and Iceberg, with a clear understanding of data structuring for efficiency and performance.
- Experience with Airflow for building robust data workflows is strongly preferred.
- Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus.
- Familiarity with analytical warehousing such as Snowflake or Clickhouse, including writing and optimizing SQL queries and understanding Snowflake's performance and cost dynamics.
- Comfort with Agile methodologies, including regular use of Jira and Confluence for task management and documentation.
- Proven ability to independently drive implementation and problem-solving, turning ambiguity into clearly defined actions.
- Excellent communication skills to effectively engage in discussions with technical teams and stakeholders.
- Familiarity with identity, privacy, and targeting methodologies in AdTech is required.
Benefits
- Breakthrough cross-screen ad experiences for the world’s leading brands and publishers
- Rapidly evolving data infrastructure and capabilities
- Collaborative design and brainstorming sessions
- Opportunities for growth and development
- Recognition as a Best Place to Work by Ad Age and Built In