Join our dynamic and high-impact Data team as a Data Engineer, responsible for safely receiving and storing trading-related data for the India teams, operating and improving our shared data access and data processing systems.
Requirements
- 5-7 years of experience in managing large-scale multi-petabyte data infrastructure
- Advanced knowledge of Linux system administration
- Deep expertise in at least one of the following technologies: Kafka, Spark, Cassandra/Scylla, or HDFS
- Strong working knowledge of Docker, Kubernetes, and Helm
- Experience with data access technologies such as Dremio and Presto
- Familiarity with workflow orchestration tools like Airflow and Prefect
- Exposure to cloud platforms such as AWS, GCP, or Azure
- Proficiency with CI/CD pipelines and version control systems like Git
- Understanding of best practices in data security and compliance
Benefits
- data access technologies such as Dremio and Presto
- workflow orchestration tools like Airflow and Prefect
- cloud platforms such as AWS, GCP, or Azure