The job holder is responsible for designing and developing programs, algorithms and automated processes to cleanse, integrate and evaluate large datasets from disparate sources and implement complex business logic as needed with the available data processing tools.
Requirements
- Bachelor's or Master’s degree in Statistics, Mathematics, Quantitative Analysis, Computer Science, Software Engineering or Information Technology
- 8+ years of relevant experience with developing, debugging, scripting and employing big data technologies (e.g. Hadoop, Spark, Flink, Kafka, Arrow, Tableau), database technologies (e.g. SQL, NoSQL, Graph databases), and programming languages (e.g. Python, R, Scala, Java, Rust, Kotlin)
- Deep experience in designing and building dimensional data models, ETL processes, applied data warehouse concepts and methodologies, optimized data pipelines and wore the architect hat in the past or worked with one extensively
- Deep experience with monitoring complex system and solving data and systems issues having a consistent and algorithmic approach to resolving them
- Deep understanding of Information Security principles to ensure compliant handling and management of all data
- Experience working in Agile teams to lead successful digital transformation projects, having mastered Agile principles, practices and Scrum methodologies
- Has the know-how and the scripting and coding prowess to set up, configure và maintain a machine learning model development environment
- Experience architecting, coding and delivering high performance micro services and/or recommenders delivering recommendations to (tens of) millions of users