The job holder leads team to design and develop programs, algorithms and automated processes to cleanse, integrate and evaluate large datasets from disparate sources and implement complex business logic as needed with the available data processing tools.
Requirements
- Bachelor's or Master’s degree in Statistics, Mathematics, Quantitative Analysis, Computer Science, Software Engineering or Information Technology
- 4+ years of relevant experience with developing, debugging, scripting and employing big data technologies (e.g. Hadoop, Spark, Flink, Kafka, Arrow, Tableau), database technologies (e.g. SQL, NoSQL, Graph databases), and programming languages (e.g. Python, R, Scala, Java, Rust, Kotlin)
- Deep experience in designing and building dimensional data models, ETL processes, applied data warehouse concepts and methodologies, optimized data pipelines and wore the architect hat in the past or worked with one extensively
- Deep experience with monitoring complex system and solving data and systems issues having a consistent and algorithmic approach to resolving them
- Deep understanding of Information Security principles to ensure compliant handling and management of all data
- Proven track-record in working in Agile teams to lead company-wide successful digital transformation initiatives and change management, having mastered and mentored on Agile principles, practices and Scrum methodologies
- Well versed in Data và Analytics and a solid know how on the latest data-related technology trends, keeping himself/herself up to date at all times
- Having experience and a good amount of success stories architecting, coding and delivering high performance micro services and/or recommenders delivering recommendations to (tens of) millions of users