An IT Staffing Firm is seeking an IT Technical Engineer to research and deploy new tools and frameworks to build a sustainable big data platform, with experience in Hadoop stack, Kerberos, and AWS products.
Requirements
- 18-months+ experience with the Hadoop stack (Pig, Hive, Spark etc)
- 6-month+ experience with Hadoop Platform as a service such as EMR, Genie etc
- Ability to architect, design and implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, Route 53 and other AWS products
- 5 or more years in UNIX systems engineering with experience in Red Hat Linux, Centos or Ubuntu
- Hands on experience with monitoring tools such as AWS CloudWatch or Nagios
- Deep Knowledge of TCP/IP networking, SMTP, HTTP, load-balancers and high availability architecture
- 6-months+ experience with Puppet, Chef or AWS OpsWorks
- Agile/Scrum Application Development experience