Big Data Systems Engineer - Cloud (DevOps)

JOB DESCRIPTION 

We’re looking for a Big Data Systems Engineer - Cloud (DevOps). We offer:

  • remote work with a globally disparate team;
  • an opportunity to join an AWS Advanced Technology Partner;
  • and unrivalled Kubernetes global footprint providing critical services to energy providers.

Requirements

The ideal candidate will have at least three years experience with Hadoop, Hbase, and Kafka. Sprinkle in some Spark, and you're a great candidate! We have CDH and are migrating to AWS EMR with Kafka (Strimzi) and Spark on Kubernetes. 

You will primarily focus on EMR, Kafka, and Spark. But, you are either certifiable in both AWS and Kubernetes or can't wait to be. We use Terraform and Ansible in depth and you are ready to triage, diagnose, and remediate production failures at scale.  Monitoring and documentation are in the definition of done and you are well written.

Responsibilities

The successful candidate will be designing, implementing, and operating AWS solutions primarily using EKS and EMR. Strimzi is a Kubernetes Operator so infrastructure as code is something you are familiar with. 

We manage our infrastructure with Git as the source of truth. It is home to Terraform and Ansible we use to do all the heavy lifting. You got Python and Bash in your back pocket and can't wait to use it! We have proper immutable infrastructure using CICD declarative pipelines with Jenkins. 

Why Should You Apply?

You love the pressure that comes with high stakes maintaining the integrity of complex system's data. We have a complex cloud-native distributed system that requires experience and discipline. You are very collaborative, reliable, and deliver to clever people.

Apply to Job
Full Name*
Email*
Phone*
Location
LinkedIn Profile
Education
Resume*
What is your visa status (H1 Visa, Green Card, US Citizen)?*
What are your salary expectations?
Would you now or in the future ever require visa to sponsorship to work in the USA?*
Yes   No