We are looking for a brilliant DevOps Engineer to strengthen our tech team. Since we are all working in the field of Big Data, you need to be in love in large amount of data and passionate about how you can change the world with it.

Desired skills & experiences
The DevOps Engineer is someone that can automate, deploy, monitor, and manage complex cloud services for big data processing and analysis on our AWS infrastructure.

  • Experience with and deep appreciation for automation software, such as Chef, Puppet, Salt, or Hashicorp products.
  • Excited about designing automation process and picking your favorite tools, and is able to make a strong case for them.
  • Able to stuff anything and everything into a Docker container out of a sheer love of standardization.
  • Able to wrangle large-scale cloud providers, preferably previous experience with AWS.
  • Can run, tune, and scale databases. Most importantly: ElasticSearch. The more database familiarity, the better.
  • Has experience with distributed computing architectures, preferably already knows some or all of our following technologies: Spark, Kafka, HDFS, Mesos, Marathon, ZooKeeper.

We are really hiring you, not your CV so make sure to let us know who you are as a live person as well…

Being innovative, curious and self going is a must in our very fast paced environment. You need to love to take responsibility for your area, and constantly look for improvements. Let us know who you are, what makes you tick and most importantly why do you want to join our team.

All shortlisted applicants will undergo testing of programmatic skills and problem solving both individually and within a team. The expected time frame for skill and personality testing is up to one week.