Logo of Huzzle

Staff BigData Hadoop Administrator

  • Job
    Full-time
    Senior Level
  • Data
    IT & Cybersecurity
  • Dublin
    Remote
  • Quick Apply

AI generated summary

  • You need 6+ years of experience, 4+ in DevOps, Hadoop ecosystem expertise, Hadoop security skills, strong scripting knowledge, CI/CD experience, and deployment automation with tools like Ansible and Jenkins.
  • You will deploy and monitor Big Data infrastructure, automate CI/CD pipelines, troubleshoot Hadoop components, provide production support, and enforce data governance policies.

Requirements

  • 6 + years of overall experience with at least 4+ years DevOps experience building and administering Hadoop clusters
  • Deep understanding of Hadoop/Big Data Ecosystem. Good knowledge in Querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark Streaming and working on systems like HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue.
  • Experience securing Hadoop stack with Sentry, Ranger, LDAP, Kerberos KDC
  • Good knowledge of Perl, Python, Bash, Groovy and Java.
  • In-depth knowledge of Linux internals (Centos 7.x) and shell scripting
  • Ability to learn quickly in a fast-paced, dynamic team environment
  • Experience using tools like Tableau, Grafana, MariaDB, and Prometheus.
  • Experience supporting CI/CD pipelines on Cloudera on Native cloud and Azure/AWS environments
  • Demonstrated expert-level experience in delivering end-end deployment automation leveraging Puppet, Ansible, Terraform, Jenkins, Docker, Kubernetes or similar technologies.
  • A good knowledge of Groovy and/or Java

Responsibilities

  • Responsible for deploying, production monitoring, maintaining and supporting of Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
  • Architect and drive the end-end Big Data deployment automation from vision to delivering the automation of Big Data foundational modules (Cloudera CDP), prerequisite components and Applications leveraging Ansible, Puppet, Terraform, Jenkins, Docker, Kubernetes to deliver end-end deployment automation across all ServiceNow environments.
  • Automate Continuous Integration / Continuous Deployment (CI/CD) data pipelines for applications leveraging tools such as Jenkins, Ansible, and Docker.
  • Performance tuning and troubleshooting of various Hadoop components and other data analytics tools in the environment: HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Kerberos, Tableau, Grafana, MariaDB, and Prometheus.
  • Provide production support to resolve critical Big Data pipelines and application issues and mitigating or minimizing any impact on Big Data applications. Collaborate closely with Site Reliability Engineers (SRE), Customer Support (CS), Developers, QA and System engineering teams in replicating complex issues leveraging broad experience with UI, SQL, Full-stack and Big Data technologies.
  • Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.

FAQs

What is the primary role of a Staff BigData Hadoop Administrator at ServiceNow?

The primary role involves deploying, monitoring, maintaining, and supporting Big Data infrastructure and applications on ServiceNow Cloud and Azure environments, as well as ensuring the availability and performance of the service.

What tools and technologies are used for Big Data deployment automation?

The role employs tools such as Ansible, Puppet, Terraform, Jenkins, Docker, and Kubernetes for end-to-end Big Data deployment automation.

How many years of experience are required for this position?

At least 6 years of overall experience is required, with 4+ years specifically in DevOps related to building and administering Hadoop clusters.

What are some of the key technologies the candidate should be familiar with?

Candidates should have a deep understanding of the Hadoop/Big Data ecosystem, including HDFS, YARN, Hive, HBase, Spark, Kafka, RabbitMQ, Impala, Kudu, and Redis.

Is experience in securing the Hadoop stack necessary for this role?

Yes, experience in securing the Hadoop stack with tools like Sentry, Ranger, LDAP, and Kerberos is required.

What programming languages are beneficial for this position?

Proficiency in Perl, Python, Bash, Groovy, and Java is beneficial for the role.

What additional experience would be advantageous for a candidate?

Experience with tools like Tableau, Grafana, MariaDB, and Prometheus, as well as supporting CI/CD pipelines in Cloudera on native cloud and Azure/AWS environments, is advantageous.

What kind of support does ServiceNow provide to its employees?

ServiceNow provides a competitive salary, supportive teams, mental health resources, flexible working culture, parental leave programs, childcare benefits, and development opportunities.

How does ServiceNow approach workplace flexibility?

ServiceNow encourages a flexible work environment with trust, assigning work personas based on the nature of the work to accommodate remote, hybrid, or in-office arrangements.

Is ServiceNow an equal opportunity employer?

Yes, ServiceNow is an equal opportunity employer and considers all qualified applicants without regard to various protected categories.

A purpose-driven company. Making work, work better for people guides everything we do.

Technology
Industry
10,001+
Employees
2004
Founded Year

Mission & Purpose

ServiceNow (NYSE: NOW) makes the world work better for everyone. Our cloud-based platform and solutions help digitize and unify organizations so that they can find smarter, faster, better ways to make work flow. So employees and customers can be more connected, more innovative, and more agile. And we can all create the future we imagine. The world works with ServiceNow.

Culture & Values

  • Wow our customers

  • Win as a team

  • Create belonging

  • Stay hungry and humble

Benefits

  • Generous family leave

  • Flexible PTO

  • Matched donations

  • 401(k) matching

  • Annual learning stipends

  • Paid volunteer time