Logo of Huzzle

Confluent Kafka Admin

image

Virtusa

Sep 26

Applications are closed

  • Job
    Full-time
    Senior Level
  • Data
    IT & Cybersecurity
  • Tampa

Requirements

  • Minimum 6 years of experience in BigData administration using KAFKA and related tools like Spark etc.
  • Minimum 4 years of experience in setting up the environment for solutions that collects and process data using Kafka Spark, Scala
  • Expert on Kafka Big Data solutions along with sound understanding of Kafka architecture
  • Manage single and multi-node Kafka cluster deployed on VM, Docker and Kubernetes Container platform.
  • Experience with Confluent Platform running on prem
  • Perform Kafka Cluster build, MRC, including Design, Infrastructure planning, High Availability and Disaster Recovery
  • Implementing wire encryption using SSL, authentication using SASL LDAP authorization using Kafka ACLs in Zookeeper, Broker Client, Connect cluster connectors, Schema Registry, REST API, Producers Consumers, Ksql
  • Develop and maintain Unix scripts to perform day to day Kafka Admin and Security related functions using Confluent REST Proxy server
  • Setting up monitoring tools such as Prometheus, Grafana to scrape metrics from various Kafka cluster components Broker, Zookeeper, Connect, REST proxy, Mirror Maker, Schema Registry and other endpoints such as webservers, databases, logs etc. and configure alerts for Kafka Cluster and supporting infrastructure to measure availability and performance SLAs
  • Experience with Confluent ksql to query and process Kafka streams
  • Knowledge of Kafka Producer and Consumer APIs, Kafka Stream Processing, Confluent Ksql
  • Security related config for above listed software or any other tools in SSL for wire encryption, integration with AD for authentication and RBAC for authorizations
  • Database administration skills in Oracle, MSSQL, SAP HANA, DB2, Aerospike, Postgres ..
  • Exposure to SaaS based observability platform like New Relic
  • Deployment of container images and pods using CI/CD pipelines using Jenkins or comparable tools.
  • Experience in building Kafka deployment pipelines using Terraform, Ansible, Cloud formation templates, shells etc.
  • Availability to work in shifts, extended hours and to provide on call support as required. There will be work over weekends at times depending on the project needs.
  • Worked in Public cloud environment such as Azure or AWS or GCP, preferably in Azure
  • Must have excellent communications and interpersonal skills
  • Experience in a Financial Services or large complex and or global environment preferred
  • Effective written and verbal communication skills
  • Effective analytic diagnostic skills
  • Good presentation skills

Responsibilities

  • Creation of key performance metrics, measuring the utilization, performance, and overall health of the cluster
  • Perform high-level, day-to-day administration and support functions
  • Capacity planning and implementation of new/upgraded hardware and software releases as well as for storage infrastructure
  • Research and recommend innovative ways to maintain the environment and where possible, automate key administration tasks
  • Ability to work with various infrastructure, administration, and development teams across business units
  • Document and share design, build, upgrade and standard operating procedures
  • Conduct knowledge transfer sessions and workshops for other members in the team
  • Provide technical expertise and guidance to new and junior members in the team
  • Implemented and supported any enterprise product such as any well-known ERP products, Data warehouse, Middleware etc
  • Manage single and multi-node Kafka cluster deployed on VM, Docker and Kubernetes Container platform
  • Perform Kafka Cluster build, MRC, including Design, Infrastructure planning, High Availability and Disaster Recovery
  • Implementing wire encryption using SSL, authentication using SASL LDAP authorization using Kafka ACLs in Zookeeper, Broker Client, Connect cluster connectors, Schema Registry, REST API, Producers Consumers, Ksql
  • Develop and maintain Unix scripts to perform day-to-day Kafka Admin and Security related functions using Confluent REST Proxy server
  • Setting up monitoring tools such as Prometheus, Grafana to scrape metrics from various Kafka cluster components Broker, Zookeeper, Connect, REST proxy, Mirror Maker, Schema Registry and other endpoints such as webservers, databases, logs etc. and configure alerts for Kafka Cluster and supporting infrastructure to measure availability and performance SLAs
  • Experience with Confluent ksql to query and process Kafka streams
  • Knowledge of Kafka Producer and Consumer APIs, Kafka Stream Processing, Confluent Ksql
  • Security related config for above listed software or any other tools in SSL for wire encryption, integration with AD for authentication and RBAC for authorizations
  • Database administration skills in Oracle, MSSQL, SAP HANA, DB2, Aerospike, Postgres
  • Exposure to SaaS based observability platform like New Relic
  • Deployment of container images and pods using CI/CD pipelines using Jenkins or comparable tools
  • Experience in building Kafka deployment pipelines using Terraform, Ansible, Cloud formation templates, shells etc
  • Availability to work in shifts, extended hours and to provide on-call support as required
  • There will be work over weekends at times depending on the project needs
  • Worked in Public cloud environment such as Azure or AWS or GCP, preferably in Azure

FAQs

What is the minimum experience required for the Confluent Kafka Admin position?

The minimum experience required is 6 years in Big Data administration using Kafka and related tools.

What skills are essential for this role?

Essential skills include expertise in Kafka Big Data solutions, knowledge of Kafka architecture, and experience with tools like Spark, Scala, and database administration.

Is experience in container platforms necessary?

Yes, experience in managing single and multi-node Kafka clusters deployed on VM, Docker, and Kubernetes is required.

Will there be opportunities for knowledge transfer and mentoring in this role?

Yes, the role includes conducting knowledge transfer sessions and providing technical guidance to new and junior team members.

Are there specific security practices required for this position?

Yes, knowledge of implementing wire encryption using SSL, authentication using SASL, and authorization using Kafka ACLs in Zookeeper is required.

Is there a requirement to work outside regular hours?

Yes, availability to work in shifts, extended hours, and provide on-call support as required is necessary, including occasional weekend work.

What kind of software development practices are expected?

Experience in deploying container images and pods using CI/CD pipelines with tools like Jenkins, as well as building Kafka deployment pipelines using Terraform or Ansible, is expected.

Is there a preferred industry experience for this position?

Experience in a Financial Services or complex global environment is preferred but not mandatory.

Are communication skills important for this role?

Yes, excellent communication and interpersonal skills are essential for effective collaboration across teams.

What cloud environments should candidates be familiar with?

Familiarity with public cloud environments like Azure, AWS, or GCP is required, with a preference for Azure.

Business transformation that lasts starts with Engineering First.

Technology
Industry
10,001+
Employees
1996
Founded Year

Mission & Purpose

Virtusa Corporation provides digital engineering and technology services to Forbes Global 2000 companies worldwide. Our Engineering First approach ensures we can execute all ideas and creatively solve pressing business challenges. With industry expertise and empowered agile teams, we prioritize execution early in the process for impactful results. We combine logic, creativity and curiosity to build, solve, and create. Every day, we help clients engage with new technology paradigms, creatively building solutions that solve their most pressing business challenges and move them to the forefront of their industry.

Get notified when Virtusa posts a new role

Get Hired with Huzzle

Discover jobs with AI-powered precision. Autofill and track applications, create tailored resumes, and find the best opportunities across the web – all by simply chatting.

Already have an account?