Kafka Engineer Job Description
As a Kafka Engineer, you will be responsible for the building, improvement and scaling of our streaming data platform. This role requires a combination of strong technical skills, deep understanding of distributed systems as well as excellent communication abilities.
Kafka Engineer Job Profile
Kafka engineer is a big data engineer who specializes in developing and managing Kafka-based data pipelines. Kafka is a distributed streaming platform that can be used to build real-time data pipelines and streaming applications. As a Kafka engineer, you will be responsible for developing and managing Kafka-based data pipelines. You will also be required to work with other big data technologies such as Hadoop, Spark, and Storm.
Reports To
Kafka Engineer Responsibilities
- Design, develop, and manage Kafka-based data pipelines
- Work with other big data technologies such as Hadoop, Spark, and Storm
- Monitor and optimize Kafka clusters
- Troubleshoot Kafka related issues
- Handle customer queries and support
Kafka Engineer Requirements & Skills
- Tech/BE/M.Tech in Computer Science or related field
- 3+ years of experience in big data or related field
- Strong knowledge of Kafka and other big data technologies
- Good programming skills in Python
- Good understanding of distributed systems
- Good communication and interpersonal skills