Kafka Engineer Job Description
As a Kafka Engineer, you will be responsible for the building, improvement, and scaling of our streaming data platform. This role requires a combination of strong technical skills, a deep understanding of distributed systems, and excellent communication abilities.
Kafka Engineer Job Profile
A Kafka Engineer is a big data engineer who specializes in developing and managing Kafka-based data pipelines. Kafka is a distributed streaming platform that can be used to build real-time data pipelines and streaming applications. You will be responsible for developing and managing Kafka-based data pipelines and will also be required to work with other big data technologies such as Hadoop, Spark, and Storm.
Reports To
Kafka Engineer Responsibilities
- Design, develop, and manage Kafka-based data pipelines
- Work with other big data technologies such as Hadoop, Spark, and Storm
- Monitor and optimize Kafka clusters
- Troubleshoot Kafka-related issues
- Handle customer queries and support
Kafka Engineer Requirements & Skills
- B.Tech / BE / M.Tech in Computer Science or related field
- 3+ years of experience in big data or a related field
- Strong knowledge of Kafka and other big data technologies
- Good programming skills in Python
- Good understanding of distributed systems
- Good communication and interpersonal skills
#J-18808-Ljbffr