We are looking for a Senior Software Engineer with expertise in Kafka and distributed systems to design and maintain data infrastructure, ensuring efficient data flow and system performance. The ideal candidate will have strong programming skills like Java, Go or Python and experience with automation, monitoring, and cloud technologies.
Key Responsibilities:
- Design, implement, and manage Kafka infrastructure, including topics, brokers, producers, and consumers.
- Application Development: Develop Kafka-based applications using languages such as Java, Go, or Python, leveraging Kafka APIs (Producer/Consumer, Kafka Streams, Kafka Connect).
- API Gateway: Manage and secure APIs using technologies like Kong or Apigee.
- CI/CD & DevOps: Build and maintain CI/CD pipelines to automate testing, deployment, and monitoring of Kafka applications.
- Distributed Systems: Ensure high availability, fault tolerance, and scalability of distributed systems.
- Monitoring & Logging: Utilize tools like Grafana and Kafka Manager to monitor system performance and reliability.
- Automation: Automate Kafka infrastructure management and deployments using scripting languages like Bash, PowerShell, or Perl.
- Linux & Kubernetes: Manage data infrastructure on Linux, Kubernetes, and create new CRDs (Custom Resource Definitions) using Go.
Qualifications:
- Bachelor’s degree in Computer Science, Data Engineering, or related field.
- 5+ years of experience with Kafka and distributed systems.
- Proficient in Java, Go, or Python, and familiar with scripting languages.
- Experience with API Gateway technologies (e.g., Kong, Apigee).
- Hands-on experience with CI/CD, DevOps, and automation tools.
- Strong understanding of Linux, Kubernetes, and CRD creation.
This is a long term contract role.