Kafka Engineer jobs in United States
cer-icon
Apply on Employer Site
company-logo

CACI bv ยท 3 months ago

Kafka Engineer

CACI is seeking a Kafka Engineer to join their team supporting the Border Enforcement Applications for Government Leading-Edge Information Technology (IT) contract. The role involves designing, developing, and deploying high-performance Kafka applications while collaborating with other engineering teams to ensure best practices in an event-driven architecture.

ConsultingEducationTraining
badNo H1BnoteU.S. Citizen Onlynote

Responsibilities

Serve as an Agile Scrum team member providing software development support and maintenance for the delivery of releasable software in short sprint cycles. Responsible for activities associated with delivery of software solutions associated with customer-defined systems and software projects by working in close collaboration with software developers/engineers, stakeholders, and end users within Agile processes. Responsibilities include:
Design, develop, and deploy high-performance Kafka producers, consumers, and stream processing applications (using Kafka Streams, ksqlDB, Flink, or Spark Streaming) in Java
Collaborate with architects and other engineering teams to define and evolve our event-driven architecture, ensuring best practices for Kafka topic design, partitioning, replication, and data retention
Implement and manage components of the Kafka ecosystem, including Kafka Connect (source and sink connectors), Schema Registry (Avro, Protobuf), and Kafka security features
Monitor, troubleshoot, and optimize Kafka clusters and Kafka-dependent applications for throughput, latency, reliability, and resource utilization
Build and maintain robust and resilient data pipelines for real-time ingestion, transformation, and distribution of data across various systems
Provide operational support for Kafka-based systems, including incident response, root cause analysis, and proactive maintenance to ensure high availability and reliability
Enforce data contract definitions and schema evolution strategies using Schema Registry to maintain data quality and compatibility across services
Implement comprehensive testing strategies for Kafka applications, including unit, integration, and end-to-end tests, ensuring data integrity and system reliability
Create and maintain detailed technical documentation, architectural diagrams, and operational runbooks for Kafka-related components and processes
Act as a subject matter expert, sharing knowledge, mentoring junior engineers, and championing Kafka best practices across the organization

Qualification

Apache KafkaKafka Streams APIJavaDistributed systemsData serialization formatsRelational databasesMonitoring toolsGitAgile methodologiesProblem-solving skillsCommunication skillsTechnical documentation

Required

Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria include but are not limited to:
Extensive hands-on experience designing, developing, and deploying applications using Apache Kafka (producers, consumers, topic management, consumer groups)
Deep understanding of Kafka's internal architecture, guarantees (at-least-once, exactly-once), offset management, and delivery semantics
Experience with Kafka Streams API or other stream processing frameworks (e.g., Flink, Spark Streaming with Kafka)
Programming Proficiency: High-level proficiency in at least one modern backend programming language suitable for Kafka development (Java strongly preferred)
Strong understanding of distributed systems principles, concurrency, fault tolerance, and resilience patterns
Experience with data serialization formats such as Avro, Protobuf, or JSON Schema, and their use with Kafka Schema Registry
Solid understanding of relational and/or NoSQL databases, and experience integrating them with Kafka
Excellent analytical, debugging, and problem-solving skills in complex distributed environments
Strong verbal and written communication skills, with the ability to clearly articulate technical concepts to diverse audiences
Knowledge of monitoring and observability tools for Kafka and streaming applications (e.g., Prometheus, Grafana, ELK stack, Datadog)
Working knowledge of Git and collaborative development workflows. Understanding of all elements of the software development life cycle, including planning, development, requirements management, CM, quality assurance, and release management
Professional Experience: at least seven (7) years related technical experience, with software design, development and implementation in a Windows Environment
College degree (B.S.) in Computer Science, Software Engineering, Information Management Systems or a related discipline
Equivalent professional experience will be considered in lieu of degree
1 year check for misconduct such as theft or fraud
1 year check for illegal drug use
3 year check for felony convictions

Preferred

Hands-on experience with Confluent Platform components (Control Center, ksqlDB, REST Proxy, Tiered Storage)
Experience with Kafka Connect for building data integration pipelines (developing custom connectors is a plus)
Familiarity with cloud platforms (AWS, Azure, GCP) and managed Kafka services (e.g., AWS MSK, Confluent Cloud, Azure Event Hubs)
Experience with containerization (Docker) and orchestration (Kubernetes) for deploying Kafka-dependent applications
Experience with CI/CD pipelines for automated testing and deployment of Kafka-based services
Familiarity with performance testing and benchmarking tools for Kafka and related applications

Benefits

Healthcare
Wellness
Financial
Retirement
Family support
Continuing education
Time off benefits

Company

CACI bv

twittertwittertwitter
company-logo
CACI levert, implementeert en beheert bedrijfskritische oplossingen voor het Hoger Onderwijs: het StudentInformatieSysteem OSIRIS en LISA voor zaakgericht werken.

Funding

Current Stage
Growth Stage
Company data provided by crunchbase