Social Security Administration · 12 hours ago
Kafka Engineer
The Social Security Administration (SSA) is seeking passionate Kafka Engineers to join their innovative technology team. In this role, you will design, build, and maintain scalable, high-performance data pipelines using Kafka, collaborating with cross-functional teams to deliver reliable and secure solutions that support millions of Americans every day.
GovernmentInnovation ManagementNon ProfitProfessional Services
Responsibilities
Design, develop, and maintain robust Kafka-based applications and data pipelines that support SSA's business operations, including the real-time or near-real-time data to AI/ML models
Collaborate with development, operations, and infrastructure teams to deliver reliable, scalable, and high-performing Kafka solutions
Ensure the availability, reliability, and performance of Kafka clusters and related systems
Work closely with architects, data engineers, and stakeholders to define requirements and deliver solutions
Troubleshoot and resolve issues in Kafka applications, ensuring minimal downtime and optimal performance
Document code, design decisions, processes, configurations, and best practices for future reference and team knowledge sharing
Mentor junior developers and share Kafka expertise, fostering a culture of learning and growth
Stay current with the latest Kafka releases, features, and ecosystem advancements
Perform statistical analysis to monitor team performance, improve processes, and ensure customer satisfaction
Define and set SLAs for projects, ensuring high standards of service delivery
Qualification
Required
Experience must be IT related; the experience may be demonstrated by paid or unpaid experience and/or completion of specific, intensive training (for example, IT certification), as appropriate
Your resume must provide sufficient experience and/or education, knowledge, skills, abilities, and proficiency of any required competencies to perform the specific position for which you are applying
To qualify for the 2210 IT Specialist series, the applicant must demonstrate the following competencies: Attention to Detail - Is thorough when performing work and conscientious about attending to detail
Customer Service - Works with clients and customers to assess their needs, provide information or assistance, resolve their problems, or satisfy their expectations; know about available products and services; and is committed to providing quality products and services
Oral Communication - Expresses information to individuals or groups effectively, taking into account the audience and nature of the information; makes clear and convincing oral presentations; listens to others, attends to nonverbal cues, and responds appropriately
Problem Solving - Identifies problems; determines accuracy and relevance of information; uses sound judgment to generate and evaluate alternatives, and to make recommendations
Minimum Qualifications: Grade 14 To qualify at the GS-14 level, you must have at least 52 weeks of specialized experience at the GS-13 level, or equivalent, designing, developing, and maintaining scalable, fault-tolerant data pipelines using Apache Kafka
Managing and administering Kafka clusters throughout the Systems Development Life Cycle (SDLC), including upgrades and patching
Leading large-scale projects, serving as a Product Owner or Agile/Scrum team lead
Demonstrating strong programming skills in Java, with Python experience as a plus
Utilizing Kafka APIs (Producer, Consumer, Streams, Connect) for event-driven and microservices-based solutions
Applying knowledge of serialization formats (Avro, Protobuf, JSON) and schema registry/data governance, including Hackolade for data modeling
Optimizing producer/consumer performance and handling large-scale data ingestion
Implementing unit and integration testing for Kafka applications
Configuring and tuning Kafka clusters for performance, reliability, and scalability
Monitoring and troubleshooting Kafka clusters using tools such as Prometheus and Grafana
Supporting hybrid integration architecture patterns
Minimum Qualifications: Grade 15 To qualify at the GS-15 level, you must have at least 52 weeks of specialized experience at the GS-14 level, or equivalent, leading the design, development, and implementation of enterprise-scale, fault-tolerant data pipelines using Apache Kafka
Providing expert-level management and administration of Kafka clusters throughout the Systems Development Life Cycle (SDLC), including upgrades and patching
Overseeing large-scale, cross-functional projects as a senior Product Owner or Agile/Scrum leader, ensuring alignment with organizational goals
Demonstrating advanced proficiency in Java programming, with experience in Python as a plus
Architecting event-driven and microservices-based solutions leveraging Kafka APIs (Producer, Consumer, Streams, Connect)
Establishing and enforcing best practices for serialization formats (Avro, Protobuf, JSON) and schema registry/data governance, including Hackolade for data modeling
Directing the optimization of producer/consumer performance and large-scale data ingestion strategies
Leading the implementation of unit and integration testing frameworks for Kafka applications
Managing hybrid integration architecture patterns and ensuring reliability, scalability, and performance of Kafka clusters
Overseeing monitoring and troubleshooting activities using tools such as Prometheus and Grafana
Providing technical guidance and mentorship to teams on Kafka cluster setup, configuration, and tuning
Ensuring compliance with organizational standards and data governance policies
Benefits
Recruitment incentive may be authorized.
Outstationing may be available at the GS15 level.
Company
Social Security Administration
Social Security Administration enable individuals to find social security services the government offers.
Funding
Current Stage
Late StageLeadership Team
Recent News
Morningstar.com
2026-01-22
Company data provided by crunchbase