PhoenixTeam · 11 hours ago
Kafka Engineer
PhoenixTeam is seeking a Kafka Engineer to support large-scale data streaming and platform modernization initiatives. The role involves designing, developing, and optimizing Apache Kafka clusters, ensuring reliable event streaming pipelines, and collaborating with various teams to achieve operational resilience in complex enterprise systems.
Business IntelligenceFinancial ServicesManagement Information SystemsSoftware Engineering
Responsibilities
Design, build, administer, and maintain Kafka clusters across development, test, and production environments
Manage Kafka topics, partitions, brokers, replication, retention policies, and access controls
Monitor Kafka performance, availability, throughput, and latency; proactively identify and resolve issues
Perform capacity planning, tuning, upgrades, patching, and disaster recovery planning for Kafka environments
Implement and maintain high availability and fault-tolerant Kafka configurations
Develop and support event streaming pipelines using Kafka for real-time and near-real-time data processing
Integrate Kafka with API Gateway (APIGW)–based microservices and downstream backend systems
Design and implement Kafka producers, consumers, and connectors (e.g., Kafka Connect) to support system integrations and ETL/data movement needs
Collaborate with application teams to define event schemas, topics, and data contracts
Ensure reliable message delivery, data integrity, and error handling across streaming workflows
Implement Kafka security best practices, including authentication, authorization, encryption in transit, and auditing
Ensure Kafka implementations comply with CMS security, data governance, and operational standards
Support DevSecOps practices, CI/CD pipelines, and infrastructure-as-code approaches where applicable
Participate in incident response, root cause analysis, and operational readiness activities
Work closely with architects, developers, DevOps engineers, and system administrators to support solution design and delivery
Document Kafka architectures, configurations, operational procedures, and integration patterns
Provide technical guidance, troubleshooting support, and knowledge transfer to internal teams
Qualification
Required
Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field
3+ years of experience developing, administering, and supporting Apache Kafka in enterprise environments
Hands-on experience managing Kafka clusters, topics, partitions, and event streaming pipelines
Experience integrating Kafka with microservices, API Gateways (APIGW), and backend systems
Strong understanding of event-driven architectures, messaging patterns, and data streaming concepts
Experience with Linux-based environments and command-line administration
Strong troubleshooting and performance tuning skills
Ability to clearly communicate technical concepts to both technical and non-technical stakeholders
Preferred
Experience supporting federal healthcare programs
Experience working in Agile, Scrum, and/or DevSecOps environments
Familiarity with cloud-based Kafka deployments (AWS MSK or similar managed Kafka services)
Experience with CI/CD pipelines and automation tools
Knowledge of cloud security concepts and secure data transmission
Experience with monitoring tools and observability platforms for Kafka (e.g., Prometheus, Grafana, CloudWatch)
Familiarity with schema management tools (e.g., Schema Registry)
Knowledge of containerized environments and orchestration tools (Docker, Kubernetes) is a plus
Company
PhoenixTeam
There's one way to sum up what we believe at PhoenixTeam - time is of the essence. Mortgage technology delivery is unique.
Funding
Current Stage
Growth StageRecent News
Company data provided by crunchbase