Genesis10 · 2 months ago
Streaming Platform Engineer
Genesis10 is currently seeking a Streaming Platform Engineer for a contract to hire position based in Plano, TX. The role involves designing, implementing, and operating high-throughput, low-latency event streaming systems using modern distributed messaging platforms, while ensuring reliability and performance of streaming data integrations.
Information ServicesInformation Technology
Responsibilities
Design and manage production-grade streaming clusters (Kafka, Pulsar, Event Hubs, Kinesis, etc.) across cloud environments
Configure topics/namespaces/streams, partitioning strategies, retention, replication, and geo-redundancy
Implement schema management using Schema Registry, Pulsar Schema, Protobuf/Avro/JSON Schema, or cloud-native schema services
Ensure high availability, disaster recovery, and multi-region failover
Write scalable, fault-tolerant producers and consumers in Java, Scala, Python, or Go
Build real-time data pipelines using:
Kafka Streams / ksqlDB
Pulsar Functions
Flink
Spark Structured Streaming
Kinesis Data Analytics
Guarantee exactly-once or at-least-once semantics, idempotency, and ordered processing
Deploy CDC pipelines using Debezium, Maxwell, MongoDB Change Streams, or cloud-native CDC (DMS, Dataflow)
Use connectors (Kafka Connect, Pulsar IO, Event Hubs Capture) to sync data to:
Data warehouses (Snowflake, BigQuery, Redshift)
Operational databases (AzureSQL, PostgreSQL, Cassandra, DynamoDB)
Design schemas, indexes, and partitioning in RDBMS (AzureSQL, PostgreSQL, MySQL) and NoSQL (Cassandra, DynamoDB, MongoDB) for high-velocity writes
Optimize query performance for event-sourced or streaming-derived data
Manage data consistency between streams and persistent stores (event sourcing, CQRS patterns)
Instrument end-to-end monitoring (Prometheus, Grafana, Datadog, CloudWatch, Azure Monitor)
Set up consumer lag alerts, schema compatibility checks, and error dead-letter queues
Automate deployments with Terraform, Helm, Kubernetes Operators (Strimzi, Pulsar Operator), or cloud CLIs
Write contract tests, integration tests, and chaos engineering scenarios
Qualification
Required
Streaming Platforms: Apache Kafka or Apache Pulsar (deep expertise in at least one); familiarity with Kinesis, Event Hubs, Pub/Sub, Redpanda
Programming: Go, Java, or Scala
Processing Frameworks: Kafka Streams, ksqlDB, Pulsar Functions, Flink, Spark Streaming
Schema & Serialization: Avro, Protobuf, JSON Schema, Schema Registry (Confluent, Apicurio, Pulsar Schema)
CDC & Connectors: Debezium, Kafka Connect, Pulsar IO, AWS DMS, Azure CDC
Databases: AzureSQL/PostgreSQL/MySQL (indexing, partitioning), Cassandra/DynamoDB/MongoDB
Cloud & Infra: AWS, GCP, Azure; Docker, Kubernetes, Terraform, CI/CD
Preferred
Migrated from Kafka to Pulsar (or vice versa) in production
Built multi-tenant streaming platforms with isolation and quota enforcement
Used event sourcing, CQRS, or domain-driven design with streams
Contributed to Strimzi, Pulsar Operators, or open-source connectors
Certified: Confluent Certified Developer, Databricks Apache Spark, AWS Data Analytics, etc
Benefits
Behavioral Health Platform
Medical, Dental, Vision
Health Savings Account
Voluntary Hospital Indemnity (Critical Illness & Accident)
Voluntary Term Life Insurance
401K
Sick Pay (for applicable states/municipalities)
Commuter Benefits (Dallas, NYC, SF)
Remote opportunities available
Company
Genesis10
Information Technology and Services
H1B Sponsorship
Genesis10 has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (126)
2024 (68)
2023 (20)
2022 (2)
2021 (13)
2020 (29)
Funding
Current Stage
Late StageCompany data provided by crunchbase