Mizuho · 2 days ago
Data Platform Engineer
Mizuho is a leading financial services provider, and they are seeking a Data Platform Engineer to design and operate their enterprise event-streaming platform. The role involves ensuring a reliable and scalable streaming ecosystem, collaborating with application teams, and implementing event-driven integration patterns.
BankingFinancial ServicesImpact Investing
Responsibilities
Design, deploy, and operate AMQ Streams (Kafka) clusters on Red Hat OpenShift
Configure and manage Kafka components including brokers, Kraft, MirrorMaker 2
Explore Kafka Connect, and Schema Registry concepts and implementations
Ensure performance, reliability, scalability, and high availability of the Kafka platform
Implement cluster monitoring, logging, and alerting using enterprise observability tools
Manage capacity planning, partition strategies, retention policies, and performance tuning
Define and document standardized event-driven integration patterns, including:
Event sourcing
CQRS
Pub/sub messaging
Change data capture
Stream processing & enrichment
Request-reply over Kafka
Guide application teams on using appropriate patterns that align with enterprise architecture
Establish best practices for schema design, topic governance, data contracts, and message lifecycle management
Implement enterprise-grade security for Kafka, including RBAC, TLS, ACLs, and authentication/authorization integration. (SSO and OAuth)
Maintain governance for topic creation, schema evolution, retention policies, and naming standards
Ensure adherence to compliance, auditing, and data protection requirements (Encryption at Rest and flight)
Provide platform guidance and troubleshooting expertise to development and integration teams
Partner with architects, SREs, and developers to drive adoption of event-driven architectures
Create documentation, runbooks, and internal knowledge-sharing materials
Build and maintain GitOps workflows using Argo CD for declarative deployment of Kafka resources and platform configurations
Develop CI/CD pipelines in GitLab, enabling automated builds, infrastructure updates, and configuration promotion across environments
Maintain Infrastructure-as-Code (IaC) repositories and templates for Kafka resources
Qualification
Required
Bachelor's degree in computer science, Engineering, or a related field
Proven experience with Kafka administration and management
Strong knowledge of OpenShift and container orchestration
Proficiency in scripting languages such as Python or Bash
Experience with monitoring and logging tools (e.g., Splunk, Prometheus, Grafana)
Excellent problem-solving skills and attention to detail
Strong communication and collaboration skills
Preferred
Experience with Red Hat OpenShift administration
Knowledge of service mesh patterns (Istio, OpenShift Service Mesh)
Familiarity with stream processing frameworks (Kafka Streams, ksqlDB, Flink)
Experience using observability stacks (Prometheus, Grafana)
Background working in regulated or enterprise-scale environments
Knowledge of DevOps practices and tools (e.g., ArgoCD, Ansible, Terraform)
Knowledge of SRE Monitoring and logging tools (e.g., Splunk, Prometheus, Grafana)
Benefits
Generous employee benefits package
Discretionary bonus
Company
Mizuho
This is not your typical financial institution. It’s our people who make us a cut above.
H1B Sponsorship
Mizuho has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (49)
2024 (23)
2023 (43)
2022 (12)
2021 (10)
2020 (1)
Funding
Current Stage
Late StageRecent News
2025-11-14
2025-11-12
2025-11-12
Company data provided by crunchbase