Java Architect @ System Soft Technologies | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Java Architect jobs in United States
Be an early applicantLess than 25 applicants
company-logo

System Soft Technologies · 5 hours ago

Java Architect

ftfMaximize your interview chances
ConsultingCustomer Service
badNo H1BnoteU.S. Citizen Onlynote
Hiring Manager
Kamron Cox
linkedin

Insider Connection @System Soft Technologies

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Lead the collaborative design and implementation of event-driven, real-time streaming architectures using Apache Kafka and Apache Flink, ensuring alignment with microservices principles and Domain-Driven Design (DDD) patterns.
Foster a collaborative environment with fellow streaming developers, promoting knowledge sharing, mentorship, and continuous refinement of best practices in stream processing, fault tolerance, and scalability.
Architect and implement production-grade, fault-tolerant Kafka pipelines and Flink applications using Java 17+, leveraging Flink's DataStream, Table, or SQL APIs to process high-volume, low-latency data streams.
Ensure compliance with company policies, data governance standards, and industry regulations in all aspects of streaming development and operations.
Advocate and enforce best practices for stream processing, code quality (including code reviews), testing strategies, and maintainability to build a resilient and future-proof streaming infrastructure.
Engage in solution architecture discussions, provide technical guidance, and conduct thorough code reviews to uphold high standards of software craftsmanship and system performance.
Drive continuous improvement by identifying and proposing enhancements to operational workflows, technical stack, and development methodologies, focusing on efficiency, scalability, and cost-effectiveness.
Collaborate with fellow engineers, DevOps and operations teams to proactively monitor, troubleshoot, and optimize streaming applications.
Contribute to cross-functional initiatives, knowledge sharing sessions, and documentation efforts to elevate the team's expertise in Kafka, Flink, and event-driven architectures.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

Java 17+Apache KafkaApache FlinkMicroservices architectureDomain-Driven Design (DDD)SQLKafka ConnectSpring FrameworkAWS cloudDockerKubernetesDevOps practicesData governanceDataDogAgile/ScrumJIRAConfluencePerformance tuningSchema evolution AvroSchema evolution JsonStream processingMentoring

Required

Must have experience Architect/Design Event Streaming Data and be able to write quality Java based Code.
Proven expertise in designing, implementing, and optimizing event-driven, real-time streaming architectures using Confluent Kafka and Apache Flink.
Advanced proficiency in Java 17+ and extensive experience with Flink's DataStream, Table, and SQL APIs for developing complex stream processing applications.
Deep understanding of microservices architecture, domain-driven design (DDD) principles, and event sourcing and CQRS patterns, with a track record of applying these concepts in stream processing systems.
Extensive experience with Kafka ecosystem, including Kafka Connect for scalable and fault-tolerant data ingestion/egress, and familiarity with common connectors for databases, message queues, and cloud services.
Demonstrated ability to provide technical leadership, mentor team members, and foster a collaborative environment that promotes knowledge sharing and continuous improvement.
Strong SQL skills, including the ability to write, optimize, and review complex queries, especially in the context of stream-table joins and windowing operations in Flink SQL.
Exceptional problem-solving, debugging, and performance tuning skills, with experience in root cause analysis of issues in distributed streaming systems.
Proficiency with the Spring Framework, particularly Spring Boot and Spring Cloud, for building and deploying microservices-based streaming applications.
Hands-on experience with in-memory data stores like Memcached and Redis for caching, state management, and enhancing the performance of streaming applications.
Solid understanding of AWS cloud for deploying, scaling, and managing streaming workloads.
Experience with DevOps practices, CI/CD pipelines, and containerization technologies (Docker, Kubernetes) to streamline the deployment and management of streaming applications.
Proficiency in Agile/Scrum methodologies, with experience in sprint planning, daily stand-ups, and iterative development in a data-driven environment.
Familiarity with collaboration tools such as JIRA and Confluence.
Excellent interpersonal, written, and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical stakeholders.
Experience with data governance, schema evolution strategies (Avro, and Json), and ensuring data quality and consistency in streaming pipelines.
Knowledge of monitoring and observability tools (DataDog) for real-time insights into streaming application performance and data flows.
Good interpersonal, written, and verbal communication skills.

Preferred

B.S./M.S., in software engineering/computer science/related area, or equivalent experience
Possessing a relevant certification (e.g., Confluent Certified Developer for Apache Kafka®) demonstrates your commitment to professional development in this field.
8+ years developing scalable, fault-tolerant full-stack or backend systems in Java, with focus on event-driven and real-time applications.
Deep expertise with Flink's SQL API for stream processing, including experience with complex event processing, temporal tables, UDFs to enrich streaming data.
Advanced knowledge of Flink state management using RocksDB, Apache Ignite, and optimal checkpointing strategies.
Experience with schema evolution strategies (Avro, Json) in Kafka and Flink to ensure data compatibility across pipeline upgrades.
Experience with Docker for containerization and Kubernetes (preferably with Helm) for orchestrating streaming deployments.
Strong DevOps skills including Git, CI/CD pipelines, and Infrastructure as Code (Terraform).
Knowledge of stream processing performance tuning and benchmarking.
Knowledge of data modeling techniques for streaming data, including temporal modeling and SCDs.

Benefits

Medical, dental, & vision insurance
401k + match
Profit sharing
Paid vacation and personal time
Flex time
10 paid holidays
Company performance bonus
Holiday bonus
Paid time to volunteer
Training & career development opportunities

Company

System Soft Technologies

twittertwittertwitter
company-logo
SystemSoft is a company specializing in innovative IT consulting services and solutions.

Funding

Current Stage
Late Stage

Leadership Team

leader-logo
Sreedhar Veeramachaneni
CEO
linkedin
leader-logo
David Romberger
Senior Client Partner
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot