Specialist Solutions Engineer (Remote, USA) @ Confluent | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Specialist Solutions Engineer (Remote, USA) jobs in USA
160 applicants
company-logo

Confluent · 10 hours ago

Specialist Solutions Engineer (Remote, USA)

ftfMaximize your interview chances
AnalyticsCloud Data Services
check
Comp. & Benefits
check
H1B Sponsor Likelynote

Insider Connection @Confluent

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Enable Customers with Real-Time Architectures
Understand customer challenges with traditional Data Warehouses, Data Lakes, and Batch Analytics workflows, and guide them toward real-time, distributed architectures using Kafka, Flink, Kafka Streams, and modern ETL/ELT frameworks.
Help customers optimize their data platforms by focusing on early-stage data enrichment, filtering, and processing for better performance and cost efficiency.
Provide Technical Expertise
Assist customers and sales teams in designing, deploying, and optimizing real-time data streaming platforms, integrating Kafka with distributed processing, and ensuring alignment with business goals.
Architect solutions to unify operational and analytical workloads, enabling a data mesh or streaming-first architecture.
Collaborate Across Teams
Partner with sales, product management, marketing, and engineering to translate customer feedback into continuous improvements for Confluent’s platform.
Work cross-functionally to refine product messaging and technical positioning, ensuring clarity and alignment with customer needs.
Drive Thought Leadership
Lead technical workshops, webinars, and sessions to showcase the advanced capabilities of Kafka, Kafka Connect, Flink, and other data streaming tools.
Advocate for best practices in real-time architectures through case studies, training sessions, and field engagement with both technical and business stakeholders.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

KafkaData WarehousesData LakesDistributed SystemsETL toolsPythonJavaCDC ConnectorsReal-time analyticsCloud-native platformsAvroParquetAWSGCPAzurePre-sales engineeringProof-of-concept developmentSolution designTechnical presentations

Required

Proven experience architecting and implementing enterprise-scale, real-time data streaming solutions with Kafka and related ecosystems.
Deep knowledge of Data Warehouses, Data Lakes, Lakehouses, and Distributed Systems, including CDC Connectors, ETL tools, and RDBMS platforms.
Expertise in distributed data processing, data governance, and real-time analytics.
Hands-on programming experience with Python, Java, or similar, especially for building streaming and ETL pipelines.
A demonstrated ability to become a trusted advisor to Chief Data Officers, Data Engineers, Data Scientists, Data Analysts, and Architects, driving customer success and platform adoption.
Strong pre-sales engineering skills, including technical presentations, proof-of-concept development, and hands-on solution design.
Proven ability to work across cross-functional teams, including product management, engineering, and marketing, to drive shared outcomes.
A passion for learning and adapting to emerging technologies, particularly within the Kafka ecosystem.

Preferred

Extensive experience with modern data architectures, including Data Warehouses, Lakehouses, data formats (e.g., Avro, Parquet), and cloud-native platforms (AWS, GCP, Azure).
Expertise in integrating Kafka-based solutions with cloud services and enterprise data ecosystems.
Demonstrated success designing and implementing distributed streaming systems for large-scale enterprises, including Fortune 1000 companies.

Benefits

Competitive equity package
Additional commission and/or bonus pay
Wide range of employee benefits

Company

Confluent

company-logo
Confluent offers a streaming platform based on Apache Kafka that enables companies to easily access data as real-time streams.

H1B Sponsorship

Confluent has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2023 (52)
2022 (158)
2021 (124)
2020 (66)

Funding

Current Stage
Public Company
Total Funding
$455.9M
Key Investors
CoatueSequoia CapitalIndex Ventures
2021-10-04IPO
2021-06-10Secondary Market
2021-05-11Private Equity

Leadership Team

leader-logo
Jay Kreps
CEO
linkedin
leader-logo
Jun Rao
Co-founder
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot