Databricks · 2 days ago
Specialist Solutions Architect - Data Warehousing
Wonder how qualified you are to the job?
AnalyticsArtificial Intelligence (AI)
Insider Connection @Databricks
Responsibilities
Provide technical leadership to guide strategic customers to successful cloud transformations on large-scale data warehousing workloads - ranging from evaluation to architecture design to production deployment
Prove the value of the Databricks Intelligence Platform for customer workloads by architecting production workloads, including end-to-end pipeline load performance testing and optimization
Become a technical expert in an area such as data warehousing evaluations or helping set up successful workload migrations
Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing and performance, and tuning workloads for production
Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
Contribute to the Databricks Community
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
5+ years experience in a technical role with expertise in data warehousing - such as query tuning, performance tuning, troubleshooting, data governance, debugging MPP data warehouses or other big data solutions, or migration workloads from EDWother systems
Experience with design and implementation of data warehousing technologies including relational databases, SQL, data analytics, NoSQL, MPP, OLTP, and OLAP
Deep Specialty Expertise in at least one of the following areas: Experience scaling large analytical data workloads in the cloud that are performant and cost-effective, Maintained, extended, or migrated a production data warehouse system to evolve with complex needs, including data modeling, data governance needs, and integration with business intelligence tools, Experience migrating on-premise EDW workloads to the public cloud
Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
Production programming experience in SQL and Python, Scala, or Java
Experience with the AWS, Azure, or GCP clouds
2 years professional experience with data warehousing and big data technologies (Ex: SQL, Redshift, SAP, Synapse, EMR, OLAP & OLTP workloads)
2 years customer-facing experience in a pre-sales or post-sales role
Can meet expectations for technical training and role-specific outcomes within 6 months of hire
Can travel up to 30% when needed
Benefits
Medical, Dental, and Vision
401(k) Plan
FSA, HSA and Commuter Benefit Plans
Equity Awards
Flexible Time Off
Paid Parental Leave
Family Planning
Fitness Reimbursement
Annual Career Development Fund
Home Office/Work Headphones Reimbursement
Employee Assistance Program (EAP)
Business Travel Accident Insurance
Mental Wellness Resources
Company
Databricks
Databricks is an AI cloud data platform that interacts with corporate information stored in the public cloud.
H1B Sponsorship
Databricks has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Trends of Total Sponsorships
2023 (177)
2022 (238)
2021 (193)
2020 (79)
Funding
Current Stage
Late StageTotal Funding
$4.18BKey Investors
Counterpoint GlobalFranklin TempletonAndreessen Horowitz
2023-09-14Series I· $684.56M
2023-07-31Secondary Market· Undisclosed
2021-08-31Series H· $1.6B
Recent News
2024-06-05
Company data provided by crunchbase