White Cap · 14 hours ago
Data Platform Engineer
White Cap is committed to Building Trust on Every Job, providing a diverse and exciting work environment. They are seeking a Data Platform Engineer responsible for designing and implementing high-performance data pipelines, APIs, and integrations to enhance their analytics platform.
ConstructionManufacturingBuilding MaintenanceService Industry
Responsibilities
Design, build, and maintain batch and streaming data pipelines using Databricks (PySpark, Delta Live Tables, Unity Catalog)
Develop and manage inbound/outbound data feeds via APIs, SFTP, pub/sub, or middleware platforms
Build and optimize data models in Postgres and synchronize with analytical layers
Collaborate with product, architecture, and InfoSec teams to ensure secure and compliant data movement
Implement data quality, observability, and governance standards
Automate deployment and testing with CI/CD tools (e.g., Databricks Asset Bundles, GitHub Actions, or Azure DevOps)
Participate in refactoring existing data pipelines to modern, scalable approaches. Aid in retirement of former techniques and communications around new methods
Create build vs buy proposals. Implement “greenfield” solutions or integrate 3rd party apps and connectors
Qualification
Required
Expertise in Postgres OLTP systems
Databricks-based data processing
Modern middleware technologies to enable secure and scalable data exchange
Design, build, and maintain batch and streaming data pipelines using Databricks (PySpark, Delta Live Tables, Unity Catalog)
Develop and manage inbound/outbound data feeds via APIs, SFTP, pub/sub, or middleware platforms
Build and optimize data models in Postgres and synchronize with analytical layers
Collaborate with product, architecture, and InfoSec teams to ensure secure and compliant data movement
Implement data quality, observability, and governance standards
Automate deployment and testing with CI/CD tools (e.g., Databricks Asset Bundles, GitHub Actions, or Azure DevOps)
Participate in refactoring existing data pipelines to modern, scalable approaches
Create build vs buy proposals
Implement 'greenfield' solutions or integrate 3rd party apps and connectors
Typically requires BS/BA in a related discipline
Generally 2-5 years of experience in a related field OR MS/MA and generally 2-4 years of experience in a related field
Certification is required in some areas
Preferred
Proficiency in Python or Scala, with strong SQL skills
Hands-on experience with Databricks or Spark-based data engineering
Experience integrating APIs, building middleware connectors, and managing event-based data flows
Solid understanding of Postgres or similar OLTP databases
Familiarity with cloud environments (Azure preferred) and containerization (Docker/Kubernetes)
Strong problem-solving, performance tuning, and communication skills
Relevant certifications (e.g., Databricks Certified Data Engineer, Azure Data Engineer Associate)
Experience working in Agile/Scrum environments
Strong documentation and technical writing skills
Company
White Cap
White Cap is a distributor of specialist hardware equipment and supplies for large and medium sized contractors.
H1B Sponsorship
White Cap has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (4)
2024 (1)
2023 (1)
Funding
Current Stage
Late StageTotal Funding
unknown2020-08-01Acquired
Recent News
2026-02-05
Morningstar.com
2026-01-23
2026-01-23
Company data provided by crunchbase