EPIC SYSTEMS INC · 1 month ago
Principal Databricks Engineer – Must have an Active Secret clearance and be able to obtain a TS/SCI clearance and DHS Suitability
Epic Systems Inc is supporting a U.S. Government customer on a critical development and sustainment program. They are seeking a Principal Databricks Engineer to migrate customer applications and services to a Medallion Model, ensuring the technical correctness and quality of delivery.
Information Technology & Services
Responsibilities
Supporting teams to migrate services, applications and platforms from legacy back-end systems to Databricks
Identifying the optimal path for migration, building the plan for migration and finally, execution of the plan
Data Pipeline migration of legacy pipelines from NiFi to Databricks, complete with validation
Implementing the medallion model for each of the data assets being migrated to Databricks
Develop an SOP for integration of data assets into the Databricks platform focused on efficiency, instrumentation and performance
Optimize development, testing, monitoring and security for data assets being added to the Databricks platform
Develop and implement a strategy for optimizing migration and integration of data assets to the Databricks platform
Develop code using various programming and scripting languages to automate/optimize data ingestion, pipeline orchestration and improve data management processes
Ingest transparency, leveraging technologies such as AWS CloudWatch to identify places for measuring and gathering performance information on automated data pipelines
Ensure Data Engineering Team Standard Operating Procedures are appropriately captured and communicated across the team
Ensure technical correctness, timeliness and quality of delivery for the team
Demonstrate excellent oral and written communication skills to all levels of management and the customer
Qualification
Required
Active Secret (S) clearance. Must be able to obtain a TS/SCI clearance
Must be able to obtain DHS Suitability
10 + years of directly relevant software development experience required
Minimum of 8 years of experience performing data engineering work in a cloud environment
Experience with relational, NoSQL and/or file-based storage (e.g. Databricks, Elastic, Postgres, S3, Athena, etc.)
Experience working in a CICD Pipeline factory environment
Working knowledge of Databricks, Cloud Relational Database Services, NiFi, AWS Redshift and Elasticsearch
Bachelor's degree in Software Engineering, Computer Science or a related discipline is required. Ten (10) or more years of software development experience required
Preferred
Databricks workflows
Databricks Unity Catalog
Databricks Autoloader
Databricks Delta Live Tables/Delta Lake
Databricks Workspace/Notebooks
MLflow
Apache Spark
Experience with collaboration tools including MS Teams, MS Outlook, MS SharePoint, and Confluence
Amazon Web Services (AWS) Professional certification or equivalent
Excellent problem-solving and communication skills
Familiarity with CISA: Securing the Software Supply Chain
Familiarity with CISA: Cybersecurity Best Practices
Familiarity with CISA: Open-Source Software Security
Familiarity with NIST SP 800-218, Secure Software Development Framework V1.1: Recommendations for Mitigating the Risk of Software Vulnerabilities