Leidos · 2 days ago
Senior Developer (Azure, Databricks)
Maximize your interview chances
ComputerGovernment
Actively Hiring
Insider Connection @Leidos
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Advise, support, and coach project teams in ingesting data, creating data pipelines, selecting the appropriate Azure services, optimizing data storage, cataloguing data, enforcing technical & architectural standards, and troubleshooting development & production issues.
Design and implement data security measures to ensure PII/PHI data is protected from unauthorized access.
Create real time Dashboard in rapid development
Incorporate data governance into the solution design, which includes policies, procedures, and standards for managing and using data
Continuously optimize the performance of data pipelines in Databricks and Azure Data Factory (ADF).
Investigate and recommend new technologies to modernize the data pipeline process. Stay current on the latest advancements in data technologies.
Collaborate with customer SMEs on data projects to develop data pipeline architectures and strategies.
Mentor project teams and data engineers on best practices and new technologies.
Collaborate with data engineers, business analysts, and testers to drive agile development team to implement data architecture.
Actively lead/participate in the discovery/validation/verification process throughout the development life cycle.
Guide and lead other developers and actively engage in process improvement initiatives.
Identify, evaluate, and demonstrate solutions to complex system problems.
Design and develop documentation including procedures, process flow diagrams, work instructions, and protocols for processes.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Bachelor degree from an accredited college in a related discipline, or equivalent experience/combined education, with 10+ years or more of professional experience; or 8+ years of professional experience with a related Master degree.
Proven software development experience on a large scale Azure Data Lake platform
Experience onboarding and managing multiple data pipelines of high complexity and processing millions of records per day
Experience working simultaneously with multiple data sources, entities submitting data daily to the datalake and deliver Technical Assistance to ensure successful operations.
Experience building Azure cloud-based ETL processes and data pipelines to automate data workflows in a rapid timeframe for emergency response
Experience implementing automated processes to QC data products and pipelines before data release, including de-duplication of data.
Experience in implementing Databricks unity catalog for the Lakehouse projects
Experience in handling and delivering big data analytics for daily users.
Strong prior experience with and expert knowledge of Databricks, Delta Lake, SPARK Streaming, Azure Synapse, Jupyter Notebooks, Microservices, Azure Function, Event Hubs, Logic Apps, Azure Kubernetes, Confluent Kalka, HD Insights, and Azure Data Factory
Prior experience integrating applications with AI/ML technologies including chatbots.
Ability to collaborate with and influence customer leadership and external teams on data initiative strategies.
Develop enterprise standards for Reference & Master Data Management, Data Quality, Data Integration and Data security.
Ability to present complex ideas and subject matter to stakeholders and customer leadership.
Proven experience working in a development environment following agile practices and processes.
Experience developing documentation including specifications, procedures, process flow diagrams, work instructions, and protocols for processes.
Proven experience with supporting highly critical customer missions.
Prior proven leadership experience.
Excellent verbal and written communication skills, including experience working directly with customers to discuss their requirements and objectives.
Proven experience in multi-tasking and managing efforts to the schedule.
Ability to learn and support new systems and applications.
Preferred
Working experience at CDC or other federal agencies
Experience with Azure DevOps and CI/CD pipelines.
Azure Data Engineer certification, Databricks Certified Data Engineer Associate certification, or similar certifications
Experience onboarding and managing 100+ data pipelines of high complexity and processing volumes of greater than 5M records per day.
Experience performing data linkage of terabytes of data using Privacy Preserving Record Linkage (PPRL)
Experience implementing and operationalize real time dashboards (DevOps, Program analytics) using enterprise BI tools including PowerBI, Tableau, RShiny.
Experience working with SAS Viya, Palantir Foundry, R, and/or python.
Experience with Transition-In to take over a large scale Azure based data lake platform
Experience with agile development process
Company
Leidos
Leidos is a Fortune 500® innovation company rapidly addressing the world’s most vexing challenges in national security and health.
Funding
Current Stage
Public CompanyTotal Funding
unknown2013-09-17IPO· nyse:LDOS
Leadership Team
Recent News
2024-11-15
2024-04-26
Company data provided by crunchbase