Principal Data Operations Engineer (Remote from anywhere in Colorado) @ NEOGOV | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Principal Data Operations Engineer (Remote from anywhere in Colorado) jobs in Statewide, CO
Be an early applicantLess than 25 applicants
company-logo

NEOGOV ยท 1 day ago

Principal Data Operations Engineer (Remote from anywhere in Colorado)

ftfMaximize your interview chances
GovTechHuman Resources
badNo H1Bnote

Insider Connection @NEOGOV

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Implementing the designs and standards provided by the Data and Integrations Architect.
Providing platform and application administration, configuration management, end user management, security and application level patching and upgrades as well as source code control, deployment and release management; and providing support services as needed.
Providing operational support ranging from minor bug fixes to major enhancements and participate in planning for application replacement and modernization.
Consulting with Data and Integrations Architects, Principal Developers and other CDO and OIT team members, as appropriate to maintain and enhance existing platforms in line with OIT strategies (e.g., API Led Connectivity, Cloud First, Mobile First, Secure Colorado, etc.).
Performing platform administration and support for applications within the Chief Data Office portfolio.
Working with Data Engineers and Integration Developers on data ingestion, data transformation and data presentation tasks and scripts, including automation of these tasks.
Establish automation of manual processes, including code deployment and environment provisioning.
Acting as Tier-2 escalation point for on-call/break-fix efforts, to diagnose and resolve incidents and problems with platforms and applications within Chief Data Office portfolio.
Working with SecOps resources to ensure network security policy is established in a consistent, repeatable and automated manner.
Collaborating with Business Analysts, Customers, Project Managers, and others as appropriate to assist in the creation of estimates and timelines.
Performing coding (in-house applications) or configuration management (COTS applications) in accordance with standards and best practices and further minimize defects through disciplined unit testing.
Coordinating update releases and other system changes, contribute to the implementation of break/fix solutions, and update documentation and configuration information related to changes as needed throughout the life cycle.
Organizing, build, and validate all segments of the code and configurations related to a specific build (release) through CI/CD pipelines.
Ensuring application maintenance and configuration activities are consistent with established service portfolio policies, procedures, standards and guidelines.
Identifying and recommending changes to application and platform policies, processes, templates and standard operating procedures to improve the overall quality of services being delivered.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

Data EngineeringETL DevelopmentAPI Lifecycle ManagementSnowflakeInformaticaTableauAzure DevOpsCI/CD PipelinesMuleSoft CertificationMS SQLPostgreSQLMySQLSource Code ManagementConfiguration ManagementSoft Skills

Required

A minimum of eight (8) years of experience as a data engineer, DevOps Engineer, or similar software engineering role.
A minimum of one year (1) of experience designing, building, implementing, and maintaining data and system integrations using dimensional data modeling and development and optimization of ETL pipelines.
Experience with API lifecycle management platforms and API led connectivity.

Preferred

Subject Matter Domain Expertise on the platforms (applications) used for the execution of the State Strategy for Data Sharing and Integrations premised on API led connectivity: Snowflake, Ipswitch MOVEit Transfer and MOVEit Automation, Informatica, Tableau.
Current MuleSoft certification.
Experience with columnar and relational database technologies such as MS SQL, PostgreSQL, MySQL, or others desired.
Familiarity with Continuous Integration and Continuous Delivery (CI/CD) pipelines.
Experience using Azure DevOps.
Familiarity with source code management tools such as TFS or Git.

Company

NEOGOV is the leading provider of workforce management software uniquely designed for the public sector, education, and public safety.

Funding

Current Stage
Late Stage
Total Funding
unknown
Key Investors
Warburg Pincus
2021-06-02Private Equity
2016-10-18Private Equity

Leadership Team

leader-logo
Shane Evangelist
CEO
linkedin
leader-logo
Brandon McDonald
Head Of Marketing
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot