LightFeather · 17 hours ago
Analytics Engineer
Maximize your interview chances
AnalyticsConsulting
No H1BU.S. Citizen OnlySecurity Clearance Required
Insider Connection @LightFeather
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Architect, build, and maintain scalable and efficient data pipelines using Apache Airflow to orchestrate workflows and automate complex processes
Partner with cross-functional teams to gather and analyze business requirements, translating them into technical specifications and scalable solutions
Develop and implement rigorous testing, validation, and monitoring strategies to ensure data quality, accuracy, and integrity throughout the pipeline
Optimize data workflows for enhanced performance, scalability, and cost-effectiveness, leveraging Airflow and related technologies
Proactively monitor and troubleshoot Airflow DAGs, identifying and resolving issues to minimize downtime and ensure seamless data operations
Establish and enforce best practices for ETL/ELT development, workflow orchestration, and data pipeline architecture
Continuously research and integrate the latest advancements in Airflow, workflow management, and data engineering technologies to improve existing systems
Create and maintain comprehensive documentation for all data pipelines, workflows, and configurations to facilitate knowledge sharing and system maintenance
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
US Citizenship
Active clearance at the Public Trust level or above. IRS is preferred
Bachelor’s degree in Computer Science, Data Engineering, or a related field
Minimum of 5 years experience as an Analytics Engineer, Data Engineer, or a similar role
Expertise in Apache Airflow, including creating and managing complex DAGs
Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, Snowflake)
Experience with data warehousing concepts and technologies
Proficiency in at least one programming language (e.g., Python, Java, Scala)
Familiarity with cloud platforms (AWS, GCP, Azure) and related data services
Excellent problem-solving skills and attention to detail
Strong communication skills and ability to work collaboratively in a team environment
Preferred
Experience with big data tools and frameworks (e.g., Spark, Kafka)
Knowledge of data visualization tools (e.g., Tableau, Power BI, Looker)
Familiarity with version control systems like Git
Experience with CI/CD pipelines for data workflows