Nerdery · 5 hours ago
Senior Data Engineer
Maximize your interview chances
Insider Connection @Nerdery
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Architect and implement data pipelines and storage solutions using BigQuery, Dataflow, and other GCP services.
Optimize Looker dashboards based on BigQuery datasets for complex visualizations and advanced analytics.
Integrate BigQuery with Vertex AI to unlock the power of machine learning and AI-driven insights.
Build and manage data lakes to support diverse data types and formats for scalable storage and analytics.
Design secure data access with IAM roles to ensure fine-grained data security on GCP.
Collaborate with cross-functional teams to address technical issues and drive data infrastructure improvements.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
6+ years of experience in data engineering with expertise in GCP services like BigQuery, Dataflow, and Dataproc.
Proficiency in Python, SQL, and data pipeline orchestration tools like Pub/Sub and Cloud Functions.
Strong knowledge of data modeling, ETL processes, and data warehousing principles.
Experience migrating data pipelines from AWS or Azure to GCP.
Exceptional problem-solving skills and a collaborative mindset.
Company
Nerdery
A digital business consultancy bridging strategy and execution.
Funding
Current Stage
Growth StageLeadership Team
Recent News
2022-11-20
2022-04-15
Business Journals
2022-02-25
Company data provided by crunchbase