Lumenalta ยท 1 day ago
Data Engineer - BigQuery/GCP - Mid Level
Maximize your interview chances
Artificial Intelligence (AI)Data Center Automation
Work & Life Balance
Insider Connection @Lumenalta
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Data Pipeline Development: Design and build ETL/ELT data pipelines using BigQuery and other GCP services to ingest, process, and transform large datasets from multiple sources.
Data Modeling & Architecture: Develop and optimize data models and schemas to support analytics, reporting, and machine learning requirements.
Performance Optimization: Implement best practices for performance tuning, partitioning, and clustering to optimize data queries and reduce costs in BigQuery.
Data Integration & Transformation: Collaborate with data scientists and analysts to design data solutions that integrate seamlessly with BI tools, machine learning models, and third-party applications.
Data Quality & Governance: Establish and enforce data quality standards, data governance frameworks, and security policies for data storage and access on GCP.
Automation & Monitoring: Automate workflows using Cloud Composer, Cloud Functions, or other orchestration tools to ensure reliable and scalable data pipelines.
Documentation & Knowledge Sharing: Create comprehensive documentation for data pipelines, workflows, and processes. Share best practices and mentor junior data engineers.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
3+ years of experience working as a Data Engineer, with a focus on GCP and BigQuery.
Strong proficiency in SQL and experience in developing complex queries, stored procedures, and views in BigQuery.
Hands-on experience with GCP services such as Cloud Storage, Dataflow, Cloud Composer, and Cloud Functions.
Understanding of data warehousing concepts, dimensional modeling, and building data marts.
Experience with ETL/ELT tools like Apache Beam, Dataflow, or dbt.
Familiarity with scripting languages like Bash, Python or JavaScript for automation and integration.
Proven ability to work with large datasets and cost-effectively optimize query performance.
Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
GCP Professional Data Engineer Certification is a plus.
Preferred
Experience with machine learning on GCP using Vertex AI or AI Platform.
Knowledge of data governance and security best practices in a cloud environment.
Experience working with real-time streaming data and tools like Pub/Sub or Kafka.
Company
Lumenalta
Lumenalta is a software and app development company that creates unique digital transformation solutions using innovative technology.
Funding
Current Stage
Late StageRecent News
Company data provided by crunchbase