Compunnel Inc. · 9 hours ago
GCP Data Architect -- DWIDC5725526
Compunnel Inc. is seeking a GCP Data Architect with extensive experience in cloud solutions and enterprise data analytics. The role involves architecting, developing, and deploying scalable data solutions on Google Cloud Platform, as well as transforming on-premise data warehouses to cloud-based platforms.
Responsibilities
Minimum of 8-10 years of technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment
Minimum of 5 years architecting, developing, and deploying scalable enterprise data analytics solutions (Enterprise Data Warehouses, Data Marts, etc)
Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume)
Minimum 2 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc
Hands-on experience analysing, re-architecting and re-platforming on-premise data warehouses to data platforms on Google cloud using GCP/3rd party services
Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc
Architecting and implementing next generation data and analytics platforms on GCP cloud
Designing and implementing data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming
Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
Qualification
Required
Minimum of 8-10 years of technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment
Minimum of 5 years architecting, developing, and deploying scalable enterprise data analytics solutions (Enterprise Data Warehouses, Data Marts, etc)
Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume)
Minimum 2 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc
Hands-on experience analysing, re-architecting and re-platforming on-premise data warehouses to data platforms on Google cloud using GCP/3rd party services
Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc
Architecting and implementing next generation data and analytics platforms on GCP cloud
Designing and implementing data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming
Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
Company
Compunnel Inc.
Compunnel Inc. is where AI-native solutions meet human ingenuity, helping enterprises reimagine talent, technology, and growth.
H1B Sponsorship
Compunnel Inc. has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (2276)
2024 (1682)
2023 (1992)
2022 (2366)
2021 (2223)
2020 (2220)
Funding
Current Stage
Late StageCompany data provided by crunchbase