Odyssey Information Services · 13 hours ago
Sr Developer (Java, Python, Golang)
Maximize your interview chances
Insider Connection @Odyssey Information Services
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Looking for someone who is familiar with technologies like Glue, Pyspark, and java. Someone who is able to understand the workings of functions that make a performant data platform. Need someone with strong hands-on working experience.
You will be writing python functions, java functions and sometimes supporting existing Golang functions to help with the data platform day to day activities
Python, Java, Golang. Plus if you have worked in a data or analytics platform prior and have a data infrastructure, data warehouse experience.
Highly experienced Full stack Engineer with expertise in cloud engineering, data engineering, and data warehousing, particularly within the AWS and google cloud ecosystem. Success in designing and implementing scalable cloud solutions, data pipelines, optimizing data architectures, and delivering high-quality analytics solutions. Proficient at collaborating with cross-functional teams to drive business insights and data-driven decision-making. Skilled in automating processes, and ensuring high availability and security in cloud environments.
Implemented scalable cloud architectures for enterprise applications using AWS services for Data platforms
Develops and maintains CI/CD pipelines to automate deployment processes and ensure continuous integration.
Leads teams in designing and developing centralized data warehouses and data lakes. Preferably using redshift
Creates and optimizes ETL processes to ingest and transform large datasets for analytical purposes.
Utilizes Apache Airflow for scheduling and monitoring data workflows to ensure timely data delivery.
Implements data partitioning, compression strategies, and query optimization techniques.
Develops serverless applications using AWS Lambda and API Gateway to reduce operational costs.
Automates configuration management using tools like Ansible and Terraform.
Conducts security assessments and implemented IAM and VPC best practices to secure cloud environments.
Integrated monitoring and logging solutions using CloudWatch, Prometheus, and ELK Stack for performance and security monitoring.
Created automated reporting solutions and interactive dashboards using Tableau, Looker, and Power BI.
Conducts data quality checks and ensured data integrity across multiple data pipelines.
Ability to securely ingest data from public and other Toyota sources to better insights
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
10 Years of overall experience is required.
Strong hands-on working experience in Java and Python.
Some exposure to Golang.
Familiarity with technologies like Glue, Pyspark, and Java.
Experience with data and analytics platforms.
Data Centric candidate – Data pipeline.
Experience with Redshift / Pyspark.
AWS experience.
Highly experienced Full stack Engineer with expertise in cloud engineering, data engineering, and data warehousing, particularly within the AWS and Google Cloud ecosystem.
Success in designing and implementing scalable cloud solutions, data pipelines, optimizing data architectures, and delivering high-quality analytics solutions.
Proficient at collaborating with cross-functional teams to drive business insights and data-driven decision-making.
Skilled in automating processes, and ensuring high availability and security in cloud environments.
Implemented scalable cloud architectures for enterprise applications using AWS services for Data platforms.
Develops and maintains CI/CD pipelines to automate deployment processes and ensure continuous integration.
Leads teams in designing and developing centralized data warehouses and data lakes, preferably using Redshift.
Creates and optimizes ETL processes to ingest and transform large datasets for analytical purposes.
Utilizes Apache Airflow for scheduling and monitoring data workflows to ensure timely data delivery.
Implements data partitioning, compression strategies, and query optimization techniques.
Develops serverless applications using AWS Lambda and API Gateway to reduce operational costs.
Automates configuration management using tools like Ansible and Terraform.
Conducts security assessments and implements IAM and VPC best practices to secure cloud environments.
Integrated monitoring and logging solutions using CloudWatch, Prometheus, and ELK Stack for performance and security monitoring.
Conducts data quality checks and ensures data integrity across multiple data pipelines.
Ability to securely ingest data from public and other sources.
Preferred
Experience with data infrastructure and data warehouse.
Experience with Docker, Kubernetes, Jenkins, Terraform, Git, Big Query, Firebase.
Experience with monitoring & logging tools like Grafana.
Experience with security tools like Forerock, Keycloak, SSO solutions for data platforms.
Company
Odyssey Information Services
Odyssey Information Services provides premier consultants to assist our clients with emerging technology and business processes.
H1B Sponsorship
Odyssey Information Services has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2023 (1)
2020 (1)
Funding
Current Stage
Growth StageCompany data provided by crunchbase