Tachyon Technologies · 4 hours ago
Fabric Data Engineer with Azure (Only on W2)
Maximize your interview chances
AppsEnterprise Software
Growth OpportunitiesH1B Sponsor Likely
Insider Connection @Tachyon Technologies
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Designing, building, and maintaining efficient, reusable, and reliable architecture and code.
Ensure the best possible performance and quality of high-scale web applications and services.
Participate in architecture and system design discussions.
Independently perform hands-on development and unit testing of the applications.
Collaborate with the development team and build individual components into complex enterprise web systems.
Work in a team environment with product, frontend design, production operation, QE/QA, and cross-functional teams to deliver a project throughout the whole software development cycle.
Responsible for identifying and resolving any performance issues.
Keep up to date with new technology developments and implementations.
Participate in code reviews to ensure standards and best practices are met.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Bachelor’s degree in computer science, Software Engineering, MIS, or an equivalent combination of education and experience.
Experience implementing and supporting data lakes, data warehouses, and data applications on AWS or Microsoft Azure for large enterprises (Fabric is plus)
Programming experience with Python, Spark and SQL.
Experience with Microsoft Azure services, specifically Data Factory, ADLS Gen2, Synapse Analytics, and Synapse Database, KQL.
Experience in system analysis, design, development, and implementation of data ingestion pipelines in AWS or Azure.
Knowledge of ETL/ELT processes.
Experience with end-to-end data solutions (ingest, storage, integration, processing, access) on Azure.
Architect and implement CI/CD strategies for EDP.
Implement high-velocity streaming solutions.
Develop data-based APIs
Implement POCs on any new technology or tools to be implemented on EDP and onboard for real use-cases.
5+ years of experience as a Data Engineer/Datalakes.
Experience developing business applications using NoSQL/SQL databases.
Experience working with Object stores and Structured/semi-structured/unstructured data is a must.
Experience with Microsoft Azure services – Data Factory, ADLS Gen2, Synapse Analytics, and Synapse Database.
Develop CI/CD pipeline using Terraform and GitHub.
Preferred
Solid experience implementing solutions on Azure-based data lakes
Migrate data from traditional relational database systems, on-prem to Azure technologies (preferably in Onelake using Fabric technologies)
Any Cloud Certification preferred (Azure / AWS)
Microsoft Fabric is plus
Company
Tachyon Technologies
Tachyon Technologies, an SAP Digital transformation services company helping customers with new innovative solutions
H1B Sponsorship
Tachyon Technologies has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2023 (18)
2022 (39)
2021 (57)
2020 (33)
Funding
Current Stage
Late StageCompany data provided by crunchbase