Cognizant · 17 hours ago
PALANTIR Senior Data Engineer- Remote
Maximize your interview chances
ConsultingIndustrial Automation
H1B Sponsor Likely
Insider Connection @Cognizant
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Use Pyspark to build data pipes in AWS environments
Write design documents and independently build the Data Pipes based on the defined source to target mappings
Convert complex stored procedures, SQL triggers, etc. logic using PySpark in the Cloud platform
Be open to learning new technologies and implementing solutions quickly in the cloud platform
Communicate with program key stakeholders to keep the project aligned with their goals
Effectively interact with QA and UAT team for code testing and migrate to different regions
Spearheads data engineering initiatives targeting moderately to complex data and analytics challenges, delivering impactful outcomes through comprehensive analysis and problem-solving
Pioneers the identification, conceptualization, and execution of internal process enhancements, encompassing scalable infrastructure redesign, optimized data distribution, and the automation of manual workflows
Addresses extensive application programming and analysis quandaries within defined procedural guidelines, offering resolutions that span wide-ranging scopes
Actively engages in agile/scrum methodologies, actively participating in ceremonies such as stand-ups, planning sessions, and retrospectives.
Orchestrates the development and execution of automated and user acceptance tests, integral to the iterative development lifecycle.
Fosters the maturation of broader data systems and architecture, assessing individual data pipelines and suggesting/implementing enhancements to align with project and enterprise maturity objective
Envisions and constructs infrastructure that facilitates access and analysis of vast datasets while ensuring data quality and metadata accuracy through systematic cataloging
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Total 10+ years of experience with 3 plus years of experience in data engineering/ETL ecosystems with Palantir Foundry, Python, PySpark and Java.
Required skills: Palantir
Expert in writing shell scripts to execute various job scheduler
Hands-on experience in Palantir & PySpark to build data pipes in AWS environments
Good knowledge of Palantir components
Good exposure to RDMS
Basic understanding of Data Mappings and Workflows
Preferred
Pyspark and Python
Any knowledge of the Palantir Foundry Platform will be a big plus
Implemented a few projects in Energy and Utility space is a plus
Benefits
Medical/Dental/Vision/Life Insurance
Paid holidays plus Paid Time Off
401(k) plan and contributions
Long-term/Short-term Disability
Paid Parental Leave
Employee Stock Purchase Plan
Company
Cognizant
Cognizant is a professional services company that helps clients alter their business, operating, and technology models for the digital era.
H1B Sponsorship
Cognizant has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2023 (10179)
2022 (13921)
2021 (12909)
2020 (21593)
Funding
Current Stage
Public CompanyTotal Funding
$0.24MKey Investors
Summit Financial Wealth Advisors
2016-11-18Post Ipo Equity· $0.24M
1998-06-19IPO
Recent News
2024-12-12
Company data provided by crunchbase