Foureyes · 4 weeks ago
Sr. Software Engineer (Big Data, Data Lake)
Foureyes is a remote-first company focused on the automotive vertical, seeking a Sr. Software Engineer to design and implement their core data platform. The role involves building scalable data ingestion and integration across various data sources using AWS services, while collaborating with cross-functional teams to ensure data availability and performance.
AnalyticsArtificial Intelligence (AI)SaaSSales AutomationSoftware
Responsibilities
Strong leadership skills - you are comfortable both leading software engineers and rolling up your sleeves to implement and deploy the best solution
Strong cross-functional collaboration with product, engineering, and analytics teams to ensure data availability, reliability, and performance
Strong communication skills and experience translating business requirements into system design, architecture diagrams, and technical documentation
Strong communication presenting complex concepts or solutions to diverse audiences through clear and concise written communication (e.g., report writing, email crafting) and effective verbal communication (e.g., presentations, stakeholder updates)
Strong problem solving skills and knowledge of applied algorithms to solve real world problems efficiently
You have operated in a team’s on-call rotation to address complex problems in real-time and keep services operational and highly available
Strong hands-on experience implementing and using AWS data services: S3, Glue (ETL/Jobs, Data Catalog, DataBrew), Athena, Lake Formation, Step Functions, Lambda, Kinesis Data Stream and API Gateway
Expertise in designing and optimizing data pipelines for high-volume, multi-source ingestion of structured and semi-structured data in a multi-tenant data architecture
Experience with entity resolution, data cleansing, data quality, and anomaly detection
Expert level skills developing back-end distributed systems and data pipelines using Python, SQL, Step Functions, Lambdas
Comfort working with cloud data warehouses (Athena, Snowflake, Redshift, BigQuery, or similar)
Experience building event-driven and notification systems (SNS/SQS, EventBridge, Pub/Sub, webhooks) and orchestration frameworks (StepFunctions or equivalent)
Experience integrating AWS data pipelines with external platforms (e.g., Snowflake, Metabase, reporting tools, etc.)
Hands on experience applying security best practices for data storage, transfer, and API access
Qualification
Required
Strong leadership skills - you are comfortable both leading software engineers and rolling up your sleeves to implement and deploy the best solution
Strong cross-functional collaboration with product, engineering, and analytics teams to ensure data availability, reliability, and performance
Strong communication skills and experience translating business requirements into system design, architecture diagrams, and technical documentation
Strong communication presenting complex concepts or solutions to diverse audiences through clear and concise written communication (e.g., report writing, email crafting) and effective verbal communication (e.g., presentations, stakeholder updates)
Strong problem solving skills and knowledge of applied algorithms to solve real world problems efficiently
You have operated in a team's on-call rotation to address complex problems in real-time and keep services operational and highly available
Strong hands-on experience implementing and using AWS data services: S3, Glue (ETL/Jobs, Data Catalog, DataBrew), Athena, Lake Formation, Step Functions, Lambda, Kinesis Data Stream and API Gateway
Expertise in designing and optimizing data pipelines for high-volume, multi-source ingestion of structured and semi-structured data in a multi-tenant data architecture
Experience with entity resolution, data cleansing, data quality, and anomaly detection
Expert level skills developing back-end distributed systems and data pipelines using Python, SQL, Step Functions, Lambdas
Comfort working with cloud data warehouses (Athena, Snowflake, Redshift, BigQuery, or similar)
Experience building event-driven and notification systems (SNS/SQS, EventBridge, Pub/Sub, webhooks) and orchestration frameworks (StepFunctions or equivalent)
Experience integrating AWS data pipelines with external platforms (e.g., Snowflake, Metabase, reporting tools, etc.)
Hands on experience applying security best practices for data storage, transfer, and API access
7-10 years of professional experience working in a software development environment, ideally with exposure to big data and data-rich applications
Experience working across disciplines, partnering with BI, ML, and product teams to translate ideas into customer-facing features
5+ years working in an AWS cloud-native environment
Preferred
Prior experience in data-rich SaaS products
Experience with auth & access control for data-driven applications in a multitenant environment
Familiarity with data governance concepts (lineage, permissions, quality checks)
Deployment experience (AWS CDK, CI/CD, containerized services, serverless functions)
Exposure to generative AI or LLM-based data exploration tools
Benefits
Competitive salary and health benefits for eligible full time employees.
401k matching, and a subsidy for internet or cell phone.
Generous PTO days, in addition to paid holidays that incorporate two days to honor and celebrate your heritage, culture, or traditions that matter most -- just tell us when!
Half Day Summer Fridays!
Company
Foureyes
Foureyes sales intelligence software helps businesses track, protect, engage, and sell better.
Funding
Current Stage
Growth StageRecent News
Company data provided by crunchbase