GoFasti · 1 day ago
1018- Data Engineer
GoFasti is a Talent-as-a-Service company connecting top talent from LatAm with leading companies globally. They are seeking an English-fluent Data Engineer to build and maintain real-time market data ingestion services, ensuring the efficient processing and delivery of financial data for trading models.
Human ResourcesInformation TechnologySoftwareSoftware Engineering
Responsibilities
Build a real-time market data ingestion service consuming streaming equity data via WebSockets from Polygon.io (now Massive.com)
Handle high-throughput data streams covering approximately 1,500 equities (~60–70% of the U.S. stock market)
Aggregate and organize streaming data by time slices, preparing it for model consumption
Implement redundancy mechanisms, including failover to the 15-minute delayed stream if the real-time feed is interrupted
Deliver processed data to the trading model (a C-based script running on AWS EC2) on a daily cycle
Validate and maintain the existing historical data pipeline that pulls options and equity data from S3
Ensure historical data is updated daily and pipeline-ready for back-testing and model input
Work with data sourced from Polygon.io (equities) and CBOE (options) flat files
Instrument the pipeline with monitoring and alerting (e.g., Prometheus) to track message throughput, latest prices, and network health
Ensure the system meets the model’s requirement: if a full day’s data is not available, the model will not run, and all positions are exited as a safety measure
Proactively identify and resolve issues — do not wait to be asked
Integrate with an existing position monitoring component that syncs current holdings from the firm’s brokerage account
Coordinate with a DevOps engineer responsible for Kubernetes scheduling, deployments, and infrastructure
Participate in weekly syncs with the CTO and broader team; communicate blockers Immediately
Qualification
Required
3+ Years of experience as a Data Engineer
Proven experience building and operating real-time streaming data pipelines in a production environment
Strong proficiency in Go (preferred for new development) and/or C (legacy system familiarity)
Hands-on experience consuming market data or high-throughput event streams via WebSockets
Familiarity with AWS services (EC2, S3) and containerized environments (Kubernetes)
Experience with monitoring/observability tooling such as Prometheus, Grafana, or similar
Demonstrated ability to work independently, take initiative, and communicate proactively when facing blockers
Comfort working in a small, fast-moving team with minimal oversight
Preferred
Prior experience in quantitative finance, hedge funds, or fintech data infrastructure
Experience with Polygon.io / Massive.com APIs and data formats
Background in financial market data (equities, options) and familiarity with concepts like tradable universes and position management
Experience with message queuing, stream processing frameworks, or time-series data aggregation