Location: Santa Clara, CA, US
Job Summary:
Job Duties:
- Design and deliver high-performance services and libraries.
- Build streaming data pipelines for data collection and processing.
- Design and build data Lakehouse architecture.
- Collaborate with engineering and business teams for production integration.
- Automate measurement, testing, and monitoring of the data platform.
Required Skills:
- Big data
- Distributed systems
- Multi-petabyte data lakes
- Spark, Trino
- Delta Lake, Iceberg
- Java, Scala, Python, Go
- Real-time streaming applications (Kafka)
- Interpersonal skills
Required Experience:
- Bachelor's/Master's in Computer Science or related field
- 5+ years software engineering experience
- Experience with large-scale distributed systems and SLAs
Job URLs: