Location: Santa Clara, CA, US
Job Summary:
Job Duties
- Design and deliver high-performance services and libraries
- Build streaming data pipelines from ingestion to insights
- Design and build data Lakehouse architecture
- Collaborate with engineering and business teams for integration
- Automate measurement, testing, updating, monitoring, and alerting of the data platform
Required Skills (Keywords)
- Big Data
- Distributed systems
- Multi-petabyte data lakes
- Spark
- Trino
- Delta Lake
- Java/Scala/Python/Go
- Real-time streaming (Kafka)
- Interpersonal skills
Required Experiences (Topics)
- 5+ years in software engineering
- Bachelor's/Master's degree in Computer Science or equivalent
- Expertise in building and operating data lakes
Job URLs: