Location: Remote, OR, US
Job Summary:
Job Duties:
- Design and launch data pipelines for large-scale data movement and transformation.
- Build real-time data ingestion pipelines for high event processing.
- Collaborate with analysts, product managers, and engineers to deliver data solutions.
- Ensure data quality through rigorous checks.
- Conduct data modeling and schema design.
- Create workflows for data ingestion and publication.
- Troubleshoot and resolve data issues.
Required Skills:
- Java/Python
- ETL design (Big Data stack)
- Google Cloud Platform
- Real-time processing tools (Kafka, Spark)
- SQL proficiency
- Data analysis
Required Experiences:
- Computer Science degree or equivalent
- 4+ years Java/Python development
- 3+ years ETL experience
Job URLs: