Location: Remote, OR, US
Job Summary:
Job Duties
- Design, build, and launch efficient data pipelines.
- Build real-time data ingestion pipelines.
- Collaborate with analysts and engineers to address business problems.
- Ensure high-quality data delivery through rigorous checks.
- Conduct data modeling and troubleshoot data issues.
- Create workflows for data ingestion, loading, and publishing.
Required Skills (Keywords)
- Java, Python
- ETL design, Big Data stack, Hadoop, Apache Beam
- Google Cloud Platform, BigQuery
- Real-time processing, Kafka, Spark streaming
- SQL, data modeling, dimensional data
Required Experiences (Topics)
- 4+ years of programming experience
- 3+ years in ETL and Big Data environments
- Experience with real-time processing tools
- Familiarity with data quality analysis and problem-solving
Job URLs: