Location: San Jose, CA, 95199, US
Job Summary:
Job Duties and Scope:
- Build and maintain scalable data pipelines for large data volumes.
- Work with cloud platforms (Azure, Google Cloud) for data storage and analytics.
- Collaborate with cross-functional teams and manage data engineering projects.
Required Skills:
- Proficiency with SQL, Python, Spark, Scala, Hadoop.
- Expertise in ETL processes and database design.
- Strong communication skills for technical and non-technical audiences.
Required Experiences:
- Minimum of 1 year as a Data Engineer or similar role.
- Experience with ETL tools (e.g., Apache Airflow) and real-time processing (e.g., Apache Kafka).
- Familiarity with Docker, Kubernetes, and cloud computing services (GCP, Azure).
Job URLs: