Location: Miamisburg, Ohio, us
Job Summary:
1. Job Duties and Scopes:
- Construct data pipelines using Airflow and Cloud Functions.
- Maintain and optimize data warehouse schemas, views, and queries.
- Perform ad-hoc analysis and provide insights on feature usage.
- Document data architecture and integration efforts.
- Guide on data best practices and mentor a team of engineers.
2. Required Skills:
- Experience with data task orchestration tools (Airflow, etc.).
- Proficiency in Python, SQL, and shell scripting.
- Strong data analysis and modeling skills.
- Familiarity with APIs, SFTP, and cloud storage solutions.
- Analytical problem-solving abilities and experience with Cloud Computing.
3. Required Experiences:
- Proven experience in data engineering and orchestrating data tasks.
- Experience interacting with various data storage and APIs.
- Leadership experience, ideally having led a small team of developers.
Job URLs: