Location: Milwaukee, US
Job Summary:
Job Duties and Scopes
- Design, build, deploy, and maintain ETL jobs and data pipelines.
- Lead and mentor junior team members; influence through lean agile leadership.
- Collaborate with engineering, data science, and analytics teams on data products.
- Implement self-healing data pipelines; assist with production support and upgrades.
- Set standards and best practices for data engineering teams; participate in code reviews.
Required Skills
- Proficient in Scala, Java, C#, or R.
- Experience with AWS services (S3, EMR, EC2, RDS, Redshift, Lambda).
- Skilled in data integration patterns (ELT/ETL, Replication, Event Streaming).
- Knowledge of big data tools (Spark, Hive, Databricks) and Python/SQL.
- Familiar with CI/CD pipelines, Docker, Kubernetes, and Agile methodologies.
Required Experiences
- Master’s degree in Data Analytics, Computer Science, or related field plus 2 years of experience; or Bachelor’s degree plus 6 years.
- At least 2 years of experience in cloud platforms, databases, and big data frameworks.
- Experience with software development life cycle and Agile Scrum principles.
Job URLs: