Join a global organization that delivers large-scale technology, data, and AI solutions to enterprise clients across industries. You’ll be part of a team building modern data platforms that power analytics and business decisions.
What you’ll do:
- Build and manage data pipelines from multiple sources (databases, APIs, files, streaming)
- Develop ETL/ELT workflows using SQL and Python
- Ensure data is reliable, scalable, and ready for reporting and analytics
- Work with databases (Oracle, SQL Server, PostgreSQL, DB2) for data processing and optimization
- Support data modeling and improve data structures for reporting
- Monitor data quality, troubleshoot issues, and handle production incidents
- Partner with analysts, BI teams, and stakeholders to deliver data solutions
- Contribute in Agile delivery (planning, estimation, execution)
What we’re looking for:
- At least 2 years of experience in data engineering or similar roles
- Strong skills in SQL, Python, and Tableau and relational databases (Oracle, DB2, SQL Server, PostgreSQL).
- Experience working with ETL pipelines and relational databases
- Background in data modeling and data quality practices
- Exposure to tools like Kafka, Spark, or similar data streaming technologies is a plus
- Familiar with BI tools (Tableau, Qlik, Power BI) and cloud data platforms
- Comfortable working in fast-paced, production environments
If you’re interested in building data systems that drive real impact, let’s connect.
This is a 12-month contract role for 12 months based in Singapore.