Recruiter :
Marcella Irene

About The Company:
A leading digital telecommunication company in Indonesia with a commitment to empowering millions through cutting-edge solutions, this organization delivers seamless connectivity, advanced data platforms, and transformative digital services.
Job Responsibilities:
Key Requirements:
If you possess the above requirements and are keen on a new challenge, please apply directly or email [email protected]
A leading digital telecommunication company in Indonesia with a commitment to empowering millions through cutting-edge solutions, this organization delivers seamless connectivity, advanced data platforms, and transformative digital services.
Job Responsibilities:
- Lead team to deliver high-quality data products and seamless integration with existing systems.
- Develop, build, and manage scalable data pipelines to support diverse business and analytics needs.
- Ensure the pipelines are optimized for data flow, integrity, and transformation across systems.
- Implement robust data integration solutions, while maintaining data integrity, security, and compliance with regulatory standards.
- Work closely with analytics, engineering, and business teams to understand data requirements and deliver effective data solutions that align with business goals.
- Identify and resolve complex technical issues and implement preventive measures to avoid recurring problems.
- Develop data architecture, workflows, processes, and technical documentation to ensure system clarity and usability across functions.
- Provide advanced technical support to data scientists, data analysts, or other related users.
- Offer technical consulting to external stakeholders to design effective data solutions and able to translate business requirements into appropriate technical solutions.
Key Requirements:
- Bachelor or higher degree in Computer Engineering, Information System, with at least 5 years of working experience in the Data Engineering field.
- Skilled in constructing robust and scalable data architectures and pipelines, including ability to implement agile approach in ETL.
- Experienced with workflow orchestration tools such as Apache Airflow and data engineering platforms like Databricks.
- Strong knowledge of big data frameworks, including Hadoop, Spark, Kafka, and Snowflake.
- Skilled in managing various database technologies such as MySQL, PostgreSQL, MongoDB, and Cassandra.
- Hands-on experience working with Google Cloud Platform (GCP) – Familiarity with Huawei Cloud is a plus.
- Proficient in programming languages commonly used in data engineering, including Python, Java, Scala, and SQL.
If you possess the above requirements and are keen on a new challenge, please apply directly or email [email protected]