SPH Media Limited – Data Engineer Intern

Company
SPH Media Limited
sph.com.sg
Designation
Data Engineer Intern
Date Listed
09 Apr 2024
Job Type
Entry Level / Junior Executive
Intern/TS
Job Period
Immediate Start, For At Least 3 Months
Profession
Engineering
Industry
Creative / Media
Location Name
1000 Toa Payoh North, Singapore 318994, Singapore
Address
1000 Toa Payoh N, Singapore 318994
Map
Allowance / Remuneration
$800 - 1,200 monthly
Company Profile

Aligned with the evolution of the media landscape and our ambition to be a relentless creator of quality content and experiences, our brand refresh is yet another milestone for SPH Media. Our transformation journey started in 2022, on digitalisation, audience engagement and talent development. Throughout the journey, it remained steadfast in its mission to be the trusted source of news on Singapore and Asia. The refreshed brand stands for the importance of giving a voice to Singapore, while inspiring conversations and providing quality content that impact the lives of our audiences. We offer a varied portfolio of over 40 media brands ranging across news publications, lifestyle brands, and radio stations. We are relentless in our commitment to creating meaningful experiences that resonate deeply with our audiences. 

Job Description

You will play a crucial role in designing, building, and maintaining our data infrastructure and systems. You will work closely with cross-functional teams to understand their data needs, implement scalable solutions, and ensure the availability and integrity of our data. 

Responsibilities: 

  • Design and develop data pipelines: Architect, build, and optimize data pipelines to ingest, transform, and store large volumes of data from various sources into our data warehouse using AWS technologies such as AWS Glue, AWS Lambda, and AWS S3. 
  • Data modeling and schema design: Collaborate with data scientists and analysts to design efficient and scalable data models and database schemas that meet business requirements and enable accurate and timely reporting and analysis. 
  • Data transformation and processing: Utilize your expertise in Python programming to transform raw data into clean, structured datasets, applying data validation, cleansing, and enrichment techniques. 
  • AWS infrastructure management: Leverage your skills in AWS services such as EC2, RDS, Redshift, and EMR to build and maintain scalable and reliable data infrastructure on the cloud. Implement best practices for security, performance optimization, and cost efficiency. 
  • Terraform deployment and management: Utilize Terraform to automate the provisioning, configuration, and deployment of AWS resources, ensuring reproducibility, consistency, and scalability of our infrastructure. 
  • ETL optimization and performance tuning: Identify bottlenecks and optimize the performance of ETL processes and data pipelines. Implement data partitioning, indexing, and caching strategies to improve overall system performance. 
  • Data quality and governance: Implement data quality checks, data monitoring, and data governance processes to ensure data accuracy, integrity, and compliance with regulatory requirements.
  • Visualization and reporting: Collaborate with business stakeholders to understand their reporting and analytics requirements. Utilize Tableau or other visualization tools to create interactive dashboards and reports that provide actionable insights to drive decision-making. 
  • Collaboration and mentoring: Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand their data needs and provide guidance on best practices. 
  • Strong proficiency in AWS services such as Glue, Lambda, S3, EC2, RDS, Redshift, and EMR. 
  • Solid understanding of infrastructure-as-code concepts and experience with Terraform for infrastructure provisioning and management. 
  • Expertise in Python programming for data transformation, manipulation, and processing. 
  • Proficiency in designing and optimizing data models and database schemas. 
  • Experience with data visualization tools such as Tableau or similar. 
  • Strong understanding of ETL processes, data warehousing concepts, and data integration techniques. 
  • Familiarity with data quality assurance and governance practices. 
  • Excellent problem-solving skills and ability to troubleshoot and debug complex data issues. 
  • Strong communication and collaboration skills, with the ability to work effectively in cross- functional teams. 
  • Proven ability to mentor and provide technical guidance to junior team members. 
This position is already closed and no longer available.  You may like to view the other latest internships here.

Related Job Searches:

Discuss this Job:

You can discuss this job on Clublance.com #career-jobs channel, or chat with other community members for free:
Share This Page