Real Estate Analytics Pte Ltd – AI Data Intern

Company
Real Estate Analytics Pte Ltd
rea-sg.com
Designation
AI Data Intern
Date Listed
11 Jul 2025
Job Type
Entry Level / Junior Executive
Intern/TS
Job Period
From Aug 2025, For At Least 4 Months
Profession
IT / Information Technology
Industry
Real Estate
Location Name
Singapore
Allowance / Remuneration
$1,200 monthly
Company Profile

At Real Estate Analytics, we are a forward-thinking, data-driven organization committed to innovation. Our data is the lifeblood of our business, and we are constantly seeking new ways to make our data infrastructure smarter, faster, and more reliable. We foster a culture of learning, collaboration, and impact, where every team member has the opportunity to make a difference.

The Role: A Glimpse into the Future of Data Engineering

We are seeking a motivated and creative AI in Data Engineering Intern to join our dynamic data team. This is a unique opportunity to work at the intersection of Artificial Intelligence, data engineering, and cloud computing.

You won't be just running queries; you'll be building the future. Your core mission will be to research, design, and develop AI-powered tools that monitor, analyze, and optimize our data pipelines. Your work will directly enhance the efficiency, reliability, and cost-effectiveness of our entire data ecosystem, providing you with a high-impact project for your portfolio.

Job Description

What You'll Do (Key Responsibilities):

  • Analyze & Identify: Use your SQL skills to query pipeline metadata, logs, and performance metrics to identify bottlenecks, inefficiencies, and patterns of failure.

  • Research & Design: Investigate and prototype AI/ML solutions to address key pipeline challenges. This could include:

    • Predictive models for pipeline runtimes or resource consumption.

    • Anomaly detection systems to flag data quality issues or pipeline delays in real-time.

    • Generative AI tools to automate boilerplate code generation or create documentation.

  • Build & Develop: Write clean, efficient, and well-documented Python code to build these tools. You’ll work with libraries like Pandas, Scikit-learn, TensorFlow/PyTorch, and frameworks like Flask or FastAPI to create services.

  • Deploy & Automate: Containerize your applications (e.g., using Docker) and deploy them to our cloud environment (e.g., AWS Lambda, Google Cloud Functions, Azure Functions). You'll gain hands-on experience in bringing a tool from concept to production.

  • Collaborate & Present: Work closely with our senior data engineers, data scientists, and platform architects. You will be expected to present your findings, progress, and final project to the team and key stakeholders.

Who You Are (Qualifications):

Required:

  • Currently pursuing a Bachelor’s, Master’s, or Ph.D. in Computer Science, Data Science, Software Engineering, Statistics, or a related technical field.

  • Strong proficiency in Python and experience with its data-centric libraries (e.g., Pandas, NumPy, Scikit-learn).

  • Solid understanding of SQL and experience querying relational databases (e.g., PostgreSQL, MySQL, Snowflake, BigQuery).

  • A foundational understanding of machine learning concepts (e.g., regression, classification, clustering).

  • Familiarity with cloud computing concepts (AWS, GCP, or Azure).

  • An insatiable curiosity, a desire to learn, and a proactive, problem-solving mindset.

Bonus Points (Preferred Qualifications):

  • Prior experience with data orchestration tools like Airflow, Dagster, or Prefect.

  • Hands-on experience with Docker for containerization.

  • Experience building and deploying simple APIs using Flask or FastAPI.

  • Familiarity with Infrastructure as Code (e.g., Terraform, AWS CloudFormation).

  • Interest or experience with Large Language Models (LLMs) and their application via APIs (e.g., OpenAI, Anthropic).

What We Offer (The Intern Experience):

  • Real-World Impact: Work on a meaningful project that will be integrated into our production systems and deliver tangible value.

  • Dedicated Mentorship: Receive guidance and support from an experienced senior engineer who will help you navigate your project and career goals.

  • Hands-On Learning: Gain practical experience across the full development lifecycle—from ideation and research to deployment and monitoring in a professional cloud environment.

  • Networking: Connect with professionals across our engineering, product, and business teams.

  • A Great Portfolio Piece: Leave with a completed, high-impact project that you can showcase to future employers.

How to Apply

Interested candidates should submit their resume and a brief cover letter or statement of interest explaining why you are excited about the intersection of AI and data engineering. Please include a link to your GitHub profile or any relevant projects if available.

Application Instructions
How to Apply
Interested candidates should submit their resume and a brief cover letter or statement of interest explaining why you are excited about the intersection of AI and data engineering. Please include a link to your GitHub profile or any relevant projects if available. Email to

Related Job Searches:

Discuss this Job:

You can discuss this job on Clublance.com #career-jobs channel, or chat with other community members for free:
Share This Page