Dow Technologies and Systems – Lead Data Engineer

Company
Dow Technologies and Systems
dowtechie.com
Designation
Lead Data Engineer
Date Listed
01 Jul 2020
Job Type
Experienced / Senior Executive
Full/Perm
Job Period
Immediate Start, Permanent
Profession
IT / Information Technology
Industry
Computer and IT
Location Name
Singapore
Allowance / Remuneration
$8,000 monthly
Company Profile

Our mission has been to offer AI based solutions and empower HR in automating processes by providing advanced technology to manage tasks such as payroll, leave management, overtime, attendance etc. We also have a huge emphasis on building a comprehensive recruitment portal to retain and attract the best talented candidates for companies. Furthermore, we want to also offer a predictive analysis of jobs & skill requirements. Finally, we want to develop greater transparency between managers, employees and employers to access information regarding employment & performance through available mediums.

Job Description

We are looking for an Lead Data Engineer will lead a small team of data analysts/engineers within the technology team as well as support other data analysts throughout the organization on various data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.

Requirement

Bachelors in Computer Science or related disciplines. Minimum 7+ years of experience in design, development, and deployment of large-scale, distributed, and cloud-deployed software services. Must have been part of minimum 2 end to end big data projects and must have handled defined modules independently. Expert in SQL and good with data modelling for relational, analytical and big data workloads. Advanced programming skills with Python, Scala or Java. Strong knowledge of data structures, algorithms, & distributed systems. Strong experience and deep understanding of Spark internals. Expert in Hive. Hand on experience with one of the cloud technologies (AWS, Azure, GCP). Hands on experience with at least one NoSQL database (HBase, Cassandra, MongoDB etc). Experience in working with both batch and streaming datasets. Knowledge of at least one ETL tool like Informatica, Apache NiFi, Airflow, DataStage etc. Experience in working with Kafka or related messaging queue technology. Hands on experience in writing shell scripts for automating processes. Willingness to learn and adapt. Delivery focused and willingness to work in a fast-paced work environment. Takes initiative and responsibility for delivering complex software. Knowledge of building REST API end points for data consumption. Excellent oral and written communication is a must. Well versed with Agile methodologies and experience in working with scrum teams.

Responsiblity

Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. 

This position is already closed and no longer available.  You may like to view the other latest internships here.

Related Job Searches:

Discuss this Job:

You can discuss this job on Clublance.com #career-jobs channel, or chat with other community members for free:
Share This Page