Salary: $3,000 - $4,000 a month

Loading ...

Job content

As a Data Engineer, you should have prior experience in building data infrastructure, and possess end-to-end data engineering knowledge (dimension modelling to ETL to data warehousing). You thrive in your ability to making strides in a fast-paced environment. For this talented role, you would be involved in the agile development cycle and take ownership of various data tools from design to deployment.

Job Responsibilities

  • Work with Data Scientists, Data Engineers, Data Analysts, Software engineers to build and manage data products and the SISTIC Data Warehouse/Data Lake.
  • Design, develop, and launch extremely efficient and reliable data pipelines.
  • Solve issues in the existing data pipelines and build their successors.
  • Build modular pipelines to construct features and modelling tables.
  • Maintain data warehouse architecture and relational databases.
  • Monitor incidents by performing root cause analysis and implement the appropriate action.
  • Create, document, and monitor highly readable code.
  • Obtain and ingest raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries etc.)
  • Conduct and participate in code reviews with peers.

Qualifications and Criteria

Technical Competencies(in bold italics are Must-Have)

  • Bachelor’s degree in Computer Science or any other related field with minimum 2 years of IT experience.
  • Minimum 2 years’ experience in designing, building and operationalizing medium to large scale data integration(structured &unstructured) projects with Data Lake, Data Warehouse, BLOB Storage, RDBMS, HDFS.
  • Prior experience in using Big Data tooling (Hadoop, Spark) and a good understanding of functional programming.
  • Minimum 1 years of Hands-on Experience in batch/real-time data integration & processing.
  • Strong proficiency in handling databases using MySQL, PostgreSQL, HIVE, Druid.
  • Solid background in programming languageslike Python/Scala/Java etc. Python is a must.
  • Build & maintain scalable ETL pipelines using Apache Airflow, Apache Kafka and Apache Snoop.

Core Competencies

  • Analytical and Critical Thinking competencies
  • Problem Solver
  • Operate effectively both as Team Player as well as Independent contributor
  • Strong communication and interpersonal skills
  • Versatile and Adaptable especially in our environment where situations change rapidly
  • Out of the Box Thinker
  • Ability to juggle multiple tasks and deadlines

Perks in Play

  • Mentorship and career guidance
  • An atmosphere of inclusion and collaboration
  • Regional network connections and opportunities
  • Constant innovation and discovery challenge
  • Fun and product-focused culture

Screening questions

  • Have you completed the following level of education: Bachelor’s Degree?
  • How many years of work experience do you have using Apache Airflow?
  • How many years of work experience do you have using Java?
  • How many years of work experience do you have using Python (Programming Language)?
  • Are you legally authorized to work in Singapore?
  • Are you comfortable commuting to this job’s location?
  • What is your proficiency in handling databases (MySQL, PostgreSQL, HIVE, Druid)? Score yourself out of 10.
  • We must fill this position urgently. Can you start immediately?
Loading ...
Loading ...

Deadline: 07-10-2023

Click to apply for free candidate

Apply

Loading ...
Loading ...

SIMILAR JOBS

Loading ...
Loading ...