Job Summary
Job description
Overview of job
- Design, implement, and maintain data models with Ralph Kimball’s Dimensional Modelling methodology.
- Support data model and ETL solutions in production.
- Build and maintain data pipelines using tools such as NIFI, Airflow, etc.
- Evaluate and define KPIs to monitor and manage data quality
- Tuning and improving the performance of DB/DWH.
- Support data scientists, data analyst to explore data in Data warehouse.
- Build and maintain integration data flow provided to other departments.
- Maintain update-to-date documentation of data catalog, ETL flows, data dictionary.
- Practice sustainable incident response and blameless postmortems.
- Participate in end-to-end engineering solutions with regard to data processing and data delivery and integration, including data processing job/pipeline tool suite, batch framework and platform, micro-service framework and platform.
- Work closely with customers to identify improvements in existing processes and new processes that bring meaningful and actionable insight to the underlying data.
- Negotiable salary according to qualification (13th-month salary + performance bonus);
- 02 months probation with 100% of Gross salary;
- The working environment is open, energetic and professional. More opportunities for career promotion;
- Free Food & Drink: lunch, fresh fruit/cake, coffee&tea;
- Annual health check and attractive Premium healthcare coverage under the company’s own policy;
- Employee Relationship: company trip, team bonding & clubs;
- Leave Paid: 12 days for annual leave & 6 days for sick leave (max 18 days)
- Working time: Monday to Friday (Flexible from 8:30 - 9:00 AM to 5:30 - 6:00 PM).
Job Requirement
BASIC QUALIFICATIONS
- Skill in one of the following languages Python / Java / Kotlin, and target in Big Data career.
- Skill in one of the following technologies: PySpark / Spark / Flink / Airflow
PREFERRED QUALIFICATIONS
- Experience in Design, ETL, data modeling and developing SQL database solutions.
- Good understanding of data management - data lineage, meta data, data quality, data governance
- Have basic understanding about big data, and big data platform.
- Knowledge with S3/ Spark/ Jupyter/ Flink is a plus point.
- Ability to debug and optimize code and automate routine tasks.
- Skills in task and time management, proactive problem solver.
- Self learning and applying into work skills.
- Teamwork and communication skills.
Languages
-
English
Speaking: Intermediate - Reading: Intermediate - Writing: Intermediate
Technical Skill
- Python
- MS SQL
- Kotlin
- Java
- Adobe InDesign
- ETL
- Apache Spark
- Data Modeling
- Big Data
- Amazon S3
- Apache Airflow
- Apache Flink
- PySpark
COMPETENCES
- Time Management Skills
- Proactive
- Teamwork
- Communication Skills
BUSINESS PROFILE
Teko Vietnam - An ideal destination for technology people.
Teko Vietnam Technology JSC. was established 01/2017. In fact, our experienced team has been together since 2009, formerly Garena Vietnam. Teko's mission is being written in the same way as we have always been.
At Teko, with many talented engineers, we work, share and accompany each other in the application of advanced technology platforms such as Cloud, Big Data, AI, Blockchain, Microservices to develop Convenient, useful, active solutions and products for businesses and society such as ERP, Data Platform, E-commerce Ecosystem, E-Payment, OmniChannel Seller & Customer Services.
In addition, Teko Ventures is investing strongly in the fields of: Fintech, New Retail, Logistics, Warehouse, Digital Entertainment, B2B Management Solution including big brands such as Phong Vu, VNPay, Sapo, Tripi, Jupviec.