Global Fashion Group
Copac Square Building, 12 Ton Dan, TP Hồ Chí Minh
Company Size : 25-99
View moreJob Summary
Job description
Overview of job
Global Fashion Group is the leading fashion and lifestyle destination in growth markets across LATAM, SEA and ANZ. Our three e-commerce platforms: THE ICONIC, ZALORA and Dafiti connect an assortment of international, local and own brands to millions of consumers from diverse cultures and lifestyles. Powered by our best-in-class operational infrastructure, which is fashion-specific and tailored to local market needs, our platforms provide an inspiring and seamless end-to-end customer experience. We stand for benchmark-setting customer service, delivery options, returns policies, and curation of brands.
About the function
At GFG, Technology is driven by innovation and quality is highly valued. Our data team is the driving force behind our business strategies and decisions. We are integrated into all departments to ensure all employees at THE ICONIC have access to good quality and timely data.
Our Data Engineering team solves complex problems and delivers data to propel our business forward, powering the insights that are used in every decision we make. We are the engineers, the builders, the maintenance people for all our business data.
Key tech you’ll play with in this role
- Both AWS and GCP
- BigQuery, SQL Server and Redshift
- Docker & Kubernetes
- Airflow, Cloud Composer and Pentaho
- Cloud Dataflow/Apache Beam
- High velocity streaming data, behavioural data and well as structured and unstructured data
What you will do
- Develop and support our enterprise data warehouses, analytical databases and infrastructure
- Work with the team to re-platform our existing data architecture to next generation tooling and data architecture
- Build/Maintain our data pipelines in Python and SQL to ensure that data is delivered in a timely manner
- Work closely with our Data Scientists and Data Analysts to implement new insights and statistical models
- Assist in developing tools/processes to enable our business to self serve
Job Requirement
What we are looking for
- Extensive programming knowledge of Data processing within Python
- DataFrames
- Pandas
- Dependency management within Python
- Iterators, producers and consumer knowledge
- Airflow DAG building
- CI/CD - Bamboo Deployment / Delivery Experience
- Strong SQL coding skills
- Advanced Data Engineering Design Skills
- Excellent communication skills
- Significant experience in a similar Data Engineering role
- Strong data warehousing/Data Engineering experience
- Experience with data modelling and complex ETL solutions
- Test and QA Experience in Data Pipelines
Ways to stand out from the crowd
- GCP experience
- BigQuery/Redshift
- Cube/SSAS Experience
- DevOps/CICD
- Docker
- Airflow
- R
Languages
-
English
Speaking: Intermediate - Reading: Intermediate - Writing: Intermediate
Technical Skill
- Python
- MS SQL
- AWS
- ETL
- SSAS
- Bamboo
- Docker
- Data Modeling
- AWS Redshift
- DevOps
- Data Warehouse
- Pandas
- GCP
- R
- Apache Airflow
- CI/CD
- Google BigQuery
COMPETENCES
- Communication Skills
BUSINESS PROFILE