Money Forward Vietnam

E-Town Central, 11 Doan Van Bo, TP Hồ Chí Minh

Company Size : 25-99

View more

Job Summary

Job description

Overview of job

RESPONSIBILITIES:

  • Develop and operate cross-company data infrastructure.
  • Manage data integration and data pipelines.
  • Aggregate data from various internal products, services, and
  • CRM tools into a data lake.
  • Appropriately distribute the data collected in the data lake to analysis and ML platforms.
  • Build and maintain data analysis infrastructure.
  • Optimize DWH performance.
  • Ensure data quality.

Caring Mental & Physical Recreation: 

  • Hybrid working
  • Full salary in probation & 13th month salary
  • Social insurance on full salary from probation
  • Premium Health insurance from probation
  • Flexible start 8AM-9AM from Mon-Fri
  • 16 days off annually + 1 Birthday Leave 
  • Paternity leave extra 5 days 
  • Annual company trip; Quarterly team building activities
  • Club activities
  • Annual health check

Caring Career & Development: 

  • Clear Career path
  • Foreign language & International technology-related certifications sponsoring
  • Well-equipped facility: Macbook pro,  additional monitor,..
  • Soft skill workshops
  • Tech seminars
  • Monthly and biannually Recognition Awards
  • Performance review twice/year

Job Requirement

REQUIREMENTS: 

(*) Must have:

  1. AI software development
  • Experienced developing AI algorithm or understand how it works from scratch
  • Developed an Application Programming Interface as an internet service
  • Operated and manipulated an API service by using an appropriate software
  • Controlled and tuned accuracy through operating data

    2. Data engineering
  • Data Warehousing
  • ETL Processes
  • Cloud Platforms (AWS, Azure, GCP)
  • Python
  • SQL
  • Big Data Technologies (Hadoop, Spark, Kafka)
  • Data Modeling
  • Data Quality
  • Data Security
  • Version Control (Git)

    3. Tools 
  • Configuration management: Terraform
  • CI/CD: GitHub Actions
  • Monitoring and logging: Datadog, Cloud Monitoring, CloudWatch
  • Project management: JIRA Cloud, Miro,...
  • Documentation: Kibela, Google Workspace
  • Spark: 2-3 years of experience
  • Airflow: 1 year (As a user not administrator)

    4. Cloud architecture
  • Experienced developing a system by using some cloud services on AWS, GCP or Azure(AWS is preferable).
  • Understood cloud service components from architectural perspective(To know what can do, how it works and what should be paid attention)
  • Designed and integrated a system with appropriate security.

(*) Nice to have:

  • Good Communication skill and open-mind
  • Good at English (four skills)

Languages

  • English

    Speaking: Intermediate - Reading: Intermediate - Writing: Intermediate

Technical Skill

  • Python
  • Data Warehouse
  • Algorithm
  • Jira
  • Git
  • ETL
  • API
  • MS SQL
  • Hadoop
  • Github
  • Apache Spark
  • Data Modeling
  • MS Azure
  • Big Data
  • Apache Kafka
  • AWS
  • Cloudwatch
  • DataDog
  • GCP
  • Terraform
  • Apache Airflow
  • Miro
  • Cloud Architecture
  • CI/CD

COMPETENCES

  • Project Management
  • Documentation
  • Communication Skills