Currently hiring a Cloud Data Engineer for one of my client a Global Bank in the middle of a major platform transformation project
Key responsibilities
- Design and implement data pipeline in cloud environment.
- Understand data schema of available data sources
- Utilize technologies such as workflow engine to automate data transfer from sources to destination
- Design suitable data schema for the data warehouse to enable efficient query and retrieval of data for analytic purposes
- Automate & implement CI/CD delivery pipeline
Desired qualifications
Hands-on experience in Google Cloud Platform (GCP) services, such as: BigQuery, Dataflow, Cloud Functions or
- 5+ years experience as a Data engineer or Cloud engineer
AWS services, such as: Athena, LambdaExperience in handling sizable data warehouse
Experience in implementing ETL / data streaming for data analytic
- Fluent in English. Any other Asian language is a plus (Cantonese, Mandarin, Korean, Japanese) but not mandatory
Nice to have:
- Big data platform technologies such as Hadoop, Hive, Spark, Cassandra, HBase
- Infrastructure as code experience, such as CloudFormation, Terraform
- Experience in utilizing workflow engines such as: Apache Airflow, Google Cloud Composer
- DevOps tooling, such as Ansible and Jenkins
Location: Hong Kong - WFH: allowed and encouraged - Permanent role
Start Date: September 2021 but open to longer notice period
Attractive base salary: depending on seniority up to HKD 70k/ month * 12 months
Bonus: 15-20% + 20 annual leaves
Interested in this description? Would you like to know more?
The next steps are to apply with an updated CV to pherard@argyllscott.com.hk or a first call through my direct line +852 6106 7562 for a confidential discussion.
#LI-PH1