Job Seeker Reactivate Your Account
Thank you, this account has been Deactivated.
Do you want to Reactivate your account?
No
Yes
X

Big Data Platform Administrator

(Database Administrator )

WING BANK (CAMBODIA ) PLC
Boeng Keng Kang | Phnom Penh
  1 Post
Verified This Job has been Verified as
Real by the Company.
This Job has been Verified as
Real by the Company.
15 Jan 2026
Recruiter active1 day ago This Company is Actively
Hiring. Your CV will be Sent
Directly to the Company.
This Company is Actively
Hiring. Your CV will be Sent
Directly to the Company.
Sorry, Unable to Apply
x
55%
Please Upload CV Attachment, or update your JobNet Profile to at least 55% of completion.
Upload CV
Update Profile

Big Data Platform Administrator

(Database Administrator )

WING BANK (CAMBODIA ) PLC
Recruiter active1 day ago This Company is Actively
Hiring. Your CV will be Sent
Directly to the Company.
This Company is Actively
Hiring. Your CV will be Sent
Directly to the Company.
Cambodia - Phnom Penh
Verified This Job has been Verified as
Real by the Company.

Experience level

Manager

Job Function

IT Hardware, Software

Job Industry

Banking/ Insurance/ Microfinance

Min Education Level

Bachelor Degree

Job Type

Full Time

Job Description

A Big Opportunity for ...

  • Design and implement scalable big data architectures using Hadoop, Spark, Hive and other others distribute systems.
  • Develop and optimize ETL/ELT pipelines for batch and streaming data.
  • Integrate structured and unstructured data from diverse sources into unified platforms.
  • Monitoring system health and performance, escalating issues as needed.
  • Support routine system backups, updates, and patches.
  • Help on troubleshoot hardware and software issues within the Hadoop environment.
  • Document procedures, system configurations, and operational activities.
  • Collaborate with data scientists, analysts and software engineers to support data-driver initiatives.
  • Implement data governance, security and compliance protocols.
  • Evaluate and adopt emerging technologies to improve data reliability and scalability.

Open To

Male/Female

Job Requirements

  • Graduated bachelor degree of Information Technology, preferably in the field of Computer Science.
  • Bachelor’s degree in computer science, Information Technology, or related field.
  • Proven experience in big data engineering ecosystem or platform development
  • Strong understanding of HDFS, MapReduce, YARN, Hive, HBase, Spark, NIFI, Ranger, and Kafka.
  • Proficiency in programming languages like python, Java, Scala and Shell Script
  • Familiarity with SQL and NoSQL databases
  • Hands-on experience Lakehouse Platform like Dremio, Databrick, Iceberg.
  • Experience with tools like Airflow, dbt, or Flink
  • Strong problem-solving skills and attention to detail
  • Exposure to machine learning pipelines and real-time analytics
  • Ability to work collaboratively in a team environment

What we can offer

Benefits

- Rewards for over performance

Highlights

- Join an experienced team

Career Opportunities

- Learn new Skills on the job