Job Seeker Reactivate Your Account
Thank you, this account has been Deactivated.
Do you want to Reactivate your account?
No
Yes

Big Data Platform Administrator

(Database Administrator )

WING BANK (CAMBODIA ) PLC
Boeng Keng Kang | Phnom Penh
  1 Post
Verified This job has been verified by the company as a real job vacancy. Today
Recruiter active12 hours ago The recruiter at this company was last active reviewing applications.
Sorry, Unable to Apply
x
55%
Please Upload CV Attachment, or update your JobNet Profile to at least 55% of completion.
Upload CV
Update Profile

Big Data Platform Administrator

(Database Administrator )

WING BANK (CAMBODIA ) PLC
Recruiter active12 hours ago The recruiter at this company was last active reviewing applications.
Cambodia - Phnom Penh
Verified This job has been verified by the company as a real job vacancy.

Experience level

Manager

Job Function

IT Hardware, Software

Job Industry

Banking/ Insurance/ Microfinance

Min Education Level

Bachelor Degree

Job Type

Full Time

Job Description

A Good Opportunity for ..

  • Design and implement scalable big data architectures using Hadoop, Spark, Hive and other others distribute systems.
  • Develop and optimize ETL/ELT pipelines for batch and streaming data.
  • Integrate structured and unstructured data from diverse sources into unified platforms.
  • Monitoring system health and performance, escalating issues as needed.
  • Support routine system backups, updates, and patches.
  • Help on troubleshoot hardware and software issues within the Hadoop environment.
  • Document procedures, system configurations, and operational activities.
  • Collaborate with data scientists, analysts and software engineers to support data-driver initiatives.
  • Implement data governance, security and compliance protocols.
  • Evaluate and adopt emerging technologies to improve data reliability and scalability.

Open To

Male/Female

Job Requirements

  • Graduated bachelor degree of Information Technology, preferably in the field of Computer Science.
  • Bachelor’s degree in computer science, Information Technology, or related field.
  • Proven experience in big data engineering ecosystem or platform development
  • Strong understanding of HDFS, MapReduce, YARN, Hive, HBase, Spark, NIFI, Ranger, and Kafka.
  • Proficiency in programming languages like python, Java, Scala and Shell Script
  • Familiarity with SQL and NoSQL databases
  • Hands-on experience Lakehouse Platform like Dremio, Databrick, Iceberg.
  • Experience with tools like Airflow, dbt, or Flink
  • Strong problem-solving skills and attention to detail
  • Exposure to machine learning pipelines and real-time analytics
  • Ability to work collaboratively in a team environment

What we can offer

Benefits

- Rewards for over performance

Highlights

- Join an experienced team

Career Opportunities

- Learn new Skills on the job