PHP 60,000
Brgy. Highway Hills, Mandaluyong City
We are seeking a skilled Data Engineer to join our team. As a Data Engineer, you will be responsible for managing and optimizing data workflows within the AWS Console. Your primary focus will be on creating, modifying, and deleting objects/tables in S3 buckets, executing Glue Jobs scripts for ETL processes, and utilizing the Glue Data Catalog for ETL operations.
Responsibilities:
• Create, modify, and delete objects/tables in S3 buckets, including the raw, clean, curated zones, and the consumption layer (Redshift Cluster).
• Develop and execute Glue Jobs scripts for Extract, Transform, Load (ETL) processes, ensuring efficient data transformation across the entire dataset in the Data Lake.
• Determine appropriate actions, such as tokenization, to be applied in the raw zone for immediate data processing.
• Set up dataset alarms to monitor the data pipeline and define notifications for missing data ingested.
• Collaborate with stakeholders to determine data transformation requirements for selected datasets within the Data Lake.
• Apply data standardization techniques in the clean zone to ensure consistent and reliable data quality.
• Create snapshots in the clean zone to capture data at specific points in time for analysis and auditing purposes.
• Design and develop data models in the curated zone, optimizing data structures for efficient access and retrieval.
• Define tables/objects in the curated zone to support Salesforce usage, ensuring data integrity and compatibility.
• Identify and create tables/objects in the curated zone and the consumption layer (Redshift Cluster) for Tableau standard access.
• Determine tables/objects to be included in Redshift Spectrum for Tableau advanced access, enabling advanced data exploration and analysis.
• Bachelor’s degree in computer science, information technology, or a related field.
• Minimum of 3 years’ relevant work experience.
• Strong proficiency in working with the AWS Console, including S3, Glue, and Redshift services.
• Experience with ETL processes, scripting, and data transformation techniques.
• Familiarity with Glue Data Catalog and its usage in ETL workflows.
• Solid understanding of data lakes, data modeling, and database concepts.
• Ability to collaborate effectively with cross-functional teams and stakeholders.
• Strong problem-solving and analytical skills.
• Excellent communication and documentation skills.
• Attention to detail and a commitment to delivering high-quality results.
• Willing to travel and be deployed to client/affiliate off-shore (Hong Kong office).
If you are a self-motivated Data Engineer with a passion for working with AWS technologies and driving data-driven insights, we would love to hear from you. Join our dynamic team and contribute to the success of our affiliate's data initiatives.
Work Arrangement: Will be deployed in HK, and will be working in the HK Office. If he is here in the Philippines, he will be working from home.
ETL
Information Technology
08:00 AM to 05:00 PM