Ugi Abdul Muchyi

Data Engineer at PT. Quantus Telematika Indonesia
  • Claim this Profile
Contact Information
Location
Bandung, West Java, Indonesia, ID
Languages
  • Bahasa Indonesia Native or bilingual proficiency
  • Bahasa Inggris Professional working proficiency
  • Bahasa Sunda Limited working proficiency
  • Bahasa Jawa Limited working proficiency

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

You need to have a working account to view this content.
You need to have a working account to view this content.

Experience

    • Indonesia
    • Information Technology & Services
    • 1 - 100 Employee
    • Data Engineer
      • Oct 2022 - Present

      Data Modeling: Designing schemas, tables, and defining relationships between different data entities. Data Integration: Extracting data from various sources (databases, APIs, files, etc.) and transforming it into a format suitable for analysis or storage in a data warehouse. Mostly used PostgreSQL and MySQL. Data Pipeline Development: Building scalable data pipelines that automate the flow of data from source systems to target destinations. Web Scraping: Develop scripts to scrape the desired data from social media. Clean and preprocess the scraped data to ensure data quality. Airflow Workflow Design: Designing and defining data workflows using Airflow's Directed Acyclic Graph (DAG) structure. This involves specifying tasks development, dependencies management, and scheduling parameters. - Task Development: Writing code or scripts to implement individual tasks within the data workflow. This can include tasks such as data extraction, transformation, loading, and any other necessary data processing steps. - Dependency Management: Establishing dependencies between tasks to ensure proper sequencing and execution. - Scheduling and Execution Parameters: Configuring the scheduling parameters for the data workflow, such as the start time, end time, recurrence interval, and other scheduling options. Data Streaming: Performing CDC on PostgreSQL using Debezium-Kafka, Debezium used as connectors to monitor database changes, produce records for every insert, update, or delete operation that occurs in the source database) while Kafka performed as Event Broker and Message Broker for handling the flow of change data events from Debezium connectors to downstream consumers. Show less

    • United Kingdom
    • Software Development
    • 1 - 100 Employee
    • Database Curator
      • Jun 2022 - Aug 2023
    • Member of Majelis Tetap Kongres
      • Jan 2019 - Dec 2019

      - Arbitrate FMIPA organization's activities problem - Facilitator of congress - Follow up and socialize the results of congress - Arbitrate FMIPA organization's activities problem - Facilitator of congress - Follow up and socialize the results of congress

    • Staff of Advocacy Commission MPA Himatika FMIPA Unpad
      • Dec 2018 - Dec 2019

      - Approve the work programs to be performed by PH Himatika for Finance and PR-Media Information Division - Supervise the overall implementation of Himatika's regulation - Prepare the aspirations report and advocate it to the related parties - Approve the work programs to be performed by PH Himatika for Finance and PR-Media Information Division - Supervise the overall implementation of Himatika's regulation - Prepare the aspirations report and advocate it to the related parties

    • Indonesia
    • Education Administration Programs
    • 100 - 200 Employee
    • Staff of Logistic Padjadjaran Education Festival
      • Jun 2018 - Oct 2018

      - Provide logistical needs of Try Out event and main event backstage - Team up with Event Division to fulfill guest stars logistical needs - Provide logistical needs of Try Out event and main event backstage - Team up with Event Division to fulfill guest stars logistical needs

Education

  • Universitas Padjadjaran (Unpad)
    Bachelor of Mathematics, Mathematics
    2017 - 2021

Community

You need to have a working account to view this content. Click here to join now