chidi henry
Data Engineer at Cyberfleet- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
Topline Score
Bio
Experience
-
Cyberfleet
-
Nigeria
-
IT Services and IT Consulting
-
1 - 100 Employee
-
Data Engineer
-
Jul 2018 - Present
-Liaised with IT & Operations team to ensure availability of high quality data & performance tuning of Hadoop clusters-Led 9+ Analysts to administer Horton works & analyze virtual machine requirements for implementing BD solutions-Analyzing business requirements to formulate strategies for implementation of Big Data initiatives-Enhanced data processing and storage throughput via Hadoop framework across a luster of 25 nodes-Optimised Hive and Hbase queries & enhanced performance by deploying Partitioning & Bucketing techniques in Hive Show less
-
-
Data Engineer
-
May 2016 - Present
-Build scalable databases capable of ETL processes using SQL and Spark, which help increase performance by 40%.-Evaluate the workflow and increase the efficiency of data pipelines that processover 50 TB of data daily.-Utilize MongoDB to create NoSQL databases that harvests data from a variety ofsources.-Performed KPI Analysis on DATA.-Migrating data from existing Mainframe/SQL servers to inhouse servers and perform ETL operations on it, which increase database performance by 30%-Design and implemented data pipelines for batch data processing.-Screened Hadoop cluster job performances and capacity planning, and monitored cluster connectivity and security-Coordinated with infrastructure, network, database, business intelligence and application teams-Design and implemented data pipelines for batch data processing.-Rich experience in processing variety unstructured data (Text data, JSON, HTML).-Worked closely with data scientists and assist on feature engineering, model training frameworks-Develop framework, metrics and reporting to ensure progress can be measured, evaluated, and continually improved.-Moved all crawl data flat files generated from various retailers to HDFS for further processing.-Involved in importing data from MySQL to HDFS using SQOOP.-Involved in writing Hive queries to load and process data in Hadoop File System.-Involved in creating Hive tables, loading with data and writing hive queries which will internally in MapReduce way.-Migrating of data from production database to Google Bigquery with the use of Apache Airflow-Migrating of data from production database to AWS S3 with the use of Apache Airflow-Exporting data from Bigquery to google datastudio for analysis-Used AWS Glue to perform ETL Operation-Update , Create and Insert data in bigquery-Used Docker and docker compose to deployed postgresql, mysql and Airflow Show less
-
-
-
Sabudh Foundation
-
India
-
Non-profit Organizations
-
1 - 100 Employee
-
Data Engineer
-
Dec 2019 - Oct 2022
-Led 7 Data Analysts to spearhead & facilitate periodic analysis of 100 GB+ of externally sourced website data -Independently designed MapReduce solutions to parse data, populate staging tables & store processed data in EDW -Conducted extensive research on revenue management and pricing analytics -Deploying rich graphical visualizations in dashboards to create effective views of data such as: - Line Charts, Bar Charts, Pie Charts, Scatter Plots, Histograms, Tree Maps, Heat Maps in Tableau Desktop -Implementing advanced python features such as parameters, table calculations, sets, groups, user filters to deliver 10+ dashboards -Analyzing user requirements and validating the available data to propose usable designs Interacting with multiple stakeholders to gather business needs & developed technical spec Show less
-
-
-
ExxonMobil
-
Oil and Gas
-
700 & Above Employee
-
ANALYTICS INTERN
-
Mar 2013 - Jan 2014
ANNULUS PRESSURE MONITORING -Used SQL and Excel to Implement and recommend annulus monitoring frequency; • Subsea wells – daily • Manned platforms – daily • Unmanned platforms –weekly -Determine appropriate diagnostic program for wells with annulus integrity issues and recommend maintenance . -Established process for coordinating, scheduling, and recording result of diagnostic testing (such as annulus pressure bleed-down/build up) -Include annulus pressure limits for each well in handover document. -The result of annulus pressure monitoring must be documented in WIMS-SAP or CPMS spreadsheet. -Performed ETL Operations and Determine Maximum Allowable Wellhead Operating Pressures (MAWOP) for each well and annulus pressure triggers. Trigger pressures should be set at no greater than 80% of the MAASP Use WIMS-SAP or other system triggers notifications for additional actions. -Determine maximum annulus pressures (MAOASP) for each well using; • Calculating the MAASP • De-rating tubulars for corrosion and erosion • Calculating the MAWOP • Establishing an annulus trigger pressure - I recommended Surface and intermediate annuli and liquid filled production annuli should be routinely checked for liquid level and topped up as required. -I Implemented and recommended precautions for casing fluid management; • Filtered fluid should be used. • All aqueous fluids should include a corrosion inhibitor, oxygen scavenger, and bactericide suitable for the well conditions. • Consider adding weighting material, where necessary. Show less
-
-
Education
-
University of Lagos