Tunde Bakare
Senior Data Engineer at LiveRamp- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
-
German Native or bilingual proficiency
-
Yoruba Native or bilingual proficiency
-
English Full professional proficiency
Topline Score
Bio
Credentials
-
Building Batch Data Pipelines on GCP
CourseraSep, 2021- Nov, 2024 -
Building Resilient Streaming Analytics Systems on GCP
CourseraSep, 2021- Nov, 2024 -
Google Cloud Big Data and Machine Learning Fundamentals
CourseraSep, 2021- Nov, 2024 -
Modernizing Data Lakes and Data Warehouses with GCP
CourseraSep, 2021- Nov, 2024 -
Preparing for the Google Cloud Professional Data Engineer Exam
CourseraSep, 2021- Nov, 2024 -
Smart Analytics, Machine Learning, and AI on GCP
CourseraSep, 2021- Nov, 2024
Experience
-
LiveRamp
-
United States
-
Advertising Services
-
700 & Above Employee
-
Senior Data Engineer
-
Oct 2018 - Present
• ETL/ELT process/pipeline build and automation (CDAP + Airflow) • Hadoop cluster (Cloudera CDH) maintenance • GCP platform maintenance (BigQuery/Dataproc/JupyterHub) • Release planning and deployment (manual/Jenkins) • Platform user support (query optimisation/pyspark-,hive-, BigQuery-code optimisation and verification) • Platform maintenance (GCP + Cloudera) • Data Cleansing/Ingestion pipeline build • Client/user support • Process code development (Pyspark, Python, Hive, BigQuery, shell) • GIT, JIRA, Jenkins, Air-Flow, GCP, BigQuery, DataProc, Hadoop, HIVE, Jupyter, Tableau,CDAP. Show less
-
-
-
dunnhumby
-
United Kingdom
-
Advertising Services
-
700 & Above Employee
-
Data Engineer
-
Nov 2017 - Oct 2018
• ETL processes with customised in-house tool, built around Pyspark and Oracle SQL. • Pyspark/Spark for data processing. • Airflow/DAG programming to automate ETL jobs • Bash/Shell scripting for data processing where necessary • Dominantly Python programming for all functional developments • Hive/SQL to process and transform data within the datalake (Cloudera platform) • Cross-platform data extraction (Hive to Exadata) using Pyspark dataframe • ETL processes with customised in-house tool, built around Pyspark and Oracle SQL. • Pyspark/Spark for data processing. • Airflow/DAG programming to automate ETL jobs • Bash/Shell scripting for data processing where necessary • Dominantly Python programming for all functional developments • Hive/SQL to process and transform data within the datalake (Cloudera platform) • Cross-platform data extraction (Hive to Exadata) using Pyspark dataframe
-
-
-
Centrica
-
United Kingdom
-
Utilities
-
700 & Above Employee
-
Hadoop developer
-
Apr 2016 - Nov 2017
• Use of various tools within Hadoop ecosystem as well as Java, SQL and bash scripting to deliver Big Data solutions as required by the business or Project dependent developments and deliveries. • Various implementations and use of Sqoop to migrate data from RDBMS to the data lake. • Big data solution implementation using Object-oriented programming languages. • L3 handling of incidents which occur from jobs running on the Hadoop job server(s). • Regular developments on the in-house tools to integrate majorly new technologies from Hortonworks as well as upgrades to enable Project related requirements. • Capturing technical as well as business requirements as related to Projects. Show less
-
-
-
-
Software Developer
-
Sep 2013 - Dec 2015
• System integration using Spring Hibernate (Java). • Various maintenance and upgrade activity carried out on existing developments in Java. • Health Mobile App and Windows application development for NHS as a ‘Prove of concept’ to monitor and control obesity within individuals. Development was Java based. • Backend development for various data driven applications. Database developed within MYSQL and service-oriented architecture developed within the .NET environment (WCF). • Web site development for Greenwich-Mencap using the open source web development package of WordPress • Development of a web-based resource planning application for own organisation. MVC design pattern was used and application was developed with Java Spring Show less
-
-
-
Royal Mail
-
United Kingdom
-
Freight and Package Transportation
-
700 & Above Employee
-
Sorting officer
-
Jan 2011 - Dec 2015
Primary and secondary sorting of letters and parcels on weekends Primary and secondary sorting of letters and parcels on weekends
-
-
-
TREVIRA GmbH
-
Augsburg Area, Germany
-
Automated Process Coordinator
-
Jul 2001 - Oct 2010
• Monitor and control production hardware from the control centre • Collect and analyse production data from over 700 data sensors for optimization • Remote restarting of faulty production units from the control centre • Engaging and supporting engineers for disaster recovery • Sampling of data collection points to ensure accuracy • Monitor and control production hardware from the control centre • Collect and analyse production data from over 700 data sensors for optimization • Remote restarting of faulty production units from the control centre • Engaging and supporting engineers for disaster recovery • Sampling of data collection points to ensure accuracy
-
-
-
FAIST Systeme GmbH
-
Augsburg Area, Germany
-
Hardware Installer
-
Apr 2000 - Jun 2001
Installation of hardware for all Fujitsu Siemens desktop computer series. Testing of hardware for functionality. Installation of hardware for all Fujitsu Siemens desktop computer series. Testing of hardware for functionality.
-
-
Education
-
University of Greenwich
MSc, Big Data and Business Intelligence -
University of Greenwich
BSc, Computing -
The Open University
Diploma, Computing