Jean-Marc HOELLINGER

Lead Data Engineer at Sedona
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Greater Paris Metropolitan Region, FR
Languages
  • Français Native or bilingual proficiency
  • Anglais Professional working proficiency
  • Japonais Limited working proficiency
  • Allemand Elementary proficiency

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

You need to have a working account to view this content.
You need to have a working account to view this content.

Experience

    • France
    • IT Services and IT Consulting
    • 1 - 100 Employee
    • Lead Data Engineer
      • Mar 2022 - Present

      ● Tech Lead for a luxury client ○ API optimization (cost, performance) ○ Technical debt reduction at domain level ○ Knowledge transfer to near and off-shore teams for them to autonomous in a couple days ○ Efficiency level-up in a team of 10 Data Engineers ● Design and build of a Data Lake, a Data Warehouse and Data Marts on GCP for an automotive group holding a dozen of brands ○ End-to-end migration of all assets in the cloud ○ Data quality uniformization of a dozen… Show more ● Tech Lead for a luxury client ○ API optimization (cost, performance) ○ Technical debt reduction at domain level ○ Knowledge transfer to near and off-shore teams for them to autonomous in a couple days ○ Efficiency level-up in a team of 10 Data Engineers ● Design and build of a Data Lake, a Data Warehouse and Data Marts on GCP for an automotive group holding a dozen of brands ○ End-to-end migration of all assets in the cloud ○ Data quality uniformization of a dozen heterogeneous sources ○ Data catalog initialization from the 5 main DMS ○ Implementation of event-driven architecture on GCP ○ Reliability for scalability and acceptance testing (> 500 000 files) ● Design and build of a worldwide and real time stock management system for a luxury client ○ Complex system mixing orchestration and events from ERP and GCP databases Show less ● Tech Lead for a luxury client ○ API optimization (cost, performance) ○ Technical debt reduction at domain level ○ Knowledge transfer to near and off-shore teams for them to autonomous in a couple days ○ Efficiency level-up in a team of 10 Data Engineers ● Design and build of a Data Lake, a Data Warehouse and Data Marts on GCP for an automotive group holding a dozen of brands ○ End-to-end migration of all assets in the cloud ○ Data quality uniformization of a dozen… Show more ● Tech Lead for a luxury client ○ API optimization (cost, performance) ○ Technical debt reduction at domain level ○ Knowledge transfer to near and off-shore teams for them to autonomous in a couple days ○ Efficiency level-up in a team of 10 Data Engineers ● Design and build of a Data Lake, a Data Warehouse and Data Marts on GCP for an automotive group holding a dozen of brands ○ End-to-end migration of all assets in the cloud ○ Data quality uniformization of a dozen heterogeneous sources ○ Data catalog initialization from the 5 main DMS ○ Implementation of event-driven architecture on GCP ○ Reliability for scalability and acceptance testing (> 500 000 files) ● Design and build of a worldwide and real time stock management system for a luxury client ○ Complex system mixing orchestration and events from ERP and GCP databases Show less

    • France
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Data Architect & Senior Data Engineer
      • Sep 2021 - Mar 2022

      ● Cloud migration support for a banking client ○ Build and animation of workshops with Data Engineering and Data Science teams (Hadoop, Hive, Spark, Scala, PySpark, Python) ○ Definition of the target architecture of data foundations on GCP (Google Cloud Storage, BigQuery, Cloud Composer, Dataproc, Google Compute Engine, Data Studio) ○ Drafting of Data High Level Design and Cloud & Data Low Level Design ○ Interaction with Cloud Foundations teams to take into account security aspects… Show more ● Cloud migration support for a banking client ○ Build and animation of workshops with Data Engineering and Data Science teams (Hadoop, Hive, Spark, Scala, PySpark, Python) ○ Definition of the target architecture of data foundations on GCP (Google Cloud Storage, BigQuery, Cloud Composer, Dataproc, Google Compute Engine, Data Studio) ○ Drafting of Data High Level Design and Cloud & Data Low Level Design ○ Interaction with Cloud Foundations teams to take into account security aspects (IAM, VPC, CMEK) ● Design and build of a Data Lake, a Data Warehouse and Data Marts on GCP for a luxury client ○ Animation of design workshops with trades and IT department, collection of needs ○ Development of a target architecture to meet current and future challenges (Google Cloud Storage, Cloud Functions, Cloud Run, BigQuery, Dataform, DLP) ○ Data anonymization for the GDPR appliance and regulatory aspects between China and Europe ○ Definition of the migration strategy and implementation of an MVP ○ Product Owner of the solution on behalf of the client ○ Management of stakeholders (parent company, CSO, BI teams) and production teams (3 profiles) ○ Animation of internal and customer Dailies (Agile) ○ Responsible for costs, deadlines and quality of deliverables Show less ● Cloud migration support for a banking client ○ Build and animation of workshops with Data Engineering and Data Science teams (Hadoop, Hive, Spark, Scala, PySpark, Python) ○ Definition of the target architecture of data foundations on GCP (Google Cloud Storage, BigQuery, Cloud Composer, Dataproc, Google Compute Engine, Data Studio) ○ Drafting of Data High Level Design and Cloud & Data Low Level Design ○ Interaction with Cloud Foundations teams to take into account security aspects… Show more ● Cloud migration support for a banking client ○ Build and animation of workshops with Data Engineering and Data Science teams (Hadoop, Hive, Spark, Scala, PySpark, Python) ○ Definition of the target architecture of data foundations on GCP (Google Cloud Storage, BigQuery, Cloud Composer, Dataproc, Google Compute Engine, Data Studio) ○ Drafting of Data High Level Design and Cloud & Data Low Level Design ○ Interaction with Cloud Foundations teams to take into account security aspects (IAM, VPC, CMEK) ● Design and build of a Data Lake, a Data Warehouse and Data Marts on GCP for a luxury client ○ Animation of design workshops with trades and IT department, collection of needs ○ Development of a target architecture to meet current and future challenges (Google Cloud Storage, Cloud Functions, Cloud Run, BigQuery, Dataform, DLP) ○ Data anonymization for the GDPR appliance and regulatory aspects between China and Europe ○ Definition of the migration strategy and implementation of an MVP ○ Product Owner of the solution on behalf of the client ○ Management of stakeholders (parent company, CSO, BI teams) and production teams (3 profiles) ○ Animation of internal and customer Dailies (Agile) ○ Responsible for costs, deadlines and quality of deliverables Show less

    • France
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Senior Data Engineer, AMOA, API Management Consultant, Data Expert
      • Jan 2019 - Sep 2021

      ● Orchestration of the pre-project scoping phase (~10 Front Office and 4 Back Office) ○ Responsible for costs, deadlines and quality of deliverables ○ Valuation and skills development of 4 employees ○ Back-documentation of data quality processing algorithms ○ Definition of the migration strategy ● Migration of real-time fraud rules to credit card: 3 M € involved daily (Python, SQLite, Oracle (SQL and hints), SAS/PROC SQL, GitLab, JIRA) ○ Reverse engineering of 700 fraud… Show more ● Orchestration of the pre-project scoping phase (~10 Front Office and 4 Back Office) ○ Responsible for costs, deadlines and quality of deliverables ○ Valuation and skills development of 4 employees ○ Back-documentation of data quality processing algorithms ○ Definition of the migration strategy ● Migration of real-time fraud rules to credit card: 3 M € involved daily (Python, SQLite, Oracle (SQL and hints), SAS/PROC SQL, GitLab, JIRA) ○ Reverse engineering of 700 fraud detection algorithms (each between 100 and 1000 lines of code) without any documentation nor knowledge and without the possibility of executing the code (sensitive data) ○ Work coordination in English with the offshore team (India) ● Pre-sales over €10m in Data Science, BI, Big Data & IA ○ Identification of 5 profiles among 1000 CVs mastering 50 Open Source Big Data technologies for unpredictable issues ○ Orchestration of inter-entity bid answer (BI, Big Data) ○ Identification of deficiencies in international BI and Big Data competence centers: selection of 300 candidate profiles for a skill transfer between sites including nearshore ● Coordination of a tender with Google on 15 areas of AI ○ Selection of Data Science and AI resources (10 profiles) ○ Responsible for answer consistency ● Technical migration in Agile mode between two API tools and skills transfer to client teams (Apigee Edge, Awxay Policy Studio, Axway API Gateway Manager, JavaScript, Python, JIRA) ○ Team pedagogical training in tools ● CI/CD for collaborative development of APIs and adaptation of the delivery tool in Apigee environment (AWS CodeCommit, AWS CloudWatch) ○ Deployment and reversibility automation of an Apigee configuration from one environment to another ○ Management of versions and dependencies of resources and objects in hierarchical levels Show less ● Orchestration of the pre-project scoping phase (~10 Front Office and 4 Back Office) ○ Responsible for costs, deadlines and quality of deliverables ○ Valuation and skills development of 4 employees ○ Back-documentation of data quality processing algorithms ○ Definition of the migration strategy ● Migration of real-time fraud rules to credit card: 3 M € involved daily (Python, SQLite, Oracle (SQL and hints), SAS/PROC SQL, GitLab, JIRA) ○ Reverse engineering of 700 fraud… Show more ● Orchestration of the pre-project scoping phase (~10 Front Office and 4 Back Office) ○ Responsible for costs, deadlines and quality of deliverables ○ Valuation and skills development of 4 employees ○ Back-documentation of data quality processing algorithms ○ Definition of the migration strategy ● Migration of real-time fraud rules to credit card: 3 M € involved daily (Python, SQLite, Oracle (SQL and hints), SAS/PROC SQL, GitLab, JIRA) ○ Reverse engineering of 700 fraud detection algorithms (each between 100 and 1000 lines of code) without any documentation nor knowledge and without the possibility of executing the code (sensitive data) ○ Work coordination in English with the offshore team (India) ● Pre-sales over €10m in Data Science, BI, Big Data & IA ○ Identification of 5 profiles among 1000 CVs mastering 50 Open Source Big Data technologies for unpredictable issues ○ Orchestration of inter-entity bid answer (BI, Big Data) ○ Identification of deficiencies in international BI and Big Data competence centers: selection of 300 candidate profiles for a skill transfer between sites including nearshore ● Coordination of a tender with Google on 15 areas of AI ○ Selection of Data Science and AI resources (10 profiles) ○ Responsible for answer consistency ● Technical migration in Agile mode between two API tools and skills transfer to client teams (Apigee Edge, Awxay Policy Studio, Axway API Gateway Manager, JavaScript, Python, JIRA) ○ Team pedagogical training in tools ● CI/CD for collaborative development of APIs and adaptation of the delivery tool in Apigee environment (AWS CodeCommit, AWS CloudWatch) ○ Deployment and reversibility automation of an Apigee configuration from one environment to another ○ Management of versions and dependencies of resources and objects in hierarchical levels Show less

    • France
    • Financial Services
    • 700 & Above Employee
    • Data Management Consultant, Data Architect Data, GDPR Advisor
      • May 2009 - Dec 2018

      ● Design of Big Data databases optimized for on-demand analyzes (SAS, R, Oracle, BO, Talend) ○ Needs expression interviews with general management, marketing and IT ○ Development of a customer-centric data warehouse ○ Task automation: savings of 0.5 man/day per week ● Change management without performance and reliability degradation (SAS -> R & Python) ○ Pilot in new technologies ○ Mentoring of seniors and juniors (4 profiles) ○ Recruitment interview for Data Engineers… Show more ● Design of Big Data databases optimized for on-demand analyzes (SAS, R, Oracle, BO, Talend) ○ Needs expression interviews with general management, marketing and IT ○ Development of a customer-centric data warehouse ○ Task automation: savings of 0.5 man/day per week ● Change management without performance and reliability degradation (SAS -> R & Python) ○ Pilot in new technologies ○ Mentoring of seniors and juniors (4 profiles) ○ Recruitment interview for Data Engineers and Data Scientists (about thirty profiles) ● Responsible for GDPR compliance: ~15 Data projects ● Massive and automatic exploitation of open data (Perl, ImageMagick, SAS, Google Maps) ○ Design and development of multi-source, structured and unstructured data pipeline ● Implementation of KPIs to improve the heterogeneous quality of patient data (critical products at €1.5 billion) (SAS, R, GDPR, JIRA) ○ Reverse engineering of thousands analytical scripts from SAS to R ○ Project management and sprint organization (3 profiles) ● Decreasing unpredictable query times (R, Tableau, Greenplum) ○ Development of a Data Sharing offer ○ UI/UX Design Show less ● Design of Big Data databases optimized for on-demand analyzes (SAS, R, Oracle, BO, Talend) ○ Needs expression interviews with general management, marketing and IT ○ Development of a customer-centric data warehouse ○ Task automation: savings of 0.5 man/day per week ● Change management without performance and reliability degradation (SAS -> R & Python) ○ Pilot in new technologies ○ Mentoring of seniors and juniors (4 profiles) ○ Recruitment interview for Data Engineers… Show more ● Design of Big Data databases optimized for on-demand analyzes (SAS, R, Oracle, BO, Talend) ○ Needs expression interviews with general management, marketing and IT ○ Development of a customer-centric data warehouse ○ Task automation: savings of 0.5 man/day per week ● Change management without performance and reliability degradation (SAS -> R & Python) ○ Pilot in new technologies ○ Mentoring of seniors and juniors (4 profiles) ○ Recruitment interview for Data Engineers and Data Scientists (about thirty profiles) ● Responsible for GDPR compliance: ~15 Data projects ● Massive and automatic exploitation of open data (Perl, ImageMagick, SAS, Google Maps) ○ Design and development of multi-source, structured and unstructured data pipeline ● Implementation of KPIs to improve the heterogeneous quality of patient data (critical products at €1.5 billion) (SAS, R, GDPR, JIRA) ○ Reverse engineering of thousands analytical scripts from SAS to R ○ Project management and sprint organization (3 profiles) ● Decreasing unpredictable query times (R, Tableau, Greenplum) ○ Development of a Data Sharing offer ○ UI/UX Design Show less

    • France
    • IT Services and IT Consulting
    • 1 - 100 Employee
    • Data Project Manager
      • Nov 2003 - Apr 2009

      ● Development and optimization of data quality tools (Oracle PL/SQL, Access, VBA, Excel, SAS/BASE, SAS/STAT, SAS/MACRO, BO Xcelsius, SAS MA) ○ Privileged Data contact for the 17 European marketing managers ○ Costing and realization of requests ● R&D Advisor for deduplication algorithms (Oracle PL/SQL) ○ Design of algorithms fitted to linguistic specificities ○ Optimization of international database queries ● Development and optimization of data quality tools (Oracle PL/SQL, Access, VBA, Excel, SAS/BASE, SAS/STAT, SAS/MACRO, BO Xcelsius, SAS MA) ○ Privileged Data contact for the 17 European marketing managers ○ Costing and realization of requests ● R&D Advisor for deduplication algorithms (Oracle PL/SQL) ○ Design of algorithms fitted to linguistic specificities ○ Optimization of international database queries

    • IT Services and IT Consulting
    • Founder & partner
      • May 2001 - Jul 2003

      ● Startup creation ● Design of a polymorphic questionnaire engine (PHP, MySQL, JavaScript) ● Startup creation ● Design of a polymorphic questionnaire engine (PHP, MySQL, JavaScript)

    • France
    • Information Services
    • Project Manager
      • Jul 2000 - Jan 2001

      ● Development of a graphical monitoring to follow the paths of Internet users in real time (Perl, PHP, MySQL, VBA) ● Development of a graphical monitoring to follow the paths of Internet users in real time (Perl, PHP, MySQL, VBA)

    • Germany
    • Software Development
    • 400 - 500 Employee
    • Marketing study officer
      • Jul 1999 - Aug 1999

      ● Market study: creation of a branch in France ● Market study: creation of a branch in France

    • France
    • Aviation and Aerospace Component Manufacturing
    • Software Developer
      • Jul 1996 - Jul 1996

      ● Development of a scheduling tool (VBA) ● Development of a scheduling tool (VBA)

Education

  • Institut Mines-Télécom Business School
    Diplôme d'études supérieures de gestion de l'Institut national des télécommunications, Management
    1998 - 2001
  • Johns-Hopkins University
    Statement of Accomplishment, Data Science
    2014 - 2014
  • Conservatoire à Rayonnement Régional of Toulouse
    End of study, Music Theory
    1993 - 1997

Community

You need to have a working account to view this content. Click here to join now