Sumughan (Sumu) Aravindan

Senior AVP, Data Engineering at Revantage, A Blackstone Portfolio Company
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Bartlett, Illinois, United States, US

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

5.0

/5.0
/ Based on 2 ratings
  • (2)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

Rob Adler

I had the pleasure of working closely with Sumu for the past two years. He was a driving force behind bringing our data platform to maturity, managing the implementation and daily execution of over 400 ETL pipelines. Despite the infrastructure growing rapidly, Sumu managed to cut costs by over 200% by deploying spot instances, file archiving strategies, and cluster tuning. Sumu has incredible range - he is able to get deep in the weeds to troubleshoot issues with ETL pathways or environment configurations, but is also able to build proposals for major infrastructure enhancements. I could bring any problem to Sumu and he would have a solution in the matter of days, if not hours. If given the opportunity to hire Sumu again, I would not hesitate to do so. Thank you for all of your contributions to Avant!

Anuj Marfatia

It has been a pleasure to have Sumu on the team. His technical knowledge is very strong, and he works well with Business stakeholders. He takes the initiative and follows through, with minimal direction. Additionally, he is always is pushing himself to learn new things. Overall, Sumu did a great job and has been sorely missed.

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • AZ-300 Microsoft Azure Architect Technologies
    Microsoft
    Jun, 2019
    - Nov, 2024
  • Machine Learning
    Coursera
    Aug, 2018
    - Nov, 2024
  • TOGAF 9 Certified Architect
    The Open Group
    May, 2012
    - Nov, 2024
  • The Open Group Certified: TOGAF® 9 Certified
    The Open Group
    May, 2012
    - Nov, 2024
  • Oracle Database 11g Administrator Certified Professional
    Oracle
    Dec, 2011
    - Nov, 2024
  • Academy Accreditation - Generative AI Fundamentals
    Databricks
    Aug, 2023
    - Nov, 2024
  • dbt Developer
    dbt Labs
    Apr, 2023
    - Nov, 2024
  • Academy Accreditation - Databricks Lakehouse Fundamentals
    Databricks
    Aug, 2023
    - Nov, 2024
  • AWS Certified Solutions Architect – Professional
    Amazon Web Services (AWS)
    Nov, 2020
    - Nov, 2024
  • AWS Certified Solutions Architect – Associate
    Amazon Web Services (AWS)
    Aug, 2020
    - Nov, 2024

Experience

    • United States
    • Real Estate
    • 400 - 500 Employee
    • Senior AVP, Data Engineering
      • Jun 2021 - Present

    • United States
    • Financial Services
    • 300 - 400 Employee
    • Senior Manager - Data Services
      • Oct 2019 - Jun 2021

      • Reduced Cloud operational cost by fine-tuning resource utilization. • Management of the entire Data Services team at Avant and delivery of Architecture, Ingestion, Storage and Processing, Consumption, Governance, and Operationalization • Delivered cloud-native (AWS), Big Data enterprise-level solution which handles multi-partner data segregation, sensitiveness, and masking. • Implemented over 300 data pipelines with diverse sources, frequency • Implemented automated Data Catalog and Data Quality as part of the integrated data pipeline. • Implement automated data pipelines using diverse cloud-native technologies including S3 Data Lake, Jenkins (CICD), Apache Airflow (workflows), EMR, Spark, Python, Dremio, Alation, Looker and Infogix • Hands-on development of data-pipelines using Spark, Python, and SQL. Actively involved in code management, resource optimization, delivery management. Show less

    • Manufacturing
    • 700 & Above Employee
    • Big Data Engineer/Enterprise Architect
      • Jul 2017 - Oct 2019

      • Introduced and implemented cloud-native (Azure), optimal Big Data, Advanced Analytics, Artificial Intelligence, and IoT solutions. • As a Data Team workstream lead for the Life Fitness separation project, lead the effort in data separation, archival and retention of key application and financial data. • Designed all aspects of the Data Lake platform for the enterprise and defined all the integration mechanisms, data governance, and metadata management. • Authored and published a position paper and blueprints on Big Data and Advanced Analytics implementation approach for the enterprise. • Implemented Data Driven Operating Model for various infrastructural assets to monitor and report compliance against Enterprise Standards. • Hands-on implemented the production and the sandbox Data Lake and developed key aspects of data ingestion pipelines, workflows, backup strategy. • As a key member of the Enterprise Architecture team was responsible for the IoT, Artificial Intelligence, Big Data and Advanced Analytics area in setting the direction and organizational evolution across the enterprise. • Work with strategic vendors of the organization to align the enterprise’s priorities with the vendor’s direction to deliver tremendous value and collaboration. • Defined and implemented the security and integration framework for the multi-tenant Data Lake. • Worked with vendors to optimize the cost vs benefit of implementing solutions and define the scaled implementation plan. • Manage internal and external teams to deliver optimal solutions in a business-critical timely manner. Show less

    • United States
    • Technology, Information and Media
    • 700 & Above Employee
    • Principal Architect
      • Nov 2014 - Jul 2017

      • Played a key role in implementing the next generation Enterprise Architecture in the cloud for Nielsen heavily focused on big data technologies.• The big-data solution currently being implemented includes technologies like Azure Data Lake, Hadoop, HDFS, Spark, Scala, Oozie, Spark Job Server.• Have experience working on AWS environment with big-data technologies which could not be chosen because of retailer data sensitivity issues. • Successfully implemented many big data solutions on Netezza, and currently implementing an enterprise-wide connect system on the cloud using Hadoop, Spark, Scala, Python, multi-enterprise integration among others.• Performed various POC's, benchmarking, feasibility, and costing analysis to move Nielsen's US Buy application from Netezza to other big data environments including Hadoop and Snowflake.• Representing my division in the 6-person enterprise-wide data governance team which defines the key data and integration standards across Nielsen.• Acting in a key ownership role for the design and implementation of Nielsen Enrichment Studio which can integrate data from various internal and external sources using advanced linking and integration techniques.• Working on defining the roadmap for Product reference catalog which would be adopted across the enterprise. This would integrate fractured Master Data Management across the enterprise into a unified source of truth.• Design and implement REST API and micro-services for solutions on Hadoop and other big data technologies.• Played a key role in evolving Nielsen's product delivery from a fragmented offering by country/regions to an integrated global source of truth.• Develop solution, conceptual and physical architecture for various product solutions in the Answers on Demand (AOD) universe and the Connected Buy System. Show less

    • Solutions Architect
      • Sep 2012 - Nov 2014

    • United States
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Sr. Architect
      • Oct 2006 - Sep 2012

      • Developed conceptual, logical and physical data model for various applications and interfaces • Developed both entity-relationship models and dimensional models for various applications. • Designed and implemented interfaces between IMPACT and various applications within Abbott. • Developed the architecture design document for various applications and their interface designs. • Designed the SOA interface between the IMPACT application and the enterprise service bus (ESB) • Develop the IQ documents for installation of Oracle 10g (RAC) & 11g (RAC) software and database migration on both Solaris and HP-UX environments • Played key role in the disaster-recovery planning & testing of various databases and applications of the organization. • Designed data model using Erwin 7.2. • Design application and solution architecture components, integration and interface • Conducted data capacity planning, lifecycle planning, usage requirements analysis and feasibility analysis of various components and the interfaces of the application. • Monitor and administer the Apache/Tomcat and Oracle application servers in the production, validation and development environments. • Develop ad-hoc reports pertaining to the business user requirement both by application data and audit data analysis. Show less

    • United States
    • Hospitals and Health Care
    • 700 & Above Employee
    • Sr. Data Architect
      • Oct 2006 - Sep 2012

      • Developed conceptual, logical and physical data model for various applications and interfaces • Developed both entity-relationship models and dimensional models for various applications. • Designed and implemented interfaces between IMPACT and various applications within Abbott. • Developed the architecture design document for various applications and their interface designs. • Designed the SOA interface between the IMPACT application and the enterprise service bus (ESB) • Develop the IQ documents for installation of Oracle 10g (RAC) & 11g (RAC) software and database migration on both Solaris and HP-UX environments • Played key role in the disaster-recovery planning & testing of various databases and applications of the organization. • Designed data model using Erwin 7.2. • Design application and solution architecture components, integration and interface • Conducted data capacity planning, lifecycle planning, usage requirements analysis and feasibility analysis of various components and the interfaces of the application. • Monitor and administer the Apache/Tomcat and Oracle application servers in the production, validation and development environments. • Develop ad-hoc reports pertaining to the business user requirement both by application data and audit data analysis. Show less

    • United States
    • Financial Services
    • 700 & Above Employee
    • Oracle DBA & Data Modeller
      • 2006 - 2006

      • Was involved in the development of the data models for the application tables. • Developed physical data model for the database in 3rd normal form. • Played key role in data-analysis of access related data sourced from the various native systems. • Performed various DBA related maintenance activities like database upgrades, Change management, schema upgrades. • Wrote various data manipulation and loading scripts for some key applications using AWK and shell scripting. • Performed various levels of application tuning and SQL tuning for optimal performance of the application. • Do data-analysis and identify data-inconsistencies and design and implement data-fixes. • Was involved in database recovery activities and database performance tuning. • Have developed various crucial and complicated shell & AWK scripts which are critical for the business to function normally. Show less

    • India
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Architect
      • 2004 - 2006

      • My responsibilities included requirement analysis and dataflow architecture, logical and physical data modelling (using Erwin 4.1) adhering to specific naming standards/abbreviations, ETL, logical and physical architecture development, design and implementation. • Designed data model (normalized in most cases up to 3rd normal form) using Erwin 4.1 for the EDW. • Developed both entity-relationship models and dimensional models for various applications. • Developed the data warehouse architecture document which included implementation aspects like ETL, Metadata, Data Quality, Security, Logical and Physical architecture. • Designed the pipe-line architecture for ETL data loading using surrogate keys and data audits so as to enable data load into the data marts even when the enterprise data warehouse was not available for data load. • Involved in maintaining the data model and synchronising it with the database using complete compare feature of Erwin. • Designed and implemented the physical implementation of the database so as to optimize it both for data loads through Ab Initio ETL process, for optimized query performance and data extraction to the data marts from the EDW. • Performed architectural requirement gathering, study and analysis and developed the data warehouse architecture for the project. • Conducted feasibility analysis for implementing the various aspects of the design like ETL design, metadata design and table design. • Developed the project effort & cost estimation for future phases of project. This was done using SMC method of effort estimation. • Directed and helped the project team on the developing the detailed design. • Provide complete data modelling/database (Oracle 9i RAC) support and consultancy to J2EE based web application project developed for Pitney Bowes. Show less

    • India
    • Information Technology & Services
    • 700 & Above Employee
    • Consultant
      • 1997 - 2004

      • Worked with multiple international clients of HCL at various roles from developer to data modeler to team lead. • Installation and maintenance of Oracle 9iAS application server and Apache Tomcat web server on HP-UX environments. • Integrate the various components of the application developed in Oracle Discoverer, Forms, Reports, Express and SAS using single sign-on as external applications. • Upgrading the Oracle Database. • Was instrumental in Logical and Physical designing of the database • Played Key role in Designing of Backup & Restore policy • Perform Performance Analysis, Stress/Benchmark test. • Created the administrator’s manual and the Database Administrators manual for the Application. • Designed the data-flow and its implementation in the system. • Involved in Business data study • Involved in design and implementation of data-cleansing using PL/SQL programs and Oracle warehouse Builder mappings • Designing of warehouse model • Generation of Transformation specifications using PL/SQL • Generating reports in Discoverer 4.1 & Reports 6i • Writing shell scripts for database backup and recovery • Writing shell scripts for data transportation from staging to the warehouse server • Writing shell scripts for monitoring of the servers for their health and paging using PageGate software. • Client liaison, • Metrics collection and reporting • Report project status to senior management and to client. Show less

Education

  • Bharathidasan University
    Master of Computer Applications (M.C.A.), Computer Science
    1991 - 1997
  • Bharathidasan University
    Bachelor of Science (B.Sc.), Physics
    1991 - 1994

Community

You need to have a working account to view this content. Click here to join now