Dhinakaran Sundareshan

Senior Data Engineer With Radius Recycling at Sara Software Systems
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Overland Park, Kansas, United States, US
Languages
  • English Full professional proficiency
  • Kannada Native or bilingual proficiency
  • Tamil Native or bilingual proficiency

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

5.0

/5.0
/ Based on 2 ratings
  • (2)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

LinkedIn User

I have worked with Dhinakaran on multiple data warehousing, data modeling, data integration, AI and ML projects. He is one of the best resource in our team. I would highly recommend Dhinakaran for any data related projects.

Vaibhav Indani

Dhinakaran has very good with understanding data and build extensive and simple data model. Highly skilled with BI stacks and good learning capabilities

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • STAR BI Specialist
    Nuvento
    Dec, 2019
    - Nov, 2024
  • Informatica BDM Developer Associate
    Informatica
    Aug, 2019
    - Nov, 2024

Experience

    • United States
    • IT Services and IT Consulting
    • 1 - 100 Employee
    • Senior Data Engineer With Radius Recycling
      • Sep 2023 - Present

    • United States
    • IT Services and IT Consulting
    • 300 - 400 Employee
    • Senior Data Architect/ Engineer
      • Sep 2022 - Aug 2023

    • Senior Data Architect/Engineer with Microsoft
      • Sep 2020 - Sep 2022

    • ETL Architect with NBA
      • Nov 2019 - Sep 2020

      The National Basketball Association (NBA) is the highest professional basketball league in the United States. It consists of thirty franchised member clubs, of which twenty-nine are located in the United States and one in Canada. NBA is a client of Nuvento for the past 8+ years, as Nuvento helps in building & maintenance of Datawarehouse, Analytics, and Digital platform for NBA.NBA wanted migrate all the data sources running on Hadoop Framework to In-house SQL Server Data warehouse platform and thus enabling the feature to unify the same customer using multiple data sources to one single point and thus aggregating their total revenue and activities done by customer. Responsibilities:• Involved in Technical decisions for Business requirement, Interaction with Business Analysts to gather the requirements.• Managed a team of size 5 in Offshore.• Architect the design flow of the gathering the available information to one single repo.• Designed the code base line to download the files from Hadoop to Landing zone for SSIS to pick it up for ETL load using C#.• Created tables, Functions and Procedures on SQL Server.• Responsible to maintain the data integrity and constraints.• Designed the Data model for individual Data sources up to Data Warehouse.• Responsible for Implementing the DSAR (GDPR) process to Delete, ROA, and DONOTSELL requests.• Developed code base for Unification of Customers based on 10 different data parameters like First name, Last Name, Address etc and also built another process to distinguish the customer by their Emails.• Developed an aggregated system to help business know the unified customer aggregated activities.• Prepared Tech Design document, ETL Run books, project document and release notes.• Created Deployment model and Scheduled the tasks to run in Dev and Prod using SSIS package Store• Interacted with developers, Business & Management Teams and End Users.Environment: C#, Azure Functions, Azure Bot Services, Python Show less

    • Senior Data Analyst with Inovalon
      • Jun 2019 - Nov 2019

      iPortHD (iPort-Hadoop) – iPortHD is a proprietary hadoop based framework on which The Inovalon Healthcare Data Lake solution resides and provides clients with an industry-leading single-source-of-truth “superset” of data that aids clients in supporting advanced reporting, analytics initiatives, and other use cases that benefit from best practices data architecture, data comprehensiveness, and data hygiene, while also eliminating costs of traditional enterprise warehouse solutions and healthcare organization processes suffering from incomplete, untimely, or erroneous data. Core to the Inovalon Healthcare Data Lake solution is the aggregation and normalization of a client’s existing data assets, often from widely disparate systems and in widely disparate formats. To these myriad data, Inovalon applies more than a 1,000 data integrity analyses that have been trained from the ingestion of the 48 billion medical events within Inovalon’s MORE2 Registry dataset.Responsibilities:• Design and develop big data ingestion pipeline (mapping and workflows) in Informatica BDM (Big Data Management) to replace SQL Server Legacy system• Design and develop Hive, Spark & Pig script, wherever required, and integrate with Informatica BDM's transformation for end-to-end integration through Orchestration• Convert and merge data lake's delta tables from Avro to text for Greenplum distribution • Aggregate and convert huge PIG Objects to BSON and push it Pig Mongo Connector and publish data in MongoDB• Automate and orchestrate production build pipelines using DevOps tools like Jenkins and Git. • Design and develop Oozie workflows for scheduling jobs in Production.• Develop and Support REST Services to convert SQL Server Meta Data to be utilized for Big Data Pipeline.• SSIS, Informatica Package and REST APIs in case of any issues Environment: Apache Hadoop, Apache Hive, Spark,Scala, Python, Pig,Oozie, SQL Server, Informatica BDM Show less

    • Data Engineer
      • Aug 2018 - Jun 2019

      Claims Data Migrator (CDM) is a powerful tool capable of functioning on multiple windows based platforms. With Claims Data Migrator, Business Analysts can easily create mappings of data to new environment based on standard models provided. CDM helps Data team to perform cleansing of data, masking and conditional migration from and to multiple platforms effortlessly. The application seamlessly integrates with a lot of existing standard applications, Data sources, Legacy systems and helps data migration easy between different technology based systems. CDM provides pre defined models for claims and Medicaid data which will help organizations standardize their existing data management process. CDM can connect to Oracle, Microsoft SQL Server, Accept data dumps from IBM based system and transform them to fit into a new DBMS or other data storage mechanism.Responsibilities:• Responsible for design, develop, test, maintenance, Production support and customize software and IT applications, Data/BI/ETL Architecture, Data Modeling, Dimension Modeling, and deploying Business Intelligence Analysis solutions using Informatica, Oracle, SQL Server SSIS, SSRS and SSAS. • Design, develop, validation and deployment of Claims Data Migrator ETL Inbound and Outbound interfaces using Informatica, Oracle, Unix scripts, Control M, UC4, Power exchange, Complex T-SQL queries and Shell Scripting, FileZilla, WinSCP, Putty, External tables and PL/SQL. • Create different Claims data model for legacy source data to Enterprise system, Validate and perform data cleansing and Convert EBCDIC format files using SSIS, C#, SQL Queries, Stored procedures and shell scripts. • Duties also include Project planning, Estimation, coordination, system integration, Defect/CR analysis and drafting the design specification documents for different ETL Projects. Show less

    • BI Team Lead
      • Jul 2014 - Aug 2018

      Exclusively worked with US major sports entertainment company - NBA, building projects and products revolving around Business Intelligence - MSBI, Big Data - Cloudera Hadoop, AWS Services, Azure Cognitive Services, AI and Machine Learning Libraries. Was part of team contributing on Architecture & Development of major AI products and NBA's Personal Virtual Assistant - "Sophia"

    • India
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Senior Data Analyst
      • Jul 2013 - Jul 2014

      Microsoft, Inc., Redmond Jul’ 13– Jun’ 14 Role: Senior Data Analyst Project Name: Contact Data Management Optimization (CDMO) Description: Microsoft Corporation is an American multinational corporation headquartered in Redmond, Washington, that develops, manufactures, licenses, supports and sells computer software, consumer electronics and personal computers and services. Contact Data Management Optimization (CDMO) is Six Sigma Project, which maintains worldwide contact of customers and partners of Microsoft. CDMO deals in optimizing and maintaining the redundancy in Accounts and Customers details of Microsoft. Responsibilities: • Understand requirements as stated in the Business Requirement document. • Analyze the data for the business requirement. • Designed ETL processes using SSIS to extract the data from flat file and to load the data into SQL server. • Created Stored Procedures and Functions for Data Cleansing and Data Archiving Activities. • Designed and ENVY Report (A Excel Reporting tool). • Interacted with developers, Business & Management Teams and End Users. • Participated in regular project status meetings. Show less

    • United States
    • IT Services and IT Consulting
    • 300 - 400 Employee
    • ETL Developer
      • Nov 2011 - Jul 2013

      Started working as an Assistant Software engineer helping the team managing and building ETL packages for US 2nd Largest Insurance Company - Farmers Insurance using Business Intelligence tools like - MSBI and OBIEE for reporting. Also was part team who developed a social media Analytics tool called "Sophia" for the company as a product. Started working as an Assistant Software engineer helping the team managing and building ETL packages for US 2nd Largest Insurance Company - Farmers Insurance using Business Intelligence tools like - MSBI and OBIEE for reporting. Also was part team who developed a social media Analytics tool called "Sophia" for the company as a product.

Education

  • Visvesvaraya Technological University
    Bachelor of Engineering (BE), Electrical, Electronics and Communications Engineering
    2007 - 2011

Community

You need to have a working account to view this content. Click here to join now