Dinesh Kumar

Data Engineer at Simplyai
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Sydney, New South Wales, Australia, AU

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • ITIL Foundation Level
    AXELOS Global Best Practice

Experience

    • Australia
    • IT Services and IT Consulting
    • 1 - 100 Employee
    • Data Engineer
      • Sep 2022 - Present

    • India
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Associate Consultant
      • Jan 2022 - Feb 2023

      Client : Westpac Banking Corporation

    • Assistant Consultant
      • Sep 2017 - Dec 2021

      Client : Westpac Banking CorporationWestpac Bank is one of the Big four banks providing the largest Banking and Financial Services in Australia. I have worked in as Senior Developer, Technical Architect, Business Analyst and Current working as a Solution Designer. The scope of the role in this project is to deliver ETL solutions for the business requirements.• Liaising Tech Leads and Business Analysts in formulating the technical solution. • Understanding the Business, Functional requirements and converting them into technical requirements, preparing the solutions for same. • Impact Analysis. • Helping in IFS, PDM preparation and bridging the gaps in them. • Efforts estimation and Project planning.• Solution Design and preparing Detailed Technical Design (DTD) Documents.• Coding, Unit Testing and code review.• Performance tuning for ETL and SQL code.• Deploying code to all Non Prod/Prod environments with the help of support teams. • SIT,UAT and SVP Test support • Implementation Support and Warranty support• Managing a team of 8• Creating scripts using Python for loading data from Big data platform to Enterprise data warehouse Show less

    • IT Analyst
      • Feb 2017 - Sep 2017

      Client : HMD GlobalAzure Data Platform PRS (Profitability Reporting Solution) is a warehousing solution which is to provide the details of Financial Transaction, Purchase Orders and Delivery details to the customer for certain analytics and management decisions. The upstream system, BYD provides the data which is then cleansed and manipulated according to the business needs in Azure Data Platform PRS and later the data is given to the reporting user for the generation of reports.Roles & Responsibilities : Developed Linked Service, Datasets and Pipelines. Developed the pipelines in Azure data factory to extract the data from Master data system to load Azure MS SQL tables. Involved in creating tables, external tables and views in MS SQL. Created Stored procedures in MS SQL as per the business logic provided by the client. Involved in testing, go-live and Post go-live support activities. Show less

    • IT Analyst
      • Aug 2016 - Jan 2017

      Client : Microsoft MobileIntegration to load Point of sales and sales relationship data from DDW (Teradata tables) to Zyme system. DDIP will extract the data and create the files in DDIP landing directory and will MFT the files to zyme though secured connection (Public key exchange).Roles & Responsibilities : Involved in testing go-live and Post go-live support activities. Developed mappings to extract the data from Teradata and send as Flat file to Zyme. Implemented secured FTP through public key exchange. Source qualifier, Expression, Aggregator transformations used.  Implemented in Linux environment with OS profile setup. Involved in testing, go-live and Post go-live support activities. Show less

    • IT Analyst
      • Feb 2013 - Aug 2016

      Client : NokiaNokia Data Integration Platform (NDIP) as a Service includes a fully managed Informatica platform for Data Integration-based Nokia applications. This fully-managed service includes Data Integration development services, Centralized Metadata Service, Production support service and high availability platform.Roles & Responsibilities  Leading the team of 10. Need to work on new developments and enhancements to existing integrations. Workflow and Informatica service monitoring and managing. Access management, configuration changes need to be handled. Code migration need to be done via deployment groups and xml files. Production Issue fixing and data loading. New Folder creation in Linux and Informatica New connection creation, updating with needed permissions. Need to create new TNS/ODBC/SAPRFC entries. Issue fixing and request handling within SLA. Customer communication about Planned and unplanned breaks. Restart/Shutdown of Informatica services after planned and unplanned breaks. Request handling in ITSM and SNOW tools. Need to work with different dependent teams to resolve the issues related to platform and Informatica jobs. Show less

    • System Engineer
      • Feb 2010 - Jan 2013

      Client : GE Commercial Finance GE Commercial Finance is merging data from its sub-businesses into Enterprise data warehouse to analyze and compare data at same level. Teradata proprietary Financial Services Logical Data Model (FSLDM) is selected as the data model and is customized to suit the needs for GE CF. The data of this data warehouse is then reported to managers as per their requirement for analytics purpose. For integrating the data from different source systems Informatica and Teradata utilities are used as data transformation modes. Informatica is used for extracting the data from the source systems which is staged into Teradata. Informatica Mappings and BTEQ Scripts are used to move it to core as per business requirements mentioned in specifications written by Data Architects.Roles & Responsibilities : Creating mappings in Informatica to Extract, Transform and Load data from various sources into the Data Warehouse using different transformations like Source Qualifier, Expression, Filter, Lookup, Update Strategy, Router. Extracted data from various source systems like Teradata, Oracle, Flat files as per requirements. Extensively worked on Workflow Manager and Workflow Monitor to create, monitor Workflows and various tasks like command, Email and session tasks. Worked on Hybrid Approach to improve the Performance. Involved in preparation of Unit test cases & Execution of Unit Test Cases. Prepared Checklists, Deployment and Rollback Plans, Unit Test Reports for code move from Integration to Production Environment. Involved in Code Review and Documents Review. Involved in Testing like Functional testing and Unit testing. Involved in analyzing issues. Involved in performance tuning of the SQL queries. Involved in knowledge transfer to new team members. Show less

Community

You need to have a working account to view this content. Click here to join now