Pushkar Tiwari
Sr. Snowflake, Pyspark, Kafka, ETL Consultant at BigData Dimension Labs- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
Topline Score
Bio
Experience
-
BigData Dimension Labs
-
United States
-
IT Services and IT Consulting
-
1 - 100 Employee
-
Sr. Snowflake, Pyspark, Kafka, ETL Consultant
-
Jul 2020 - Present
● Develop and maintain applications using Snowflake services. ● Perform analysis on existing Data Storage system and development of Data Solution in Cloud (Snowflake) ● Set up and build pipelines for Data Migration from on-prem to Snowflake. ● Extensively used Pyspark to write custom complex jobs to load the into Snowflake from various API’s, Tearadata, S3, Csv, json and Xml files. ● Extensively used DBT to create Snowflake ELT jobs. ● Extensively used Kafka process to handle the CDC from Mysql and Postgres databases into Snowflake. ● Discuss solutions for problems that are not fully defined and highlight the roadblocks early in the project lifecycle. ● Design and build internal snowflake applications. Transferring data to and from the snowflake using snow sql. ● Collaborate and participate in the solution review call and technical meetings with other leads, engineers, and product owners. ● Ability to develop ETL pipelines in and out of data warehouse using combination of Python and Snowflake’s SnowSQL ● Design dimensional model, data lake architecture, data vault 2.0 on Snowflake and used Snowflake logical data warehouse for compute. ● Responsible for developing Ab Initio psets, DMLs(Data Manipulation Language) and XFRs(transformation) to do mapping transformations and aggregations on data to perform load/unload XFRs(transformation) to do mapping transformations and aggregations on data to perform load/unload. ● Perform unit testing through Ab Initio and acts as a subject matter expert to QA and Business Testers through the testing lifecycle. ● Developed generic framework which are configurable and reusable using Extract, Transform and Load tool Abinitio. Show less
-
-
-
Barclays Bank US
-
United States
-
Financial Services
-
200 - 300 Employee
-
Snowflake, ETL, Pyspark, ETL Developer
-
Apr 2017 - May 2020
● Design and develop Snowflake bases solutions using Snowflake stored procedures and views. ● Built data migration process in Pyspark adn CDC in Kafka to move the data from Oracle, Mysql, Potgres, Json, XML, Csv to Snowflake with bulk loading mechanism of Spark and streaming mechanism of Kafka. ● Extensively used Talend to load the data from various databases like Oracle, mysql and Postgres to Snowflake. ● Implemented Exception handling and Email notification of Snowflake procedures. ● Created Continuous data ingestion pipelines using Snowpipe, STREAMS, TASKS etc. ● Perform the role of Account Admin activity like creating Roles, Users, Stages in Snowflake cloud data platform. ● Responsible for architecting ETL (Extract, Transform, Load) Framework design, development, debug, deploy and document all the Process and programs as well as support activities adhering to the corporate systems architecture and maintaining the framework Show less
-
-
-
-
Pyspark Abinitio ETL Developer
-
Apr 2015 - Mar 2017
● Understand the Requirements and Design and implement them. ● Worked on Express>It to create GET, MAP, ILM and LOAD Appconfs for generic DMF framework. ● Published data from Optum Sharepoint using customized web service graph to datalake built on multi file system. ● Developed Web Service graph to fetch data using SOAP protocol in xml format and further using utility xml-to-dml to convert XML formatted to structured data. ● Developed Loader process with remote layout such that data is published where reporting tool Query>IT is installed. ● Develop ETL (Extract, Transform, Load) and ELT (Extract, load, Transform) solutions using Ab Initio graphs and plans that involved transformations for OL (Operational layer), ODS (Operational data store), Dimensions, Fact and Aggregate layers of target databases. ● Developed generic SCDII(slowly changing dimension) process to capture historical data using point in time files concepts. Show less
-
-
-
Wipro Technologies
-
India
-
Abinitio Developer
-
Jan 2013 - Feb 2015
● System analysis and design to migrate the DB2 database from Mainframe system to Teradata using Extract, Transform Load tool Abinitio. ● Published Currency Transaction Report filtering data of customers performing transaction of $10000, in day, using aggregation of entire amount of transaction per day. ● System analysis and design to migrate the DB2 database from Mainframe system to Teradata using Extract, Transform Load tool Abinitio. ● Published Currency Transaction Report filtering data of customers performing transaction of $10000, in day, using aggregation of entire amount of transaction per day.
-
-
Education
-
Thakral College of Technology, Thakral Nagar, Raisen Road, Bhopal 462021
Bachelor of Technology - BTech -
Visvesvaraya Technological University
Master of Computer Applications - MCA -
FACULTY OF ARCHITECTURE, UPTU,LUCKNOW
B-tech, Computer Science