Bio
Experience
-
SWAGG TECHNOLOGIES
-
Bengaluru, Karnataka, India
-
Data Engineer
-
Apr 2023 - Present
-
Bengaluru, Karnataka, India
Over all 5 years of expertise in designing and implementing IT solution delivery support for diverse solutions and technical platforms and relevant 3.6 years of experience in Azure Cloud technologies such as Azure Data Factory, Azure SQL, Azure data lake storage Gen1 & Gen2, Python ,pyspark Synapse Warehouse, Azure Data Explorer, Azure Blob Storage, Azure Cosmos DB and Azure Data Bricks .Power BI
-
-
-
Tata Consultancy Services
-
Bengaluru, Karnataka, India
-
Azure Developer
-
Aug 2022 - Mar 2023
-
Bengaluru, Karnataka, India
Over all 4 years of expertise in designing and implementing IT solution delivery support for diverse solutions and technical platforms and relevant 3.6 years of experience in Azure Cloud technologies such as Azure Data Factory, Azure SQL, Azure data lake storage Gen1 & Gen2, Synapse Warehouse, Azure Data Explorer, Azure Blob Storage, Azure Cosmos DB and Azure Data Bricks.
-
-
-
Deft Infotech Pvt. Ltd.
-
Bengaluru, Karnataka, India
-
Azure Engineer
-
Aug 2020 - Mar 2022
-
Bengaluru, Karnataka, India
To build an ETL pipeline with ADF, you first need to create a data factory. Once you have created a data factory, you can start creating pipelines. Each pipeline consists of a series of activities, which represent the different steps in the ETL process.For example, a typical ETL pipeline with ADF might include the following activities:Extract activity: This activity extracts data from the source data store.Transform activity: This activity transforms the data into the desired format.Load activity: This activity loads the transformed data into the destination data store.
-
-
-
Magicbricks
-
Bengaluru, Karnataka, India
-
Associate Software Engineer
-
Jan 2020 - Jul 2020
-
Bengaluru, Karnataka, India
● Working with Structured data that is being ingested into Azure File storage explorer. ● Create ETL pipeline in Snap logic tool to bring the data into azure Databricks workspace. ● Applied transformation logic including, Spark sql, pyspark operations on data. ● Applied optimizations logics i.e., Partitioning, broadcast joins etc. ● Create ETL pipeline on Databricks transformed data to dump the target directory called snowflake. ● Analyze the resultant data with data bricks tool
-
-
Education
-
2014 - 2018Sarojini high school
Bachelor of Technology - BTech, Electrical and Electronics Engineering
Suggested Services
This profile is unclaimed. These are suggested service rates with 0% commision upon successful connection
Industry Focus. “IT Services and IT Consulting”
Need a custom project? We'll create a solution designed specifically for your project.
References
Community