See How Many Clients You're Missing Each Month

Simply enter your business email & Topline AI Agent will show you.

Bio

Generated by
Topline AI
Dandu Rajkumar is a seasoned Azure Data Engineer with 5+ years of expertise in designing and implementing IT solution delivery support for diverse solutions and technical platforms, including Azure Cloud technologies such as Azure Data Factory, Azure SQL, and Azure Blob Storage. He has experience working with various tools and technologies, including Python, pyspark Synapse Warehouse, Azure Data Explorer, and Snap logic tool.

Experience

  • SWAGG TECHNOLOGIES
    • Bengaluru, Karnataka, India
    • Data Engineer
      • Apr 2023 - Present
      • Bengaluru, Karnataka, India

      Over all 5 years of expertise in designing and implementing IT solution delivery support for diverse solutions and technical platforms and relevant 3.6 years of experience in Azure Cloud technologies such as Azure Data Factory, Azure SQL, Azure data lake storage Gen1 & Gen2, Python ,pyspark Synapse Warehouse, Azure Data Explorer, Azure Blob Storage, Azure Cosmos DB and Azure Data Bricks .Power BI

  • Tata Consultancy Services
    • Bengaluru, Karnataka, India
    • Azure Developer
      • Aug 2022 - Mar 2023
      • Bengaluru, Karnataka, India

      Over all 4 years of expertise in designing and implementing IT solution delivery support for diverse solutions and technical platforms and relevant 3.6 years of experience in Azure Cloud technologies such as Azure Data Factory, Azure SQL, Azure data lake storage Gen1 & Gen2, Synapse Warehouse, Azure Data Explorer, Azure Blob Storage, Azure Cosmos DB and Azure Data Bricks.

  • Deft Infotech Pvt. Ltd.
    • Bengaluru, Karnataka, India
    • Azure Engineer
      • Aug 2020 - Mar 2022
      • Bengaluru, Karnataka, India

      To build an ETL pipeline with ADF, you first need to create a data factory. Once you have created a data factory, you can start creating pipelines. Each pipeline consists of a series of activities, which represent the different steps in the ETL process.For example, a typical ETL pipeline with ADF might include the following activities:Extract activity: This activity extracts data from the source data store.Transform activity: This activity transforms the data into the desired format.Load activity: This activity loads the transformed data into the destination data store.

  • Magicbricks
    • Bengaluru, Karnataka, India
    • Associate Software Engineer
      • Jan 2020 - Jul 2020
      • Bengaluru, Karnataka, India

      ● Working with Structured data that is being ingested into Azure File storage explorer. ● Create ETL pipeline in Snap logic tool to bring the data into azure Databricks workspace. ● Applied transformation logic including, Spark sql, pyspark operations on data. ● Applied optimizations logics i.e., Partitioning, broadcast joins etc. ● Create ETL pipeline on Databricks transformed data to dump the target directory called snowflake. ● Analyze the resultant data with data bricks tool

Education

  • 2014 - 2018
    Sarojini high school
    Bachelor of Technology - BTech, Electrical and Electronics Engineering

Suggested Services

This profile is unclaimed. These are suggested service rates with 0% commision upon successful connection

Industry Focus. “IT Services and IT Consulting”

Looking to Create a Custom Project?

Need a custom project? We'll create a solution designed specifically for your project.

Get Started

References

Community

You need to have a working account to view this content. Click here to join now

Similar Profiles