Siri Kademani

Volunteer Research Assistant at W. P. Carey School of Business – Arizona State University
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Tampa, Florida, United States, US
Languages
  • English Full professional proficiency
  • Kannada Native or bilingual proficiency
  • Hindi Professional working proficiency

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

5.0

/5.0
/ Based on 2 ratings
  • (2)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

Praveen Kumar Pal

Siri is a talented professional with great analytical skills . She has seamlessly delivered Data Analytics projects with good grasp over SQL and visualisation tools. She is always keen to learn any new technology and which makes her ahead of her peers. I would highly recommend her for any Data Analytics project. Thanks Siri for all the support and dedication which made the project a success.

Ritesh Bisht

Siri was my student during the Tableau Training sessions. She learned things quickly enough to pass the certification exam because she was a curious learner. I wish her luck in the future and am confident that she will contribute significantly to any team. She learned Tableau quickly, applied her technical knowledge, and produced a number of outstanding visuals. Any Data organization would be fortunate to hire Siri since she is truthful, diligent, and talented.

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • Alteryx - The Complete Masterclass
    Udemy
    Sep, 2023
    - Nov, 2024
  • Statistics for Business Analytics and Data Science A-Z
    Udemy
    Jul, 2023
    - Nov, 2024
  • SQL Advanced Skill Assessment
    HackerRank
    Jun, 2023
    - Nov, 2024
  • SQL Basic Skill assessment
    HackerRank
    Jun, 2023
    - Nov, 2024
  • SQL Intermediate Skill Assessment
    HackerRank
    Jun, 2023
    - Nov, 2024
  • SnowPro Core Certification
    Snowflake
    Apr, 2022
    - Nov, 2024
  • Hands On Essentials - Data Warehouse
    Snowflake
    Mar, 2022
    - Nov, 2024
  • Tableau Desktop Specialist
    Tableau
    Jan, 2022
    - Nov, 2024
  • Certificate in Qlik Sense Analytics Development
    Udemy
    Aug, 2021
    - Nov, 2024
  • Learn Data Mining and Machine Learning with Python
    Udemy
    Aug, 2021
    - Nov, 2024
  • Beginners Guide to AI
    Udemy
    Jun, 2021
    - Nov, 2024
  • Social Media Management Pro course
    MyCaptain
    May, 2021
    - Nov, 2024
  • Linux Command Line Basics
    Udemy
    Apr, 2020
    - Nov, 2024
  • Oracle SQL: Become aCertified SQL Developer from scratch
    Udemy
    Mar, 2020
    - Nov, 2024

Experience

    • United States
    • Higher Education
    • 400 - 500 Employee
    • Volunteer Research Assistant
      • Jun 2023 - Present

    • United States
    • Public Safety
    • 700 & Above Employee
    • Data Science Analyst
      • Jan 2023 - May 2023

      1. Devised a robust binary classification model for text categorization, utilizing Natural Language Processing (NLP) techniques. 2. Successfully automated the time-consuming task of manually categorizing customer reviews, resulting in a remarkable time savings equivalent to approximately 16 hours of arduous manual work. This automation freed up valuable resources for more strategic initiatives. 3. Led the development of Tableau dashboards to present KPIs such as Avg. Promotors… Show more 1. Devised a robust binary classification model for text categorization, utilizing Natural Language Processing (NLP) techniques. 2. Successfully automated the time-consuming task of manually categorizing customer reviews, resulting in a remarkable time savings equivalent to approximately 16 hours of arduous manual work. This automation freed up valuable resources for more strategic initiatives. 3. Led the development of Tableau dashboards to present KPIs such as Avg. Promotors, Detractors, and Neutrals Net Promoter Score (NPS), contributing to a 25% reduction in time spent on manual data compilation and report generation. This enhanced data accessibility and facilitated quicker and data-driven decision-making. Show less 1. Devised a robust binary classification model for text categorization, utilizing Natural Language Processing (NLP) techniques. 2. Successfully automated the time-consuming task of manually categorizing customer reviews, resulting in a remarkable time savings equivalent to approximately 16 hours of arduous manual work. This automation freed up valuable resources for more strategic initiatives. 3. Led the development of Tableau dashboards to present KPIs such as Avg. Promotors… Show more 1. Devised a robust binary classification model for text categorization, utilizing Natural Language Processing (NLP) techniques. 2. Successfully automated the time-consuming task of manually categorizing customer reviews, resulting in a remarkable time savings equivalent to approximately 16 hours of arduous manual work. This automation freed up valuable resources for more strategic initiatives. 3. Led the development of Tableau dashboards to present KPIs such as Avg. Promotors, Detractors, and Neutrals Net Promoter Score (NPS), contributing to a 25% reduction in time spent on manual data compilation and report generation. This enhanced data accessibility and facilitated quicker and data-driven decision-making. Show less

    • United States
    • IT Services and IT Consulting
    • 400 - 500 Employee
    • Business Intelligence Engineer
      • Oct 2021 - Jun 2022

      Responsibilities: 1. Discover the sales, orders and transaction data along with social media data and performed data modelling by coming with facts and dimensions tables and in turn building the schemas . Snowflake schema was built for this model. 2. Data manipulation, data cleaning using Ms excel 3. Prepared wire frames of tableau dashboards using Blasmiq. 4. Prepared tableau dashboards based on the business requirements with back end snowflake integration. Responsibilities: 1. Discover the sales, orders and transaction data along with social media data and performed data modelling by coming with facts and dimensions tables and in turn building the schemas . Snowflake schema was built for this model. 2. Data manipulation, data cleaning using Ms excel 3. Prepared wire frames of tableau dashboards using Blasmiq. 4. Prepared tableau dashboards based on the business requirements with back end snowflake integration.

    • India
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Data Analyst
      • May 2020 - Dec 2020

      ‘FMR Data Ontology’ - Responsibilities 1. Discovered the raw financial data by collaborating with SMEs and analysed the gaps and discovered insights which were further served as inputs for the AI use case models. 2. Performed Data cataloging using Alation and presented the entire analysis and insights to the chapter group of the company. 3. Worked on maintaining and enhancing processes by sourcing the data from SQL developer to Qlik sense using cronjobs. 4. Agile methodology… Show more ‘FMR Data Ontology’ - Responsibilities 1. Discovered the raw financial data by collaborating with SMEs and analysed the gaps and discovered insights which were further served as inputs for the AI use case models. 2. Performed Data cataloging using Alation and presented the entire analysis and insights to the chapter group of the company. 3. Worked on maintaining and enhancing processes by sourcing the data from SQL developer to Qlik sense using cronjobs. 4. Agile methodology helped me with better project management and leadership skills, also used JIRA board for the same. 5. Also automated two manual jobs : A. A monthly job using a crontab schedule. Automated the process end to end using Unix shell scripts the queries for logic were written in SQL. B. A weekly job that automatically sends two reports via email every week at a particular time. Initially, these jobs were taking a manual effort of a day or two. After automation, the manual effort is reduced to almost nil. Show less ‘FMR Data Ontology’ - Responsibilities 1. Discovered the raw financial data by collaborating with SMEs and analysed the gaps and discovered insights which were further served as inputs for the AI use case models. 2. Performed Data cataloging using Alation and presented the entire analysis and insights to the chapter group of the company. 3. Worked on maintaining and enhancing processes by sourcing the data from SQL developer to Qlik sense using cronjobs. 4. Agile methodology… Show more ‘FMR Data Ontology’ - Responsibilities 1. Discovered the raw financial data by collaborating with SMEs and analysed the gaps and discovered insights which were further served as inputs for the AI use case models. 2. Performed Data cataloging using Alation and presented the entire analysis and insights to the chapter group of the company. 3. Worked on maintaining and enhancing processes by sourcing the data from SQL developer to Qlik sense using cronjobs. 4. Agile methodology helped me with better project management and leadership skills, also used JIRA board for the same. 5. Also automated two manual jobs : A. A monthly job using a crontab schedule. Automated the process end to end using Unix shell scripts the queries for logic were written in SQL. B. A weekly job that automatically sends two reports via email every week at a particular time. Initially, these jobs were taking a manual effort of a day or two. After automation, the manual effort is reduced to almost nil. Show less

    • Ireland
    • Business Consulting and Services
    • 700 & Above Employee
    • Associate Data Analyst
      • Aug 2018 - May 2020

      Training in Big Data Stream (09/2018 – 10/2018) Was able to explore the big data stream which comprised of HadoopEco systems such as HDFS, Kafka, Cassandra, Hive, Scala. Costco Project (12/2018 – 06/2019) 1. Executed end-to-end ETL UAT testing & monitored GCP Composer scheduling, promptly resolving any errors or bugs encountered, resulting in a 20% reduction in testing and debugging time. This streamlined process ensured the timely deployment of ETL pipelines, saving approximately… Show more Training in Big Data Stream (09/2018 – 10/2018) Was able to explore the big data stream which comprised of HadoopEco systems such as HDFS, Kafka, Cassandra, Hive, Scala. Costco Project (12/2018 – 06/2019) 1. Executed end-to-end ETL UAT testing & monitored GCP Composer scheduling, promptly resolving any errors or bugs encountered, resulting in a 20% reduction in testing and debugging time. This streamlined process ensured the timely deployment of ETL pipelines, saving approximately 8 hours per week. 2. Employed Google Cloud Shell for efficient data import and export from Google Cloud Storage, leading to a 15% reduction in data transfer time compared to traditional methods. This optimization resulted in an estimated time saving of 5 hours per week. 3. Headed maintenance of the weekly scheduling process, ensuring smooth operations and settling any issues to guarantee error-free report generation. This proactive approach minimized disruptions and potential errors, contributing to a 98% reduction in report generation errors and saving an estimated 10 hours per month in troubleshooting and corrective actions. FMR India (07/2018 – 12/2020) Performed the DATA ONTOLOGY task basically to dig in the present data from the different data sources and apply business logic using SQL (used SQL developer tool mostly) in order to derive the business insights. Show less Training in Big Data Stream (09/2018 – 10/2018) Was able to explore the big data stream which comprised of HadoopEco systems such as HDFS, Kafka, Cassandra, Hive, Scala. Costco Project (12/2018 – 06/2019) 1. Executed end-to-end ETL UAT testing & monitored GCP Composer scheduling, promptly resolving any errors or bugs encountered, resulting in a 20% reduction in testing and debugging time. This streamlined process ensured the timely deployment of ETL pipelines, saving approximately… Show more Training in Big Data Stream (09/2018 – 10/2018) Was able to explore the big data stream which comprised of HadoopEco systems such as HDFS, Kafka, Cassandra, Hive, Scala. Costco Project (12/2018 – 06/2019) 1. Executed end-to-end ETL UAT testing & monitored GCP Composer scheduling, promptly resolving any errors or bugs encountered, resulting in a 20% reduction in testing and debugging time. This streamlined process ensured the timely deployment of ETL pipelines, saving approximately 8 hours per week. 2. Employed Google Cloud Shell for efficient data import and export from Google Cloud Storage, leading to a 15% reduction in data transfer time compared to traditional methods. This optimization resulted in an estimated time saving of 5 hours per week. 3. Headed maintenance of the weekly scheduling process, ensuring smooth operations and settling any issues to guarantee error-free report generation. This proactive approach minimized disruptions and potential errors, contributing to a 98% reduction in report generation errors and saving an estimated 10 hours per month in troubleshooting and corrective actions. FMR India (07/2018 – 12/2020) Performed the DATA ONTOLOGY task basically to dig in the present data from the different data sources and apply business logic using SQL (used SQL developer tool mostly) in order to derive the business insights. Show less

Education

  • Arizona State University
    Master of Science - MS, Business analytics
    2022 - 2023
  • Visvesvaraya Technological University
    Bachelor of Engineering, Computer Science Engineering (CSE)
    2014 - 2018
  • Chetan PU Science College , Hubli
    Pre university, (Physics, Chemistry, Maths, Biology )
    2012 - 2014
  • N K Thakkar High school
    10th grade, 93.6%

Community

You need to have a working account to view this content. Click here to join now