Krishna Chikkam
AWS Data Engineer at Sas Info- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
-
English Full professional proficiency
-
Telugu Native or bilingual proficiency
-
Hindi Full professional proficiency
-
Spanish Limited working proficiency
Topline Score
Bio
Credentials
-
Oracle Database SQL Certified Associate
OracleJun, 2019- Nov, 2024 -
Amazon Web Services (AWS)
CapgeminiMay, 2019- Nov, 2024 -
Big Data
CapgeminiMay, 2019- Nov, 2024 -
Ab Initio Certified
CapgeminiApr, 2019- Nov, 2024 -
Data Modeling Certified
CapgeminiApr, 2019- Nov, 2024 -
Qlik Sense Certified
CapgeminiApr, 2019- Nov, 2024 -
Qlik View Certified
CapgeminiApr, 2019- Nov, 2024 -
Teradata Basics Certified
CapgeminiApr, 2019- Nov, 2024 -
ETL Basics Certified
CapgeminiMar, 2019- Nov, 2024 -
Java Programming Basics
CapgeminiMar, 2019- Nov, 2024 -
Oracle Advanced Certified
CapgeminiMar, 2019- Nov, 2024 -
UNIX Certified
CapgeminiMar, 2019- Nov, 2024 -
Introduction to Python Certified
CapgeminiFeb, 2019- Nov, 2024
Experience
-
Sas Info
-
Brazil
-
Retail
-
1 - 100 Employee
-
AWS Data Engineer
-
Jan 2023 - Present
Significantly contributed to the development of a key data pipeline to process over 500 TB of data by consolidating data from diverse sources into a single destination, enabling quick data analysis and better decision making. Processed transactional data across 9 diverse primary data sources using Apache Spark, Redshift, S3, and Python. Collaborated with cross-functional teams to migrate on-prem databases to AWS, reducing maintenance costs by 20%. Automated three CI/CD pipelines using Git hooks, ensuring smooth code integration and version control. Show less
-
-
-
MUFG
-
Japan
-
Financial Services
-
700 & Above Employee
-
Data Engineer
-
Feb 2019 - Nov 2021
Developed a data pipeline using AWS Glue, Python and Apache Spark to automate data ingestion, transformation, and loading to process over 10TB of data per month, reducing manual efforts and operating costs. Engineered an approach to manage loan moratorium for 8 million customers during covid-19 using Python and SQL. Architected, deployed and supported a highly scalable data warehousing solution on the Snowflake platform, managing over 500 concurrent users and delivering a significant improvement in application performance. Implemented partitioning, bucketing and index optimization in Hive (Hadoop), reducing query response time by 25%. Streamlined data workflows by seamlessly incorporating automation and scheduling techniques within Apache Airflow, resulting in a reduction in manual intervention and increased data accessibility. Pioneered Bash shell scripting for task automation, excelled in processing and analyzing JSON data, achieved optimizing data storage through the conversion of 50 terabytes to Parquet format. Show less
-
-
-
TELLIGEN TECHNOLOGIES PRIVATE LIMITED
-
India
-
1 - 100 Employee
-
ETL/Informatica Developer
-
Jul 2018 - Feb 2019
Led the development of ETL Informatica applications in an agile environment, cutting development costs by 25%. Developed and delivered tableau reports and dashboards for business users, resulting in increased productivity. Spearheaded performance tuning techniques in an existing ETL process by implementing parallel processing and optimizing data loading strategy that improved the application performance by at least 30%. Orchestrated migration of applications from Informatica PowerCenter to Informatica Intelligent Cloud Services (IICS). Show less
-
-
-
TELLIGEN TECHNOLOGIES PRIVATE LIMITED
-
India
-
1 - 100 Employee
-
Informatica Developer
-
Jan 2018 - May 2018
Improved database performance by 30% through Strategic query optimization, partitioning and indexing. Extracted and transformed data from multiple Databases (PostgreSQL, MS SQL server, Teradata, Oracle), SharePoint, flat files, XML files, into target tables in databases and csv files. Executed Disaster Recovery (DR) plan for over 10 applications, ensuring business continuity with minimal downtime. Improved database performance by 30% through Strategic query optimization, partitioning and indexing. Extracted and transformed data from multiple Databases (PostgreSQL, MS SQL server, Teradata, Oracle), SharePoint, flat files, XML files, into target tables in databases and csv files. Executed Disaster Recovery (DR) plan for over 10 applications, ensuring business continuity with minimal downtime.
-
-
Education
-
Governors State University
Master of Science - MS, Computer Science -
MLR Institute of Technology
Bachelor of Technology, Computer Science