Harshitha Naredla
Software Engineer IV at Next Phase Solutions and Services, Inc.- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
Topline Score
Bio
Experience
-
Next Phase Solutions and Services, Inc.
-
United States
-
IT Services and IT Consulting
-
1 - 100 Employee
-
Software Engineer IV
-
Mar 2019 - Present
• Perform Hands-on development using Databricks, AWS, Airflow, Informatica, Python, Pyspark, SQL, DB2, Teradata, Oracle databases. • Utilized AWS services Secrets Manager for Password rotations with External Database systems. • Created Jupyter notebooks in Databricks Extract, Transform and Load process. • Worked on IBM Sterling Connect Direct process for File transfers between AWS and Mainframe environments. • Automated Databricks Notebooks and resulting scripts using Apache Airflow and shell scripting to ensure daily execution in production. • Created Airflow DAG’s to use Email Operator, Bash Operator, Trigger Dag Run Operator, Databricks Submit Run Operator, Python Operator, Branch Operator. • Used GitHub for version control • Worked with JFrog Artifactory and Jenkins for CI/CD pipelines • Worked on creating RCA in JIRA for production issues • Implemented both short term and long-term workarounds for production issues with Special characters in Data • Working with Various CMS provider and POR data sources (POR, PECOS, NPPES, NPICS, TMSIS,QIES,DOD). • Working with Enterprise data lake to create data sets in parquet format and contribute to EDL • Worked on Teradata Bteq scrips to load data into Teradata database • Create Labels for Informatica objects for migration from lower environment to higher environments. • Create and modify UNIX shell scripts for file processing, email notifications, etc. • Support operations by troubleshooting performance issues related to Data volume. Work with DBA teams for regular maintenance of DB2 objects for better performance. • Define and implement production release requirements. • Working on Operations support for MDM for Informatica, DB2 and IBM Infosphere. • Trouble shooting performance issues related to Data volume and or as part of smoke testing. • Be a part of and contribute to ETL code reviews with the goal of ensuring standardization, best practices, consistency, and quality. Show less
-
-
-
VEDICSOFT
-
United States
-
Information Technology & Services
-
100 - 200 Employee
-
ETL Specialist & Operations Support supporting MDM contract for CMS
-
Jan 2014 - Mar 2019
• Work closely with the design and development teams to implement design patterns such as Informatica mapping architect for Visio to speed up the development process and maintain consistency across applications. • Profiling source system data to find abnormalities and ambiguity with source data and create score cards and cleansing rules using Informatica Developer and Unix shell scripting. • Provide Operational support for Informatica Application, DB2 and IBM Infosphere components. • Trouble shooting performance issues related to Data volume and Working with DBA teams for Database maintenance for better performance and to address capacity and or performance issues during Operations. • Using Mapping templates when needed to speed up development process and maintain consistency across ETL application code. • Create Labels for Informatica objects and Deployment groups for migration from lower environment to higher environments • Create repository queries (Public and private) for cleansing of redundant connection objects (as part of internal audit project). • Creating and/or modifying UNIX shell scripts for file processing, Email notifications, Templates and parameter files etc. • Trouble shooting performance issues related to Data volumes, Indexes and Database maintenance. • Using Extrahop as network monitoring tool to monitor and log network traffic. Show less
-
-
-
Highmark Inc.
-
United States
-
Insurance
-
700 & Above Employee
-
Information Developer
-
Jul 2010 - Dec 2013
• Involved in gathering of business scope and technical requirements and written technical specifications. • Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Normalizer, Rank and Router transformations. • Worked on Informatica tool –Source Analyzer, Target designer, Mapping Designer & Maplet and Transformations. • Involved in the development of Informatica mappings and also tuned for better performance. • Worked with the MLOAD, TPUMP and FASTLOAD utilities of Teradata for faster loading and to improve the performance • Extensively used ETL to load data from flat files, Oracle to Teradata database and Flat Files. • Teradata Views were written against the departmental database and claims database to get the required data. • And these views were built through Harvest Change Manager Tool in desired Schema in Teradata and used as one of the sources for Informatica. • Load balancing of ETL processes, database performance tuning and Capacity monitoring. • Involved in the Unit testing, Event and System testing of the individual. • Analyzed existing system and developed business documentation on changes required. • Used UNIX to create Parameter files and for real time applications. • Extensively involved in testing the system from beginning to end to ensure the quality of the adjustments made to oblige the source system up-gradation. • Worked with many existing Informatica Mappings to produce correct output. • Efficient Documentation was generated for all mappings and workflows. • Efficient Unit testing documentation was created along with Unit test cases for the development code. • Detail Technical design documents were prepared before the code gets migrated. • Detailed System Defects were created to inform the project team about the status throughout the process. Show less
-
-
Education
-
Gannon University
Master of Science - MS, Mechanical Engineering