Abhishek Tambi

Personal Line BI Lead Data Engineer at Farmers Insurance
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Los Angeles Metropolitan Area

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • SnowPro Core Certification
    Snowflake
    Jun, 2021
    - Nov, 2024
  • AWS Certified Cloud Practitioner
    Amazon Web Services (AWS)
  • Cisco Certified Network Administrator(CCNA)
    Cisco
  • Informatica Certified in Advanced Mapping Design PowerCenter 8
    Informatica
  • Informatica Certified in Architecture and Administration PowerCenter 8
    Informatica
  • Informatica Certified in Mapping Design PowerCenter 8
    Informatica
  • Oracle 9i Certified Professional (OCP) (IZ0-007)
    Oracle

Experience

    • United States
    • Insurance
    • 700 & Above Employee
    • Personal Line BI Lead Data Engineer
      • Jul 2021 - Present

      • Played key role in successful implementation of one of the biggest data modernization projects in Insurance Industry. Moving data from legacy on-premises databases (DB2) and systems to Cloud Data warehouse (Snowflake) including implementing changes in ETL pipelines for conversion for around 20000 data assets for Farmers Personal Line BU. • Provide technical leadership to a team of data engineers, promoting best practices, defining standards, code reviews, and knowledge sharing. • Worked with Informatica Professional Service to implement key project for App2Cloud strategy to convert ETL Code from Informatica Power Center to IICS, liaison for all technical & financial discussions with vendor & leadership. • Spearheading initiative on data integration patterns to architect scalable and cost-effective data intake solutions using cloud services like AWS S3, Lambda, SNS and Snowflake data ingestion services like Snowpipe, Copy Into for varied complexities data ingestion projects. • Proven & much appreciated ability to identify and resolve performance bottlenecks for ETL pipelines by analyzing logs, finding bottlenecks & employing varied tuning techniques on both on prim (oracle) & cloud (Snowflake) databases. • Track credit usages of cloud-based tools like snowflake, IICS. Eliminate waste by rapidly discovering ways to lower unexpected expenses by right sizing warehouses for ETL loads & identifying overprovisioned and unused resources to decommission. • Led comprehensive evaluations of ETL tools in the market, considering factors such as functionality, scalability, ease of use, performance, and cost-effectiveness. • Orchestrated multiple POCs that showcase the tool's performance, data integration capabilities, and compatibility with existing systems. • Proactively drive strategy, governance and data modeling principles across multiple efforts. Show less

    • Claims BI Technical lead/Senior Data Engineer
      • Aug 2016 - Jun 2021

      • Lead the Solution & Development team of approx. 50 resources & played pivot role to successfully implement multimillion dollars multiyear road map Claims BI Guidewire transformation project. This is significant achievement & received numerous appreciations from leadership.• Responsible for managing end to end delivery including Product Evaluation, Infrastructure Planning, Functional & GAP Analysis, Change Management, Business Readiness and Upgrade Assessment for projects stream under multi million-dollar portfolio. • Architecting, Modeling, Designing and Developing ETL pipelines in Informatica for data layers like Stage, Data Marts & Dashboard for multiple complex Business Intelligence projects supporting reporting needs of Claims Business for Guidewire Data & approx. 25 vendor integrations. • Lead architect on complete lifecycle of various ETL projects in OLTP & OLAP systems. • Liaison between Business & Technical team all deliverables in steering committee • Coordinating with vendor partner team for task assignments, knowledge transition, code reviews, and maintaining application documentation.• Played important role in ideation & implementation of DataOps & CI/CD pipeline leveraging capabilities of GITHUB, Jenkins & Cloudbees including custom built value additions like Auto Code review, Auto Deployment, etc. Show less

    • Senior ETL Developer/SME
      • Mar 2014 - Jul 2016

      • Played pivot role in successfully implementing SOX Compliant Reconciliation process between transactional system (Siebel) & Financial Engine (Mainframe) with threshold of 99.98 % matching transactions. • Responsible for end-to-end delivery including project planning, architecting technical solutions, recon framework design, leading team of developers, conducting extensive user testing & Prod deployment. • Participated in Health Check on Informatica footprint & Design Review sessions with Global Informatica Support. • Applied understanding of Insurance sector P&C Industry Data Model along with Kimball Methodology to design & build grounds up Claims Data Mart with internal transactional & external vendor data.• Design common services framework that encompass tools for data integrity, data quality, audit control and error handling for ETL process. Show less

    • ETL Consultant
      • Mar 2009 - Mar 2014

      • Worked extensively on design and development of ETL framework on Informatica Power Center for Workers Compensation Insurance Sector Compliance reports (like NCCI, USR, DCI, Medicare, etc) including strategy for validation , error handling and checkpoints. • Work in Siebel Migration and Integration with various systems using EIM Jobs • Work on performance tuning of ETL Sessions and complex SQL queries • Analyzing oracle database for source, target and instruction tables (populated by Siebel 8 front-end). • Work on creating ad-hoc reports for business partners using SQL. • Proposed efficient solutions and implemented production fixes including emergency fixes for ETL Jobs. • Implemented new standards and ensures adherence to locally defined standards for all developed components across all environments. • Use Harvest Change Management Tool for code integrity and Mercury Quality Center tool for defect tracking. • Lead a team to support all on-call production duties to monitor ETL loads on scheduling tools like DAC & Autosys. Show less

    • United States
    • Financial Services
    • 700 & Above Employee
    • Informatica/ETL Developer
      • Feb 2008 - Feb 2009

      • Understanding of Functional Design Document (FDD) and interacting with Subject Matter Expert (SME) to gather the requirements for the ATT and FET. • Analyzing oracle database for source, target and instruction tables (populated by Siebel front-end) for above modules. • Designed Mapping Document which specifies the data flow for the transformations at Informatica level. • Developed Informatica Mappings which adheres to Functional Specifications and build test cases. • Performed extensive Unit testing by fabricating data in the source and instruction tables for various test cases. • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval Show less

  • Systematix Infotech
    • Indore Area, India
    • Software Developer
      • Aug 2005 - May 2006

      Worked on in-house projects which involved technologies like Oracle, Microsoft Office, etc Worked on in-house projects which involved technologies like Oracle, Microsoft Office, etc

Education

  • UNCC
    MS, Computer Engineering
    2006 - 2007
  • SSGMCE
    B.E., Electronics and Telecommunications
    2001 - 2005

Community

You need to have a working account to view this content. Click here to join now