Fahad Rehman

Investment Information Architecture Manager at State of Wisconsin Investment Board
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
United States, US
Languages
  • Arabic -
  • Urdu -

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • Data Analytics & Visualization Boot Camp (6 months)
    University of Wisconsin Extended Campus
    Mar, 2022
    - Nov, 2024
  • Machine Learning in Business
    MIT Sloan School of Management
    Aug, 2020
    - Nov, 2024
  • DAMA CDMP Certification - Associate
    DAMA CDMP Certification
    Oct, 2019
    - Nov, 2024
  • Metadata Exam: Associate Level
    DAMA CDMP Certification
    Oct, 2019
    - Nov, 2024
  • Metadata Exam: Practitioner Level
    DAMA CDMP Certification
    Oct, 2019
    - Nov, 2024
  • Programming for Everybody (Getting Started with Python)
    Coursera Course Certificates
    Mar, 2016
    - Nov, 2024
  • Hadoop Platform and Application Framework
    Coursera Course Certificates
    Nov, 2015
    - Nov, 2024
  • Introduction to Big Data
    Coursera Course Certificates
    Sep, 2015
    - Nov, 2024
  • Introduction to Big Data
    Coursera Course Certificates
    Sep, 2015
    - Nov, 2024

Experience

    • United States
    • Investment Management
    • 100 - 200 Employee
    • Investment Information Architecture Manager
      • May 2019 - Present

    • United States
    • Insurance
    • 700 & Above Employee
    • Information Governance and Standards Specialist
      • Apr 2014 - May 2019

      Responsible for establishing and implementing the methods and procedures for proper use and maintenance of the enterprise metadata environment. Collaborates with the data governance area to develop guidelines for how technical, business, and process metadata is defined, managed, and transformed as it moved through various data repositories. Collaborates with IT and the business to develop a common understanding of metadata and ensure effective and accurate decision making by the organization`s data users. Show less

    • United States
    • Financial Services
    • 700 & Above Employee
    • Data Governance senior consultant
      • Oct 2012 - Apr 2014

      • Supported MetaCenter tool which holds company-wide metadata for structured and unstructured data• Established business and technical metadata, data lineage and data models and methodology to document this metadata for both technical and business users• Trained Data Stewards, Data Owners, and SMEs as to optimal use of data within the Business Glossary, Data Dictionary, and Data Lineage for Metadata Management program rollout • Developed enterprise-wide initiatives for the definition and adoption of common data dictionaries• Inventoried sensitive data (PCI, PII and IP) in relational databases and report it to the higher mgmt. on qtly basis• Played a key role in technology selection and architecture design to support the data governance policies and procedures as it relates to data quality and metadata across the organization • Published and governed ‘ISO/IATA Reference data’ into Metadata repository• Utilized SAS Dataflux tool for data profiling and data quality of various data sources to uncover and determine root cause of data quality issues• Supervised an offshore team of eight individuals. Involved with daily status calls, coordination of work, updating Offshore with current and future metadata enhancements etc.• Supported day to day Ab Initio ETL processes which loads custom Metadata into MetaCenter• Educated employees on the MetaCenter tool to increase metadata knowledge and awareness Show less

    • Senior ETL Developer
      • Jul 2007 - Oct 2012

      Designed and developed ETL apps for Marketing, Risk, Banking, Student Loans, and Collections etc. Some of the projects I worked on are (CCS Collection/Inbound Acquisition, Statements, PitStop, ACAPS, Discover Bank, Discover Fraud, Disputes(Card Member), Student Loans, Marketing data into Teradata, Instant Credit, DPR(Document Platform Refresh), CMA, Acquisition Settlement, EDS rehosting, Teradata upgrades etc).Participated in more than 10 agile developments aka Sprints. Coded, Tested, Uat'd and Installed several different projects in 2weeks of timeframe.Designed and tested the Disaster Recovery plan for CFR team which is a yearly audit requirement. Currently involved with the Yearly Disaster Recovery table-top exercise which includes maintaining the plans on LDRPS website, completing the pre and post table top exercise etc.Optimized ETL(Ab Initio) programming code, wrote precise, reusable ETL (Ab-Initio) specifications and patterns to facilitate development. Modified the Ab Initio components parameters, utilize data parallelism and thereby improve the overall performance to fine-tune the execution times.Reviewed requirements, designed ETL process, reviewed data models for approvals, managed build phase, coordinated job scheduling, scripting and implementation.Completed the following POC's(Proof of concept): Expressor(ETL Tool), Teradata Server 1650 evaluation, Aster database evaluation, in-database processing for SAS team, Temporal Data support(in progress).Worked on various DBA activities like creation of tables, partitions and optimizing SQL queries as well as creating a Purge module for Operational Data Store tables older than 30days.Performed SQL Query Optimization and Performance tuning which improved the load time by 70%.Currently working as a Team Lead on Ab Initio 3.0 Upgrade.Participated in code walkthru's and coding standards. Strictly adhere to the various "Best Practices" in AbInito/ETL development. Show less

    • Associate Programmer
      • Dec 2005 - Jun 2007

      Upgraded Unica campaign management/ Unica Manager from version 6.5 to 7.1 on AIX server.Installed and configured the Unica Optimization tool which replaced the Market Switch tool.Responsible for US/UK Marketing Data ETL, job includes loading the 10/100 % databases, dropping and re-creating the table indexes and run data quality scripts on the data.Responsible for ETL monthly releases which includes adding new tables, fields and datasources to ETL, dropping columns, adding/dropping indexes. Changing the transformation rules, doing rollup on aggregate data and normalizing the data. This also involves unit testing the data in dev.Metadata mapping from Legacy Source System (SCMS) to target Database tables. Involved in loading fact, dimension and aggregate tables. Worked on the CBGI UK data mart, developing and supporting back end applications using Ab Initio to create various UK campaigns and loaded them into the UK oracle datamart using the SQL*loader utility of Ab Initio and troubleshoot and support the daily application run.Achieved data parallelism by using the Partition components like Partition by Key, Partition by Expression, and Partition by Round Robin etc.Developed Various Ab Initio Graphs for data cleansing/quality by using Ab Initio functions like is_valid, is_error, string_substring, String_concat etc.Performed SQL tuning on Oracle 9i database, analyzed the various marketing SQL query using the cost based optimizer. The tuning involved the extensive use of “Explain Plan” command, avoided unnecessary sorts, reduced no. of columns in select statements and hints to reduce the cost. Automation of load process using Maestro.Responsible for running weekly Statement jobs which includes creating the CSV and control file, Data validation, running Market Switch reports and executing the Mainframe jobs etc.Provided technical support and training in Market Switch.Involved in pager rotation program to support 24 x 7 production support of Ab Initio graphs. Show less

    • United States
    • Telecommunications
    • 700 & Above Employee
    • Data Warehouse Consultant
      • Nov 2004 - Sep 2005

      Responsible for day to day support of extracting data from major database sources; Oracle, SQL Server 2000, DB2 UDB, CSV(Comma Seperated Values)Files, Mainframe into Target database i.e. Oracle 9i. The sizes of the Databases were close to 5 Terabytes. Performed potential transformations at the staging area, such as cleansing the data(correcting misspellings, dealing with missing elements, parsing into standard formats) combing data from multiple sources, de-duplicating the data and assigning warehouse keys. Show less

    • Data Warehouse Jr. Consultant
      • Sep 2003 - Oct 2004

      • Helped Circata in reduction of Disk Storage and the future problem of dealing with Data Redundancy by defining a single file format with multiple data type and lengths. • Loaded the CSV files into staging tables, transformed and loaded data into warehouse tables using Ab Initio GDE and automated the ETL process through scheduling and exception-handling routines. • Responsible for performance monitoring and tuning of the databases and the applications. Also performed the duty of Database Admin such as Table creation, Synonyms, Space Allocation etc. • Performed evaluations and made recommendations in improving the performance of graphs by minimizing the no. of components in a graph, replacing the old components by new one, tuning the Max Core value, used of Lookup components instead of joins for small tables, Filter the data at the beginning of the graph etc. • Developed reports based on business requirements using Business Objects. Show less

Education

  • University of Wisconsin-Stevens Point
    Masters in Big Data Science
    2019 - 2021
  • Minnesota State University, Mankato
    Bachelors, CIS and Management
    2000 - 2003
  • Bemidji State University
    BS, Computer Science
    1999 - 2000
  • Sheikh Khalifa School
    High School Diploma, High School - Pre Engineering
    1985 - 1998

Community

You need to have a working account to view this content. Click here to join now