Parker H.

Senior Data Engineer at Elation Health
  • Claim this Profile
Contact Information
Location
Salt Lake City Metropolitan Area

Topline Score

Bio

Generated by
Topline AI

0

/5.0
/ Based on 0 ratings
  • (0)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

No reviews to display There are currently no reviews available.

0

/5.0
/ Based on 0 ratings
  • (0)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

No reviews to display There are currently no reviews available.
You need to have a working account to view this content. Click here to join now

Experience

    • United States
    • Hospitals and Health Care
    • 100 - 200 Employee
    • Senior Data Engineer
      • Feb 2022 - Present

      Elation Health brings the right information, at the right time, to support the craft of independent primary care. Elation's certified, cloud-based EHR system connects patients and providers in a dynamic health information network, enabling collaboration on mutual patients at the point of care. Elation's "Clinical First" design is trusted by a network of over 14,000 clinicians within independent practices and other innovative care models. Elation Health allows physicians to deliver extraordinary patient care and revolutionize medicine. Show less

    • Pakistan
    • Advertising Services
    • Senior Data Engineer
      • Dec 2020 - Feb 2022

      • Architect and provide solution for an Enterprise Data Warehouse, and Data Streaming Pipelines. • Set and delivered on short and medium strategic goals for the organization. Key Results • Data Engineering – Stood up an Enterprise Data Warehouse in Snowflake using Kimball Dimensional Modeling Method. Introduced the need to for storing trending data in EDW. Ingested new data sources and provided analyst/data scientist new areas to discover new revenue streams. • Innovation – Lead the efforts in changing the architecture by which to ingest source systems into EDW. Reduced speed of ingestion to near real-time after implementing Confluent Kafka. • Streaming– Introduced Snowflake Snowpipe, Confluent Kakfa Connect, and Avro Schemas to increase the rate of ingestion. Accomplishments • Architect Snowflake EDW solution using the Kimball Star Schema Method • Implement Change Data Capture in Azure SQL to ingest data 90% faster • Architect streaming pipelines using Debezium, Kakfa, and snowflake streaming tables • Increased speed of PowerBI reports by moving them to Snowflake Direct Query. Show less

    • United States
    • Insurance
    • 700 & Above Employee
    • Lead Data Lake Engineer
      • Dec 2019 - Nov 2020

      • Architect and provide solution Data Lake, Machine Learning, and Enterprise Data Warehouse platforms. • Data Engineering – Matured a Data Lake in the Azure. Established true partner relationship with our business to understand their needs and then serve in a consultative role to share information and architectural considerations and help them make well informed decisions. Successfully implemented a Data Lake in Azure. Ingested new data sources and provided analyst/data scientist an area to discover new revenue streams, high-cost claimants, and call center FAQ’s. • Innovation – Lead the efforts in changing the architecture of our Data Lake. Reduced speed of ingestion by 90% after the rearchitected solution and costed 30% less. • Intelligent Automation – Introduced Azure Resource Management scripts in HDInsghts cluster to auto scale to reduce cost and scale up when processes increased. • Architect new Machine Learning pipeline for data scientist using Azure ML, PySpark, Hive, and De-identified Data Lake • Design and implement new data pipelines using Talend, Airflow, Azure ADLS Gen 2 Storage, and Apache Hive • Aggregate and analyze healthcare data to bring new insights to the marketing and business operation teams • Implement Change Data Capture in Talend processes to ingest data in half the time Show less

    • United States
    • Software Development
    • 700 & Above Employee
    • Big Data Engineer
      • Apr 2017 - Dec 2019

      As a Data Engineer my responsibility is to develop and maintain big data pipelines. Teach members of team new technologies and techniques Pull batch and stream data through SSIS, Spark, custom Python, and other associated technologies into our AWS Data Warehouse Architect data structures for our customers to make business decision based on the data provided. Deliver completed, tested, and validated code through Agile development to stake holders. Manage expectations for stake holders by routine detailed updates and keeping on schedule for delivery of projects. Show less

    • United States
    • Construction
    • 100 - 200 Employee
    • Information Technology Specialist
      • Feb 2015 - Apr 2017

      Managed and trained 100 + employees on new software. Maintained network up time to increase production and accuracy in reporting numbers by providing hot spots to satellite locations. Developed dashboards in PowerBI to give insight into company metrics and provide decision makers daily updates to data that impacted day to day decisions. Developed and implemented Data Integration System to move data from SQL DB to HeavyJob application. Providing our engineers with real time data to better serve them when bidding jobs. Developed online applications using SharePoint, HTML, CSS, and Javascript to increase company productivity and monitor company equipment. Developed C# programs to eliminate user input errors while reporting data to Viewpoint. Show less

Education

  • Utah Valley University
    Information Technology
    2010 - 2015

Community

You need to have a working account to view this content. Click here to join now