Alex Meadows, CBIP

Senior Data Engineer at Sorcero
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Raleigh, North Carolina, United States, US
Languages
  • English Native or bilingual proficiency

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

5.0

/5.0
/ Based on 2 ratings
  • (2)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

Snehal Desai

I worked with Alex for more than a year and he is one of the best Data Engineers I have worked with. He is an expert in Data pipelines, Data modeling and Data architecture in the cloud. He has thorough knowledge of all the available tools in the market and is very instrumental in selecting the right tool for the right problem. I was also impressed by his Python skills when he worked on very complex rules based functionality. Under his leadership, we improved our SDLC practices, DevOps practices, Data pipelines and Data architecture that ultimately improved the quality and go to market time for products. He is very easy to work with, open for feedback and always eager to improve the process for everyone. I enjoyed working with him and I would highly recommend him for any senior or leadership position in the data engineering space.

Liza Mischel

I've had the pleasure of working with Alex on a number of projects, and he's always been a pleasure to work with. He's an incredibly intelligent and talented data engineer, capable of bringing complex ideas to life. I think what impressed me most about working with him was his ability to make difficult concepts understandable to everyone at the table. It makes him invaluable as both an engineer and as someone who can help other engineers understand what they're building. I highly recommend Alex as a data engineer. He's extremely capable both technically and in terms of communication, which makes him an excellent asset on any team.

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • Herald Level Dungeons and Dragons Game Master
    DCI
    Sep, 2011
    - Nov, 2024
  • Certified Business Intelligence Professional, Data Analysis and Design
    institute for certification of computing professionals
    Nov, 2018
    - Nov, 2024
  • Certified Business Intelligence Professional, Data Integration
    Institute for Certification of Computing Professionals
    Nov, 2018
    - Nov, 2024

Experience

    • United States
    • Software Development
    • 1 - 100 Employee
    • Senior Data Engineer
      • Jan 2023 - Present

      Tech lead of the data team providing guidance and working on building a next generation data lake. * Built automation suite for loading data lake following the data vault architecture * Designed and implemented data lake integrating many different data sources Technologies leveraged ------------------------------------- GCP: BigQuery, Composer, GCS Data Modeling: Hackolade Database: BigQuery, PostgreSQL DevOps: Gitlab CI/CD, Docker Languages: Python, SQL, XML, YAML Testing/Quality Management: pytest, pre-commit, ruff Show less

    • United States
    • Technology, Information and Internet
    • 1 - 100 Employee
    • Principal Data Engineer
      • Jan 2022 - Nov 2022

      * Helped standardize logging capabilities by introducing a custom built logging library * Working to automate the monitoring, measuring, and analysis of logs with tools like Grafana and Loki * Leading the Data Engineering Chapter and promoting best practices * Built a custom rules engine for processing data with reactivex patterns (reactivex/pyrx, gevent) - processing 1+ million records in 1.5 to 2 minutes * Wrote extensive documentation for rules engine, including architecture and process flows * Identified memory and CPU optimizations in existing processes, reducing usage of both * Provided recommendations for improvements/best practices on data visualization within the application * Promoted Agile best practices across multiple teams * Coordinated feature interaction between UI and backend Technologies leveraged ------------------------------------ AWS: EKS, ECR, Fargate, Batch, Lambda, S3 Data Modeling: pgModeler Database: Snowflake, PostgreSQL DevOps: Github Actions, Docker Languages: Python, SQL, YAML Log Management: Grafana, Loki, Cloudwatch Show less

    • United States
    • Farming
    • Lead Data Engineer
      • Mar 2020 - Jan 2022

      Building awesome applications for making it easier to manage farming. * Designed and built the backend for Shepherd, an application designed to manage farm work digitally * GraphQL API for use by our native mobile applications and web frontend (Graphene, Django) * Data model to allow for nuanced flexibility and growth * Wired the Shepherd web application to leverage the GraphQL API * Designed and implemented an initial proof of concept of a universal semantic-based data commons to allow farmers to view farm data from across their businesses as well as at a regional and national level * Integrated third party ontologies with project built ontological models (Protégé, pgModeler) * Designed/built data migrations for handling data from startups into the data commons (Tweak Street, Pentaho Data Integration) * Designed/built initial REST API for external integrations with other startups/vendors for farmer data * Designed/built multi-tiered data lake using data vault 2.0 methodology and AWS (EMR, Glue, Lambda) * Built a data migration engine with Tweak Street to migrate the Shepherd application from MongoDB to PostgreSQL. * Built and managed build stacks and environments for Shepherd and Data Commons (AWS EKS, Github) Technologies Leveraged ------------------------------------ API: REST, GraphQL (graphene) AWS: EKS, ECR, Fargate, RDS, Batch, Lambda, S3, EC2, EMR, Glue Data Modeling: pgModeler, Protégé Database: Postgresql, MongoDB, Hive, SQLAlchemy DevOps: Github Actions, Docker Languages: Python, SQL, YAML, OWL Log Management: Cloudwatch Web: Django, Flask Show less

  • TDWI Carolinas Chapter
    • Raleigh-Durham, North Carolina Area
    • Program Coordinator
      • Jul 2013 - Sep 2021

      Volunteer work with the local professional data community. TDWI is the leading technology agnostic professional organization teaching best practices and showcasing new techniques through their chapter meetings and regional conferences. - Work with community to find top notch presenters for our meetings - Work with the board to plan, schedule, and run regular meetings Volunteer work with the local professional data community. TDWI is the leading technology agnostic professional organization teaching best practices and showcasing new techniques through their chapter meetings and regional conferences. - Work with community to find top notch presenters for our meetings - Work with the board to plan, schedule, and run regular meetings

    • United States
    • Retail
    • 700 & Above Employee
    • Senior Data Engineer
      • Apr 2019 - Mar 2020

      * Developed a data lineage library to track data from source to target (Python based) * Provided architecture recommendations and coached on best practices * Assisted in the usage/implementation of Cloud9 cloud based IDE Technologies Leveraged ------------------------------------- AWS: EKS, Batch, Lambda, RDS, Cloud9, S3 Data Modeling: pgModeler Database: PostgreSQL, SQLAlchemy DevOps: Github Actions, Docker Languages: Python, SQL * Developed a data lineage library to track data from source to target (Python based) * Provided architecture recommendations and coached on best practices * Assisted in the usage/implementation of Cloud9 cloud based IDE Technologies Leveraged ------------------------------------- AWS: EKS, Batch, Lambda, RDS, Cloud9, S3 Data Modeling: pgModeler Database: PostgreSQL, SQLAlchemy DevOps: Github Actions, Docker Languages: Python, SQL

    • Switzerland
    • Farming
    • 700 & Above Employee
    • Lead Data Engineer/Data Modeller
      • Aug 2017 - Apr 2019

      * Trained team on best practices and tools, ensuring proficiency * Integrated data from many different systems from relational, NoSQL, and files to build a coherent tapestry of data for use in analytics, reporting, etc. * Team lead working to build a next generation data platform * Established guidelines and fundamental architecture for cross organization data integration * Changing the way data is utilized one subject area at a time * Implemented ETL stack and framework for data transformation * Iterated extensively on previous framework to make more robust * Wrote transformations required to integrate disparate data for various use cases * Integrated with Spark and other NoSQL technologies * Designed and helped implement data integration methodology utilizing AWS S3, Apache Parquet/ORC, Pentaho Data Integration, and RedShift Spectrum Technologies Leveraged ------------------------------------ AWS: EKS, ECR, Fargate, RDS, Batch, Lambda, S3, EC2, EMR, Glue, Redshift, Cloudwatch Data Modeling: pgModeler, Protégé, erwin Data Integration: Pentaho Data Integration, PySpark Database: Postgresql, SQLServer, Oracle, Hive, Parquet, ORC, OpenLink Virtuoso, Redshift Languages: Python, SQL, YAML, OWL, XML Show less

    • United States
    • Higher Education
    • 700 & Above Employee
    • Adjunct Professor, Business Analytics
      • Dec 2015 - Mar 2019

      Designing course material and teaching courses to students.

    • Business Analytics Department Advisory Committee Member
      • Jul 2013 - Dec 2017

      - Advise Wake Tech on recommended skillsets, applications, software, and techniques for students

    • United States
    • Information Technology & Services
    • 1 - 100 Employee
    • Principal Consultant
      • Aug 2014 - Jul 2017

      I work with an amazing team to help meet customer deliverables utilizing my full skillset - from team building and teaching agile to architecture and application development. My primary client was Syngenta. * Implementing graph, semantic web, and other NoSQL technologies to track plant lineage and represent them visually * Build ontological models translating relational data into triples * Building ETL/data integration processes for data processing into NoSQL solutions * Providing guidance on best practices on data visualizations * Coach teams on agile and continuous integration best practices * Technical leadership on standards and program direction Technologies Leveraged ------------------------------------ AWS: RDS, Batch, Lambda, S3, EC2, Cloudwatch Data Modeling: pgModeler, Protégé, erwin Data Integration: Pentaho Data Integration Database: Postgresql, SQLServer, Oracle, Parquet, ORC, OpenLink Virtuoso, Redshift Languages: Python, SQL, YAML, OWL, XML Show less

    • United States
    • Software Development
    • 700 & Above Employee
    • Senior Business Intelligence Engineer
      • Jun 2012 - Jul 2014

      * Built an open source automated data profiling tool utilizing Pentaho Data Integrator and Data Cleaner (https://github.com/dbaAlex/PDI-DC-Auto-Profiling). * Architected Test Driven Development and Continuous Integration methodologies for the Business Intelligence space * Architected automation, unit testing, and data quality tools and framework * Designed standards for Data Vault, data mart, and staging warehousing methodology * Developed and architected a multi-tiered data warehouse, utilizing both Data Vault and Kimball methodologies * Implemented new ETL stack * Promote open source software throughout Business Intelligence Infrastructure * Promote Agile methodologies and their use in Business Intelligence * Established best practices and standards * Created Qualification Protocol and User Acceptance Testing Protocol and documentation. * Built an open source Data Integration testing tool (https://github.com/OpenDataAlex/etlTest) Technologies Leveraged ------------------------------------- AWS: RDS, EC2, EMR, Redshift, Cloudwatch Data Modeling: SQL Power Architect Data Integration: Pentaho Data Integration, Business Objects Data Services Database: Postgresql, Oracle, MySQL, Redshift, Mondrian (OLAP) Languages: Python, SQL, YAML, XML Show less

    • Pentaho Community Leader
      • Jun 2011 - Mar 2014

      I keep my ear to the ground keeping abreast with the Pentaho community at large and bring the local group up to speed on the latest comings and goings via our meetings as well as our LinkedIn and Meetup groups. I keep my ear to the ground keeping abreast with the Pentaho community at large and bring the local group up to speed on the latest comings and goings via our meetings as well as our LinkedIn and Meetup groups.

    • United States
    • IT Services and IT Consulting
    • 1 - 100 Employee
    • Senior Business Analytics Consultant
      • Mar 2012 - Jun 2012

      *Pentaho Practice Manager - providing up to date information about the Penthao tool set as well as developing and discovering best practices. Developing, coordinating, and leading BI projects utilizing the Pentaho suite. *Worked directly with customers to build customized solutions and train them on best practices. *Developed proof of concept for new SaaS offering by Datamensional, LLC. Technologies Leveraged ------------------------------------ Data Integration: Pentaho Data Integration Database: Postgresql, Oracle, MySQL Show less

    • IT Services and IT Consulting
    • 1 - 100 Employee
    • Business Intelligence Engineer
      • Mar 2011 - Mar 2012

      * Developed and architected a multi-tiered data warehouse, utilizing both Data Vault and Kimball Conformed Dimensions* Established a Business Intelligence program available for the entire organization (from Account Managers to Technology)* Developed and maintained data integration program utilizing Extract, Transform, and Load (ETL) and Enterprise Application Integration (EAI) methodologies* Reduced load on production environment by moving reporting needs into data warehouse/data marts* Established best practices for data analyticsTechnologies Leveraged-------------------------------------Data Modeling: MySQL WorkbenchData Integration: Pentaho Data IntegrationDatabase: Postgresql, MySQL, Mondrian (OLAP)Languages: SQL Show less

    • Data Warehouse Engineer
      • Nov 2010 - Mar 2011

      Develop standards for data warehouse, ETL, and other aspects of BIDevelop data warehouse using the Data Vault methodology, coupled with data martsDevelop ETL code using KettleTechnologies Leveraged-------------------------------------Data Modeling: MySQL WorkbenchData Integration: Pentaho Data IntegrationDatabase: Postgresql, MySQL, Mondrian (OLAP)Languages: SQL

    • United States
    • Plastics Manufacturing
    • 700 & Above Employee
    • Database and Project Administrator
      • Jun 2006 - Nov 2010

      * Built and deployed business intelligence solutions for entire manufacturing facility * Assisted in the development and deployment of real-time monitoring system that tracked over 100 pieces of equipment, refreshing data every 6 seconds * Wrote software to streamline maintenance processes and reduce paperwork that included documentation for maintenance of injection molds and QA monitoring of parts * Built a data warehouse to support reporting and data analytics leveraging Kimball Conformed Dimensional modeling, which integrated real-time monitoring system data and other internal systems * Developed and maintained data integration program utilizing Extract, Transform, and Load (ETL) methodologies * Performed software validation on incoming software being leveraged in the manufacturing and quality assurance processes * ISO Internal Auditor Technologies Leveraged ------------------------------------- Business Intelligence: Pentaho BI Suite Database: MySQL Data Integration: Pentaho Data Integration Documentation Management: Lotus Notes 7, Sharepoint Languages: PHP, SQL, Javascript Web Frameworks: Symfony Show less

Education

  • Saint Joseph's University
    MS, Business Intelligence
    2009 - 2010
  • Chowan University
    BS, Business Administration
    2002 - 2006

Community

You need to have a working account to view this content. Click here to join now