Justin Weible
Data Engineer at Cognixia- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
Topline Score
Bio
Experience
-
Cognixia
-
India
-
Professional Training and Coaching
-
200 - 300 Employee
-
Data Engineer
-
Dec 2021 - Jun 2023
Basking Ridge, New Jersey, United States Cognixia’s JUMP program is a hyper-intense technical training program that gives top STEM talent from across the U.S. the equivalent of 12-18 months of industry experience. Not only do they up-skill their technical abilities, they also evolve their digital mindset to successfully adapt and utilize technology in an efficient manner so they can JUMP start their careers via deployment with Collabera clients. • Python Modules and File Handling - Developed Python modules for custom use cases… Show more Cognixia’s JUMP program is a hyper-intense technical training program that gives top STEM talent from across the U.S. the equivalent of 12-18 months of industry experience. Not only do they up-skill their technical abilities, they also evolve their digital mindset to successfully adapt and utilize technology in an efficient manner so they can JUMP start their careers via deployment with Collabera clients. • Python Modules and File Handling - Developed Python modules for custom use cases, and read and wrote data to files on system • Data Science with Pandas and Numpy - Imported CSVs and processed data into Data Frames with the Pandas Package, and analyzed the data with the Numpy Package • Data Visualization with MatPlotLib - Displayed data as charts and graphs in Jupyter Notebook using PyPlot from the MatPlotLib package Show less
-
-
-
Capital One
-
United States
-
Financial Services
-
700 & Above Employee
-
Data Engineer
-
Mar 2022 - Apr 2023
- Created and managed ETL data pipelines for multiple datasets. - Created a standardized template for both a notebook to execute the business logic and a Python script to execute the notebook code on an AWS EMR cluster using PySpark. - Built DAGs using Apache Airflow to schedule each pipeline to run at a specific time. - Migrated several automation processes created in Automation Anywhere from virtual desktop interfaces to AWS EC2s. - Updated code to allow process… Show more - Created and managed ETL data pipelines for multiple datasets. - Created a standardized template for both a notebook to execute the business logic and a Python script to execute the notebook code on an AWS EMR cluster using PySpark. - Built DAGs using Apache Airflow to schedule each pipeline to run at a specific time. - Migrated several automation processes created in Automation Anywhere from virtual desktop interfaces to AWS EC2s. - Updated code to allow process logs to be uploaded to Splunk and created Splunk dashboards to track the logs and collect metrics. - Tested bots to ensure that they could run end-to-end on EC2 and created firewall requests when necessary. Show less
-
-
-
VITAC
-
Broadcast Media Production and Distribution
-
200 - 300 Employee
-
Offline Senior Captioner II
-
Oct 2011 - Jan 2021
-
-
-
-
Writer
-
2009 - 2018
United Kingdom
-
-
-
The Western Pennsylvania Conservancy
-
United States
-
Environmental Services
-
1 - 100 Employee
-
Tour Guide
-
Jun 2001 - Oct 2011
Mill Run, Pennsylvania, United States
-
-
Education
-
Flatiron School
Data Science -
California University of Pennsylvania
Bachelor of Arts - BA, English with a Journalism Concentration -
Westmoreland County Community College
Associate of Arts - AA, Mass Communication/Media Studies