Allan Clements
Data Engineering Team Lead at Phylum- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
Topline Score
Bio
Experience
-
Phylum
-
United States
-
Software Development
-
1 - 100 Employee
-
Data Engineering Team Lead
-
Feb 2021 - Present
Rust, JanusGraph, HBase, Scala Data Engineering Team Lead: June 2022 - Ongoing Senior Data Engineer: February 2021 - June 2022 Rust, JanusGraph, HBase, Scala Data Engineering Team Lead: June 2022 - Ongoing Senior Data Engineer: February 2021 - June 2022
-
-
-
-
Business Owner
-
Jan 2020 - Dec 2021
24/7 autonomous trading May 2020 until December 2021 Completely written in Java using Dagger2 as a compile time dependency injection framework All on-premise hardware requiring Linux administration expertise 6 months of trading produced over $100k of trading volume with only $1k of principle Interprocess communication via Kafka to isolate operational responsibilities such as market data archival and trading system consumption of that market data Infrastructure: Kafka, Elasticsearch, Kibana, Logstash, Jenkins, TimescaleDB (PostgreSQL) running on raid array formatted in ZFS, Logtrail, Elastalert, and Grafana Java frameworks: Dagger2, Jackson, Mockito, JUnit 5, TestContainers, Micrometer, Logback, Retrofit/OkHTTP, Protobuf, Jetty Websocket, Jdbi 3, Kafka, Owner, JMX & Flight Recorder for real-time JVM monitoring Show less
-
-
-
Sonatype
-
United States
-
Software Development
-
400 - 500 Employee
-
Tech Lead & Interview Committee
-
Nov 2017 - Feb 2021
I designed and implemented a new storage solution that grew to over 1 billion rows and unified two former data silos. This improved security data analysis from over 36 hours to under 20 minutes, a 108x improvement, allowing the business to provide security vulnerability notification to customers significantly faster While working on this I created an in-house tool "Liquibase for HBase" to fill the need of automating our HBase table creation and configuration. This then created an easy path for integration based testing as we could then programmatically create our HBase setup from production into docker containers with trivial amounts of effort and likewise re-use the tool in our infrastructure as code for our production HBase environment's schema updates. Programming Languages: Primarily Java and domain specific usage of Typescript and Scala AWS Technologies: Fargate (Docker), EMR, ECS, SNS, SQS, Cloudformation, S3, Beanstalk, and Redshift Hadoop Ecosystem: Hadoop DFS, HBase, and Spark CI/CD: Jenkins Misc: Docker, Liquibase, Maven Tech Lead October 2019 - February 2021 - Team of 4 other developers - PM provided feature request and I would oversee implementation to meet the company roadmap - Surface to business the technical areas of improvement and technical debt becoming untenable - Daily duties: operating team standup and assisting team in design, implementation, debugging, deployment, and production outage triage in addition to my own development work items Interview Committee April 2019 - February 2021 Senior Software Developer February 2019 - October 2019 Software Developer November 2017 - February 2019 Show less
-
-
-
Tradebot
-
United States
-
Financial Services
-
1 - 100 Employee
-
Software Developer
-
Jan 2015 - Nov 2017
I took ownership of the real time trading statistics pipeline and enhanced its code clarity and performance. - New statistics could be expressed in a single line of code and could automatically generate tests - This new engine in turn improved client application latency from seconds to milliseconds making query request turnaround time almost imperceptible despite a 3x increase of data - Implemented as a distributed application utilizing 750 GB of memory and over 300 CPU cores The trading statistics pipeline is a combination of a C++ application and Java applications working across multiple servers to distribute load. The pipeline incorporates Kafka for data consumption and uses HBase for data storage. Show less
-
-
-
IBM
-
United States
-
IT Services and IT Consulting
-
700 & Above Employee
-
Extreme Blue Intern
-
May 2014 - Aug 2014
Cognitive Cloud team that worked on two projects. One of which produced a patent: https://patents.google.com/patent/US9734036 First project utilized Watson technology to develop a sports Q&A system. The second project developed an automated system to optimize IBM's Events Infrastructure server configurations. We used Ruby to develop supporting script automation in both projects. The second project was based around a genetic algorithm written in ECJ. The algorithm was then tested using Tsung for stress/success testing within a Chef environment managed by Vagrant. The team operated on the Scrum methodology using Rational Team Concert. Show less
-
-
-
Lexmark
-
United States
-
Information Technology & Services
-
700 & Above Employee
-
Remote Software Engineering Intern
-
Oct 2013 - May 2014
Created a code performance profiler for plugins to the Perceptive Integration Framework. This tracked performance of all plugins in the framework, even if the plugin was of third party nature. Standard performance measurements were gathered at framework wrapper hooks so all plugins were automatically profiled without any additional code. The API also provided a means to track custom code timings for any developer. The custom statistics were included in the same report. Created a code performance profiler for plugins to the Perceptive Integration Framework. This tracked performance of all plugins in the framework, even if the plugin was of third party nature. Standard performance measurements were gathered at framework wrapper hooks so all plugins were automatically profiled without any additional code. The API also provided a means to track custom code timings for any developer. The custom statistics were included in the same report.
-
-
-
Lexmark
-
United States
-
Information Technology & Services
-
700 & Above Employee
-
R&D Software Engineering Intern
-
Jun 2013 - Aug 2013
Worked alongside full time software engineers on the Integrated Product Architecture (IPA) team tasked with development of the Perceptive Integration Framework (PIF). PIF was a interprocess communication layer based on the OSGi specification for Perceptive Software's applications. It also functioned as a connection point for third party plugins into the PIF environment. Worked alongside full time software engineers on the Integrated Product Architecture (IPA) team tasked with development of the Perceptive Integration Framework (PIF). PIF was a interprocess communication layer based on the OSGi specification for Perceptive Software's applications. It also functioned as a connection point for third party plugins into the PIF environment.
-
-
-
Adtran
-
United States
-
Telecommunications
-
700 & Above Employee
-
Enterprise Technical Support co-op
-
Jan 2013 - May 2013
Independently deployed and troubleshot enterprise level computer networks. Solutions occasionally required working alongside ISPs such as Verizon and AT&T and confidently working within the customer’s network. Customers ranged from small home businesses to the Embassy of Switzerland and the U.S Army on issues ranging from deployment to site downs Independently deployed and troubleshot enterprise level computer networks. Solutions occasionally required working alongside ISPs such as Verizon and AT&T and confidently working within the customer’s network. Customers ranged from small home businesses to the Embassy of Switzerland and the U.S Army on issues ranging from deployment to site downs
-
-
Education
-
University of Missouri-Rolla
Bachelor of Science (BS), Computer Engineering