Daniele Rea
Software-Data Engineer, Data Architect, AWS Solution Architect, Senior Technical Team Leader at Gunpowder- Claim this Profile
Click to upgrade to our gold package
for the full feature experience.
-
English -
Topline Score
Bio
Fabio Piccolo
Daniele is a very technical skilled and talented consultant. He is able to manage the activities both autonomously and working in team. He is very procifient in Teradata scripts creation, processes management and able and experienced in interfacing the customer. When he gets a task you can be sure he manages in the best way as possible. I recommend him for any project in Teradata ambit as senior tecnhical consultant and/or technical team leader.
Fabio Piccolo
Daniele is a very technical skilled and talented consultant. He is able to manage the activities both autonomously and working in team. He is very procifient in Teradata scripts creation, processes management and able and experienced in interfacing the customer. When he gets a task you can be sure he manages in the best way as possible. I recommend him for any project in Teradata ambit as senior tecnhical consultant and/or technical team leader.
Fabio Piccolo
Daniele is a very technical skilled and talented consultant. He is able to manage the activities both autonomously and working in team. He is very procifient in Teradata scripts creation, processes management and able and experienced in interfacing the customer. When he gets a task you can be sure he manages in the best way as possible. I recommend him for any project in Teradata ambit as senior tecnhical consultant and/or technical team leader.
Fabio Piccolo
Daniele is a very technical skilled and talented consultant. He is able to manage the activities both autonomously and working in team. He is very procifient in Teradata scripts creation, processes management and able and experienced in interfacing the customer. When he gets a task you can be sure he manages in the best way as possible. I recommend him for any project in Teradata ambit as senior tecnhical consultant and/or technical team leader.
Credentials
-
Einstein Analytics and Discovery Consultant
SalesforceOct, 2020- Nov, 2024 -
Teradata 14 Certified Professional
TeradataJun, 2019- Nov, 2024 -
AWS Certified Security – Specialty
Amazon Web Services (AWS)May, 2021- Nov, 2024 -
AWS Certified Security – Specialty
Amazon Web Services (AWS)May, 2021- Nov, 2024 -
AWS Certified Solutions Architect – Associate
Amazon Web Services (AWS)Jan, 2021- Nov, 2024 -
AWS Solution Architect Associate
Amazon Web Services (AWS)Jan, 2021- Nov, 2024 -
TERADATA BASICS
- -
TERADATA SQL SPECIALIST
-
Experience
-
Gunpowder
-
Italy
-
IT Services and IT Consulting
-
1 - 100 Employee
-
Software-Data Engineer, Data Architect, AWS Solution Architect, Senior Technical Team Leader
-
Oct 2019 - Present
R&D Activities: Research and study about innovative methodologies of big data framework integration within Salesforce Einstein Analytics datasets (HDFS environment integration using SPARK).Used technologies: Einstein Analytics, SPARK, HDFS.Integration Salesforce-One Key: responsible of the analysis and integration of a software which aim was to deliver an online platform to Public Institutions and Local Administrations. Used technologies: Apex, Trigger, Schedulable class, Batch classDevelopment environment: Object Manager, Salesforce Developer Console, Visual Studio Code.Personal Experiences:I have been involved in the cloud environment for over a year, specifically Amazon Web Services (AWS). On January 12, 2021, I obtained the AWS Solution Architect Associate certification and I am working on developing use cases based on importing data in Real Time or Near Real Time using Kinesis to then be able to perform secure and permanent archiving on S3 using the parquet data format. This data will be available to a cluster based on Big Data (EMR) technology that allows you to use tools such as Spark, Impala, Hive, Pig, ..., in order to operate on the data with the latest technologies and, once the the desired result has been obtained, the data will be saved on S3 in parquet format, terminating the cluster automatically in order to get the most out of the Cloudera environment, with HDFS and the Hadoop framework, at the minimum cost and with the possibility of also having an automatic resizing of the cluster depending on the resources required. Be able to choose whether to deploy the cluster with web access or privately within the AWS VPC alone.Furthermore, using AWS QiuckSight it will be possible to offer the end user a graphical view, integrating AWS Glue and Athena to make the data available to Business Intelligence (BI) tools to generate reports.
-
-
-
Accenture
-
Ireland
-
Business Consulting and Services
-
700 & Above Employee
-
BigData, Software-Data Engineer,Senior Technical Team Leader,Data Architect
-
Aug 2017 - Oct 2019
I take care of Project Management, Technical Solutions Design and Implementation, Support, and Pre-sales Activities at the customer Wind-3 on the Project: eQuA NTW. I managed the Cloudera Big Data environment, with my team we face the problems that arise from the design specifications side Impala and SPARK, to mitigate or solve them.BigData: Cloudera, Impala, Spark,...GCPGNU Linux ClusterOBIEE I take care of Project Management, Technical Solutions Design and Implementation, Support, and Pre-sales Activities at the customer Wind-3 on the Project: eQuA NTW. I managed the Cloudera Big Data environment, with my team we face the problems that arise from the design specifications side Impala and SPARK, to mitigate or solve them.BigData: Cloudera, Impala, Spark,...GCPGNU Linux ClusterOBIEE
-
-
-
-
Coustomer Reference,Senior Technical Team Leader,Data Architect,TERADATA-ASTER-HADOOP,POSTE ITALIANE
-
May 2016 - Aug 2017
I take care about Project Management, Technical Solutions Design and Implementation, Support, and Pre-sales Activities. I started the Big Data project and I am currently in contact with the customer and with my team we face the problems that arise from the design specifications side Hadoop and Aster, to mitigate or solve them. I’m the UDA (Unified Data Architecture) Technical Team Leader & Data Architect in the Poste Italiane environment.I perform the studies and the optimizations of query that are performed in Teradata, leading to excessive delays in the application chain, with high reference.
-
-
-
Teradata
-
United States
-
Software Development
-
700 & Above Employee
-
Senior Professional Service - Technical Consultant, Technical Lead, POSTE ITALIANE
-
Nov 2015 - Mar 2016
I have developed and optimized the projects, related facilities and execution of queries in the development environment of Italian Post Office currently interfaced with the customer.In the production environment i applied the query enhancements, making structural changes to the DDL of tables, to respond very fast at the queries of Customer Card and NFEA.
-
-
Senior Professional Service- Technical Consultant, SOGEI (Italy,Rome)
-
Oct 2015 - Nov 2015
I developed and optimized projects, performing query tuning into the environment of Sogei currently interfaced with the customer.
-
-
Senior Professional Service -Technical Consultant, UDA Data Architect, WIND (Italy,Rome)
-
Jul 2014 - Oct 2015
I develop ed and realized a project about online backup data Campain Manager (CIM) currently interfaced with the customer.In the production environment i applied the query enhancements, making structural changes to the DDL of tables, to respond very fast at thequeries Customer Card.I was a Data Architect of the first team Big Data in Italy for the UDA architecture in WIND customer with a high level of communication with the customer about the basic requirements for proper usability of the project and its correctness.
-
-
Professional Service -Technical Consultant, UNICREDIT (Italy,Milan - Rome)
-
Feb 2014 - Jun 2014
I participated in the tests related to the upgrade project from Teradata 13.10 to 14.10 with the study of results of the use case as a referent for the customer on the upgrade activities.I participated in the project for the creation of optimized SQL views based in third normal form.
-
-
Course Apache Hadoop 2.0
-
Feb 2014 - Feb 2014
I have done the course:Apache Hadoop 2.0, Data Analysis with the Hortonworks Data Platform.
-
-
Professional Service -Technical Consultant, BNL (Italy,Rome)
-
Jul 2011 - Jan 2014
I started performing Database Administrator activities, taking part in the project for migrating from Teradata V2R6 version to 12.10, playing the role of tester adapting to the new query features.I Participated to development of BTEQ (Basic Teradata Query) to perform ETL from IBM Datastage.In the production environment I played the role of Customer Consultant applying the query enhancements and making structural changes for the DDL architecture, to respond very fast at the queries of Customer Card.
-
-
Course Teradata System Performance
-
May 2013 - May 2013
-
-
-
Teradata
-
United States
-
Software Development
-
700 & Above Employee
-
Alitalia, Teradata Professional Service
-
Nov 2010 - May 2011
Roles: Project Management, Functional Analysis, Software Development, Test Design and Test Management.Technologies: DWH/DBMS: Teradata; ETL Processes: Datastage; O.S.: Microsoft, UNIX, LINUX. Roles: Project Management, Functional Analysis, Software Development, Test Design and Test Management.Technologies: DWH/DBMS: Teradata; ETL Processes: Datastage; O.S.: Microsoft, UNIX, LINUX.
-
-
-
CNS - Consorzio Nazionale Servizi
-
Italy
-
Facilities Services
-
1 - 100 Employee
-
OFM (Open Facility Management)
-
Apr 2010 - Oct 2010
Support to the development and test of the OFM project. Support to the development and test of the OFM project.
-
-
Education
-
universita di scienze de L'Aquila
science FF.MM.NN., Computer and Information Sciences and Support Services -
La Sapienza University, Rome
Master's Degree, Master's degree in security systems and computer networks for the enterprise and the public sector.