Deivid Robim

Solutions Architect at FLIPACAR
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Melbourne, Victoria, Australia, AU
Languages
  • English Full professional proficiency
  • Portuguese Native or bilingual proficiency

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

5.0

/5.0
/ Based on 2 ratings
  • (2)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

Daniel Alves dos Santos

Conheço o Deivid a mais de 7 anos e trabalho com ele a quase 3 anos. Nesse tempo pude ter a certeza que trabalhei com umas das pessoas mais competentes que já conheci em minha vida profissional. Deivid não só é uma pessoa esforçada e empenhada, como também é um excelente orientador e compartilhador de conhecimento. Nunca teve um projeto que iniciou e não concluiu, sempre foi até o fim de tudo que começou. Desempenha suas tarefas de maneira espetacular e sempre está buscando melhorar suas habilidades. Recomendo o Deivid por todas as suas qualidades profissionais que fazem dele um profissional em Ascenção cada vez melhor.

Debora Ferreira

Trabalho com o Deivid a quase 2 anos e ele sempre se mostrou uma pessoa muito comprometida, proativa, é super inteligente e responsável. Mantem o respeito com os colegas de trabalho que é muito importante e tem postura exemplar no ambiente corporativo. Uma pessoa muito competente e que tem muito a agregar a qualquer equipe que estiver.

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • AWS Certified Data Analytics - Specialty
    Amazon Web Services (AWS)
    Feb, 2021
    - Nov, 2024
  • AWS Certified SysOps Administrator – Associate
    Amazon Web Services (AWS)
    May, 2020
    - Nov, 2024
  • AWS Certified Developer - Associate
    Amazon Web Services (AWS)
    Mar, 2020
    - Nov, 2024
  • AWS Certified Solutions Architect - Associate
    Amazon Web Services (AWS)
    Nov, 2019
    - Nov, 2024
  • AWS Certified Cloud Practitioner
    Amazon Web Services (AWS)
    Sep, 2019
    - Nov, 2024

Experience

    • Australia
    • Information Technology & Services
    • 1 - 100 Employee
    • Solutions Architect
      • May 2022 - Present

      • Implementation of development principles such as Version Management, Documentation and Automated Deployment. • Design and development of a metadata-driven API Framework, providing code reusability and scalability. • Development of Alerting System for Custom Applications. • API development enabling fast integration with clients and partners. • Development of a Payment Platform allowing buyers and sellers to perform transactions, protecting the buyer and providing increased transparency. • Enablement of MicroServices Deployment increasing Agility and Scalability. • Assist in defining technical requirements to solve business problems. • Assist in delivering data infrastructure to support business intelligence reporting. • Create and maintain a reporting dashboard suite. Technologies: • DevOps: Bitbucket Pipelines, Terraform, SOPS, Bash • Languages: Python, Node.JS • Cloud Technologies: AWS ECR, AWS ECS, AWS S3, AWS CloudFront, AWS API Gateway, AWS Lambda Layers, AWS Lambda, AWS Secret Manager, AWS VPC, AWS EC2, AWS RDS, AWS IAM, AWS CloudWatch • Data Integration: Airbyte • Orchestration: Prefect • Containerization: Docker • Databases: PostgreSQL • Data Visualization: Streamlit

    • Australia
    • Retail
    • 700 & Above Employee
    • Data Engineer
      • Sep 2021 - May 2022

      • Responsible for building and supporting the Cloud Data Platform infrastructure on Azure. • Development of alerting and monitoring framework to improve observability of Data Pipelines. Built Data Pipelines on Azure Data Factory to process large datasets on Databricks and ingest the data into Snowflake to perform advanced analytics. • Used Databricks to integrate with other tools in the data ecosystem such as Apache Spark and Delta Lake. • Created Dimension and Fact Tables on Snowflake. • Develop enhancements to existing ELT Framework on Azure Data Factory, responsible for processing and ingesting Terabytes of data into Snowflake on a daily basis. • Establish and implement development principles such as version management, documentation, and testing to ensure the appropriate quality and process standards are met. Technologies: • DevOps: Azure DevOps, Terraform, Bash • Languages: Python • Cloud Technologies: Azure Databricks, Azure Blob Storage, Azure Functions, Azure Logic Apps, Azure Application Insights, Azure SQL, Azure Data Factory, Azure Key Vault • Containerization: Docker • Databases: MS SQL Server • Datawarehouse: Snowflake • Data Visualization: Power BI

    • Australia
    • Technology, Information and Internet
    • 1 - 100 Employee
    • Data Engineer
      • May 2021 - Sep 2021

      • Optimised Data Pipelines performance in 6x by enabling asynchronous data processing with Dask Executors on Prefect. • Re-designed data storage strategy on AWS S3, reducing costs by 90% without losing any data. • Developed a metadata-driven ELT framework, focusing on code reliability and scalability. • Loaded data into Snowflake tables from an external stage. • Unloaded data from Snowflake to AWS S3 using COPY Command. • Developed Snowflake Procedures in Javascript. • Queried Semi-Structured data using External Tables on AWS S3. • Integrated data from multiple third party APIs to create data views on Snowflake to be used on Tableau. • Created impactful dashboards on Tableau for internal and external stakeholders. Technologies: • DevOps: Azure DevOps, Terraform, Bash • Languages: Python • Cloud Technologies: AWS DynamoDB, AWS S3, AWS API Gateway, AWS Lambda Layers, AWS Lambda, AWS Secret Manager, AWS VPC, AWS EC2, AWS IAM, AWS CloudWatch • Data Integration: Fivetran • Orchestration: Prefect • Containerization: Docker • Databases: PostgreSQL • Datawarehouse: Snowflake • Data Visualization: Tableau

    • Australia
    • Financial Services
    • 400 - 500 Employee
    • Data Engineer
      • Jan 2020 - May 2021

      • Strong knowledge around Azure Cloud Services. • Serving as a hands-on subject matter expert for Data Collection, Processing and Automation in Azure. • Build and maintain the Data Lake, Data Warehouse and Data Pipelines. • Created a Batch Data Pipeline to ingest CDC data from SQL Server into Snowflake. • Designed and developed Dimensional Modelling on Snowflake. • Performed data cleaning, data preparation and feature engineering for Machine Learning using SQL on Snowflake. • Created Tasks to manage workflows on Snowflake. • Designed and implemented a Serverless Event Driven Data Pipeline to process Semi-Structured Data triggered by hundreds of underwriters requesting Comprehensive Credit Report Data through Bureaux API's. • Maintain and support Batch Pipelines using Data Factory and Event Driven Data Pipelines with Azure Functions. • Perform Infrastructure as Code to provision resources on Azure using Terraform, ARM Templates Technologies: • DevOps: Azure DevOps, Terraform, Bash • Languages: Python • Cloud Technologies: Azure Service Bus, Azure Blob Storage, Azure Functions, Azure Logic Apps, Azure Application Insights, Azure SQL, Azure Data Factory, Azure Key Vault • Containerization: Docker • Databases: MS SQL Server • Datawarehouse: Snowflake • Data Visualization: Tableau, Streamlit

    • Singapore
    • Information Services
    • 200 - 300 Employee
    • Data Engineer
      • Jun 2019 - Dec 2019

      • Improvement of internal processes by automating manual work, optimising data delivery and re-designing infrastructure using cloud-based technologies, Python and SQL.• Led a POC to define the most appropriated Data Warehousing solution to use in the cloud.• Designed a Batch Data Pipeline to ingest Terabyte scale data into Snowflake, fully automated using AWS S3, Lambda, Glue (Spark) and SNS.• Architecture and design of standard RDBMs, Data Warehouses and NoSQL databases.• Hands on Experience on Fivetran API connectors.• Created a Data Lake on AWS S3 to minimise costs and maximise performance on services such as Snowflake and AWS Athena.• Development of Data Pipelines with Python and Prefect to process large datasets to meet data quality standards. • Work with stakeholders including Product Management and Sales teams to assist with data-related issues and support their data infrastructure needs.Technologies: • Languages: Python• Cloud Technologies: AWS SNS, AWS Athena, AWS Glue, AWS Redshift, AWS S3, AWS Lambda Layers, AWS Lambda, AWS Secret Manager, AWS VPC, AWS EC2, AWS IAM, AWS CloudWatch• Databases: MS SQL Server• Datawarehouse: Snowflake• Data Visualization: Tableau

    • Senior Data Analyst
      • Aug 2016 - Jun 2019

      • Responsible for developing and maintaining the databases of the organization. • In charge of performance tuning of the database and T-SQL queries.• Manipulate, clean, process and validate client and supplier data using SQL Server.• Design automated systems to mine information and make reports.• Design Stored Procedures, User Defined Functions, Views, TSQL Scripting for complex business logic.• Automation of new databases and models, to ensure they are ready for client delivery.• Create and load custom audiences for targeting purposes into digital platforms such as Facebook, Microsoft and Google.• Technical support for the Products and the Sales teams.Technologies: • Languages: Python• Databases: MS SQL Server• Data Visualization: Tableau

    • Information Services
    • 700 & Above Employee
    • Senior Data Analyst
      • May 2014 - Jun 2016

      • Manipulating, cleansing and processing data using SQL Server.• Responsible for loading, extracting and validation of client data.• Analysing raw data, drawing conclusions and developing business reports.• Designing, developing and implementing automated loading processes using ETL tools.• Experience in developing products focused in the prospection of new clients, upgrading customer portfolios, making competitors’ research and segmented campaigns.• I performed the bridge between the commercial and the marketing area, understanding and making possible the scope of work, negotiating development deadlines.• Ground in the processing and automation of market data reports, helping to create and control indicators, supporting commercial campaigns and analyzing consumers’ database.Technologies: • Languages: Python• Databases: MS SQL Server• Data Visualization: Tableau

    • Data Analyst
      • Dec 2012 - May 2014

    • Junior Data Analyst
      • Oct 2011 - Dec 2012

    • Intern
      • Jan 2011 - Oct 2011

Education

  • Udacity
    Data Engineering Nanodegree
    2020 - 2020
  • Universidade do Grande ABC
    Bachelor of IT
    2009 - 2012

Community

You need to have a working account to view this content. Click here to join now