Dejan Lozanovic

CEO at Bytecode Factory
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Camberwell, England, United Kingdom, UK

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • Convolutional Neural Networks
    Coursera
    Feb, 2018
    - Oct, 2024
  • Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
    Coursera
    Dec, 2017
    - Oct, 2024
  • Structuring Machine Learning Projects
    Coursera
    Dec, 2017
    - Oct, 2024
  • Neural Networks and Deep Learning
    Coursera
    Nov, 2017
    - Oct, 2024

Experience

    • United Kingdom
    • IT Services and IT Consulting
    • 1 - 100 Employee
    • CEO
      • Jul 2019 - Present
    • United Kingdom
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Senior Data Engineer / National Crime Agency - Enhanced Security Clearance
      • Feb 2022 - Mar 2023

      I’m ingesting crime report emails from police stations across the UK, stripping out all PII data from reports and storing the results into Elastic Search. Data is then used in Grafana to display trends of crimes per region or per different types of crime like illegal firearms or drug dealing etc. Performing datamodeling using star schema inside Redshift, and ingesting various crime data into Redshift datawarehouse using Scala spark. I was orchestrating jobs by Airflow. Technologies used Python,Scala, AWS, Elastic Search, Grafana, Docker, EMR, Redshift, Airflow Show less

    • United Kingdom
    • Internet Publishing
    • 700 & Above Employee
    • Senior Data Engineer
      • Nov 2021 - Jan 2022

      I’m working on deprecating Kafka and streaming applications into spark batch processing applications, I worked with reservation data. The applications are running as notebooks in Databricks and orchestrated with Airflow. The work also involved stabilising Kafka and Zookeeper deployment over EC2. Technologies used Python, Apache Spark, EMR, Ansible, Grafana, Databricks, Airflow I’m working on deprecating Kafka and streaming applications into spark batch processing applications, I worked with reservation data. The applications are running as notebooks in Databricks and orchestrated with Airflow. The work also involved stabilising Kafka and Zookeeper deployment over EC2. Technologies used Python, Apache Spark, EMR, Ansible, Grafana, Databricks, Airflow

    • United States
    • Advertising Services
    • 1 - 100 Employee
    • Big data lead consultant / Bank Of England SC Cleared
      • Oct 2020 - Nov 2021

      I’m ingesting trading data from Trade Repositories into the Bank's Hadoop data lake. Data is coming in XML format on a daily basis. In the second stage after data ingestion, we were performing some ETL transformations and aggregation over the data. And finally, in the third stage, we are creating data marts for different parties inside the bank organisation. I personally manage to optimize data ingestion and reduce execution time from 50 minutes per file to 4 minutes per file. Technologies used Python, Apache Spark, Hive, Hadoop, Airflow Show less

    • Singapore
    • Internet Publishing
    • 100 - 200 Employee
    • Big Data Engineer
      • Jul 2019 - Dec 2019

      I worked on data migration from an on-premise data warehouse into AWS data lake, the data consisted reservations and hotels data. I was parsing ordering event data to replace deprecating processes and databases. I migrated processes from Azkaban and AWS step functions into Airflow. Technologies used in this project: Scala, Python, AWS, AWS Lambda, EMR, S3, Azkaban, Apache Spark(scala and python), Hive, Teradata, Airflow. I worked on data migration from an on-premise data warehouse into AWS data lake, the data consisted reservations and hotels data. I was parsing ordering event data to replace deprecating processes and databases. I migrated processes from Azkaban and AWS step functions into Airflow. Technologies used in this project: Scala, Python, AWS, AWS Lambda, EMR, S3, Azkaban, Apache Spark(scala and python), Hive, Teradata, Airflow.

    • United States
    • Information Technology & Services
    • 1 - 100 Employee
    • Senior Software Engineer
      • Jan 2019 - Jul 2019

      Endeavour has a video streaming platform that focuses on sports events, it provides live streaming and video on-demand events. I created a data lake to collect data from 9 databases and a few amazon kinesis streams, using Apache Sqoop and StreamSets. Generated analytic reports using Apache Spark, replaced some of the existing reports and reduce time to generate them by 4x compared to an existing solution. Technologies used on this project: Scala, AWS, AWS Lambda, Aurora RDS(MySQL), EMR, S3, AWS Athena, Apache Sqoop, Apache Spark, StreamSets. Show less

    • Information Services
    • 700 & Above Employee
    • Principal Engineer
      • Dec 2017 - Nov 2018

      Data Labs is the Research & Development business unit of Experian. They are building next generation products using machine learning and artificial intelligence. My role is to oversee all projects inside Datalabs and productionize them, after Data Scientists build a model. Build a software architecture, specify hardware requirements, and develop a working software. The first project is a transaction categorization engine, it takes transactions from user accounts and categorizes them using machine learning models. Then it returns transactions with categories, and calculates KPIs and Affordability. Technologies used on this project are Scala, Python, Kafka, HDFS, Spark, Akka, Knockoutjs, Mongodb, Docker, Kubernetes, Microsoft Azure. Show less

    • United Kingdom
    • IT Services and IT Consulting
    • 200 - 300 Employee
    • Big Data Architect
      • Jan 2017 - Dec 2017

      I’m working at Lloyds Banking group as Technical Big Data Architect, it is a greenfield project that is extracting data from data lake, and store them in a format suitable for their machine learning program. I’m writing a technical design document for the project, doing code reviews for Lloyds offshore team, and providing implementation for key components. I was assisting the Sales team to demonstrate and sell licence for Dataguise product. I used a Dataguise product to perform discovery and encryption of sensitive data in their hadoop clusters. Providing guidance and setting up projects on Jenkins for Whishworks’s development team. Building proof of concept with Hortonworks DataFlow. Show less

    • United Arab Emirates
    • Industrial Machinery Manufacturing
    • 1 - 100 Employee
    • Big Data Architect
      • Feb 2016 - Nov 2016

      I was designing the big data platform for LeoTech’s client. Main focus from the platform was around security, and to provide fine grained access to datasets. Another big requirement is to provide the ability to import new dataset without development. Key technologies used in this project are: Cloudera Enterprise edition,Kerberos, Apache spark,Apache Kafka Apache Nifi I was designing the big data platform for LeoTech’s client. Main focus from the platform was around security, and to provide fine grained access to datasets. Another big requirement is to provide the ability to import new dataset without development. Key technologies used in this project are: Cloudera Enterprise edition,Kerberos, Apache spark,Apache Kafka Apache Nifi

    • United Kingdom
    • Business Consulting and Services
    • 1 - 100 Employee
    • Big Data Senior Software Developer
      • Oct 2014 - Jan 2016

      Magnetic, the leader in search retargeting, specializes in reaching consumers with relevant ad messages based on intent. As the partner of choice for leading Fortune 500 brands, Magnetic powers both brand awareness and direct response campaigns through its core capabilities, which includes search retargeting on desktop and mobile, advanced media optimization, programmatic buying, site retargeting, and extensive creative opportunities. Currently a Senior Data Developer, working with a lot of different technologies, like pig, apache spark, hadoop map reduce, impala, luigi, and using Scala and Python Main projects that I'm working on: Keyword categorisation – java hadoop M/R project. I was implementing multi language support Reporting & analytics - using PIG(legacy reports) and migrating them to apache spark Apache Flink Streaming POC – we are going to migrate processing from batch to real time streaming, and I evaluated Apache Flink. Show less

    • United States
    • Book and Periodical Publishing
    • 1 - 100 Employee
    • senior software developer
      • Jun 2014 - Oct 2014

      Senior Java Developer, working on all aspects of the software development and code review ensuring that team follow best TDD and BDD practice in development processes. Main activities and responsibilities: Migrating existing cochranelibrary web site to a new dotCMS platform. Using java, spring and velocity. Building proof of concept project to find pirated content online, using Hadoop, Amazon web services, and commoncrawl data Senior Java Developer, working on all aspects of the software development and code review ensuring that team follow best TDD and BDD practice in development processes. Main activities and responsibilities: Migrating existing cochranelibrary web site to a new dotCMS platform. Using java, spring and velocity. Building proof of concept project to find pirated content online, using Hadoop, Amazon web services, and commoncrawl data

    • Software Developer
      • Apr 2011 - Jul 2014

      Senior software developer Senior Java Developer, working on all aspects of the software development lifecycle from planning and implementation through to delivery of a scalable messaging platform for rapid global expansion of existing products and services. The platform consists mainly of RabbitMQ & Spring Integration solutions. While I was a Software engineer for 192.com I was working on developing fraud prevention services. This involved working on many small green field Java projects. Projects are using Spring MVC, SQLServer/MySQL, Java and Maven. Main activities and responsibilities: Developing in Spring framework 3rd party data providers and fetching data through SOAP/REST/ or any other protocol, while also communicating over RabbitMQ with the rest of the system. Maintaining and adding new features to a Batch client program that uses MongoDB to persist all requests and responses from the system. Applied Refactoring techniques and Object Oriented Design Patterns to produce robust and scalable application code Used the Pair Programming technique to enhance the development process and improve code quality Adopted Test-driven Development (TDD) approach using unit testing & object mocking to produce well tested and good quality code Using Behaviour-driven Development (BDD) tests in collaboration with analysts and testers to ensure good coverage and quality of functional testing with Scrum and Kanban methodology. Used Continuous Integration and versioning software to apply a high level of quality control In April 2013 I moved to the London office from Belgrade. Show less

    • Technology, Information and Internet
    • 1 - 100 Employee
    • Senior Software Developer & System Arhitect
      • Feb 2009 - Apr 2011

      Main activities and responsibilities: Creating extraction software for product search engine for website www.yoterra.com, and maintaining the existing code. Implementing whole extraction and normalization process in Hadoop. Creating index of data with Lucene, Hbase and Cassandra. Working on the website using Spring, and Spring MVC with JSP and JSTL. Managing 3 different teams in total of 40 people. Main activities and responsibilities: Creating extraction software for product search engine for website www.yoterra.com, and maintaining the existing code. Implementing whole extraction and normalization process in Hadoop. Creating index of data with Lucene, Hbase and Cassandra. Working on the website using Spring, and Spring MVC with JSP and JSTL. Managing 3 different teams in total of 40 people.

    • United States
    • Travel Arrangements
    • 1 - 100 Employee
    • Senior Software Developer/ Team Leader
      • Apr 2007 - Feb 2009

      Main activities and responsibilities: Maintenance and adding new features into Limores's main desktop application called Qlimo.Qlimo (covering all processes). From creating reservations, dispatching cars, and doing accounting stuff, it is a huge java swing application run concurrently on over 50 clients. Develop and integrate IVR into existing architecture. IVR is calling customers / drivers on various different events. Computer Telephony Integration – this project is divided into three major parts. First part - to speed up creating a reservation. When the operator answers the call, Qlimo automatically opens the reservation screen, and fills it with all data related to this customer based on his caller ID. Second requirement – to allow operators to press a button in Qlimo, and system is making a bridge call between the operator and the customer. Third requirement is to display all call recordings for specific reservation, so operator can hear them Show less

    • India
    • IT Services and IT Consulting
    • 1 - 100 Employee
    • Software Developer
      • Sep 2005 - Oct 2006

      Wrote data extraction program called - “Polisher”. Polisher is written in Java for Vast company. It extracts data from millions of web pages (data like location, price, job's salary etc...) and normalizes values like salary ( job posters can give salary per hour, day, week, year) to yearly amount. Also, mileage can be given in miles or kilometres etc.. Also, users can easily improve its accuracy just by adding “Words of Interest”. This program is also special because at the time it was written nothing like that existed. All the data displayed on http://www.vast.com is extracted by Polisher. Show less

    • Software Developer
      • Jan 2004 - Aug 2005

      Writing libSMPP in c++, library that covers SMPP v 3.4 protocol. SMPP is a protocol for text messaging exchange between mobile operators. This protocol is also widely used by companies that send text messages and bulk text messages over the internet to mobile operators. Also, writing an SMS router program that route incoming messages into Routo's system, and deliver them to Routo's suppliers, based on many rules - like suppliers coverage, price, throughput. Writing libSMPP in c++, library that covers SMPP v 3.4 protocol. SMPP is a protocol for text messaging exchange between mobile operators. This protocol is also widely used by companies that send text messages and bulk text messages over the internet to mobile operators. Also, writing an SMS router program that route incoming messages into Routo's system, and deliver them to Routo's suppliers, based on many rules - like suppliers coverage, price, throughput.

Education

  • Djuro Strugar

Community

You need to have a working account to view this content. Click here to join now