Jay Mathuria

Senior Data Engineer at Gemini
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Jersey City, New Jersey, United States, JE

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

5.0

/5.0
/ Based on 2 ratings
  • (2)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

Ajay Soni

Jay is an exceptional IT professional. He is excellent at partnering with his clients to understanding their needs. He goes beyond the technical aspects to really understand the business and how technology can best serve that business. Once the requirements are clear, he then works tirelessly to deliver in a timely and quality way. It was a pleasure to work with him. Jay is a great player to have in the team.

nikhil dharap

During my work at TCS, I had the privilage to work with Jay for little over a year. I found Jay to be extremely enthusiastic and dedicated towards any task he worked on. I was really impressed by the way he would volunteer to work on complex technical issues and successfully resolve each one on time. In addition to helping the team with the daily project activities, Jay also encouraged everyone in the team to participate in external activities like technical paper writing and personal development. He would actively participate in external forums to help employees across the organization. Jay has been an great mentor and an ideal colleague. I am confident that Jay will be an excellent performer in any furure responsibilities that he takes on.

You need to have a working account to view this content.
You need to have a working account to view this content.

Experience

    • United States
    • Financial Services
    • 400 - 500 Employee
    • Senior Data Engineer
      • Jun 2022 - Present

    • United States
    • Financial Services
    • 700 & Above Employee
    • Data Engineering Manager
      • Sep 2016 - Jun 2022

      Built the high performing data ingestion/processing platform for future growth and faster time-to-market using innovative technologies including Greenplum, Hadoop, Hive, HBase, Sqoop and Spark. - Designed and developed data migration framework to support and control transferring of datasets from greenplum into a multi-tenant hadoop cluster. Successfully completed with audit trail and validations. - Designed a data solution in greenplum that identifies merchants for millions of transactions, which was consumed by mobile banking solution for debit and credit card customer. - Worked closely with the business and data scientists on various analytical projects, brainstorming innovative ways to leverage the technology platform and the available/new data sets to identify revenue generating opportunities. - Created Control-M jobs to automate scheduled data loads to Greenplum from Teradata. Developed Postgres functions and NIFI jobs used to process the data model. - Designed a Data lake and consumption layer on Cloudera and ingested data from various data sources in various form. - Explored Airflow to ingest data into AWS S3 for POC. Show less

    • United States
    • Financial Services
    • 700 & Above Employee
    • Data Engineer
      • Dec 2014 - Aug 2016

      • Designed Operational framework that will enforce the best practices and standards for Autosys batch development. • It includes implementation of new processes to reduce the number of outages caused due to lack of controls, lack of proactive checks or due to infrastructure issues, to attain better supportability, to reduce the recovery time in case of failures and where it requires batch to be caught up by elimination of potential human errors. • Designed Operational framework that will enforce the best practices and standards for Autosys batch development. • It includes implementation of new processes to reduce the number of outages caused due to lack of controls, lack of proactive checks or due to infrastructure issues, to attain better supportability, to reduce the recovery time in case of failures and where it requires batch to be caught up by elimination of potential human errors.

    • United States
    • Financial Services
    • 700 & Above Employee
    • Data Warehouse Specialist
      • Sep 2012 - Dec 2014

      Project : Order Execution Datamart : Client : Morgan Stanley Developed : Informatica, DB2 and Unix. Location : New York, NY Description: The Datamart stores all Order and its execution information in Star dimension schema. The data is store in FCT tables for 13 months using partition based tables. The aggregated data is stored forever in FCT tables. Responsibilities : • Managing offshore team for development and maintenance of the project. • Supporting existing application that involves debugging user problems and query. • Production Support of 10k plus autosys jobs. • Interact with L1 team and document all reoccurring issues. • Preparing requirement and design documents using Microsoft office Visio. • Debugging and analyses the job failures and resolve it immediately. • Co-ordinate between various team (DBA, BAs and downstream users) for successfully turnover. • Developed Informatica mappings, Sessions and Workflows with consideration of millions of data. • Performance tuning of mapping by analyzing logs. • Extensive handling and management of database objects i.e. Stored Procedures, Views and Tables. • Developing and enhancing Perl script to automate testing. • Resolving data issues of business users of application in all phases of project. • Giving Production support to the daily job run cycle and solves the issues immediately. • Responsible for project documentation, status reporting and presentation. • Evaluate all codes and ensure quality of all project deliverables. Show less

    • India
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Assistant Consultant
      • May 2012 - Sep 2012

      Project : Omnium Hedge Fund Administrator (May 2012 till date)Client : Northern Trust Bank.Developed : Perl, UnixLocation : ChicagoDescription : Omnium is hedge fund Administration product that provides services to its clients. Some of its functionalitiesAre listed below : • Calculation of the Net asset value ("NAV") including the calculation of the funds income and expense accruals and the pricing of securities at current market value.• Preparation of semi-annual and annual reports to shareholders.• Maintenance and filing of the fund's financial books and records as the fund accountant, including reconcilement of holdings with custody and broker records.• Payment of fund expenses.• Settlement of daily purchases and sales of securities, ensuring collection of dividends and interests.• Preparation and filing of the fund's prospectus.• Preparation and filing of other SEC filings/reports.• Calculation of the total returns and other performance measures of the fund.• Monitoring investment compliance with SEC, prospectus or U.S. Internal Revenue Code restrictions.I am responsible to provide support to business users for Omnium Applications.Responsibilities:• Writing Perl Script to match value traded by broker and its client.• Support various applications at market hours. • Co-ordination between Brokers, client and business users. Show less

    • IT Analyst
      • Jul 2010 - Apr 2012

      Project : Reference Data Technology (August 2010 till April 2012)Client : J.P.Morgan ChaseDeveloped : Perl, Oracle.Location : Mumbai, Glasgow.Description :Account and product information are stored in data mart. Product information was available to us from third parties like Bloomberg, Reuters etc in form of Encrypted files. ETL tool, use to read this file, transform the data and load it to database.I have developed a tool (frameworks) that will decrypt the files, transform the data and compare data in source file/database to that of target (database). The Tool is very generic and can be easily configured for other Recon scripts.I was responsible for development, enhancement, maintenance, support and new development for the tool. Show less

    • Software Developer
      • Mar 2006 - Aug 2010

      Project : ETSDB Compliance Reporting (Oct 2007 till August 2010)Client : Morgan StanleyProject : Compliance ReportingDeveloped : Perl, SybaseLocation : New York, NYAs investment bank, Morgan Stanley needs to meet various regulatory and compliance requirement. We were responsible to generate these reports and send it across to various regulators and Exchanges. One of the most critical report developed/supported by our team is OATS. This report goes out to FINRA (Financial Industry Regulatory Authority, Inc) which has details on life cycle of entire order. Technology we used here in Perl and Sybase.Roles and Responsibilities• I am responsible for Global Regulatory Reporting which includes reports from Asia, LN and NY.• Developed new report OTS that is required by NYSE on demand basis.• Gathering requirements from Business users for the new reports or enhancements requested.• Writing Stored Procedures and performance tunning for new reports and modify existing once for any enhancements suggested to the existing reports.• Creating database objects such as Tables, Indexes and Triggers etc. • Tuning of complex SQL/Stored Procedures for the performance enhancement.• Writing Perl scripts to generate reports out of database using DBI and DBD modules.• Creating Autosys Jobs and updating Netcool alerts for the new reports added.• Giving Production support to the daily job run cycle and solves the issues immediately. Show less

    • India
    • Technology, Information and Internet
    • 700 & Above Employee
    • Software Developer
      • 2004 - 2006

      Project : Order Execution Datamart : Client : Morgan Stanley Developed : Informatica, DB2 and Unix. Location : New York, NY Description: The Datamart stores all Order and its execution information in Star dimension schema. The data is store in FCT tables for 13 months using partition based tables. The aggregated data is stored forever in FCT tables. Responsibilities : • Managing offshore team for development and maintenance of the project. • Supporting existing application that involves debugging user problems and query. • Production Support of 10k plus autosys jobs. • Interact with L1 team and document all reoccurring issues. • Preparing requirement and design documents using Microsoft office Visio. • Debugging and analyses the job failures and resolve it immediately. • Co-ordinate between various team (DBA, BAs and downstream users) for successfully turnover. • Developed Informatica mappings, Sessions and Workflows with consideration of millions of data. • Performance tuning of mapping by analyzing logs. • Extensive handling and management of database objects i.e. Stored Procedures, Views and Tables. • Developing and enhancing Perl script to automate testing. • Resolving data issues of business users of application in all phases of project. • Giving Production support to the daily job run cycle and solves the issues immediately. • Responsible for project documentation, status reporting and presentation. • Evaluate all codes and ensure quality of all project deliverables. Show less

    • India
    • Advertising Services
    • 1 - 100 Employee
    • Software Programmer
      • 2004 - 2004

Education

  • University of Mumbai
    BE, Electronics
  • D.J.Doshi Gurukul English Medium High School

Community

You need to have a working account to view this content. Click here to join now