Sudheer Palyam

Data Engineering Manager at EnergyAustralia
  • Claim this Profile
Contact Information
us****@****om
(386) 825-5501
Location
Melbourne, Victoria, Australia, AU
Languages
  • English -
  • Hindi -
  • Telugu -
  • Kannada -
  • Tamil -

Topline Score

Topline score feature will be out soon.

Bio

Generated by
Topline AI

5.0

/5.0
/ Based on 2 ratings
  • (2)
  • (0)
  • (0)
  • (0)
  • (0)

Filter reviews by:

Esin Dundar

I enjoyed working closely with Sudheer in building the foundation for the Regional Customer Data Platform at Newscorp. It was a pleasure to work with Sudheer. I found Sudheer to be highly organised, focussed, clear in supplying solution designs; guiding the delivery and establishing himself as the glue between various delivery teams. He required minimal supervision and quickly established himself and earned the respect both by the business and IT stakeholders with his roll the sleeves approach across all aspects of the project. He worked tirelessly in this very complex program and showed genuine care for listening and understanding the business requirements. I recommend Sudheer as a key member for any data program.

LinkedIn User

Sudheer is technically very good. He is a quick learner. He has the "can do" attitude. You ask him anything, he will rarely say no. We both joined Sun at the same time. Though I had more years of experience than him when we joined, but the way he was working, his coding style in C & Java and grasping the know how on Sun system architecture, Solaris Internals, I was very much impressed. He was delivering more than what was expected at his job level with respect to his experience level. Still we use the dashboard he had developed in Java for our group. He is very friendly and enthusiastic. I would definitely like to recommend him to any company.

You need to have a working account to view this content.
You need to have a working account to view this content.

Credentials

  • Microsoft Certified: Azure Fundamentals
    Microsoft
    Jun, 2020
    - Nov, 2024
  • MapR Certified Spark Developer
    MapR Technologies
    May, 2017
    - Nov, 2024
  • SCJA
    Sun Microsystems
    Jul, 2008
    - Nov, 2024
  • SCJP
    Sun Microsystems
    Sep, 2007
    - Nov, 2024
  • AWS Certified Solutions Architect - Associate (SAA)
    Amazon Web Services (AWS)
    Jun, 2020
    - Nov, 2024

Experience

    • Australia
    • Utilities
    • 700 & Above Employee
    • Data Engineering Manager
      • Aug 2022 - Present

       Cloud Data Foundation: Define and implement cloud-based data strategies that optimize performance, scalability, and cost-efficiency using AWS cloud services. Creating enterprise data models for self-service analytics at all levels. Enabled for 16 internal key business units.  Data Architecture Design: Develop end-to-end data architecture solutions that leverage AWS (Redshift, Lambda, Airflow (MWAA)) services and Databricks, data integration, transformation, storage, and analysis… Show more  Cloud Data Foundation: Define and implement cloud-based data strategies that optimize performance, scalability, and cost-efficiency using AWS cloud services. Creating enterprise data models for self-service analytics at all levels. Enabled for 16 internal key business units.  Data Architecture Design: Develop end-to-end data architecture solutions that leverage AWS (Redshift, Lambda, Airflow (MWAA)) services and Databricks, data integration, transformation, storage, and analysis, information classification (RBAC), Security.  Databricks Implementation: Architect data processing pipelines, analytics workflows, and machine learning pipelines using Databricks' unified analytics platform, and ML Operationalisation. Building data products and advanced analytics for internal and external users.  Data Governance: Establish data governance frameworks and practices for maintaining data quality, security, and compliance across AWS and Databricks environments, using Alation.  Solution Collaboration: Collaborate with enterprise architecture, security architecture, data engineers, data scientists, and business stakeholders to understand data requirements and translate them into effective architecture designs.  Data Security: Implement security controls and encryption mechanisms to protect sensitive data stored and processed within AWS and Databricks.  Project Management: Plan and manage data engineering projects, including setting priorities, allocating resources, and monitoring progress and measuring success. Coach and mentor team members on technologies, techniques, and excellence in data delivery. Show less  Cloud Data Foundation: Define and implement cloud-based data strategies that optimize performance, scalability, and cost-efficiency using AWS cloud services. Creating enterprise data models for self-service analytics at all levels. Enabled for 16 internal key business units.  Data Architecture Design: Develop end-to-end data architecture solutions that leverage AWS (Redshift, Lambda, Airflow (MWAA)) services and Databricks, data integration, transformation, storage, and analysis… Show more  Cloud Data Foundation: Define and implement cloud-based data strategies that optimize performance, scalability, and cost-efficiency using AWS cloud services. Creating enterprise data models for self-service analytics at all levels. Enabled for 16 internal key business units.  Data Architecture Design: Develop end-to-end data architecture solutions that leverage AWS (Redshift, Lambda, Airflow (MWAA)) services and Databricks, data integration, transformation, storage, and analysis, information classification (RBAC), Security.  Databricks Implementation: Architect data processing pipelines, analytics workflows, and machine learning pipelines using Databricks' unified analytics platform, and ML Operationalisation. Building data products and advanced analytics for internal and external users.  Data Governance: Establish data governance frameworks and practices for maintaining data quality, security, and compliance across AWS and Databricks environments, using Alation.  Solution Collaboration: Collaborate with enterprise architecture, security architecture, data engineers, data scientists, and business stakeholders to understand data requirements and translate them into effective architecture designs.  Data Security: Implement security controls and encryption mechanisms to protect sensitive data stored and processed within AWS and Databricks.  Project Management: Plan and manage data engineering projects, including setting priorities, allocating resources, and monitoring progress and measuring success. Coach and mentor team members on technologies, techniques, and excellence in data delivery. Show less

    • Australia
    • Technology, Information and Media
    • 700 & Above Employee
    • Program Architect - Customer Data Platform
      • Sep 2021 - Aug 2022

      Lead a multi-staged strategic transformation programs solution architecture development and delivery. Design and deliver end to end high-level solutions that derive maximum business benefits and leverage more value from our investment in the CDP (Customer Data Platform) ecosystem. Design solutions aligned to the Domain architecture roadmap for the Commercial domain, with adherence to architecture methods and standards Interpret stakeholder goals, design and cost related… Show more Lead a multi-staged strategic transformation programs solution architecture development and delivery. Design and deliver end to end high-level solutions that derive maximum business benefits and leverage more value from our investment in the CDP (Customer Data Platform) ecosystem. Design solutions aligned to the Domain architecture roadmap for the Commercial domain, with adherence to architecture methods and standards Interpret stakeholder goals, design and cost related solutions Define and maintain target and transition state architectures for the program Define and guide Customer Data Design and modelling for consumption across the Commercial and Marketing use cases Provide day-to-day technical leadership across permanent and vendor architect resources assigned to the program. Provide architecture governance for the technical designs across the program Proactively identify and address delivery/technical issues and challenges in order to ensure business outcomes are delivered. Define and guide Proof-of-concepts delivery across the program Contribute to a high performing delivery culture, comprised of engaged and empowered employees. Show less Lead a multi-staged strategic transformation programs solution architecture development and delivery. Design and deliver end to end high-level solutions that derive maximum business benefits and leverage more value from our investment in the CDP (Customer Data Platform) ecosystem. Design solutions aligned to the Domain architecture roadmap for the Commercial domain, with adherence to architecture methods and standards Interpret stakeholder goals, design and cost related… Show more Lead a multi-staged strategic transformation programs solution architecture development and delivery. Design and deliver end to end high-level solutions that derive maximum business benefits and leverage more value from our investment in the CDP (Customer Data Platform) ecosystem. Design solutions aligned to the Domain architecture roadmap for the Commercial domain, with adherence to architecture methods and standards Interpret stakeholder goals, design and cost related solutions Define and maintain target and transition state architectures for the program Define and guide Customer Data Design and modelling for consumption across the Commercial and Marketing use cases Provide day-to-day technical leadership across permanent and vendor architect resources assigned to the program. Provide architecture governance for the technical designs across the program Proactively identify and address delivery/technical issues and challenges in order to ensure business outcomes are delivered. Define and guide Proof-of-concepts delivery across the program Contribute to a high performing delivery culture, comprised of engaged and empowered employees. Show less

    • Australia
    • Banking
    • 700 & Above Employee
    • Solution Designer & Lead @ Nab Data Hub
      • Jun 2020 - Sep 2021

      Design & implement data governance mechanism & compliance standards by metadata harvesting and augment with business metadata, cataloging, end-to-end lineage capture, Data Quality Mgmt & profiling. Designer at NAB Data Hub (NDH) one of Australis’s renowned banking standards grade date-lake on AWS. Designing and implementing mechanisms to automate the capture of technical metadata, augment it with Business Metadata and Cataloging. Using AWS Glue Catalog, Hive Event Listeners, SNS/SQS… Show more Design & implement data governance mechanism & compliance standards by metadata harvesting and augment with business metadata, cataloging, end-to-end lineage capture, Data Quality Mgmt & profiling. Designer at NAB Data Hub (NDH) one of Australis’s renowned banking standards grade date-lake on AWS. Designing and implementing mechanisms to automate the capture of technical metadata, augment it with Business Metadata and Cataloging. Using AWS Glue Catalog, Hive Event Listeners, SNS/SQS, DynamoDB, APIGateway, Neptune Graph DB, Lambda functions. . Establish dynamic data lineage DAG from source to target systems which includes multi-tier technologies like Informatica BDM, Spark Jobs, Redshift stored procedure, Python Lambda scripts etc. . Devising a mechanism to identify Data lineage gaps and Data Quality & profiling scoring. Show less Design & implement data governance mechanism & compliance standards by metadata harvesting and augment with business metadata, cataloging, end-to-end lineage capture, Data Quality Mgmt & profiling. Designer at NAB Data Hub (NDH) one of Australis’s renowned banking standards grade date-lake on AWS. Designing and implementing mechanisms to automate the capture of technical metadata, augment it with Business Metadata and Cataloging. Using AWS Glue Catalog, Hive Event Listeners, SNS/SQS… Show more Design & implement data governance mechanism & compliance standards by metadata harvesting and augment with business metadata, cataloging, end-to-end lineage capture, Data Quality Mgmt & profiling. Designer at NAB Data Hub (NDH) one of Australis’s renowned banking standards grade date-lake on AWS. Designing and implementing mechanisms to automate the capture of technical metadata, augment it with Business Metadata and Cataloging. Using AWS Glue Catalog, Hive Event Listeners, SNS/SQS, DynamoDB, APIGateway, Neptune Graph DB, Lambda functions. . Establish dynamic data lineage DAG from source to target systems which includes multi-tier technologies like Informatica BDM, Spark Jobs, Redshift stored procedure, Python Lambda scripts etc. . Devising a mechanism to identify Data lineage gaps and Data Quality & profiling scoring. Show less

    • United States
    • Software Development
    • 700 & Above Employee
    • Sr. Solutions Architect
      • Nov 2018 - Jun 2020

      Designing and deploying 3 tier architectures or large-scale Hadoop solutions. Architecting & feature enhancements for time-sensitive real-time machine learning based cybersecurity solutions using Apache Metron. Pioneering across any delivery model: cloud, data center, hybrid. Building Monitoring, Alerting & Visualisation solutions for smoother platform operations. Machine Learning lifecycle management using Cloudera DataScience workbench. Working closely with Open-Source… Show more Designing and deploying 3 tier architectures or large-scale Hadoop solutions. Architecting & feature enhancements for time-sensitive real-time machine learning based cybersecurity solutions using Apache Metron. Pioneering across any delivery model: cloud, data center, hybrid. Building Monitoring, Alerting & Visualisation solutions for smoother platform operations. Machine Learning lifecycle management using Cloudera DataScience workbench. Working closely with Open-Source engineering community in driving innovation in Big Data space. Analyze complexly distributed production deployments, and make recommendations to optimize performance. Work closely with Hortonworks’ teams at all levels to help ensure the success of project consulting engagements with customers. Drive projects with customers to successful completion. Write and produce technical documentation, blogs, and knowledgebase articles. Show less Designing and deploying 3 tier architectures or large-scale Hadoop solutions. Architecting & feature enhancements for time-sensitive real-time machine learning based cybersecurity solutions using Apache Metron. Pioneering across any delivery model: cloud, data center, hybrid. Building Monitoring, Alerting & Visualisation solutions for smoother platform operations. Machine Learning lifecycle management using Cloudera DataScience workbench. Working closely with Open-Source… Show more Designing and deploying 3 tier architectures or large-scale Hadoop solutions. Architecting & feature enhancements for time-sensitive real-time machine learning based cybersecurity solutions using Apache Metron. Pioneering across any delivery model: cloud, data center, hybrid. Building Monitoring, Alerting & Visualisation solutions for smoother platform operations. Machine Learning lifecycle management using Cloudera DataScience workbench. Working closely with Open-Source engineering community in driving innovation in Big Data space. Analyze complexly distributed production deployments, and make recommendations to optimize performance. Work closely with Hortonworks’ teams at all levels to help ensure the success of project consulting engagements with customers. Drive projects with customers to successful completion. Write and produce technical documentation, blogs, and knowledgebase articles. Show less

    • Australia
    • Telecommunications
    • 700 & Above Employee
    • Technology Researcher/Consultant (Big Data & ML Engineering)
      • Nov 2017 - Nov 2018

      Focussing on building the core competency for NBN in Analytics specifically Big Data Engineering. Understanding latest and greatest Analytics technology available out there on the market, then championing that technology internally at NBN. Perform a technology scan across new technologies in the market and doing experiment design, implementing NBN business solutions leveraging an established, cloud-based, big data and analytics framework. Involved in: - Build and harden data pipelines for… Show more Focussing on building the core competency for NBN in Analytics specifically Big Data Engineering. Understanding latest and greatest Analytics technology available out there on the market, then championing that technology internally at NBN. Perform a technology scan across new technologies in the market and doing experiment design, implementing NBN business solutions leveraging an established, cloud-based, big data and analytics framework. Involved in: - Build and harden data pipelines for enterprise ingestion data using Nifi, Airflow & Apache Spark. - Drive the operationalization of selected analytical models by creating portable ML Models in PMML, MLeap, SageMaker, Spark ML Pipelines. - Implementing Graph Analytics using Neo4J & Spark GraphX. - Define standards for data pipelines to support NBN’s analytics platform, metadata management and Data Catalogue using Glue, CKAN, Apache Atlas & Apache MetaModel. Exploring design and build of NBN wide Feature Store. - Working with container deployment and orchestration technologies like Kubernetes at including service discovery, deploying and running applications like Nifi, Spark Jobs as Docker images in Kubernetes, monitoring, scheduling, load balancing them. Exploring AWS ECS, EKS & Fargate as alternatives to Kubernetes on EC2. - Collaborate with Principal Big Data Engineer to define the architecture and technology selection, database performance profiling on Aws Athena, AWS Redshift, AWS Redshift Spectrum, Aurora Postgres, Presto, Spark SQL. - Building Server-less applications stack and response dashboards using ReactJS, NodeJS, Lamdba & Glue. - Support prototyping, shakeout and stand-up of new analytics capabilities, implementing data wrangling and feature extraction in Spark Scala. - Work with business teams to iterate and refine visualizations of analytical outcomes. - Apply DevOps, Continuous Integration, Delivery & Testing of ML Models to Productionisation. Show less Focussing on building the core competency for NBN in Analytics specifically Big Data Engineering. Understanding latest and greatest Analytics technology available out there on the market, then championing that technology internally at NBN. Perform a technology scan across new technologies in the market and doing experiment design, implementing NBN business solutions leveraging an established, cloud-based, big data and analytics framework. Involved in: - Build and harden data pipelines for… Show more Focussing on building the core competency for NBN in Analytics specifically Big Data Engineering. Understanding latest and greatest Analytics technology available out there on the market, then championing that technology internally at NBN. Perform a technology scan across new technologies in the market and doing experiment design, implementing NBN business solutions leveraging an established, cloud-based, big data and analytics framework. Involved in: - Build and harden data pipelines for enterprise ingestion data using Nifi, Airflow & Apache Spark. - Drive the operationalization of selected analytical models by creating portable ML Models in PMML, MLeap, SageMaker, Spark ML Pipelines. - Implementing Graph Analytics using Neo4J & Spark GraphX. - Define standards for data pipelines to support NBN’s analytics platform, metadata management and Data Catalogue using Glue, CKAN, Apache Atlas & Apache MetaModel. Exploring design and build of NBN wide Feature Store. - Working with container deployment and orchestration technologies like Kubernetes at including service discovery, deploying and running applications like Nifi, Spark Jobs as Docker images in Kubernetes, monitoring, scheduling, load balancing them. Exploring AWS ECS, EKS & Fargate as alternatives to Kubernetes on EC2. - Collaborate with Principal Big Data Engineer to define the architecture and technology selection, database performance profiling on Aws Athena, AWS Redshift, AWS Redshift Spectrum, Aurora Postgres, Presto, Spark SQL. - Building Server-less applications stack and response dashboards using ReactJS, NodeJS, Lamdba & Glue. - Support prototyping, shakeout and stand-up of new analytics capabilities, implementing data wrangling and feature extraction in Spark Scala. - Work with business teams to iterate and refine visualizations of analytical outcomes. - Apply DevOps, Continuous Integration, Delivery & Testing of ML Models to Productionisation. Show less

    • Australia
    • Banking
    • 700 & Above Employee
    • Big Data Architect
      • Jul 2017 - Nov 2017

      Architecting, leading & Implementing a generic data ingestion pipeline for ANZ Bank, whose theme is metadata driven data ingestion from any type of data source, any format of data, batch/stream onto Hadoop Data lake. Use case identification, design proposition, implementation, performance assessment, thorough DevOps adherence in following Continuous integration (Git), Continuous Deployment (Bamboo), Continuous Testing (Bamboo, JUnit, SonarQube, Fortify etc). Designing Big Data… Show more Architecting, leading & Implementing a generic data ingestion pipeline for ANZ Bank, whose theme is metadata driven data ingestion from any type of data source, any format of data, batch/stream onto Hadoop Data lake. Use case identification, design proposition, implementation, performance assessment, thorough DevOps adherence in following Continuous integration (Git), Continuous Deployment (Bamboo), Continuous Testing (Bamboo, JUnit, SonarQube, Fortify etc). Designing Big Data application with attention to Data/Schema registration, Data Access, Metadata-driven Orchestration (Oozie/Airflow), Data Consistency/Quality Checks, Ingestion processing, Data profiling checks, Data Lineage, Alerting/Monitoring, Building Visualization (Tableau/Qlikview), Persistence (Hive/HBase/Impala/Kudu) Exposure to Data Modelling & Data Vault techniques. Show less Architecting, leading & Implementing a generic data ingestion pipeline for ANZ Bank, whose theme is metadata driven data ingestion from any type of data source, any format of data, batch/stream onto Hadoop Data lake. Use case identification, design proposition, implementation, performance assessment, thorough DevOps adherence in following Continuous integration (Git), Continuous Deployment (Bamboo), Continuous Testing (Bamboo, JUnit, SonarQube, Fortify etc). Designing Big Data… Show more Architecting, leading & Implementing a generic data ingestion pipeline for ANZ Bank, whose theme is metadata driven data ingestion from any type of data source, any format of data, batch/stream onto Hadoop Data lake. Use case identification, design proposition, implementation, performance assessment, thorough DevOps adherence in following Continuous integration (Git), Continuous Deployment (Bamboo), Continuous Testing (Bamboo, JUnit, SonarQube, Fortify etc). Designing Big Data application with attention to Data/Schema registration, Data Access, Metadata-driven Orchestration (Oozie/Airflow), Data Consistency/Quality Checks, Ingestion processing, Data profiling checks, Data Lineage, Alerting/Monitoring, Building Visualization (Tableau/Qlikview), Persistence (Hive/HBase/Impala/Kudu) Exposure to Data Modelling & Data Vault techniques. Show less

    • United States
    • Aviation & Aerospace
    • 700 & Above Employee
    • Big Data Architect
      • Apr 2015 - Jul 2017

      Streaming Data Analytics on Flight Sensor Data, Text Mining on Aircraft Maintenance Data, Hadoop & Big Data Infrastructure establishment at Boeing Research & Technology Division, involved in: - Airplane Health Prediction Analytics Streaming Pipeline. - Flight Data Uplink/Downlink Using Azure Cloud. - InkBotics Monitoring and Administration Application. - Maintenance Adaptive Decision Support (MxADS). - Designed Lambda Architecture to achieve real time prognosis by… Show more Streaming Data Analytics on Flight Sensor Data, Text Mining on Aircraft Maintenance Data, Hadoop & Big Data Infrastructure establishment at Boeing Research & Technology Division, involved in: - Airplane Health Prediction Analytics Streaming Pipeline. - Flight Data Uplink/Downlink Using Azure Cloud. - InkBotics Monitoring and Administration Application. - Maintenance Adaptive Decision Support (MxADS). - Designed Lambda Architecture to achieve real time prognosis by taking advantage of both batch- and stream-processing methods to handle the increasing volume, velocity, variety & veracity. - Research and benchmark recent advancements in event processing of time series data streams and identify target approaches for further development and use. - Implemented Streaming, Batch and Hybrid Topologies in Scala using Spark, Spark Sql & Spark ML. - Turing Spark for optimal utilization of CPU Cores, Executors & Memory. Profiling using Spark UI. - Data exploration, Feature selection and Model development using Spark 1.6 ML & Mllib in Scala. - Typical life cycle involves Pre-Process, Buffer, sliding Window, Handle late/missing data, extract features, applying model/rules, state management, predict, classify & events/alerts generation. - Involved in setting up Hadoop cluster using Cloudera and staging server for data archival. - Developed near real time analytics POC in IBM Infosphere Streams, Apache Storm & Apache Spark. - Understanding of Classification and Regression Machine learning methods. - Delivered a responsive Web UI based for showing the concept of real time streaming. - Developed Data & Result visualization Bar, Spline, Box, Time series charts using HighCharts.js. - Interactive visualizations are developed in Tableau and integrated into Dashboard. Show less Streaming Data Analytics on Flight Sensor Data, Text Mining on Aircraft Maintenance Data, Hadoop & Big Data Infrastructure establishment at Boeing Research & Technology Division, involved in: - Airplane Health Prediction Analytics Streaming Pipeline. - Flight Data Uplink/Downlink Using Azure Cloud. - InkBotics Monitoring and Administration Application. - Maintenance Adaptive Decision Support (MxADS). - Designed Lambda Architecture to achieve real time prognosis by… Show more Streaming Data Analytics on Flight Sensor Data, Text Mining on Aircraft Maintenance Data, Hadoop & Big Data Infrastructure establishment at Boeing Research & Technology Division, involved in: - Airplane Health Prediction Analytics Streaming Pipeline. - Flight Data Uplink/Downlink Using Azure Cloud. - InkBotics Monitoring and Administration Application. - Maintenance Adaptive Decision Support (MxADS). - Designed Lambda Architecture to achieve real time prognosis by taking advantage of both batch- and stream-processing methods to handle the increasing volume, velocity, variety & veracity. - Research and benchmark recent advancements in event processing of time series data streams and identify target approaches for further development and use. - Implemented Streaming, Batch and Hybrid Topologies in Scala using Spark, Spark Sql & Spark ML. - Turing Spark for optimal utilization of CPU Cores, Executors & Memory. Profiling using Spark UI. - Data exploration, Feature selection and Model development using Spark 1.6 ML & Mllib in Scala. - Typical life cycle involves Pre-Process, Buffer, sliding Window, Handle late/missing data, extract features, applying model/rules, state management, predict, classify & events/alerts generation. - Involved in setting up Hadoop cluster using Cloudera and staging server for data archival. - Developed near real time analytics POC in IBM Infosphere Streams, Apache Storm & Apache Spark. - Understanding of Classification and Regression Machine learning methods. - Delivered a responsive Web UI based for showing the concept of real time streaming. - Developed Data & Result visualization Bar, Spline, Box, Time series charts using HighCharts.js. - Interactive visualizations are developed in Tableau and integrated into Dashboard. Show less

    • Senior Lead Software Engineer
      • Sep 2010 - Dec 2014

      Developed applications for Narus Network's Cyber security products and involved in Narus Insight Analytics development using IBM Infosphere Streams, Map Reduce, Apache Storm, HBase, Memcached, MongoDB, Java. Developed applications for Narus Network's Cyber security products and involved in Narus Insight Analytics development using IBM Infosphere Streams, Map Reduce, Apache Storm, HBase, Memcached, MongoDB, Java.

    • United States
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Sr. Java Consultant
      • Jan 2009 - Sep 2010

      Worked on Verizon's MyBusiness applications mostly on Self Service and eCommerce. Worked on Verizon's MyBusiness applications mostly on Self Service and eCommerce.

    • United States
    • Staffing and Recruiting
    • 1 - 100 Employee
    • Sr. Java Consultant
      • Aug 2008 - Dec 2008

      Involved in Saved Search Refactoring Project for recruitladder.com. Worked on theladders.com Internationalization and then Localization to Europe Locale. Involved in Saved Search Refactoring Project for recruitladder.com. Worked on theladders.com Internationalization and then Localization to Europe Locale.

    • United States
    • IT Services and IT Consulting
    • 700 & Above Employee
    • Member Technical Staff
      • Dec 2003 - Jul 2008

      Application Developer. Test Developer. Application Developer. Test Developer.

Education

  • Symbiosis institute of Management Studies
    MBA, PGDIT
    2006 - 2008
  • Jawaharlal Nehru Technological University
    B Tech, Computer science
    1999 - 2003

Community

You need to have a working account to view this content. Click here to join now