Close this
Close this

Jay Gohel

Development
Maharashtra, India

Skills

Data Engineering

About

Jay Gohel's skills align with System Developers and Analysts (Information and Communication Technology). Jay also has skills associated with Database Specialists (Information and Communication Technology). Jay Gohel has 7 years of work experience.
View more

Work Experience

Lead Consultant

Genpact
April 2023 - Present
  • Devised Python scripts to extract and transform data from 50+ Excel files into database tables, cutting manual data entry time by 80% and eliminating errors. Developed and optimized complex SQL queries to extract and transform data, aligning with BRD specifications, reduced processing time by 40% Pioneered the troubleshooting of existing SQL queries, pinpointing data and logic discrepancies effectively.

Senior Data Engineer

SpringML Inc
January 2022 - April 2023
  • Spearheaded implementation of Jupyter Notebook automation framework on GCP, reducing manual work by 75% and accelerating execution times by 90%. Transformed labor-intensive data extraction by automating Excel-to-BigQuery process using Python and SQL, ensuring accuracy and efficiency. Architectured end-to-end ETL pipeline, integrating Python, Spark, and GCP services for scalable data processing. Engineered Terraform scripts automating GCP service provisioning and minimizing manual effort.

Senior Product Consultant

Verisk Financial Fintellix
July 2017 - September 2021
  • Spearheaded the complete development and production deployment of an NPA application for a respected small finance bank in India within 90 days. Engaged throughout the entire SDLC, commencing with the creation of Functional Specification Documents (FSD) and culminating in the successful deployment to production. Designed and executed intricate database components, including procedures, and views, aligned with project requirements. Furthermore, optimized existing queries, resulting in an impressive 50% reduction in execution time. Expertly managed data warehousing activities, including the seamless ETL of data from diverse source systems. Employed a combination of Informatica PowerCenter and PySpark to facilitate seamless data movement. Developed intricate PySpark-based ETL pipelines, tailored to business needs, increasing data processing speed by 60%. Orchestrated the migration of 15+ ETL pipelines from Informatica to Apache Spark, leveraging Python for enhanced efficiency and performance. Played pivotal role in all phases of SDLC, consistently delivering high-quality outputs within 75% of the allotted time. Exhibited strong troubleshooting prowess, adeptly identifying and resolving issues within ETL pipelines to ensure smooth data flow and integrity.

Education

Liverpool John Moores University

MSc In Data Science

Rajiv Gandhi Institute of Technology

Bachelor of Engineering in Electronics and Telecom