Close this

Umar Hayat

Development
Punjab, Pakistan

Skills

Data Engineering

About

UMAR HAYAT's skills align with Programmers (Information and Communication Technology). UMAR also has skills associated with Database Specialists (Information and Communication Technology). UMAR HAYAT has 9 years of work experience.
View more

Work Experience

PRINCIPAL DATA ENGINEER

SYSTEMS LIMITED
January 2023 - Present
  • PROJECTS Regeneron • Regeneron is a leading biotechnology company that invents, develops, and commercializes life-transforming medicines for people with serious diseases. They have developed many FDA approved medicines including Arcalyst (rilonacept) which is used for specific and rare autoinflammatory conditions, Eylea (aflibercept injection) to treat a common cause of blindness in the elderly, Dupixent (dupilumab injection) for the treatment of adolescent and adult patients' atopic dermatitis • Worked as offshore data engineer to develop data pipelines over the AWS cloud • Developed multiple pipelines to automatically ingest, process, store and deliver medicines data from data lake using Python, Apache Spark, EMR, Athena, EKS, S3, RDS, SQL, Apache Airflow

SENIOR DATA ENGINEER

SMARTZIP ANALYTICS
March 2020 - December 2022
  • PROJECTS SmartZip Analytics • SmartZip Analytics is a national leader in predictive marketing solutions for the real estate industry. Using patent-pending predictive analytics, automated marketing campaigns and smart CRM follow-up tools, it makes big data actionable for real estate and mortgage lending professionals. One of its flagship products, SmartTargeting, helps agents land more listings by identifying and targeting the homeowners most likely to sell in any neighborhood across the U.S • Led a data engineering team for smoothly running operations over the AWS cloud • Dedicated to building and managing data intake processes, running and managing analytics pipelines, client deliveries and product operations • Developed multiple pipelines to automatically ingest, process and store real estate listings using Python, Shell Scripting, Hadoop, Apache Spark, Sqoop, Hive, EMR, Athena, AWS EC2, S3, RDS, SQL, Amazon Redshift • Optimized data pipelines by writing efficient SQL queries and following spark best practices • Improved performance of automated valuation model from 60% to 80%

SENIOR SOFTWARE ENGINEER

EBRYX
December 2017 - February 2020
  • PROJECTS Netsentinal This was a log parsing project. We were provided with Network ingress traffic and pool of malicious log files. We were supposed to identify the traffic coming from malicious IPs. Developed a data pipeline to parse log files using Python, Scala, Apache Spark, Cassandra Used Apache Airflow to orchestrate the workflows Developed reports in Apache Zeppelin to summarize malicious traffic Bolton This was UK based medical recruitment agency Pertemps project to hire doctors and nurses. We were supposed to develop web portals and mobile app. Developed web portals for different roles including recruiters, restricted recruiters, super recruiters, compliance, super compliance, job entry team and admins using angular.js, node.js, express.js, postgreSql Worked as a full stack web developer and developed frontend and backend rest apis that were deployed over the AWS cloud using EC2, RDS Postgres Improved the performance of reports by writing database-level functions in PostgreSQL

SOFTWARE ENGINEER

NORTHBAY SOLUTIONS
August 2015 - November 2017
  • PROJECTS Resident World This was a real time bidding (RTB) project. Three Teams were working, Resident World, AxisPoint, Northbay. We were supposed to build serverless architecture on AWS using the latest technologies and standard best practices Worked as backend developer and developed different apis that were later used by web and mobile application developers Developed and deployed apis over the AWS cloud using Python, AWS DynamoDB, RDS MySql, Lambda function, Api Gateway, EC2, SQL Developed few reports in AWS QuickSight as per the business requirements Scholastic Scholastic Corporation is among the most recognized names across the education industry, and is the world's largest publisher and distributor of children's books and print and digital educational materials for pre-K to grade 12. This project was a migration from IBM AS/400 technologies to AWS to evolve their Big Data and Analytics workloads Developed multiple python scripts to import data from FTP to S3 and then dump into AWS Redshift Transformed and aggregated data by writing SQL scripts Automated workflows over the AWS using EC2, Jenkins, Bitbucket

Education

University of The Punjab

BS