Krishna Nuvvala
Development
Texas, United States
Skills
Data Engineering
About
KRISHNA NUVVALA's skills align with Programmers (Information and Communication Technology). KRISHNA also has skills associated with System Developers and Analysts (Information and Communication Technology). KRISHNA NUVVALA has 6 years of work experience.
View more
Work Experience
Data Engineer- Intern
Advithri Technologies
January 2023 - November 2023
- Zocdoc is a health care domain, Get the RDBMS data (Oracle/MySQL) and SFTP data, store in AWS-S3. Import and export data from Oracle, and MySQL through Sqoop) and store in S3. Apply Transformation rules on the top of different datasets and finally store in the desired output (CSV to JSON). Schedule the tasks in oozie, scale up automatically based on data. Finally store the desired data in RedShift and S3 in the desired format. Coordinate with Reporting team to create dashboards. * For injection data from Oracle, MySQL to S3 which can be queried using hive and spark SQL tables. * Worked on Sqoop jobs for ingesting data from MySQL to Amazon S3 * Created hive external tables for querying the data. * Use Spark Data frame APIs to inject Oracle data to S3 and store it in Redshift. Write a script to get RDBMS data to Redshift. * Optimize Hive and spark performance. identify the errors using logs. * Automatically scale-up the EMR Instances based on the data. Apply Transformation rules on the top of Data Frames. * Run and schedule the Spark script in EMR Pipes. Process Hive, csv, Json, oracle data at a time (POC). * Validate and debug the script between source and destination. Validate the source and final output data. Environment: Hadoop, Hdfs, Hive, Sql, Spark, Python, Shell Scripting, Aws S3, Emr, Ec2, Lambda
Data Engineer
Accenture
June 2021 - August 2022
- Design and develop automation logic for revenue generating customer journeys within credit and fraud risk organization supporting strategies with benefit targets of $20M executed in Hadoop/HDFS environment. * Manage, Design, and develop applications using Hive, Impala, Shell Scripting, HDFS, Jenkins, Eclipse IDE * Validation requires GitHub and Shell Scripting which are required for continuous integration. * Responsible for querying the data using hive which runs in the backend of the rule execution engine. * Performing analysis on the production data for different issues. * Monitoring the batch process for the jobs which run on daily basis on the cluster. * Configuring arguments for different nodes with different processes Environment: Hadoop, HDFS, Hive, SQL, Spark, Python, Shell Scripting, GCP, Glue, Nifi, Azure ADF, Jira, snowflake
Data Engineer
Techno Soft
June 2019 - May 2021
- This project is to develop a system for each make/model to rank how quickly customers install an update over the previous campaigns. The ranking will be used to determine which customers to include as part of smart20k push. * Worked with the business users to gather, define business requirements, and analyze the possible technical solutions. * Implemented Sqoop scripts to import data from Oracle to Hive. * Developed comprehensive test cases to test the data quality. Environment: Hadoop, HDFS, Hive, Spark, Git, Unix
Data Engineer
Apex_IT
June 2017 - May 2019
- OSND Carrier Portal: It's a carrier portal that allows carriers to submit OS&Ds (Overages, Shortages, and Damages) and OS&D claims, and inquire load status at DCs (5 days out). The current portal is home-grown, built with 20-year-old technology, and unsupported - this is a major risk for implementation of S/4 project. If not replaced, it hinders the introduction of positive changes, e.g., the simplification of the shipment structure. It intends to utilize an application based on the SAP Cloud Platform (SCP App) to replace the existing Carrier Portal. * Analyzing the data from different dimensions and bring it into HANA Data Base. * Hands on experience in BAS * Worked on cloud DB modelling. * Created complex models (Calculated Views) in model.cds file. * Worked on table functions. * Experienced in creating the ABAP views. * Experienced on uploading the CSV files in BTP Cockpit and create the new tables. * Preparing Functional Specification, Unit Test Plan document and Operation guide. * Involved in Design, develop and maintain Webi (Reports) * Involved in build and deployment and pushed the code to git. * Knowledge on SDI Environment: Sap HANA, SQL, Git, Cap DB Modelling