Close this
Close this

Sameera Silla

Development
Texas, United States

Skills

Data Engineering

About

Sameera Silla's skills align with System Developers and Analysts (Information and Communication Technology). Sameera also has skills associated with Database Specialists (Information and Communication Technology). Sameera Silla has 9 years of work experience.
View more

Work Experience

Azure Data Engineer

Adeptia Inc
August 2021 - Present
  • - TX, United States Client: Adeptia Inc. Orchestrated end-to-end data integration solutions using Azure Data Factory, achieving a 30% reduction in data processing time and enhancing overall data accessibility for healthcare practitioners. Architected and optimized healthcare-specific data warehousing solutions with Snowflake and Azure Synapse Analytics, resulting in a 40% improvement in query performance, expediting patient data analysis for more informed decision-making. Managed scalable storage and retrieval of raw and processed healthcare data using Azure Data Lake Storage, leading to a 25% increase in data accessibility across various healthcare data processing scenarios, supporting analytics, research, and reporting. Conducted performance tuning and optimization in Azure SQL databases tailored to healthcare transactional data, resulting in a 20% reduction in query response time and improved efficiency in data retrieval and manipulation. Leveraged Azure Databricks to enhance data processing and analysis workflows specific to healthcare use cases, contributing to a 15% increase in overall data processing efficiency and agility within the healthcare data ecosystem. Developed and implemented intricate ETL processes using SQL, facilitating the transformation of raw healthcare data into actionable insights. Established and maintained rigorous data validation and quality assurance protocols within ETL pipelines, ensuring the accuracy and integrity of healthcare data. This initiative resulted in a 25% reduction in data errors, enhancing the reliability of clinical reports and patient records. Collaborated closely with healthcare professionals to design robust data pipelines supporting machine learning and artificial intelligence initiatives. These efforts empowered data-driven decision-making in patient care and medical research. Environment: SQL, Snowflake, Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure Databricks

ETL Developer

Wells Fargo
January 2020 - July 2021
  • - AZ, United States Client: Cap Gemini - Wells Fargo Developed workflows to automate ETL processes on a daily basis through scheduled jobs using Ab Initio, UNIX/Linux shell scripting and SQL. Responsible for designing, building and supporting ETL pipelines that automates the ingestion of structured and unstructured data. Responsible for ingesting data from application's upstream to data lake (HDFS), then processed and analyzed as per business requirements to enable reporting capabilities. Developed Autosys Scheduler jobs for new workflows and updated the JIL's if any modifications required. Involved in performing regular system audits to ensure accuracy of the various environments and applications along with monitoring Autosys batch processes and perform trouble shooting using logs, UNIX scripts and Database SQL processes. Responsible for OS patching activities, identifying and resolving data quality issues, environment and production issues and thus, reduced the overall downtime by at least 60%. Involved in checking Unix logs to resolve performance issues, server issues and infrastructure issues within SLA by coordinating with L2 and DBA. Environment: Ab Initio, ETL, SQL, UNIX/Linux, Shell Scripting, HDFS, Autosys

ETL Developer/Business Analyst

First Tech Federal Credit Union
May 2019 - December 2019
  • Acted as a liasion between cross functional teams for understanding and gathering business requirements and documented the artifacts necessary for development. Contributed to the design and development discussions, product enhancement activities and code reviews with the technical expertise. Gathering the knowledge of existing operational sources for future enhancements and performance optimization of graphs. Involved in Analysis, coding, unit testing, user acceptance testing, production implementation and system support for the Enterprise Data Warehouse Application. Documented the technical artifacts like Supporting Guide, User Guide, Low Level Document (LLD) to help the business and application users with the product understanding and knowledge. Involved in migrating legacy data warehouse application and database objects to Snowflake through various environments such as Development, Testing, UAT and Production environment. Analyzed and validated source system and migrated data utilizing Tableau dashboards. Environment: SQL, Ab Initio, Snowflake, UAT

Ab Initio Developer

Maslow Media Group Inc
March 2019 - April 2019
  • Unites States Extensively involved in Ab Initio Graph Design, development and Performance tuning. Monitoring of jobs through Ab Initio Control Center and Tivoli Developed and executed Parameter Sets (psets) for the generic graphs to process the data from multiple sources. Worked on enhancement activities, based on the client requirements. Environment: Ab Initio, UNIX, SQL

Data Specialist - ETL

IBM
January 2017 - February 2019
  • Responsible for debugging and cleansing the data from source systems using Ab Initio components such as Join, Dedup Sorted, Denormalize, Normalize, Reformat, Filter-by-Expression, Rollup. Developed Generic Ab Initio graphs for data cleansing, data validation and data transformation and tuned them for better performance. Used various component of Ab Initio graphs like Partition By Key, Sort, Reformat, Join, Dedup etc. to impose business logic. Used UNIX environment variables in All the Ab Initio Graphs, which comprises of specified locations of the files to be used as source and target. Created sandbox parameters to check in and checkout of graphs from repository Systems and edited them accordingly. Developed BTEQ scripts for table and view development using Teradata SQL Assistant, as well as tweaked existing BTEQ scripts to increase performance. Created source datasets, destination datasets, and feed settings in Express IT based on the mapping documentation. In a production setting, monitored the jobs that were created on a scheduling application like Tivoli, Autosys and Control Center. Documented the technical artifacts like Supporting Guide, Implementation Guide, Low Level Document (LLD) and High Level Document (HLD). Environment: Ab Initio GDE (Graphical Development Environment), Ab Initio MDWP ExpressIT, Teradata SQL Assistant, Putty, Tivoli, Control Center

Associate System Engineer

Vivos Corp
July 2015 - December 2016
  • Worked closely with Architects when designing the Data Warehouses, with Business Analyst in understanding the business needs and interacting with other team members in completing the task as scheduled. Developed graphs using Graphical Development Environment (GDE) with various Ab Initio components. Good understanding of Agile SCRUM process and extensive experience working in a collaborative environment. Involved in end to end ETL process including unit testing and deployment. Environment: Ab Initio, ETL, Data Warehousing

Education

Gandhi Institute of Technology and Management

Bachelor's