Saikumar L
Development
CA, United States
Skills
Python
About
Sai Kumar's skills align with Programmers (Information and Communication Technology). Sai also has skills associated with System Developers and Analysts (Information and Communication Technology). Sai Kumar has 7 years of work experience.
View more
Work Experience
Python/Aws Developer
October 2021 - April 2022
- Responsibilities: * Hands-on configuring applications on AWS EC2 instances and the stage on S3 buckets. Perform S3 buckets creation, policies and IAM role-based policies and configuring user access levels. * Utilized Python Libraries like Boto3, NumPy for AWS. * Hands-on experience using IAM for creating roles, users, groups, and MFA to provide additional security to AWS accounts and its resources. * Implemented REST calls to consume the REST API's using Angular GET and POST methods and tested calls using Postman. * Used Angular Routing Module to implement routing, and role-based routing for Single Page Application. * Used regular expressions for faster search results in combination with Angular built-in and custom filters. * Worked with Terraform for automating VPCs, ELBs, security groups, SQS queues, S3 buckets, and continuing to replace the rest of our infrastructure. * Performed efficient delivery of code based on principles of Test-Driven Development (TDD) and continuous integration to keep in line with Agile Software Methodology principles * Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format. * Worked on deployment of project on to Amazon S3. * Implemented RESTful API which returns data from MongoDB in JSON format. * Proficiency in cloud-based DevOps practices, specifically in Terraform, OpenShift. * Expertise in containerization with Docker/Kubernetes and orchestration tools Ansible/ Puppet * Worked on CSV files while trying to get input from the MySQL database. * Implement the one-time migration of Multistate level data from SQL server to Snowflake by using Python and SnowSQL. * Using raw files loaded data files from URLs to Amazon S3 Bucket. * Developed AWS Lambda functions in Python using S3 and SQS triggers to automate workflows and developed Python scripts to store and retrieve objects in AWS S3 buckets. Developed scripts in Python using boto3 APIs to retrieve messages from various events. * Used Kubernetes to deploy scale, load balance, and worked on Docker Engine, Docker HUB, Docker Images, Docker Compose for handling images for installations and domain configurations. * Worked on reading queues in Amazon SQS, which have paths to files in Amazon S3 Bucket. * Used Amazon EMR for map reduction jobs and tested locally using Jenkins. * Also worked on Amazon EC2 Clusters to deploy files into Buckets. * Automated setting up server infrastructure for the DevOps services, using Ansible, shell and python scripts. * Exported/Imported data between various data sources. * Designing and implementing CI (Continuous Integration) system: configuring Jenkins Servers, Jenkins nodes, TFS creating required scripts (Perl & Python), and creating/configuring VMs (Windows/Linux). * Used Azure Container Services with Kubernetes for Orchestrating the servers. Environment: AWS, Glue, EMR, GIS, spark- SQL, Scala, Lambda, python, MySQL, Auth0, Step functions, Snowflake, Jenkins.
Python/Aws Developer
February 2020 - June 2021
- Responsibilities: * Hands-on configuring applications on AWS EC2 instances and the stage on S3 buckets. Perform S3 buckets creation, policies and IAM role-based policies and configuring user access levels. * Utilized Python Libraries like Boto3, NumPy for AWS. * Hands-on experience using IAM for creating roles, users, groups, and MFA to provide additional security to AWS accounts and its resources. * Implemented REST calls to consume the REST API's using Angular GET and POST methods and tested calls using Postman. * Used Angular Routing Module to implement routing, and role-based routing for Single Page Application. * Developing ETL pipelines in and out of data warehouse using advanced SQL Queries in Snowflake. * Used regular expressions for faster search results in combination with Angular built-in and custom filters. * Worked with Terraform for automating VPCs, ELBs, security groups, SQS queues, S3 buckets, and continuing to replace the rest of our infrastructure. * Performed efficient delivery of code based on principles of Test-Driven Development (TDD) and continuous integration to keep in line with Agile Software Methodology principles * Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format. * Worked on deployment of project on to Amazon S3. * Implemented RESTful API which returns data from MongoDB in JSON format. * Proficiency in cloud-based DevOps practices, specifically in Terraform, OpenShift. * Expertise in containerization with Docker/Kubernetes and orchestration tools Ansible/ Puppet * Worked on CSV files while trying to get input from the MySQL database. * Using raw files loaded data files from URLs to Amazon S3 Bucket. * Developed AWS Lambda functions in Python using S3 and SQS triggers to automate workflows and developed Python scripts to store and retrieve objects in AWS S3 buckets. Developed scripts in Python using boto3 APIs to retrieve messages from various events. * Used Kubernetes to deploy scale, load balance, and worked on Docker Engine, Docker HUB, Docker Images, Docker Compose for handling images for installations and domain configurations. * Worked on reading queues in Amazon SQS, which have paths to files in Amazon S3 Bucket. * Used Amazon EMR for map reduction jobs and tested locally using Jenkins. * Also worked on Amazon EC2 Clusters to deploy files into Buckets. * Automated setting up server infrastructure for the DevOps services, using Ansible, shell and python scripts. * Exported/Imported data between various data sources. * Designing and implementing CI (Continuous Integration) system: configuring Jenkins Servers, Jenkins nodes, TFS creating required scripts (Perl & Python), and creating/configuring VMs (Windows/Linux). * Used Azure Container Services with Kubernetes for Orchestrating the servers. Environment: AWS, Glue, EMR, GIS, spark- SQL, Scala, Lambda, python, MySQL, Auth0, Step functions, Snowflake, Jenkins.
Python Developer
Gem Soft Tech Solutions
May 2017 - January 2020
- Responsibilities: * Responsible for gathering requirements, system analysis, design, development, testing and deployment. * Created Python and Bash tools to increase efficiency of call center application system and operations; data conversion scripts, AMQP/RabbitMQ, REST, JSON, and CRUD scripts for API Integration. * Developed GIT hooks for the local repository, code commit and remote repository, code push functionality and on Git Hub. * Used Celery with RabbitMQ, MySQL, Django, and Flask to create a distributed worker framework. * Recording of Scripts (Web, Web Services HTML) using Vugen and SoapUI and script validation through co correlations, parameterizations and other methods. Scriptingweb and web services. * Develop consumer-based features and applications using Python, Django, HTML, Behavior Driven Development (BDD) and pair-based programming. * Used Django for framework for accessing various consumer data from various sources. * Designed table-less layouts, gradient effects, page layouts, navigation and icons using CSS and appropriate HTML tags as per W3C standards. * Used SOAPUI for testing web service for server-side changes. * Carried out various mathematical operations for calculation purpose using python libraries. * Application was based on service-oriented architecture and used Python 3.4, Django1.5, JSF 2, Spring 2, Ajax, HTML, CSS for the frontend. * Implemented monitoring and established best practices around using elastic search. * Install and configuring monitoring scripts for AWS EC2 instances. * Implemented task object to interface with data feed framework and invoke database message service setup and update functionality. * Working under UNIX environment in development of application using Python and familiar with all of its commands. * Used Python and Django to interface with the jQuery UI and manage the storage and deletion of content. * Carried out various mathematical operations for calculation purpose using python libraries. * Planning, implementing, and converting manual test cases to automation test cases. Environment: Python, Django, MySQL, RESTful, MS SQL Server, MongoDB, Elastic Search, Ubuntu server, Apache CQ, TFS, Amazon s3, Jenkins, pytest, Robot, GitHub, Linux, and Windows.
Python Engineer
Walgreens
May 2022 - Present
- Responsibilities: * Used Python to write data into JSON files for testing Django Websites. Created scripts for data modelling and data import and export. * Used J Query and Ajax calls for transmitting JSON data objects between frontend and controllers. * Involved in building database Model, APIs and Views utilizing Python, in order to build an interactive web-based solution. * Designed and implemented data pipelines leveraging Google Cloud Platform services such as BigQuery, Bigtable, and Cloud Storage, handling Gigabytes of data efficiently. * Thorough knowledge in various front-end tools like HTML, CSS, JavaScript, XML, jQuery, Angular Js, and AJAX. Managed large datasets using Panda data frames and MySQL. * Utilized BigQuery for real-time analytics, querying massive datasets, and generating actionable insights for business stakeholders, resulting in improved decision-making processes. * 5+ Years of Experience with Exceptional skills in Analysis, Design and Development of highly scalable Multi - tiered Web Applications, Distributed Applications and Web Services using Python, Django, Angular 2, Node JS, CSS, Bootstrap and HTML. * Responsible for developing UI using Html, CSS, Bootstrap and Angular Framework. * Created Python tools to increase the efficiency of application system. * Implemented REST calls that consume the REST API's using Angular Http Module. * Extensively used Reactive Forms and Angular Form Validation. Wrote reusable and efficient code to display form data. * Used Angular Routing Module to implement routing, and role-based routing for Single Page Application. * Used regular expressions for faster search results in combination with Angular built-in, custom pipes and ng2-charts for report generation. * Developed various screens for the front end using Angular. Created Components, and Services using Angular CLI. * Create/Load the data using ETL process into Teradata data warehouse from their legacy systems (MySQL, SQL server, Oracle). * Data Extraction, aggregations, and consolidation of Adobe data within AWS Glue using PySpark. * Create Pyspark frame to bring data from DB2 to Amazon S3. * Used Amazon Web Services (AWS) to maintain a centralized storage for server hosted on two boxes * Experience with Docker containers and container orchestration systems such as Confidential ECS, Kubernetes and Docker Swarm. * Worked on creating and documenting POC for helping migrate the current application to micro service architecture. The architecture included Docker as the container technology with Kubernetes and worked on with REST API. * Used glassfish containers in amazon ec2 cloud servers for deployment * Setup Amazon EC2 instance, integrated Amazon S3 with the Scrum pad, and did the automatic deployment by Capistrano. * Installation/manage oracle database including databases on AWS (Mongo, RedShift, RDS) * Performed troubleshooting on customer environments, web applications, network configurations, and end-user usability for the API Gateway and API Developer Portal. * Created the domains, application servers and load balancers using WebLogic 9.0 * Implemented AWS solutions like Route 53, EC2, S3, IAM, EBS, Elastic Load Balancer (ELB), Security Group, Auto Scaling in Cloud Formation JSON templates. * Involved in configuring CI/CD with Docker and Kubernetes. * Created data pipelines for different events to load the data from DynamoDB to AWS S3 bucket and then into HDFS location. * Working with the JIRA tool for Quality Center bug tracking * Developing Microservices using Spring Boot, Spring AOP, RabbitMQ for transporting and fetching data. * Created micro-service APIs for page classification, image processing and OCR, and extract text data from scanned documents. * Created step function by micro-service lambda function in AWS and used ingestion framework. * Conducted some data analytics and processing by using AI and ML algorithms. * Database management with SQL and NOSQL like MongoDB and DynamoDB. * Worked primarily with the back-end team where the coding language was solely based on Python and Django to use fast Api, flask frame work was used in some cases. (NumPy, SciPy, Pandas, OpenCV, boto3, nltk) * Also, many restful APIs and endpoints were created to process paper transactions and deliver superior real time visibility needs of importers, exporters, and freight forwarders. * Worked with Azure, AWS cloud servers/serverless, CI/CD and conducting database management duties Git workflows. Environment: HDFS, MapReduce, Hive, Kafka, Zookeeper, Spark SQL, Angular, Docker, Kubernetes, CI/CD, Spark Data frames, PySpark, Teradata, Scala, AWS, Python, JSON, SQL Scripting and Linux Shell Scripting, Parquet, Git.