Close this
Close this

Vamsi Vaka

Development
Michigan, United States

Skills

Python

About

Vamsi Priya Vaka's skills align with Programmers (Information and Communication Technology). Vamsi also has skills associated with System Developers and Analysts (Information and Communication Technology). Vamsi Priya Vaka has 10 years of work experience.
View more

Work Experience

Sr. DevOps Engineer

Ford Motor Company
April 2021 - Present
  • SRE Responsibilities: * Executed DevOps Automation, advanced Continuous Integration (CI) and Continuous Deployment (CD), and automated test processes using GIT, Jenkins, and Ansible. * Proficient in planning, strategizing, designing, migrating, and implementing cloud-based platforms/applications on AWS and Azure with various deployment models (IaaS, SaaS, PaaS) including private, public, and hybrid. * Implemented Kubernetes for the runtime environment of the CI/CD system to facilitate testing and deployment. * Architected and implemented Azure and online Directory Synchronization for Office 365. * Developed a Disaster Recovery plan utilizing Azure Recovery Services. * Managed and designed infrastructure across AWS, Azure, and Confidential PaaS, IaaS, and SaaS, as well as IoT. * Demonstrated expertise in implementing solutions across various cloud provider models such as Azure, AWS, and SaaS. * Led the migration of legacy applications to AWS & Azure clouds, as well as the migration to SaaS solutions. * Directed large technical team projects, meetings, and communications for incident resolution and updates involving Operations, Business Product Management, Application Architecture, Technical Architecture, Developers, and Governing Management. * Extensive experience in solution delivery, application development, Continuous Integration (CI), Continuous Development (CD)/DevOps, infrastructure management, and Cloud-based applications/design & migration (IaaS, SaaS & PaaS models). * Utilized Docker to containerize applications and dependencies, developing Docker files, Docker-Compose files, Docker container snapshots, managing Docker volumes, and deploying Docker Swarm using Ansible. * Created additional Docker Slave Nodes for Jenkins using custom Docker Images and deployed them to the Cloud, working on all major components of Docker. * Implemented Urban Code Deployment tool to establish deployment automation processes for development, QA, Staging, and Production environments. * Developed build workflows utilizing Gradle, Gitlab-CI, Docker, and OpenShift. * Responsible for implementing Continuous Integration (CI) and Continuous Delivery (CD) processes using Jenkins, Python, and Shell scripts to automate routine tasks. * Built effective engineering teams focused on edge and network-based products, platform and network software engineering, VoIP, and multi-homed network route control. Directed the relocation of Confidential business systems, technical operations, and service management. * Implemented Maven to automate and enhance the operational environment, developing build and deployment scripts in Jenkins using Maven. * Installed and configured Jenkins master and slave nodes, configuring Jenkins builds for continuous integration and delivery, and set up Jenkins for application deployment. * Acted as the Snowflake Database Administrator, leading data model design and database migration deployment for production releases on AWS Cloud (Snowflake). * Configured, automated, and maintained CI/CD tools such as GIT/GitLab, JENKINS, Build Forge, Docker registry/daemon, Nexus, and JIRA for Multi-Environment (Local/POC/NON-PROD/PROD) with a high degree of standardization for both infrastructure and application stack automation in AWS cloud platform. Orchestrated CI/CD processes and set up CI/CD tools in Vagrant, AWS, and VPCs. * Installed and configured GitLab runners to deploy .NET components onto IIS servers, working on the GIT branching strategy for deploying components onto different environments. * Integrated AWS DynamoDB using AWS Lambda for storing item values and backup of DynamoDB streams. * Configured Jenkins server, installed plugins, integrated Jenkins with Git and modified Artifactory, Jira, and Urban Code for SMART-IT Automation projects. * Automated the build and deployment process using Bash, Python, and Shell scripts with a focus on DevOps tools and CI/CD in Jenkins. * Orchestrated changes across multiple environments using IBM Deploy, managing automated and consistent deployments and rollbacks for all applications. Implemented Incremental Updates to application components and versions using Urban Code Deploy. * Wrote Bash and Python shell scripts to automate tasks such as server provisioning, package installation, configuration, and application deployment on multiple servers in Prod & Non-prod environments. * Automated Datadog Dashboards and supported internal users in designing and maintaining production-quality dashboards for Splunk.

Sr. DevOps Engineer

SS&C Technologies
October 2018 - March 2021
  • AWS Responsibilities: * Engaged in designing and deploying numerous applications utilizing nearly the entire AWS stack (including EC2, Route53, S3, RDS, ECS, Dynamo DB, SNS, SQS, and IAM) with a focus on high-availability, fault tolerance, and auto-scaling using AWS CloudFormation. * Worked extensively on public cloud infrastructure, particularly with AWS products and services such as EC2, Virtual Private Clouds (VPCs), IAM, Security Groups, Elastic Beanstalk, S3, CloudFormation, RDS, Lambda, Redshift, SQS, SNS, CloudWatch, AWS Shield, AWS Trusted Advisor, CloudFront, and CloudTrail. * Demonstrated expertise in utilizing AWS Key Management Service (KMS) for creating and managing encryption keys used to encrypt data. * Implemented an HTTPS Ingress controller and utilized TLS certificates on AKS to provide reverse proxy and configurable traffic routing for individual Kubernetes services. * Transferred all Kubernetes container logs, application logs, event logs, cluster logs, activity logs, and diagnostic logs into Azure Event Hubs and then into Splunk for monitoring purposes. * Created CloudWatch alarms for monitoring server performance metrics such as CPU utilization, disk usage, etc. * Provisioned EC2 instances in AWS using Terraform scripts, pulled images from Docker, and performed AWS S3 bucket creation, and IAM role-based policies customization using JSON templates. * Utilized big data technologies such as AWS Elastic MapReduce (EMR), AWS Glue, AWS Athena, Apache Spark, Apache Hadoop, and Spark for data processing and manipulation. Implemented SFCC platform technology and APIs for data manipulation, integration with external services, creating data processing jobs, and working with JavaScript, HTML, and CSS for SFRA (Storefront Reference Architecture) in the commerce cloud. * Provisioned highly available EC2 instances using CloudFormation and developed new plugins to support additional functionality in Terraform. * Managed application infrastructures in an IaaS environment using Terraform, handling storage and networking configurations. * Set up and built AWS infrastructure components like VPC, EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS using CloudFormation JSON templates. * Provided user stories and use cases for various modules of OpenStack data centre deployments in collaboration with the team. * Utilized Kubernetes for creating new projects and services, load balancing, adding routes for external access, creating pods for new applications, and managing, scaling, and troubleshooting pods via SSH. * Implemented Continuous Delivery frameworks using Jenkins in both Windows and Linux environments. * Developed build and deployment scripts using ANT/Maven as build tools and Jenkins for automation to facilitate environment transitions. * Used Ansible Playbooks in the Continuous Delivery Pipeline, deploying microservices and provisioning AWS environments. * Wrote Python code utilizing the Ansible Python API to automate the Cloud Deployment Process. * Configured applications for OpenShift v3 and containerized apps using Docker. * Applied Beanstalk for deploying and scaling web applications and services developed in Java, PHP, and Node.js, alongside AWS Redshift, Postgres, NewSQL, and other modern database technologies. * Conducted Staging and Production deployments using deployment preparation and setup PowerShell scripts and tools like Octopus. * Deployed code on WebLogic and Tomcat servers across Production, QA, and Development environments. * Implemented an in-house testing engine for performance statistics in selecting NoSQL vendors like MongoDB and CouchDB. * Experience in deploying Cassandra clusters in the cloud and on-premises for data storage and disaster recovery. * Implemented Hadoop clusters for processing Big Data pipelines using Amazon EMR and Cloudera, relying on Apache Spark for efficient processing and API integration, and managed resources using Apache Mesos. * Integrated Jenkins with various DevOps tools such as Nexus, SonarQube, and Chef for seamless workflows.

DevOps Engineer

AT&T
December 2016 - September 2018
  • Responsibilities: * Deployed OpenShift Enterprise v3.7-3.11 on RedHat 7 environment and integration with private Docker Registry. * Responsible for designing, investigating, and implementing public-facing websites on Amazon Web Services (AWS). * Configured Elastic Load Balancers with EC2 Auto Scaling groups to ensure optimal performance and availability. * Developed a universal CloudFormation template to dynamically calculate environment values based on client-specific parameters during deployment. * Handled VPC configuration, security groups, EC2 instance launching, database setup, and storage configuration in AWS using EC2, S3, and Route 53. * Implemented CloudFormation template versioning on GitHub as part of the continuous integration process. * Supported and enhanced LaMDA functions in AWS, querying Elasticsearch clusters to produce servers. * Utilized AWS solutions including EC2, S3, RDS, Elastic Load Balancer, and Auto Scaling groups. * Successfully demonstrated the initial Proof of Concept (POC) on AWS Cloud using EC2, ELB, and AWS API Gateway. * Developed a multi-threaded process to migrate records from the database to AWS S3 and DynamoDB tables for state storage of activities. * Transitioned to Ansible and triggered puppet scripts through Ansible for some legacy applications, storing all YAML scripts on GitLab distributed version control systems, and executing the scripts through Jenkins. * Designed, installed, and implemented Ansible configuration management systems, writing playbooks for Ansible, and deploying applications. * Installed and configured applications such as Docker and Kubernetes for orchestration purposes. * Employed Ansible to manage existing servers and automate the build/configuration of new servers using Ansible Playbooks. * Created Ansible Playbooks with Python as the Wrapper to manage configurations of AWS nodes and tested playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers. * Configured and integrated servers with different environments to automatically provision and create new machines using Configuration management/provisioning tools like CHEF and PUPPET. * Proficient in using OpenStack services including Ceilometer, Keystone, and Swift. * Installed, configured, and managed Puppet Master/Agent, wrote custom Modules and Manifests, and downloaded pre-written modules from Puppet-forge. Conducted upgradation or migration of Puppet Community and Puppet Enterprise. * Developed OpenStack infrastructure with automation and configuration management tools such as Ansible, Puppet, or custom-built cloud-hosted applications. * Integrated Maven/Nexus, Jenkins, Urban Code Deploy with Patterns/Release, Git, Confluence, JIRA, and Cloud Foundry for seamless development and deployment processes. * Managed local deployments in Kubernetes, setting up local clusters and deploying application containers. * Implemented the Chef cookbook to automate the integration process between RHEL and Windows AD using the Kerberos key tab file. * Set up and configured Chef Software on VMs from scratch, deployed run-list into the Chef server, and remotely bootstrapped the Chef clients. * Utilized existing cookbooks from Chef Marketplace and customized recipes for each VM. * Responsible for onboarding Application teams to build and deploy code using Jenkins, GitHub, and Nexus. * Collaborated extensively with Jira and Slack for project management and internal team communication.

Linux Administrator

Honeywell
September 2014 - November 2016
  • Responsibilities: * Hands-on experience in using different source code version control tools like GIT, Subversion and code deployments to QA, Stage & Production environments. * System Installation and Configuration (RHEL, Windows Servers). System performance monitoring, tuning and log management. * Account, user and group administration, Permissions policy management and quota implementation. * Implement and maintain server virtualization using VMware, ESXi, and Oracle Virtual Manager. * Installation and configuration of database systems e.g. MySQL, Oracle, and JBoss servers. * Experience in setting up Storage Area Network (SAN) and NAS and file system backup and restore. * 24x7 on-call rotation for support of production systems. * Monitoring the infrastructure using Nagios. * Modified NANT/ANT scripts to detect and display circular dependencies for Java and .Net. * Implemented and manipulated Shell and Perl scripts for release and build automation per requirement. * Created and maintained build automation shell, managed building fixes, merges, and release cuts as needed. * Written scripts for changes in Bash, Perl, and Python for WebSphere mapping modules to the enterprise application. * Conduct builds for a suite of J2EE applications using Maven, and Ant. * Maintained/executed build scripts using Maven by coordinating with development and QA teams. * Extensively worked with the LAN/WAN, firewalls and routing for Internet and Intranet connectivity using different protocols like TCP/IP, DHCP, HTTP/s, FTP, SMTP & SSH * Package management and servers patching, Kernel upgrade. * Management of Firewall, iptables, and TCP/IP Network troubleshooting.

Education

Bachelors in Computer Science