
Anand Bharma
Development
CA / Los Angeles, United States
Skills
Cloud Computing
About
Anand B's skills align with Programmers (Information and Communication Technology). Anand also has skills associated with System Developers and Analysts (Information and Communication Technology). Anand B has 9 years of work experience.
View more
Work Experience
Sr DevOps Engineer
FTFCU
May 2021 - Present
- Responsibilities: * Working on DevOps with Agile Methodology and tools area (Code Management and build and release automation and service and incident management). * Responsible for Continuous Integration (CI), Release Management, Environment Management and Continuous Deployment/Delivery (CD). * Worked on Micro-Service Platform using Elastic Beanstalk, Configured the EC2, RDS, ELB and Auto-scaling group on Elastic Beanstalk. * Configured the Cloud Trial on the AWS console and created the Cloud watch events to receive SNS Notifications when someone terminates the EC2 instance. * Created builds with integration to code quality tools such as SonarQube. * Worked on HTTP protocols and processed the request on server side by implementing API & used postman as tool to process REST API. * Written terraform scripts from scratch for building Dev, Staging, Prod and DR environments. * Worked on Google play store and App store and downloaded and Deployed APK and IPA to iOS App store and google play store. * Created Atlas local project in Firebase for local Atlas testing for mobile push notifications. * Having good implementation experience with installation and configuration of Kubernetes, clustering them and managed local deployments in Kubernetes. * Used Terraform scripts to Automate Instances for Manual Instances that were launched before. * Built unit test framework using python pyunit test framework using automation testing. * Wrote Python automation testing using Selenium web-driver across chrome, IE browsers and Firefox. * Writing the necessary tools to automate the pipe. Worked with Terraform key features such as Infrastructure as code, Execution plans, Resource Graphs, Change Automation. * Used python to run API calls to AWS. I have used the Python Boto SDK Module to connect to AWS Management Console * Collaborated in the automation of AWS infrastructure via Terraform, deployed micro services, including provisioning AWS environments using Ansible Playbooks. Designed, Installed, and Implemented Ansible configuration management system. * Managed AWS infrastructure as code using Terraform. * Created quality gates in SonarQube dashboard and enforced in the pipeline to fail the builds when condition not met. * Created scripts in Python to automate log rotation of multiple logs from web servers. * Created EC2 instances with various AMI's and Configured Application Servers on the instances. * Configured Elastic Load Balancer ELB for distribution of incoming application traffic across multiple EC2 instances. Configured Auto-Scaling group to spin up more instances on heavy load. * Provided consistent environment using Kubernetes for deployment scaling and load balancing to the application from development through production, easing the code development and deployment pipeline by implementing Docker containerization. * Configured NAT instances within the public subnet and tested the connectivity within instances within between and private subnets. * Created health checks on Route53 and configured different routing policies like Simple, Weighted, and Fail-over * Created a Lambda service to take EBS Volume snapshots and configured Cloud Watch to run the service every hour. * Developed auto container to automate containerization of new and existing applications as well as deployment and management of complex run time environment like Kubernetes. * Migrating servers with the required configurations changes and testing and deploying the machines using Ansible commands * Created various modules and Manifests in Ansible to automate various applications. * Working with several Docker components like Docker Engine, Machine, Creating Docker Images, Compose and Docker Registry and handling multiple images primarily for middleware installations and domain configurations. Experience in databases like MongoDB and MySQL. * Expertise in optimized SQL queries in MySQL. Implemented Continuous Integration using Jenkins and GIT. * Working on creating a Continuous Delivery CI pipeline with AWS Code Pipeline to automate builds with AWS Code Build. * Implemented a GIT mirror for SVN repository, which enables users to use both GIT and SVN. * Worked with container orchestration tools and concepts like Docker Engine and Kubernetes. * Installation and implementation of AppDynamics on all Prod and Non-Prod environment for monitoring of java and .net applications * Implemented and maintained the monitoring and alerting of production and corporate servers/storage using AWS cloud Watch. * Worked with New Relic, Implemented Deep Learning algorithms on large data sets to add real-time anomaly detection and predictive modeling for integration into the New Relic APM platform Environment: AWS, EC2, VPC, RDS, SNS, SES, IAM, Cloud Formation, Cloud Watch, CLI, Confluence, Bitbucket, Jenkins, Docker, Kubernetes, Terraform, JIRA, Agile, Python, Java, Unix, Ansible, Puppet, Maven, GIT, SVN, Nagios, New Relic, AppDynamics.
DevOps Engineer /Cloud Engineer
DTCC
March 2019 - April 2021
- Responsibilities: * Moderate and contribute to the support forums (specific to Azure Networking, Azure Virtual Machines, Azure Active Directory, Azure Storage) for Microsoft Developers Network including Partners. * Gathering information from business partners and database developers and prepared a good technical document to build robust .Net web and web API solutions to satisfy business requirements. * Experience working with Branching strategy and implementing it in the New Upcoming Projects. * Worked on different pieces of Python code in the Azure code repository like client helpers and batch helpers and upgrading the agents of YAML files. * Experience in doing Deployments in the Command Prompt and Build/pull/push/Deploy the docker image to the Azure Container Registry. * Re-engineered the legacy application into next gen application using Java script framework like Knockout.js, Angular 7, Typescript, jQuery and Bootstrap. * Created dimensional snowflake models based on Kimball methodology. * Expertise in working With Azure Code repository and known the concepts of Commit, Tagging, Pushes, Branches and Pull-requests. * Experience in Migrating code repo from one project to another project within Azure DevOps. * Good Knowledge in adding members to users and Groups permissions within the Azure DevOps. * Worked on creating an Agent pool in Azure DevOps and Download / Run the agent configuration in On-premises SQL server. * Used TFS (Team Foundation Server) for Source Code Control, project related document sharing and team collaboration. * Experience in creating Custom Access token within Azure DevOps for Authentication purposes. * Experience in Working with Creating an Azure DevOps Build and Release pipeline for SQL Deployment and Azure Data Factory (ADF). * Worked with Build Pipeline for building an DACPAC artifact as an O/P within the Azure DevOps. * Worked on implementing a Build Pipeline for the Azure Data Factory by using an ADF Publish Branch / Master branch within the Azure DevOps. * Worked with Staging, Production environments in Azure DevOps Release Pipeline and Deploy the DACPAC into the Databases within Microsoft SQL server management Studio (SSMS). * Experience in working with Rollback and Backup Strategy as a Part of Release Pipeline in Azure DevOps. * Experience in creating an Azure Release pipeline by using JSON templates which comes o/p from the Build Artifact and create a New ADF (Azure Data Factory) Production Environment. * Good Knowledge in Insert, delete, truncate tables in the SSMS (SQL server Management Studio). * Good Understanding knowledge of Store Procedures, Functions, Tables and Views in the SQL server Database either in On-premises or in Cloud. * Experience in creating a Build/Release pipeline in Azure DevOps and made deployments into Prod Environments in Azure Databricks. * Worked on creating a Node and deploy the Application package to the Azure Batch services. * Experience in creating a key vault services and maintain secrets in the Azure Batch itself. * Worked on different storage Accounts like Blob storages and Data Lake GEN 2.0 storages within the Azure cloud. * Experience in working with Upgrading the API packages of MS Rest, Storage Blob, Panda's version within Azure Cloud. * Deployed Azure resource managed based resources. Moreover, deployed Azure Cosmos databases and other Azure settings using PowerShell and Json. * Involved in Deployment of Existing Windows server applications to Microsoft Azure windows Servers. * Experience in Azure Infrastructure as a Service (IaaS) is a form of cloud computing that provides virtualized computing resources over the Internet. Environment: Azure, Nginx, ANT, MAVEN, SVN, GitHub, Chef, Puppet, Ansible, Docker, Java, Apache HTTPD, Bash, Shell, Perl, Python.
DevOps/ Automation Engineer
Sony International
January 2018 - February 2019
- Responsibilities: * Experience as DevOps/Automation Engineer project teams that involved different development teams and multiple simultaneous software releases. * Automated repetitive tasks within the technical infrastructure to streamline and improve overall quality through the high availability of resources resulting in a 36% reduction in support times. * Data migration from on perm data centers to AWS cloud. Creating the infrastructure for data migration in AWS Cloud. * Performed the automation deployments using AWS by creating the IAMs and used the code pipeline plug-in to integrate Jenkins with AWS and created the EC2 instances to provide the virtual servers. * Involved in Designing and deploying AWS solutions using EC2, S3, RDS, EBS Volumes, Elastic Load Balancer, Auto Scaling groups, Lambda Functions, Apigee, Cloud Formation Templates, IAM Roles, and Policies. * Managed and monitored the server and network infrastructure using AppDynamics and Splunk. Performed system administration and operations tasks using Jenkins, AppDynamics. * We migrated from our company datacenter to AWS using their services for our projects for server-based centos, red hat for development, QA, UAT, Prod environments and provide access to all teams who worked on those applications. * For server less we are using cloud formation Templates for creating environments using developer tools in AWS like code commit, code build, code deploy, code pipeline, Lambda functions, Apigee, Step functions, S3, PostgreSQL providing access for a Team. Solved problems using a combination of JavaScript, JSON, and JQUERY. * Extend and develop Ansible plugins and modules to fully automate AWS management and put the power of deployment into the hands of the product owners. Architect and implement the micro-service ecosystem on top of AWS and docker. Design workflows and write tools to allow engineers to develop and fully manage their own micro-service stacks across all environments (from Dev to prod). * Manage the development, deployment, and release lifecycles by laying down processes and Manage AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well as infrastructure servers for GIT and Ansible. Automated the regular build and deployment processes for pre-prod and prod environments using Tools such as Maven following the Software Implementation Plan. * Used Maven dependency management system to deploy snapshot and release artifacts to Nexus to share artifacts across projects. Configured and maintained Jenkins to implement the CI process and integrated the tool with Ant and Maven to schedule the builds. Worked with Hudson/Jenkins continuous integration servers to write/perform sanity checks as well as automatic rpm builds. * Maintained a high availability clustered and standalone server environments and refined automation components with scripting and configuration management (Ansible). * Wrote Ansible playbooks which are the entry point for Ansible provisioning, where the automation is defined through tasks using YAML format. Run Ansible Scripts to provision Dev servers. * Installing, setting up & Troubleshooting Ansible, created and automated platform environment setup. * Wrote Ansible Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS nodes and Tested Playbooks on AWS instances using Python. Run Ansible Scripts to Provide Dev Servers. * Responsible for Continuous Integration and Continuous Delivery process implementation using Jenkins along with Python and Shell scripts to automate routine jobs. * Worked on AWS Elastic Beanstalk for fast deploying of various applications developed with Python, Ruby, and Docker on familiar servers such as Apache and IIS. * Migrated VMware VMs to AWS and Managed Services with Ansible. Working with GitHub to store the code and integrated into Ansible Tower to deploy the playbooks. Created user level of access for related GitHub project directories to the code changes. * Experience on Java Multi-Threading, Collection Framework, Interfaces, Synchronization, and Exception Handling. * Written Shell Scripts to apply the Integration label to all the files which need manual labelling of files. Configured the user accounts for Continuous Integration - Jenkins, Nexus, and Sonar. Handled Jira Tickets for SCM Support activities. * Installed, Configured and Managed Monitoring Tools such as AppDynamics, Cloud Watch for Resource Monitoring. Environment: AWS, Git, GitHub, Jenkins, Ansible, Nexus, Docker, Kubernetes, Nagios, Jira, AppDynamics, Cassandra, ANT, Shell Scripts, ELK, JavaScript, JSON, Python.
Build and Release Engineer
Century Link
April 2016 - December 2017
- Responsibilities: * Experience in building the releasing the software baselines, branch, and labels development in subversion/GIT. * Exposed to resolve the conflicts in merging of source code for GIT. Software configuration using SVN/CVN, and GIT. Involved in administration of SVN&CVS for repo migration and creation, implementing repository hook script and integration of JIRA/GIT. * Building software baselines, code merges, and branch and label creation in GIT and installing, administering Bamboo Continuous Integration tool on Linux/Unix machines using GIT, Maven. Configured script builder, custom command builder agents in Bamboo. * Installed packages using YUM and RPM on various servers. * Worked on JENKINS, used it for (CI) continuous Integration and End to End automation in building and deploying application. * Integrated JENKINS with ServiceNow using API, writing scripts in python and shell for JENKINS to build, commit information to ServiceNow and store the information by creating Schema. * Experienced Support deploy operations to web application servers like ApacheHTTPD, WebLogic, and JBOSS Apache. * Configuring system settings and giving user permissions, message notifications and email alerts, using Bamboo. * Implemented templates for ELB, Auto-Scaling, s3, EBS, VPC and performing cloud watch services in AWS and providing cloud security for applications on AWS. * Worked extensively on CHEF managing the cookbooks and implementing roles and templates for environment management. * Configuration Management, implementing CD (continuous Delivery) and provisioning using Chef on AWS, conducting automated deployment by writing Chef Cookbooks. * Experienced in writing CHEF cookbooks to configure DB to optimize product configuration and AWS server provisioning using CHEF recipes. * Worked on deployment of JAVA applications into Apache Tomcat server, automating the daily process using integration tool Bamboo. * Using SonarQube to identify the issues like build errors from the components (project, modules, etc.) in the system. Escalating the errors to the concerned team. * Building SQL scripts, maintaining the execution of scripts on different environments. Writing and documenting necessary information about software release, versions, labels, etc. * Organizing and maintaining repositories for MAVEN using NEXUS tool and used to share the snapshots of repositories for the internal projects. * Exposed in creating and modifying configuration files with POM.xml, authorize POM.xml files, managing artefacts in NEXUS repository. Environment: AWS, Maven, GIT, SVN, CVS, JIRA, Bamboo, Jenkins, Python, Shell, Apache HTTPD, WebLogic, JBOSS, Chef, Nexus.
LINUX Administrator
Omni cell
January 2014 - December 2014
- Responsibilities: * Provided 24x7 on-call supports in debugging and fixing issues related to Linux, Solaris,HP-U Installation/Maintenance of Hardware/Software in Production, Development & Test Environment as an integral part of the Unix/Linux (RHEL/SUSE/SOLARIS/HP-UX/AIX) Support team. * Experience managing various file systems using LVM and SVM and configured file systems through network using NFS, NAS, SAN methodologies and installed RAID devices. * Experience in configuration of apache SVN, HTTP, HTTPS, FTP, SFTP, remote access management and security trouble shooting skills. Worked on creating disk groups, volumes, deporting, importing disk groups using VERITAS volume manager. * Monitored server and application performance & tuning via various stat commands (vmstat, NFS-stat, iostat) and tuned I/O, memory, etc. * Responsible for Remote Linux Support with more than 400 Servers. Managing users like creating, deleting and granting proper privileges and managing system security. * Resolving TCP/IP network access problems for the clients. Develop, Maintain, update various scripts for services (start, stop, restart, recycle, Cron jobs) UNIX based shell. * Respond to clients for network problems such as firewall, router, switch, internet, computer applications, printer, VPN, Active directory, DNS and DHCP Servers. * Experience installing and configuring SSH (Secure Shell) encryption to access securely on Ubuntu and Red hat Linux. Responsible for configuring and connection to SSH through SSH clients like Putty and Cyber duck. * Monitoring CPU, memory, iSCSI disks, disk controllers, physical disk, HW and SW RAID, multipath, file systems, network using the tools NAGIOS and BMC Tools. * Updating patches to keep the Linux and Solaris servers updated against bugs present in the operating system using patch administration commands like yum, rpm, patch add, show rev & up2date etc. * Diagnosed and resolved problems associated with DNS, DHCP, VPN, NFS, and Apache. * Managing CRON jobs, batch processing and job scheduling. Troubleshooting the network with the help of netstat, ping, nslookup and traceroute tools. * OS upgrades and installation of third-party software, packages, and patches as per requirement. Maintaining the Linux Firewall for network and implement to the network security. * Responsible for Installation, configuration, and administration of middleware like Web Logic 9.0, JBoss 3.2, CA Web Access Manager (WAM) and MQ Series v 7.0 etc. * Performance Analysis of resources like CPU, Memory, Disk and Swap from command line using commands like prstat, vmstat, sar, iostat, swap etc. and tuning of the kernel parameters for the better performance of the operating system and TCP. * Performed Disaster Recovery in RHEL servers which consists of LVM based FS and Red Hat Clustering. * Supported class monitoring and management tools such as Open NMS, Tivoli and VCO. Environment: WINDOWS, Oracle Virtual box, VMware, SSH, Putty, ceph, Cyber duck, Ubuntu, RHEL, SFTP, FTP, TCP/IP, DNS, DHCP, VPN, RPM, YUM, SCP, WinScp, and FileZilla.