Using the ARC Migration Script in vROps 8.4 and above
search cancel

Using the ARC Migration Script in vROps 8.4 and above

book

Article ID: 315986

calendar_today

Updated On:

Products

VMware Aria Suite

Issue/Introduction

Starting in vRealize Operations 8.4, the Application Remote Proxy (ARC) server is handled by a Cloud Proxy VM.
To migrate an existing ARC setup from 7.5, 8.0.x, 8.1.x, 8.2, or 8.3 to a Cloud Proxy VM, follow the steps in this article.  The migration script will move existing endpoint from a stand-alone ARC to a Cloud Proxy

It is strongly recommended to migrate end-points to a Cloud Proxy VM to continue to receive new features and functionality of Application monitoring in later releases of vRealize Operations.
If you do not migrate ARC to Cloud Proxy VMs there will be no loss of functionality at this time, however, you will be unable to add new ARC VMs and will not be able to use the new Application monitoring features of vRealize Operations 8.4 and later.
 

Limitations

  • After a Cloud Proxy VM has been installed, the migration should be started immediately after.  Do not attempt to register a new vCenter; complete the migrations first.
  • ARC instances and CP should have a one of one mapping – multiple ARCs cannot be consolidated; multiple ARC's cannot be migrated to single CP.
  • Once the migration is complete, telegraf agents that were in a stopped state before the migration (if they were stopped by a user or some other reason) will come online to the running (collecting) state.
  • If a VM with any de-activated plugins are migrated, no further action can be performed on the plugin after migration, such as activation (Status change to Enabled) or edit.  This is because before the migration, service instance related configurations would be removed from telegraf configurations.


Environment

VMware vRealize Operations 8.6.x
VMware vRealize Operations 8.4.x
VMware vRealize Operations 8.x
VMware vRealize Operations 8.5.x

Resolution

Quick Links

 

Prerequisites

  • Take a snapshot of the ARC instance.
  • Upgrade vRealize Operations to version 8.4 or later following the documentation.
  • Deploy a Cloud Proxy VM and add it to your newly upgrade vRealize Operations cluster.
  • Enable SSH on the Cloud Proxy VM by logging into the Console as root and running service sshd start.
  • Enable SSH on the stand alone ARC by logging into the Console as root and running service sshd start.
  • Ensure all plugins are in an activated(enabled) state before the migration is preformed.
    • Go to the Manage agents page of vRealize Operations UI, select the VMs in which agents are installed, edit service instance configuration and check if the Status field is enabled and make sure that the service instance is in a data receiving state.
  • The minimal permissions required for the vRealize Operations user to run this migration script are:
    • Administration > Resource Management > Create, Delete, Edit, Read
    • Administration > REST APIs > All other Read, Write APIs, Delete Solution, Read access to APIs, Read access to metering API
    • Administration > Solutions Management > Account Management > Add, Delete, Edit, View.
  • Ensure that you have completed all the requirements related with vRealize Operations, Application Remote Collector, Cloud Proxy, vCenter, VMs and ESX hosts where VMs are deployed. For more information, see Prerequisites in the vRealize Operations Documentation.

 

Running the Migration Script

Important: Ensure that you have completed all the prerequisites prior to running the steps below. For more information, see Prerequisites.

  1. Log into the ARC VM as root via SSH or Console, pressing ALT+F1 in a Console to log in.
  2. Download the Migration Script from the Cloud Proxy VM using this command:
wget --no-check-certificate https://cp_ip/downloads/salt/ucp-migration-util.zip

Notes:
  • Replace cp_ip with the IP address of the Cloud Proxy VM. 
  • If any vRealize Operations patches or Hot Fixes have been installed after originally downloading the migration zip, redownload the migration zip to ensure you're using the latest version.
  • If you have upgraded to vRealize Operations 8.5 Hot Fix 1, redownload the migration zip to ensure you're using the latest version.
Examplewget --no-check-certificate https://192.168.3.50/downloads/salt/ucp-migration-util.zip
  1. Unzip the Migration Script on the ARC VM:
unzip ucp-migration-util.zip
  1. (Optional) By default, the migration script uses 1 thread to migrate the agents - the average time to migrate an end-point takes up to 120 seconds.  It can be configured to use up to 5 threads in parallel if a migration.properties file is created and configured.  To configure this optional value, complete these steps:
    1. Create and open a migration.properties file in the folder ucp-migration-util.zip was extracted to:
vi migration.properties
  1. Enter the following text, replacing value with any number from 2-5.
threadcount=value
  1. Save and close the file:
:wq
  1. Run the migration script, using the appropriate values:
./migrate.sh -v primary_ip -u user -p password -a vc_ip -b cp_ip -c root -d cp_password

Note: Replace
  • primary_ip with the IP or FQDN of the vRealize Operations Primary node.
  • user with a vRealize Operations user that has the required permissions. See the Prerequisites section for details.
  • password with the vRealize Operations local Admin user password.
  • vc_ip with the IP or FQDN of the vCenter as configured in vRealize Operations.
    • If the vCenter is configured using FQDN, run the migration script with the FQDN or if the vCenter is configured using IP, run the migration script with the IP.
  • cp_ip with the IP or FQDN of the Cloud Proxy VM as configured in vRealize Operations.
    • If the Cloud Proxy is configured using FQDN, run the migration script with the FQDN or if the Cloud Proxy is configured using IP, run the migration script with the IP.
  • cp_password with the Cloud Proxy VM's root user password.
Example./migrate.sh -v 192.168.3.10 -u admin -p Password!123 -a 192.168.1.10 -b 192.168.3.50 -c root -d Password!123
  1. After the successful migration upgrade the agents by navigating to Administration > Inventory > Manage Agents, selecting an agent and clicking Update.

Additional Information

If end-points are turned off, unreachable, or failed with a timeout exception during the migration,  those end-points will fail and can be migrated once they're reachable by running the migration again; only the end-points that previously failed will be migrated.

To see if any end-points failed the migration, see FailedMigrationEnpoints.txt on the ARC node in the directory the Migration Script was run from.