Camping fails on computers with Hyper-v, Virtualbox or DockerANT network adapters

book

Article ID: 142887

calendar_today

Updated On:

Products

CA Client Automation - Asset Management CA Client Automation - IT Client Manager CA Client Automation CA Client Automation - Software Delivery CA Client Automation - Remote Control CA Client Automation - Asset Intelligence CA Client Automation - Desktop Migration Manager CA Client Automation - Patch Manager

Issue/Introduction

CAM does not work as expected on computers with Hyper-V, VirtualBox or DockerNat installed.

These computers do not reply to camping because CAM on those computers use the vEthernet Interface from DockerNat or the VirtualBox Host-Only Network.

Ipconfig may show the following order:

Ethernet-Adapter vEthernet (Standardswitch)

Ethernet-Adapter vEthernet (DockerNAT)

Ethernet-Adapter Ethernet

Ethernet-Adapter Bluetooth

 

Cause

CAM binds to the first adapter, the Ethernet that connects to the corporate network is not used, hence the failure to camping the system.

 

Environment

Clarity Client Automation - all versions

Resolution

There're two possible solutions:

1- Change the binding order of the network interfaces to the main network adapter for traffic to the SS will be the main one:

The metric of the adapter must be changed to the lowest value in the system by using the following command:

powershell set-netipinterface -interfacealias <adapter-name> -interfacemetric 5

Example: setting the Ethernet adapter as the primary one:

powershell set-netipinterface -interfacealias Ethernet -interfacemetric 5

After it check the updated value with "powershell get-netipinterface" and "ipconfig /all" will show the right order too. The Ethernet interface must show the lowest value in the list as by default the automatic values are higher.

 

2- Setting the routing to the SS in CAM using the routing section of cam.cfg

This method is suggested for static environments but not suggested for agents roaming to different SS mainly if using locationawareness.

In the following example, both Agent and SS are using the same subnet:

SS - itcmss1402 - 192.168.107.24
AG - itcmagwin1402 - 192.168.107.46

Define routing in cam.cfg as follows to force the traffic with the SS:

*ROUTING
forward itcmss1402 = 192.168.107.*

The current dgxxx log file shows the following activity when set in trace mode and it confirms the route has been defined:

path created to (and from) 192.168.107.24:4104
preferred source address for messages to 192.168.107.24:4104 is 192.168.107.46

After camsave persist the following path has been stablished using the IP address of the right adapter (this camsave is not needed as only the first line is needed, it's just to verify it's using the right address):

*ROUTING
forward itcmss1402 = 192.168.107.*(to)
forward itcmss1402 = 192.168.107.46(to)

Camstat -a on the Agent shows:

CAM - 10.0.3.15  Version 1.14 (Build 4) up 0 days 0:04 (trace=67FF)

Host                  proto state  port  Qlen  m/sent  m/recv  retry  disc  RTO
--------------------- ----- ----- ----- ----- ------- ------- ------ ----- ----
192.168.107.24          udp   ---  4104     0      18      17      0     0    1
192.168.107.46          prx 

The following article shows the steps to update cam.cfg and restart CAM: How do I Configure Client Automation to use the Correct IP Address on a Server with multiple Network Adapters

This article CAM routing configuration in a firewall environment explains a restriction on defining multiple routing statements.