The Spooler fails to send QoS data. <nimsoft>\robot\q1.rdb continues to grow without passing data.
spooler.log contains a loop of the same events such as:
Jun 11 15:06:19:772 [5308] spooler: checkphead: header not ok: HTTP/1.0 4
Jun 11 15:06:19:772 [5308] spooler: sockParse: illegal phead received
Jun 11 15:06:19:772 [5308] spooler: nimSessionWaitMsg: got error on client session: 0
Jun 11 15:06:19:772 [5308] spooler: FlushMessages - failed to flush message (communication error)
Jun 11 15:06:19:772 [5308] spooler: FlushMessages - 0 messages sent to 216.58.142.67:48001
Jun 11 15:06:24:842 [5308] spooler: FlushMessages - out-queue contains 1 records - continue
Jun 11 15:06:24:842 [5308] spooler: nimSessionConnect - host = 216.58.142.67, port = 48001, secWait = 15
Jun 11 15:06:24:842 [5308] spooler: sockConnect - to host 216.58.142.67, port 48001
Jun 11 15:06:24:842 [5308] spooler: SREQUEST: hubpost ->216.58.142.67/48001
Jun 11 15:06:24:842 [5308] spooler: head mtype=100 cmd=hubpost seq=0 ts=1339448784 frm=10.0.0.9/52865
Jun 11 15:06:24:842 [5308] spooler: head tout=10 addr=
Jun 11 15:06:24:842 [5308] spooler: data nimid=XM58589581-45592 nimts=1339023600 tz_offset=21600 source=74.63.179.39
Jun 11 15:06:24:842 [5308] spooler: data md5sum=HEX(16):527cf55edb9b5a188c2fee71c30378b2 robot=85780-DB04
Jun 11 15:06:24:842 [5308] spooler: data domain=Viawest origin=85780 pri=1 subject=QOS_MESSAGE prid=cdm
Jun 11 15:06:24:842 [5308] spooler: data dev_id=D35A414C4B0AD986CC5D592A370635BD2 met_id=M7BD94BD67ABEDEEFF28D8E1113B77647
Jun 11 15:06:24:842 [5308] spooler: data udata=PDS(198)
(The same QoS object, dev_id, met_id, etc is looping through the spooler logs).
1) Check the <nimsoft>\robot\q1.rdb and q2.rdb files. If q2.rdb is small (~1KB) and q1.rdb is continuing to grow, there's probably corrupt data in q2.rdb that's causing QoS to queue on the spooler.
2) Stop the robot.
3) Move q2.rdb to a temp folder
4) Restart the robot.
This will allow the robot to create a new q2.rdb file which should then allow q1.rdb to begin to flush. An analysis of the archived q2.rdb may show corrupt data via illegal characters which was disrupting the normal data flow.