Article ID: 197971


Updated On:


CA Single Sign On Secure Proxy Server (SiteMinder) CA Single Sign On Agents (SiteMinder) CA Single Sign On Federation (SiteMinder) CA Single Sign On SOA Security Manager (SiteMinder) SITEMINDER



We're running a Web Agent and when user access a specific resource,
the browser shows error :

  SMERROR 00-0002

We'd like to know :


   We guess SMERROR may be due to the fact that UTF8 encodes french "é"
   character into %C3%A9 which may be interpreted into %C3 (allowed in
   BadUrlChars) and %A9 (forbidden in BadUrlChars).
   However, as far as we understand, the format used by BadURLChars
   covers only ASCII characters. Can you bring lights on this point ?


   To extend the analysis, we tested inserting two forward slashes (//)
   in URL

   Again, we face no problem on legacy Web Agent version 12SP3 whereas
   we face a SMERROR 00-0002 on Web Agent 12.52SP1CR09.

   Also, we tested removal of %7f-%bf on Web Agent 12.52SP1CR09 and it
   resolves the problem although it's still not clear why bot
   platforms act differently.

   Can you help finding why both Web Agent versions acting differently


   Could you detail the concerns on security perspective about
   restraining badurlchars range ?



Web Agent 12.52SP1CR10 on Apache 2.4.43 on RedHat 7





   WebAgent will encode all characters in the URL which aren't ASCII,
   in order to validate it with the hex character defined in the

   We made this test, and we see that the Web Agent encodes the french
   "é" character to validate against values in the badurlchars :

   In browser URL :é/allheaders.php

   WebAgent.log :

   [4455/613070592][Thu Aug 13 2020 10:58:18] SiteMinder APACHE 2.4
   WebAgent, Version 12.52 QMR01, Update HF-10, Label 2813.
   [4455/613070592][Thu Aug 13 2020 10:58:18]

   WebAgentTrace.log :

   contains BadUrlChars: '/é/allheaders.php'.]

   I've modified the badurlchars from :




   and still the character is catched :

   WebAgent.log :  

   [5435/3584091904][Thu Aug 13 2020 11:12:51]

   WebAgentTrace.log :


   contains BadUrlChars: '/é/allheaders.php'.]


   We've tried the same with the default values of badurlchars, and as
   expected, the // is catched :

   start /B iexploreé//allheaders.php
   [8475/1787266816][Thu Aug 13 2020 11:23:25]

   contains BadUrlChars: '/é//allheaders.php'.]

      That is the way it works. Notes 2 important changes between
   12SP3CR12 and 12.52SP1CR10 :

   There's a fix :


   Defects Fixed in 12.52 SP1 CR09

   00467736 DE203133 When localization is enabled, Web Agent fails to
   decode the encoded characters before processing badurlchars.

   And 12.51 makes localization enabled by default :



   List of Agent Configuration Parameters

   Localization Yes Specifies whether agent internationalization is
   enabled. Disable to customized FCCs created for use with agent
   versions prior to 12.51. See FCC Internationalization.

  For guidance on security, please refer to the following KD :

  Web Agent :: BadUrlChars : Impact of disabling Them