BadFormChars
Specifies the characters that the Web Agent blocks before using them as output on a form. If enabled and if the agent name part of the URL has one or more characters that are specified in this parameter, then the login page returns the following error message:
Internal Server Error
Default: Disabled (characters are not blocked)
Examples: <, >, &, %22
Limits: You can specify the characters literally.
One of the issues researching this topic was the conflicting information and data depending on the age of the KB or document they looked at. Over the years, as CSS/XSS checking has advanced, the product design around the various ACOs, CSSChecking, BadCSSChars, BadFormChars, etc has evolved as well.
Originally, BadFormChars did not block characters, it encoded them as literal HTML characters. And it was limited to a specific set of 4 characters.
BadFormsChars="<,>,&,""
< was encoded as <
> was encoded as >
& was encoded as &
" was encoded as "
It was not designed or intended to scan the actual FCC source code or filter the POST data. It was meant to apply during a GET request when FCC directives like $$variable$$ are used. It will also apply to the FCC hidden variables which become POST data during the POST request.
Please note that SiteMinder does not provide any support for filtering POST data. This is the responsibility of the application that is receiving the data.
Here is what was changed and how it works in the newer releases:
- BadFormChars now accepts any character in the ACO so that it is on par with the other CSS/XSS ACOs rather than being limited to a specific set.
- BadFormChars now requires a " to URL-encoded in the ACO value as %22.
- BadFormChars now blocks, rather than encodes, any character listed in the ACO.
The application of the ACO remains the same in that it applies to GET requests, does not scan the FCC source code or POST data. And for additional clarification, none of the ACOs like BadCSSChars, BadURLChars, BadQueryChars, or BadFormChars will filter or block the POST data.