Guidelines for Configuring Feeds for Web and Firewall Logs


Guidelines for Configuring Feeds for Web and Firewall Logs

Consider the following guidelines when configuring web and firewall logs:

  • The Zscaler service hex encodes all non-printable ASCII characters that are in URLs when it sends the logs to the NSS. Any URL character that is less than and equal to 0x20 or equal to and above 0x7F will be encoded as %HH. This ensures that your SIEM will be able to parse the URLs in case they contain control characters. For example, a \n char in a URL is encoded as 0A, and a space is encoded as %20

You can also specify additional characters that you would like to encode, in case you are using certain characters as delimiters. For example, if you are using a comma (,) as a delimiter, enter a comma in the Feed Escape Character field and the service will encode it as %2C. Note that the service encodes characters in URLs, host names, and referer URLs only.

  • The NSS can buffer logs for at least one hour. If a SIEM goes offline for maintenance or if the connection between the NSS and the SIEM is disrupted, the NSS buffers the logs and sends them once the connection is re-established. Note that there can be a difference between the time the SIEM becomes unavailable and the time that the NSS detects that the SIEM is unavailable. When you configure an NSS feed, you can use the Duplicate Logs option to control the period of time from before the NSS detected the SIEM for which it will resend logs. Note that you configure up to eight NSS feeds for each NSS.

For example, the SIEM went offline at 6:29:00 p.m., the NSS detected the lost connection at 6:30:00 p.m., and the connection was restored at 6:40:00 p.m.

  • If Duplicate Logs was set to five minutes, the NSS will resend the logs from 6:25:00 onwards, after the connection is restored. It will send five minutes of duplicate logs.
  • If Duplicate Logs was disabled, the NSS will resend the logs from 6:30:01 onwards, after the connection is restored.
    Each log record has a unique ID stored in the ‘recordid’ field that can be used to discover duplicate logs.