Log Streaming Overview

Updated 5 hours ago by admin

The Log Streaming product is an add-on to the Unified Security Service platform that allows you to extract enriched log data for each of the core security products that are licensed. The typical use case is to stream log data to an external log analysis/SIEM tool.

Plugins for Third Party tools

There are many third party tools available that can be integrated with the Log Streaming product to provide enhanced analytics. If the tool does not directly support one of the Configuration Options out-of-the-box, it is usually possible for a developer to use the Configuration Options below to create a custom plugin or add-on. Recommended plugins will be listed here when they are available. If you require a specific plugin, please contact your Service Provider to discuss further.

  • Splunk Enterprise - download and install the app
  • Rapid7 InsightIDR - no plugin required, use a Collector configured with the Amazon SQS option (see below)
  • SumoLogic - no plugin required, use the Form Post option
  • AlienVault USM Anywhere - contact AlienVault to request an AlienApp to be developed

Configuration Options

The Log Streaming product provides three output options:

  1. Webhook - provides a standard HTTP endpoint to call from a script/plugin to download the latest log data for a given core product. This is a generic interface for accessing the log data.
  2. Amazon SQS - the log data will be published to the Amazon SQS queue specified in the configuration. This method is supported by a variety of third party SIEM tools.
  3. Form Post - the log data will be POST'ed as multipart form data in JSON format to the specified URL
Please note that all methods have a one day timeout for streaming. If data is not successfully sent for more than one day, the streaming point will be reset to the last 5 minutes.

To configure Log Streaming for a product, navigate to Products and then click Log Streaming. If you require a license, please contact your Service Provider.

The products that are licensed and support Log Streaming will be listed on the left. Click a product to access the configuration options.

Webhook

Enabling the Webhook option will generate a unique secret key which can then be used to call the HTTP endpoint to retrieve the log data. Some plugin's will simply require you to provide the secret key as part of their configuration.

Currently a webhook can only be used by one external service. Using the same webhook with multiple external services will overwrite the cursor position and lead to inaccurate log streaming.

The HTTP endpoint is available here:

https://stream.clouduss.com/{product}/{key}
  • product = web, email, app (inline mode CASB), casb(API mode CASB), mfa
  • key = the Key displayed in the configuration screen. Ensure you keep the key safe and secure.

IMPORTANT: The first time you call the endpoint it will return the previous 10 minutes worth of data (approximately) and it will store the UTC timestamp of the call as a cursor. Subsequent calls will retrieve the data since the last cursor in 5 minute intervals up to the maximum of one day since the last call. It is recommended that you call the endpoint every 5 minutes. If you do not call the endpoint for over a day, then the cursor will reset to the last 5 minutes and you will begin streaming again from that point. The longer the gap between calls, the more data you will need to process each time.

Amazon SQS

Enabling the Amazon SQS option will instruct the Log Streaming product to send log data to the designated SQS queue every 5 minutes. If the SQS queue is not available, the streamed data will be lost. Each row of data will be sent as a separate SQS message.

Please ensure the SQS queue maximum message size is set large enough to receive all of the meta data about an email message. We recommend 64Kb or larger.
To use this option you will need to sign up to Amazon Web Services and create an SQS queue. This may incur additional costs.

Form Post

Enabling the Form Post option will instruct the Log Streaming product to send log data to the designated URL every 5 minutes. The data will be sent in JSON format as a multipart form POST, therefore the web server at the receiving end should be configured accordingly. The data is formatted as one JSON object per line. If the URL is not available, the streamed data will be lost.

If more than 20 rows of data are returned in the 5 minute period, then multiple POSTs will be sent with a maximum of 20 rows per POST until all the data has been sent.


How did we do?