Stream Audit Log Events to your customers’ SIEM providers.
Log Streams allow your customers to stream Audit Logs directly to their Security Incident and Event Management (SIEM) providers like Datadog or Splunk and object storage solutions like AWS S3 or Google Cloud Storage. There is also a generic provider (HTTP POST) available to stream logs to any configured endpoint.
This gives your customers greater control over their Audit Logs by allowing them to apply custom indexing and monitoring of their events in the SIEM provider along with events from other cloud services they use.
Log Streams can be created by either configuring the Log Stream through your WorkOS Dashboard or by allowing your customer’s IT admin to configure it themselves through the WorkOS Admin Portal.
WorkOS streams audit logs from a fixed set of IP addresses. If audit logs are being streamed to a host that restricts access based on IP address, the following IP addresses should be allowed:
3.217.146.166 23.21.184.92 34.204.154.149 44.213.245.178 44.215.236.82 50.16.203.9 52.1.251.34 52.21.49.187 174.129.36.47
To configure a Log Stream through the WorkOS Dashboard, navigate to an organization and click “Configure”.

You will be promoted to select a destination from a dropdown, click “Save connection”. You will then be prompted to provide specific configuration for the selected destination.

The Admin Portal can be accessed via a Setup Link found in the Organization page within the Dashboard. Click “Generate” and select “Log Streams”. Copy the link and send it to the organization’s IT admin who will be configuring Log Streams.

You can also guide users to the Admin Portal by redirecting them to a programmatically generated Admin Portal link directly from your application.
import { WorkOS } from '@workos-inc/node'; const workos = new WorkOS('sk_example_123456789'); const { link } = await workos.portal.generateLink({ organization: 'org_01EHZNVPK3SFK441A1RGBFSHRT', intent: 'log_streams', }); // Redirect to link
Once redirected to the Admin Portal, the user will be prompted to select a destination and will be provided with step-by-step configuration instructions for the selected destination.

WorkOS supports streaming audit log events to five different types of destinations, each with its own payload format and configuration requirements:
Events are sent to Datadog’s HTTP Log Intake API with regional endpoint support.
Example Payload:
[ { "message": { "id": "01HY123456ABCDEFGHIJK", "action": "user.signed_in", "targets": [ { "id": "user_123", "type": "user" } ], "actor": { "id": "user_456", "type": "user" }, "context": { "location": "192.168.1.1", "user_agent": "Chrome/91.0" }, "occurred_at": "2024-01-15T10:30:00.000Z" }, "ddsource": "team-name", "service": "audit-logs" } ]
Configuration:
Events are sent to Splunk’s HTTP Event Collector (HEC) endpoint.
Example Payload:
[ { "event": { "id": "01HY123456ABCDEFGHIJK", "action": "user.signed_in", "targets": [ { "id": "user_123", "type": "user" } ], "actor": { "id": "user_456", "type": "user" }, "context": { "location": "192.168.1.1", "user_agent": "Chrome/91.0" }, "occurred_at": "2024-01-15T10:30:00.000Z" }, "time": 1705314600000, "source": "team-name" } ]
Configuration:
Events are stored as individual JSON files in an S3 bucket with cross-account IAM role access.
File Format: Individual JSON files per event with pretty-printed formatting
File Naming Pattern: YYYY-MM-DD/{timestamp}_{keySuffix}.json
Example Filename: 2024-01-15/2024-01-15T10:30:00.123Z_abc123def456.json
Example File Content:
{ "id": "01HY123456ABCDEFGHIJK", "action": "user.signed_in", "targets": [ { "id": "user_123", "type": "user" } ], "actor": { "id": "user_456", "type": "user" }, "context": { "location": "192.168.1.1", "user_agent": "Chrome/91.0" }, "occurred_at": "2024-01-15T10:30:00.000Z" }
Configuration:
Events are stored as individual JSON files using Google Cloud Storage’s S3-compatible API.
File Format: Individual JSON files per event (same format as S3)
File Naming Pattern: {timestamp}_{keySuffix}.json
Example File Content: Same JSON structure as S3
Configuration:
Events are sent to custom HTTP endpoints with configurable authentication and format options.
JSON Format Example:
[ { "event": { "id": "01HY123456ABCDEFGHIJK", "action": "user.signed_in", "targets": [ { "id": "user_123", "type": "user" } ], "actor": { "id": "user_456", "type": "user" }, "context": { "location": "192.168.1.1", "user_agent": "Chrome/91.0" }, "occurred_at": "2024-01-15T10:30:00.000Z" }, "keySuffix": "abc123def456", "timestamp": "2024-01-15T10:30:00.123Z", "source": "team-name" } ]
NDJSON Format Example:
{"event":{"id":"01HY123456ABCDEFGHIJK","action":"user.signed_in",...},"keySuffix":"abc123def456","timestamp":"2024-01-15T10:30:00.123Z"}
Configuration:
Audit log streams can be in one of four states that determine their operational status:
| State | Description |
|---|---|
| Active | Stream is functioning normally and delivering events |
| Inactive | Stream is incomplete, manually disabled or paused |
| Error | Stream encountered a retry-able error and will be retried |
| Invalid | Stream has invalid credentials or configuration |
Streams automatically transition between states based on delivery outcomes: