Analytics & Data
Relay provides built-in analytics tools to measure user behavior, track custom business metrics, and export data to external systems.
Funnel Analysis
Define multi-step event funnels to measure conversion rates.
Creating a Funnel
Go to your app → More → Observe → Funnels → Create Funnel.
A funnel is a sequence of steps, each defined by a channel pattern and event name:
{
"name": "Checkout Flow",
"steps": [
{"channel": "page.*", "event": "view"},
{"channel": "cart.*", "event": "add"},
{"channel": "checkout.*", "event": "start"},
{"channel": "checkout.*", "event": "complete"}
],
"window_minutes": 60
}
How It Works
- Relay tracks which
socket_idvalues appear at each step - Only users who completed step N are checked for step N+1
- The window limits how long between first and last step
- Results show counts and conversion rates per step
Reading Results
| Step | Count | Conversion |
|---|---|---|
| page.view | 1,000 | 100% |
| cart.add | 500 | 50% |
| checkout.start | 200 | 20% |
| checkout.complete | 50 | 5% |
Conversion is always relative to step 1 (not the previous step).
Custom Metrics
Define your own metrics from event payloads.
Creating a Metric
Go to your app → More → Observe → Custom Metrics → Create Metric.
| Field | Example | Description |
|---|---|---|
| Name | Daily Revenue | Human-readable name |
| Aggregation | sum |
One of: sum, avg, count, min, max, p95 |
| Source Channel | orders-* |
Glob pattern for source channels |
| Source Event | order.completed |
Event name to aggregate |
| Field Path | data.amount |
Dot-notation path into the event payload |
Example Metrics
Daily Revenue:
- Aggregation:
sum - Channel:
orders-* - Event:
order.completed - Field:
data.amount
Average Response Time:
- Aggregation:
avg - Channel:
api.* - Event:
response - Field:
data.duration_ms
Error Count:
- Aggregation:
count - Channel:
errors-* - Event:
* - Field:
*
Viewing Metrics
Each metric shows:
- Latest value — the most recent computed value
- Time series — 24-hour chart of values over time
- Sample count — how many events contributed
Metrics are computed on a rolling basis and stored as time-bucketed values.
Data Export Pipelines
Stream event data to external systems for warehousing, analytics, or compliance archival.
Supported Destinations
| Destination | Format Support |
|---|---|
| Amazon S3 | JSON, Parquet, CSV, Avro |
| Google BigQuery | JSON |
| Snowflake | JSON |
| Apache Kafka | JSON, Avro |
| ClickHouse | JSON |
| Elasticsearch | JSON |
| Webhook | JSON |
Creating a Pipeline
Go to your app → More → Observe → Data Exports → Create Pipeline.
Configure:
| Field | Description |
|---|---|
| Name | Pipeline name (e.g., "S3 Archive") |
| Destination | Where to send data |
| Source Channels | Glob pattern (e.g., orders-* or * for all) |
| Format | Output format |
| Batch Size | Events per batch (default 1000) |
| Flush Interval | Seconds between exports (default 60) |
Destination Configuration
Each destination requires specific credentials:
S3:
Bucket: my-relay-events
Region: us-east-1
Access Key: AKIA...
Secret Key: ...
Prefix: relay/events/
BigQuery:
Project ID: my-project
Dataset: relay_events
Credentials JSON: {...}
Kafka:
Brokers: kafka1:9092,kafka2:9092
Topic: relay-events
SASL Username: (optional)
SASL Password: (optional)
Webhook:
URL: https://your-server.com/ingest
Secret: (auto-generated for HMAC signing)
Pipeline Status
Each pipeline shows:
- Events exported — total count
- Last exported — timestamp of last successful export
- Enabled/Disabled — toggle without deleting
- Flush interval — how often batches are sent
Data Format
Exported events follow this structure:
{
"id": "01HQ...",
"channel": "orders-us",
"event": "order.created",
"data": "{\"amount\": 99.99}",
"payload_size": 24,
"fired_at": "2024-03-15T14:30:00Z",
"app_id": "01HQ..."
}
API Access
All analytics data is also available via the Intelligence API:
# Query events
POST /api/v1/apps/{appId}/intelligence/query
{"query": "SELECT COUNT(*) FROM events WHERE channel LIKE 'orders-*' SINCE 24h AGO"}
# Get cost attribution
GET /api/v1/apps/{appId}/intelligence/cost-attribution
# Get latency percentiles
GET /api/v1/apps/{appId}/intelligence/latency?channel=orders
See the Intelligence & Analytics docs for the full API reference.