Skip to main content

API Playground

Test dakkio API endpoints directly from your browser. Enter your API key and try sending requests!

Getting Started

Don't have an API key yet? Create a free account and generate an API key from the dashboard.

API Key Types:

  • Admin Key (dakkio_a_...): Full access - bucket management, API key management
  • Write Key (dakkio_w_...): Data ingestion, data source management, alert management
  • Read Key (dakkio_r_...): Data queries, analytics

1. Bucket Management

Start by creating a bucket to store your time-series data.

List Buckets

Get all buckets in your organization.

GET/api/buckets

List all buckets. Requires Admin API key.

Authentication

Create Bucket

Create a new data bucket.

POST/api/buckets

Create a new bucket for storing time-series data. Requires Admin API key.

Authentication

Request Body (JSON)

2. API Key Management

Create write and read API keys for your bucket. Use these keys for data operations instead of your admin key.

Admin Key Required

All API key management endpoints require an Admin API key (dakkio_a_...). Only admin keys have the apikeys:manage permission needed for these operations.

List Bucket API Keys

Get all API keys associated with a specific bucket.

GET

List all write and read API keys for a bucket. Requires Admin API key.

💡 Replace :bucketId or other parameters with actual IDs

Authentication

Create Bucket API Key

Generate a new write or read API key for a bucket.

POST

Create a new bucket-specific API key. Type can be 'write' or 'read'. Requires Admin API key.

💡 Replace :bucketId or other parameters with actual IDs

Authentication

Request Body (JSON)

Save Your Key

The plaintext API key is only shown once when created. Store it securely - you cannot retrieve it later. If lost, delete the key and create a new one.

Delete Bucket API Key

Permanently delete an API key from a bucket.

DELETE

Delete a bucket API key. This action is immediate and cannot be undone. Requires Admin API key.

💡 Replace :bucketId or other parameters with actual IDs

Authentication

Request Body (JSON)

3. Data Source Management

Define data sources to describe the structure of your data.

List Data Sources

Get all data sources in a bucket.

GET

List all data sources in a bucket. Requires Write or Read API key.

💡 Replace :bucketId or other parameters with actual IDs

Authentication

Create Data Source

Define a new data source in a bucket.

POST

Create a new data source with schema definition. Requires Write API key.

💡 Replace :bucketId or other parameters with actual IDs

Authentication

Request Body (JSON)

Automatic Downsampling

The downsampling property (default: true) enables automatic data aggregation for queries. When enabled, queries without explicit aggregation will automatically aggregate data based on the time range to return ~500 data points instead of potentially millions. This optimizes query performance for high-frequency data sources.

How it works:

  • The system tracks the average interval between data points
  • When you query without specifying aggregation and groupBy, it calculates expected data points
  • If expected points exceed ~500, it automatically applies avg aggregation with an appropriate groupBy interval

Set downsampling: false if you always need raw data points.

4. Data Ingestion

Send time-series data to your bucket.

Ingest Single Data Point

Send a single time-series data point.

POST/api/data

Ingest a single data point. Requires Write API key.

Authentication

Request Body (JSON)

Batch Ingest Data

Send multiple data points at once.

POST/api/data/batch

Batch ingest multiple data points in a single request. Requires Write API key.

Authentication

Request Body (JSON)

5. Data Queries

Retrieve and analyze your time-series data.

Query Time-Series Data

Retrieve time-series data with filters and aggregations.

POST/api/data/query

Query time-series data with filters, aggregations, and time-based grouping. Requires Read API key.

Authentication

Request Body (JSON)

Auto-Downsampling

If you omit aggregation and groupBy parameters and your data source has downsampling: true (default), the system will automatically aggregate data based on the time range to return ~500 data points. The response metadata will include downsamplingApplied: true when this happens.

To get raw data points, either:

  1. Set downsampling: false on the data source
  2. Always specify explicit aggregation and groupBy parameters (even if you set them to the values you want)

6. Alert Rules

Set up automated alerts based on your data conditions.

List Alert Rules

Get all alert rules for a bucket.

GET

List all alert rules in a bucket. Requires Read API key.

💡 Replace :bucketId or other parameters with actual IDs

Authentication

Create Alert Rule

Set up an alert with natural language query.

POST/api/alerts

Create an alert rule with natural language condition. Requires Write API key.

Authentication

Request Body (JSON)

7. Analytics

View aggregated statistics and insights.

Get Overview Analytics

Organization-level analytics and statistics.

GET/api/analytics/overview

Get organization-level analytics including bucket count, data points, active alerts. Requires Read API key.

Authentication

Get Bucket Analytics

Detailed analytics for a specific bucket.

GET

Get detailed analytics for a specific bucket. Requires Read API key.

💡 Replace :bucketId or other parameters with actual IDs

Authentication


Tips for Using the Playground

Follow this sequence to test the full API:

  1. Create bucket (Admin key) → Save the bucket ID
  2. Create API keys (Admin key) → Create a write key and read key for the bucket
  3. Create data source (Write key) → Define your data schema, save the data source ID
  4. Ingest data (Write key) → Send test data points
  5. Query data (Read key) → Retrieve and analyze your data
  6. Set up alerts (Write key) → Monitor your data for conditions
  7. View analytics (Read key) → See aggregated statistics

Replace Placeholder IDs

Replace placeholder IDs with your actual resource IDs:

  • bucketId: Get from GET /api/buckets
  • dataSourceId: Get from GET /api/buckets/:bucketId/sources
  • keyId: Get from GET /api/buckets/:bucketId/keys

Error Handling

If you get errors:

  • 401 Unauthorized: Check your API key is valid and has the required permissions
  • 403 Forbidden: Your API key doesn't have permission for this operation
  • 404 Not Found: Verify your resource IDs are correct
  • 400 Validation Error: Check request body matches the schema

Need Help?

Security Note

warning

This playground sends real requests to the production API. Be careful with:

  • Personal data in request bodies
  • Production API keys
  • Sensitive information

For testing, use a separate development account and test data only.