API Playground
Test dakkio API endpoints directly from your browser. Enter your API key and try sending requests!
Don't have an API key yet? Create a free account and generate an API key from the dashboard.
API Key Types:
- Admin Key (
dakkio_a_...): Full access - bucket management, API key management - Write Key (
dakkio_w_...): Data ingestion, data source management, alert management - Read Key (
dakkio_r_...): Data queries, analytics
1. Bucket Management
Start by creating a bucket to store your time-series data.
List Buckets
Get all buckets in your organization.
List all buckets. Requires Admin API key.
Create Bucket
Create a new data bucket.
Create a new bucket for storing time-series data. Requires Admin API key.
2. API Key Management
Create write and read API keys for your bucket. Use these keys for data operations instead of your admin key.
All API key management endpoints require an Admin API key (dakkio_a_...). Only admin keys have the apikeys:manage permission needed for these operations.
List Bucket API Keys
Get all API keys associated with a specific bucket.
List all write and read API keys for a bucket. Requires Admin API key.
💡 Replace :bucketId or other parameters with actual IDs
Create Bucket API Key
Generate a new write or read API key for a bucket.
Create a new bucket-specific API key. Type can be 'write' or 'read'. Requires Admin API key.
💡 Replace :bucketId or other parameters with actual IDs
The plaintext API key is only shown once when created. Store it securely - you cannot retrieve it later. If lost, delete the key and create a new one.
Delete Bucket API Key
Permanently delete an API key from a bucket.
Delete a bucket API key. This action is immediate and cannot be undone. Requires Admin API key.
💡 Replace :bucketId or other parameters with actual IDs
3. Data Source Management
Define data sources to describe the structure of your data.
List Data Sources
Get all data sources in a bucket.
List all data sources in a bucket. Requires Write or Read API key.
💡 Replace :bucketId or other parameters with actual IDs
Create Data Source
Define a new data source in a bucket.
Create a new data source with schema definition. Requires Write API key.
💡 Replace :bucketId or other parameters with actual IDs
The downsampling property (default: true) enables automatic data aggregation for queries. When enabled, queries without explicit aggregation will automatically aggregate data based on the time range to return ~500 data points instead of potentially millions. This optimizes query performance for high-frequency data sources.
How it works:
- The system tracks the average interval between data points
- When you query without specifying
aggregationandgroupBy, it calculates expected data points - If expected points exceed ~500, it automatically applies
avgaggregation with an appropriategroupByinterval
Set downsampling: false if you always need raw data points.
4. Data Ingestion
Send time-series data to your bucket.
Ingest Single Data Point
Send a single time-series data point.
Ingest a single data point. Requires Write API key.
Batch Ingest Data
Send multiple data points at once.
Batch ingest multiple data points in a single request. Requires Write API key.
5. Data Queries
Retrieve and analyze your time-series data.
Query Time-Series Data
Retrieve time-series data with filters and aggregations.
Query time-series data with filters, aggregations, and time-based grouping. Requires Read API key.
If you omit aggregation and groupBy parameters and your data source has downsampling: true (default), the system will automatically aggregate data based on the time range to return ~500 data points. The response metadata will include downsamplingApplied: true when this happens.
To get raw data points, either:
- Set
downsampling: falseon the data source - Always specify explicit
aggregationandgroupByparameters (even if you set them to the values you want)
6. Alert Rules
Set up automated alerts based on your data conditions.
List Alert Rules
Get all alert rules for a bucket.
List all alert rules in a bucket. Requires Read API key.
💡 Replace :bucketId or other parameters with actual IDs
Create Alert Rule
Set up an alert with natural language query.
Create an alert rule with natural language condition. Requires Write API key.
7. Analytics
View aggregated statistics and insights.
Get Overview Analytics
Organization-level analytics and statistics.
Get organization-level analytics including bucket count, data points, active alerts. Requires Read API key.
Get Bucket Analytics
Detailed analytics for a specific bucket.
Get detailed analytics for a specific bucket. Requires Read API key.
💡 Replace :bucketId or other parameters with actual IDs
Tips for Using the Playground
Recommended Flow
Follow this sequence to test the full API:
- Create bucket (Admin key) → Save the bucket ID
- Create API keys (Admin key) → Create a write key and read key for the bucket
- Create data source (Write key) → Define your data schema, save the data source ID
- Ingest data (Write key) → Send test data points
- Query data (Read key) → Retrieve and analyze your data
- Set up alerts (Write key) → Monitor your data for conditions
- View analytics (Read key) → See aggregated statistics
Replace Placeholder IDs
Replace placeholder IDs with your actual resource IDs:
bucketId: Get fromGET /api/bucketsdataSourceId: Get fromGET /api/buckets/:bucketId/sourceskeyId: Get fromGET /api/buckets/:bucketId/keys
Error Handling
If you get errors:
- 401 Unauthorized: Check your API key is valid and has the required permissions
- 403 Forbidden: Your API key doesn't have permission for this operation
- 404 Not Found: Verify your resource IDs are correct
- 400 Validation Error: Check request body matches the schema
Need Help?
Security Note
This playground sends real requests to the production API. Be careful with:
- Personal data in request bodies
- Production API keys
- Sensitive information
For testing, use a separate development account and test data only.