Skip to main content

Query Data

Retrieve time-series data from a bucket with filtering, aggregation, and pagination support.

Endpoint

POST /api/data/query

Authentication

Key TypeAllowed
Admin (dakkio_a_)✅ Yes
Write (dakkio_w_)✅ Yes
Read (dakkio_r_)✅ Yes

Recommended: Use a Read key (dakkio_r_) for querying data. Read keys have the minimum permissions needed for data retrieval and analytics.

Why use a Read key?
  • Security: Read keys cannot modify data, only retrieve it
  • Dashboard access: Give your visualization tools read-only access
  • Sharing: Safely share Read keys with team members who only need to view data

Automatic Downsampling

Data sources with downsampling: true (the default) will automatically aggregate data when:

  1. You don't explicitly specify aggregation and groupBy parameters
  2. The expected number of data points exceeds ~500

This prevents queries from returning millions of data points for high-frequency sensors, optimizing both query performance and response sizes.

How it works:

  • The system tracks the average interval between data points for each data source
  • When you query without specifying aggregation, it calculates expected data points based on time range and data density
  • If expected points exceed ~500, it automatically applies avg aggregation with an appropriate groupBy interval (minute, hour, day, week, or month)
  • The response metadata includes downsamplingApplied: true when this happens

To get raw data points:

  1. Set downsampling: false on the data source when creating it
  2. Or explicitly specify aggregation and groupBy parameters in your query (the system respects explicit parameters)
Example

A sensor sending data every second generates 86,400 points per day. Querying a 7-day range would return 604,800 points. With auto-downsampling, this is automatically aggregated to ~500 hourly averages instead.

Request

Headers

HeaderTypeRequiredDescription
X-API-Keystring✅ YesYour Read, Write, or Admin API key
Content-Typestring✅ YesMust be application/json

Body Parameters

ParameterTypeRequiredDescription
bucketIdstring✅ YesThe bucket ID to query
filtersobject❌ NoFilter criteria (see below)
aggregationstring❌ NoAggregation function: avg, sum, min, max, count
groupBystring❌ NoTime grouping: minute, hour, day, week, month
limitnumber❌ NoMax results (default: 100, max: 10000)
offsetnumber❌ NoSkip results for pagination
sortstring❌ NoSort order: asc or desc (default: asc)

Filter Object

ParameterTypeDescription
dataSourceIdsstring[]Filter by specific data sources
startTimestringStart of time range (ISO 8601)
endTimestringEnd of time range (ISO 8601)
metadataobjectFilter by metadata fields

Example Request - Basic Query

curl -X POST https://api.dakkio.io/api/data/query \
-H "X-API-Key: dakkio_r_xyz789ghi012..." \
-H "Content-Type: application/json" \
-d '{
"bucketId": "507f1f77bcf86cd799439011",
"filters": {
"dataSourceIds": ["507f1f77bcf86cd799439012"],
"startTime": "2024-01-15T00:00:00Z",
"endTime": "2024-01-15T23:59:59Z"
},
"limit": 100
}'

Example Request - With Aggregation

curl -X POST https://api.dakkio.io/api/data/query \
-H "X-API-Key: dakkio_r_xyz789ghi012..." \
-H "Content-Type: application/json" \
-d '{
"bucketId": "507f1f77bcf86cd799439011",
"filters": {
"dataSourceIds": ["507f1f77bcf86cd799439012"],
"startTime": "2024-01-01T00:00:00Z",
"endTime": "2024-01-31T23:59:59Z"
},
"aggregation": "avg",
"groupBy": "day"
}'

Example Request - Filter by Metadata

curl -X POST https://api.dakkio.io/api/data/query \
-H "X-API-Key: dakkio_r_xyz789ghi012..." \
-H "Content-Type: application/json" \
-d '{
"bucketId": "507f1f77bcf86cd799439011",
"filters": {
"startTime": "2024-01-15T00:00:00Z",
"endTime": "2024-01-15T23:59:59Z",
"metadata": {
"deviceId": "ESP32-001",
"location": "Living Room"
}
}
}'

Response

Success Response - Raw Data (200 OK)

{
"data": [
{
"_id": "507f1f77bcf86cd799439016",
"timestamp": "2024-01-15T10:00:00Z",
"dataSourceId": "507f1f77bcf86cd799439012",
"values": {
"temperature": 22.0,
"humidity": 68
},
"metadata": {
"deviceId": "ESP32-001",
"location": "Living Room"
}
},
{
"_id": "507f1f77bcf86cd799439017",
"timestamp": "2024-01-15T10:05:00Z",
"dataSourceId": "507f1f77bcf86cd799439012",
"values": {
"temperature": 22.2,
"humidity": 67
},
"metadata": {
"deviceId": "ESP32-001",
"location": "Living Room"
}
}
],
"metadata": {
"count": 2,
"limit": 100,
"offset": 0,
"hasMore": false
}
}

Success Response - Aggregated Data (200 OK)

{
"data": [
{
"timestamp": "2024-01-15T00:00:00Z",
"values": {
"temperature": 22.3,
"humidity": 66.5
}
},
{
"timestamp": "2024-01-16T00:00:00Z",
"values": {
"temperature": 21.8,
"humidity": 68.2
}
}
],
"metadata": {
"count": 2,
"aggregation": "avg",
"groupBy": "day"
}
}

Success Response - Auto-Downsampled (200 OK)

When auto-downsampling is applied, the response includes downsamplingApplied: true:

{
"data": [
{
"time": "2024-01-15T00:00:00Z",
"dataSourceId": "507f1f77bcf86cd799439012",
"temperature": 22.3,
"humidity": 66.5
},
{
"time": "2024-01-15T01:00:00Z",
"dataSourceId": "507f1f77bcf86cd799439012",
"temperature": 22.1,
"humidity": 67.2
}
],
"metadata": {
"count": 24,
"aggregation": "avg",
"groupBy": "hour",
"bucketId": "507f1f77bcf86cd799439011",
"downsamplingApplied": true
}
}

Error Responses

400 Bad Request - Invalid Time Range

{
"error": "Validation Error",
"message": "startTime must be before endTime"
}

401 Unauthorized

{
"error": "Unauthorized",
"message": "Invalid or missing API key"
}

403 Forbidden - Wrong Bucket

{
"error": "Forbidden",
"message": "API key does not have access to this bucket"
}

Code Examples

JavaScript/Node.js

const axios = require('axios');

async function queryData(startTime, endTime, options = {}) {
const response = await axios.post(
'https://api.dakkio.io/api/data/query',
{
bucketId: process.env.BUCKET_ID,
filters: {
dataSourceIds: [process.env.DATA_SOURCE_ID],
startTime,
endTime
},
...options
},
{
headers: {
'X-API-Key': process.env.DAKKIO_READ_KEY,
'Content-Type': 'application/json'
}
}
);

return response.data;
}

// Get raw data from last hour
const data = await queryData(
new Date(Date.now() - 3600000).toISOString(),
new Date().toISOString()
);

// Get daily averages for the month
const dailyAvg = await queryData(
'2024-01-01T00:00:00Z',
'2024-01-31T23:59:59Z',
{ aggregation: 'avg', groupBy: 'day' }
);

Python

import requests
import os
from datetime import datetime, timedelta

def query_data(start_time, end_time, **options):
response = requests.post(
'https://api.dakkio.io/api/data/query',
headers={
'X-API-Key': os.environ['DAKKIO_READ_KEY'],
'Content-Type': 'application/json'
},
json={
'bucketId': os.environ['BUCKET_ID'],
'filters': {
'dataSourceIds': [os.environ['DATA_SOURCE_ID']],
'startTime': start_time,
'endTime': end_time
},
**options
}
)

return response.json()

# Get data from last hour
end = datetime.utcnow()
start = end - timedelta(hours=1)
data = query_data(
start.isoformat() + 'Z',
end.isoformat() + 'Z'
)

# Get hourly averages
hourly_avg = query_data(
'2024-01-15T00:00:00Z',
'2024-01-15T23:59:59Z',
aggregation='avg',
groupBy='hour'
)

Go

package main

import (
"bytes"
"encoding/json"
"net/http"
"os"
)

type QueryRequest struct {
BucketId string `json:"bucketId"`
Filters QueryFilters `json:"filters"`
Aggregation string `json:"aggregation,omitempty"`
GroupBy string `json:"groupBy,omitempty"`
Limit int `json:"limit,omitempty"`
}

type QueryFilters struct {
DataSourceIds []string `json:"dataSourceIds,omitempty"`
StartTime string `json:"startTime,omitempty"`
EndTime string `json:"endTime,omitempty"`
}

func queryData(startTime, endTime string) (map[string]interface{}, error) {
request := QueryRequest{
BucketId: os.Getenv("BUCKET_ID"),
Filters: QueryFilters{
DataSourceIds: []string{os.Getenv("DATA_SOURCE_ID")},
StartTime: startTime,
EndTime: endTime,
},
}

jsonData, _ := json.Marshal(request)
req, _ := http.NewRequest("POST", "https://api.dakkio.io/api/data/query", bytes.NewBuffer(jsonData))
req.Header.Set("X-API-Key", os.Getenv("DAKKIO_READ_KEY"))
req.Header.Set("Content-Type", "application/json")

client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()

var result map[string]interface{}
json.NewDecoder(resp.Body).Decode(&result)
return result, nil
}

Aggregation Functions

FunctionDescription
avgAverage of all values
sumSum of all values
minMinimum value
maxMaximum value
countNumber of data points

Time Groupings

GroupByDescription
minuteGroup by minute
hourGroup by hour
dayGroup by day
weekGroup by week
monthGroup by month

Pagination

For large datasets, use pagination:

async function getAllData(startTime, endTime) {
const allData = [];
let offset = 0;
const limit = 1000;

while (true) {
const response = await queryData(startTime, endTime, { limit, offset });
allData.push(...response.data);

if (!response.metadata.hasMore) break;
offset += limit;
}

return allData;
}

Best Practices

1. Always Specify Time Range

Queries without a time range scan all data, which is slow and expensive:

// ✅ Good - specific time range
filters: {
startTime: '2024-01-15T00:00:00Z',
endTime: '2024-01-15T23:59:59Z'
}

// ❌ Bad - no time range (scans everything)
filters: {}

2. Use Aggregation for Large Datasets

Instead of fetching thousands of raw points:

// ✅ Good - get hourly averages (24 points for a day)
{ aggregation: 'avg', groupBy: 'hour' }

// ❌ Bad - fetch all raw data (potentially thousands of points)
{ limit: 10000 }

3. Filter by Data Source When Possible

// ✅ Good - specific data source
filters: {
dataSourceIds: ['507f1f77bcf86cd799439012']
}

// Slower - scan all data sources in bucket
filters: {}