Uploading Forecasts
Sharing your own forecasts with the Powernaut platform helps us better understand your resources' expected behaviour, leading to improved flexibility opportunities and optimised market participation.
Why upload your Forecasts?​
When you upload forecasts, you provide us with your domain expertise about your resources, which helps us:
- Improve flexibility estimation by understanding when your resources will be available
- Optimise market participation with more accurate predictions of your baseline behaviour
- Reduce dispatch uncertainty by knowing your expected consumption or production patterns
- Enhance revenue opportunities through better-informed bidding strategies
Your uploaded forecasts always take priority over our internal models. When you provide forecast data for specific time periods, we use your predictions as the primary source of truth for those intervals.
Feature availability​
Uploading forecasts requires the forecasting feature to be enabled for your account. Without this feature:
POST
requests to forecast endpoints will return403 Forbidden
- You can only access limited forecast data via GET requests
Please contact our support team to enable forecast uploading capabilities.
Available endpoints​
You can upload forecasts for both sites and resources:
- Sites:
POST /connect/sites/{id}/forecast
- Resources:
POST /connect/resources/{id}/forecast
Request format​
To upload a forecast, the following request structure should be used:
Request body structure​
{
"model_id": "my_external_model_v1.2",
"items": [
{
"start": "2024-01-15T10:00:00Z",
"end": "2024-01-15T10:15:00Z",
"value": "2.500000"
},
{
"start": "2024-01-15T10:15:00Z",
"end": "2024-01-15T10:30:00Z",
"value": "2.750000"
}
]
}
Required fields​
Field | Type | Required | Description |
---|---|---|---|
model_id | string | Yes | Your own identifier for this forecasting model (1-100 chars). See Quality & Versioning for naming suggestions. |
items | array | Yes | Array of forecast data points (max 192 items) |
items[].start | string | Yes | ISO 8601 start timestamp (must be perfect quarter-hour) |
items[].end | string | Yes | ISO 8601 end timestamp (must be perfect quarter-hour) |
items[].value | string | Yes | Predicted power value in kW as decimal string |
Data validation rules​
- Time intervals: All timestamps must be perfect quarter-hours (15-minute intervals)
- Power values: Must be in kW as decimal strings (positive = consumption, negative = production/injection)
- Future only: Past timestamps will be ignored and returned in the
ignored
response array - Horizon limit: Maximum 48 hours in advance (192 data points at 15-min intervals)
- Resource constraints: For resources, values must be within the resource's min/max power constraints
Example request - Site Forecast Upload​
- Python
- JavaScript
- Java
- Go
- C#
- cURL
import requests
import json
url = "https://api.powernaut.io/v1/connect/sites/<uuid>/forecast"
payload = json.dumps({
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
})
headers = {
'Content-Type': 'application/json',
'Accept': 'application/json',
'Authorization': 'Bearer <token>'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
const myHeaders = new Headers();
myHeaders.append("Content-Type", "application/json");
myHeaders.append("Accept", "application/json");
myHeaders.append("Authorization", "Bearer <token>");
const raw = JSON.stringify({
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
});
const requestOptions = {
method: "POST",
headers: myHeaders,
body: raw,
redirect: "follow"
};
fetch("https://api.powernaut.io/v1/connect/sites/<uuid>/forecast", requestOptions)
.then((response) => response.text())
.then((result) => console.log(result))
.catch((error) => console.error(error));
OkHttpClient client = new OkHttpClient().newBuilder()
.build();
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType, "{\n \"model_id\": \"<string>\",\n \"items\": [\n {\n \"start\": \"<dateTime>\",\n \"end\": \"<dateTime>\",\n \"value\": \"<decimal>\"\n }\n ]\n}");
Request request = new Request.Builder()
.url("https://api.powernaut.io/v1/connect/sites/<uuid>/forecast")
.method("POST", body)
.addHeader("Content-Type", "application/json")
.addHeader("Accept", "application/json")
.addHeader("Authorization", "Bearer <token>")
.build();
Response response = client.newCall(request).execute();
package main
import (
"fmt"
"strings"
"net/http"
"io/ioutil"
)
func main() {
url := "https://api.powernaut.io/v1/connect/sites/<uuid>/forecast"
method := "POST"
payload := strings.NewReader(`{
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
}`)
client := &http.Client {
}
req, err := http.NewRequest(method, url, payload)
if err != nil {
fmt.Println(err)
return
}
req.Header.Add("Content-Type", "application/json")
req.Header.Add("Accept", "application/json")
req.Header.Add("Authorization", "Bearer <token>")
res, err := client.Do(req)
if err != nil {
fmt.Println(err)
return
}
defer res.Body.Close()
body, err := ioutil.ReadAll(res.Body)
if err != nil {
fmt.Println(err)
return
}
fmt.Println(string(body))
}
var client = new HttpClient();
var request = new HttpRequestMessage(HttpMethod.Post, "https://api.powernaut.io/v1/connect/sites/<uuid>/forecast");
request.Headers.Add("Accept", "application/json");
request.Headers.Add("Authorization", "Bearer <token>");
var content = new StringContent("{\n \"model_id\": \"<string>\",\n \"items\": [\n {\n \"start\": \"<dateTime>\",\n \"end\": \"<dateTime>\",\n \"value\": \"<decimal>\"\n }\n ]\n}", null, "application/json");
request.Content = content;
var response = await client.SendAsync(request);
response.EnsureSuccessStatusCode();
Console.WriteLine(await response.Content.ReadAsStringAsync());
curl --location 'https://api.powernaut.io/v1/connect/sites/<uuid>/forecast' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer <token>' \
--data '{
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
}'
Example request - Resource Forecast Upload​
- Python
- JavaScript
- Java
- Go
- C#
- cURL
import requests
import json
url = "https://api.powernaut.io/v1/connect/resources/<uuid>/forecast"
payload = json.dumps({
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
})
headers = {
'Content-Type': 'application/json',
'Accept': 'application/json',
'Authorization': 'Bearer <token>'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
const myHeaders = new Headers();
myHeaders.append("Content-Type", "application/json");
myHeaders.append("Accept", "application/json");
myHeaders.append("Authorization", "Bearer <token>");
const raw = JSON.stringify({
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
});
const requestOptions = {
method: "POST",
headers: myHeaders,
body: raw,
redirect: "follow"
};
fetch("https://api.powernaut.io/v1/connect/resources/<uuid>/forecast", requestOptions)
.then((response) => response.text())
.then((result) => console.log(result))
.catch((error) => console.error(error));
OkHttpClient client = new OkHttpClient().newBuilder()
.build();
MediaType mediaType = MediaType.parse("application/json");
RequestBody body = RequestBody.create(mediaType, "{\n \"model_id\": \"<string>\",\n \"items\": [\n {\n \"start\": \"<dateTime>\",\n \"end\": \"<dateTime>\",\n \"value\": \"<decimal>\"\n }\n ]\n}");
Request request = new Request.Builder()
.url("https://api.powernaut.io/v1/connect/resources/<uuid>/forecast")
.method("POST", body)
.addHeader("Content-Type", "application/json")
.addHeader("Accept", "application/json")
.addHeader("Authorization", "Bearer <token>")
.build();
Response response = client.newCall(request).execute();
package main
import (
"fmt"
"strings"
"net/http"
"io/ioutil"
)
func main() {
url := "https://api.powernaut.io/v1/connect/resources/<uuid>/forecast"
method := "POST"
payload := strings.NewReader(`{
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
}`)
client := &http.Client {
}
req, err := http.NewRequest(method, url, payload)
if err != nil {
fmt.Println(err)
return
}
req.Header.Add("Content-Type", "application/json")
req.Header.Add("Accept", "application/json")
req.Header.Add("Authorization", "Bearer <token>")
res, err := client.Do(req)
if err != nil {
fmt.Println(err)
return
}
defer res.Body.Close()
body, err := ioutil.ReadAll(res.Body)
if err != nil {
fmt.Println(err)
return
}
fmt.Println(string(body))
}
var client = new HttpClient();
var request = new HttpRequestMessage(HttpMethod.Post, "https://api.powernaut.io/v1/connect/resources/<uuid>/forecast");
request.Headers.Add("Accept", "application/json");
request.Headers.Add("Authorization", "Bearer <token>");
var content = new StringContent("{\n \"model_id\": \"<string>\",\n \"items\": [\n {\n \"start\": \"<dateTime>\",\n \"end\": \"<dateTime>\",\n \"value\": \"<decimal>\"\n }\n ]\n}", null, "application/json");
request.Content = content;
var response = await client.SendAsync(request);
response.EnsureSuccessStatusCode();
Console.WriteLine(await response.Content.ReadAsStringAsync());
curl --location 'https://api.powernaut.io/v1/connect/resources/<uuid>/forecast' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer <token>' \
--data '{
"model_id": "<string>",
"items": [
{
"start": "<dateTime>",
"end": "<dateTime>",
"value": "<decimal>"
}
]
}'
Response format​
Success response​
{
"inserted": [
{
"start": "2024-01-15T10:00:00Z",
"end": "2024-01-15T10:15:00Z",
"value": "2.500000"
},
{
"start": "2024-01-15T10:15:00Z",
"end": "2024-01-15T10:30:00Z",
"value": "2.750000"
}
],
"ignored": []
}
Response fields​
Field | Type | Description |
---|---|---|
inserted | array | Forecast data points that were successfully inserted |
ignored | array | Forecast data points that were ignored (past timestamps only) |
Error handling​
Common error responses​
Status Code | Error | Description |
---|---|---|
400 | Bad Request | Invalid data format or values |
403 | Forbidden | Forecasting feature not enabled |
404 | Not Found | Site or resource not found |
413 | Payload Too Large | Too many forecast points in single request |
422 | Unprocessable Entity | Data validation failed |
Example Error Response​
{
"error": "Invalid forecast data",
"message": "Some forecast points failed validation",
"statusCode": 400
}
Best practices​
Follow these strategies to maximise the value of your forecast data and ensure reliable integration.
Batch upload strategy​
Upload forecasts efficiently by balancing batch size with reliability and API limits.
Scenario | Batch Size | Frequency |
---|---|---|
Real-time updates | 4-8 data points | Every hour |
Scheduled uploads | 96 data points (24h) | Daily |
Bulk historical | 192 data points (48h) | One-time |
Split large datasets into 24-hour chunks (96 data points) and add 1-second delays between batches to respect rate limits.
Handle ignored Forecasts​
Monitor and respond to ignored forecast points to maintain data quality.
Always check the response for ignored forecasts, as these indicate data points that couldn't be processed.
Ignored (returned in ignored
array):
- Past timestamps - Always filtered out and returned in the
ignored
response array
Validation Errors (cause 400/422 responses):
- Invalid intervals - Must be perfect quarter-hours (throws validation error)
- Constraint violations - Values outside resource limits for resources (throws validation error)
- Duplicate timestamps - Same timestamp within a single upload request (throws validation error)
When you upload forecast data for timestamps that already exist in our system (from previous API calls), the new values will automatically overwrite the existing ones. This allows you to update your forecasts as your models improve or as new data becomes available, without needing to delete previous entries first.
Update frequency​
Consider uploading forecasts as soon as your forecasts change. This can be done on a schedule, or as soon as new data has become available (e.g. weather updates, operational changes, market events, etc.).
Quality & versioning​
Model tracking​
Use descriptive model IDs with version information to track which forecasting approach generated each prediction.
Example format: solar_forecast_v2.1_2024-01-15_weather-historical
Quality monitoring​
Track forecast accuracy using standard metrics. We recommend normalised Mean Absolute Error (nMAE) as the primary metric, as it accounts for different resource capacities and makes thresholds more meaningful across diverse resources. Note that the following numbers apply to the DA-forecasting scenario. For near-real-time forecasting, the quality is expected to be better given the availability of more recent data.
nMAE: Mean Absolute Error / Mean absolute power in Period
Site-level forecasts​
Resource types behind the meter | nMAE Excellent | nMAE Good | nMAE Needs Improvement |
---|---|---|---|
None | < 0.25 | 0.25-0.35 | > 0.60 |
Electric Vehicles (EV) | < 0.25 | 0.25-0.45 | > 0.85 |
Photovoltaic installations (PV) | < 0.35 | 0.35-0.45 | > 0.60 |
EV + PV | < 0.40 | 0.40-0.50 | > 0.85 |
Resource-level forecasts​
Resource types | nMAE Excellent | nMAE Good | nMAE Needs Improvement |
---|---|---|---|
PV | < 0.35 | 0.35-0.50 | > 0.75 |
- Handles zero values: Unlike MAPE, nMAE doesn't break when actual values are zero
- Resource-agnostic: Normalising by average power makes thresholds meaningful across different resource sizes
- Clearer interpretation: 0.15 nMAE means errors are 15% of typical power levels
There is a downside to using nMAE in the context of site- or resource-level power forecasts: The nMAE is sensitive to the normalisation factor.
- During periods with low power values, the error can be artificially high, even if the forecast is quite accurate.
- During periods with high power values, large forecast errors can be masked by a relatively large normalisation factor.
To mitigate this risk: make sure you calculate the nMAE over a period of time long enough to span different power levels. If you need to use a shorter period, you can use the capacity normalised MAE instead.
Compare your forecasts against actual outcomes weekly to calculate nMAE, with different expectations for PV vs non-PV resources due to inherent predictability differences.
Next steps​
With both forecast retrieval and uploading capabilities, you're ready to optimise your flexibility offerings.
Monitor the performance of your forecasts regularly and refine your models based on actual outcomes. Better forecasts lead to better flexibility opportunities and higher revenue potential.