The data's already at a 30 minute granularity, so in some ways there isn't much downsampling to be done.
I do, however, want to have a downsampling job which calculates cost per half-hour (saving me having to write queries to do it) based on consumption and the unit price at that time.
I do, however, want to copy the data into my longer-lived RP, so have added a max job
downsample_octopus_stats_max:
# Name for the task
name: "Downsample Octopus Stats (Max)"
influx: home1x
# Query the last n mins
period: 120
# Window into n minute blocks
window: 30
# taken from in_bucket
bucket: Systemstats
measurement:
- octopus_pricing
- octopus_consumption
fields:
- cost_exc_vat
- cost_inc_vat
- consumption
aggregates:
max:
output_influx:
- influx: home2xreal
output_bucket: Systemstats/rp_720d
Activity
06-Jul-23 19:28
assigned to @btasker
06-Jul-23 19:32
The data's already at a 30 minute granularity, so in some ways there isn't much downsampling to be done.
I do, however, want to have a downsampling job which calculates cost per half-hour (saving me having to write queries to do it) based on consumption and the unit price at that time.
06-Jul-23 19:33
I do, however, want to copy the data into my longer-lived RP, so have added a
max
job06-Jul-23 19:33
mentioned in commit sysconfigs/downsample_configs@f745c80c85ea702f168e02c1ffc3b0bee348b70c
Message
Add downsampling jobs for utilities/telegraf-plugins#16