[go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

downsampling sample loss when the chunk range is less than downsample range #7908

Open
aluode99 opened this issue Nov 15, 2024 · 0 comments
Open
Labels

Comments

@aluode99
Copy link

Thanos, Prometheus and Golang version used:

Object Storage Provider:

What happened:
Thanos version is v0.36.1, the configuration parameters are as follows:

  • retention.resolution-raw=7d
  • retention.resolution-5m=186d
  • retention.resolution-1h=366d

Under normal conditions, 5-minute downsampled data is generated once every two days. Due to a fault on the 9th for a certain period, no downsampled data was generated on the 8th and 9th. Subsequently, after the expiration of the raw metrics, data loss occurred. In addition, for newly deployed clusters, due to the insufficient number of 24-hour chunks within the initial 48 hours (i.e., less than two complete 24-hour periods), it is not possible to generate 5-minute downsampled data for the first two days.
image

Is it possible to perform downsampling on data that does not meet the requirement of having 2 downsample chunks, in order to minimize data loss ?
image
As shown in the diagram, downsample-2 did not produce downsampled data due to not meeting the condition of m.MaxTime - m.MinTime < downsample.ResLevel1DownsampleRange. Is it possible to generate downsampled data for the chunks in the downsample-2 interval after downsample-3 has been completed?
image

What you expected to happen:
raw chunks meta range that is smaller than the downsample range can also undergo downsampling to reduce data loss.

How to reproduce it (as minimally and precisely as possible):

Full logs to relevant components:

Anything else we need to know:

@dosubot dosubot bot added the bug label Nov 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant