-
Notifications
You must be signed in to change notification settings - Fork 600
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
batch: Add metrics for tasks. #3832
Comments
My notes for this part: We can collect metrics for batch executor like The metrics "exchange_recv_size" and "exchange_frag_recv_size" can be a good ref for collecting the input data rows of task and out data rows of task (accumulation). But a seems like we can not get the input recv size for fragment contains table scan: They do not have exchange source as input. cc @ZENOTME |
for clean up job, I think we can implement it using 'range search', such as: batch_exchange_recv_row_number{query_id="aaa"} can get all item which query_id equal "aaa". So I think we can clean metrics in prometheus first. |
WDYM by metrics in cn? I think they are all metrics in prometheus |
I think metrics in cn is a local data structure like this:
And we will send this metrics to prometheus server and then store at the prometheus. (It's metrics in prometheus server? |
Metrics are stored in Prometheus server, but they are not sent. Prometheus will pull metrics from each compute node. |
What's the difference between metrics in CN and metrics in Prometheus? |
I think call |
For example there is a metrics in CN, like this: , (or maybe I should call it a value in prometheus client
And as you say, and this metrics(value) will pull by prometheus and store in prometheus server. The metrics in prometheus is pull from the metrics in CN.
I look up the implementation and find delete_label_values will only delete the metrics in CN. (children.remove(&h)
|
|
Closed |
Add task level metrics, including incoming number of rows and data size of each exchange source, outgoing number of rows and data size of each output.
The text was updated successfully, but these errors were encountered: