Skip to content

Commit

Permalink
fixes after merge
Browse files Browse the repository at this point in the history
  • Loading branch information
akuzm committed Dec 11, 2024
1 parent 3da23f5 commit 9b83c13
Show file tree
Hide file tree
Showing 5 changed files with 242 additions and 292 deletions.
2 changes: 1 addition & 1 deletion tsl/src/nodes/decompress_chunk/decompress_chunk.c
Original file line number Diff line number Diff line change
Expand Up @@ -877,7 +877,7 @@ ts_decompress_chunk_generate_paths(PlannerInfo *root, RelOptInfo *chunk_rel, con
pushdown_quals(root, compression_info->settings, chunk_rel, compressed_rel, consider_partial);

set_baserel_size_estimates(root, compressed_rel);
const double new_tuples_estimate = compressed_rel->rows * DECOMPRESS_CHUNK_BATCH_SIZE;
const double new_tuples_estimate = compressed_rel->rows * TARGET_COMPRESSED_BATCH_SIZE;
const double new_rows_estimate =
new_tuples_estimate *
clauselist_selectivity(root, chunk_rel->baserestrictinfo, 0, JOIN_INNER, NULL);
Expand Down
40 changes: 18 additions & 22 deletions tsl/test/expected/transparent_decompression_ordered_index-15.out
Original file line number Diff line number Diff line change
Expand Up @@ -690,22 +690,20 @@ JOIN LATERAL
WHERE node = f1.device_id) q
ON met.device_id = q.node and met.device_id_peer = q.device_id_peer
and met.v0 = q.v0 and met.v0 > 2 and time = '2018-01-19 20:00:00-05';
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Nested Loop (actual rows=1 loops=1)
Join Filter: (nodetime.node = met.device_id)
Join Filter: ("*VALUES*".column1 = nodetime.node)
-> Nested Loop (actual rows=1 loops=1)
Join Filter: (nodetime.node = "*VALUES*".column1)
Rows Removed by Join Filter: 1
-> Seq Scan on nodetime (actual rows=1 loops=1)
-> Values Scan on "*VALUES*" (actual rows=2 loops=1)
-> Custom Scan (DecompressChunk) on _hyper_1_4_chunk met (actual rows=1 loops=1)
Filter: ("*VALUES*".column3 = v0)
Rows Removed by Filter: 47
Vectorized Filter: ((v0 > 2) AND ("time" = 'Fri Jan 19 17:00:00 2018 PST'::timestamp with time zone))
-> Index Scan using compress_hyper_2_9_chunk_device_id_device_id_peer__ts_meta__idx on compress_hyper_2_9_chunk (actual rows=1 loops=1)
Index Cond: ((device_id = "*VALUES*".column1) AND (device_id_peer = "*VALUES*".column2) AND (_ts_meta_min_1 <= 'Fri Jan 19 17:00:00 2018 PST'::timestamp with time zone) AND (_ts_meta_max_1 >= 'Fri Jan 19 17:00:00 2018 PST'::timestamp with time zone))
(13 rows)
-> Custom Scan (DecompressChunk) on _hyper_1_4_chunk met (actual rows=0 loops=2)
Filter: ("*VALUES*".column3 = v0)
Rows Removed by Filter: 24
Vectorized Filter: ((v0 > 2) AND ("time" = 'Fri Jan 19 17:00:00 2018 PST'::timestamp with time zone))
-> Index Scan using compress_hyper_2_9_chunk_device_id_device_id_peer__ts_meta__idx on compress_hyper_2_9_chunk (actual rows=0 loops=2)
Index Cond: ((device_id = "*VALUES*".column1) AND (device_id_peer = "*VALUES*".column2) AND (_ts_meta_min_1 <= 'Fri Jan 19 17:00:00 2018 PST'::timestamp with time zone) AND (_ts_meta_max_1 >= 'Fri Jan 19 17:00:00 2018 PST'::timestamp with time zone))
-> Seq Scan on nodetime (actual rows=1 loops=1)
(11 rows)

-- filter on compressed attr (v0) with seqscan enabled and indexscan
-- disabled. filters on compressed attr should be above the seq scan.
Expand Down Expand Up @@ -841,24 +839,22 @@ GROUP BY m.device_id,
ORDER BY 1,
2,
3;
QUERY PLAN
----------------------------------------------------------------------------------
QUERY PLAN
-------------------------------------------------------------------------------
Sort (actual rows=0 loops=1)
Sort Key: m.device_id, d.v0, (count(*))
Sort Method: quicksort
-> HashAggregate (actual rows=0 loops=1)
Group Key: m.device_id, d.v0
Batches: 1
-> Hash Join (actual rows=0 loops=1)
Hash Cond: ((d.device_id = m.device_id) AND (d."time" = m."time"))
-> Nested Loop (actual rows=0 loops=1)
-> Custom Scan (ConstraintAwareAppend) (actual rows=0 loops=1)
Hypertable: metrics_ordered_idx
Chunks excluded during startup: 2
-> Hash (never executed)
-> Custom Scan (ConstraintAwareAppend) (never executed)
Hypertable: metrics_ordered_idx
Chunks excluded during startup: 2
(15 rows)
-> Custom Scan (ConstraintAwareAppend) (never executed)
Hypertable: metrics_ordered_idx
Chunks excluded during startup: 2
(13 rows)

--query with no results --
:PREFIX
Expand Down
135 changes: 66 additions & 69 deletions tsl/test/expected/vectorized_aggregation.out
Original file line number Diff line number Diff line change
Expand Up @@ -183,77 +183,74 @@ SELECT sum(segment_by_value) FROM testtable WHERE segment_by_value > 0;
-- Vectorization not possible due to a used filter
:EXPLAIN
SELECT sum(segment_by_value) FROM testtable WHERE segment_by_value > 0 AND int_value > 0;
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Finalize Aggregate
Output: sum(_hyper_1_1_chunk.segment_by_value)
-> Gather
Output: (PARTIAL sum(_hyper_1_1_chunk.segment_by_value))
Workers Planned: 2
-> Parallel Append
-> Custom Scan (VectorAgg)
Output: (PARTIAL sum(_hyper_1_1_chunk.segment_by_value))
Grouping Policy: all compressed batches
-> Custom Scan (DecompressChunk) on _timescaledb_internal._hyper_1_1_chunk
Output: _hyper_1_1_chunk.segment_by_value
Vectorized Filter: (_hyper_1_1_chunk.int_value > 0)
-> Parallel Seq Scan on _timescaledb_internal.compress_hyper_2_11_chunk
Output: compress_hyper_2_11_chunk._ts_meta_count, compress_hyper_2_11_chunk.segment_by_value, compress_hyper_2_11_chunk._ts_meta_min_1, compress_hyper_2_11_chunk._ts_meta_max_1, compress_hyper_2_11_chunk."time", compress_hyper_2_11_chunk.int_value, compress_hyper_2_11_chunk.float_value
Filter: (compress_hyper_2_11_chunk.segment_by_value > 0)
-> Custom Scan (VectorAgg)
Output: (PARTIAL sum(_hyper_1_2_chunk.segment_by_value))
Grouping Policy: all compressed batches
-> Custom Scan (DecompressChunk) on _timescaledb_internal._hyper_1_2_chunk
Output: _hyper_1_2_chunk.segment_by_value
Vectorized Filter: (_hyper_1_2_chunk.int_value > 0)
-> Parallel Seq Scan on _timescaledb_internal.compress_hyper_2_12_chunk
Output: compress_hyper_2_12_chunk._ts_meta_count, compress_hyper_2_12_chunk.segment_by_value, compress_hyper_2_12_chunk._ts_meta_min_1, compress_hyper_2_12_chunk._ts_meta_max_1, compress_hyper_2_12_chunk."time", compress_hyper_2_12_chunk.int_value, compress_hyper_2_12_chunk.float_value
Filter: (compress_hyper_2_12_chunk.segment_by_value > 0)
-> Custom Scan (VectorAgg)
Output: (PARTIAL sum(_hyper_1_3_chunk.segment_by_value))
Grouping Policy: all compressed batches
-> Custom Scan (DecompressChunk) on _timescaledb_internal._hyper_1_3_chunk
Output: _hyper_1_3_chunk.segment_by_value
Vectorized Filter: (_hyper_1_3_chunk.int_value > 0)
-> Parallel Seq Scan on _timescaledb_internal.compress_hyper_2_13_chunk
Output: compress_hyper_2_13_chunk._ts_meta_count, compress_hyper_2_13_chunk.segment_by_value, compress_hyper_2_13_chunk._ts_meta_min_1, compress_hyper_2_13_chunk._ts_meta_max_1, compress_hyper_2_13_chunk."time", compress_hyper_2_13_chunk.int_value, compress_hyper_2_13_chunk.float_value
Filter: (compress_hyper_2_13_chunk.segment_by_value > 0)
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_4_chunk.segment_by_value)
-> Parallel Seq Scan on _timescaledb_internal._hyper_1_4_chunk
Output: _hyper_1_4_chunk.segment_by_value
Filter: ((_hyper_1_4_chunk.segment_by_value > 0) AND (_hyper_1_4_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_5_chunk.segment_by_value)
-> Parallel Seq Scan on _timescaledb_internal._hyper_1_5_chunk
Output: _hyper_1_5_chunk.segment_by_value
Filter: ((_hyper_1_5_chunk.segment_by_value > 0) AND (_hyper_1_5_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_6_chunk.segment_by_value)
-> Parallel Seq Scan on _timescaledb_internal._hyper_1_6_chunk
Output: _hyper_1_6_chunk.segment_by_value
Filter: ((_hyper_1_6_chunk.segment_by_value > 0) AND (_hyper_1_6_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_7_chunk.segment_by_value)
-> Parallel Seq Scan on _timescaledb_internal._hyper_1_7_chunk
Output: _hyper_1_7_chunk.segment_by_value
Filter: ((_hyper_1_7_chunk.segment_by_value > 0) AND (_hyper_1_7_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_8_chunk.segment_by_value)
-> Parallel Seq Scan on _timescaledb_internal._hyper_1_8_chunk
Output: _hyper_1_8_chunk.segment_by_value
Filter: ((_hyper_1_8_chunk.segment_by_value > 0) AND (_hyper_1_8_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_9_chunk.segment_by_value)
-> Parallel Seq Scan on _timescaledb_internal._hyper_1_9_chunk
Output: _hyper_1_9_chunk.segment_by_value
Filter: ((_hyper_1_9_chunk.segment_by_value > 0) AND (_hyper_1_9_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_10_chunk.segment_by_value)
-> Parallel Seq Scan on _timescaledb_internal._hyper_1_10_chunk
Output: _hyper_1_10_chunk.segment_by_value
Filter: ((_hyper_1_10_chunk.segment_by_value > 0) AND (_hyper_1_10_chunk.int_value > 0))
(68 rows)
-> Append
-> Custom Scan (VectorAgg)
Output: (PARTIAL sum(_hyper_1_1_chunk.segment_by_value))
Grouping Policy: all compressed batches
-> Custom Scan (DecompressChunk) on _timescaledb_internal._hyper_1_1_chunk
Output: _hyper_1_1_chunk.segment_by_value
Vectorized Filter: (_hyper_1_1_chunk.int_value > 0)
-> Index Scan using compress_hyper_2_11_chunk_segment_by_value__ts_meta_min_1___idx on _timescaledb_internal.compress_hyper_2_11_chunk
Output: compress_hyper_2_11_chunk._ts_meta_count, compress_hyper_2_11_chunk.segment_by_value, compress_hyper_2_11_chunk._ts_meta_min_1, compress_hyper_2_11_chunk._ts_meta_max_1, compress_hyper_2_11_chunk."time", compress_hyper_2_11_chunk.int_value, compress_hyper_2_11_chunk.float_value
Index Cond: (compress_hyper_2_11_chunk.segment_by_value > 0)
-> Custom Scan (VectorAgg)
Output: (PARTIAL sum(_hyper_1_2_chunk.segment_by_value))
Grouping Policy: all compressed batches
-> Custom Scan (DecompressChunk) on _timescaledb_internal._hyper_1_2_chunk
Output: _hyper_1_2_chunk.segment_by_value
Vectorized Filter: (_hyper_1_2_chunk.int_value > 0)
-> Index Scan using compress_hyper_2_12_chunk_segment_by_value__ts_meta_min_1___idx on _timescaledb_internal.compress_hyper_2_12_chunk
Output: compress_hyper_2_12_chunk._ts_meta_count, compress_hyper_2_12_chunk.segment_by_value, compress_hyper_2_12_chunk._ts_meta_min_1, compress_hyper_2_12_chunk._ts_meta_max_1, compress_hyper_2_12_chunk."time", compress_hyper_2_12_chunk.int_value, compress_hyper_2_12_chunk.float_value
Index Cond: (compress_hyper_2_12_chunk.segment_by_value > 0)
-> Custom Scan (VectorAgg)
Output: (PARTIAL sum(_hyper_1_3_chunk.segment_by_value))
Grouping Policy: all compressed batches
-> Custom Scan (DecompressChunk) on _timescaledb_internal._hyper_1_3_chunk
Output: _hyper_1_3_chunk.segment_by_value
Vectorized Filter: (_hyper_1_3_chunk.int_value > 0)
-> Index Scan using compress_hyper_2_13_chunk_segment_by_value__ts_meta_min_1___idx on _timescaledb_internal.compress_hyper_2_13_chunk
Output: compress_hyper_2_13_chunk._ts_meta_count, compress_hyper_2_13_chunk.segment_by_value, compress_hyper_2_13_chunk._ts_meta_min_1, compress_hyper_2_13_chunk._ts_meta_max_1, compress_hyper_2_13_chunk."time", compress_hyper_2_13_chunk.int_value, compress_hyper_2_13_chunk.float_value
Index Cond: (compress_hyper_2_13_chunk.segment_by_value > 0)
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_4_chunk.segment_by_value)
-> Seq Scan on _timescaledb_internal._hyper_1_4_chunk
Output: _hyper_1_4_chunk.segment_by_value
Filter: ((_hyper_1_4_chunk.segment_by_value > 0) AND (_hyper_1_4_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_5_chunk.segment_by_value)
-> Seq Scan on _timescaledb_internal._hyper_1_5_chunk
Output: _hyper_1_5_chunk.segment_by_value
Filter: ((_hyper_1_5_chunk.segment_by_value > 0) AND (_hyper_1_5_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_6_chunk.segment_by_value)
-> Seq Scan on _timescaledb_internal._hyper_1_6_chunk
Output: _hyper_1_6_chunk.segment_by_value
Filter: ((_hyper_1_6_chunk.segment_by_value > 0) AND (_hyper_1_6_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_7_chunk.segment_by_value)
-> Seq Scan on _timescaledb_internal._hyper_1_7_chunk
Output: _hyper_1_7_chunk.segment_by_value
Filter: ((_hyper_1_7_chunk.segment_by_value > 0) AND (_hyper_1_7_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_8_chunk.segment_by_value)
-> Seq Scan on _timescaledb_internal._hyper_1_8_chunk
Output: _hyper_1_8_chunk.segment_by_value
Filter: ((_hyper_1_8_chunk.segment_by_value > 0) AND (_hyper_1_8_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_9_chunk.segment_by_value)
-> Seq Scan on _timescaledb_internal._hyper_1_9_chunk
Output: _hyper_1_9_chunk.segment_by_value
Filter: ((_hyper_1_9_chunk.segment_by_value > 0) AND (_hyper_1_9_chunk.int_value > 0))
-> Partial Aggregate
Output: PARTIAL sum(_hyper_1_10_chunk.segment_by_value)
-> Seq Scan on _timescaledb_internal._hyper_1_10_chunk
Output: _hyper_1_10_chunk.segment_by_value
Filter: ((_hyper_1_10_chunk.segment_by_value > 0) AND (_hyper_1_10_chunk.int_value > 0))
(65 rows)

:EXPLAIN
SELECT sum(segment_by_value) FROM testtable WHERE int_value > 0;
Expand Down
Loading

0 comments on commit 9b83c13

Please sign in to comment.