-
Notifications
You must be signed in to change notification settings - Fork 898
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Segmentation fault,"Failed process was running: insert hypertable" #1165
Comments
@Specter-Y any chance you still have the core dump around? Can you do a |
It takes a long time to obtain the coredump with the code line. We use the -g parameter to recompile the timescaledb and postgres and install the environment again. The problem recurs and The key core dump information is as follows: |
This time, after the ts_chunk_insert_state_create is called. but chunk_insert_state_set_arbiter_indexes is crash |
dear cevian, please help us find the cause of the problem. tks a lot. |
@Specter-Y Not quite enough info here to debug, could you run |
@Specter-Y We were able to reproduce this bug. We are working on a fix now. I'll update you with our progress. Thank you. |
Relevant system information:
postgres --version
): 11.2\dx
inpsql
):1.2.2Describe the bug
error log:
2019-04-12 13:00:21.795 UTC,,,284641,,5cb037a2.457e1,6,,2019-04-12 07:00:50 UTC,,0,LOG,00000,"server process (PID 253718) was terminated by signal 11: Segmentation fault","Failed process was running: insert into t_mpp_11120_1d_rx0l3p_0(time,entity_id,ind_030ix5z4n,ind_03vfryh7a,ind_04apng40b,ind_05aa4w885,ind_05n4v6dpm,ind_05qpaty18,ind_05rkbs2ri,ind_09rvbyr2y,ind_0au4x42mk,ind_0b4g17rpg,ind_0cibtjd1m) select U0.time as time,U0.entity_id as entity_id,U0.ind_030ix5z4n as ind_030ix5z4n,U0.ind_03vfryh7a as ind_03vfryh7a,U0.ind_04apng40b as ind_04apng40b,U0.ind_05aa4w885 as ind_05aa4w885,U0.ind_05n4v6dpm as ind_05n4v6dpm,U0.ind_05qpaty18 as ind_05qpaty18,U0.ind_05rkbs2ri as ind_05rkbs2ri,U0.ind_09rvbyr2y as ind_09rvbyr2y,U0.ind_0au4x42mk as ind_0au4x42mk,U0.ind_0b4g17rpg as ind_0b4g17rpg,U0.ind_0cibtjd1m as ind_0cibtjd1m from (select t01.destination_instance as entity_id,dw_common.time_floor(t00.time,'30m') as time,avg(t00.ind_030ix5z4n) as ind_030ix5z4n,avg(t00.ind_03vfryh7a) as ind_03vfryh7a,avg(t00.ind_04apng40b) as ind_04apng40b,avg(t00.ind_05aa4w885) as ind_05aa4w885,avg(t00.ind_05n4v6dpm) as ind_05n4v6dpm,avg(t00.ind_05qpaty18) as ind_05qpaty18,avg(t00.ind_05rkbs2ri) as ind_05rkbs2ri,avg(t00.ind_09rvby",,,,,,,,""
coredump with gdb:
(gdb) where
#0 0x00007fec434cbd90 in ts_chunk_insert_state_create () from /opt/tsdb/pgsql/lib/timescaledb-1.2.2.so
#1 0x00007fec434c951f in ts_chunk_dispatch_get_chunk_insert_state () from /opt/tsdb/pgsql/lib/timescaledb-1.2.2.so
#2 0x00007fec434c98a4 in chunk_dispatch_exec () from /opt/tsdb/pgsql/lib/timescaledb-1.2.2.so
#3 0x000000000062704b in ExecModifyTable ()
#4 0x0000000000604daa in standard_ExecutorRun ()
#5 0x00007feec385a2f5 in pgss_ExecutorRun () from /opt/tsdb/pgsql/lib/pg_stat_statements.so
#6 0x000000000073cada in ProcessQuery ()
#7 0x000000000073cd01 in PortalRunMulti ()
#8 0x000000000073d7bc in PortalRun ()
#9 0x000000000073b2cd in PostgresMain ()
#10 0x000000000048e399 in ServerLoop ()
#11 0x00000000006d1eda in PostmasterMain ()
#12 0x000000000048edef in main ()
To Reproduce
USE TimescaleDB create hypertable , then in postgres the table like this:
CREATE TABLE dw_ies.t_mpp_11182_1d_ejki0_0
(
"time" timestamp with time zone NOT NULL,
entity_id text COLLATE pg_catalog."default" NOT NULL,
ind_042lgvatg numeric(25,5),
CONSTRAINT t_mpp_11182_1d_ejki0_0_pkey PRIMARY KEY ("time", entity_id)
)
WITH (
OIDS = FALSE
)
TABLESPACE pg_default;
ALTER TABLE dw_ies.t_mpp_11182_1d_ejki0_0
OWNER to dw_ies;
CREATE INDEX t_mpp_11182_1d_ejki0_0_time_idx
ON dw_ies.t_mpp_11182_1d_ejki0_0 USING btree
("time" DESC)
TABLESPACE pg_default;
CREATE TRIGGER ts_insert_blocker
BEFORE INSERT
ON dw_ies.t_mpp_11182_1d_ejki0_0
FOR EACH ROW
EXECUTE PROCEDURE _timescaledb_internal.insert_blocker();
When a large amount of data is inserted and chunks are automatically created, the database sometimes crashes.
The text was updated successfully, but these errors were encountered: