Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement]: include additional debug information for SQLCA when deccompressing chunks #4468

Closed
spaceshipoperator opened this issue Jun 24, 2022 · 7 comments
Labels
enhancement An enhancement to an existing feature for functionality no-activity waiting-for-author

Comments

@spaceshipoperator
Copy link

What type of enhancement is this?

User experience

What subsystems and features will be improved?

Compression

What does the enhancement do?

user testing chunk decompression using python3 and the psycopg2 library.

please include more/useful/actionable information for troubleshooting compression issues:

Implementation challenges

#1 It seems the SQLCA is only partially filled out (many DEBUG field "None"). Here are the results of the error.diag list:
[06/09/22 09:21:30] App msg: decompressing
[06/09/22 09:21:30] SQL Error [55000] Severity: ERROR
DEBUG: column_name = None
DEBUG: constraint_name = None
DEBUG: context = None
DEBUG: datatype_name = None
DEBUG: internal_position = None
DEBUG: internal_query = None
DEBUG: message_detail = None
DEBUG: message_hint = None
DEBUG: message_primary = chunk "_hyper_449_2940_chunk" is not compressed
DEBUG: schema_name = None
DEBUG: severity = ERROR
DEBUG: severity_nonlocalized = ERROR
DEBUG: source_file = compress_utils.c
DEBUG: source_function = decompress_chunk_impl
DEBUG: source_line = 315
DEBUG: sqlstate = 55000
DEBUG: statement_position = None
DEBUG: table_name = None

@spaceshipoperator spaceshipoperator added the enhancement An enhancement to an existing feature for functionality label Jun 24, 2022
@spaceshipoperator
Copy link
Author

a few more details/context (from user/customer):

sqlstate, 55000 reads: object_not_in_prerequisite_state

This SQLSTATE occurs when the chunk is not compressed AND if an insert is attempted on a compressed chunk

maybe it would help if DEBUG.context were populated?

@jnidzwetzki
Copy link
Contributor

Hello @spaceshipoperator,

Thank you for your message and the description of the problem. I would like to reproduce this behavior. Would it be possible to share the steps required to reproduce the issue (e.g., how was the hypertable created, how was the compression policy set up, and what SQL statements were executed via python3/psycopg2 that led to the SQL error)?

@spaceshipoperator
Copy link
Author

thank you @jnidzwetzki , here are steps I received from user experiencing problem:

  1. Insert a row, returns with an error 55000 ‘not in prerequisite state’. I think the message says the chunk is compressed.
  2. Decompress chunk
  3. Attempt insert. Under normal conditions this is fine.

… However, as there are other transactions do the same processing this is what happens:

  1. Insert a row, returns with an error 55000 ‘not in prerequisite state’. The message says the chunk is compressed.
  2. Decompress chunk, by this time another transaction has already decompressed the chunk, so a 55000 error is once again thrown, but for a different reason.

Now what? 55000 == 55000 ???

I will try to learn more about (1) how the compression policy has been defined and/or (2) some python3/psycopg2 scripts which may help to illustrate

@jnidzwetzki
Copy link
Contributor

@spaceshipoperator Thank you for all the information. We have fixed a related problem in TimescaleDB 2.7.1, which should be released soon. Could you try to reproduce the issue with this version?

@spaceshipoperator
Copy link
Author

excellent and thank you @jnidzwetzki I will have a look at that item and see if I can't test it myself

@github-actions
Copy link

github-actions bot commented Sep 7, 2022

This issue has been automatically marked as stale due to lack of activity. You can remove the stale label or comment. Otherwise, this issue will be closed in 30 days. Thank you!

@github-actions
Copy link

github-actions bot commented Oct 8, 2022

Dear Author,

We are closing this issue due to lack of activity. Feel free to add a comment to this issue if you can provide more information and we will re-open it. Thank you!

@github-actions github-actions bot closed this as completed Oct 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement An enhancement to an existing feature for functionality no-activity waiting-for-author
Projects
None yet
Development

No branches or pull requests

2 participants