Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ReadTimeout errors #1184

Closed
uatach opened this issue Nov 13, 2023 · 5 comments · Fixed by #1186 or #1182
Closed

ReadTimeout errors #1184

uatach opened this issue Nov 13, 2023 · 5 comments · Fixed by #1186 or #1182
Assignees
Labels
api: storage Issues related to the googleapis/python-storage API. priority: p2 Moderately-important priority. Fix may not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@uatach
Copy link

uatach commented Nov 13, 2023

I keep receiving requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='storage.googleapis.com', port=443): Read timed out. (read timeout=60) when trying to write files to storage.

I've tried adding the timeout value when opening the file:
Client().bucket('my_bucket').blob('my_blob').open(mode='wb', timeout=3600)

Investigating the traceback:

 File "/usr/local/lib/python3.9/site-packages/google/cloud/storage/fileio.py", line 411, in _upload_chunks_from_buffer
    upload.transmit_next_chunk(transport)

I've found that the method transmit_next_chunk can receive a timeout param that is not being set.

@product-auto-label product-auto-label bot added the api: storage Issues related to the googleapis/python-storage API. label Nov 13, 2023
@cojenco cojenco self-assigned this Nov 13, 2023
@cojenco cojenco added the status: investigating The issue is under investigation, which is determined to be non-trivial. label Nov 13, 2023
@cojenco cojenco added type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. priority: p2 Moderately-important priority. Fix may not be included in next release. and removed status: investigating The issue is under investigation, which is determined to be non-trivial. labels Nov 17, 2023
gcf-merge-on-green bot pushed a commit that referenced this issue Dec 4, 2023
@shamyjun22
Copy link

I keep receiving requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='storage.googleapis.com', port=443): Read timed out. (read timeout=60) when trying to write files to storage.

I've tried adding the timeout value when opening the file: Client().bucket('my_bucket').blob('my_blob').open(mode='wb', timeout=3600)

Investigating the traceback:

 File "/usr/local/lib/python3.9/site-packages/google/cloud/storage/fileio.py", line 411, in _upload_chunks_from_buffer
    upload.transmit_next_chunk(transport)

I've found that the method transmit_next_chunk can receive a timeout param that is not being set.

@uatach, have you resolved this issue? I keep on receiving this exception.

@uatach
Copy link
Author

uatach commented Apr 29, 2024

@shamyjun22 I've added some retry logic and it was enough, I haven't tried the new package version yet.

@shamyjun22
Copy link

Hi @uatach, I encounter this exception when trying to upload file to Google Cloud Storage:
1.
Code: ReadTimeout
Type: <class 'requests.exceptions.ReadTimeout'>
Message: HTTPSConnectionPool(host='storage.googleapis.com', port=443): Read timed out. (read timeout=60

Code: ConnectionError
Type: <class 'requests.exceptions.ConnectionError'>
Message: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

Does it mean that I was able to upload the file successfully, but I just don't receive any response from where I am uploading to?

Also, if it wont bother, can you provide a snippet of code on how you resolve it?

Thanks in advance.

@uatach
Copy link
Author

uatach commented May 8, 2024

I encounter this exception when trying to upload file to Google Cloud Storage

@shamyjun22 are you trying to upload many files, like in parallel or really fast? I've felt that several GCP python libs have some problem dealing with simultaneous connections but I still haven't been able to pin the problem down... I have been adding some sleep calls just to avoid these problems

Does it mean that I was able to upload the file successfully, but I just don't receive any response from where I am uploading to?

I don't think so

Also, if it wont bother, can you provide a snippet of code on how you resolve it?

Sure, but as I said is a simple retry logic:

blob = Client().bucket('bucket_name').blob('blob_name')

while True:
  try:
    with blob.open('w') as fp:
      fp.write('data')
  except:
    continue
  else:
    break

@shamyjun22
Copy link

shamyjun22 commented May 9, 2024

@uatach, thank you for your response. By the way, i am not uploading many files in parallel at one time, but i am uploading one file every second interval because I need realtime data.

This is how I structure my code in python, maybe you can have suggestion the way it is structure

# import other libraries needed here

from google.cloud import storage

# setup cloud storage client here
client = storage.Client.from_service_account_json('service_account_here')
bucket = client.get_bucket('bucket_name_here')

def upload_to_storage_function_here(data):
    try:
        # do stuff here to upload file like the blob path
        # blob upload here
    except:
    # catch exception here

async def main():
    while True:
        # fetch data from source
        # manipulate data
        
        upload_to_storage_function_here(data)
        
        if (sec == datetime.now(manila_timezone).strftime("%S")):
            time.sleep(max(0, 1 - (datetime.now(manila_timezone) - timeStart).total_seconds() % 1))
                    
if __main__ == '__main__':
    asyncio.run(main())

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: storage Issues related to the googleapis/python-storage API. priority: p2 Moderately-important priority. Fix may not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
3 participants