Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storage: repeatedly streaming a readable from storage "freezes" after a a few times #335

Closed
cristiano-belloni opened this issue Dec 21, 2014 · 22 comments
Assignees
Labels
api: storage Issues related to the Cloud Storage API. 🚨 This issue needs some love. triage me I really want to be triaged. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@cristiano-belloni
Copy link
Contributor

Hi all,
I'm trying to stream a file from my server to the client. Waiting from #318 to be closed, I'm using @stephenplusplus 's https://github.com/stephenplusplus/range-stream. By the way, this doesn't change the issue I'm seeing, which seems related to createReadStream().

First of all, what I do on the server on a ranged request is extracting the range from the request in the start and end variables, then get a file:

var file = resourceBucket.file(fileName);

create a readable:

var rStream = file.createReadStream()

then write a 206 on res and pipe the readable into a range request:

res.writeHead(206, header);
rStream.pipe(rangeStream(start, end)).pipe(res);

rangeStream is @stephenplusplus module, here modified with a few console.log statements to see what's happening:

var through = require("through2");

module.exports = function (start, end) {
    end += 1;
    var bytesReceived = 0;

    console.log ('---------------------------------------------------');
    console.log ("Sending a range from: " + start + " to: " + end);

    return through(function (chunk, enc, next) {
        console.log ('start, end: ' + start, end);
        console.log ('chunk.length: ' + chunk.length);
        bytesReceived += chunk.length;
        console.log ('bytesReceived: ' + bytesReceived);

        if (bytesReceived >= start) {

            if (start - (bytesReceived - chunk.length) > 0) {
                console.log ('slicing chunk');
                chunk = chunk.slice(start - (bytesReceived - chunk.length));
            }

            if (end <= bytesReceived) {
                console.log ('calling end');
                console.log ('---------------------------------------------------');
                this.push(chunk.slice(0, chunk.length - (bytesReceived - end)));
                this.end()
            } else {
                console.log ('pushing chunk');
                this.push(chunk)
            }
        }
        console.log ('calling next');
        next();
    });
};

Initially everything works fine, the readable streams, this.end gets called and the file streams OK.
But after a few times the client does ranged requests (specifically via an <audio> HTML element), rStream seems to freeze and never send data to rangeStream again. Here is the log:

Sending a range from: 44 to: 6545454
start, end: 44 6545454
chunk.length: 10386
bytesReceived: 10386
slicing chunk
pushing chunk
calling next
start, end: 44 6545454
chunk.length: 2710
bytesReceived: 13096
pushing chunk
calling next
[... lots of chunks coming through ...]
start, end: 44 6545454
chunk.length: 16384
bytesReceived: 629918
pushing chunk
calling next 
[ hanging forever ]

As you can see, the code calls next(), but no data comes into it at all. This seems an issue with rStream, and specifically with the readable. I also have an error handler on rStream, which never gets called:

rStream.on('error', function (err) {
                        log.error({error: err}, 'downloadLayer - ERROR ON rstream');
                        rStream.end();
                    });

Oddly enough, this seems to happen only when the range is 44 - end. But 44 being the size of the wav header, it's a typical range value, and that could be a red herring.

@cristiano-belloni cristiano-belloni changed the title Storage: repeatedly streaming a file from storage "freezes" after a a few times Storage: repeatedly streaming a readable from storage "freezes" after a a few times Dec 21, 2014
@stephenplusplus
Copy link
Contributor

Are you creating a new instance of a Readable Stream every time, or reusing one (eg do you call file.createReadStream once or for each request)? I think things get messy if you reusing streams that already consumed data.

@cristiano-belloni
Copy link
Contributor Author

I call it for each request. My exact code is:

reqHandler = function (req, res) {
   // [...] hit the db, construct pathName
   gcloud.readFile(encodeURIComponent(pathName), function (err, file) {
       // [...] get the range from req
       res.writeHead(206, header);
       rStream = file.readable;
       rStream.pipe(rangeStream(start, end)).pipe(res);
   })

And my small gcloud module has:

exports.readFile = function (fileName, cb) {
    var file = resourceBucket.file(fileName);
    file.getMetadata (function(err, metadata) {
        if (err) {
            return cb (err, null);
        }
        cb (null, {meta: metadata, readable: file.createReadStream()});
    });
   };
}

So a new file is created for every request, if I'm not missing something obvious.

@stephenplusplus
Copy link
Contributor

No, you're right in that you're creating a new file instance each request.

Does every range request fail?

When you see that it stops sending data, are there any similarities between other failed instances? -- as an example, is the byte count received always the same?

@cristiano-belloni
Copy link
Contributor Author

No similarities, unluckily. It took me 5-6 minutes to get a freezing now (I'm deactivating caching in the browser to make it happen quicker), and the final log is completely different, in terms of values:

calling next
start, end: 919424 6545454
chunk.length: 32768
bytesReceived: 4734141
pushing chunk
calling next

Worth mentioning that when it happens, it completely freezes my server (it doesn't answer any request at all, even unrelated ones). ps aux doesn't show high cpu use on the node process, btw.
I'm using node v0.10.29 (installed from Wheezy apt-get).

Let me know if some more tests would be useful, and I'll try them.

@cristiano-belloni
Copy link
Contributor Author

And, to answer your first question: it doesn't happen on every ranged request. The first n ranged requests work perfectly fine, it freezes at the n+1, where n seems random.

@stephenplusplus
Copy link
Contributor

So far, I can't think of a reason the stream would fail. If anything, it sounds like it could be a caching issue somewhere upstream: request or the API itself. All that's returned from 'createReadStream' is a request instance.

Is there any way you can produce a headless test case to recreate the behavior of the ranged requests and get it to reproduce the bug?

@cristiano-belloni
Copy link
Contributor Author

Yep, here's to you. This kills (freezes) my server after a couple of minutes: https://gist.github.com/janesconference/22afe04aba3dde02814d

@cristiano-belloni
Copy link
Contributor Author

Updated it with a deterministic version, still freezes it.

@cristiano-belloni
Copy link
Contributor Author

Update: the deterministic gist freezes it after exactly 5 requests.

@stephenplusplus
Copy link
Contributor

I started off by plugging in "yahoo.com" as an endpoint, and everything ran smoothly. I then started testing directly with file.createReadStream() and everything performed as expected. After that, I integrated range-stream, and immediately saw it stop logging "done" after 5 requests. However, I continued running the tests and noticed very slow responses from the server, sometimes making it seem like I would never get a done, but if I waited long enough, it did come. I assume that was the same thing that happened in the first case where I thought I had reproduced the behavior. Also worth noting, my test file is quite small (~3mb).

Still no answers, unfortunately, but I will try to play with it more tomorrow.

@cristiano-belloni
Copy link
Contributor Author

You reproduced it.
My file is ~6 megabytes, so not so big. I get the same result: it stops logging after 5 requests. The server side freezes on the last call to next. It's possible that it doesn't completely freeze, though, but just becomes extremely unresponsive and slow - I kill it after a minute or so, will try to wait as you did.

@cristiano-belloni
Copy link
Contributor Author

(Update: Tried to wait for a long time, but the server does not respond anymore after 5 iterations)

@stephenplusplus
Copy link
Contributor

Can you paste the exact code you're using (fill in the TODO part of your gist). I'm not able to reproduce.

@cristiano-belloni
Copy link
Contributor Author

@stephenplusplus, do you mean I should send you the API endpoint I'm using on my server or send over the server code? I can probably semplify the code, in the second case, because it's a complex server with integrations with Mongodb and JWT.

Btw, didn't you say yesterday that by plugging in range-stream you could observe your server stopping after 5 requests?

@stephenplusplus
Copy link
Contributor

I just need a test case that strips it down to the bare minimum; something I can run isolated from your environment so we can assure the bug is in gcloud. Of course, assume I have a bucket with files in it to test with. Maybe try the code from my comment on your gist and see if that reproduces the bug for you?

Yesterday, when I inserted gcloud to your example, it was working, even though it was slow. Inserting range-stream did result in 5 successful attempts, then nothing, but only on the first run. And I still attribute that to my lack of patience, as I didn't notice until later attempts that if I waited long enough, the requests did come back (talking 1-2 seconds at first, 10-15 seconds at the slowest).

@cristiano-belloni
Copy link
Contributor Author

Sorry, just found your comment with your version.

Replaced with my data and what I get is:

done
request sent
done
request sent
done
request sent
done
request sent
done
request sent
request sent
request sent
request sent
request sent
request sent
request sent
request sent
request sent
request sent
request sent
request sent
[a lot of times]

Done never gets called again.

@stephenplusplus
Copy link
Contributor

No, it works for me. I'll re-run in a bit to see what I get today. Curious, if you take out range-stream from my example, does it work?

@cristiano-belloni
Copy link
Contributor Author

Commenting out the line .pipe(rangeStream(start, end))

produces this result:

request sent
request sent
request sent
done
request sent
request sent
done
done
request sent
request sent
request sent
done
request sent
done
request sent
done
request sent
request sent
request sent
done
done
request sent
done
request sent
done
request sent
request sent
done
request sent
request sent
done
done
request sent
done
request sent
request sent
done
done
request sent

Seems to work. Maybe the culprit is range-stream. But I can't see how. Maybe it's the interaction with range stream and the readable generated by file?

stephenplusplus added a commit to stephenplusplus/range-stream that referenced this issue Dec 22, 2014
@stephenplusplus
Copy link
Contributor

This one broke my head a little bit. Try upgrading range-stream to 1.0.1: stephenplusplus/range-stream@261c989

@cristiano-belloni
Copy link
Contributor Author

Oh wow, I couldnt spot that at all!
Confirmed working for me, thanks.

@stephenplusplus
Copy link
Contributor

Phew! Thanks for reporting. We should win a debugging medal for this effort.

@cristiano-belloni
Copy link
Contributor Author

You should :)

@jgeewax jgeewax added api: storage Issues related to the Cloud Storage API. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Feb 2, 2015
@jgeewax jgeewax added this to the Storage Stable milestone Feb 2, 2015
@yoshi-automation yoshi-automation added 🚨 This issue needs some love. triage me I really want to be triaged. labels Apr 6, 2020
sofisl pushed a commit that referenced this issue Nov 11, 2022
* Make unify script recursive + clean up dependency conflicts

* Restore travis.yml

* Delete outdated text detection sample that duplicates detect.js

* Fix failing KMS + vision tests by updating dependencies

* Fix video tests using a bad cwd

* Upgrade monitoring dependency + skip flaky monitoring tests

* Fix DLP tests having wrong cwd

* Fix failing vision test

* Fix datastore tests

* Update broken dependency

* Update possibly broken compute engine dependency

* Fix typos

* Disable Node 4 testing

* Revert deletion of outdated sample - @gguuss says we still use this.

This reverts commit b7259c820fb011369c7b5badac82fcde26be008a.

* Update dependency
sofisl pushed a commit that referenced this issue Nov 11, 2022
- [ ] Regenerate this pull request now.

Committer: @summer-ji-eng
PiperOrigin-RevId: 424244721

Source-Link: googleapis/googleapis@4b6b01f

Source-Link: googleapis/googleapis-gen@8ac83fb
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiOGFjODNmYmE2MDZkMDA4YzdlOGE0MmU3ZDU1YjY1OTZlYzRiZTM1ZiJ9
sofisl pushed a commit that referenced this issue Nov 11, 2022
- [ ] Regenerate this pull request now.

Committer: @summer-ji-eng
PiperOrigin-RevId: 434859890

Source-Link: googleapis/googleapis@bc2432d

Source-Link: googleapis/googleapis-gen@930b673
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiOTMwYjY3MzEwM2U5MjUyM2Y4Y2ZlZDM4ZGVjZDdkM2FmYWU4ZWJlNyJ9
sofisl pushed a commit that referenced this issue Nov 11, 2022
samples: pull in latest typeless bot, clean up some comments

Source-Link: https://github.com/googleapis/synthtool/commit/0a68e568b6911b60bb6fd452eba4848b176031d8
Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-nodejs:latest@sha256:5b05f26103855c3a15433141389c478d1d3fe088fb5d4e3217c4793f6b3f245e
sofisl pushed a commit that referenced this issue Nov 11, 2022
Users can now use checksums for data integrity assurance when adding and
accessing SecretVersions.
PiperOrigin-RevId: 425369494
Source-Link: googleapis/googleapis@70d389c
Source-Link: googleapis/googleapis-gen@cf92905
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiY2Y5MjkwNTY4Mjg0ZDJmMDk5YjlhMDBjYzgyYTJhMTMzYmU2ZGZkYSJ9
See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md
Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
sofisl pushed a commit that referenced this issue Nov 11, 2022
... chore: update gapic-generator-ruby to the latest commit chore: release gapic-generator-typescript 1.5.0

Committer: @miraleung
PiperOrigin-RevId: 380641501

Source-Link: googleapis/googleapis@076f7e9

Source-Link: googleapis/googleapis-gen@27e4c88
sofisl pushed a commit that referenced this issue Nov 11, 2022
sofisl pushed a commit that referenced this issue Nov 17, 2022
* Make unify script recursive + clean up dependency conflicts

* Restore travis.yml

* Delete outdated text detection sample that duplicates detect.js

* Fix failing KMS + vision tests by updating dependencies

* Fix video tests using a bad cwd

* Upgrade monitoring dependency + skip flaky monitoring tests

* Fix DLP tests having wrong cwd

* Fix failing vision test

* Fix datastore tests

* Update broken dependency

* Update possibly broken compute engine dependency

* Fix typos

* Disable Node 4 testing

* Revert deletion of outdated sample - @gguuss says we still use this.

This reverts commit b7259c820fb011369c7b5badac82fcde26be008a.

* Update dependency
sofisl pushed a commit that referenced this issue Nov 18, 2022
This PR was generated using Autosynth. 🌈

Synth log will be available here:
https://source.cloud.google.com/results/invocations/5fc9e824-cf7f-4f93-9bfe-6ba3aa84ec69/targets

- [ ] To automatically regenerate this PR, check this box.

Source-Link: googleapis/synthtool@ba9918c
sofisl pushed a commit that referenced this issue Jan 10, 2023
* updated CHANGELOG.md [ci skip]

* updated package.json [ci skip]

* updated samples/package.json [ci skip]
sofisl pushed a commit that referenced this issue Jan 17, 2023
* Make unify script recursive + clean up dependency conflicts

* Restore travis.yml

* Delete outdated text detection sample that duplicates detect.js

* Fix failing KMS + vision tests by updating dependencies

* Fix video tests using a bad cwd

* Upgrade monitoring dependency + skip flaky monitoring tests

* Fix DLP tests having wrong cwd

* Fix failing vision test

* Fix datastore tests

* Update broken dependency

* Update possibly broken compute engine dependency

* Fix typos

* Disable Node 4 testing

* Revert deletion of outdated sample - @gguuss says we still use this.

This reverts commit b7259c820fb011369c7b5badac82fcde26be008a.

* Update dependency
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: storage Issues related to the Cloud Storage API. 🚨 This issue needs some love. triage me I really want to be triaged. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
Development

No branches or pull requests

4 participants