Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

max_cache_size_mb doesn't seem to be working #42

Closed
scriby opened this issue Jan 4, 2014 · 11 comments
Closed

max_cache_size_mb doesn't seem to be working #42

scriby opened this issue Jan 4, 2014 · 11 comments

Comments

@scriby
Copy link

scriby commented Jan 4, 2014

Hi,

I'm finding that the cache size is exceeding the max_cache_size_mb setting. I've tried 10MB, 512MB, etc., but the cache is growing as high as 6 gigs in some cases.

Note that in my case, I'm dealing with multi-gig files in the google drive. Not sure if that could be related. There are also some instances where the drive seems to get disconnected every now and then, not sure if that could be related either.

Thanks,

Chris

@astrada
Copy link
Owner

astrada commented Jan 5, 2014

It's probably a bug in the way I keep track of cache size. If you enable the -verbose logging, you should be able to see lines like this:

Downloading resource (id=7)...Updating cache size (36) in db...done

in the file .gdfuse/default/gdfuse.log. The total cache size is kept in the SQLite db (table metadata) and is updated only when downloading or uploading a resource. If it gets out of sync, it can absolutely cause the issue you are experiencing.

@scriby
Copy link
Author

scriby commented Jan 6, 2014

It looks like the act of simply copying a large file out of the google drive onto the local disk causes it to get placed in cache (even if the file is larger than the specified cache size)

@astrada
Copy link
Owner

astrada commented Jan 6, 2014

Yes, you are right. It works like that, because Google Drive API does not support partial uploads, while FUSE will copy the file chunk by chunk. So if the file is not cached somewhere, the upload would be impossible. In theory, the cached file will be deleted after the upload is completed. If the file remains, there is a bug.

@scriby
Copy link
Author

scriby commented Jan 7, 2014

Yes, all I am doing is

  1. Copy 1 GB+ file from Google Drive to linux file system
  2. Observe that even after the transfer completes, the .cache folder is > 1 GB in size (it was a few K before)

There did not appear to be any errors or issues during the copy.

@astrada
Copy link
Owner

astrada commented Jan 7, 2014

Oh, sorry, I didn't understand that you were talking about downloading and not uploading. When you download a resource, the cache is shrinked before the download is performed. That's because, even reading is done chunk by chunk and I think that lantency would be too high, if reading was performed directly from the Drive API. But when I have more time, I will do some testing about direct download without caching.

@glubox
Copy link

glubox commented Feb 7, 2014

Hi, I would also like to see direct download option. I'm using several google drives to store large number of multimedia files which are only read by the system on which the drive is mounted, everything works fine, but sometimes when my app want to access a file on the drive there is little delay and this causes my app to stall forever. What i observed is that when my app hits a file on the drive it is downloaded in the cache folder, may be if the download does not happen very fast i get the problem. If the download could begin in chunks may be my app will read the file seamlessly.

@PlexMediaFan
Copy link

AS - Thanks for all your hard work in making our google drives available to Linux No-GUI installs!

I have the same need as stated in this thread -

I have a VPS running Ubuntu 13.10 CLI only; running as a plex media server. However, the local space on the server is very limited, so I am using google-drive-ocamlfuse version 0.5.3-0ubuntu1 to access the google drive with my media on it (and it works great). The problem occurs when I access my media on my google drive - it immediately downloads the whole movie (5-6GB) from the google drive and stores it locally in VPS's cache before it plays or really does anything. Can you maybe add an option to NOT download and cache the entire movies/media that are accessed on google drive? I don't have the local HD space on my VPS to support caching files that size and really just want the ability to read my movie files on GD just like a regular hard drive and then Plex Media Server will send to me for viewing.

Thanks Again

@astrada
Copy link
Owner

astrada commented Apr 28, 2014

OK, thanks for your feedback. I will try to add a direct download option. I still don't know if it's easy to implement.

@iplor
Copy link

iplor commented Sep 28, 2014

Thanks for the very useful program.

I wondered if you have made any progress with this request? As with the users above, I'm dealing with large media files that I wish to access directly from the mounted drive, without caching locally first. The local caching causes long delays and timeouts in my scenario.

At the moment I can only achieve this using the NetDrive commercial app on Windoze. This is essentially quite similar to your project, but it only needs to cache for uploads.

@astrada
Copy link
Owner

astrada commented Oct 1, 2014

In version 0.5.10, I've added 2 options to deal with this issue. If you set stream_large_files to true, any file larger than large_file_threshold_mb will be downloaded directly, skipping the cache.

@iplor
Copy link

iplor commented Oct 2, 2014

I've just tested 0.5.10 from the ubuntu repo, and I can confirm that the new options are present and working. Good work. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants