Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache abstraction layer with Redis support #19

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

gigablah
Copy link

@gigablah gigablah commented Jul 1, 2012

Changes MogileFS::Config->memcache_client to Mgd::get_cache. Cache options are now configured in mogilefsd.conf (cache_type, cache_servers and cache_ttl). Preserves functionality of the previous cache implementation (using server_settings). Adapters supplied for Memcache and Redis.

@gigablah
Copy link
Author

gigablah commented Jul 1, 2012

And yes, I did read memcache-support.txt :p
This would be convenient for those who point, say, nginx directly to the tracker (using nginx-mogilefs-module), so it bypasses the app completely.

For Redis, device IDs are stored using sets (SADD, SMEMBERS). Socket addresses are supported (e.g. /tmp/redis.sock).

I pretty much followed the structure of Mgd::get_store.

@dormando
Copy link
Member

Thanks a lot for doing this. I'm still trying to get enough round-tuits to properly review, but you've fixed a few good things so far. Not sure if it'll make 2.65, but if not, 2.66 for sure!

$cache_ttl = MogileFS::Config->server_setting_cached('memcache_ttl') || 3600;
} else {
$servers = MogileFS->config('cache_servers');
$type = MogileFS->config('cache_type');
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like it'd be more helpful to either default to something here or print a warning that cache_type must be specified with cache_servers.

@dormando
Copy link
Member

Hi again! I'm a terrible person: Are you around to help deal with a few more comments on the patch series so we can toss it in?

Thanks!

@gigablah
Copy link
Author

Sorry about that, I'll try to get to it in these couple of days.

@dormando
Copy link
Member

awesome, thanks! Though all apologies are mine!

@dormando
Copy link
Member

ping again! If it's fixed up within the next week it could go out in the next cut

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants