Skip to content

gbrlsnchs/filecache

Repository files navigation

filecache (In-memory file caching using Go)

Build Status Sourcegraph GoDoc Minimal Version

About

This package recursively walks through a directory and caches files that match a regexp into a radix tree.

Since it spawns one goroutine for each file / directory lookup, it is also context-aware, enabling all the process to return earlier when the context is done.

Usage

Full documentation here.

Installing

Go 1.10

vgo get -u github.com/gbrlsnchs/filecache

Go 1.11 or after

go get -u github.com/gbrlsnchs/filecache

Importing

import (
	// ...

	"github.com/gbrlsnchs/filecache"
)

Reading all files in a directory

c, err := filecache.ReadDir("foobar", "")
if err != nil {
	// If err != nil, directory "foobar" doesn't exist, or maybe one of the files
	// inside this directory has been deleted during the reading.
}
txt := c.Get("bazqux.txt")
log.Print(txt)

Reading specific files in a directory

c, err := filecache.ReadDir("foobar", `\.sql$`)
if err != nil {
	// ...
}
q := c.Get("bazqux.sql")
log.Print(q)
log.Print(c.Len())  // amount of files cached
log.Print(c.Size()) // total size in bytes

Lazy-reading a directory

c := filecache.New("foobar")

// do stuff...

if err := c.Load(`\.log$`); err != nil {
	// ...
}

Setting a custom goroutine limit

By default, this package spawns goroutines for each file inside each directory.
Currently, the limit of goroutines is the result of runtime.NumCPU(). However, it is possible to use a cache with a custom limit by using the method SetSemaphoreSize.

c := filecache.New("foobar")
c.SetSemaphoreSize(100)
if err := c.Load(`\.log$`); err != nil {
	// ...
}

Contributing

How to help