Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refact: remove memoize #4

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/pipeline.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -46,4 +46,4 @@ jobs:

# Run go build
- name: Run Go Build
run: go build ./cache && go build ./memoize && go build ./persist
run: go build ./cache && go build ./persist
71 changes: 0 additions & 71 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,74 +74,3 @@ var GetTeams = cache.OnDisk(filepath.Join("cache", "teams"), time.Hour, func(ctx
return teams, nil
})
```

## memoize
The memoization package provides functionality for memoizing function results.
You can use these functions to cache function results both in memory as-well-as in an external data store.
Additionally, this cache is set based on the input parameter so different inputs will have their own individual cache.
Be aware that if you are memorizing large amounts of data with long TTLs you may run into OOM issues.
This is especially true for memoization where new entries are made into the cache for every new paramater.

It's important to note that the memoized function may return expired data.
This can happen when your cached function returns an error but the previous cache value still exists.
In this case valid cache data will be returned along with your function's error.
As the developer it is up to you to determine if this stale data is safe to use or if it should be ignored.

Example 1: cache function results in memory. This makes repeatedly calling the GetTeams function much faster since
only the first call will result in a network call.
```go
// GetTeam gets a team from an external api. The results will be cached in memory for at least one hour.
// The cached data is tied to the input parameter such that calls with different inputs will have their own
// individual cache.
var GetTeam = memoize.InMemory(time.Hour, func(ctx context.Context, teamName string) (*Team, error) {
client := &http.Client{}
resp, err := client.Get("https://api.weavedev.net/team/" + teamName)
if err != nil {
return nil, err
}

defer resp.Body.Close()
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, err
}

team := &Team{}
err = json.Unmarshal(body, team)
if err != nil {
return nil, err
}

return team, nil
})
```

Example 2: cache function results in memory and on disk.
Like example 1 this improves performance.
It also allows the cache to be restored across runs which can be useful for short-lived process like cron jobs or cli tools
```go
// GetTeam gets a team from an external api. The results will be cached in memory for at least one hour.
// The cached data is tied to the input parameter such that calls with different inputs will have their own
// individual cache. Additionally, the cache will be backed by the file system so it can be restored between program runs
var GetTeam = memoize.OnDisk(filepath.Join("cache", "team"), time.Hour, func(ctx context.Context, teamName string) (*Team, error) {
client := &http.Client{}
resp, err := client.Get("https://api.weavedev.net/team/" + teamName)
if err != nil {
return nil, err
}

defer resp.Body.Close()
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, err
}

team := &Team{}
err = json.Unmarshal(body, team)
if err != nil {
return nil, err
}

return team, nil
})
```
67 changes: 0 additions & 67 deletions memoize/memoize.go

This file was deleted.

Loading