-
Notifications
You must be signed in to change notification settings - Fork 321
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
An opinionated way of doing async with alt #240
Conversation
API const StargazerSource = {
fetchUsers() {
return {
remote(state) {
const url = `https://api.github.com/repos/${state.user}/${state.repo}/stargazers`
return axios({ url }).then(response => response.data)
},
local(state) {
return state.users.length ? state.users : null
},
loading: StargazerActions.fetchingUsers,
success: StargazerActions.usersReceived,
error: StargazerActions.failed
}
}
}
@datasource(StargazerSource)
class StargazerStore { }
alt.createStore(StargazerStore)
// ...
StargazerStore.fetchUsers();
StargazerStore.isLoading(); loading, success, and error are all action functions that you set which will be called when they need to be called. cache is called before anything and its meant to check if we already have the value. If we don't have the value then fetch is called. Fetch is supposed to return a promise which will resolve with the data needed. This all exports a public method on the store which you can call to make your read. I've included a decorator and a function to hook this up to a store as well. |
Interesting. What's the |
I like this, especially the caching, and think that it is definitely worth adding to core. Will look at the code in more detail once you have it more finalized. |
Cache is probably a bad word for it since it doesn't cache for you but rather lets you implement your own cache. I do however have a branch locally with auto caching as a decorator :) |
Right, I think letting people set up their own cache is powerful. But yes we will want to make sure that is clear in the docs. 😄 |
@jareware oops totally missed your comment. |
} | ||
|
||
StargazerStore.listen((state) => console.log('CHANGED', state.users.length)) | ||
StargazerStore.fetchUsers() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@goatslacker
it will be nice if we could pass an arg to this so in the source we could use with the fetch.
For example for a signIn
action, if I don't wanna store the password in my store, passing via an arg will make it available to my request, that will help a lot.
what do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👏 Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually nvm, it looks like you're passing the arguments too. all good.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we prob want to pass an option like force: true so it bypass the cache check, what do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Return null from cache or just don't define it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm just wondering about the case where we have the cache but sometimes the user wants a force update? I'm guessing not?
Also, what happens if if fetch is called when loading is true? should it automatically do nothing? or the user add a check in cache to check state.loading to return false.
We probably also need ensure abort works. If we're returning the promise to the view then that works, but ideally it gets passed to loading action. Does loading action take the promise into the store? I'm guessing that a proper UI design is to always make the loading icon an abort button.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I could see a potential use case for forcing an update of the cached data, but I think in most cases it can be solved by the implementation in the cache function. It takes the state as an argument which allows you to determine whether new data is being queried or not. You could even store the time of the last fetch and only fetch again after a certain period of time. I think this is pretty flexible since you define what caching means (if you even choose to have it).
Loading should be true when fetch is called. Loading is the intermediary state before the data has been fetched. It gives you a hook to give the user feedback that the data is on its way. If you look at the way this is implemented, it calls the loading function if it is present and then immediately starts trying to fetch data. Since the fetch is async, the loading function allows us to update some state immediately, which like I said before could give indication that loading is occurring, or simply prepare for the fetched data.
There is no inherent abort in place right now, but like you are saying, that could be implemented in conjunction with this async patter for a pretty sweet user experience.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wouldn't mind adding an HttpProvider as an add-on which does automatic batching of requests, say you call a request for URL: 'foo/bar' and then another component tries to fetch from the same URL we can just fire the one remote request and respond to both call-sites when that single request comes back.
I'd also love to provide a caching add-on to automate caching of things.
Aborting requests could also be part of HttpProvider. I think we need to take control of the http request implementation in order to be able to provide these features.
Right now this API makes it flexible to be able to build things on top of it. I'd love to hear thoughts on if its flexible enough.
Code looks solid @goatslacker. Branch is in a weird place though, but nothing a rebase couldn't fix. |
Overall I do like this, I'll definitely be able to use this in my code. |
My 0.02 EUR regarding the proposed HttpProvider: please base it around |
@emroot take a second look at this? |
|
||
_this2[Sym.PUBLIC_METHODS][methodName] = methods[methodName]; | ||
}); | ||
>>>>>>> Update build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
merge conflict FYI
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup I'll just rerun npm build before merge.
LGTM. 👏 |
This is very good work! |
An opinionated way of doing async with alt
👍 Nice work! |
💝 Grateful! |
@goatslacker Looks very good. Could this be used server side? I'm using the
... and still have the Thanks! |
👍 This is going to help a lot of people on their first steps with Alt. I can hardly think of a scenario without async data, its the norm. |
This adds a few lines to core for dealing with asynchronous data with alt. I feel it's important enough to be in core and not in an addon. I've included an example that I'll turn into a test.