-
-
Notifications
You must be signed in to change notification settings - Fork 432
Conversation
…as a static website starting at the site's root. TESTED=Basic unit test ensuring relevant routes are added.
…ests for extracted client pages that match a regular expression
Also clarifies some texts and documentation.
Much cleaner! I was never too comfortable using I just tried this on a personal site I am building that's closely coupled in style to the template project. I'm getting this error:
Any immediate thoughts? (This is after 14 other files seem to be successfully built) |
That's very odd — does |
No, it doesn't, but I must say I'm relatively new to debugging node packages with GitHub and am sure I installed this commit wrong. Sorry to pollute this thread, but what's the best practice for pulling this change in inside an actual Sapper project? I went into the |
I changed the dependency in my |
Ah, I see — yeah, that would fail. The The way I'm doing it is by running |
EDIT: I think it may be creating a file with the same name as a directory, and that's preventing further files in that subdirectory from being created. Yep, just got it working by manually merging changes from this commit. Now I'm facing new issues. It seems not all routes are generated in my project -- I'm getting some 404s in my website project.
|
Any thoughts on files that have the same name as directories? It's cool in servers but not most filesystems. Ex: Let's say I have the following routes:
Generating these files locally results in errors because |
This is easily fixable by adding |
Yeah, I ran into the directories issue as well. For pages adding It would be easy enough to determine that One other possible idea — probably a terrible one — would be to generate a map that worked like so: const route_map = {
'/blog': '/blog_xyz123'
};
const _fetch = window.fetch;
window.fetch = (url, opts) => {
url = new URL(url, window.location.href);
if (url.pathname in route_map) return _fetch(route_map[url.pathname], opts);
return _fetch(url, opts);
}; Glossing over some details (query string parameters etc) but you get the gist.
Interesting. Can you get to those pages from
|
This feature isn't done, but I'm going to release it as 'experimental' in 0.5 so that we can start properly dogfooding, and iterating from there. Many thanks @freedmand for outlining the solution and getting the ball rolling |
This builds on top of @freedman's PR #66. The main difference is that we're just monkey-patching
fetch
to reify server routes like/api/blog/whatever
rather than making any assumptions about project structure or adding configuration.Rather than running everything through node-spider, we just copy
assets
and the generated files, and then visit/
(TODO: visit all static routes.)This is quite a crude approach but it works well with the test app. Lots of tasks remaining:
Also, semi-relatedly, the CLI will need a bit of work (
sapper --help
etc — also, would be nice to configure thedist
folder perhaps).I think we might be better off using
sapper export
thansapper extract
, so that people familiar with the concept from Next.js know what to expect — what do you reckon? Orsapper generate
, to align with Nuxt.js.