Replies: 1 comment 7 replies
-
You could save the object to an RDS file, then use tar_file() to track the file contents and read it in to your pipeline. If for any reason the contents of the RDS file changed it would trigger downstream steps that depend on it. But AFAIK {targets} has no way of knowing how changes upstream of the model object would affect it, other then completely re-doing the workflow in {targets}. |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Help
Description
In the process of learning how to work with targets, I decided to convert an existing ongoing project. I've learned how to do it from scratch, but I have the following "problem". There is a computationally demanding step which simulates some models, which originally took about a week to complete. The object produced by this step is used in downstream analyses. I would like to avoid recomputing this object and just reuse it. Is it somehow possible to setup the targets pipeline such that it will consider some steps completed and reuse the (externally-generated) object as if it was produced by a step in the targets pipeline? I hope my question makes sense.
Beta Was this translation helpful? Give feedback.
All reactions