-
Notifications
You must be signed in to change notification settings - Fork 603
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
📝 Add more examples / uses to the Concurrency section in the documentation guide of Fs2 #3437
Conversation
Added some examples of `parEvalMap` to the *Concurrency* section in the documentation guide.
Added reasoning about `merge` halting variants and an example of `concurrently` using a producer consumer demo (this example only works in Scala 3)
Added a similar consumer-producer example to the `parJoin` method. Also mentioned the `parJoinUnbounded` version.
Apparently, `mdoc` does not have a way to distinguish between Scala 2 and Scala 3 AFAIK. So this commit changes the examples in Scala 3 to Scala 2.
The sample routes in the example are actually linked to the project, so I used the testdata/ folder to place some lorem ipsum files.
The previous one had conflict because of previous definitions in the `concurrently` one. The new example also runs three streams at the same time to differentiate it a bit more from `merge`.
site/guide.md
Outdated
Path("sample_file_part3.txt"), | ||
).map(Path("testdata") / _) | ||
|
||
def loadFile(path: Path): IO[String] = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't love this example for a couple of reasons:
- concurrency not really adding any benefit given all actions are against the file system
- unnecessarily parses the file contents to a string
- unnecessarily loads full file contents in memory
- if those downsides aren't important, then the same thing could be done entirely with
IO
and withoutStream
Here's a full streaming version of this file concat routine that maintains a streaming nature and avoids parsing:
def concat(inPaths: Stream[IO, Path], outPath: Path): Stream[IO, Nothing] =
inPaths.flatMap(p => Files[IO].readAll(p)).through(Files[IO].writeAll(outPath))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right, I did It just as an excuse to maintain the order of the original stream 🤔 let me thing of a better example!
Changed tree 🌳 to three in typo on `merge` Co-authored-by: Michael Pilquist <mpilquist@gmail.com>
Changed 'optimizations' word from british to american english Co-authored-by: Michael Pilquist <mpilquist@gmail.com>
Removed unused imports in various examples before `mdoc:reset` Co-authored-by: Michael Pilquist <mpilquist@gmail.com>
Changed the example as suggested. Now it does not parse the bytes, nor reads from the filesystem. Instead it downloads some files from The Project Gutenberg and creates an `InputStream` from each one of them. Co-authored-by: Michael Pilquist <mpilquist@gmail.com>
I sometimes forget that GitHub don't usually notifies the commits in PRs, so changes are done @mpilquist! |
Thanks! |
Helps with #1958
What's included:
parEvalMap
andparEvalMapUnordered
merge
variants; it's about when they will halt or not depending on the provided streamsconcurrently
example using a simple producer-consumer environment (Had to use Scala 2 syntax with implicits, I didn't know how to make mdoc work with Scala 3 at the same time 🤔).parJoin