From 9ee3f881b396302897b40f6987027d40478739ca Mon Sep 17 00:00:00 2001 From: Mark Date: Wed, 12 Sep 2018 15:43:51 +0100 Subject: [PATCH] Tried to fix the English of the first few paras (#29050) I don't have time to do this whole document, but it could do with an edit by a native English speaker. (cherry picked from commit 592a4745012a3d1b920fb1c95093bc5c7453323d) --- doc/src/manual/parallel-computing.md | 20 ++++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/doc/src/manual/parallel-computing.md b/doc/src/manual/parallel-computing.md index 6f178ff33a9e9..11bf6669b26ce 100644 --- a/doc/src/manual/parallel-computing.md +++ b/doc/src/manual/parallel-computing.md @@ -7,22 +7,22 @@ the different levels of parallelism offered by Julia. We can divide them in thre 2. Multi-Threading 3. Multi-Core or Distributed Processing -We will first consider Julia [Tasks (aka Coroutines)](@ref man-tasks) and other modules that rely on the Julia runtime library, that allow to suspend and resume computations with full control of inter-`Tasks` communication without having to manually interface with the operative system's scheduler. -Julia also allows to communicate between `Tasks` through operations like [`wait`](@ref) and [`fetch`](@ref). -Communication and data synchronization is managed through [`Channel`](@ref)s, which are the conduit -that allows inter-`Tasks` communication. +We will first consider Julia [Tasks (aka Coroutines)](@ref man-tasks) and other modules that rely on the Julia runtime library, that allow us to suspend and resume computations with full control of inter-`Tasks` communication without having to manually interface with the operating system's scheduler. +Julia also supports communication between `Tasks` through operations like [`wait`](@ref) and [`fetch`](@ref). +Communication and data synchronization is managed through [`Channel`](@ref)s, which are the conduits +that provide inter-`Tasks` communication. Julia also supports experimental multi-threading, where execution is forked and an anonymous function is run across all threads. -Described as a fork-join approach, parallel threads are branched off and they all have to join the Julia main thread to make serial execution continue. +Known as the fork-join approach, parallel threads execute independently, and must ultimately be joined in Julia's main thread to allow serial execution to continue. Multi-threading is supported using the `Base.Threads` module that is still considered experimental, as Julia is -not fully thread-safe yet. In particular segfaults seem to emerge for I\O operations and task switching. -As an un up-to-date reference, keep an eye on [the issue tracker](https://github.com/JuliaLang/julia/issues?q=is%3Aopen+is%3Aissue+label%3Amultithreading). +not yet fully thread-safe. In particular segfaults seem to occur during I\O operations and task switching. +As an up-to-date reference, keep an eye on [the issue tracker](https://github.com/JuliaLang/julia/issues?q=is%3Aopen+is%3Aissue+label%3Amultithreading). Multi-Threading should only be used if you take into consideration global variables, locks and -atomics, so we will explain it later. +atomics, all of which are explained later. -In the end we will present Julia's way to distributed and parallel computing. With scientific computing -in mind, Julia natively implements interfaces to distribute a process through multiple cores or machines. +In the end we will present Julia's approach to distributed and parallel computing. With scientific computing +in mind, Julia natively implements interfaces to distribute a process across multiple cores or machines. Also we will mention useful external packages for distributed programming like `MPI.jl` and `DistributedArrays.jl`. # Coroutines