Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Script with batched requests fails with "fatal error: concurrent map writes" #770

Closed
pdkovacs opened this issue Sep 16, 2018 · 3 comments · Fixed by #771
Closed

Script with batched requests fails with "fatal error: concurrent map writes" #770

pdkovacs opened this issue Sep 16, 2018 · 3 comments · Fixed by #771

Comments

@pdkovacs
Copy link

The create-icon-and-refresh.js.zip script fails with "fatal error: concurrent map writes" when executed with --vus 5 --duration 60s in batch: k6-log.txt.

The issue can be consistently reproduced both on Mac OS 10.13.6 and on Ubuntu 18.04.

A version of the script modified so as to execute the request sequentially completes successfully: create-icon-and-refresh-non-batched.js.zip

@pdkovacs pdkovacs changed the title "concurrent map writes" batched requests Script with batched requests fails with "fatal error: concurrent map writes" Sep 16, 2018
@na--
Copy link
Member

na-- commented Sep 17, 2018

Thanks for reporting this, I managed to semi-consistently reproduce it, even with a much simpler script:

export let options = {
    vus: 5,
    iterations: 10,
    batchPerHost: 6
};

export default function () {
    let reqs = [];
    for (let i = 0; i < 30; i++) {
        reqs.push({ method: "GET", url: `https://test.loadimpact.com/?req=${i}` });
    }
    let resps = http.batch(reqs);
    sleep(2);
}

The data race is in the limiter used for the batchPerHost option, compiling k6 with -race (to enable the Go race condition detector) and running the script again produced a more consistent panic and a slightly better error message:

WARNING: DATA RACE
Read at 0x00c422ee96b0 by goroutine 40:
  runtime.mapaccess2_faststr()
      /usr/lib/go/src/runtime/hashmap_fast.go:261 +0x0
  github.com/loadimpact/k6/js/modules/k6/http.(*MultiSlotLimiter).Slot()
      /rw/home/go/src/github.com/loadimpact/k6/js/modules/k6/http/limiter.go:64 +0x97
  github.com/loadimpact/k6/js/modules/k6/http.(*HTTP).Batch.func2()
      /rw/home/go/src/github.com/loadimpact/k6/js/modules/k6/http/http_request.go:752 +0xe3

Previous write at 0x00c422ee96b0 by goroutine 39:
  runtime.mapassign_faststr()
      /usr/lib/go/src/runtime/hashmap_fast.go:694 +0x0
  github.com/loadimpact/k6/js/modules/k6/http.(*MultiSlotLimiter).Slot()
      /rw/home/go/src/github.com/loadimpact/k6/js/modules/k6/http/limiter.go:68 +0x197
  github.com/loadimpact/k6/js/modules/k6/http.(*HTTP).Batch.func2()
      /rw/home/go/src/github.com/loadimpact/k6/js/modules/k6/http/http_request.go:752 +0xe3
...

The actual issue was pretty obvious once I looked at the code, so I'll try to get it fixed today. The only potential problem is how much would a RWMutex there affect the performance of the surrounding batch code that uses the limiter and whether we need to do some more serious refactoring to address this...

na-- added a commit that referenced this issue Sep 17, 2018
@na-- na-- closed this as completed in #771 Sep 18, 2018
na-- added a commit that referenced this issue Sep 18, 2018
Fix, refactor, document and test the slot limiters and update the release notes

This fixes #770
@pdkovacs
Copy link
Author

pdkovacs commented Oct 6, 2018

Thanks for the fix. It works for my use case as well.

When will it be released?

@na--
Copy link
Member

na-- commented Oct 8, 2018

This will probably be part of the 1.0 release, sometime in the next several weeks. You can use the latest k6 docker image (or build k6 from source yourself) if you want to use it before that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants