Skip to content
This repository has been archived by the owner on Sep 22, 2022. It is now read-only.

rare test060-mt-hot failures on a busy machine #92

Closed
erthink opened this issue May 29, 2016 · 3 comments
Closed

rare test060-mt-hot failures on a busy machine #92

erthink opened this issue May 29, 2016 · 3 comments

Comments

@erthink
Copy link
Owner

erthink commented May 29, 2016

ldap_search_ext_s(cn=All Staff,ou=Groups,dc=example,dc=com): Other (e.g., implementation specific) error (80) internal error

test060-mt-hot .log:

running defines.sh
Running slapadd to build slapd database...
Running slapindex to index slapd database...
Starting slapd on TCP/IP port 16656...
 = timeout -s SIGXCPU 30m  /home/ly/Projects/openldap.git/@ci-buzz.pool/@9.devel/src/tests/../servers/slapd/slapd -D -s0 -d stats -f /home/ly/Projects/openldap.git/@ci-buzz.pool/@9.devel/src/tests/t
Using ldapsearch to check that slapd is running (port 16656)...
Waiting 1 seconds for slapd to start...
Monitor searches
Testing basic mt-hot search: 1 threads (1 x 50000) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e cn=Monitor -m 1 -L 1 -l 50000
Testing basic mt-hot search: 5 threads (1 x 10000) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e cn=Monitor -m 5 -L 1 -l 10000
Testing basic mt-hot search: 100 threads (5 x 100) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e cn=Monitor -m 100 -L 5 -l 100
Random searches
Testing random mt-hot search: 1 threads (1 x 50000) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e dc=example,dc=com -f (objectclass=*) -m 1 -L 1 -l 50000
Testing random mt-hot search: 5 threads (1 x 10000) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e dc=example,dc=com -f (objectclass=*) -m 5 -L 1 -l 10000
Testing random mt-hot search: 100 threads (5 x 100) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e dc=example,dc=com -f (objectclass=*) -m 100 -L 5 -l 100
Multiple threads and connection searches
Testing basic mt-hot search: 5 threads 5 conns (1 x 10000) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e cn=Monitor -c 5 -m 5 -L 1 -l 10000
Testing basic mt-hot search: 50 threads 5 conns (5 x 1000) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e cn=Monitor -c 5 -m 50 -L 5 -l 1000
Testing random mt-hot search: 100 threads 5 conns (5 x 100) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e dc=example,dc=com -f (objectclass=*) -c 5 -m 100 -L 5 -l 100
Testing random mt-hot r/w search: 10 read threads 10 write threads 1 conns (5 x 100) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e dc=example,dc=com -f (&(!(cn=rwtest*))(objectclass=*)) -c 1 -m 10 -M 10 -L 5 -l 100
Testing random mt-hot r/w search: 10 read threads 10 write threads 5 conns (5 x 100) loops...
./progs/slapd-mtread -H ldap://localhost:16656/ -D cn=Manager,dc=example,dc=com -w secret -e dc=example,dc=com -f (&(!(cn=rwtest*))(objectclass=*)) -c 5 -m 10 -M 10 -L 5 -l 100
slapd-mtread failed (1)!
>>>>> waiting for things (2402) to exit... done

mtread.log:

slapd-mtread PID=19642: MT Test Start: conns: 1 (ldap://localhost:16656/)
slapd-mtread PID=19642: Threads: RO: 10 RW: 10
slapd-mtread PID=19642: RO thread 0 pass=500 fail=0
slapd-mtread PID=19642: RO thread 1 pass=500 fail=0
slapd-mtread PID=19642: RO thread 2 pass=500 fail=0
slapd-mtread PID=19642: RO thread 3 pass=500 fail=0
slapd-mtread PID=19642: RO thread 4 pass=500 fail=0
slapd-mtread PID=19642: RO thread 5 pass=500 fail=0
slapd-mtread PID=19642: RO thread 6 pass=500 fail=0
slapd-mtread PID=19642: RO thread 7 pass=500 fail=0
slapd-mtread PID=19642: RO thread 8 pass=500 fail=0
slapd-mtread PID=19642: RO thread 9 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1024 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1025 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1026 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1027 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1028 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1029 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1030 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1031 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1032 pass=500 fail=0
slapd-mtread PID=19642: RW thread 1033 pass=500 fail=0
slapd-mtread PID=19642: MT Test complete
slapd-mtread PID=19916: MT Test Start: conns: 5 (ldap://localhost:16656/)
slapd-mtread PID=19916: Threads: RO: 10 RW: 10
slapd-mtread PID=19916: ldap_search_ext_s(cn=All Staff,ou=Groups,dc=example,dc=com): Other (e.g., implementation specific) error (80) internal error
slapd-mtread PID=19916: RO thread 0 pass=500 fail=0
slapd-mtread PID=19916: RO thread 1 pass=500 fail=0
slapd-mtread PID=19916: RO thread 2 pass=499 fail=1
slapd-mtread PID=19916: FAIL RO thread 2
slapd-mtread PID=19916: RO thread 3 pass=500 fail=0
slapd-mtread PID=19916: RO thread 4 pass=500 fail=0
slapd-mtread PID=19916: RO thread 5 pass=500 fail=0
slapd-mtread PID=19916: RO thread 6 pass=500 fail=0
slapd-mtread PID=19916: RO thread 7 pass=500 fail=0
slapd-mtread PID=19916: RO thread 8 pass=500 fail=0
slapd-mtread PID=19916: RO thread 9 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1024 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1025 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1026 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1027 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1028 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1029 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1030 pass=500 fail=0
slapd-mtread PID=19916: RW thread 1031 pass=500 fail=0

slapd.log:

160529-02:41:31.934428_03528 conn=1024 op=1135 RESULT tag=105 err=0 text=
160529-02:41:31.934476_03528 conn=1024 op=1136 DEL dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.934712_03528 conn=1024 op=1136 RESULT tag=107 err=0 text=
160529-02:41:31.934758_03528 conn=1024 op=1137 ADD dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.935060_11079 conn=1024 op=1138 DEL dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.935245_03528 conn=1024 op=1137 RESULT tag=105 err=0 text=
160529-02:41:31.935311_11079 conn=1024 op=1138 RESULT tag=107 err=0 text=
160529-02:41:31.935361_09981 conn=1024 op=1139 ADD dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.935624_09981 conn=1024 op=1139 RESULT tag=105 err=0 text=
160529-02:41:31.935675_11079 conn=1024 op=1140 DEL dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.935896_11079 conn=1024 op=1140 RESULT tag=107 err=0 text=
160529-02:41:31.936228_02478 conn=1023 op=1270 RESULT tag=105 err=0 text=
160529-02:41:31.936279_02478 conn=1024 op=1141 ADD dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.936527_02478 conn=1024 op=1141 RESULT tag=105 err=0 text=
160529-02:41:31.936576_02478 conn=1023 op=1271 DEL dn="cn=rwtest1024,dc=example,dc=com"
160529-02:41:31.936815_02478 conn=1023 op=1271 RESULT tag=107 err=0 text=
160529-02:41:31.936856_02478 conn=1024 op=1142 DEL dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.937085_02478 conn=1024 op=1142 RESULT tag=107 err=0 text=
160529-02:41:31.937131_02478 conn=1023 op=1272 ADD dn="cn=rwtest1024,dc=example,dc=com"
160529-02:41:31.937393_02478 conn=1023 op=1272 RESULT tag=105 err=0 text=
160529-02:41:31.937445_02478 conn=1024 op=1143 ADD dn="cn=rwtest1029,dc=example,dc=com"
160529-02:41:31.937668_11079 conn=1023 op=1273 DEL dn="cn=rwtest1024,dc=example,dc=com"
160529-02:41:31.937842_03500 conn=1025 op=871 SEARCH RESULT tag=101 err=80 nentries=0 text=internal error
160529-02:41:31.937909_03500 conn=1025 op=872 SRCH base="cn=Barbara Jensen,ou=Information Technology Division,ou=People,dc=example,dc=com" scope=0 deref=0 filter="(objectClass=*)"
160529-02:41:31.937914_03500 conn=1025 op=872 SRCH attr=1.1
160529-02:41:31.938037_03500 conn=1025 op=872 SEARCH RESULT tag=101 err=0 nentries=1 text=
160529-02:41:31.938087_03500 conn=1025 op=873 SRCH base="cn=ITD Staff,ou=Groups,dc=example,dc=com" scope=0 deref=0 filter="(objectClass=*)"
160529-02:41:31.938092_03500 conn=1025 op=873 SRCH attr=1.1
160529-02:41:31.938197_03500 conn=1025 op=873 SEARCH RESULT tag=101 err=0 nentries=1 text=
160529-02:41:31.938245_03500 conn=1025 op=874 SRCH base="cn=Mark Elliot,ou=Alumni Association,ou=People,dc=example,dc=com" scope=0 deref=0 filter="(objectClass=*)"
160529-02:41:31.938250_03500 conn=1025 op=874 SRCH attr=1.1
160529-02:41:31.938360_03500 conn=1025 op=874 SEARCH RESULT tag=101 err=0 nentries=1 text=
160529-02:41:31.938409_03500 conn=1025 op=875 SRCH base="cn=James A Jones 1,ou=Alumni Association,ou=People,dc=example,dc=com" scope=0 deref=0 filter="(objectClass=*)"
160529-02:41:31.938414_03500 conn=1025 op=875 SRCH attr=1.1
160529-02:41:31.938518_03500 conn=1025 op=875 SEARCH RESULT tag=101 err=0 nentries=1 text=
160529-02:41:31.938567_03500 conn=1025 op=876 SRCH base="cn=Jane Doe,ou=Alumni Association,ou=People,dc=example,dc=com" scope=0 deref=0 filter="(objectClass=*)"
160529-02:41:31.938572_03500 conn=1025 op=876 SRCH attr=1.1
160529-02:41:31.938679_03500 conn=1025 op=876 SEARCH RESULT tag=101 err=0 nentries=1 text=
160529-02:41:31.938722_03500 conn=1025 op=877 SRCH base="cn=Manager,dc=example,dc=com" scope=0 deref=0 filter="(objectClass=*)"
160529-02:41:31.938727_03500 conn=1025 op=877 SRCH attr=1.1
160529-02:41:31.938825_03500 conn=1025 op=877 SEARCH RESULT tag=101 err=0 nentries=1 text=
@erthink
Copy link
Owner Author

erthink commented Aug 16, 2016

Now no more "(32) No such object", but "(80) internal error" still flickers.

@erthink
Copy link
Owner Author

erthink commented Aug 17, 2016

MDB_BAD_TXN (-30782) returned from mdb_cursor_open( ltid, mdb->mi_dn2id, &mcd ) at https://github.com/ReOpen/ReOpenLDAP/blob/devel/servers/slapd/back-mdb/search.c#L493

@erthink
Copy link
Owner Author

erthink commented Feb 17, 2017

Seems fixed in the devel branch after the f294a53, but more testing are needed.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant