Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--Ojanewton breaks with -b 24 #1592

Closed
arielf opened this issue Sep 1, 2018 · 3 comments
Closed

--Ojanewton breaks with -b 24 #1592

arielf opened this issue Sep 1, 2018 · 3 comments
Assignees
Labels
Bug Bug in learning semantics, critical by default

Comments

@arielf
Copy link
Collaborator

arielf commented Sep 1, 2018

In the vw source tree:

# takes about 0.25 sec, average loss = 0.089070
vw  --OjaNewton --sketch_size=10 --alpha_inverse=1.0 -d test/train-sets/0002.dat

Same command with -b 24:

# Takes over 16 seconds, has NaNs in all examples & error jumps to 0.283062
vw  -b 24 --OjaNewton --sketch_size=10 --alpha_inverse=1.0 -d test/train-sets/0002.dat
@arielf
Copy link
Collaborator Author

arielf commented Sep 1, 2018

Also noticed that -b 20 still works reasonably well (although slower than default) and the breaking point is -b 21.

@jackgerrits jackgerrits added the Bug Bug in learning semantics, critical by default label Nov 30, 2018
@JohnLangford
Copy link
Member

I'm not sure what's wrong here. A careful examination of the computation is required. @haipengl do you want to take a look?

@yannstad
Copy link
Collaborator

yannstad commented Apr 9, 2019

The independent sample r1 in the box-muller transform must be strictly positive. But it is chosen using merand48, that possibly returns 0:

float r1 = merand48(all->random_state);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Bug in learning semantics, critical by default
Projects
None yet
Development

No branches or pull requests

4 participants