-
The paper says that one dimension of the density network output is log-density, so I assume there is no ReLU for the density network output, is that right? If that is the case, do you directly feed this output without activation into the color network? It would be very helpful if you could clarify. Thanks a lot! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Yes, there's no activation on the density network's output. I did briefly try a ReLU (because I also found not having an activation a little unsatisfactory), but couldn't observe any improvement. If you'd like to try it yourself, you can uncomment line 69 of // #define RELU_NON_DENSITY |
Beta Was this translation helpful? Give feedback.
Yes, there's no activation on the density network's output.
I did briefly try a ReLU (because I also found not having an activation a little unsatisfactory), but couldn't observe any improvement. If you'd like to try it yourself, you can uncomment line 69 of
nerf_network.h
// #define RELU_NON_DENSITY