You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have had the same problem a few times at LLR when analyzing ROIs that contain the Crab Nebula. I understood it to be due to the fact that the Crab is modelled by multiple components, one of which is fixed in the catalog, so there is no TS_value given for it and the code dies. The same problem could also occur for sources that don't have "Variability_Index".
We ended up fixing it by changing the code to test explicitly that these two attributes aren't a null string before trying to make them into float :
#evaluate variability and significance threshold info
self.sources[name].update([('variable',source.getAttribute('Variability_Index')=="" or float(source.getAttribute('Variability_Index'))>=self.variability_threshold),
('significant',source.getAttribute('TS_value')=="" or float(source.getAttribute('TS_value'))>=self.sigma_to_free)])
We default to keeping them in the model if the attribute isn't present.
Okay, it took me a little bit longer to find time to address this issue, but that is now done with release v1.10.11 (on PyPI and created as a tagged release on GitHub). I did this creating a function LATSourceModel.utilities.float_conversion to make things a bit more compact. I tested building a region centered near the Crab, using the XML catalog, and was able to make the model successfully.
We ended up fixing it by changing the code to test explicitly that these two attributes aren't a null string before trying to make them into float :
We default to keeping them in the model if the attribute isn't present.
Originally posted by @sfegan in #68 (comment)
The text was updated successfully, but these errors were encountered: