Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jigsaw solution for framing cameras and linescan cameras in same group #3369

Closed
rbeyer opened this issue Aug 7, 2019 · 20 comments
Closed

jigsaw solution for framing cameras and linescan cameras in same group #3369

rbeyer opened this issue Aug 7, 2019 · 20 comments
Assignees
Labels
enhancement New feature or request Missions Issues which are a priority for missions

Comments

@rbeyer
Copy link
Contributor

rbeyer commented Aug 7, 2019

Description
Allow images from both framing cameras and linescan cameras in the FROMLIST to result in a solution.

My understanding is that jigsaw cannot perform a single solution when both LORRI (framing camera) and MVIC (linescan) images are in the mix. It can solve for all-framing camera images, or all-linescan camera images. This results in a less-than-ideal control solution (detailed in the Schenk et al. topo papers for Pluto and Charon). There is some circumstantial evidence (old e-mails and remembered conversations from a few years ago) that this capability exists in some developmental versions of jigsaw (perhaps in some abandoned branch of IPCE), so perhaps this effort doesn’t need to start from scratch.

@jessemapel
Copy link
Contributor

The BundleAdjust has the capability to use per-observation or per-instrument solution settings. Currently, this is only accessible via ipce. In the jigsaw setup dialog there is an observation settings tab where you can set these. It's a bit tedious, but it's there. I've attached a screenshot.
Screen Shot 2019-08-07 at 9 32 03 AM

@jessemapel jessemapel added the enhancement New feature or request label Aug 7, 2019
@rbeyer
Copy link
Contributor Author

rbeyer commented Aug 8, 2019

Are you saying that if I had a list of framing camera images and linescan camera images, I could use IPCE to process them all together and create a single bundle adjustment solution?

@jessemapel
Copy link
Contributor

Yes, it may be tedious to setup if you have a few hundred images but you can do it.

@rbeyer
Copy link
Contributor Author

rbeyer commented Aug 8, 2019

So the capability exists in IPCE, but not in command-line jigsaw. Good to know. With that knowledge, I would sharpen this issue to be:

Dealing with more than one 'kind' of imager (framing camera, linescan, etc.) is possible in IPCE, but not from the command line with jigsaw. It would be nice if jigsaw also had that capability.

@jlaura jlaura added the Missions Issues which are a priority for missions label Dec 9, 2019
@jessemapel
Copy link
Contributor

Here are the use cases we are considering for this:

  1. Different mission, different sensor
    1. LRO NAC & Apollo Metric
    2. MRO CTX and TGO CaSSIS
    3. Galileo SSI & CaSSINI ISS
  2. Same mission, different sensor
    1. MRO CTX & HiRISE
    2. New Horizons LORRI & MVIC
    3. OREx MapCam & PolyCam
  3. Same mission, same sensor
    1. Chandrayaan-1 M3, pre and post star tracking
    2. MESSENGER MDIS NAC, different filters

@jessemapel
Copy link
Contributor

We're going to use a PVL config file to control this because it is far to complex for the command line interface. the config file will allow for setting the following jigsaw arguments on a per observation basis:

  1. Pointing options
    1. CAMSOLVE
    2. TWIST
    3. CKDEGREE
    4. CKSOLVEDEGREE
    5. OVEREXISTING
    6. CAMERA_ANGLES_SIGMA
    7. CAMERA_ANGULAR_VELOCITY_SIGMA
    8. CAMERA_ANGULAR_ACCELERATION_SIGMA
    9. ADDITIONAL_CAMERA_POINTING_SIGMAS (array value for sigmas past the angular acceleration)
  2. Position options
    1. SPSOLVE
    2. SPKDEGREE
    3. SPKSOLVEDEGREE
    4. OVERHERMITE
    5. SPACECRAFT_POSITION_SIGMA
    6. SPACECRAFT_VELOCITY_SIGMA
    7. SPACECRAFT_ACCELERATION_SIGMA
    8. ADDITIONAL_SPACECRAFT_POSITION_SIGMAS (array value for sigmas past the acceleration)

This is designed to match the interface on the already implemented BundleObservationSolveSettings object that already sets the per-observation settings inside of jigsaw.

The challenging part of this config file is determining how to specify which settings are used for which images. As a first MVP, we will have one object that contains the default settings and then additional objects that are image/observation specific settings. After that we can investigate things like checking the values/existence of keywords in the image labels or from things like camstats.

@lwellerastro
Copy link
Contributor

@jessemapel, I'm not completely sure (because I haven't tried it yet), but I thought this capability in IPCE would would allow for different jigsaw arguments in the case of different mission, different sensor, but same type of sensor - for instance, Galileo SSI (framer) and Voyager (framer). Is this scenario captured in case I. iii., Galileo SSI & CaSSINI ISS? I think so, I would just like a confirmation. Thanks!

@lwellerastro
Copy link
Contributor

lwellerastro commented Dec 16, 2019

The challenging part of this config file is determining how to specify which settings are used for which images.

I thought Ken had something already implemented in jigsaw to accept a PVL for this functionality, but never exposed it to the user by way of new parameters. I vaguely recall the possibility of having the PVL point to a list of images that each set of solve parameters would work on. Clunky, the whole PVL idea is a little cumbersome, but would get the job done.

Is the old work not useful under the circumstances?

I just snuck a peak at some of Ken's old work. It looks like he is driving the process by using a SensorGroup in the PVL: /work/projects/progteam/kedmundson/jigsaw_tests/multiple_sensors/ObservationMode/LO/APOLLO_LRONAC_LO.pvl

Maybe you already know this and found all of this too outdated to use. Sorry if that is the case. Wouldn't want you to reinvent the wheel if not necessary.

@jessemapel
Copy link
Contributor

@lwellerastro We found Ken's old code from this and some test data from him. It looks like the PVL interface was created but then thrown away and a new interface was created to support the current work in IPCE.

Unfortunately, Ken's old PVL interface keys off of the instrument ID and the new IPCE interface keys off of the observation. So, we will need to add some extra code in jigsaw to convert from instrument ids (or a more generic selection option) to observations.

@lwellerastro
Copy link
Contributor

Doh! You type faster than I do - I just added to my previous comments.

Thanks for the information @jessemapel !

@jessemapel
Copy link
Contributor

@jessemapel, I'm not completely sure (because I haven't tried it yet), but I thought this capability in IPCE would would allow for different jigsaw arguments in the case of different mission, different sensor, but same type of sensor - for instance, Galileo SSI (framer) and Voyager (framer). Is this scenario captured in case I. iii., Galileo SSI & CaSSINI ISS? I think so, I would just like a confirmation. Thanks!

Yes this would be supported.

@jessemapel
Copy link
Contributor

After looking at Ken's work, instead of using per-image settngs, we're going to go with per sensor settings. Here's the Apollo Metric, LRO NAC, LO config file Lynn mentioned above:

# This is an example template that can be used to create the 
# sensor parameters pvl file used in jigsaw to set parameters
# separately for different image sensors. Each sensor that
# is represented by the images in the bundle adjustment must
# have a Group defined in this file along with the appropriate
# parameters for that sensor.
#
# The Group name for each sensor is formed by combining the
# SpacecraftName and the InstrumentId from the labels of images 
# containing data for that sensor. The units required for each 
# uncertainty is as follows:
#
# Spacecraft_Position_Sigma is in meters
# Spacecraft_Velocity_Sigma is in meters/second
# Spacecraft_Acceleration_Sigma is in meters/second/second
# Camera_Angles_Sigma is in decimal degrees
# Camera_Angular_Velocity_Sigma is in decimal degrees/second
# Camera_Angular_Acceleration_Sigma is in decimal degrees/second/second
#
# If an image in the cube list provided to jigsaw does not have a matching
# group in this file, then an error will occur. Every sensor represented
# by the images in the cube list must have an entry in this file with all
# required parameters set.
#
Object = SensorParameters
  Group = APOLLO15/METRIC
    CKDegree=2
    CKSolveDegree=2 
    CamSolve=Angles
    Twist=yes
    SPSolve=position
    Spacecraft_position_sigma=1000.0
    Camera_Angles_Sigma=2.0
  EndGroup
  Group = "LUNARRECONNAISSANCEORBITER/NACL"
    CamSolve=accelerations
    Twist=yes
    OverExisting=yes
    SPSolve=accelerations
    overhermite=yes
    Spacecraft_position_sigma=100.0
    Spacecraft_velocity_sigma=1.0
    Spacecraft_acceleration_sigma=0.1
    Camera_Angles_Sigma=2.0
    Camera_AngularVelocity_Sigma=0.1
    Camera_AngularAcceleration_Sigma=0.01
  EndGroup
  Group = "LUNARRECONNAISSANCEORBITER/NACR"
    CamSolve=accelerations
    Twist=yes
    OverExisting=yes
    SPSolve=accelerations
    overhermite=yes
    Spacecraft_position_sigma=100.0
    Spacecraft_velocity_sigma=1.0
    Spacecraft_acceleration_sigma=0.1
    Camera_Angles_Sigma=2.0
    Camera_AngularVelocity_Sigma=0.1
    Camera_AngularAcceleration_Sigma=0.01
  EndGroup
  Group = "LUNARORBITER5/HIGHRESOLUTIONCAMERA"
    CamSolve=angles
    Twist=yes
    Camera_Angles_Sigma=2.0
    SPSolve=position
    Spacecraft_position_sigma=1000.0
  EndGroup
EndObject
End

We could eventually allow for selections based on something other than instrument id as this doesn't support same missions, same sensor settings. That gets into adding logic though and that type of stuff is very challenging. See isisminer and findfeatures.

@jessemapel
Copy link
Contributor

Here's some explanation about why Ken's test code was removed prior to merging

https://github.com/USGS-Astrogeology/ISIS3/blob/dev/isis/src/control/apps/jigsaw/jigsaw.xml#L350

@jessemapel
Copy link
Contributor

jessemapel commented Dec 18, 2019

@paarongiroux IPCE results to compare against: /work/users/jmapel/mutli_sensor/viking_ctx_test/ipce_test_results

I bundle adjusted /work/users/jmapel/mutli_sensor/viking_ctx_test/CTX_and_VO_measures_merged.net and /work/users/jmapel/mutli_sensor/viking_ctx_test/all_cubes.lis with the default IPCE settings which are:

  • Do not solve for position
  • Solve for angular acceleration over existing in CTX
  • Solve for angles in viking
  • Do not constrain pointing

Here's what the config file should look like:

Object = SensorParameters
# Viking sensor parameters
  Group = Viking1/VISA
    CamSolve=Angles
    Twist=yes
    SPSolve=None
  EndGroup
  Group = Viking1/VISB
    CamSolve=Angles
    Twist=yes
    SPSolve=None
  EndGroup

# MRO sensor paramters
  Group = "MRO/CTX"
    CamSolve=accelerations
    Twist=yes
    OverExisting=yes
    SPSolve=None
  EndGroup
EndObject
End

@jessemapel
Copy link
Contributor

jessemapel commented Dec 19, 2019

I've got two testing networks ready to go now. The MRO CTX and Viking network is located at /work/users/jmapel/mutli_sensor/isis_tests/seed_add_edit_reg_21_31_clean_thin.net and the PLUTO network is at /work/users/jmapel/mutli_sensor/pluto/seed_add_edit_ref_reg_clean_hand_clean.net.

The viking/ctx network is fairly good. The Pluto network is kind of meh. The MVIC image has different lighting compared to the LORRI images, so pointreg had a lot of trouble. I went in and hand corrected all of the points. It could probably do with some additional hand added points too, but we'll see how it bundles first.

@lwellerastro
Copy link
Contributor

If you're interested, I have some test cases - one that appears to be Ken's star case involving Lunar Orbiter, Apollo Metric and LROCNAC and on that involves Themis IR and Viking Orbiter.

I believe the lunar case is one of Ken's test cases, but the lists in his directory (/work/projects/progteam/kedmundson/jigsaw_tests/multiple_sensors/) point to data that have been moved. Sometime in 2018 I gathered the images and network mostly likely for helping on the IPCE project.

See /work/users/lweller/Isis3Tests/IPCE/Jigsaw/MultiSensor/AS15LandingSite/
multiSensor.lis and
AS15_LandingSide_Metric-NAC-LO5H-FIN.net

The images are under the same directory. This should be a good network - Tammy Becker likely created it.

It would be advisable to set observations=true for the Lunar Orbiter data. I copied one of Ken's config files into my area (APOLLO_LRONAC_LO.pvl from Ken's ObservationMode directory) so you can see how he was solving for things. Ken's test directories include a number of cases and pvl's to go with them.

The Themis IR and VO data can be found under /work/users/lweller/Isis3Tests/IPCE/Jigsaw/MultiSensor/ThmNIR_VO/Arcadia_StupidSmall/
themis_dayir_VO_arcadia.lis and
themis_dayir_VO_arcadia.net

The images are under the same directory. This is also a good network. As the name of the directory indicates, this is a small network.

My settings for Themis would be:
camsolve=accelerations twist=yes overexisting=yes
camera_angles_sigma=.25
camera_angular_velocity_sigma=.1
camera_angular_acceleration_sigma=.01

Viking Orbiter:
camsolve=angles twist=yes
camera_angles_sigma=1.0 (honestly don't know what this should be, but not as tight as Themis)

With global settings
radius=true
point_radius_sigma=50

I believe this network has some ground (constrained) points in it as well. No need to solve for spacecraft.

If you need help locating other test cases please let me know as there is a chance I have something or know where to find it.

@jessemapel
Copy link
Contributor

@lwellerastro Thanks! That should be plenty of test cases.

@jlaura
Copy link
Collaborator

jlaura commented Mar 23, 2020

@jessemapel jessemapel mentioned this issue Mar 23, 2020
11 tasks
@jessemapel
Copy link
Contributor

This will be available in the 4.1 release.

@rbeyer
Copy link
Contributor Author

rbeyer commented Mar 24, 2020

Yay! Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Missions Issues which are a priority for missions
Projects
None yet
Development

No branches or pull requests

5 participants