-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Release 0.0.2 of LiDAR tiling workflow #12
Conversation
setting up
@robyngit I have finished most of the requested changes from your review. A brief summary of what's changed:
I've decided to bump the version to What I still have yet to implement from your review:
With your approval I will move to merge this to |
@iannesbitt the changes you made look great! 🎉 I wasn't able to test everything because I got an error during installation: Expand to see error(viz-points) ➜ viz-points git:(develop) pip install .
Processing /Users/rtb/git/viz-points
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [29 lines of output]
Traceback (most recent call last):
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/config.py", line 564, in configure
handler = self.configure_handler(handlers[name])
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/config.py", line 745, in configure_handler
result = factory(**kwargs)
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/handlers.py", line 153, in __init__
BaseRotatingHandler.__init__(self, filename, mode, encoding=encoding,
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/handlers.py", line 58, in __init__
logging.FileHandler.__init__(self, filename, mode=mode,
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/__init__.py", line 1146, in __init__
StreamHandler.__init__(self, self._open())
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/__init__.py", line 1175, in _open
return open(self.baseFilename, self.mode, encoding=self.encoding,
FileNotFoundError: [Errno 2] No such file or directory: '/home/nesbitt/log/viz-points/pdgpoints.log'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/Users/rtb/git/viz-points/setup.py", line 2, in <module>
from pdgpoints import _version
File "/Users/rtb/git/viz-points/pdgpoints/__init__.py", line 4, in <module>
dictConfig(LOGGING_CONFIG)
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/config.py", line 809, in dictConfig
dictConfigClass(config).configure()
File "/usr/local/anaconda3/envs/viz-points/lib/python3.9/logging/config.py", line 571, in configure
raise ValueError('Unable to configure handler '
ValueError: Unable to configure handler 'debugfile'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details. The only other comment I have at this point is super minor - I think it's more typical to have the test scripts & test data at the root level of the repo rather than inside the |
@robyngit The error was an oversight on my part...I was logging in my home dir because I don't have As for the test scripts and data, the reason I did that is because I'm pretty sure the manifest will only include files (such as the two test LAS files) when they exist inside a module. So maybe it makes sense to make a |
Logging@iannesbitt I'm not sure of the best way to set up logging, but it does seem like an atypical pattern to require a logging config file to exist and be properly set up before the user can install or import a python package. What do you think? Is there a way to enable users to configure the logging after the package has been installed and imported? TestsI think your release is good as-is, without changing your test set up, but just to continue the discussion: If I understand your proposed solution correctly, I do think that is a more common set up. So use a file structure like the one below and then add
BTW, also not necessary right away, but you could also include other dev dependencies you're using in extras_require={
'dev': [
'sphinx',
]
} |
Merging contributing guidelines into develop branch
Merging to main as all immediately relevant points have been addressed, and others have issues tracking them for a future release. |
At present the workflow is contained in the
pdgpoints.pipeline.Pipeline
class, which can be executed withPipeline.run()
. In constructing it this way I hoped to make it easy to parallelize in the future, either withthreading
or a Ray/Slurm instance.Certain steps of this workflow are executed with
LAStools
binary. LAStools is multithreading-enabled but I doubt that functionality can be scaled across machines easily.This workflow contains 5 steps:
las2las
to ensure LiDAR file (LAS, LAZ, or other major filetype) has CRS information in WKT stringlasinfo
to read WKT string, write to filelas2las
to rewrite LiDAR file with corrected header, write WKT string from file to header. Write 8 bit intensity values to 16 bit R, G, and B channels if necessary.py3dtiles.convert._Convert
class to tile dataset, use WKT string from file to input CRS infopy3dtiles.merger.merge_from_files()
to merge created tileset with existing tilesetsTest dataset is the Olympic Jumping Complex in Lake Placid, NY.