Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: linkinglines: Using the Hough Transform to Cluster Line Segments and for Mesoscale Feature Extraction #6147

Closed
editorialbot opened this issue Dec 14, 2023 · 77 comments
Assignees
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 6 (ESE) Earth Sciences and Ecology

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Dec 14, 2023

Submitting author: @aikubo (Allison Kubo Hutchison)
Repository: https://github.com/aikubo/LinkingLines
Branch with paper.md (empty if default branch):
Version: v2.1.2
Editor: @hugoledoux
Reviewers: @evetion, @nialov
Archive: 10.5281/zenodo.11321509

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/64eeef828a1100bfba74052d89314758"><img src="https://joss.theoj.org/papers/64eeef828a1100bfba74052d89314758/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/64eeef828a1100bfba74052d89314758/status.svg)](https://joss.theoj.org/papers/64eeef828a1100bfba74052d89314758)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@evetion & @nialov, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @hugoledoux know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @evetion

📝 Checklist for @nialov

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=0.11 s (531.1 files/s, 153376.7 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
HTML                             9            500              0           4422
Python                          18           1547           2246           2604
CSS                              6            370             74           1559
SVG                              4            118              2            988
JavaScript                       6            124            188            785
Markdown                         2             75              0            150
reStructuredText                 3             58             66             78
TeX                              1              0              0             64
YAML                             3              8              4             62
Jupyter Notebook                 3            157            632             57
TOML                             2              6              1             40
DOS Batch                        1              8              1             26
make                             1              4              7              9
-------------------------------------------------------------------------------
SUM:                            59           2975           3221          10844
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 2145

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- None

MISSING DOIs

- 10.1016/0031-3203(81)90009-1 may be a valid DOI for title: Generalizing the Hough transform to detect arbitrary shapes
- 10.25080/majora-92bf1922-00a may be a valid DOI for title: Data structures for statistical computing in python
- 10.22541/essoar.167397469.91853825/v1 may be a valid DOI for title: Multiscale spatial patterns in giant dike swarms identified through objective feature extraction

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@evetion
Copy link

evetion commented Dec 14, 2023

Review checklist for @evetion

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/aikubo/LinkingLines?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@aikubo) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@nialov
Copy link

nialov commented Dec 20, 2023

Planning to start the review next week or start of the year, sorry for delay!

@evetion
Copy link

evetion commented Dec 22, 2023

Thanks for the submission, it's an interesting read and package! I have completed my review, and in my opinion it requires major revisions.

My feedback follows below, per item that I can't yet check. I've also made several suggestions in issues on the repository.

Let me know if you have questions.

General checks

Contribution and authorship

The code seems to have a sole author, as is stated in the pyproject.toml and Github. Can you explain what the roles of your co-authors in the paper are? Also, it seems to me that your documentation is partially written by a LLM, as indicated by https://github.com/aikubo/LinkingLines/blob/master/src/linkinglines/FitRadialCenters.py#L8. There's no rule against the use of LLMs, but it might be good to note its use somewhere in your documentation.

Substantial scholarly effort

I'm in doubt whether this is a substantial scholarly effort. LoC (2825) is a bad estimator, I'm mostly looking at the possible re-use. The code has been developed for specific (published 🎉) research, but is not generalized. I've made various suggestions in aikubo/LinkingLines#20 to improve it.

Documentation

Functionality documentation

It seems all the core methods are documented, but as described in aikubo/LinkingLines#18, there are linking errors and the examples don't render nicely.

Automated tests

There are tests 👍🏻, but AFAIK these are not automated and there are otherwise no instructions on running them.

Community guidelines

There are no clear guidelines for third parties wishing to contribute. The use of Github issues for issues, problems, and support is implicit.

Paper

Summary

The summary is great, my only remark is to introduce your research domain, as giant dike swarms is not understood by a diverse non-specialist audience.

A statement of need

While the paper contains a statement of need section, it mostly lists generic features, not a precise description of the gap it's trying to fill. For example, it states “simplify extraction”, but the code in question doesn't extract them.

It also does not address who the target audience is, or its relation to other work.

State of the field

The paper does not address the current state of the field.

Quality of writing

The writing is fine, but could use a careful reread, as some sentences are incomplete, such as l10 (sentence should probably be broken up) and l130 (cluster fall).
Furthermore, like the summary, large parts assume specific domain knowledge. For example, l51-63 mostly contain results from another paper. Ideally this part is rewritten (not removed), to reflect the original use of the code and its impact, but without the domain details.

References

According to the bot post above, there are missing dois in your paper. There's also a missing key (search survey?).

@nialov
Copy link

nialov commented Dec 26, 2023

Review checklist for @nialov

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/aikubo/LinkingLines?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@aikubo) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@nialov
Copy link

nialov commented Dec 28, 2023

Thank you for the submission. I believe the functionality is very interesting from a geological perspective and can definitely see it being used also outside the main documented use case of dyke swarm clustering for example for examining bedrock fracture data. Therefore publishing the software in JOSS to allow reuse of the methodology is great!

However, it is difficult to estimate the software functionality as neither the tests or documentation (notebook) run correctly due to import issues. This is a major issue that needs to be fixed for me to assess the suitability of the package for JOSS. A major revision is therefore required.

Further issues are outlined below based on the checklist.

General checks

Contribution and authorship

The contributions by authors other than the main author are not explicitly stated in the paper and they can not be implicitly seen contributing to the repository on GitHub. I would like a short explanation of the roles so that I know that the contribution is not purely financial or organisational, in which case the authors should not be included.

See: https://joss.readthedocs.io/en/latest/review_criteria.html#authorship

Substantial scholarly effort

I believe the scholarly effort in terms of geological method development is sufficient but this effort has already been (partly) published as part of https://doi.org/10.1029/2022GC010842.

As mentioned by evetion, the effort in terms of software is lacking as the described workflow is too specialized to only with with csv files with a strict schema. I agree with the proposed changes by evetion here: aikubo/LinkingLines#20 and especially recommend testing geopandas for reading the input files into GeoDataFrames.

Data sharing

Data is shared but not used in the tests or documentation (notebook).

Functionality

Functionality

I can not confirm the functionality with the tests and notebook because they do not work.

Documentation

Installation instructions

I would document the development dependencies i.e. pytest in pyproject.toml.

aikubo/LinkingLines#22

Example usage

The example/recreation of the dyke analysis is great! I would however, to generalise the software, include at least some example with other data, such as with fractures.

aikubo/LinkingLines#25

Functionality documentation

All functions are documented with inputs, outputs and examples given.

However, I would expand the Quick Start section in the README.md a bit and make sure the included notebook works and is clearly linked to in documentation that users will find it.

Automated tests

Tests are not automated or documented properly.

Community guidelines

I suggest adding a CONTRIBUTING.md to let potential contributors to know if e.g. pull requests are accepted, what kind of issues should be posted, etc. Does not have to be long, but having no information on future development activity is discouraging to potential contributors.

See e.g. https://github.com/nialov/fractopo/blob/master/CONTRIBUTING.rst

Software paper

The Example Code Usage section is not needed, code examples belong in the repository documentation, not the JOSS manuscript.

The Code Structure section is overly specific. The first paragraph is fine. Hough transform can be introduced but equations are not required for the article as they do not seem to be novel to this software/methodology. Citations to established papers with the equations and concepts are better. Details such as column names of pandas DataFrames are overly specific and need to be removed.

Summary

The summary makes it seem like the software is readily usable for a wide variety of domains. I would suggest focusing the paper on the target audience of geoscientists for whom the software is designed.

As the summary is mainly for non-specialist audience, I would explain or give examples of geoscientific concepts such as Linear feature analysis and unique feature extraction methods immediately after introducing them when possible.

I would remove/revise the sentence on lines 12 to 15 as it is too general to be useful for the reader.

A statement of need

I would revise the statement of need section to start with the specific need in the geological domain in relation to cluster analysis of dykes (L50-L63). Then I would at the end of the section introduce/list the other possible use cases.

I find the current lines 20-49 to be too unspecific and general and do not add any knowledge of value to the user in regards to this specific software. Parts of these lines (L20-49) can be used at the end of the section, if wanted, but otherwise I would remove most of the text in this line range.

State of the field

The geological context is introduced but I would suggest adding context on GIS tools that might already conduct similar analysis if they exist. If similar software is not found I would at least show some similar packages to run specific GIS analyses (with Python) on (geological) data to better to show how linkinglines fits in the geoscientific field.

See e.g. NetworkGT https://github.com/BjornNyberg/NetworkGT or fractopo https://github.com/nialov/fractopo.

Quality of writing

Some parts are overly general and the paper could be more concise and focused on the specific issues that the software is trying to solve.

References

There are missing references and links, see review by evetion above.

Further comments

Current issues and pull requests at the time of this review:

Pull requests are just suggested changes, feel free to improve, comment or implement yourself!

@aikubo
Copy link

aikubo commented Dec 31, 2023

Thank you @evetion and @nialov for the reviews!
I agree with and appreciate your comments. I will get to work generalizing the code.

On the subject of authorship, my currently listed coauthers helped in the conceptualization and theory of the code, did a little writing but I ended up rewriting most of it and prepared this paper based on our published results paper.

@kthyng
Copy link

kthyng commented Feb 13, 2024

Hi everyone! How is this review coming along?

@evetion
Copy link

evetion commented Feb 13, 2024

Hi everyone! How is this review coming along?

We're still waiting on changes in the code by @aikubo

@aikubo
Copy link

aikubo commented Feb 14, 2024

Hello all, thank you for your patience. I will try to get to the reviews next week.

@aikubo
Copy link

aikubo commented May 26, 2024

  • [x ] Make a tagged release of your software, and list the version tag of the archived version here.

v2.1.2
tag here: https://github.com/aikubo/LinkingLines/releases/tag/2.1.2

  • [x ] Archive the reviewed software in Zenodo or a similar service (e.g., figshare, an institutional repository)

https://zenodo.org/records/10887984

  • [X ] Check the archival deposit (e.g., in Zenodo) has the correct metadata. This includes the title (should match the paper title) and author list (make sure the list is correct and people who only made a small fix are not on it). You may also add the authors' ORCID.
  • [ X] Please list the DOI of the archived version here.

10.5281/zenodo.10887984

Thank you for the excellent reviews from @nialov and @evetion ! Thank you all for bearing with me on this long review.

@hugoledoux
Copy link

your zenodo has the wrong metadata, make sure to do this:

Check the archival deposit (e.g., in Zenodo) has the correct metadata. This includes the title (should match the paper title) and author list (make sure the list is correct and people who only made a small fix are not on it). You may also add the authors' ORCID.

@aikubo
Copy link

aikubo commented May 27, 2024

Sorry the link above was for a previous verison. Here's the link to the current
https://zenodo.org/records/11321509

doi is of course the same

@hugoledoux
Copy link

hmmm, no the DOI is different, but okay I have the DOI.

However:

  1. your name is different on the paper and on zenodo?! It has to be the same
  2. order of the authors is also different.

You can fix both in the metadata without making a new release I reckon...

@aikubo
Copy link

aikubo commented May 28, 2024

My bad. Sorry.

https://zenodo.org/records/11321509

@hugoledoux
Copy link

@editorialbot set 10.5281/zenodo.11321509 as archive

@editorialbot
Copy link
Collaborator Author

Done! archive is now 10.5281/zenodo.11321509

@hugoledoux
Copy link

@editorialbot set v2.1.2 as version

@editorialbot
Copy link
Collaborator Author

Done! version is now v2.1.2

@hugoledoux
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1145/361237.361242 is OK
- 10.1016/0031-3203(81)90009-1 is OK
- 10.25080/Majora-92bf1922-00a is OK
- 10.1016/j.patcog.2014.08.027 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.1029/2022GC010842 is OK
- 10.1130/GES02173.1 is OK
- 10.3133/sim3121 is OK
- 10.21105/joss.05300 is OK
- 10.5194/se-14-603-2023 is OK
- 10.1016/0012-8252(95)00017-5 is OK
- 10.5281/zenodo.3946761 is OK
- 10.1029/2022JB026310 is OK
- 10.7717/peerj.453 is OK
- 10.21105/joss.02826 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/ese-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#5402, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label May 29, 2024
@kthyng
Copy link

kthyng commented May 29, 2024

Hi! I'll take over now as Track Associate Editor in Chief to do some final submission editing checks. After these checks are complete, I will publish your submission!

  • Are checklists all checked off?
  • Check that version was updated and make sure the version from JOSS matches github and Zenodo.
  • Check that software archive exists, has been input to JOSS, and title and author list match JOSS paper (or purposefully do not).
  • Check paper.

@kthyng
Copy link

kthyng commented May 29, 2024

@aikubo Please check the capitalization in your references. You can preserve capitalization by placing {} around characters/words in your .bib file. In particular "Hough" isn't capitalized but please check all entries.

@aikubo
Copy link

aikubo commented Jun 7, 2024

Fixed capitalization:
aikubo/LinkingLines#38

Thanks for your patience, I've been prepping for my defense.

@kthyng
Copy link

kthyng commented Jun 7, 2024

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@kthyng
Copy link

kthyng commented Jun 7, 2024

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

Ensure proper citation by uploading a plain text CITATION.cff file to the default branch of your repository.

If using GitHub, a Cite this repository menu will appear in the About section, containing both APA and BibTeX formats. When exported to Zotero using a browser plugin, Zotero will automatically create an entry using the information contained in the .cff file.

You can copy the contents for your CITATION.cff file here:

CITATION.cff

cff-version: "1.2.0"
authors:
- family-names: Hutchison
  given-names: Allison Kubo
  orcid: "https://orcid.org/0000-0002-1378-361X"
- family-names: Karlstrom
  given-names: Leif
  orcid: "https://orcid.org/0000-0002-2197-2349"
- family-names: Mittal
  given-names: Tushar
  orcid: "https://orcid.org/0000-0002-8026-0018"
doi: 10.5281/zenodo.11321509
message: If you use this software, please cite our article in the
  Journal of Open Source Software.
preferred-citation:
  authors:
  - family-names: Hutchison
    given-names: Allison Kubo
    orcid: "https://orcid.org/0000-0002-1378-361X"
  - family-names: Karlstrom
    given-names: Leif
    orcid: "https://orcid.org/0000-0002-2197-2349"
  - family-names: Mittal
    given-names: Tushar
    orcid: "https://orcid.org/0000-0002-8026-0018"
  date-published: 2024-06-07
  doi: 10.21105/joss.06147
  issn: 2475-9066
  issue: 98
  journal: Journal of Open Source Software
  publisher:
    name: Open Journals
  start: 6147
  title: "LinkingLines: Using the Hough Transform to Cluster Line
    Segments and Mesoscale Feature Extraction"
  type: article
  url: "https://joss.theoj.org/papers/10.21105/joss.06147"
  volume: 9
title: "LinkingLines: Using the Hough Transform to Cluster Line Segments
  and Mesoscale Feature Extraction"

If the repository is not hosted on GitHub, a .cff file can still be uploaded to set your preferred citation. Users will be able to manually copy and paste the citation.

Find more information on .cff files here and here.

@editorialbot
Copy link
Collaborator Author

🐘🐘🐘 👉 Toot for this paper 👈 🐘🐘🐘

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.06147 joss-papers#5467
  2. Wait five minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.06147
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels Jun 7, 2024
@kthyng
Copy link

kthyng commented Jun 7, 2024

Congratulations on your new publication @aikubo, and good luck on your defense!! Many thanks to @hugoledoux and to reviewers @evetion and @nialov for your time, hard work, and expertise!! JOSS wouldn't be able to function nor succeed without your efforts.

@kthyng kthyng closed this as completed Jun 7, 2024
@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.06147/status.svg)](https://doi.org/10.21105/joss.06147)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.06147">
  <img src="https://joss.theoj.org/papers/10.21105/joss.06147/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.06147/status.svg
   :target: https://doi.org/10.21105/joss.06147

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 6 (ESE) Earth Sciences and Ecology
Projects
None yet
Development

No branches or pull requests

6 participants