Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conform tweetynet to recent PyHa refactor #113

Merged
merged 42 commits into from
Jun 28, 2022

Conversation

JacobGlennAyers
Copy link
Contributor

No description provided.

@mugen13lue
Copy link
Contributor

For the environments, I have got it to work on windows 10 and macos(Big Sur).

Copy link
Contributor

@mugen13lue mugen13lue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ignore everything but the

  1. tweetynet package
  2. generate_automated_labels_tweetynet in isoautio,
  3. hope that the new spectrogram visualization will replace create_visualization_tweetynet in visualizations.py.
  4. Conda environments that I added

@mugen13lue
Copy link
Contributor

Dealt with the merge conflicts in isoautio and visualizations

@JacobGlennAyers
Copy link
Contributor Author

Right now, I see that there is a bool being used to activate the generate_automated_labels_tweetynet(), instead, we should have generate_automated_labels() detect the "model" key as "tweetynet" which causes it to call generate_automated_labels_tweetynet(). Then we can get rid of the tweetynet_output boolean.

Furthermore, the local_score_visualization() was refactored by Sam and me (mostly Sam) as spectrogram_visualization, so this branch needs to be updated with main.

Finally, we want to get rid of the tweetynet tutorial, and just integrate it into the main tutorial notebook where we just have a commented out isolation_parameters dictionary that has the "model" parameter set to "tweetynet"

Then we have to update the readme documentation with all of the parameters for tweetynet.

@JacobGlennAyers
Copy link
Contributor Author

I will note that we can push the necessary technical changes to tweetynet_2_main, and then update the documentation when we go from tweetynet_2_main to main

@JacobGlennAyers
Copy link
Contributor Author

In the current tweetynet_integrations branch there is a new jupyter notebook that shows off some demos from the DSC 180 tweetynet labeled dataset. When we demo tweetynet on the main PyHa Tutorial notebook, we should just focus on demoing it on the screaming piha dataset.

- removes the tweety_output bool and moves it to isolation_parameters dict
- updates local_score_visualization() function to reflect spectrogram_visualization() name change and organization
- removes TweetyNET tutorial and a handful of test files
- README documentation for TweetyNet parameters and generate_automated_labels_tweetynet()
Copy link
Member

@sprestrelski sprestrelski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I separated out the TweetyNET content into a new generate_automated_labels_tweetynet function, but it's pretty similar to the Microfaune function. Mugen had originally implemented both in the same function. @JacobGlennAyers thoughts on the organization?

Also, to support the TweetyNET integration, installing Pytorch probably should be added to the envs.

If you're doing code review, please try to break the visualization function using different models and different settings. There may be some typechecking I forgot to consider.

@JacobGlennAyers
Copy link
Contributor Author

Yeah, Mugen setup the Windows and MacOS envs. I have one for Ubuntu

- Trimmed some unnecessary tweetynet isolation parameters
- Removed unnecessary import of torch.
@Sean1572
Copy link
Contributor

bug fixes and documentation updates for this PR are on #119

Be sure to wait for #119 to merge with this branch before merging with main

@Sean1572 Sean1572 changed the base branch from tweetynet_integrations to main June 27, 2022 21:36
@sprestrelski sprestrelski merged commit b6b24b9 into main Jun 28, 2022
sprestrelski added a commit that referenced this pull request Jun 30, 2022
* Create artwork

* Add files via upload

* Update ver3 zoo2.svg

* Update ver3 zoo2.svg

* Delete ver3 zoo2.svg

* Delete ver3 zoo3.svg

* Delete ver3 zoo5.svg

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Create IsoAutio.svg

* Delete ver3_zoo8.svg

* Delete ver3_zoo5.svg

* Delete ver3_zoo1.svg

* Delete ver3_zoo2.svg

* Delete ver2.svg

* Delete ver1.svg

* Create PyHa

* Rename PyHa to PyHa.svg

* Delete ver3_zoo3.svg

* Delete ver3_zoo7.svg

* Delete artwork

* Created separate folder for conda environments, added Ubuntu 18.04 environment

* Removing old Ubuntu 16.04 environment

No longer needed since the last commit added the conda_environments folder

* Added in a Windows 10 environment file

Tested on one machine

* Add files via upload

Adding environment .yml file for macOS Big Sur 11.4

* Added try-except block to handle faulty wav files
- Found in situations where you want to run an isolation algorithm across a large set of wave files that haven't been properly vetted for various problems that can occur. Such as RIFX instead of RIFF or wave files that were created but failed to actually record anything and are empty.
- I am not that experienced with error handling, but this change made it work on a large folder filled with Audiomoth clips that had tons of errors

* Forgot to add continue to previous commit

* Improved Manual Labels for test set
- Annotations re-done by Jacob using Pyrenote

* Added columns for Kaleidoscope compatibility
- Channel had to be added. LABEL was changed to MANUAL ID

* Updated PyHa_Tutorial Notebook with new labels

* Converted bi_directional_jump to window_size
- This is a more standard description of what the steinberg isolation technique is deploying

* Changing bird_dir to audio_dir in IsoAutio.py
- This is part of the transition to make PyHa more general.

* Improved zero division error-handling messaging
- Specific to the matrix_IoU_Scores() function

* Improved visualizations.py error handling
- local_score_visualization() lets you know which clip failed to receive microfaune predictions

* Fixed PyHa Tutorial Notebook
- Recent commit didn't display everything

* Improved visualizations.py error handling
- Handled more potential points of failure
- Added better descriptions of what may have gone wrong

* Improved local_score_visualizations()
- Allowed a user to insert whatever sort of pre-rendered annotations that they so desire.
- Allow them to change the words on the legend that appear corresponding to the annotations

* Demo local_score_visualization() changes

* initial linting with pep8

* Reworked audio test set.
- Decided it best to use the Creative Commons xeno-canto data
- Decided it best to stick to Screaming Piha audio to honor the package's name
- Had to update the Jupyter Notebook tutorial with the new dataset
- Deleted the audio, new update in the readme branch will link to a public drive with the relevant test set
- Replaced the old labels from the old test set with the labels for the new screaming piha dataset.

* IsoAutio.py update

* removed all whitespace and styling errors

* Address Linting Warnings
- Related to unused variables
- Fixed IsoAutio import in the visualizations.py file
- master_clips_stats_df ==> master_clip_stats_df

* Revert "Address Linting Warnings"
- Local repo wasn't properly updated, leading to the undoing of important recent readme PR request

This reverts commit 684a03c.

* Readme (#86)

* Update README.md

removed last remnant of old name on this branch

* Create artwork

* Add files via upload

* Update ver3 zoo2.svg

* Update ver3 zoo2.svg

* Delete ver3 zoo2.svg

* Delete ver3 zoo3.svg

* Delete ver3 zoo5.svg

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Create IsoAutio.svg

* Delete ver3_zoo8.svg

* Delete ver3_zoo5.svg

* Delete ver3_zoo1.svg

* Delete ver3_zoo2.svg

* Delete ver2.svg

* Delete ver1.svg

* Create PyHa

* Rename PyHa to PyHa.svg

* Delete ver3_zoo3.svg

* Delete ver3_zoo7.svg

* Delete artwork

* Created separate folder for conda environments, added Ubuntu 18.04 environment

* Removing old Ubuntu 16.04 environment

No longer needed since the last commit added the conda_environments folder

* Added in a Windows 10 environment file

Tested on one machine

* Add files via upload

Adding environment .yml file for macOS Big Sur 11.4

* isolate function

* Update README.md

* isolation_parameters

* threshold function

* return values

* Update README.md

* Update README.md

* usage

* file links

* Locations

* isolation_parameters update

* isolation_parameters update

* design

* Update README.md

* Update README.md

* steinberg_isolate

* Table of Contents

* PyHa Logo

* logo

* steinberg isolate

* update logo

* steinberg isolate

* isolation techniques

* rename variables

* generate_automated_labels

* generate_automated_labels

* kaleidoscope_conversion

* All IsoAudio.py files

* section headings

* local_line_graph

* local_score_visualization

* plot_bird_label_scores

* All visualizations.py functions

* comments

* microfaune_package notes

* annotation_duration_statistics

* bird_label_scores

* automated_labeling_statistics

* global_dataset_statistics

* clip_IOU

* matrix_IoU_Scores

* clip_catch

* global_IOU_Statistics

* dataset_Catch

* dataset_IoU_Statistics

* All statistics.py functions

* Sections

* sections

* update dataset_IoU_Statistics

* Update image

* Formatting

* Update Image

* Update installation directions

* change to numbered list

* Update Installation Directions

* Update README.md

* Examples, Remove "How it Works"

* Update README.md

* Update README.md

* isolation params in example

* Examples

* Update Graphs

* Update Style

* Credit

* Credit update

* Credits updates

* Update README.md

removed "Install Jupyter Notebook" step
This should be accomplished by the conda environment install.

* Update README.md

Adjusted Examples portion. Results were created and pushed from my Linux Mint laptop which is built off of Ubuntu 16.04

* Update Readme.md with relevant Screaming Piha Test set information

Would be smart to get someone that is new with PyHa to see if they can easily get things up and running simply by reading the readme. I also added in more details to the steps to make them more foolproof, might be overkill.

* Updated Images

Co-authored-by: Jacob <jgayers@ucsd.edu>
Co-authored-by: NathalieFranklin <69173439+NathalieFranklin@users.noreply.github.com>
Co-authored-by: Jacob <jacobthescreenwriter@gmail.com>
Co-authored-by: mugen13lue <33683103+mugen13lue@users.noreply.github.com>

* gitignore

* deleted pycaches

* Annotation Length Histograms  (#89)

* move from old branch

* update tutorial

* update

* Add parameters update to match

* update tutorial

* Updated Tutorial Notebook
Added more relevant demo of the annotation histogram

* Update README.md

* Fixed broken logo

* Update README.md (#95)

* Made resampling more general

Scipy's resample function allows for both upsampling and downsampling, made sure that this function also includes upsampling, not just downsampling.

* Update IsoAutio.py

- Fixed a comment on the resampling portion of the code base.

* Fix "FOLDER" column Kaleidoscope compatibility

* Added extra error handling to generate_automated_labels
- Caught the case where an empty wav file is successfully loaded in
- Problem was that in these cases the script would crash whenever the signal was downsampled

* Added BirdNet Pipeline

* New Ubuntu 20.04 conda environment

- Upgrades to tensorflow 2.8.0 which is compatible with both Microfaune and BirdNET

* Fixed typo
birdnet "model" isolation parameter was mispelled as "birndet"

* Documentation update for new BirdNET-Lite functions

* Update README.md

Updated example to include the "model" isolation parameter

* local_score_visualization for non-Microfaune

currently calls generate_automated_labels on the single clip - will refactor in a different commit to handle if an already made automated_df is passed in

* local_scores ability to pass in automated_df (#109)

* Fixed new Ubuntu 20.04 conda environment placement
- Made sure it is in the conda_environments folder

* Renamed two visualization functions
- Changed local_score_visualization() to spectrogram_visualization()
- Changed plot_bird_label_scores() to binary_visualization()
- Fixed a small bug in spectrogram_visualization() related to an error warning when a dataframe with automated annotations from multiple clips is passed in

* Delete LICENSE

Moving to Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)

* Updated License

- Since we are using birdnet source code, we have to reflect their license.
- No commercial use I guess.

* Fix for visualization typechecking

- Fixed a small bug in spectrogram_visualization() to better handle both bools and dataframes for automated_df
- Added spectrogram_graph(), annotation_duration_histogram() to README

* Added Tensorflow 2.8 to windows env

* Adjusted Tutorial Notebook
- Some markdown didn't make sense talking about TPs, FPs, FNs, and TNs in the context of the spectrogram_visualization() changes

* Update License

Matching BirdNET-Analyzer instead of old BirdNET-Lite repository

* Correct typos and deleted redundant file

* Conform tweetynet to recent PyHa refactor (#113)

* Added TweetyNet model to PyHa
* Added TweetyNet local score and spectrogram visualization output
* Added new conda environments for MacOS and Windows 10
* Added testing notebooks
* Optimize panda use on Steinburg isolate technique
* Updated .gitignore for cached files and testing
* Updated README, Tutorial Notebook, and documentation
* Improved error messages

Co-authored-by: mugen13lue <mugen4college@gmail.com>
Co-authored-by: Samantha Prestrelski <samantha.prestrelski@gmail.com>
Co-authored-by: Vanessa-Salgado <vsalgadozavaleta@ucsb.edu>
Co-authored-by: Sean Perry <shperry@ucsd.edu>

* .gitignore fixed and removed notebooks (#120)

* remove notebooks (#121)

* .gitignore fixed and removed notebooks

* remove notebooks

* Label chunker (#114)

* Added label chunker
* Added documentation for the annotation_chunker
Co-authored-by: shreyasar2202 <ars.shreyas@gmail.com>
Co-authored-by: Sean Perry <shperry@ucsd.edu>
Co-authored-by: Samantha Prestrelski <samantha.prestrelski@gmail.com>

Co-authored-by: NathalieFranklin <69173439+NathalieFranklin@users.noreply.github.com>
Co-authored-by: Jacob <jgayers@ucsd.edu>
Co-authored-by: Jacob <jacobthescreenwriter@gmail.com>
Co-authored-by: mugen13lue <33683103+mugen13lue@users.noreply.github.com>
Co-authored-by: Nishant Balaji <nishantb1130@gmail.com>
Co-authored-by: Nishant Balaji <44332326+nishantbalaji@users.noreply.github.com>
Co-authored-by: Gabriel Steinberg <gsteinb1@binghamton.edu>
Co-authored-by: shreyasar2202 <ars.shreyas@gmail.com>
Co-authored-by: Sean Perry <sean.hyatt.perry@gmail.com>
Co-authored-by: mugen13lue <mugen4college@gmail.com>
Co-authored-by: Vanessa-Salgado <vsalgadozavaleta@ucsb.edu>
Co-authored-by: Sean Perry <shperry@ucsd.edu>
@sprestrelski sprestrelski deleted the TweetyNet_integrations_2_main branch July 5, 2023 18:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

TweetyNET Pandas functionality Add TweetyNET pipeline to PyHa Update Conda Environments with Seaborn
6 participants