Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove notebooks #121

Merged
merged 2 commits into from
Jun 28, 2022
Merged

remove notebooks #121

merged 2 commits into from
Jun 28, 2022

Conversation

sprestrelski
Copy link
Member

No description provided.

@sprestrelski sprestrelski merged commit b27cfc6 into UCSD-E4E:main Jun 28, 2022
sprestrelski added a commit that referenced this pull request Jun 30, 2022
* Create artwork

* Add files via upload

* Update ver3 zoo2.svg

* Update ver3 zoo2.svg

* Delete ver3 zoo2.svg

* Delete ver3 zoo3.svg

* Delete ver3 zoo5.svg

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Create IsoAutio.svg

* Delete ver3_zoo8.svg

* Delete ver3_zoo5.svg

* Delete ver3_zoo1.svg

* Delete ver3_zoo2.svg

* Delete ver2.svg

* Delete ver1.svg

* Create PyHa

* Rename PyHa to PyHa.svg

* Delete ver3_zoo3.svg

* Delete ver3_zoo7.svg

* Delete artwork

* Created separate folder for conda environments, added Ubuntu 18.04 environment

* Removing old Ubuntu 16.04 environment

No longer needed since the last commit added the conda_environments folder

* Added in a Windows 10 environment file

Tested on one machine

* Add files via upload

Adding environment .yml file for macOS Big Sur 11.4

* Added try-except block to handle faulty wav files
- Found in situations where you want to run an isolation algorithm across a large set of wave files that haven't been properly vetted for various problems that can occur. Such as RIFX instead of RIFF or wave files that were created but failed to actually record anything and are empty.
- I am not that experienced with error handling, but this change made it work on a large folder filled with Audiomoth clips that had tons of errors

* Forgot to add continue to previous commit

* Improved Manual Labels for test set
- Annotations re-done by Jacob using Pyrenote

* Added columns for Kaleidoscope compatibility
- Channel had to be added. LABEL was changed to MANUAL ID

* Updated PyHa_Tutorial Notebook with new labels

* Converted bi_directional_jump to window_size
- This is a more standard description of what the steinberg isolation technique is deploying

* Changing bird_dir to audio_dir in IsoAutio.py
- This is part of the transition to make PyHa more general.

* Improved zero division error-handling messaging
- Specific to the matrix_IoU_Scores() function

* Improved visualizations.py error handling
- local_score_visualization() lets you know which clip failed to receive microfaune predictions

* Fixed PyHa Tutorial Notebook
- Recent commit didn't display everything

* Improved visualizations.py error handling
- Handled more potential points of failure
- Added better descriptions of what may have gone wrong

* Improved local_score_visualizations()
- Allowed a user to insert whatever sort of pre-rendered annotations that they so desire.
- Allow them to change the words on the legend that appear corresponding to the annotations

* Demo local_score_visualization() changes

* initial linting with pep8

* Reworked audio test set.
- Decided it best to use the Creative Commons xeno-canto data
- Decided it best to stick to Screaming Piha audio to honor the package's name
- Had to update the Jupyter Notebook tutorial with the new dataset
- Deleted the audio, new update in the readme branch will link to a public drive with the relevant test set
- Replaced the old labels from the old test set with the labels for the new screaming piha dataset.

* IsoAutio.py update

* removed all whitespace and styling errors

* Address Linting Warnings
- Related to unused variables
- Fixed IsoAutio import in the visualizations.py file
- master_clips_stats_df ==> master_clip_stats_df

* Revert "Address Linting Warnings"
- Local repo wasn't properly updated, leading to the undoing of important recent readme PR request

This reverts commit 684a03c.

* Readme (#86)

* Update README.md

removed last remnant of old name on this branch

* Create artwork

* Add files via upload

* Update ver3 zoo2.svg

* Update ver3 zoo2.svg

* Delete ver3 zoo2.svg

* Delete ver3 zoo3.svg

* Delete ver3 zoo5.svg

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Create IsoAutio.svg

* Delete ver3_zoo8.svg

* Delete ver3_zoo5.svg

* Delete ver3_zoo1.svg

* Delete ver3_zoo2.svg

* Delete ver2.svg

* Delete ver1.svg

* Create PyHa

* Rename PyHa to PyHa.svg

* Delete ver3_zoo3.svg

* Delete ver3_zoo7.svg

* Delete artwork

* Created separate folder for conda environments, added Ubuntu 18.04 environment

* Removing old Ubuntu 16.04 environment

No longer needed since the last commit added the conda_environments folder

* Added in a Windows 10 environment file

Tested on one machine

* Add files via upload

Adding environment .yml file for macOS Big Sur 11.4

* isolate function

* Update README.md

* isolation_parameters

* threshold function

* return values

* Update README.md

* Update README.md

* usage

* file links

* Locations

* isolation_parameters update

* isolation_parameters update

* design

* Update README.md

* Update README.md

* steinberg_isolate

* Table of Contents

* PyHa Logo

* logo

* steinberg isolate

* update logo

* steinberg isolate

* isolation techniques

* rename variables

* generate_automated_labels

* generate_automated_labels

* kaleidoscope_conversion

* All IsoAudio.py files

* section headings

* local_line_graph

* local_score_visualization

* plot_bird_label_scores

* All visualizations.py functions

* comments

* microfaune_package notes

* annotation_duration_statistics

* bird_label_scores

* automated_labeling_statistics

* global_dataset_statistics

* clip_IOU

* matrix_IoU_Scores

* clip_catch

* global_IOU_Statistics

* dataset_Catch

* dataset_IoU_Statistics

* All statistics.py functions

* Sections

* sections

* update dataset_IoU_Statistics

* Update image

* Formatting

* Update Image

* Update installation directions

* change to numbered list

* Update Installation Directions

* Update README.md

* Examples, Remove "How it Works"

* Update README.md

* Update README.md

* isolation params in example

* Examples

* Update Graphs

* Update Style

* Credit

* Credit update

* Credits updates

* Update README.md

removed "Install Jupyter Notebook" step
This should be accomplished by the conda environment install.

* Update README.md

Adjusted Examples portion. Results were created and pushed from my Linux Mint laptop which is built off of Ubuntu 16.04

* Update Readme.md with relevant Screaming Piha Test set information

Would be smart to get someone that is new with PyHa to see if they can easily get things up and running simply by reading the readme. I also added in more details to the steps to make them more foolproof, might be overkill.

* Updated Images

Co-authored-by: Jacob <jgayers@ucsd.edu>
Co-authored-by: NathalieFranklin <69173439+NathalieFranklin@users.noreply.github.com>
Co-authored-by: Jacob <jacobthescreenwriter@gmail.com>
Co-authored-by: mugen13lue <33683103+mugen13lue@users.noreply.github.com>

* gitignore

* deleted pycaches

* Annotation Length Histograms  (#89)

* move from old branch

* update tutorial

* update

* Add parameters update to match

* update tutorial

* Updated Tutorial Notebook
Added more relevant demo of the annotation histogram

* Update README.md

* Fixed broken logo

* Update README.md (#95)

* Made resampling more general

Scipy's resample function allows for both upsampling and downsampling, made sure that this function also includes upsampling, not just downsampling.

* Update IsoAutio.py

- Fixed a comment on the resampling portion of the code base.

* Fix "FOLDER" column Kaleidoscope compatibility

* Added extra error handling to generate_automated_labels
- Caught the case where an empty wav file is successfully loaded in
- Problem was that in these cases the script would crash whenever the signal was downsampled

* Added BirdNet Pipeline

* New Ubuntu 20.04 conda environment

- Upgrades to tensorflow 2.8.0 which is compatible with both Microfaune and BirdNET

* Fixed typo
birdnet "model" isolation parameter was mispelled as "birndet"

* Documentation update for new BirdNET-Lite functions

* Update README.md

Updated example to include the "model" isolation parameter

* local_score_visualization for non-Microfaune

currently calls generate_automated_labels on the single clip - will refactor in a different commit to handle if an already made automated_df is passed in

* local_scores ability to pass in automated_df (#109)

* Fixed new Ubuntu 20.04 conda environment placement
- Made sure it is in the conda_environments folder

* Renamed two visualization functions
- Changed local_score_visualization() to spectrogram_visualization()
- Changed plot_bird_label_scores() to binary_visualization()
- Fixed a small bug in spectrogram_visualization() related to an error warning when a dataframe with automated annotations from multiple clips is passed in

* Delete LICENSE

Moving to Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)

* Updated License

- Since we are using birdnet source code, we have to reflect their license.
- No commercial use I guess.

* Fix for visualization typechecking

- Fixed a small bug in spectrogram_visualization() to better handle both bools and dataframes for automated_df
- Added spectrogram_graph(), annotation_duration_histogram() to README

* Added Tensorflow 2.8 to windows env

* Adjusted Tutorial Notebook
- Some markdown didn't make sense talking about TPs, FPs, FNs, and TNs in the context of the spectrogram_visualization() changes

* Update License

Matching BirdNET-Analyzer instead of old BirdNET-Lite repository

* Correct typos and deleted redundant file

* Conform tweetynet to recent PyHa refactor (#113)

* Added TweetyNet model to PyHa
* Added TweetyNet local score and spectrogram visualization output
* Added new conda environments for MacOS and Windows 10
* Added testing notebooks
* Optimize panda use on Steinburg isolate technique
* Updated .gitignore for cached files and testing
* Updated README, Tutorial Notebook, and documentation
* Improved error messages

Co-authored-by: mugen13lue <mugen4college@gmail.com>
Co-authored-by: Samantha Prestrelski <samantha.prestrelski@gmail.com>
Co-authored-by: Vanessa-Salgado <vsalgadozavaleta@ucsb.edu>
Co-authored-by: Sean Perry <shperry@ucsd.edu>

* .gitignore fixed and removed notebooks (#120)

* remove notebooks (#121)

* .gitignore fixed and removed notebooks

* remove notebooks

* Label chunker (#114)

* Added label chunker
* Added documentation for the annotation_chunker
Co-authored-by: shreyasar2202 <ars.shreyas@gmail.com>
Co-authored-by: Sean Perry <shperry@ucsd.edu>
Co-authored-by: Samantha Prestrelski <samantha.prestrelski@gmail.com>

Co-authored-by: NathalieFranklin <69173439+NathalieFranklin@users.noreply.github.com>
Co-authored-by: Jacob <jgayers@ucsd.edu>
Co-authored-by: Jacob <jacobthescreenwriter@gmail.com>
Co-authored-by: mugen13lue <33683103+mugen13lue@users.noreply.github.com>
Co-authored-by: Nishant Balaji <nishantb1130@gmail.com>
Co-authored-by: Nishant Balaji <44332326+nishantbalaji@users.noreply.github.com>
Co-authored-by: Gabriel Steinberg <gsteinb1@binghamton.edu>
Co-authored-by: shreyasar2202 <ars.shreyas@gmail.com>
Co-authored-by: Sean Perry <sean.hyatt.perry@gmail.com>
Co-authored-by: mugen13lue <mugen4college@gmail.com>
Co-authored-by: Vanessa-Salgado <vsalgadozavaleta@ucsb.edu>
Co-authored-by: Sean Perry <shperry@ucsd.edu>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant