Copy/paste and fill in the template below each week, BEFORE coming to the weekly meeting with your supervisor. Put the newest weekly meeting first.
- Replace this text with a bullet point list of what you achieved this week.
- It's ok if your list is only one bullet point long!
- Replace this text with a bullet point list of where you struggled this week.
- It's ok if your list is only one bullet point long!
- Replace this text with a bullet point list of what you would like to work on next week.
- It's ok if your list is only one bullet point long!
- Try to estimate how long each task will take.
- Replace this text with a bullet point list of what you need help from Kirstie on.
- It's ok if your list is only one bullet point long!
- Try to estimate how long each task will take.
This space is yours to add to as needed.
This template is partially derived from "Whitaker Lab Project Management" by Dr. Kirstie Whitaker and the Whitaker Lab team, used under CC BY 4.0.
Weekly meetings Copy/paste and fill in the template below each week, BEFORE coming to the weekly meeting with your supervisor. Put the newest weekly meeting first.
Week 1 Date: 19-11-2018
-
What did you achieve? Achieved knowledge over the trabsfer learing community. What is goining on and what has been done. finding some more information about SVM. define my python goal and make a summery of the fucntions i need to get to my goal
-
What did you struggle with? SVM doesn't seem like a really fast option, or i'm doining things wrong. --> Did it wrong, got a simple fucntion that works faster.
-
What would you like to work on next week? Having worked out with functions and data i need for my end results. also check how to make everything as equal as possible.
-
Where do you need help from Veronika? Should have ask for help on the machine learning lecture, because i only found paid versions.
Any other topics: Last week the grandpa of my girlfriend died, and i was finishing my exams. I think every thing i've done is a bit low in quality, dont want do do this again.
Week 2 Date: 18-11-2018
-
What did you achieve? Made the whole script, but for just one simple data set. Learned which basic fucnitons i need for the final experiment. This took me around 3 days of coding. And 1 day or running some other configs. Nice to notice that 1000 svm preformed as well as 10000 svm, so de svm does do all the hard work. Val_loss was very low = <0.05 and val_acc was around > 0.995. Could experiment with differt epochs (did it now with the tenth) and compare results including de val_loss en val_acc ==> overtraining or not. each downtime caused by errors significantly dropped each day.
-
What did you struggle with? Needed some weird funcitons, for example: making de interger based class notation to a 0 and 1 based class notation. Tensorflow-gpu struggled for a bit, took some unneccasary time. The mnist data set is a small gray scale data set (28X28X1).
-
What would you like to work on next week? Try another data set, and cross experiment: train en test with different sets. try to use differt layers and epoches. Also see if switching model takes great efforts, just for fun ==> dir(tensorflow.keras.applications). Give some visualisation on results: maybe built a big for loop script running mutiple configs and comparing all outcomes.
Do a small literature search on optimizers and loss functions, just using the basic ones.
- Where do you need help from Veronika? --- problem solving is way too much fun. When there is more literature involved, the questions will come :P ---
Any other topics: The labbook feels pretty obsolete the versions are very rough. is that bad?\
- Made 4 main scripts, 1 file in which i run everyting and can ajust de order and parameters of the functions
- Made a AUC for the SVM, made a auc for the model based on results.
- Tried to immitate experiments conducted from the melago project.
- Github desktop got stuck
- Retinopathy was to big for my pc 83 GB.
- Atlas didn't succeed in such a small time frame
- AUC optimizing isn't possible with keras, tried to do it custom, but keep recieving error. could switch to tensorflow instead of tensorflow.keras, with TF there are more possibilities.
- Get AUC optimalizatoin working
- Swap my SSD for a bigger one (already got it, but takes a while to transfer)
- Try to get some data sets ready for testing
- AUC optimalization for keras?
- evaluate the script for bad habits and
I did not really achieve to mimic the results, is this now a new top priority or isnt it?
- Finetune vgg16 on imagenet
- replicate results of crowdsource.
- make a generator
- make changes to readme
- make architecture file (not on github)
- auto labnotes
- cool loading function
- laptop crashed, always fun.
- generator is slow, but probably neccesary with bigger data sets
- still a bit hard coding in jupyter instead of clean script
- Get data generator to work on file path command, and apply it to big data set.
- better val and test splitting.
- some analytic tools built in
- ....
Had a busy week, UEFA took full 3 days... Work 1 full day, so just 3 left for uni.
- replicated the results of menegola === https://docs.google.com/spreadsheets/d/1X9ahOiyy6TcOCcBIfZuvBjlhPI0CqwwM57AzElSJojo/edit?usp=sharing
- made it all work, no generator was needed, the data set was 4.5gb after it was resized to the 2242243 format
- cant remeber every thing
- mini data set test()
- Needed to re-intsall windows and reboot my desktop on friday nigth, fun fun fun
- The data argumentation is just to big, maybe we need to take a different approach
- One output was stuck on 0.33 acc, this showed how much models change by just re running. But running every thing 5 times, would have taken too much time
- Get the data sets i need and wrigth import functions + the overview
- Deep dream visualisation (individual lager output images)
- Make the experiment " correct" and ready for research.
- need help on finding things i need help with
Let's run it on your pc?
- Result file, with more than 4 models, 3 random seeds. 72 auc values...
- Small dataset table
- Report layout
- Sometimes the model doesn't train, it could be a bug of keras, because it happens at random.... or i'm making rookie mistakes
- The balancing of data sets is still a hurdle with brings great inconsitencies.
- to normalize or not to normalize...?
- Get some results in the other direction, does another data set give the same values?
- maybe repeat a few chest FT runs
- Starting with LaTeX, let's not push this to the last two weeks.
- Melonoma on weigths? --> maybe monday nigth
- need help on finding things i need help with
Let's run it on your pc?