Skip to content

Latest commit

 

History

History
611 lines (306 loc) · 45.9 KB

log.md

File metadata and controls

611 lines (306 loc) · 45.9 KB

100 Days Of Code - Log

Day 1: 21 March 2018

Today's Progress: Enrolled in Andrew Ng's Neural Networks and Deep Learning course on Coursera, watched the first two introductory videos, and answered my first grading question. Also, watched five videos on vectors at Khan Academy.

Thoughts: Everything I learned about vectors in school came flooding back (which is a credit to my maths teachers, as that was almost 20 years ago). There were some interesting things I didn't know before as well, like how to write the definition of a line in n dimensions. The machine learning introduction was mostly stuff that I knew before, although I expect that will quickly change. Andrew Ng's example of a neural network was very clear, and the easiest to understand I've seen yet.

Day 2: 22 March 2018

Today's Progress: I've finished week one of the Neural Networks and Deep Learning course, including taking the quiz. That was a quick week! I got 10/10 on the quiz, yay. :) And I did some more brushing up on vectors on Khan Academy.

Thoughts: I still need more work on my vectors and matrices, so I think I will be doing them in parallel to the neural nets course. Week two of the course is going to involve a lot more of them, I'm guessing, so I'll have to judge my pace so that I learn enough linear algebra that I can progress, but not so much that I spend all of my time doing that.

Day 3: 23 March 2018

Today's Progress: Spent all of my time today learning about matrices on Khan Academy.

Thoughts: It turns out that after 18 years you forget how to do matrix multiplication. I still have a way to go to get back up to speed, but hopefully I can mostly do that tomorrow.

Day 4: 24 March 2018

Today's Progress: Also spent all of my time today learning about matrices on Khan Academy. This is becoming a theme. I covered 2D matrix transformations, and matrix inverses and determinants.

Thoughts: I was blown away by learning about matrix transformations, and realising that they basically underpin all of 3D computer graphics. When I get as far as matrix transposes I should be able to understand what's going on in the week 2 machine learning videos, so I'll switch back to Andrew Ng's course at that point.

Day 5: 25 March 2018

Today's Progress: Finished studying matrices and vectors at Khan Academy (for now), and am starting back on the machine learning course videos.

Thoughts: I now know a lot more about matrices than I did a few days ago, and the logistic regression algorithm in Andrew Ng's course is starting to make sense. I feel I'm making good progress.

Day 6: 26 March 2018

Today's Progress: Watched all of the compulsory week 2 videos for the Neural Networks and Deep Learning course.

Thoughts: Today's videos were mostly about Python, which I'm a lot more at home with than the maths. Numpy is still new for me though, so it's still not exactly plain sailing. Tomorrow I'll be going on to the programming assignment. Finally, I'll be writing real code!

Day 7: 27 March 2018

Today's Progress: Started working through the introduction to numpy in the Neural Networks and Deep Learning course (hereby abbreviated to NNDL).

Thoughts: There were a few new concepts in the introduction - notably vector and matrix norms - so it was slow going. It was also my first time working with Jupyter notebooks, which are awesome so far (and I hear they get better). I can tell that it's going to take some time before I'm comfortable with all the maths here, but hey, that's the point, right? :)

Day 8: 28 March 2018

Today's Progress: Did the week two quiz in the NNDL course, and worked on the numpy introduction some more.

Thoughts: I only got 9/10 on the quiz due to a careless mistake (boo). Otherwise, it wasn't too bad. The maths is still the bottleneck for me, but it's coming.

Day 9: 29 March 2018

Today's Progress: Finished the numpy introduction, and started on the first required programming assignment.

Thoughts: The result of the programming assignment will be a real machine learning algorithm, so I'm pretty excited. :) Still, it's slow going, as I'm still getting used to numpy and matrices.

Day 10: 30 March 2018

Today's Progress: Just studied more about vectors today.

Thoughts: No earth-shattering discoveries today, but I did get to learn some interesting properties of vector dot products.

Day 11: 31 March 2018

Today's Progress: Continued the week two programming assignment.

Thoughts: My first successful forward propagation and back-propagation. Nice! Those cat pics aren't quite classified yet, though.

Day 12: 1 April 2018

Today's Progress: Finished the week two programming assignment. I now have a bona fide cat recognition algorithm. At the moment it has 70% accuracy.

Thoughts: Finishing week two feels good! I didn't find the programming hard - it was just learning the new ML concepts and getting used to NumPy, really. I actually found the "Introduction to NumPy" programming assignment trickier than the graded assignment.

Day 13: 2 April 2018

Today's Progress: Started on the week 3 videos, and also started brushing up on my calculus at Khan Academy. (Today I went over the delta/epsilon limit proof, which I vaguely remember from school, but not well enough to feel comfortable going further in calculus without.)

Thoughts: Again, I'm going to have to pace myself with calculus - I don't want to study it to the exclusion of the ML course and then have to do my assignments in a rush. But I have a strong feeling that a good understanding of it will help me. And hopefully the calculus from school will start coming back once I get back into it a bit.

Day 14: 3 April 2018

Today's Progress: Watched videos about differential calculus on Khan Academy.

Thoughts: Just going over the basics again. In my school maths class we never learned limits formally, only informally, but I think that differential calculus is much easier to understand when you understand the formal definition of limits first.

Day 15: 4 April 2018

Today's Progress: Watched more videos about differential calculus on Khan Academy.

Thoughts: Mmm, nutritious maths. Today I got past the formal introduction to the derivative of a function, and got on to the power rule. This is much more familiar to me - I remember doing a lot of this stuff in school. I think I'll go as far as the chain rule and then go back to the machine learning course.

Day 16: 5 April 2018

Today's Progress: Studied the chain rule, the product rule, and the quotient rule on Khan Academy, then tried to find the derivative of the sigmoid function.

Thoughts: I bought off a bit more than I could chew with the sigmoid function! I failed when I tried by myself, but I now know/remember enough calculus to understand the derivation on Stack Exchange.

Day 17: 6 April 2018

Today's Progress: Finished off the videos for week 3 of NNDL, found the derivative of the tanh function, and passed the week 3 quiz.

Thoughts: I was impressed with myself for differentiating tanh without any help, although I did know what the answer was supposed to be. I (again) only got 9 out of 10 on the quiz though, (again) due to a careless mistake. But overall, a good day's study.

Day 18: 7 April 2018

Today's Progress: Spent today going back through the videos for week three and the quizzes for week 2 and week 3.

Thoughts: I spent quite a lot of time going through the dimensions of the different matrices in the different layers of the neural networks, and now I think I have a much better handle of what is going on. And because in NNDL you can take the quizzes as many times as you want and they keep your top score, I now have 10/10 on both. Yay!

Day 19: 8 April 2018

Today's Progress: Started the programming exercise for week 3 of NNDL.

Thoughts: I haven't quite finished the exercise, although I did finish the model - my first neural network with a hidden layer.

Day 20: 9 April 2018

Today's Progress: Finished week 3 of NNDL, and started on the videos for week 4.

Thoughts: On a similar theme as week two, I didn't find the programming exercise that difficult this time. In fact, it was easier this time, as it was so similar to the week two exercise. It's quite fun to see how even a neural network with just one hidden layer can do a much better job at fitting data than logistic regression can. At this rate, I'll be able to finish week four in a few days.

Day 21: 10 April 2018

Today's Progress: Finished the videos and the quiz for week 4 of NNDL.

Thoughts: I got 10/10 on the quiz. (Finally!) And I feel I have a good understanding of the material now, which is probably why.

Day 22: 11 April 2018

Today's Progress: Finished the first programming exercise for week 4 of NNDL.

Thoughts: It was tough starting today as I got back pretty late from work, but I still managed to finish my study at a not-too-unreasonable time. The programming exercise was basically the same as for week three, but generalised to any number of layers of network instead of just two. Tomorrow, hopefully I will be correctly classifying some cute little kitties.

Day 23: 12 April 2018

Today's Progress: Finished the course! And I had a bit of time left over, so I watched the video about the Caucy-Schwartz inequality on Khan Academy.

Thoughts: The last programming exercise was the easiest of the lot - it was basically just copying and pasting function definitions that I'd written before. The result was pretty good though - my cat-recognition model is now up to 80% accuracy.

Link to work: Coursera Neural Networks and Deep Learning course certificate

Day 24: 13 April 2018

Today's Progress: Started on Andrew Ng's Machine Learning course at Coursera, and took the first quiz.

Thoughts: This course seems a lot easier than NNDL - it looks like I took them the wrong way round - but there are lots of details in here that I think will help cement my understanding of neural networks, and give me some broader machine learning knowledge. I can easily imagine missing a useful approach to a problem if all I know about is neural nets. The course is supposed to take 11 weeks, but I imagine I'll finish it much quicker than that.

Day 25: 14 April 2018

Today's Progress: Learned more about series on Khan Academy.

Thoughts: The first thing in the machine learning course is linear regression, and it looks like to really understand it I will need to be able to differentiate a series, which means I need to properly understand series. (Serieses? Nope, that's not right.) So I spent a lot of time muddling through arithmetic series today, and I was surprised at how little I remembered about them from school. They make a lot more sense after seeing Sal's proof for the general formula of an arithmetic series on Khan Academy. The actual differentiation part will come later.

Day 26: 15 April 2018

Today's Progress: Learned more about linear regression in the ML course, more series on Khan Academy, and also started looking at multivariable functions on Khan Academy.

Thoughts: It turns out that in week one of the ML course they solve linear regression using gradient descent, although it can apparently be solved directly as well. It makes sense that you have to use gradient descent for neural networks, as the input space is so complex, but I'm impressed to learn that there are direct methods for simpler ML models as well. I'm going to need to learn more of the maths to understand all of this properly. Good job I have a whole 74 days left. ;)

Day 27: 16 April 2018

Today's Progress: Finished week 1 of the ML course, and started on the week 2 videos. Finished all of the linear regression ones already.

Thoughts: I was pleased to see that I could do the week 1 linear algebra quiz without looking at any of the Coursera linear algebra videos - obviously what I learned from those Khan Academy videos has stuck.

Day 28: 17 April 2018

Today's Progress: Learned more about partial derivatives on Khan Academy, then rewatched Patrick Winston's neural net lecture for the MIT Artificial Intelligence course.

Thoughts: Watching the lecture the second time made a lot more sense than watching it the first time! I think the first time was a little more than a year ago, and I had no idea about partial derivatives then. Still, I can tell that I still have quite a bit of work to do to be fully mathematically literate at this stuff.

Day 29: 18 April 2018

Today's Progress: Watched week two videos for the Coursera ML course, did the linear regression quiz, and started playing around with Octave.

Thoughts: I only got 3/5 on the quiz on my first attempt. Both of the ones I got wrong were silly mistakes (forgot a minus sign, and forgot a parameter). But that's not the end of the world - I got all of them right on the second attempt. Octave seems fun so far - I have a feeling I'm going to enjoy the programming exercises this time. Apparently you can do them on Matlab too, but why use that when you can GNU?

Day 30: 19 April 2018

Today's Progress: Watched the rest of the Octave videos on Coursera and took the Octave quiz.

Thoughts: Octave has a few surprises, but nothing extremely new. I do like the "magic" function - it's very useful for trying stuff out. Next up is the programming exercise, but scheduling means I'll probably have to stick with videos instead. Either Khan Academy or part two of Patrick Winston's neural nets lecture, I think.

Day 31: 20 April 2018

Today's Progress: Watched videos about partial differentiation and parametric functions on Khan Academy.

Thoughts: I ended up going with the Khan Academy videos today. Lots of good stuff there, and I think I have a good understanding of partial differentiation now, although I could probably do with more practice at actually solving some of them.

Day 32: 21 April 2018

Today's Progress: Started the linear regression programming exercise for the ML Coursera course.

Thoughts: I now have linear regression with one variable working as it should be. I am finding Octave harder going than the Python used in the NNDL course - which makes sense, as I use Python every day, but Octave is new for me - but at least I won't be totally out of my depth now if someone asks me to program something in MATLAB. Tomorrow I'll finish off the optional part fo the programming assignment (linear regression with multiple variables), and then I'll move on to week three and new ML models.

Day 33: 22 April 2018

Today's Progress: Finished the linear regression programming exercise and started watching the week three videos.

Thoughts: Week three is about logistic regression, which I have already covered in NNDL, but the ML course seems more in-depth, so I think I'll probably learn some new things. From a couple of things in the videos so far, it looks like I'll need to brush up on my statistics as well.

Day 34: 23 April 2018

Today's Progress: Finished the videos and quizzes for week three.

Thoughts: Perhaps there were less videos for week three than for week two, but I seemed to get through them very quickly this time. Just the programming exercise left, and I'll be on week four.

Day 35: 24 April 2018

Today's Progress: Started the programming exercise for week three.

Thoughts: A lot of today's time was spent working out how to plot nice graphs in Octave, and a lot of the rest was spent on realising that I had forgotten to plug my sigmoid function into my logistic regression model. But I am steadily getting more comfortable with Octave, which is good. Hopefully tomorrow I'll be able to finish the programming exercise off.

Day 36: 25 April 2018

Today's Progress: Finished the programming exercise for week three.

Thoughts: And with that, week three is over. It's been interesting to learn about regularisation, and about the alternatives to gradient descent, and implementing them was surprisingly easy. (Well, I didn't implement the gradient descent alternative myself, to be fair - I just called an Octave function.) Tomorrow it's onto week four and neural networks.

Day 37: 26 April 2018

Today's Progress: Finished the videos and the quiz for week four.

Thoughts: Week four is about neural network basics, so this is familiar ground for me. Although Prof. Ng uses slightly different notation than he does for the NNDL course, which is confusing.

Day 38: 27 April 2018

Today's Progress: Finished the programming exercise for week four.

Thoughts: Similarly to yesterday, the programming exercise was all familiar stuff. In fact, apparently Prof. Ng thought that people would implement logistic regression using a for loop for the week three assignment, and this week's challenge was to implement it in vectorised form. However, my solution from last week three was already vectorised, so it was literally just copy and paste to pass that part.

Day 39: 28 April 2018

Today's Progress: Finished the videos and the quiz for week five.

Thoughts: These weeks are starting to go by very quickly! Well, this one was about back-propogation in neural networks, which is another thing I've covered before. Next week is about advice for applying machine learning, which will probably be mostly new, and then the week after is about support vector machines, which is definitely new. So I'll have to see if I can keep up this pace.

Day 40: 29 April 2018

Today's Progress: Started the programming exercise for week five.

Thoughts: I spent most of today staring blankly at the cost function for neural networks using multiple classification, and wondering how to implement it in Octave in a vectorised way. I understand the basic idea, but I can't seem to get it to work. Will try again tomorrow...

Day 41: 30 April 2018

Today's Progress: Still working through the programming exercise. I managed to do the bit I was stuck on yesterday, which was good.

Thoughts: This programming exercise is definitely harder than the previous ones. I was glad that I managed to figure out yesterday's problem, though. I'm definitely starting to get the hang of vectorised stuff, even if my progress is still a little slower than I would like.

Day 42: 1 May 2018

Today's Progress: Slogging on further with the programming exercise.

Thoughts: Watching the back-propagation lectures again helped a lot to figure out what the notes for the programming exercise are saying, and the back-propagation part is now (I think) done. I'm getting an error in my cost function implementation, though, so it looks like I might have done the regularisation part of that wrong when I tried it the first time. Hopefully I'll be able to finish the exercise tomorrow.

Day 43: 2 May 2018

Today's Progress: Finally finished the week five programming exercise.

Thoughts: It feels great to have this programming exercise done! Doing this exercise has made me realise how delicate neural networks are - if you make a mistake in the arithmetic, it might look like it's working, but then the numbers you get back at the end can be completely wrong. It's not like mixing strings and ints in regular programming where you get a type error straight away - all you're mixing is floats with floats with other floats. Being strict about matrix sizes helps, but that's definitely not the end of the story.

Day 44: 3 May 2018

Today's Progress: Started the week six videos and took the first quiz.

Thoughts: The material for week six seems comparatively simple, and I didn't find anything in the videos difficult. However, I only got 4 out of 5 on the first quiz the first two times I took it. I'm pretty sure the second of those was a careless mistake, although I'm less sure about the first. Third time lucky, I suppose. The idea of plotting learning curves for a network is a really good one - it's one of those things that just makes intuitive sense, and it looks like it will be very useful for diagnosing problems in real neural networks. The next step is the programming exercise - hopefully this one won't take me four days.

Day 45: 4 May 2018

Today's Progress: Started the programming exercise for week six. This one is about learning curves.

Thoughts: Computing the cost function and the gradients for linear regression with regularisation was simple enough, as I had already written all the parts I needed in previous exercises though. I'm having difficulty understanding the instructions for creating the learning curve, though. Will have another go at it tomorrow.

Day 46: 5 May 2018

Today's Progress: Finished week six - both the programming exercise and the second quiz.

Thoughts: The programming exercise turned out to be pretty easy once I figured out what the instructions were getting at. It certainly wasn't as hard as the exercise for week five. The material for the second quiz was pretty interesting. It was all about how you should think when you implement learning algorithms - what to do next, when large training sets can help, and in general how not to waste your time doing things that aren't that likely to work. Next up, support vector machines, something I've been looking forward to studying.

Day 47: 6 May 2018

Today's Progress: Started watching the week seven videos about support vector machines.

Thoughts: I'm still not completely comfortable with the maths involved in SVMs, so perhaps I'll have to watch the videos again and/or study from a source that has a detailed derivation. But I think I have the basic idea of how SVMs work.

Day 48: 7 May 2018

Today's Progress: Continued with the week seven videos, and also did some extra reading about SVMs.

Thoughts: The Coursera course definitely isn't telling the full story regarding the SVM maths. However, I was only able to skim read a few things with the time I had today. More detailed reading will have to wait until tomorrow.

Day 49: 8 May 2018

Today's Progress: Finished the videos and the quiz for week seven.

Thoughts: I didn't do any work on the maths of SVMs today in the end. After having done the quiz I feel like I understand SVMs well enough, although I know there are more things I can learn - there's Mercer's theorem, which Prof Ng mentioned, and I could probably do with learning some more about the Gaussian kernel too. But what I know now is definitely enough to use SVMs, and it's enough for the ML course. I'll see if the mood strikes me another day to have a deeper look into the maths.

Day 50: 9 May 2018

Today's Progress: Started the programming exercise for week seven.

Thoughts: Halfway through! The programming exercise this week is pretty fun, as it's actually starting to resemble something like a real-world problem - specifically, spam classification. This exercise also burns lots of CPU cycles to compute the beset model parameters, which is something I'll probably be doing a lot more of in the future. At least with Coursera courses you don't have to send all your data to the cloud just to find that you've made a syntax error...

Day 51: 10 May 2018

Today's Progress: Finished week seven, and started on week eight: unsupervised learning.

Thoughts: The rest of the programming exercise for week seven was pretty easy - all the hard parts of preprocessing the spam emails in the spam classifier had already been done for me. (I'm quite glad of that, as they look much more of a pain to do in Octave than in Python.) The unsupervised learning stuff looks pretty interesting, and Prof Ng's explanation of K-means was very clear. With SVMs I was just using a library to do the hard mathematical coding, but with K-means I get the feeling that I'll be doing it all manually.

Day 52: 11 May 2018

Today's Progress: Studied more about K-means, took the quiz, and started on dimensionality reduction.

Thoughts: The K-means quiz was not hard at all - it's definitely an easier algorithm than the other ones in the course so far. I just got started on dimensionality reduction, but Prof. Ng hasn't got into the maths at all, so I suppose I'll find out how hard it is tomorrow.

Day 53: 12 May 2018

Today's Progress: Finished the quiz for dimensionality reduction.

Thoughts: Prof Ng mentioned eigenvectors today, which is something that I haven't learned so far in my linear algebra journey. This is making me want to study more maths - I still have quite a few videos left to go on the Khan Academny linear algebra course. Perhaps I'll have a look at those after doing the week eight programming exercise.

Day 54: 13 May 2018

Today's Progress: Started the week eight programming exercise. This week's exercise is about K-means and principal component analysis (PCA). I've finished the K-means part.

Thoughts: There wasn't anything too taxing in the K-means part of the exercise. The result was pretty interesting, though - the Coursera code used the k-means code that I wrote to compress an image from 16.7 million colours to 16 colours, and it still looked pretty good.

Day 55: 14 May 2018

Today's Progress: Finished the week eight programming exercise. The second half was about PCA, and also involved image compression - this time, images of faces.

Thoughts: I got stuck for a while after projecting the data matrix as the transpose of what it actually should be, but all was fine once I figured that part out. Otherwise there wasn't that much actual programming to be done in the exercise. I should probably start looking for some real machine learning problems to get my teeth into, but I think I'll finish the course first.

Day 56: 15 May 2018

Today's Progress: Started week nine - this week is about anomaly detection.

Thoughts: For this week I actually know the maths pretty well - anomaly detection is all about working with normal distributions. I remember learning about these in school, and while it has been a while, it seems I haven't forgotten very much of it.

Day 57: 16 May 2018

Today's Progress: Finished the anomaly detection quiz.

Thoughts: Today I learned about multivariate Gaussian distributions. It looks like they could be pretty useful for the right data set. Covariant matrixes came up again, which is making me think about studying more linear algebra. I'm also quite keen to get the course finished, though - only two and a half weeks to go, which is maybe one week in real time.

Day 58: 17 May 2018

Today's Progress: Watched the videos about recommender systems and took the quiz.

Thoughts: The recommender systems quiz was pretty hard! Well, one question in particular, about which formulas for the algorithm are equivalent. I might need to do some more work on that.

Day 59: 18 May 2018

Today's Progress: Finished the first half of the week nine programming exercise. This part was about anomaly detection.

Thoughts: Just like the videos, this part of the programming exercise wasn't particularly hard. Also, this is the last exercise, and probably the last time I'll use Octave for a while. It's been fun, but Python is definitely where it's at.

Day 60: 19 May 2018

Today's Progress: Working through the second half of the programming exercise.

Thoughts: I was having difficulties making a vectorised implementation of the partial differentials of the cost function, so I took the time out to make a version using for loops. The for loop version is working, so tomorrow I'll try and vectorise it properly. Up to now I've just been writing the vectorised versions straight away (usually after squinting at the algorithm for long time, testing things out in the console and making lots of mistakes), so this is a new approach for me. I'll see if it makes the vectorisation any easier.

Day 61: 20 May 2018

Today's Progress: More work on the programming exercise.

Thoughts: This vectorisation is hard! I'm still working on the same thing as yesterday, really. There are some tips in the exercise notes that I haven't fully tried out yet, so I'll read those more thoroughly tomorrow and try again.

Day 62: 21 May 2018

Today's Progress: Finished the programming exercise.

Thoughts: Finished the vectorisation after reading the tips again, and after that part was done the rest of the exercise was pretty easy. The last part was pretty interesting - I rated a bunch of films that I have seen, and my algorithm gave me some recommendations for other films that I might like. One of my recommendations was "Santa with Muscles", though, so maybe I need to give it some more data. ;)

Day 63: 22 May 2018

Today's Progress: Did the whole of week ten. This week was about working with very large data sets. I learned about stochastic gradient descent and mini-batch gradient descent, which were both pretty interesting, and about map-reduce, which I already knew about from many articles about Google's search infrastructure. And also online learning algorithms, where you learn from a live stream of data.

Thoughts: This was a very quick week. It certainly speeds things up when there aren't any programming exercises. I suppose the lack of a programming exercise makes sense, as I would imagine that it's hard to use very large data sets with things like Coursera. The quiz was pretty easy - there wasn't much maths or programming in the videos, so I'm guessing there was less scope for writing questions that might trip people up.

Day 64: 23 May 2018

Today's Progress: Finished the course! Week 11 was about the same length as week 10 - maybe a bit shorter, in fact. This week we covered machine learning pipelines and ceiling analysis.

Thoughts: It feels great to complete the course. I've learned a lot about machine learning and Octave in the last 40 days, and I'm sure a lot of it will come in useful. And the certificate looks pretty good too! I'm considering getting a nice printout of it and having it framed. :)

Link to work: Coursera Machine Learning course certificate

Day 65: 24 May 2018

Today's Progress: I started back where I left off with the Deep Learning Coursera speciality - course two, entitled "Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization". I'll use IDNN for short, I think.

Thoughts: It is a bit of a jolt to come from the easier algorithms at the end of the machine learning course straight back into deep neural networks. And the algorithms are only getting more complicated - the focus in week one of the course is on dropout regularisation, where you just randomly remove nodes from your network for every training example, which means that the cost function is no longer well defined. Also, the notation is different from the machine learning course, which doesn't help. I may have to review some of the first deep learning course if I get lost.

Day 66: 25 May 2018

Today's Progress: Finished the videos for week one, and took the quiz. Learned some more about regularisation and gradient checking, as well as vanishing and exploding gradients and techniques for setting up the neural network weights.

Thoughts: I have a pretty bad cold today so it wasn't easy to convince myself to start studying, but once I started it wasn't too bad.

Day 67: 26 May 2018

Today's Progress: Finished the first programming exercise of three for week one. This one was about setting up your weights properly.

Thoughts: I'm guessing that this exercise is the simple one, and the second and third ones will be harder. Having an easy exercise today was just right, though, as I'm still not 100% because of my cold. (It's better than yesterday, though.)

Day 68: 27 May 2018

Today's Progress: Finished week one. The remaining two programming exercises were on regularisation and gradient checking.

Thoughts: It was a bit of a slog to get through the other two programming exercises today. For one thing, I kept getting tripped up by the automatic grading system. (One of them was because I was using a numpy.float64 instead of a regular Python float - the other I think was I used < instead of <=, but I'm not certain.) But I wanted to finish them today because otherwise I would have to switch to a later course session.

Day 69: 28 May 2018

Today's Progress: Started the week two videos. This week is all about (even more) alternatives to gradient descent - mini-batch gradient descent, momentum, RMSprop, and Adam.

Thoughts: The Machine Learning course was good preparation for this, not least because I've covered mini-batch gradient descent before. I'll be taking the quiz and starting on the programming exercise tomorrow, so I'll see how that goes.

Day 70: 29 May 2018

Today's Progress: Finished the week two quiz and started on the programming exercise.

Thoughts: The other two videos for this week were mostly things I had covered before, but there was one very interesting observation about local minima in higher-dimensional spaces that was new to me. Namely, that it is very rare to have a true local minimum in very high-dimensional spaces, because then the curves in all of the dimensions would have to be pointing the same way (the "saddle" effect). I haven't done enough of the programming exercise to see if it will be tricky, but so far it has been fairly easy.

Day 71: 30 May 2018

Today's Progress: Continued with the programming exercise, although most of my time was spent reading about NumPy array indexing.

Thoughts: I have always found NumPy array indexing kind of confusing, but thanks to the NumPy docs and especially this StackOverflow answer I now have a much better handle on things.

Day 72: 31 May 2018

Today's Progress: Finished the programming exercise.

Thoughts: I managed to stump myself for a while with a misplaced bracket, and also the formulae have so many subscripts and superscripts that they're getting pretty hard to read. But overall, not so difficult an assignment this time.

Day 73: 1 June 2018

Today's Progress: Started the videos for week three. This week is about tuning hyperparameters and applying batch normalisation, and TensorFlow.

Thoughts: Batch norm is a pretty interesting idea that will help a lot in training deeper networks. And I'm excited at the prospect of finally learning a real ML library, instead of just using NumPy and Octave for everything.

Day 74: 2 June 2018

Today's Progress: Finished the week 3 quiz.

Thoughts: I didn't actually start using TensorFlow today, although I did get to see a video about how it works. The real TensorFlowing starts tomorrow.

Day 75: 3 June 2018

Today's Progress: Started the TensorFlow programming exercise.

Thoughts: I'm liking TensorFlow so far. Writing out the cost function for my neural networks just got a whole lot easier! Plus, I got to use the Python matrix multiplication operator for the first time ever.

Day 76: 4 June 2018

Today's Progress: Still working through the TensorFlow programming exercise.

Thoughts: I'm taking this one nice and slow - took a little detour today to get TensorFlow set up on my home machine with pipenv, which turned out to be pleasantly easy. I've just finished the introductory material on TensorFlow, and tomorrow I'll make a real neural net with it.

Day 77: 5 June 2018

Today's Progress: Finished the TensorFlow programming exercise.

Thoughts: I suppose it's to be expected, but writing neural nets in TensorFlow is a lot less error-prone than writing them in straight NumPy. And the back-propagation step is completely taken care of for you, which is marvellous. The problem is more to do with knowing how the library works and how you fit the various bits together to make a model. And using good models, of course. That's it for the Improving Deep Neural Networks course, although the grader for the programming exercise appears to be broken, so I can't post my certificate yet. I'll be moving straight on to the next course in the speciality: "Structuring Machine Learning Projects".

Day 78: 6 June 2018

Today's Progress: Got the programming exercise accepted, and started on the Structuring Machine Learning Projects course (SMLP for short).

Thoughts: It turns out that my assignments were being failed because I was using Python's @ operator instead of TensorFlow's tf.matmul function. They both do the same thing, though, so that's definitely a Coursera grader bug. I haven't got that far into the new course yet - I've just watched the first few videos - although a few things that came up in the Machine Learning course have come up again, which is nice review.

Link to work: Coursera Improving Deep Neural Networks course certificate

Day 79: 7 June 2018

Today's Progress: Finished the week one quiz for SMLP.

Thoughts: The quiz this time was pretty interesting - we were given questions simulating a real-world ML system and its lifecycle, and had to make decisions about what to do next. Some of them weren't so obvious!

Day 80: 8 June 2018

Today's Progress: Started the week two videos for SMLP.

Thoughts: The videos for this week are about data - how to work out what errors to focus on, what to do about mislabeled data, and how to work with different data distributions. Some of it I learned in the ML course, but some of it is new. Pretty interesting stuff.

Day 81: 9 June 2018

Today's Progress: Finished the SMLP course.

Thoughts: No programming exercises in this course, and there were only two weeks, so it was over very quickly. The quiz for week two was similar to week one - a simulation of a machine learning project. Again, some of the questions were pretty tricky.

Link to work: Coursera Structuring Machine Learning Projects course certificate

Day 82: 10 June 2018

Today's Progress: Started the next course in the deeplearning.ai speciality: "Convolutional Neural Networks". So far I've watched the videos on the basic theory of the convolution operation.

Thoughts: It turns out that the convolution operation is used for edge detection, which I found pretty interesting. I have yet to see how it fits in with neural netsm but that will come in the next few days as I work through the course.

Day 83: 11 June 2018

Today's Progress: Did the quiz for week one of the Convolutional Neural Networks course (ConvNet from now on).

Thoughts: The quiz had a couple of tricky questions, but maybe not as tricky as the quizzes for the previous course. I'm feeling pretty good about the basics of ConvNets now. I'm looking forward to the programming exercises so I can actually make one myself.

Day 84: 12 June 2018

Today's Progress: Started the first programming exercise for week one of ConvNets. This one is about implementing convolutional neural nets from the ground up.

Thoughts: Took it pretty relaxed today. I got as far as doing one forward pass of a ConvNet, although my implementation is giving an error at the moment. Will debug it tomorrow.

Day 85: 13 June 2018

Today's Progress: Still working through the programming assignments.

Thoughts: I managed to debug yesterday's error, although I didn't get much further than that today. (It didn't help that my notebook from yesterday didn't get saved, making me have to type in all of yesterday's work again.)

Day 86: 14 June 2018

Today's Progress: Still working through the programming assignments. I finished coding the forward pass of a ConvNet, and now I'm starting on the backward pass.

Thoughts: The backward pass is going pretty well so far, although I haven't had to look at the calculus at all as part of the course. Maybe I should look at it separately later - or maybe take another calculus course on Coursera after I finish the deeplearning.ai speciality.

Day 87: 15 June 2018

Today's Progress: Almost done with the programming assignments. I'm on the last function, though it's a tricky one.

Thoughts: I have the feeling that it would be easier if I was allowed to just write the code, and not have to try and follow the Coursera code suggesions. Sometimes it feels like trying to read someone's mind... But maybe it will be clearer in the morning.

Day 88: 16 June 2018

Today's Progress: Finished the programming exercise and started on the week two videos.

Thoughts: The programming exercise was definitely clearer this morning. It definitely pays to take some time away from the problem sometimes. Week two is about case studies of convolutional neural networks, so by the end of the week I should have seen lots of examples of good network architectures that I can use.

Day 89: 17 June 2018

Today's Progress: Learned about residual neural networks (ResNets) and the Inception network.

Thoughts: Both ResNets and Inception are about making deeper and deeper neural networks (and yes, the name of the Inception network does come from the film). We are talking about 100s of layers, which is pretty impressive. I'm looking forward seeing how these ideas are integrated in the programming exercises, which I will probably get to tomorrow.

Day 90: 18 June 2018

Today's Progress: Did the week two quiz.

Thoughts: The rest of the week two videos were about practical advice for using ConvNets, which I'm sure will come in handy. The quiz was mostly about ResNets and the Inception network, though, which was a good test of what I learned in the videos the couple of days before. The programming exercises this week use Keras, which I've been looking forward to get started with.

Day 91: 19 June 2018

Today's Progress: Started the Keras tutorial.

Thoughts: Keras seems pretty easy to use so far. (I suppose anything would seem easy after coding neural nets by hand in NumPy and Octave, but hey.) It's great that it frees up your energy to focus on the actual architecture of your neural net, rather than having to worry about whether you got all of the fiddly calculations right. This exercise wasn't graded, so I'll see how I do on the graded one tomorrow.

Day 92: 20 June 2018

Today's Progress: Finished the Keras programming exercise. Or at least I think I did - my model is still training as I write this.

Thoughts: It was surprisingly simple to make a 50-layer-deep convolutional neural network with Keras. It's a testiment to the care the Keras team have taken in giving the library an intuitive API. And that's with the functional API - I gather the sequential API is even easier.

Day 93: 21 June 2018

Today's Progress: Started the videos for week three. This week is about object location.

Thoughts: The videos I watched today were pretty interesting, especially the part where I learned how facial recognition and posture recognition worked. It was one of those "aha" moments - I've been seeing facial recognition in action for a long time (think Snapchat filters) and seeing how it works seems really obvious in hindsight. Hindsight is always 20/20, I suppose.

Day 94: 22 June 2018

Today's Progress: Finished the quiz for week three.

Thoughts: The quiz for this week seemed easier than for previous weeks - I suppose that is because there was less tricky maths involved, and no questions with more than one valid answer. Perhaps Prof. Ng is saving up the difficult stuff for the programming exercise?

Day 95: 23 June 2018

Today's Progress: Finished the object detection programming exercise.

Thoughts: This one definitely took more than an hour. But it was pretty cool to see the result - I effectively made an object detector for self-driving cars. (The Coursera folks trained the network and provided quite a few helper functions, though.) That's week three down - only one more to go before I finish the course.

Day 96: 24 June 2018

Today's Progress: Started the videos for week four. This week is about neural transfer and facial recognition.

Thoughts: The neural transfer stuff looks interesting - it might be a fun one to try by myself outside of the course.

Day 97: 25 June 2018

Today's Progress: Finished the week four quiz.

Thoughts: There were a couple of tricky questions in this quiz, but it is continuiing the general trend of the quizzes not being as tough as before. (Probably because there is less maths in these later topics.)

Day 98: 26 June 2018

Today's Progress: Started the neural style transfer programming exercise.

Thoughts: I almost finished this exercise in one go, but the last function looked like it would take just a bit too much time. There's still one more exercise left in the course after this one, though. It's going to be pretty cool to test out my neural style transfer with some of my own pictures when I've finished the exercise.

Day 99: 27 June 2018

Today's Progress: Still working on the neural style transfer programming exercise.

Thoughts: Yesterday's progress was deceptive - I seem to have introduced a bug in an earlier step that means that my final generated image doesn't match up to the expected outputs. This kind of bug is probably the most time-consuming to track down, as I don't really have much of a clue where it might be. Will come back to this tomorrow. Hopefully I can solve it before my 100 days are up. ;)

Day 100: 28 June 2018

Today's Progress: Finished! Both the 100 days, and the ConvNet course.

Thoughts: It's been a long 100 days! There were good days and bad days, days where I did lots more than one hour, and days where I did not-quite-an-hour-but-it's-probably-enough. And even if I had to do my study on the train or at lunchtime at work, I genuinely stuck with it every single day, something I'm very proud of. I still have one more course to go in the deeplearning.ai speciality, so I will probably do that next, but first I will take a well-deserved break over the weekend.

Link to work: Coursera Convolutional Neural Networks course certificate