This is a graduate topics course in computational economics, with applications in datascience and machine learning.
- Get a GitHub ID and apply for the Student Developer Pack to get further free features
- Consider clicking
Watch
at the top of this repository to see file changes
See Syllabus for more details
See problemsets.md.
Paul
-
September 4: Environment and Introduction to Julia
- Intro slides
- Environment: read one or both of these on your own and install Julia, IJulia, and VSCode, preferrably before the first class
- In class: Motivating econometric examples
- Self-study: Introductory Examples or Chapter 1 of Scientific Programming in Julia
-
September 9: Integration
- Slides
- Self-study: Julia Essentials and Fundamental Types
- Self-study: Chapter 2 of Scientific Programming in Julia
-
September 11: Nonlinear Equation Solving
-
September 16: Project Best Practices
- Slides
- Self-study: Package development, unit tests, & CI
- Self-study: Testing and Packages
- Self-study: Git and Github
-
September 18: clean up example project, introduction to automatic differentiation
- In class: automatic differentiation packages, slides
- Self-study: Automatic Differentation in Scientific Programming in Julia
- Self-study: Differentiation for Hackers
- Self-study: Engineering Trade-Offs in Automatic Differentiation: from TensorFlow and PyTorch to Jax and Julia
- Optional:
-
September 23: Optimization
-
September 25: Extremum Estimation
-
October 2 Function Approximation
-
October 7 Code Performance
- Coding for performance be sure to look at the 2023 branch for the recent additions
- GPU usage
- Self-study: SIMDscan: since it briefly came up in class, and I was curious about it, I made a little package for calculating things like cumulative sums and autoregressive simulations using SIMD
- Self-study: Need for speed
- Self-study: Performance Tips
-
October 9 Dynamic Programming
- October 16 Debiased Machine Learning
JESSE
Slides for the lectures can be found here
-
October 21: Factorizations, Direct Methods, and Intro to Regularization
- SLIDES: Factorizations and Direct Methods
- Introduction to regularization and implicit bias of algorithms
- Numerical Linear Algebra applying generic programming
-
October 23: Iterative Methods, Geometry of Optimization, Rethinking LLS, and Preconditioning - SLIDES: Least Squares and Iterative Methods - Iterative Methods
-
October 28: Overview of Machine Learning
- SLIDES: Intro to ML
- Finalize discussion of iterative methods and preconditioning
- Introduce key concepts about supervised, unsupervised, reinforcement learning, semi-supervised, kernel-methods, deep-learning, etc.
- Basic introduction to JAX and Python frameworks
-
October 30: Differentiable everything! JAX and Auto-Differentiation/JVP/etc.
- SLIDES: Differentiation
- Reverse-mode and forward-mode AD.
- Jvps and vjps
- Implicit differentiation of systems of ODEs, linear systems, etc.
-
November 4: High-dimensional optimization and Stochastic Optimization
- SLIDES: Optimization
- Gradient descent variations
- Using unbiased estimates instead of gradients
-
November 6: Stochastic Optimization Methods and Machine Learning Pipelines
- SLIDES: SGD variations in Optimization
- W&B sweeps, and code in
lectures/lectures/examples
- - SGD and methods for variance reduction in gradient estimates
- Using SGD-variants in practice within ML pipelines in JAX and Pytorch
- Readings: Probabilistic Machine Learning: An Introduction Section 5.4 on ERM
-
November 18: Neural Networks, Representation Learning, Double-Descent
- SLIDES: Deep Learning and Representation Learning and started Double-Descent and Regularization
- Readings
- Probabilistic Machine Learning: An Introduction Section 13.2.1 to 13.2.6 on MLPs and the importance of depth
- Probabilistic Machine Learning: An Introduction Section 13.5.1 to 13.5.6 on regularization
- Mark Schmidt's CPSC440 Notes on Neural Networks (see CPSC340 lectures for a more basic treatment of these topics)
- Mark Schmidt's CPSC440 Notes on Double-Descent Curves (see CPSC340 lectures for a more basic treatment of these topics)
- Optional Extra Material
- Probabilistic Machine Learning: Advanced Topics Section 32 on representation learning
-
November 20 Finish Double-Descent and Intro to Kernel Methods and Gaussian Processes
- SLIDES: Kernel Methods and Gaussian Processes and finish Double-Descent and Regularization
- Readings
- If you didn't do it already, read Mark Schmidt's CPSC440 Notes on Double-Descent Curves and Overparameterization (see CPSC340 lectures for a more basic treatment of these topics)
- Probabilistic Machine Learning: An Introduction Section 17.1 and 17.2 on Kernel methods and Gaussian Processes
- CPSC340 has some notes on the "kernel trick", and you can skip over the details on images. Also see more advanced notes on kernel methods
- Finally, your problem set will involve running some simple Gaussian Processes with GPyTorch, which will become easier to understand after seeing the theory.
- Probabilistic Machine Learning: Advanced Topics Section 18.1 to 18.3 on GPs and kernels
- Researchers working in GPs love the online textbook Gaussian Processes for Machine Learning, so you may want to read the intro section on GP Regression
-
November 25 Bayesian Methods and HMC
-
November 27 Applications
-
December 2 Applications
-
December 4 Applications
-
December 18
- Final Project due
Look under "Releases" or switch to another branch for earlier versions of the course.