This is a first course in random matrix theory, the study of the eigenvalues and eigenvectors of matrices with random entries that is foundational to highdimensional statistics and data science. Aside from the main ideas and modern applications of random matrices, a key goal will be to introduce you to the main concepts of probability in high dimensions: concentration of measure, the geometry of highdimensional spaces and convex sets, Gaussian measure, and sharp transitions and threshold phenomena. The following is a (very) tentative ordered list of specific topics to be covered:
1. Gaussian matrices and dimensionality reduction
2. Classical theory of i.i.d. random matrices
3. Spiked matrix models and principal component analysis (PCA)
4. Matrix concentration inequalities
I am the instructor of this course, Tim Kunisky, and the teaching assistant is AMS PhD student Yue Wu.
The best way to contact us is by email, at kunisky [at] jhu.edu and ywu166 [at] jhu.edu, respectively. Our office hours are as follows:
Class will meet Tuesdays and Thursdays, 9:00am to 10:15am in Gilman 55.
Below is a tentative schedule, to be updated as the semester progresses.
Date  Details 

Week 1  
Aug 27  1. Course logistics. Review of PCA and SVD as exploratory data analysis tools. EckartYoungMirsky theorem. Multiplying by a Gaussian matrix: what does it do? What is it good for? How to think about Gaussian processes. 
Aug 29  2. Random matrices for dimensionality reduction: the JohnsonLindenstrauss transform and lemma. Concentration inequalities and union bound arguments. 
Week 2  
Sep 3  3. Application of JohnsonLindenstrauss to nearest neighbors. Ailon and Chazelle's fast JohnsonLindenstrauss transform. Connection between Gaussian matrices and random projection. Uniformity of singular vectors. 
Sep 5  4. Concentration of singular values of short wide matrices. Interpretation as a "matrix concentration" inequality. Epsilon net arguments for discretizing matrix norms. 
Week 3  
Sep 10  5. Nonconstructive existence of epsilon nets over unit sphere. Finish singular values of short wide matrices. 
Sep 12  6. Application: compressed sensing with random sensing matrices and the restricted isometry property. First steps of limit theorems: what does convergence of empirical spectral distribution mean? Statistical meaning of Wishart matrix limit theorems. 
Week 4  
Sep 17  7. Formal definitions of random weak convergence. Statement of Wigner's semicircle limit theorem. Intuition for moment method for proving limit theorems. 
Sep 19  8. Review of proof of central limit theorem by Carleman's criterion and moment method. Moment calculations with Catalan numbers for semicircle limit theorem. 
Week 5  
Sep 24  9. Finish proof of weak convergence in probability for semicircle limit theorem. Discussion of extensions: universality and controlling extreme eigenvalues by moments. 
Sep 26  10. Brief discussion of MarchenkoPastur limit theorem. Introduction to free probability. Renormalization proof of central limit theorem and matrix generalization. Difficulties in dealing with "tangled" matrix products. 
Week 6  
Oct 1  11. Free probability continued. Definition of asymptotic freeness and additive free convolution. Wigner's semicircle limit theorem as the free central limit theorem. 
Oct 3  12. Review and interpretation of free central limit theorem. Application: spectral graph theory and the spectra of locally treelike graphs. 
Week 7  
Oct 8  13. Application: landscapes and eigenvalues of Hessians of neural networks. 
Oct 10  14. Transform methods and multiplicative free convolution. Application: covariance estimation. 
Week 8  
Oct 15  15. El Karoui's covariance denoising method. Phase transition in spiked additive models and statement of BBP transition theorem. 
Oct 17  Fall Break: No class. 
Week 9  
Oct 22  16. Resolvent derivation of additive BBP transition. Application: community detection in dense stochastic block models. 
Oct 24  17. Application: nonlinear PCA and denoising spiked matrix models. 
Week 10  
Oct 29  18. Zoom Lecture: Using general purpose concentration inequalities for random matrix theory: EfronStein and bounded difference inequalities.

Oct 31  19. Concentration inequalities continued: EfronStein for random matrices, Poincare inequalities. 
Week 11  
Nov 5  20. Subgaussian rates of concentration, logSobolev inequalities, and Gaussian Lipschitz concentration. Introduction to noncommutative Khintchine inequality. 
Nov 7  21. Proof of noncommutative Khintchine inequality and variations. Matrix Chernoff and Bernstein as consequences, and basic examples. 
Week 12  
Nov 12  22. Alternative approach via matrix moment generating functions. 
Nov 14  23. Application: Random graphs. Possibly: community detection revisited. 
Week 13  
Nov 19  24. Application: Spectral sparsification. 
Nov 21  25. Spectral sparsification continued and sketch of Laplacian linear system solvers. 
Fall Recess: Nov 25–29  
Week 14  
Dec 3  26. Recent developments in nonasymptotic random matrix theory: improved noncommutative Khintchine inequalities and free probability connections. 
Dec 5  27. Recent developments continued: universality, a nascent unified nonasymptotic theory, and open problems of current interest. 
Final Exam Period  
Dec 17  Final presentations (69pm). 
You do not need to buy any books for this course. You can find my lecture notes here. They are currently updated through Lecture 17.
The following are freely available books or lecture notes that cover some similar material and might be useful to you in addition to my notes:
Grades will be based on a small number of written homework assignments, class participation, and a final project concerning a recent research paper, open problem, or topic of interest related to the material we cover.
Homework will be posted here, and is to be submitted through Gradescope (see Canvas announcements for details). Please try to talk to me in advance if you need more time for an assignment.
Assigned  Due  Link 

Sep 12  Sep 30  Assignment 1 
Oct 7  Oct 23  Assignment 2 
Your final project is to do one of the following on a topic related to the content of this course: (1) read and digest a paper and present its content in your own words and style, with some elaboration that is not present in the original paper; (2) perform an interesting computational experiment motivated by something we have seen in class or something you read in a paper and report in detail on the results and their interpretation; or (3) for the intrepid, find an open problem related to something we have seen in class, try to work on it, and report your findings.
You will both give a short presentation (no more than 10 minutes) on your topic and will submit a short written report.
The following are reasonable categories from which to choose a topic, along with a few references you might look into. You are also welcome to choose a topic of your own, provided that you describe it and its relevance to the course convincingly on Assignment 2.