During Fall 2023, we held three reading groups, each focused on a different topic within TCS. The groups were: Algebraic and Spectral Graph Theory, Analysis of Boolean Functions, and Theoretical Neuroscience and AI. Each group was run by an undergraduate student organizer along with a graduate student mentor.
Please see the descriptions and tables below for a summary and the list of talks for each of the groups.
This was the seventh iteration of the TCS seminar. The seminar was run by Ekene Ezeunala.
Organizer: Ekene. Graduate Student Mentor: Binghui.
Description: We explore a variety of topics in spectral and algebraic graph theory, following the book of the same name by Daniel Spielman. Some of the topics we will cover include random walks, rubber bands, and electronic networks. The seminar will build up towards a discussion of one of the seminal results in this area, a fast algorithm for solving linear systems.
Resource | Title | Link |
---|---|---|
Spielman | Spectral and Algebraic Graph Theory | Link |
Date | Topic | Speaker | Reading |
---|---|---|---|
September 30th | Introduction, Motivation, and Background. The Spectral Theorem. | Ekene | Spielman Chap. 1; Fan R. K. Chung Lectures on Spectral Graph Theory Chap. 1 |
October 7th | Courant-Fischer Theorem. The graph Laplacian. | Ekene | Spielman Chaps. 2, 3 |
October 14th | Random Walks and Diffusion Problems. | Adi | Spielman Chap. 10 |
October 21st | Electrical Networks and Graphs. | Ryan | Spielman Chaps. 11, 12 |
October 28th | Midterm Break | ||
November 4th | Sampling Spanning Trees. | Ayaan | Spielman Chap. 13 |
November 11th | Graph Sparsifiers. | Binghui | Spielman Chap. 14 |
November 18th | Cheeger's Inequality. | Lyra | Spielman Chap. 21 |
December 2nd | Laplacian solvers, and a nearly-linear-time Laplacian Solver | Ekene and Binghui | Spielman Chap. 36; [ST'04]; [KMP'10]; [KLOS'14]; [Madry'13] |
Organizers: Ashwin and Krish. Graduate Student Mentor: Yuhao.
Description: We cover foundational concepts in Boolean function analysis, a widely applicable tool in theoretical computer science. We will go through the first two chapters of Ryan O'Donnell's Analysis of Boolean Functions textbook, culminating in an algorithm for linearity testing and a result in voting theory.
Resource | Title | Link |
---|---|---|
O'Donnell | Analysis of Boolean Functions | Link |
Date | Topic | Speaker | Reading | Optional Exercises |
---|---|---|---|---|
October 2nd | Background | Krish and Ashwin | O'Donnell Sec. 1.1 - 1.3 | 1.1 (a-c, g-i), 1.2, 1.3, 1.4, 1.5 |
October 9th | Parseval's Identity | Ashwin and Krish | O'Donnell Sec. 1.4 | 1.6, 1.7, 1.13, 1.15 |
October 16th | Convolution | Szymon | O'Donnell Sec. 1.5 | 1.20, 1.21, 1.23, 1.24 |
October 23rd | BLR Linearity Testing | Nikhil | O'Donnell Sec. 1.6 | |
October 30th | Influences and Derivatives | Srirag | O'Donnell Sec. 2.12 - 2.22 | |
November 6th | Academic Holiday | |||
November 13th | Poincare Inequality | Krish and Ashwin | O'Donnell Sec. 2.27 - 2.38 | |
November 20th | Noise Stability | Assigned Reading | O'Donnell Sec. 2.40 - 2.47 | |
November 27th | The Noise Operator | Assigned Reading | O'Donnell Sec. 2.49 - 2.54 | |
December 4th | Arrow's Theorem | Yuhao | O'Donnell Sec. 2.5 |
Organizer: Shujun. Graduate Student Mentor: Daniel.
Description: We explore topics in theoretical neuroscience as it relates to artificial intelligence. Talks themselves will focus on the mathematical and computational cores of the weekly topic, with both proof-based and computational work. Optionally, there will be a topic-related project that may develop into a paper for interested students.
Resource | Title | Link |
---|---|---|
Gerstner | Neuronal Dynamics | Link |
Date | Topic | Speaker | Reading |
---|---|---|---|
October 5th | Introduction and Background: Tools of the Trade | Shujun | Gerstner Chaps. 1-4; [PTHW'21] |
October 12th | Vision: Convolutional Networks and Beyond | Joseph | [RP'21]; [BSJKP'21] |
October 19th | Long Short-Term Memory and Transformers | Sammy | [HS'97]; [Vaswani et al. (2017)]; [CDKKLH'22] |
October 26th | Spiking Neural Networks: Boom or Bust? | Paarth | [TGKMM'19]; [LP'20] |
November 2nd | Attractor Networks: How is memory preserved in networks? | Gerstner Chap. 17; [PAB'23] | |
November 9th | Beyond Backpropagation: Deep Learning with Different Learning Rules | Sreyas | Gerstner Chap. 19; [Amit'19]; [JRGM'23] |
November 16th | Biologically Constrained Deep Learning | Audrey | Gerstner Chap. 20; [BB'06]; [SC'23] |
November 30th | Large Language Models and Theory of Mind | everyone | General Discussion |