My Notes

This page contains links to a bunch of notes that I have maintained while attending various courses, and notes for some talks and reading projects. I use these mainly for my own reference. In any set of notes, I have tried to write the most interesting and the most important parts as I understand them.

Errors are abound.

Many parts in these notes might be incomplete, and may contain errors as well. I have been lazy in correcting many of them.

Analysis

  1. Complex Analysis. I have a document containing solutions to some good problems in complex analysis. This was mainly in conjunction with the instructor notes that I was following.
  2. Analysis 3. The main goal of the course was to study Metric Spaces and Fourier Series. There is a document for the main notes, and a set of homework problems.
  3. Calculus. aka Integration in higher dimensions. I have a set of nice problems, and a document containing the main notes.
  4. Analysis 2. aka Differential Calculus in higher dimensions. Notes are not in the best shape, but maybe still good for reference. Also a couple of homework problems, without problem statements.

Algebra

  1. Algebra 3. This course was a study of rings and fields, two of the most important algebraic structures in math. There is a document for the main notes, along with a set of homework problems.
  2. Algebra 2. I maintained a document of solutions to a lot of group-theoretic problems and some miscellaneous stuff. Not proper notes, but useful material anyway.
  3. Algebra 1. Here is a loose document which discusses some basic topics in linear algebra. This is an old one (taken in my very first semester).

Computer Science

  1. Reinforcement Learning. In this course we covered MDPs and various algorithms to solve them, multi-armed bandits, dynamic programming, Monte Carlo methods, temporal-difference learning, and some other notions in Reinforcement Learning. For this course, I have some notes on MDPs and some of their mathematical properties, along with two sets of homework problems on them. Additionally, there is a repository containing solutions to some problems from Berkeley’s course.
  2. Software Verification and Analysis. This was my first course which intersected with formal verification. The main goal of the course was to understand different theoretical tools which are needed to verify programs, which included tools from Hoare Logic and Abstract Interpretation. We also played with a bounded model checking program called CBMC and Microsoft Research’s Z3. For this course, I have a set of main notes, and a repository of two homework problem sets.
  3. Linear Programming and Combinatorial Optimization. As the name suggests, in this course we studied different aspects of Linear Programming, algorithms to solve LPs and their application to various problems in combinatorial optimization. For this course, I maintained a set of notes.
  4. Online Convex Optimization. This course revolved around the Online Convex Optimization (abbreviated OCO) framework. In this setting, the goal is to devise an online learning algorithm which takes in a sequence of arbitrary convex functions over a convex body and tries to predict a strategy at each time step. The goal is to minimize the so called regret of the algorithm over a time horizon (this word comes from game theory). I have a set of main notes as well as three homework problem documents given below. Additionally there is a repository which contains an implementation of the vanilla GD and SGD algorithms, and compares them in terms of their convergence rates.
  5. Foundations of Machine Learning. This course was an introduction to the foundations of Statistical Learning Theory. The main ideas of PAC learning were introduced, leading all the way to the Fundamental Theorem of Statistical Learning. In the second part of the course, many practical aspects of ML were discussed, including some ML algorithms and model selection and validation techniques. For this course, I have a document containing some notes, along with three homework problem documents.
  6. Randomized and Sublinear Algorithms. The first half of this course was about randomization in algorithms, i.e the techniques of designing and analyzing randomized algorithms. The second half of the course revolved around the domain of sublinear algorithms in the streaming model. Towards the end of the course, techniques of proving lower bounds on sublinear algorithms were discussed. For this course, I have a document containing the notes.
  7. Programming Language Concepts. This was a course on various programming concepts, including object oriented programming and lambda-calculus. I have a set of main notes, along with two homeworks on lambda calculus.
  8. Complexity Theory 1. This is a rigorous introduction to complexity classes and other topics in complexity theory. I have a document containing the main notes, along with some nice problems.
  9. Theory of Computation. The course revolved around three models of computations: finite automatons, context-free grammars and Turing Machines, and the related properties of the sets they describe. I maintained a document of the main notes, along with a bunch of problem sets.
  10. Discrete Mathematics. This was a first course in discrete mathematics, and revolved around topics in combinatorics and graph theory. I made a (not so good looking) document containing solutions to some nice problems, and some other random topics. Unfortunately, the document does not contain problem statements.

Uncategorized

  1. Differential Equations. This was an advanced course on the theory of Ordinary Differential Equations. I have maintained a set of main notes along with some problems.
  2. Topology. The course revolved around various aspects of general topology and algebraic topology. There is a document containing the main notes and some good exercises.

Talks/Reading Projects

  1. Conservative Online Convex Optimization. As part of the OCO course, I learned about an extension of the OCO framework which enforces a conservativeness constraint on the online learner. The reading was presented twice during the semester. Check out the original paper. Below I’ve included two documents containing some hand-written notes roughly explaining the ideas, as well as a presentation for the talk
  2. Adaptive Moment Estimation. This is a talk on Adam, which stands for Adaptive Moment Estimation. Adam is a technique for optimization of the well known stochastic gradient descent algorithm in machine learning. Adam is readily available in most ML packages, and often beats other counterparts like AdaGrad or Nestorov’s momentum in practice. Here I’ve included two presentations for the talk, as well as a report containing rough ideas about Adam’s working as well as an experiment.
  3. Graph Problems in the Semi-streaming Model. In this reading I explored the semi-streaming data model for graph problems. This model relaxes the storage constraints which makes it a rich area for solving graph theoretic problems (like matchings, computing diameters etc). Check out the original paper. I have a document containing hand-written notes roughly explaining the ideas in the paper, and a final presentation for the talk. These notes contain the discussion on the unweighted bipartite matching problem in the semi-streaming model.
  4. Randomization (for high school students). This is a talk I gave during PROMYS 2021. The talk discusses some useful ideas in randomization. The main problems discussed in the talk are the polynomial identity testing problem, it’s application to bipartite matching, and a problem on verification of matrix multiplication. There is a presentation and a write-up of the talk.
  5. Excess scaling algorithm for the max-flow problem. As part of a course on matching and flow algorithms, I presented Orlin et al.’s paper A fast and simple algorithm for the maximum flow problem. The algorithm can be considered as a more detailed version of the usual PUSH-RELABEL paradigm. I made a presentation and a report for this.

results matching ""

    No results matching ""