Proceedings 2019 - Part 1 Proceedings 2019 - Part 2
Proceedings 2020 - Part 1 Proceedings 2020 - Part 2
Proceedings 2021 - Part 1 Proceedings 2021 - Part 2
Proceedings 2023 - Part 1 Proceedings 2023 - Part 2
The course project can be categorized as a literature review, original research, or a literature review that ends up as original research (there is flexibility to that).
-
Literature review. This includes an in-depth review and analysis of a paper (to be selected from a list of papers provided by the instructor or you after the instructor’s approval). The review should provide an in-depth summary, exposition, and discussion of the paper (which will often include reading other related papers on that subject).
-
Original research. You are strongly encouraged to combine your current study with the course project. Otherwise, the instructor will provide some ideas to follow. It can be either theoretical or experimental.
Milestones
-
Pick a project the sooner as possible. Deadline: February 17th (Friday).
-
Submit a one-page description of the project, what it is about, your opinion, what needs to be done (related papers to read), and whether you have any ideas to improve the ideas involved. Describe why they are important or interesting, and provide some appropriate references. If it is original research, provide a plan for the next steps and what needs to be done by the end of the semester to finish the project. Deadline: February 21st (Friday).
-
We will probably have in-class presentations towards the end of the semester. These will be spotlight talks (~5-10mins). Prepare an oral presentation with slides. Focus on high-level ideas, and leave most technical details to your report.
-
A written report. A LaTeX template will be provided (most probably in ICML format). The report should be at least six pages (excluding references). Deadline: End of the semester. Note that the project can continue beyond the end of the semester if it deserves publication.
Suggested list of projects/papers (to be updated)
Project ideas
- The math and modeling behind AI in material science
- Optimization in quantum computing: benchmarking algorithms
- (This is a project about setting up scenarios to test classical and quantum algorithms on classical CS problems)
- Parameter Setting in Quantum Approximate Optimization of Weighted Problems
- Writeup provided by instructor - per request
- Learning rate schedules and variational quantum computing algorithms: is there a connection?
- Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient
- Stochastic polyak step-size for SGD: An adaptive learning rate for fast convergence
- Learning-Rate-Free Learning by D-Adaptation
- Automatic Gradient Descent: Deep Learning without Hyperparameters
- Adaptive FL with auto-tuned clients
- How much pretraining in neural networks before low-rank approximations can safely ``kick in’’?
- Homotopy methods, graduated optimization, and quantum annealing: Find connections between them
- Advances in asynchrony on distributed SGD and how to improve them
- Continual learning methods and how to improve them
- Agentic AI: new ideas and how to improve them
- One Size Fits All for Semantic Shifts: Adaptive Prompt Tuning for Continual Learning
- Compress, Then Prompt: Improving Accuracy-Efficiency Trade-off of LLM Inference with Transferable Prompt
- FedJETs: Efficient Just-In-Time Personalization with Federated Mixture of Experts
- Sweeping Heterogeneity with Smart MoPs: Mixture of Prompts for LLM Task Adaptation
- Recent advances in acceleration methods: extensions
- Review on efficient distributed protocols: independent subnetwork training (IST) - new ideas on IST
- Distributed learning of fully connected neural networks using independent subnet training
- GIST: Distributed training for large-scale graph convolutional networks
- Resist: Layer-wise decomposition of resnets for distributed training
- Efficient and Light-Weight Federated Learning via Asynchronous Distributed Dropout
- Federated Learning Over Images: Vertical Decompositions and Pre-Trained Backbones Are Difficult to Beat