CSE 5095-006: Machine Learning for Physical Sciences (Spring 2019)

This course will cover recent advances in machine learning for materials science, chemistry, and physics, and discuss some of the unique opportunities and challenges at the intersection of machine learning and these fields. Topics include feature selection, uncertainty, small and biased data, and other fundamental ideas in applied machine learning. The course will seek to connect students from computer science with students from the physical sciences together to build projects bridging their fields.

Lectures: T/Th, 9:30-10:45am, KNS 201

Office Hours: Th, 11:00am-12noon, ITE 259

Syllabus (tentative)

Full course materials on HuskyCT.

Date Materials/Further Reading Deadlines
Tues, Jan. 22 Lecture 1: Introduction
Thur, Jan. 24 Lecture 2: Quick Review of ML Algorithms Class survey due.
Tues, Jan. 29 ICERM Scientific Machine Learning Workshop live stream
Thur, Jan. 31 Lecture 3: Sample Complexity Chs. 1-3 of Foundations of Machine Learning by Mohri, Rostamizadeh, and Talwalker, MIT Press 2012.
Tues, Feb. 5 Lecture 4: Small Data; Transfer Learning Yosinski et al., How transferable are features in deep neural networks? NIPS, 2014.
Thur, Feb. 7 Lecture 5: Active Learning Active Learning by Settles, Morgan & Claypool 2012.; Dasgupta, Two faces of active learning. Theoretical Computer Science, 2011.
Tues, Feb. 12 Lecture 6: Noisy Data Frenay & Verleysen, Classification in the Presence of Label Noise: A Survey. IEEE TNNLS, 2014; Reed et al., Training Deep Neural Networks on Noisy Labels with Bootstrapping. ICLR Workshop, 2015.
Thur, Feb. 14 Lecture 7: Imbalanced Data Krawczyk, Learning from imbalanced data: open challenges and future directions. Progress in Artificial Intelligence, 2016; He & Garcia, Learning from Imbalanced Data. IEEE TKDE, 2009 Quiz 1
Tues, Feb. 19 Lecture 8: Model Selection & Assessment Ch. 7 of The Elements of Statistical Learning by Hastie, Tibshirani, and Friedman, Springer 2009 Final project proposal due.
Thur, Feb. 21 Lecture 9: Stability Shalev-Shwartz et al., Learnability, Stability and Uniform Convergence. JMLR, 2010; Bousquet & Elisseeff, Stability and Generalization. JMLR, 2002; Hardt et al., Train faster, generalize better: Stability of stochastic gradient descent. ICML, 2016 Quiz 2
Tues, Feb. 26 Lecture 10: Meta-learning Andrychowicz et al., Learning to learn by gradient descent by gradient descent. NIPS Deep Learning Symposium, 2016.
Thur, Feb. 28 Lecture 11: Feature Engineering & Selection Guyon & Elisseeff, An Introduction to Variable and Feature Selection. JMLR, 2003. Quiz 3
Tues, Mar. 5 Lecture 12: Autoencoders Bengio et al., Representation Learning: A Review and New Perspectives. TPAMI, 2013.; Bengio, Learning Deep Architectures for AI. Foundations and Trends in Machine Learning, 2009.
Thur, Mar. 7 Lecture 13: Generative Adversarial Networks Arjovsky & Bottou, Towards Principled Methods for Training Generative Adversarial Networks. ICLR, 2017
Tues, Mar. 12 Lecture 14: Interpretability Doshi-Velez & Kim, Considerations for Evaluation and Generalization in Interpretable Machine Learning. In Explainable and Interpretable Models in Computer Vision and Machine Learning. Springer, 2018; Adebayo et al., Sanity Checks for Saliency Maps. NIPS, 2018 Quiz 4; Final project data due.
Thur, Mar. 14 Lecture 15: Dimensionality Reduction; Part I Review Domingos, A Few Useful Things to Know about Machine Learning. CACM, 2012 Quiz 5
Tues, Mar. 19 Spring Break -
Thur, Mar. 21 Spring Break -
Tues, Mar. 26 Paper Review
Thur, Mar. 28
Tues, Apr. 2
Thur, Apr. 4
Tues, Apr. 9 Final project algorithm code and abstract due.
Thur, Apr. 11
Tues, Apr. 16
Thur, Apr. 18
Tues, Apr. 23
Thur, Apr. 25
Tues, Apr. 30 Final Project Flash Talks Final project flash talk slides due Apr. 29.
Thur, May 2 Final Project Flash Talks
Thur, May 9 Final project report due.