Grad Student @ MIT CSAIL
Programming Systems & Machine Learning
I'm interested in using machine learning as an abstraction to help write complex programs, with the goal of making it easier to write programs that are hard or even impossible to write by hand.
Here is a full CV
Programming with Neural Surrogates of Programs.
Alex Renda, Yi Ding, and Michael Carbin.
DiffTune: Optimizing CPU Simulator Parameters with Learned Differentiable Surrogates.
Alex Renda, Yishen Chen, Charith Mendis, and Michael Carbin.
TIRAMISU: A Polyhedral Compiler for Dense and Sparse Deep Learning.
Riyadh Baghdadi, Abdelkader Nadir Debbagh, Kamel Abdous, Fatima Zohra Benhamida, Alex Renda, Jonathan Elliott Frankle, Michael Carbin, and Saman Amarasinghe.
Workshop on Systems for ML, NeurIPS, 2019.
Comparing Rewinding and Fine-tuning in Neural Network Pruning.
Alex Renda, Jonathan Frankle, and Michael Carbin.
Oral presentation (<2% of submitted papers).
BHive: A Benchmark Suite and Measurement Framework for Validating x86-64 Basic Block Performance Models.
Yishen Chen, Ajay Brahmakshatriya, Charith Mendis, Alex Renda, Eric Atkinson, Ondřej Sýkora, Saman Amarasinghe, and Michael Carbin.
Ithemal: Accurate, Portable and Fast Basic Block Throughput Estimation using Deep Neural Networks.
Charith Mendis, Alex Renda, Saman Amarasinghe, and Michael Carbin.
Best Paper award at the ML for Systems workshop at ISCA 2019.
Programming Language Support for Natural Language Interaction.
Alex Renda, Harrison Goldstein, Sarah Bird, Chris Quirk, and Adrian Sampson.
- NSF GRFP Honorable Mention, 2020
- Best Paper award for Ithemal at the ML for Systems workshop at ISCA 2019
- MIT Great Educators Fellowship, 2018-2019
- Cornell University: Summa Cum Laude with Honors, 2018
- PLDI 2023 — Social Events Co-Chair
- ICLR 2023 — Reviewer
- OOPSLA 2022 — Artifact Evaluator / External Review Committee
- ECOOP 2022 — Artifact Evaluator / External Review Committee
- ICLR 2022 — Reviewer
- POPL 2022 — Artifact Evaluator
- OOPSLA 2021 — Artifact Evaluator
- NeurIPS 2021 — Reviewer
- ICML 2021 — Reviewer
- ASPLOS 2021 — Artifact Evaluator
- ICLR 2021 — Reviewer (Outstanding Reviewer)
- AAAI 2021 — Emergency Reviewer
- NeurIPS 2020 — Reviewer
- ICML 2020 — Reviewer (Top 33% Reviewer)
- PLSE Seminar Co-Coordinator — Spring 2021 - present
- PLSE Coffee Chat Co-Coordinator — Fall 2020 - present
- PLSE Lunch Co-Coordinator — Fall 2019 - Spring 2020, Fall 2021 - present
- EECS GAAP Mentor — Fall 2020, Fall 2021
- Fast ML Reading Group Coordinator — Fall 2019 - Spring 2020
January, 2022 — NEC Labs Europe — Programming with Neural Surrogates of Programs
March, 2021 — MIT PLSE Seminar — Learned x86 Cost Models: Steps Towards a Learned Compiler Backend
November, 2020 — Facebook AI Compiler Group — Learned x86 Cost Models: Steps Towards a Learned Compiler Backend
July, 2021 — OctoML — DiffTune: Optimizing CPU Simulator Parameters with Learned Differentiable Surrogates
- CS 4120 - Introduction to Compilers. Teaching Assistant. Cornell University. Spring 2018.
- CS 2112 - Object Oriented Programming and Data Structures - Honors. Consultant. Cornell University. Fall 2015, Fall 2016.
- Ph.D. student in EECS, MIT CSAIL. 2018-present.
Working on learning-based systems and efficient neural networks.
Advised by Michael Carbin.
- S.M. in Electrical Engineering and Computer Science, MIT. 2020.
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Worked on efficient neural networks.
Advised by Michael Carbin.
- B.S. (Summa Cum Laude) in Computer Science with Honors, with a minor in Linguistics, Cornell University. 2018.
Worked on programming abstractions for natural language and intelligent systems as an undergraduate member of the Capra group.
Advised by Adrian Sampson.
- Summer 2020: MLSys Research Intern at OctoML
- Summer 2018: Software Engineering Intern at Two Sigma
- Summer 2017: Software Engineering Intern at Two Sigma
- Summer 2016: Software Engineering Intern at Facebook
- Summer 2014: System Validation Intern at Tesla