I'm interested in using machine learning as an abstraction to help write complex programs, with the goal of making it easier to write programs that are hard or even impossible to write by hand.
Turaco: Complexity-Guided Data Sampling for Training Neural Surrogates of Programs.
Alex Renda, Yi Ding, and Michael Carbin.
OOPSLA, 2023.
Paper.
Bibtex.
Code.
DiffTune: Optimizing CPU Simulator Parameters with Learned Differentiable Surrogates.
Alex Renda, Yishen Chen, Charith Mendis, and Michael Carbin.
MICRO, 2020.
Paper.
Bibtex.
Code.
Presentation.
TIRAMISU: A Polyhedral Compiler for Dense and Sparse Deep Learning.
Riyadh Baghdadi, Abdelkader Nadir Debbagh, Kamel Abdous, Fatima Zohra Benhamida, Alex Renda, Jonathan Elliott Frankle, Michael Carbin, and Saman Amarasinghe.
Workshop on Systems for ML, NeurIPS, 2019.
Paper.
Bibtex.
Comparing Rewinding and Fine-tuning in Neural Network Pruning.
Alex Renda, Jonathan Frankle, and Michael Carbin.
ICLR, 2020.
Paper.
Bibtex.
Code.
Presentation.
Oral presentation (<2% of submitted papers).
BHive: A Benchmark Suite and Measurement Framework for Validating x86-64 Basic Block Performance Models.
Yishen Chen, Ajay Brahmakshatriya, Charith Mendis, Alex Renda, Eric Atkinson, Ondřej Sýkora, Saman Amarasinghe, and Michael Carbin.
IISWC, 2019.
Paper.
Bibtex.
Code.
Ithemal: Accurate, Portable and Fast Basic Block Throughput Estimation using Deep Neural Networks.
Charith Mendis, Alex Renda, Saman Amarasinghe, and Michael Carbin.
ICML, 2019.
Paper.
Bibtex.
Code.
Best Paper award at the ML for Systems workshop at ISCA 2019.
Programming Language Support for Natural Language Interaction.
Alex Renda, Harrison Goldstein, Sarah Bird, Chris Quirk, and Adrian Sampson.
SysML, 2018.
Paper.
Extended Draft.
Bibtex.
Code.
Drafts
CoMEt: x86 Cost Model Explanation Framework.
Isha Chaudhary, Alex Renda, Charith Mendis, and Gagandeep Singh.
In Submission, 2023.
Paper.
Renamer: A Transformer Architecture Invariant to Variable Renaming.
Zachary Ankner, Alex Renda, and Michael Carbin.
In Submission, 2022.
A Study of Equivalence-Preserving Program Embeddings.
Logan Weber, Jesse Michel, Alex Renda, Saman Amarasinghe, and Michael Carbin.
In Submission, 2022.
Cello: Efficient Computer Systems Optimization with Predictive Early Termination and Censored Regression.
Yi Ding, Alex Renda, Ahsan Pervaiz, Michael Carbin, and Henry Hoffmann.
In Preparation, 2022.
Paper.
Honors
NSF GRFP Honorable Mention, 2020
Best Paper award for Ithemal at the ML for Systems workshop at ISCA 2019
MIT Great Educators Fellowship, 2018-2019
Cornell University: Summa Cum Laude with Honors, 2018
B.S. (Summa Cum Laude) in Computer Science with Honors, with a minor in Linguistics, Cornell University. 2018.
Worked on programming abstractions for natural language and intelligent systems as an undergraduate member of the Capra group.
Advised by Adrian Sampson.
Industry Experience
Spring 2021: Consultant at ReadySet
Summer 2020: MLSys Research Intern at OctoML
Summer 2018: Software Engineering Intern at Two Sigma
Summer 2017: Software Engineering Intern at Two Sigma
Summer 2016: Software Engineering Intern at Facebook