Sara Babakniya
About
I am currently working as a Research Scientist at Google in Kirkland.
I obtained my Ph.D. in Computer Science at the University of Southern California. I was a member of Information Theory and Machine Learning (vITAL) research lab under the supervision of Prof. Salman Avestimehr. Generally, I am interested in different challenges in ML, such as privacy and efficiency. My experience has primarily been exploring these challenges in Federated Learning and Natural Language Processing.
I finished my B.Sc. in Electrical Engineering at the Sharif University of Technology in 2019, during which I gained some experience in implementing and designing networked systems.
Experience
Research Scientist (Sep 2025 - Present)
Google Research, Kirkland, WAStudent Researcher (Aug 2024 - March 2025)
Google Research, New York, NY
Mentors: Kareem Amin, Umar SyedSWE Intern (Feb 2024 - May 2024)
Google, Sunnyvale, CA
Mentors: Jeremy Fisher, Sam AldrinGraduate Research Assistant (Aug 2019 - Aug 2025)
University of Southern California, Los Angeles, CA
Publications
Escaping Collapse: The Strength of Weak Data for Large Language Model Training
Kareem Amin, Sara Babakniya, Alex Bie, Weiwei Kong, Umar Syed, Sergei Vassilvitskii
[Authors are ordered alphabetically]
NeurIPS 2025, SSI-FM@ICLR 2025
[Paper]A Data-Free Approach to Mitigate Catastrophic Forgetting in Federated Class Incremental Learning for Vision Tasks
Sara Babakniya, Zalan Fabian, Chaoyang He, Mahdi Soltanolkotabi, Salman Avestimehr
NeurIPS 2023
[Paper], [Code]Revisiting Sparsity Hunting in Federated Learning: Why does Sparsity Consensus Matter?
Sara Babakniya*, Souvik Kundu*, Saurav Prakash, Yue Niu, Salman Avestimehr
Transactions on Machine Learning Research 2023
[Paper], [Code]SLoRA: Federated Parameter Efficient Fine-Tuning of Language Models
Sara Babakniya*, Ahmed Roushdy Elkordy*, Yahya H Ezzeldin, Qingfeng Liu, Kee-Bong Song, Mostafa El-Khamy, Salman Avestimehr
FL@FM-NeurIPS 2023
Best Paper Award
[Paper]AICircuit: A Multi-Level Dataset and Benchmark for AI-Driven Analog Integrated Circuit Design
Asal Mehradfar, Xuzhe Zhao, Yue Niu, Sara Babakniya, Mahdi Alesheikh, Hamidreza Aghasi, Salman Avestimehr
ML4S-NeurIPS 2024
Reproducibility Award
[Paper]Don’t Memorize; Mimic The Past: Federated Class Incremental Learning Without Episodic Memory
Sara Babakniya, Zalan Fabian, Chaoyang He, Mahdi Soltanolkotabi, Salman Avestimehr
ICML-FL 2023
[Paper]Federated Sparse Training: Lottery Aware Model Compression for Resource Constrained Edge
Sara Babakniya*, Souvik Kundu*, Saurav Prakash, Yue Niu, Salman Avestimehr
NeurIPS-FL 2022
[Paper]Supervised Learning for Analog and RF Circuit Design: Benchmarks and Comparative Insights
Asal Mehradfar, Xuzhe Zhao, Yue Niu, Sara Babakniya, Mahdi Alesheikh, Hamidreza Aghasi, Salman Avestimehr
Preprint
[Paper]Defending Against Poisoning Backdoor Attacks on Federated Meta-Learning
Chien-Lun Chen, Sara Babakniya, Marco Paolieri, Leana Golubchik
ACM Transactions on Intelligent Systems and Technology, 2022
[Paper]Deep-n-Cheap: An Automated, Efficient and Extensible Search Framework for Cost-Effective Deep Learning
Sourya Dey, Sara Babakniya, Saikrishna C. Kanala, Marco Paolieri, Leana Golubchik, Peter A. Beerel, Keith M. Chugg
Springer Nature Computer Science, 2021
[Paper], [Code]
Honors and Awards
- WiSE Travel Grant, 2024
- Best Paper Award, FL@FM-NeurIPS Workshop, 2023
- Outstanding Poster Presentation, USC MHI Research Festival, 2023
- Best Poster Presentation, USC-Meta Center Workshop, 2022
- Grace Hopper Celebration Travel Grant, USC 2022
- Grad Cohort Travel Grant, CRA-W 2022
- WiSE Qualcomm Top-Off Fellowship 2021
- Grace Hopper Celebration Scholarship 2021
Contact
Email: babakniy[at]usc[dot]edu
last update 12/02/2025
