About Me

I'm a fifth-year graduate student at Stanford University. I'm studying computer science, interested in systems research, and am advised by Matei Zaharia and Peter Bailis. I work on building new abstractions for distributed systems. Some projects I have worked on include POP (SOSP 2021), Willump (MLSys 2020), MacroBase (Best of VLDB 2019) and Arachne (OSDI 2018). I did my undergrad at Harvard, where I worked with Professor Margo Seltzer on my senior thesis and Professor Alexander Rush on predicting congressional voting records from bill text (EMNLP 2016).

Contact Details

Peter Kraft
kraftp@cs.stanford.edu

Publications

DBOS: A DBMS-Oriented Operating System.
Athinagoras Skiadopoulos, Qian Li, Peter Kraft, Kostis Kaffes, Daniel Hong, Shana Matthew, David Bestor, Michael Cafarella, Vijay Gadepally, Goetz Graefe, Jeremy Kepner, Christos Kozyrakis, Tim Kraska, Michael Stonebraker, Lalith Suresh, Matei Zaharia.
International Conference on Very Large Data Bases (VLDB) 2022.

Solving Large-Scale Granular Resource Allocation Problems Efficiently with POP.
Deepak Narayanan, Fiodar Kazhamiaka, Firas Abuzaid, Peter Kraft, Akshay Agrawal, Srikanth Kandula, Stephen Boyd, Matei Zaharia.
Symposium on Operating Systems Principles (SOSP) 2021.

Data Governance in a Database Operating System (DBOS).
Deeptaanshu Kumar, Qian Li, Jason Li, Peter Kraft, Athinagoras Skiadopoulos, Lalith Suresh, Michael Cafarella, Michael Stonebraker.
VLDB Workshop on Polystore Systems (Poly) 2021.

DIFF: A Relational Interface for Large-Scale Data Explanation.
Firas Abuzaid, Peter Kraft, Sahaana Suri, Edward Gan, Eric Xu, Atul Shenoy, Asvin Ananthanarayan, John Sheu, Erik Meijer, Xi Wu, Jeff Naughton, Peter Bailis, Matei Zaharia.
The VLDB Journal, 2021. "Best of VLDB 2019" Special Issue.

A Demonstration of Willump: A Statistically-Aware End-to-end Optimizer for Machine Learning Inference.
Peter Kraft, Daniel Kang, Deepak Narayanan, Shoumik Palkar, Peter Bailis, Matei Zaharia.
International Conference on Very Large Data Bases (VLDB) 2020.

Willump: A Statistically-Aware End-to-end Optimizer for Machine Learning Inference.
Peter Kraft, Daniel Kang, Deepak Narayanan, Shoumik Palkar, Peter Bailis, Matei Zaharia.
Conference on Machine Learning and Systems (MLSys) 2020.

DIFF: A Relational Interface for Large-Scale Data Explanation.
Firas Abuzaid, Peter Kraft, Sahaana Suri, Edward Gan, Eric Xu, Atul Shenoy, Asvin Ananthanarayan, John Sheu, Erik Meijer, Xi Wu, Jeff Naughton, Peter Bailis, Matei Zaharia.
International Conference on Very Large Data Bases (VLDB) 2019.

Arachne: Core-Aware Thread Management.
Henry Qin, Qian Li, Jacqueline Speiser, Peter Kraft, John Ousterhout.
Symposium on Operating Systems Design and Implementation (OSDI) 2018.

Automatically Scalable Computation That Is More Scalable and Automatic.
Peter Kraft.
Harvard University Senior Thesis. 2017.

Improving Supreme Court Forecasting Using Boosted Decision Trees.
Aaron Kaufman, Peter Kraft, Maya Sen.
Political Analysis. 2019.

An Embedding Model For Predicting Roll-Call Votes.
Peter Kraft, Hirsh Jain, Alexander Rush.
Empirical Methods for Natural Language Processing (EMNLP) 2016.