Write a Blog >>
PPoPP 2021
Sat 27 February - Wed 3 March 2021

Fifty years of parallel programming has generated a substantial legacy parallel codebase, creating a new portability challenge: re-parallelizing already parallel code. Our solution exploits inherently portable parallel patterns, and addresses the challenge of identifying patternization opportunities in legacy parallel code via constraint matching on traced dynamic dataflow graphs. Notably, this makes the analysis source-independent and equally applicable to sequential and parallel legacy code. We identify various map and reduction patterns, including compositions, in Pthreads code. Experiments with the Starbench suite show that our analysis is effective (finding 86% of the patterns known in the literature), accurate (reporting actual patterns in 98% of the cases), and efficient (scaling linearly with the size of the execution traces). We re-express the found patterns via a parallel pattern library, making code freely portable across CPU/GPU systems and performing competitively with hand-tuned implementations at zero additional effort.

Wed 3 Mar

Displayed time zone: Eastern Time (US & Canada) change

12:30 - 13:30
Session 10. Machine Learning and Software EngineeringMain Conference
Chair(s): Albert Cohen Google
12:30
15m
Talk
TurboTransformers: An Efficient GPU Serving System For Transformer Models
Main Conference
Jiarui Fang Tencent, Yang Yu , Chengduo Zhao Tencent, Jie Zhou Tencent
Link to publication
12:45
15m
Talk
Extracting Clean Performance Models from Tainted Programs
Main Conference
Marcin Copik ETH Zurich, Alexandru Calotoiu ETH Zurich, Tobias Grosser University of Edinburgh, Nicolas Wicki ETH Zurich, Felix Wolf TU Darmstadt, Torsten Hoefler ETH Zurich
Link to publication Pre-print
13:00
15m
Talk
Modernizing Parallel Code with Pattern Analysis
Main Conference
Roberto Castañeda Lozano University of Edinburgh, Murray Cole University of Edinburgh, Björn Franke University of Edinburgh
Link to publication
13:15
15m
Talk
DAPPLE: A Pipelined Data Parallel Approach for Training Large Models
Main Conference
Shiqing Fan Alibaba Group, Yi Rong Alibaba Group, Chen Meng Alibaba Group, ZongYan Cao Alibaba Group, Siyu Wang Alibaba Group, Zhen Zheng Alibaba Group, Chuan Wu The University of Hong Kong, Guoping Long Alibaba Group, Jun Yang Alibaba Group, LiXue Xia Alibaba Group, Lansong Diao Alibaba Group, Xiaoyong Liu Alibaba Group, Wei Lin Alibaba Group
Link to publication