Write a Blog >>
PPoPP 2021
Sat 27 February - Wed 3 March 2021

Performance models are well-known instruments to understand the scaling behavior of parallel applications. They express how performance changes as key execution parameters, such as the number of processes or the size of the input problem, vary. Besides reasoning about program behavior, such models can also be automatically derived from performance data. This is called empirical performance modeling. While this sounds simple at the first glance, this approach faces several serious interrelated challenges, including expensive performance measurements, inaccuracies inflicted by noisy benchmark data, and overall complex experiment design, starting with the selection of the right parameters. The more parameters one considers, the more experiments are needed and the stronger the impact of noise. In this paper, we show how taint analysis, a technique borrowed from the domain of computer security, can substantially improve the modeling process, lowering its cost, improving model quality, and help validate performance models and experimental setups.

Wed 3 Mar

Displayed time zone: Eastern Time (US & Canada) change

12:30 - 13:30
Session 10. Machine Learning and Software EngineeringMain Conference
Chair(s): Albert Cohen Google
12:30
15m
Talk
TurboTransformers: An Efficient GPU Serving System For Transformer Models
Main Conference
Jiarui Fang Tencent, Yang Yu , Chengduo Zhao Tencent, Jie Zhou Tencent
Link to publication
12:45
15m
Talk
Extracting Clean Performance Models from Tainted Programs
Main Conference
Marcin Copik ETH Zurich, Alexandru Calotoiu ETH Zurich, Tobias Grosser University of Edinburgh, Nicolas Wicki ETH Zurich, Felix Wolf TU Darmstadt, Torsten Hoefler ETH Zurich
Link to publication Pre-print
13:00
15m
Talk
Modernizing Parallel Code with Pattern Analysis
Main Conference
Roberto Castañeda Lozano University of Edinburgh, Murray Cole University of Edinburgh, Björn Franke University of Edinburgh
Link to publication
13:15
15m
Talk
DAPPLE: A Pipelined Data Parallel Approach for Training Large Models
Main Conference
Shiqing Fan Alibaba Group, Yi Rong Alibaba Group, Chen Meng Alibaba Group, ZongYan Cao Alibaba Group, Siyu Wang Alibaba Group, Zhen Zheng Alibaba Group, Chuan Wu The University of Hong Kong, Guoping Long Alibaba Group, Jun Yang Alibaba Group, LiXue Xia Alibaba Group, Lansong Diao Alibaba Group, Xiaoyong Liu Alibaba Group, Wei Lin Alibaba Group
Link to publication