Write a Blog >>
PPoPP 2021
Sat 27 February - Wed 3 March 2021

Performance models are well-known instruments to understand the scaling behavior of parallel applications. They express how performance changes as key execution parameters, such as the number of processes or the size of the input problem, vary. Besides reasoning about program behavior, such models can also be automatically derived from performance data. This is called empirical performance modeling. While this sounds simple at the first glance, this approach faces several serious interrelated challenges, including expensive performance measurements, inaccuracies inflicted by noisy benchmark data, and overall complex experiment design, starting with the selection of the right parameters. The more parameters one considers, the more experiments are needed and the stronger the impact of noise. In this paper, we show how taint analysis, a technique borrowed from the domain of computer security, can substantially improve the modeling process, lowering its cost, improving model quality, and help validate performance models and experimental setups.

Conference Day
Wed 3 Mar

Displayed time zone: Eastern Time (US & Canada) change

12:30 - 13:30
Session 10. Machine Learning and Software EngineeringMain Conference
Chair(s): Albert CohenGoogle
12:30
15m
Talk
TurboTransformers: An Efficient GPU Serving System For Transformer Models
Main Conference
Jiarui FangTencent, Yang Yu, Chengduo ZhaoTencent, Jie ZhouTencent
Link to publication
12:45
15m
Talk
Extracting Clean Performance Models from Tainted Programs
Main Conference
Marcin CopikETH Zurich, Alexandru CalotoiuETH Zurich, Tobias GrosserUniversity of Edinburgh, Nicolas WickiETH Zurich, Felix WolfTU Darmstadt, Torsten HoeflerETH Zurich
Link to publication Pre-print
13:00
15m
Talk
Modernizing Parallel Code with Pattern Analysis
Main Conference
Roberto Castañeda LozanoUniversity of Edinburgh, Murray ColeUniversity of Edinburgh, Björn FrankeUniversity of Edinburgh
Link to publication
13:15
15m
Talk
DAPPLE: A Pipelined Data Parallel Approach for Training Large Models
Main Conference
Shiqing FanAlibaba Group, Yi RongAlibaba Group, Chen MengAlibaba Group, ZongYan CaoAlibaba Group, Siyu WangAlibaba Group, Zhen ZhengAlibaba Group, Chuan WuThe University of Hong Kong, Guoping LongAlibaba Group, Jun YangAlibaba Group, LiXue XiaAlibaba Group, Lansong DiaoAlibaba Group, Xiaoyong LiuAlibaba Group, Wei LinAlibaba Group
Link to publication