Call for Papers

Software systems (e.g., smartphone apps, desktop applications, telecommunication infrastructures and enterprise systems, etc.) have strict requirements on software performance. Failing to meet these requirements may cause business losses, customer defection, brand damage, and other serious consequences. In addition to conventional functional testing, the performance of these systems must be verified through load testing or benchmarking to ensure quality service.

Load testing examines the behavior of a system by simulating hundreds or thousands of users performing tasks at the same time. Benchmarking compares the system's performance against other similar systems in the domain. The workshop is not limited by traditional load testing; it is open to any ideas of re-inventing and extending load testing, as well as any other way to ensure systems performance and resilience under load, including any kind of performance testing, resilience / reliability / high availability / stability testing, operational profile testing, stress testing, A/B and canary testing, volume testing, and chaos engineering.

Load testing and benchmarking software systems are difficult tasks that require a deep understanding of the system under test and customer behavior. Practitioners face many challenges such as tooling (choosing and implementing the testing tools), environments (software and hardware setup), and time (limited time to design, test, and analyze). Yet, little research is done in the software engineering domain concerning this topic.

Adjusting load testing to recent industry trends, such as cloud computing, agile / iterative development, continuous integration / delivery, micro-services, serverless computing, AI/ML services, and containers poses major challenges, which are not fully addressed yet.

This one-day workshop brings together software testing and software performance researchers, practitioners, and tool developers to discuss the challenges and opportunities of conducting research on load testing and benchmarking software systems. Our ultimate goal is to grow an active community around this important and practical research topic.

We solicit two tracks of submissions:

  1. Research or industry papers:
  2. Presentation track for industry or research talks:
Research/Industry papers should follow the standard ACM SIG proceedings format and need to be submitted electronically via HotCRP. Extended abstracts for the presentation track need to be submitted as "abstract only'' submissions via HotCRP as well. Accepted papers will be published in the ICPE 2025 Companion Proceedings. Submissions can be research papers, position papers, case studies, or experience reports addressing issues including but not limited to the following:


Instructions for Authors from ACM

By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all ACM Publications Policies, including ACM's new Publications Policy on Research Involving Human Participants and Subjects. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.

Please ensure that you and your co-authors obtain an ORCID ID, so you can complete the publishing process for your accepted paper. ACM has been involved in ORCID from the start, and we have recently made a commitment to collect ORCID IDs from all of our published authors. The collection process has started and rolled out as a requirement throughout 2022. We are committed to improve author discoverability, ensure proper attribution and contribute to ongoing community efforts around name normalization; your ORCID ID will help in these efforts.

Important Dates

Paper Track (research and industry papers):

[Submission Link]
Abstract submission: January 20, 2025, AOE;
Paper submission: January 24, 2025, AOE;
Author notification: February 11, 2025;
Camera-ready version: February 26, 2025

Presentation Track:

Extended abstract submission: February 17, 2025, AOE;
Author notification: February 26, 2025;
Workshop date: May 5-9 (TBD), 2025


Organization:

Chairs:

Stephen Fan The King's University, Canada
Lizhi Liao Memorial University of Newfoundland, Canada
Zhenhao Li York University, Canada


Web Chair:

Changyuan Lin The University of British Columbia, Canada


Program Committee:

Alexander Podelko Amazon
An Ran Chen University of Alberta
Andrew Tappenden The King's University
Daniele Di Pompeo University of L'Aquila
Dong Jae Kim Depaul University
Gerson Sunyé University of Nantes
Heng Li Polytechnique Montréal
Jinfu Chen Wuhan University
Junkai Chen Zhejiang University
Kundi Yao University of Waterloo
Michele Tucci University of L'Aquila
Zebin Ren Vrije Universiteit Amsterdam
Zeyang Ma Concordia University
Zishuo Ding Hong Kong University of Science and Technology (Guangzhou)


Past LTB Workshops: