Google Research

ASAP: Fast Mobile Application Switch via Adaptive Prepaging

  • Sam Son
  • Seung Yul Lee
  • Jonghyun Bae
  • Yunho Jin
  • Jinkyu Jeong
  • Tae Jun Ham
  • Jae W. Lee
  • Hongil Yoon
USENIX Association, ., pp. 365-380

Abstract

With ever-increasing demands for memory capacity from a mobile application, along with a steady increase in the number of applications running concurrently, the memory capacity is becoming a scarce resource on mobile devices. When the memory pressure is high, current mobile OSes often kill application processes that have not recently been used to reclaim memory space. This leads to a long delay when the user relaunches the killed application, which degrades the user experience. Even if this mechanism is disabled to utilize a compression-based in-memory swap mechanism, relaunching the application still incurs a substantial latency penalty as it requires decompression of compressed anonymous pages and a stream of I/O accesses to retrieve file-backed pages into memory. This paper identifies the conventional demand paging as the primary source of this inefficiency and proposes ASAP, a mechanism for fast application switch via adaptive prepaging on mobile devices. Specifically, ASAP performs prepaging effectively by combining i) high-precision switch footprint estimators for both file-backed and anonymous pages, and ii) efficient implementation of the prepaging mechanism to minimize resource wastes for CPU cycles and disk bandwidth during an application switch. Our evaluation of ASAP using eight real-world applications on Google Pixel 4 demonstrates that ASAP can reduce the switch time by 22.2% on average (with 33.3% at maximum) over the vanilla Android 10.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work