Friday Apr. 30, 2010
Gates Hillman Centre 6501 (Note the change in location)
University of Virginia
Modern computer system designers must consider many more factors than raw performance of individual applications. Thermal output, power consumption, reliability, heterogeneity, and dynamic resource contention have become first-order concerns. Yet many of these issues are transient in nature, are difficult to predict, and are expensive to completely avoid. These observations point toward the potential benefits of adaptive systems that detect and react to changing conditions as they arise. Historically, research efforts in optimizing computer systems have targeted a single logical layer in the system stack, whether it has been optimization of the hardware (microarchitectural or circuit-level techniques), the middleware (operating systems and virtual machines), or the software (static and dynamic compilation). However, a true adaptive system requires coordination of all layers to be effective. While hardware is ideal for detecting thermal emergencies, for instance, the middleware has a global view of the runtime environment, including resource contention and observed process heterogeneity, and the software has a global view of the opportunities for permanent code-based solutions that leverage dynamic information. In this talk, I will make a case for dynamic adaptation as a solution for several modern architectural and system challenges, such as voltage noise, heterogeneity, and resource contention. I will discuss our research efforts in integrating the strengths of each design layer to provide cohesive, symbiotic solutions to these challenges. At the core of this work is the Pin dynamic instrumentation system, which we use to collate information gathered from the various system layers and to perform run-time code transformations. I will therefore discuss the myriad of implementation challenges we encountered when developing, optimizing, and supporting Pin. Finally, I will highlight the numerous benefits of runtime adaptation moving forward.
Kim Hazelwood is an Assistant Professor of Computer Science at the University of Virginia. She works at the boundary between hardware and software, with research efforts focusing on computer architecture, run-time optimizations, and the implementation and applications of virtual execution environments. She received the Ph.D. degree from Harvard University in 2004. Since then, she has become widely known for her active contributions to the Pin dynamic instrumentation system, which allows users to easily inject arbitrary code into existing program binaries at run time (www.pintool.org <http://www.pintool.org> ). Pin is widely used throughout industry and academia to investigate new approaches to program introspection, optimization, security, and architectural design. It has been downloaded over 40,000 times and cited in over 500 publications since it was released in July 2004. Kim has published over 35 peer-reviewed articles related to computer architecture and virtualization. She has served on over a dozen program committees, including ISCA, PLDI, MICRO, and PACT, and is the program chair of CGO 2010. Kim is the recipient of numerous awards, including the FEST Distinguished Young Investigator Award for Excellence in Science and Technology, an NSF CAREER Award, a Woodrow Wilson Career Enhancement Fellowship, the Anita Borg Early Career Award, and research awards from Microsoft, Google, NSF, and the SRC. Her research has been featured in Computer World, ZDNet, EE Times, and Slashdot.