Link to CALCM Home  

Understanding and Visualizing Full Systems with Data Flow Tomography

Tuesday March 18, 2008
Hamerschlag Hall D-210
4:00 pm



Eric Chung
Carnegie Mellon University

Data Flow Tomography is a recently proposed technique by Mysore et al. from the University of California, Santa Barbara for visualizing and understanding the interactions between complex components in modern computer systems. Software abstraction layers enable developers to construct more sophisticated systems without the need to understand the details of the underlying system. While these abstractions are necessary, few people understand or are aware of the many layers of software that run and interact beneath and around their programs.

Data flow tomography uses data flow tagging at the ISA level to identify data regions of interest. Data tags can be generated at the network-interface, application, or by instruction-rules and is tracked at the ISA level to aid the examination and visualization of systems composed of many independently developed components. In this talk, I will discuss three data tagging policies: 1) feed-forward tag-and-release for finding data uses, 2) source-flow tagging for tracking data back to its source, and 3) a confluence method that relies on the collision of tags to map boundaries between components. A prototype of the system on QEMU is used to test these methods over distributed systems with a variety of components (OS, Mongrel, Ruby, Rails, Apache, Perl, and MySQL) and to generate visualization results for the end-user.


Eric is a fourth-year PhD student in the Computer Architecture Laboratory at Carnegie Mellon, where he is advised by Prof. James C. Hoe. His interests are in computer architecture and FPGA-accelerated simulation technologies for full-system, multiprocessor architectural exploration.

 

Department of Electrical and Computer EngineeringCarnegie Mellon UniversitySchool of Computer Science