CPCP Seminar: Transforming Your Research with High-Throughput Computing by Lauren Michael

Nov 10, 2015

9:30 am - 10:30 am

3rd Floor Teaching Lab, WID (Orchard View Room)

For projects aiming to extract novel understanding from large biomedical datasets, the process of selecting from an ever-growing list of computational tools and approaches is often a significant bottleneck. Beyond the difficulty in gaining awareness of various methods is the daunting task of identifying which of these will apply to and scale for the evolving complexity of future research problems. In fact, the ability of research projects to expand to greater dimensionality, validity, and impact is often unknowingly limited by researcher perceptions of the scalability of previously-selected computational tools. Ultimately, the scalability of any computational research project or tool relies on an ability to effectively break up (or “parallelize”) large and time-consuming tasks for better ease and throughput. As a general approach, high-throughput computing not only applies to a multitude of data-intensive research problems but also provides for built-in scalability into the future.

To provide a discussion of high-throughput computing approaches, Lauren Michael from UW-Madison’s Center for High Throughput Computing will (a) discuss the applicability of high-throughput computing (HTC) for executing data-intensive work, as compared to other approaches; (b) introduce the CHTC-produced HTCondor software, noting its specific design as an important tool for executing HTC workloads; and (c) describe the free support and computational capacity of the CHTC, which also serves as UW-Madison’s research computing center. The seminar will include diverse examples of research projects from campus and the world that have been transformed through the use of HTC, including the specific projects of several current CPCP members.