关键词:计算内核;能量消耗;高性能计算;算法
摘 要:This thesis extends previous research on communication bounds and computational kernels by presenting bounds on the energy consumption of a large class of algorithms. These bounds apply to sequential, distributed parallel and heterogeneous machine models and we detail methods to further extend these models to larger classes of machines. We argue that the energy consumption of computational kernels is usually predictable and can be modeled via linear models with a handful of terms. Thus, these energy models (and the accompanying bounds) may apply to many HPC applications when used in composition.