In this text, students of applied mathematics, science and
engineering are introduced to fundamental ways of thinking about
the broad context of parallelism. The authors begin by giving the
reader a deeper understanding of the issues through a general
examination of timing, data dependencies, and communication. These
ideas are implemented with respect to shared memory, parallel and
vector processing, and distributed memory cluster computing.