<strike id="jrjdx"><ins id="jrjdx"></ins></strike>

<address id="jrjdx"></address>

    <listing id="jrjdx"><listing id="jrjdx"><meter id="jrjdx"></meter></listing></listing>
    <address id="jrjdx"></address><form id="jrjdx"><th id="jrjdx"><th id="jrjdx"></th></th></form>
    <address id="jrjdx"><address id="jrjdx"><listing id="jrjdx"></listing></address></address>
    <noframes id="jrjdx">

    <noframes id="jrjdx">
    <form id="jrjdx"></form><form id="jrjdx"></form>

      <noframes id="jrjdx"><address id="jrjdx"><listing id="jrjdx"></listing></address>
      <noframes id="jrjdx">

      課程目錄:并行編程培訓
      4401 人關注
      (78637/99817)
      課程大綱:

                并行編程培訓

       

       

       

      Parallel Programming

      We motivate parallel programming

      and introduce the basic constructs for building parallel programs

      on JVM and Scala. Examples such as array norm

      and Monte Carlo computations illustrate these concepts.

      We show how to estimate work and depth

      of parallel programs as well as how to benchmark

      the implementations.Basic Task Parallel Algorithms

      We continue with examples of parallel algorithms by presenting a parallel merge sort.

      We then explain how operations such as map, reduce, and scan can be computed in parallel.

      We present associativity as the key condition enabling parallel implementation of reduce and scan.

      Data-ParallelismWe show how data parallel operations enable the development

      of elegant data-parallel code in Scala. We give an overview

      of the parallel collections hierarchy, including the traits

      of splitters and combiners that complement iterators and builders from the sequential case.

      Data Structures for Parallel ComputingWe give a glimpse

      of the internals of data structures for parallel computing, which helps

      us understand what is happening under the hood of parallel collections.


      日韩不卡高清