Sunday, April 28, 2013

Two options for parallelizing the CStore representation function

Adam has been working on the parallelization of an expensive operation performed by the imperative interpreter. This operation (repr) converts the type used to represent the simulation state internally in the imperative interpreter (Object) to the type used by the reference interpreter (CStore), as this is the format used to communicate results to the graphical user interface. Currently, this operation is done sequentially. The imperative interpreter first collects snapshots of the state at each step of the simulation and then maps repr over these to produce a stream of CStore objects, which are then passed to the GUI.
There are several possible options to parallelize this operation. An attractive first option, which is trivial to implement, is to use Scala's parallel collections. This amounts to adding the .par suffix to the collection, which makes all subsequent operations on the collection resolve to their parallel counterparts. Unfortunately, the simplicity of this solution comes at a price, Scala's parallel collections operations only work on compatible collection types (vectors, arrays, hash maps), and will copy the contents of sequential collections (such as the Stream used to represent the simulation history) to a compatible type before performing their computation. This means that additional copies of the simulation state will have to be kept in memory, which already constitutes a bottleneck in certain simulations.
Another option that Adam looked at was to push the computation of the CStore into the iterateMain function which updates the state during simulation and is already parallel. Fusing iterateMain and repr was straight forward given that these are both traversals of the heap and the necessary changes to the interpreter were relatively small. Surprisingly, the resulting implementation did not result in any measurable performance improvement. Additional investigation is necessary to identify why this optimization was not successful.