Figure 2From: Detecting & interpreting self-manipulating hand movements for student’s affect predictionSchematic diagram of the proposed gesture and mental state prediction framework. Given input image frames, the gesture detection module extracts binary codes that are input to a three-layered Bayesian network that emits posterior estimates for all (mental) states. Bar graph shows the posterior estimates of a mental state as well as the associated gesture.Back to article page