Mercurial > cortex
view MIT-media-projects.org @ 346:83afb1fcc999
add touch sensors and mass.
author | Robert McIntyre <rlm@mit.edu> |
---|---|
date | Sun, 22 Jul 2012 11:18:47 -0500 |
parents | c264ebf683b4 |
children |
line wrap: on
line source
1 *Machine Learning and Pattern Recognition with Multiple2 Modalities Hyungil Ahn and Rosalind W. Picard4 This project develops new theory and algorithms to enable5 computers to make rapid and accurate inferences from6 multiple modes of data, such as determining a person's7 affective state from multiple sensors--video, mouse behavior,8 chair pressure patterns, typed selections, or9 physiology. Recent efforts focus on understanding the level10 of a person's attention, useful for things such as11 determining when to interrupt. Our approach is Bayesian:12 formulating probabilistic models on the basis of domain13 knowledge and training data, and then performing inference14 according to the rules of probability theory. This type of15 sensor fusion work is especially challenging due to problems16 of sensor channel drop-out, different kinds of noise in17 different channels, dependence between channels, scarce and18 sometimes inaccurate labels, and patterns to detect that are19 inherently time-varying. We have constructed a variety of20 new algorithms for solving these problems and demonstrated21 their performance gains over other state-of-the-art methods.23 http://affect.media.mit.edu/projectpages/multimodal/