[SensorFetch code] [code]
The project aims to predict five kinds of emotions of pedestrains from sensory data of cellphones. The following procedures were performed: I developed an Android Mobile App SensorFetch to record sensory data; A total of 30 pedestrians were recruited to walk with cellphones in hand under induced emotions; I regressed sensory data onto emotion labels that pedestrians reported. The results demonstrated a weak correlation and required further measures to accurately capture pedestrian emotionals.
Page not found. Your pixels are in another canvas.
This is a page not in th emain menu