Limits...
Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors.

Choi JS, Bang JW, Heo H, Park KR - Sensors (Basel) (2015)

Bottom Line: Further, the latter causes inconvenience to the user due to the sensors attached to the body.Among various emotions, the accurate evaluation of fear is crucial in many applications, such as criminal psychology, intelligent surveillance systems and the objective evaluation of horror movies.Therefore, we propose a new method for evaluating fear based on nonintrusive measurements obtained using multiple sensors.

View Article: PubMed Central - PubMed

Affiliation: Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea. jjongssuk@dgu.edu.

ABSTRACT
Most previous research into emotion recognition used either a single modality or multiple modalities of physiological signal. However, the former method allows for limited enhancement of accuracy, and the latter has the disadvantages that its performance can be affected by head or body movements. Further, the latter causes inconvenience to the user due to the sensors attached to the body. Among various emotions, the accurate evaluation of fear is crucial in many applications, such as criminal psychology, intelligent surveillance systems and the objective evaluation of horror movies. Therefore, we propose a new method for evaluating fear based on nonintrusive measurements obtained using multiple sensors. Experimental results based on the t-test, the effect size and the sum of all of the correlation values with other modalities showed that facial temperature and subjective evaluation are more reliable than electroencephalogram (EEG) and eye blinking rate for the evaluation of fear.

No MeSH data available.


Related in: MedlinePlus

Four corresponding (calibration) pairs of points produced by four NIR illuminators to obtain the geometric transform matrix and an example for measuring calibration accuracy. (a) Four pairs of corresponding (calibration) points; (b) pair of points for measuring calibration accuracy.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4541947&req=5

sensors-15-17507-f005: Four corresponding (calibration) pairs of points produced by four NIR illuminators to obtain the geometric transform matrix and an example for measuring calibration accuracy. (a) Four pairs of corresponding (calibration) points; (b) pair of points for measuring calibration accuracy.

Mentions: As explained in Section 2.1 and shown in Figure 3, the images are acquired using dual cameras for measuring user facial temperature variations. It is usually difficult to detect the regions of facial features in the thermal image, because the textures of facial features are not distinct in the image, as shown in Figure 3. Therefore, facial feature regions are detected by the visible-light camera. However, the viewing angle and image resolution of the visible-light camera are different from those of the thermal camera. In addition, there exists a positional disparity between the visible-light and thermal cameras, as shown in Figure 3. Therefore, the coordinates of the detected facial feature regions in the visible-light image cannot be directly used in the thermal image. To solve this problem, the two axes of the visible-light and thermal cameras are parallel with the minimum horizontal distance between them, as shown in Figure 3. Then, we make the coordinates of the two images (visible-light and thermal) coincident using a geometric transform, as shown in Equation (1) and Figure 5. (1)[Tx0Tx1Ty0Ty1    Tx2Tx3Ty2Ty30     00     0        0     00     0]=[abef    cdgh0000    0000][Vx0     Vx1Vy0      Vy1          Vx2     Vx3Vy2     Vy3Vx0Vy0Vx1Vy111     Vx2Vy2Vx3Vy311](2)[T′xT′y00]=[abef    cdgh0000    0000][V′xV′yV′xV′y1]


Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors.

Choi JS, Bang JW, Heo H, Park KR - Sensors (Basel) (2015)

Four corresponding (calibration) pairs of points produced by four NIR illuminators to obtain the geometric transform matrix and an example for measuring calibration accuracy. (a) Four pairs of corresponding (calibration) points; (b) pair of points for measuring calibration accuracy.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4541947&req=5

sensors-15-17507-f005: Four corresponding (calibration) pairs of points produced by four NIR illuminators to obtain the geometric transform matrix and an example for measuring calibration accuracy. (a) Four pairs of corresponding (calibration) points; (b) pair of points for measuring calibration accuracy.
Mentions: As explained in Section 2.1 and shown in Figure 3, the images are acquired using dual cameras for measuring user facial temperature variations. It is usually difficult to detect the regions of facial features in the thermal image, because the textures of facial features are not distinct in the image, as shown in Figure 3. Therefore, facial feature regions are detected by the visible-light camera. However, the viewing angle and image resolution of the visible-light camera are different from those of the thermal camera. In addition, there exists a positional disparity between the visible-light and thermal cameras, as shown in Figure 3. Therefore, the coordinates of the detected facial feature regions in the visible-light image cannot be directly used in the thermal image. To solve this problem, the two axes of the visible-light and thermal cameras are parallel with the minimum horizontal distance between them, as shown in Figure 3. Then, we make the coordinates of the two images (visible-light and thermal) coincident using a geometric transform, as shown in Equation (1) and Figure 5. (1)[Tx0Tx1Ty0Ty1    Tx2Tx3Ty2Ty30     00     0        0     00     0]=[abef    cdgh0000    0000][Vx0     Vx1Vy0      Vy1          Vx2     Vx3Vy2     Vy3Vx0Vy0Vx1Vy111     Vx2Vy2Vx3Vy311](2)[T′xT′y00]=[abef    cdgh0000    0000][V′xV′yV′xV′y1]

Bottom Line: Further, the latter causes inconvenience to the user due to the sensors attached to the body.Among various emotions, the accurate evaluation of fear is crucial in many applications, such as criminal psychology, intelligent surveillance systems and the objective evaluation of horror movies.Therefore, we propose a new method for evaluating fear based on nonintrusive measurements obtained using multiple sensors.

View Article: PubMed Central - PubMed

Affiliation: Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea. jjongssuk@dgu.edu.

ABSTRACT
Most previous research into emotion recognition used either a single modality or multiple modalities of physiological signal. However, the former method allows for limited enhancement of accuracy, and the latter has the disadvantages that its performance can be affected by head or body movements. Further, the latter causes inconvenience to the user due to the sensors attached to the body. Among various emotions, the accurate evaluation of fear is crucial in many applications, such as criminal psychology, intelligent surveillance systems and the objective evaluation of horror movies. Therefore, we propose a new method for evaluating fear based on nonintrusive measurements obtained using multiple sensors. Experimental results based on the t-test, the effect size and the sum of all of the correlation values with other modalities showed that facial temperature and subjective evaluation are more reliable than electroencephalogram (EEG) and eye blinking rate for the evaluation of fear.

No MeSH data available.


Related in: MedlinePlus