Limits...
Using eye movement analysis to study auditory effects on visual memory recall.

Marandi RZ, Sabzpoushan SH - Basic Clin Neurosci (2014)

Bottom Line: The method was validated with eight different participants.Recognition rate in "with sound" stage was significantly reduced as compared with "without sound" stage.The result demonstrated that the familiarity of visual-auditory stimuli can be detected from EOG signals and the auditory input potentially improves the visual recall process.

View Article: PubMed Central - PubMed

Affiliation: Department of Biomedical Engineering, Iran University of Science and Technology, Narmak, Tehran, Iran.

ABSTRACT
Recent studies in affective computing are focused on sensing human cognitive context using biosignals. In this study, electrooculography (EOG) was utilized to investigate memory recall accessibility via eye movement patterns. 12 subjects were participated in our experiment wherein pictures from four categories were presented. Each category contained nine pictures of which three were presented twice and the rest were presented once only. Each picture presentation took five seconds with an adjoining three seconds interval. Similarly, this task was performed with new pictures together with related sounds. The task was free viewing and participants were not informed about the task's purpose. Using pattern recognition techniques, participants' EOG signals in response to repeated and non-repeated pictures were classified for with and without sound stages. The method was validated with eight different participants. Recognition rate in "with sound" stage was significantly reduced as compared with "without sound" stage. The result demonstrated that the familiarity of visual-auditory stimuli can be detected from EOG signals and the auditory input potentially improves the visual recall process.

No MeSH data available.


Related in: MedlinePlus

The top 16 eye movement features selected by mRMR for all twelve training sets for the faces picture category. X-axis shows feature numbers and groups; the key on the right shows the corresponding feature names as described in Table 1; Y-axis shows the rank.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4202595&req=5

Figure 0009: The top 16 eye movement features selected by mRMR for all twelve training sets for the faces picture category. X-axis shows feature numbers and groups; the key on the right shows the corresponding feature names as described in Table 1; Y-axis shows the rank.

Mentions: Feature rankings using mRMR was analyzed on each of the twelve leave-one-person-out training sets for the faces category. The rank of a feature is the position at which mRMR selected it within a set. The position corresponds to the importance through which mRMR assesses a feature's ability to discriminate between classes in combination with the features already selected. Figures 8 and 9 show the top 16 features according to the median rank over all sets (see Table 1 for features’ description). For each feature, the vertical bar represents the spread of mRMR ranks for the twelve training sets. The most useful features are those found with the highest rank (close to one) for most training sets as indicated by shorter bars. As illustrated in Figures 8 and 9, the top discriminative features are mostly common for the two parts of the experiment.


Using eye movement analysis to study auditory effects on visual memory recall.

Marandi RZ, Sabzpoushan SH - Basic Clin Neurosci (2014)

The top 16 eye movement features selected by mRMR for all twelve training sets for the faces picture category. X-axis shows feature numbers and groups; the key on the right shows the corresponding feature names as described in Table 1; Y-axis shows the rank.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4202595&req=5

Figure 0009: The top 16 eye movement features selected by mRMR for all twelve training sets for the faces picture category. X-axis shows feature numbers and groups; the key on the right shows the corresponding feature names as described in Table 1; Y-axis shows the rank.
Mentions: Feature rankings using mRMR was analyzed on each of the twelve leave-one-person-out training sets for the faces category. The rank of a feature is the position at which mRMR selected it within a set. The position corresponds to the importance through which mRMR assesses a feature's ability to discriminate between classes in combination with the features already selected. Figures 8 and 9 show the top 16 features according to the median rank over all sets (see Table 1 for features’ description). For each feature, the vertical bar represents the spread of mRMR ranks for the twelve training sets. The most useful features are those found with the highest rank (close to one) for most training sets as indicated by shorter bars. As illustrated in Figures 8 and 9, the top discriminative features are mostly common for the two parts of the experiment.

Bottom Line: The method was validated with eight different participants.Recognition rate in "with sound" stage was significantly reduced as compared with "without sound" stage.The result demonstrated that the familiarity of visual-auditory stimuli can be detected from EOG signals and the auditory input potentially improves the visual recall process.

View Article: PubMed Central - PubMed

Affiliation: Department of Biomedical Engineering, Iran University of Science and Technology, Narmak, Tehran, Iran.

ABSTRACT
Recent studies in affective computing are focused on sensing human cognitive context using biosignals. In this study, electrooculography (EOG) was utilized to investigate memory recall accessibility via eye movement patterns. 12 subjects were participated in our experiment wherein pictures from four categories were presented. Each category contained nine pictures of which three were presented twice and the rest were presented once only. Each picture presentation took five seconds with an adjoining three seconds interval. Similarly, this task was performed with new pictures together with related sounds. The task was free viewing and participants were not informed about the task's purpose. Using pattern recognition techniques, participants' EOG signals in response to repeated and non-repeated pictures were classified for with and without sound stages. The method was validated with eight different participants. Recognition rate in "with sound" stage was significantly reduced as compared with "without sound" stage. The result demonstrated that the familiarity of visual-auditory stimuli can be detected from EOG signals and the auditory input potentially improves the visual recall process.

No MeSH data available.


Related in: MedlinePlus