Search results

Filters

  • Journals
  • Date

Search results

Number of results: 4
items per page: 25 50 75
Sort by:

Abstract

Although the emotions and learning based on emotional reaction are individual-specific, the main features are consistent among all people. Depending on the emotional states of the persons, various physical and physiological changes can be observed in pulse and breathing, blood flow velocity, hormonal balance, sound properties, face expression and hand movements. The diversity, size and grade of these changes are shaped by different emotional states. Acoustic analysis, which is an objective evaluation method, is used to determine the emotional state of people’s voice characteristics. In this study, the reflection of anxiety disorder in people’s voices was investigated through acoustic parameters. The study is a case-control study in cross-sectional quality. Voice recordings were obtained from healthy people and patients. With acoustic analysis, 122 acoustic parameters were obtained from these voice recordings. The relation of these parameters to anxious state was investigated statistically. According to the results obtained, 42 acoustic parameters are variable in the anxious state. In the anxious state, the subglottic pressure increases and the vocalization of the vowels decreases. The MFCC parameter, which changes in the anxious state, indicates that people can perceive this situation while listening to the speech. It has also been shown that text reading is also effective in triggering the emotions. These findings show that there is a change in the voice in the anxious state and that the acoustic parameters are influenced by the anxious state. For this reason, acoustic analysis can be used as an expert decision support system for the diagnosis of anxiety.
Go to article

Abstract

The paper analyzes the estimation of the fundamental frequency from the real speech signal which is obtained by recording the speaker in the real acoustic environment modeled by the MP3 method. The estimation was performed by the Picking-Peaks algorithm with implemented parametric cubic convolution (PCC) interpolation. The efficiency of PCC was tested for Catmull-Rom, Greville, and Greville two- parametric kernel. Depending on MSE, a window that gives optimal results was chosen.
Go to article

Abstract

Speech emotion recognition is an important part of human-machine interaction studies. The acoustic analysis method is used for emotion recognition through speech. An emotion does not cause changes on all acoustic parameters. Rather, the acoustic parameters affected by emotion vary depending on the emotion type. In this context, the emotion-based variability of acoustic parameters is still a current field of study. The purpose of this study is to investigate the acoustic parameters that fear affects and the extent of their influence. For this purpose, various acoustic parameters were obtained from speech records containing fear and neutral emotions. The change according to the emotional states of these parameters was analyzed using statistical methods, and the parameters and the degree of influence that the fear emotion affected were determined. According to the results obtained, the majority of acoustic parameters that fear affects vary according to the used data. However, it has been demonstrated that formant frequencies, mel-frequency cepstral coefficients, and jitter parameters can define the fear emotion independent of the data used.
Go to article

Abstract

The present study consisted of two experiments. The goal of the first experiment was to establish the just noticeable differences for the fundamental frequency of the vowel /u/ by using the 2AFC method. We obtained the threshold value for 27 cents. This value is larger than the motor reaction values which had been observed in previous experiments (e.g. 9 or 19 cents). The second experiment was intended to provide neurophysiological confirmation of the detection of shifts in a frequency, using event-related potentials (ERPs). We concentrated on the mismatch negativity (MMN) - the component elicited by the change in the pattern of stimuli. Its occurrence is correlated with the discrimination threshold. In our study, MMN was observed for changes greater than 27 cents - shifts of ±50 and 100 cents (effect size - Cohen’s d = 2.259). MMN did not appear for changes of ±10 and 20 cents. The results showed that the values for which motor responses can be observed are indeed lower than those for perceptual thresholds.
Go to article

This page uses 'cookies'. Learn more