- This topic has 1 reply, 2 voices, and was last updated 8 years, 6 months ago by .
Viewing 1 reply thread
Viewing 1 reply thread
- You must be logged in to reply to this topic.
› Forums › Speech Synthesis › Evaluation › Testing during system development
What is the best way to perform evaluation during system development? I don’t think researchers regularly recruit human evaluators to listen to the outputs of their system after each small design change, and yet some way of measuring progress seems necessary. It would be neat to have some automatic objective scoring methods, based on statistical properties of human speech, even if they could only serve as rough guides. What I imagine as a desirable scenario is knowing what features are most distinctive of synthetic speech as compared to natural speech and being able to evaluate how your system fares with respect to those features. Such features would need to be relatively easy and quick to calculate. They could include for instance some measure of smoothness of the pitch contour, or of how natural spectral changes are or of how human-like is the placement and length of pauses.
What I’d like to know is (a) whether we know what computable properties of speech correlate reliably with perceived naturalness; (b) if so, do researchers use these properties for development-stage evaluation and if not, what do they do apart from listening to the synthesised outputs themselves?
If we don’t want to perform a listening test after every minor change to the system, then we need to rely on either
We’ll cover objective measures in the lecture.
Objective measures are widely used in statistical parametric synthesis. In fact, the statistical model is essentially trained to minimise a kind of objective error with the training data. We can then measure the error with respect to some held-out data.
Some forums are only available if you are logged in. Searching will only return results from those forums if you log in.
Copyright © 2024 · Balance Child Theme on Genesis Framework · WordPress · Log in