Again with the claim that being partially deaf somehow makes you a better listener, which I've just got to call b.s. on.
It makes one a different listener for sure, and is likely to have the effect that sensation that others would never notice may be obvious. So, while "better" might not be correct, it is entirely possible that someone with impaired hearing might hear an effect inside of their sensation range than a person with no impairment would not be able to distinquish.
In particular, a shift in level from below sensation level (threshold) to above is very noticible.
A small change in loudness due to a small change in intensity (for the normal hearing person) may be completely indistinguishable.
So, that is a place where the person with injured hearing would notice something that somebody with Stevens-Standard loudness perception would not.
Lest you wonder, this has tripped up codec designers a few times, myself included.
Being able to hear better would make you a better listener.Could you, or someone you think can do it, tell the difference between the previously mentioned lampcord and the stuff you sold in a listening test?The woo is in the claim that people can tell the difference in quality when, logically, any such difference shouldn't be able to be heard by the human ear.
Well, you want to level match a time-proximate DBT to under .1dB. That is a level change (in amplitude) of about 1.0115 or so.
This could arise due to cable resistance vs. even a "resistive" load of 8 ohms, without too much silliness of cable length. In other words, about .1 ohm difference in cable resistance could, barely, create an issue.
Bear in mind that such teensy intensity differences are not ever heard to be loudness, rather they are percieved as "quality" or "depth" or any number of non-loudness-related sensation.