The question is, could this be applied in a radio environment? B101 in Philadelphia is very good as testing radio creative and it makes it part of its process. I think it is wise to make a determination of how effective the advertising can be, and B101 is very careful about what it puts on the air. And it works. KINK in Portland has a similar litmus test in determining whether or not radio creative is "KINK-worthy." 94/7 Alternative Portland does the same. It's vital to protect a radio brand and, if there is something that sullies it, it shouldn't get on the air. Substandard work should at least be fixed to reflect the sensibility of the audience.
Logic dictates that you would want to target as best as you can. However, that isn't always the case. Kevin Roberts, in a Radiocreativeland interview, made that point. He mentioned that radio-station programming and content targets him very well, but the advertising sometimes doesn't.
I generally look at a radio station as a circle. A station has programming, DJs or air personalities, imaging (the promos and other liners in between songs) and advertising. For the most part, the program director, music director and talent have a great deal of control. Once the "stop set" starts (that's an industry term for "commercial break"), they can sometimes lose their control and the circle can be broken. The good news is that a vast majority of listeners stick around for the advertising, but that number can always improve.
One of the great things about radio is that it is faster to market and easier to make changes than most other media. But I think that testing, even if its rudimentary, is crucial as the industry evolves and changes. If a spot or campaign is really strong, it should be rewarded with a lower rate or bonus spots. If its garbage, the advertiser should pay a premium for stinking up the station.
Now, if I could just figure out how to monitor all of this, I'd be on to something.