Desperation Morale does not claim to play the products before its reviews. It does not need to playtest, because this needs to be done before publication of the product itself.
While I agree that actually playing products before reviewing them would produce even deeper feedback, some experience that Pitman has will allow him to identify some obvious glitches that any experienced player would take note of. The 'specialty' of Mark in this regard is that he might be the only one systematically buying all that stuff. Furthermore his reviews don't focus upon the scenarios themselves but rather on elements that might give clues on their quality: Errors in printing, credits to playtesters (or the lack thereof), changes (or the lack thereof) to subsequent editions of a product the quality of the previous version of which might have been made apparent by people who actually did play it, etc.
Basically, this is the most what one could expect if one man alone is doing the job. It is valuable to get information on these tidbits that Pitman provides on DM. And that is the very reason why his site is viewed as a valuable resource by many.
von Marwitz
No arguments with any of that, save my original point in the breakaway thread. If one of the criticisms a reviewer is making is that a product wasn't playtested - my contention is that there would be no way to know that unless you actually played it yourself. (This is barring an admission by the publisher that they didn't playtest it, which has happened from time to time.)
To reiterate, I think you are correct in what you are saying re: the DM site, and I too find it for the most part valuable if for nothing else than as a catalogue.
But Mark is the one asking about how to identify if something has been playtested or not. I don't think you can do that with a simple chronology or comparing to a list of release dates, for the reasons that Jim, Chris and others have elucidated very well earlier in the thread.
Put another way - if Mark wants to criticize the playtesting of a product, that's fair game, but he'll need to put the work in to do it fairly. I don't doubt he can pick up "obvious glitches" (as you put it) from a general scan of the scenario card, and probably better than most of us. I do think, however, you can't speak to thorough testing of something unless you submit it to some actual scrutiny yourself.
It might beg the question of just how you would determine if something has been playtested. What does that look like? Spelling errors or glitches on the scenario card? Well, that would be more a lack of editing and proofing, though it could indicate it passed through very few hands in the playtest as well.
Wouldn't "lack of playtesting" only reveal itself in things like not having enough turns for an attacker to complete the VC? Or having lopsided win-loss results after multiple playings? I mean, isn't that what playtesting is supposed to prevent? And how would you be able to identify that kind of evidence with a simple scan of the scenario card? If it was that easy, I have to believe there would be no reason to ever complain of poor playtesting.