I've played through the entire game on my desktop, with XP running and DX9, of course - with medium details. My desktop has only 768 Megs of RAM, so I didn't experiment to move anything above medium details, only once, when I tried how the game would look like with everything up to the max. The result was...well, a horribly long load time and swap lock-ups. After a few minutes, when everything was in its place, I took a look at the environment - nice. Back to medium settings.
Yesterday I've installed Crysis on my notebook (Pentium Dual-Core 1.6 Ghz, 1,5 Gigs of RAM, AMD Radeon 2600 VGA, Failsta, as mentioned above), since it runs UT3 at playable framerates on high settings, figured I could give it a shot. It runs nice at medium-high settings, so I've again pumped everything up to the max...not playable, but I personally didn't notice any difference, as much as I remember the max settings on XP. It looked the same to me - nice shadows, crisp textures and HDR up the ass.
Maybe there is some slight difference, so I might notice it if I look closer again, but I don't see that big difference between DX9 and 10. I remember, when I bought my Radeon 9550 more than 3 years ago and started up HL2 - now that was visible difference compared to DX8 and my old Geforce 3.