Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

ATI All-In-Wonder X1900 PCIe Review 55

An anonymous reader writes "ViperLair is currently running a closer look at ATI's newly released All-In-Wonder X1900 PCIe graphics card. The clock speeds and memory are pretty comparable to other cards available but the reviewer warns that 'clock speeds do not always tell the whole story.' The review tests performance in Doom 3, UT 2004, Far Cry, Half-Life 2, and the 3DMark06 benchmarking tool." This release comes relatively quickly after the X1800 series which was release just last October.
This discussion has been archived. No new comments can be posted.

ATI All-In-Wonder X1900 PCIe Review

Comments Filter:
  • by Anonymous Coward on Saturday February 04, 2006 @04:36PM (#14642990)
    My next video card won't have any TV capture abilities. I got an MDP-130 HD capture card, and Comcast is now doing Analog Digital Simulcast (clear QAM) in my area, which means I can do straight digital captures of most major TV stations.
    • After setting up a friend's (then new) 9700 All In Wonder, I'd have to agree. It did decently as a video card, but trying to get the video capture and TV tuning parts working with his cable was simply a pain, and even when working properly, the quality was rather sub par when compared to the 15 year old TV he had. A stand alone tuner/capture card would probably have been the better choice, and cheaper!
      • I would second the cheaper also. I've had a Tuner card that has seen me through three full systems and about seven video card upgrades and is still working great. In comparison the 2MB video card, 64MB memory, 56k modem, 4X CD Rom etc that shared space with it when it was new have long since hit the dustbin.
  • $500 (Score:1, Flamebait)

    by Eightyford ( 893696 )
    Wow, and for the low price of only $500! Who actually buys these cards when they are at this price? WoW only needs a GeForce 2, and those can be had for little over 25 bucks.
    • Re:$500 (Score:4, Insightful)

      by LurkerXXX ( 667952 ) on Saturday February 04, 2006 @04:49PM (#14643032)
      People with more disposable income than you, or people who have gaming as a higher priority in their life compared to other things than it is in yours.
      • Re:$500 (Score:3, Informative)

        by d474 ( 695126 )
        "People with more disposable income than you, or people who have gaming as a higher priority in their life compared to other things than it is in yours."
        IOW, guys without girlfriends.
    • But you can watch TV AND Game!

      Well... if you have a dual-screen setup. ;-)
    • have you ever had a thought that maybe, just maybe, some people play other games than world of warcraft? and that some people don't like role playing games at all? and that some people even prefer single player games?
      • have you ever had a thought that maybe, just maybe, some people play other games than world of warcraft? and that some people don't like role playing games at all? and that some people even prefer single player games?

        I just picked a fairly new game to illustrate my point. I don't think it matters if it's a single or multiplayer game...
        • Re:$500 (Score:1, Offtopic)

          You picked a game over a year old that wasn't a good judge of current hardware requirements WHEN IT CAME OUT. Im also sure you didnt actually attempt playing WoW on a GEForce2. My 2.8 gigahertz ATI Radeon 9700 Pro had trouble with it. If you want to look at 'current hardware requierments' look at FEAR or Battlefield2 or heck, even Quake4. That being said, I just bought myself a GeFroce 7800GT for $359 and got a free motherboard with it. Good deal for me.

          But if you want to know who the crazies are, it is
          • But if you want to know who the crazies are, it is possible to buy two DUAL GEForce something or other cards for 800 each. So $1600 for the 'graphics card' alone.

            That's pretty crazy! Doom3 was probably the last hardcore 3d game I've played, but wouldn't the xbox360 provide a better gaming experience for a lot less money?
            • well, that would depend if youre a diehard pc fan (or dont play anything but rts/fps) or just someone who wants the best fun per dollar spent. personally i hate rts and I'm over most fps (how many times can you play the same storyless "shoot the bad guy, yuo teh win" type of game without getting tired of it). console gaming saves me cash AND cuts the last need for windows on my pc.
        • well, deus ex 2 is (with the updated textures) absolutely unplayable with a geforce 3. and this game is old.
  • by Suddenly_Dead ( 656421 ) on Saturday February 04, 2006 @04:43PM (#14643011)
    The review didn't really test much that stressed the video card beyond Doom 3. A look at Half-Life 2 Lost Coast and/or some other HDR game(s) would have been far more useful than testing Unreal Tournament 2004, which the review admitted had more of a CPU bottleneck than anything else. They didn't do any overclock tests either, and the image quality tests are a little dubious.

    • here is a list i compiled by checking out many different benchmarks. in general the faster cards are on top, the slower ones below. since i am concentrating on affordable cards, i haven't placed many expensive cards above the nvidia 6600GT and radeon X1600XT, so there are many high-end ones available now that are not on this list. if you see a few cards back-to-back with an equal sign (=) in front, that means they are very similar in performance to the ones next to it that also have the "=" sign.

      N/A = dis
    • Yea, the games list is Doom 3, UT 2004, Far Cry, Half-Life 2, and the 3DMark06 benchmarking tool. Whatever happened to trying Battlefield 2 (a hugely detailed game when you scale things up), F.E.A.R (reputedly tough on video hardware) or something like the latest Age of Empires game.

      Half Life 2 has been out for a year- there are tougher tests for a video card, like the Lost Coast expansion pack
  • Another ATI (Score:1, Flamebait)

    by canuck57 ( 662392 )
    Yawn... Why would someone in this day and age pay this much for a video card?
  • about 10,000 geeks worldwide just became 700 dollars poorer
  • My geforce6600 is dog slow on my athlonXP +2400 for doom3 and everquest2.

    Its like nothing is fast enough. After reading about the trillion or so polygons for unreal3 or whatever its going to be called, I need a new card. The graphics are stunning [unrealtechnology.com] and I wonder if even the x1900 will be able to handle it?
  • While the performance of the card does take a nice step forward over the X1800, it's not really much to spend the extra $$ over. I'm still waiting for the "next generation" graphics card. Something that really takes a large step forward. Still, it really comes down to the application developers and how they design the programs. Most can't use the full capabilities of the card, so we're still left in the dark.

    - Adam
  • I am waiting on my PVR system. Mainly now I am waiting for backend software that supports saving in a format I can play on any of my computers and which can be controlled from a thin client like the windows media connector ones. It would also have to have the standard features. Support for various cards, HD capability, wide screen, etc, etc. I don't care about the OS as long as the features are there. Unfortunately it seems that only Windows Media Center really does what I want and it, unfortunately, d
  • Jumping from 16 to 48 pixel pipelines, (1800 to 1900), one would expect much better frame rate. But the nice thing, this puts ATI back up to the Nvidia GTX series.

    Very nice card, price is expensive, but nice.
    • by be-fan ( 61476 ) on Saturday February 04, 2006 @06:54PM (#14643420)
      They didn't jump from 16 to 48 pixel pipelines. The x1000 cards have a fairly non-traditional architecture. Instead of having a fixed set of pixel pipelines with fixed resources, they have a large shader array, running a number of rendering threads. ALUs are assigned to each thread as necessary. The X1900 increases the number of shader units from 16 to 48, but both the X1800 and X1900 have 16 texture units and 16 raster-op units. So both cards can do 16 texture lookups per clock, and commit 16 pixels to memory per clock. Where the extra ALUs in the X1900 come in handy are for complex shaders, where the X1900 can do far more calculations per pixels than the X1800.
      • Then the graph is off in the article.

        All-In-Wonder Comparison
        X1900 X1800 XL 2006 X800 XL X800 XTPCI Express
        Yes Yes Yes Yes No
        Core Clock
        500 500 450 400 500
        Memory Clock
        480 500 400 490 500
        Vertex Pipelines
        8 8 2 6 6
        Pixel Pipelines
        48 16 4 16 16
        Microtune Tuner
        IC 2121 IC 2121 IC 2121 IC 2121 MT2050
        Shader Model 3.0
        Yes Yes Yes No No
        Avivo, H.264 Acceleration
        Yes Yes No No No
    • That is, until Nvidia releases the Geforce 7900 GTX :)
  • http://www.xbitlabs.com/articles/video/display/gpu -consumption2006.html [xbitlabs.com]

    NVidia's cards used to be the ones that sucked the most watts and still weren't the best performers. Now it's ATI! Ugh... Fortunately NVidia's got the best Linux drivers. ;)

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...