Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

VIA and NVIDIA Working Together For PC Design 93

Vigile writes "With AMD buying up ATI and Intel working on their own discrete graphics core, it makes sense for NVIDIA and VIA to partner together. It might be surprising, though, that rather than see the rumors of NVIDIA buying VIA come true, the two companies instead agreed to 'partner' on creating a balanced PC design around VIA's Nano processor and NVIDIA's mid-range discrete graphics cards. During a press event in Taiwan, VIA showed Bioshock and Crysis running on the combined platform. They also took the time to introduce a revision to the mini-ITX standard, which Intel has adopted for Atom, that pushes an open hardware and software platform design rather than the ultra-controlled version that Intel is offering."
This discussion has been archived. No new comments can be posted.

VIA and NVIDIA Working Together For PC Design

Comments Filter:
  • once more... (Score:1, Insightful)

    by Anonymous Coward
    the video game industry is the one pushing the development of computing!
    • Re: (Score:3, Insightful)

      by sayfawa ( 1099071 )
      the porn industry is the one pushing the development of computing!

      There, fixed.
      What, you haven't been watching a lot of CG porn recently? I'm not alone.. am I? :)
      • What, you haven't been watching a lot of CG porn recently?
        Not yet. The broken physics still bothers me. I can feel it getting closer, though.

        Two years, if it keeps progressing as it has lately. Maybe less - surely the money will start ramping up soon.
      • Re: (Score:3, Interesting)

        by Smauler ( 915644 )

        The major problem with CGI pron is that Uncanny Valley [wikipedia.org] can take on a whole new meaning.

        • The major problem with CGI pron is that Uncanny Valley can take on a whole new meaning.
          If they stuck to modelling the physics of silicone instead of real flesh, that problem would be quickly solved.
      • by ady1 ( 873490 )
        the porn industry is the one pushing the development of internet!

        There, fixed.
        And yeah, you are :P
      • the porn industry is the one pushing the development of computing!
        But has anyone ever found reliable data indicating the porn industries revenues? I know everyone quotes billions, but I'd love to know just how powerful an industry it really is.
    • as opposed to Intel's corporate agenda that tanks gaming because "businesses" don't "need" it. Both AMD and Nvidia/VIA will put more balanced machines out there. I personally can't get over the Apple/Intel Macbook fiasco... that a $1200 Macbook (with the fastest dual core processors out there) play games like WoW like crap... an old iBook will play better than Macbooks! But it's "cheaper" for Intel to hose it's partners with the integrated graphic and eventually the game industry will change how they mak
      • as opposed to Intel's corporate agenda that tanks gaming because "businesses" don't "need" it.

        Exactly, businesses do not need advanced 3d acceleration. They are businesses, the bottom line is king. (increased hardware costs, lost productivity, etc.)

        Both AMD and Nvidia/VIA will put more balanced machines out there.

        That is an incorrect statement since none of those companies actually sell working "machines". Instead they develop and sell components. Which aside from VIA are all comparable with one another in terms of price and performance (generally).

        that a $1200 Macbook (with the fastest dual core processors out there) play games like WoW like crap

        Well that's a given. Proper 3d acceleration is a must for any modern game. It should be noted my girlfriend has pr

  • by clowds ( 954575 ) on Saturday June 07, 2008 @08:23AM (#23692649)
    It would be grand to be able to buy a low watt, small box gaming machine that doesn't require 6 fans to keep it cool.

    However, with the way things are at the moment in the pc gamespace, I'd be pretty cautious expecting any decent performance, even with their Crysis and Bioshock demoes.

    I do miss the days when games had 128 multiplayer maps, ran on cheap $200 video cards well and had more story rather than the shinies but I guess that's progress for you. *sigh*
    • Man, I've never payed $200 for a video card and I've played most of the big games that have come along.

      I wouldn't be so sure nVidia and VIA aren't on to something here.
      • $250 for mine, and I can power a small submarine using my DVI connector.
        • 200 for mine, ati model... ran auto overclock feature in the drivers... reported back 92c. It's nice to be able to boil my ramen noodles while waiting for the next level to load.
      • I've never spent a single cent on a graphics card. I got my 8800 the old fashioned way -- trade/barter. And don't even get me started on soundcards -- TF2 FTW! All things considered, I think VGXPO last year was a very profitable experience for me.... Anyway, on the subject of older machines, I have a Dell Latitude D600 (2 GHz Dothan, 1GB, 160GB, 32MB ATi MR 9000), and the only part that really holds it back from running just about all games released up to about the middle of last year is the video card.
      • by ejecta ( 1167015 )
        After buying my first computer, I've never paid more than $40 for a video card, and that's AUD$! But then I am still happy playing Total Annihilation across the network with the Total Annihilation Works Project Mod.

        Shiny graphics are a waste - gameplay is king imho.
      • by Sj0 ( 472011 )
        Man, I've never payed $200 for a video card and I've played most of the big games that have come along.

        Seconded. I've always been able to play the latest games by taking advantage of the fact that you don't need the latest and greatest to play the games. A 100 dollar budget video card will give you all the features you need, for a fraction of the price for the same card a year or two earlier.
    • It would be grand to be able to buy a low watt, small box gaming machine that doesn't require 6 fans to keep it cool. However, with the way things are at the moment in the pc gamespace, I'd be pretty cautious expecting any decent performance, even with their Crysis and Bioshock demoes.

      Who cares? The latest games are always written to 'barely' run on top-of-the-line consumers PC's. But who needs a box like that? For myself, I normally put together a box that is behind the latest tech a year or so, if not more. That way, you can run almost anything you want, at a fraction of what it would cost you to buy the latest & greatest. Right now I've got a box that is 2~3 years old, but it still meets minimum specs for Crysis, and Bioshock. I don't expect either of those to run smoothly, but I

      • by maxume ( 22995 )
        I'm pretty sure that anybody who has *anything* they would rather do with their money should be following your strategy. And I mean stuff like "Buy 45 year supply of non perishable food", "Get that pony you've always wanted" and other such nonsense.

        I'm glad somebody is buying cutting edge computer systems and driving the technology forward, but not as glad as I am that it isn't me.
        • Re: (Score:3, Interesting)

          by Smauler ( 915644 )

          Whenever I build a new PC for myself, I buy just below the best. My previous PC lasted me 5 years or so, with a ti4200 (a powerhouse of a card, which will still outperform some modern 256mb cards). I'm hoping I can last as long with my current 8800GT, though I'm guessing it may be more like 4 years before I replace it.

          That's the thing about buying a decent spec PC - if you buy well, you will not have to replace it for years, and you'll be able to play just about anything because everyone else is buying g

          • *nods* I'm much the same as you are in that respect... but I've recently found my desires have changed as far as computing goes. Basically... there's exactly one computer game that I want to play any more, and its requirements are not even close to stunning. All it asks for, at a minimum, is an Intel 945 chipset, nVidia GeForce 6600, or ATI Radeon 9500 or better, coupled with 512MB of RAM, and an 800MHz P3 or Athlon. Show me *any* new computer that you can buy today which doesn't meet those specs. And it ru
          • a ti4200 (a powerhouse of a card, which will still outperform some modern 256mb cards)

            If and only if the games in question don't use fancy shaders and whatnot that the 4200 doesn't have hardware support for, that is.

            • Shaders? Z-buffers? Mip mapping? Back in my day we had to walk from system memory to frame buffer uphill and in the snow both ways!
            • by Smauler ( 915644 )

              I know, that's one of the main reasons I got a new PC. Saying that, however, the ti4200 would run games like doom3 [digital-daily.com] (un?)happily, though I never got it. I wasn't going to just upgrade the graphics card because it was AGP, and the entire system was crap by this point anyway.

    • I do miss the days when games had 128 multiplayer maps, ran on cheap $200 video cards well and had more story

      I run The Orange Box games and Oblivion on a ~$100 GeForce 8600 GT just fine at high settings and 1680x1050 resolution (although without antialiasing); what's the problem?

  • by aceofspades1217 ( 1267996 ) <aceofspades1217 AT gmail DOT com> on Saturday June 07, 2008 @08:26AM (#23692657) Homepage Journal
    Competition can't hurt. Now that we have Intel, AMD/ATI, and Nvidia/VIA all throwing their hat in the ring it will keep prices down and of course spur innovation considering its a race to the find the best technology. Personally I would like to see Intel taken off it's high considering it delayed all their 45nm production just so they could sell out their older chips. Of course they were able to do that because AMD is so behind in the 45nm race.

    So great hopefully we will see some real progress and we can have affordable laptops that have OK power. Because right now most normal laptops have integrated chips (you can't really fit a video card into a normal laptop) and of course the integrated card is horrible. Also the integrated card (at least in my laptop) sucks up all the power and makes my laptop have 3x less life. Also my integrated card overheats.

    So yea it would be great if we could have decent video processing on normal mass market laptops.

    Good Luck and may the best chip win!
    • Was anyone saying that competition would hurt?
      • No...but I just said it wouldn't :P

        Considering I was the first post pretty sure noone else said that.

        I am just saying that any competition is good. Heck if Microsoft wanted to make a chip that would be cool :P All I'm saying is the more the marrier. Now if only we had competition between our telecoms.
        • No...but I just said it wouldn't :P

          Considering I was the first post pretty sure noone else said that.

          I am just saying that any competition is good. Heck if Microsoft wanted to make a chip that would be cool :P All I'm saying is the more the marrier. Now if only we had competition between our telecoms.
          Microsoft?

          I wouldn't marrier!
          • I was making a point. I mean seriously I would like it if any company started making chips even the horrible, despicable, and evil microsoft :P
    • I don't know about you all, but I'm not sure three entities making all the processing hardware is enough.

      Whenever I see these "strategic partnerships" which basically means "mergers so the DOJ won't notice", I think about what's happened to the airlines and the oil companies (oh and telecom). Going in different directions, they are, but the consumers are getting screwed all around when these big outfits team up.
      • Re: (Score:2, Interesting)

        I wouldn't really call this a merger. VIA is a processor maker and nvidia is a GPU maker obviously Nvidia's all-in-one chip wouldn't be viable on their own. VIA makes a lot cheaper and low power chips so for the purposes of this they have a leg up on intel. The whole point is you don't really need (or want for that matter) a fast, monster processor for a smart phone. But there could be obvious benefits to having a kickass video card....playing video. The same also applies to ultra portable laptops. Your not
      • by Kjella ( 173770 )

        I don't know about you all, but I'm not sure three entities making all the processing hardware is enough.

        Well, what would you like to do about it? It's not because of lack of innovation, the problem is rather that the tick-tocks going on cost billions and billions and billions. For one smaller companies couldn't afford them, and even if we forget that it's big enough it'd have to be passed to the consumer as substantially increased prices. The only other option I see is compulsory licensing which would bring a host of problems on its own as they'd lose their biggest incentive to improve the hardware.

        Besides,

        • the problem is rather that the tick-tocks going on cost billions and billions and billions.

          So, you think that there are only three corporate entities in the world that can put together "billions and billions"?

          The problem is that even if there were EIGHT companies making processing hardware today, by next Friday there would be three again because our system rewards consolidation at the expense of innovation and certainly against the best interest of the consumer.

          The system is rigged in favor of the corporat

    • by renoX ( 11677 )
      >Competition can't hurt.

      For users no, but AMD isn't doing so well for a long time and investors don't like to loose money, so who know how long they're still going to compete with Intel?

      • Yes AMD needs to get their shit in gear. They are getting their ass kicked by Intel (and rightfully so). Their chips are still in the dark ages and they are just barely making quad cores and more importantly 45nm processors. I pity the fool who buys a 300 dollar 65nm processor.

        Now if Nvidia and VIA gets in the game and supplants amd are at least takes a small niche it will keep intel on their toes....without competition Intel could become like microsoft. They can artificially set their prices.
        • Their chips are still in the dark ages and they are just barely making quad cores and more importantly 45nm processors.

          Their chip aren't Dark Age at all. There are a lot of clever tricks inside. The main being the whole idea of moving the north bridge functions (memory controller) inside the CPU it self and using an open standard to communicate to the rest of the world (hypertransport). (in addition to other technical feats such as split power planes, *real* quad cores, unganged-mode bus)

          Intel is only catching up with that now (with their future Quickpath technology. And it's hardly an *open* standard, given the fights wit

    • by Kjella ( 173770 )

      Personally I would like to see Intel taken off it's high considering it delayed all their 45nm production just so they could sell out their older chips.

      Umm, what? Yes, they've kept their prices high and 45nm only on the high-end to get rid of 65nm stock, but Intel would like nothing more than to switch to 45nm as fast as possible. The chips get considerably smaller and thus cheaper to produce, which translates direcfly to higher margins. Plus they get all the premium of being alone in the high-end market, another good reason to keep them stocked. I'm sure there's a lot you can blame Intel for, but I don't think this is one of them...

      • Well obviously they aren't making the 65nm ones (for good reason) but they wouldn't have done that to begin with if AMD had released its 45nm chips.

        But thats not the point. I have intel stuff I love intel. They compete hard but I feel right now that AMD has slipped out of the consumer market Intel is not getting as much competition. Without competition Intel can sell their chips at whatever price they feel like. I jsut think someone needs to keep intel on their toes.

        And I'm not saying anything negative agai
      • The 45nm chips are not only on the super-high-end chips. Sure, you don't exactly see celerons with 45nm yet but a core 2 duo e7200 can be had for $130, and that's 45nm. I'm sure that before long, most if not all Intel chips will be of the 45nm variety.
    • The problem is :

      - Intel has always been a strong pusher for open source (see their graphic drivers as an example)

      - AMD has too (AMD64 Linux released before the actual processor, thanks to massive help from them - and thanks to Transmeta's code-morphing to help test before the chips come to life).
      And since they acquired ATI, Radeons have seen lot of open-source efforts (before acquisition was mostly reverse engineering. Now AMD is slowly releasing the necessary documentation so open source drivers can be wri
      • If nVidia buys VIA maybe there's even a chance that they choke VIA's previous open source effort.

        I don't think that is likely. It would be more likely to go the other way.

        My information is not up to date but I did some reading up on VIA a few years ago. They were then owned by a larger conglomerate. That conglomerate makes a lot of interlinking things. Other branches of the conglomerate make motherboards and the material used to encase CPUSs and other integrated circuits just to name 2 that I recall

    • In this corner: AMD/ATI with the next generation of integrated video with their first pass at pairing up the CPU/GPU and claiming a 3x speedup (over previous integrated graphics solutions, most likely). Their second pass, if they stick the two onto a single die, should be much more interesting, as it should cost even less, use less power, and, if they do enough of a redesign should actually go even faster. I'm keeping my fingers crossed that this combination will result in a fully open system.

      In the next
  • by crhylove ( 205956 ) <rhy@leperkhanz.com> on Saturday June 07, 2008 @08:29AM (#23692667) Homepage Journal
    You got my attention! Now price and availability? Is it multicore? Can I do this:

    http://tech.slashdot.org/article.pl?sid=08/05/31/1633214&from=rss [slashdot.org]

    ?

    Price and availability? Can we have a laptop that has a draw on wacom style touch screen where the keyboard is? I still want a keyboard though, but I guess I could use a docking station with a monitor. Can I get it like eee size and eee cheap and put 6 of 'em together in a custom beowulf cluster that grows and shrinks as the various laptops enter and exit the house over wifi N, or wimax, or whatever?
    • by naz404 ( 1282810 )

      Actually, I hate the way processors are getting faster and faster and hungrier and hungrier.

      I mean all this speed is sheer abuse! Back then I was perfectly fine with 233Mhz + 128MB RAM with Win98 and Diablo II!

      All this speed is just a waste of battery life!

      If they can give me a PIII-500 equivalent processor + 256MB ram + 2GB hard drive + 640x480 + ubuntu lite + 10hours battery life at $100-$200, that would be totally teh sweet!

      Retrogames and emulators on Linux for TEH WIN!

      • Why would you want to settle for 640x480? I like the rest of the idea, but honestly, 640x480 wasn't even fun back in the days of Win95.

        Boost it to 1600x1200 or something like that, and it'd be a lot more comfortable to work with.
        • except smaller than EEE (but with still a true keyboard) Then toss in DOSBox [dosbox.com], SCUMMVM [scummvm.org], ZSNES [zsnes.com],

          mix in a little WiFi capability for leeching off hotspots, and you now have a true hacker toy that can lug anywhere!

          PDAs and Smartphones just don't cut it. They suck for doing stuff like coding and compiling your own programs.

          • Even on an EEE-size screen, I'd still want more pixels. It's 2008, for crying out loud; why can't we finally get 300DPI?!

      • Sounds like you just described the XO-1. [wikipedia.org]
      • Yeah, let's stop technological advances right now!
      • Every computer *I'VE* built for myself has been better than the last. My apologies if that is not the same result for everyone. I'm totally for cheap over options though, and if I could get an under $100 n64 emulator, with those new hi res textures and then play mario kart.... that would be worth $100 alone. But not $600, like I was offering for the supercomputer cluster of eee style notebooks that could also be 6 death match stations and 6 race cars. Oh yeah, and all work as perfect p2p skype video ph
      • by maxume ( 22995 )
        A low voltage core solo uses about 5 watts. At that point, the screen and other devices are going to factor much more heavily into battery consumption. The lower power core duos are around 15 watts, which still isn't going to destroy battery life.

        So get a core solo and don't worry about the 4 watts that they might be able to save by gutting performance.
      • Re: (Score:3, Informative)

        by Smauler ( 915644 )

        Actually, I hate the way processors are getting faster and faster and hungrier and hungrier.

        Umm... don't know if you've been following the processor market recently, but they're not. A lot of the advances recently have been about lower voltages, lower power consumption, and more cores. Pure processor power increase has most definately not been a feature of the recent processor market, at least not compared to the past. I mean, AMD released over 2ghz processors 6 years ago....

      • now en days procs are becoming less important and now video cards are becoming more important. Even the best games don't really need too much CPU. Other than pysics CPUs are becoming less and less important. It is now secondary in high end gaming PCs. You won't see much of any difference between a 1k quad core or 200 dollar 45nm dual core in video gaming. Although you will see a huge difference between an 8600 GT and a 8800 GTS.
    • Less... Caffeine... Please...

      Can I get it like eee size and eee cheap and put 6 of 'em together in a custom beowulf cluster that grows and shrinks as the various laptops enter and exit the house over wifi N, or wimax, or whatever?
      WTF would you do with a beowolf cluster of mini laptops on wireless? Folding@home that important to you?
      • Umm... (Score:4, Funny)

        by FurtiveGlancer ( 1274746 ) <AdHocTechGuy@@@aol...com> on Saturday June 07, 2008 @08:42AM (#23692715) Journal

        WTF would you do with a beowolf cluster of mini laptops on wireless?
        Leak information like a sieve?
      • by ozamosi ( 615254 )
        Crack WEP/WPA
      • Ever hear of Smart Dust? [wikipedia.org]

        That's almost exactly what they're trying to achieve.

      • Re: (Score:3, Insightful)

        by crhylove ( 205956 )
        I was thinking POV ray Quake III, but sure, like good causes or whatever! Maybe install the Folding@home screensaver by default on all machines while I'm not playing POV ray Urban Terror. Thanks in advance, and if you can make 'em for $100 each, I'll take six advance orders.

        PS Oh, can you install regular Urban Terror on each machine, too, so I have a 6 chair death match out of the box? Double Thanks in Advance. Might as well put a racing game on there too and bundle each with a dual analog stick. Thank
      • WTF would you do with a beowolf cluster of mini laptops on wireless?

        With a beowulf cluster, where you have to rework your applications, or operate them on a job-submission basis at which point you might as well just use DQS? Nothing.

        However, with a single-system-image cluster like MOSIX, and with binary compatibility across architectures, you could use the system to do all kinds of things transparently.

        However again, my understanding is that OpenMOSIX is more or less over (I would love to be corrected.) And even if it isn't you need everything to be the same architectur

    • Can we have a laptop that has a draw on wacom style touch screen where the keyboard is?

      Wouldn't it make more sense to put the Wacom digitizer where the screen is?

      Also, I second the motion for a cheap, small tablet PC!

      • It seems like the GP is either ignorant of the existence of the tablet PC, or just wants a giant PDA (tablet PC, no keyboard.) Maybe they want the clamshell form factor - personally, I'd want two screens in that case, but almost every use of two screens has turned out to be more trouble than it's worth. Me, I just want a giant PDA with some sort of optically-based multitouch and an easily replacable ~1/4" thick sheet of Lexan over the top of the screen so that I can beat it up and then resurface it. (The on
        • It seems like the GP is either ignorant of the existence of the tablet PC, or just wants a giant PDA (tablet PC, no keyboard.)

          I own a tablet PC, and no, I don't think he wants one in its current form, because no tablet PC is small or cheap. They're all expensive, and even the ones with small screens are absurdly thick and heavy. No, what he's asking for -- and what I'd want, too -- would be a convertible tablet (i.e., including keyboard) with about a 6" by 8" 1024x768 screen, <= 2lbs, <= $500, <=

          • I own a tablet PC, and no, I don't think he wants one in its current form, because no tablet PC is small or cheap.

            But then what he wants is a "lighter, cheaper tablet PC" :P

            I think the hinge is just a problem. If I need a keyboard I can carry a snazzy folding one in my pocket - I have big pockets. I already have a serial one for handspring visor that I intend to hack for use with everything else (well, that has rs232 anyway. they have bluetooth ones these days.) I want the ultimate durability and that would ideally mean something sealed in plastic. The battery would be external and would connect to the only electric

            • I think the hinge is just a problem. If I need a keyboard I can carry a snazzy folding one in my pocket - I have big pockets.

              The problem with that is that it becomes harder to use it as a laptop: what do you prop up the screen with?

              I think Linux is fine for a pen-based system, at least potentially.

              "Potential" solutions are useless. I've actually tried using Linux on my tablet. The digitizer itself works great; that's not the problem. The problem is that there are no apps! There is no open-source continuo

  • *Swoons* (Score:2, Interesting)

    by hyperz69 ( 1226464 )
    I am a big fan of the Nano. I think it has potential to be huge if it lives up to 1/2 of it's claims. I always cried thinking I would need to use the Chroma crap that Via Integrates. Nvidia Graphics on a Nano platform. Tiny little gaming boxes and notebooks. Dear lord, its nerd heaven! Media centers for the poor! I am buzzing with glee just thinking the possibilities. KUDOS!
  • Will nvidia make chipsets for via as via ones suck?
  • Might there be a chance to finally get open source drivers from nvidia now they team up with VIA?
    • Might there be a chance to finally get open source drivers from nvidia now they team up with VIA?

      You're kidding, right?

      So far VIA has been terrible with Linux. I've had much better experiences with Nvidia, even with their closed-source drivers. This talk about VIA being open in comparison to Intel is funny, considering Intel has provided opensource Linux drivers for their hardware for years.

      • by dfries ( 466073 )

        This talk about VIA being open in comparison to Intel is funny, considering Intel has provided opensource Linux drivers for their hardware for years.

        Have they? I always thought it was their publically available hardware specifications that allowed anyone to write drivers. It's just when they got into the graphics chip business that they lacked documentation and had to then provide drivers. Now Intel is releasing the source to the graphics drivers, and providing the graphic chipset documentation, a great

  • One thing that has puzzled me is why aren't more companies just copying OLPC design, may be enhance the processor, memory etc., and sell for say, $200-$300. I don't think OLPC foundation will say no to sell or share their design. Not that their design is some top secret anyway.
    • What puzzles me is why you are puzzled about something not happening when it actually is? Take a look at the Asus Eee PC and the other competing laptops starting to come out.
      • The best feature of the XO is the Marvell-based WiFi (which currently does not have Free or even Open firmware) which implements a Mesh AP on a chip. This allows the device to operate as a repeater while draining the absolute minimum power. The only way we are going to achieve independence from the greedy corporations trying to milk the internet for every penny is if we make them irrelevant. The way to do that is to build an alternative internet. And the only cost-effective way to do that at the moment is w

  • He used and spelled "discrete" correctly.

    wow.

    He used and spelled "discrete" correctly.

    wow.

    ummm... the story? yeah, sure.

    Whatever.

    He used and spelled "discrete" correctly.
  • by Glasswire ( 302197 ) on Saturday June 07, 2008 @12:51PM (#23694093) Homepage
    Hmmm, the old Mini-ITX format had multiple vendors (VIa, Intel, others) using it and right now the only vendor using Mini-ITX 2.0 is VIA-NVidia. How is this more open? And in what sense was Intel making the old standard less open -other than jumping into that market and doing well?

    BTW, I have to laugh at the sight of a Mini-ITX board with a relatively low power VIA cpu having a huge, power sucking NVidia discrete GPU board on it. Surely anybody that cares about performance graphics is not using this catagory of board. Logically , NVidia would do an integrated graphics chipset for the Mini-ITX format, but a PCI-Express external card that quadruples the chassis height (and probably quads the power consumption of the board) is a joke. Ask embedded systems developers (still the main market for Mini-ITX systems) if this is really what they're looking for. VIA and NVidia cobbled together a frankenstein combination of technologies just to make the Atom look bad with irrelevant perf specs.
    • by DrYak ( 748999 )

      Hmmm, the old Mini-ITX format had multiple vendors (VIa, Intel, others) using it and right now the only vendor using Mini-ITX 2.0 is VIA-NVidia. How is this more open?

      By the fact that vendors wanting to implement it don't have to pay royalties to anybody. That is what "open" means. You don't necessarily need several vendors using them, you need vendors having no big barrier to start using them.

      And in what sense was Intel making the old standard less open -other than jumping into that market and doing well?

      Here I agree with you. Intel's Mini-ITX has no fundamental reason of being less open. (Unless they're charging license fees for their improvements).

      BTW, I have to laugh at the sight of a Mini-ITX board with a relatively low power VIA cpu having a huge, power sucking NVidia discrete GPU board on it.

      Still the whole platform is projected to cost less than any Intel offering able to run Vista.

      And I presume the whole VIA / NVidia c

      • "Intel's Mini-ITX has no fundamental reason of being less open..."

        Maybe what they're referring to as far as "open" (but they're using the word loosely) is that intel doesn't want people to use its cheap atom chip to build systems that can compete in the same market segment with their more pricey offerings. That's what would keep an OEM from outfitting a system any way they want to.

        This VIA/nVidia system that does have enough graphics performance to beat the atom and (they claim) to play recent games is bas

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...