NVIDIA Message Board older than one year ago

Sign-in to post

Posted by Joystiq Aug 11 2013 17:00 GMT
- Like?
Sales of Nvidia's Android-powered gaming handheld, the Nvidia Shield, have inspired confidence in the platform's future from Nvidia's leaders, GamesIndustry international reports.

Senior Director of Investor Relations Chris Evenden describes sales as "great" and says "everything that we shipped so far has sold out ... we're just starting to ramp production."

Despite Nvidia's optimism for the Shield, the company's Q2 reports show a 6.4 percent drop in revenue and a 19 percent decline in net income year-over-year. Nvidia CEO Jen-Hsun Huang says the company expects a "strong second half" of the year due to the Shield launching outside of the U.S. and the introduction of more Tegra 4-powered devices to the market.

YouTube
Posted by Joystiq Jul 24 2013 20:00 GMT
- Like?

Nvidia has unveiled its new mobile graphics processor, built on the same Kepler architecture found in its latest PC graphics cards. Dubbed Project Logan, this new chip supports Unreal Engine 4 and, according to comments from Epic Games CEO Tim Sweeney, will empower mobile devices with "the same high-end graphics hardware capabilities exposed via DirectX 11 on PC games and on next-generation consoles."

Right, it's not just Unreal Engine 4 - Nvidia says Project Logan supports popular graphics standards Open GL 4.4 and Direct X 11, and uses one-third the power consumption of GPUs found in "leading tablets, like the iPad 4." Project Logan runs at 2.2 teraflops, with more raw processing power than can be found in the PS3's GPU.

Above, you can see a video demo of Nvidia's faux-face "Ira" rendered on Project Logan in real-time, and embedded past the break you'll find Nvidia's "Island" demo running on the mobile chip. Project Logan is still in the prototyping phase, so don't expect to see it in the wild anytime soon.

Posted by Kotaku Jul 24 2013 14:00 GMT
- Like?
You may remember Ira, the facial rendering demo Nvidia used to show off the power of the latest desktop graphics technology. He's even more impressive when he's running on Project Logan, Nvidia's Kepler-powered next-generation mobile processor. These are mobile graphics. Damn. Read more...

Posted by Joystiq Jul 21 2013 21:30 GMT
- Like?
Nvidia's Android-powered handheld gaming system Shield will ship July 31, Nvidia announced. The system was originally expected to launch June 27, but shipment was pushed back to July due to a mechanical issue found during the console's QA process.

Our hands-on time with the Shield found some similarities to the Xbox 360 controller, with a little more bulk than a PlayStation Vita, thanks in no small part to its 5-inch, multi-touch, 720p display.

Posted by Kotaku Jul 21 2013 17:17 GMT
- Like?
Following a last-minute delay that pushed the launch from June to July, today Nvidia confirmed with Kotaku that the Shield handheld is officially launching on July 31. Read more...

Posted by Kotaku Jun 26 2013 19:00 GMT
- Like?
Remember last week, when Nvidia announced their Shield portable Android gaming system was launching tomorrow? Yeah, about that... Citing a mechanical issue discovered during incredibly last-minute QA testing, the PC game-streaming Shield will now be dropping sometime next month. Here's Nvidia's official statement: "During our final QA process, we discovered a mechanical issue that relates to a 3rd party component. We want every SHIELD to be perfect, so we have elected to shift the launch date to July. We'll update you as soon as we have an exact date." While I am glad Nvidia caught whatever the flaw is before the system started (ridiculous speculation alert) exploding in players' hands, I am amazed that a company can announce a major product for release in seven days when testing hasn't been completed. Preorder customers are likely going to be furious, having assumed they'd have the $300 system in their hands tomorrow. Heck, I'm a little miffed myself. Hopefully they get this all ironed out soon. Until then, keep your eyes on the Nvidia blog for further updates.

Posted by Joystiq Jun 26 2013 20:00 GMT
- Like?
The Nvidia Shield was supposed to launch this week, but Nvidia has announced that a third-party mechanical issue will be pushing that release back until sometime in July. The exact date is still yet to be determined.

"During our final QA process, we discovered a mechanical issue that relates to a 3rd party component," the Nvidia statement reads. "We want every Shield to be perfect, so we have elected to shift the launch date to July. We'll update you as soon as we have an exact date."

Last week, Nvidia announced it was knocking $50 off the Android-powered handheld device, bringing the Nvidia Shield's price tag down to $300. In May, we visited the Nvidia campus and got our hands on the Nvidia Shield, which sports a 5-inch 720p display, Tegra 4 chip inside and can stream games from your PC - granted you have a GTX 650 GPU or better in your rig.

Posted by Kotaku Jun 25 2013 23:00 GMT
- Like?
Nvidia has continued to roll out the GeForce 700 series this week with the GTX 760 — the generation's first truly mainstream product, with pricing well under that of the GTX Titan, 780 and even the 770, which at $400 stillcosts more than the average gamer is willing to spend. In other words, the GTX 760 has the potential to be today's most relevant option for someone who needs a new graphics card. Previous 700 series cards have been heavy hitters, with the GTX 770 packing about as much muscle as the GTX 680 for about $100 less. Assuming Nvidia doesn't throw us a curve ball, we expect the GTX 760 to deliverperformance comparable to that of the HD 7950 with a price tag that's closer to HD 7870s — a situation that would invariably benefit anyone shopping for a mid-range GPU. Like the GTX 770, 680, 670 and 660 Ti before it, the GTX 760 is based on Nvidia's GK104 architecture. That said, the newcomer's core configuration has been cut down quite significantly when compared to last month's GTX 770. Before you close the tab, there is some good news: Nvidia has left the card's 256-bit wide memory bus intact, affording the GTX 760 quite a lot of memory bandwidth. Gainward has given us our first look at the GTX 760 with the company's special Phantom edition graphics card, which features a heavily modified board design packing upgraded cooling and some factory overclocking. We're generally pleased with the results of partners' efforts to customize their products, though it definitely adds some complexity to determining a card's value proposition. The GTX 760 Phantom in Detail... As was the case with the GTX 770, Gainward prepared its Phantom card in time for the GTX 760 release, touting a reworked PCB, factory overclocking and a massive triple slot cooler — the last of which is the most noteworthy enhancement. Also like the GTX 770, the GTX 760 receives the new third-generation Phantom cooler with replaceable fans. The new Phantom delivers better thermals while making less noise and boasting a sturdier construction. It's unlike any triple-slot cooler we've encountered before, featuring four 8mm heatpipes that extract heat from the base and distribute it evenly throughout the heatsink. Gainward claims that its Phantom cooler allows the GTX 760 to run 16 degrees cooler than the reference board. The most unusual part of the cooler design is its fans — their location, specifically. Fans are typically attached to the top side of the heatsink, but instead Gainward has embedded two quiet 80mm brushless PWM fans inside the heatsink. The fans are also removable, featuring a tool-less design. Similar to the way hot-swappable hard drive bays work, the fans slide out once a single thumb screw has been removed — no cables, no fuss. The heatsink measures 210mm long (the PCB itself is just 170mm long), 65mm wide and just 15mm tall. It has a black fan shroud that forces the 80mm fans to draw air in through fins above them and push it over the card below them at the same time. Unlike the higher-end GTX 770, the GTX 760 Phantom doesn't have a heat spreader over the memory chips, leaving them naked instead. Nvidia has designed the GTX 670 with three graphics processing clusters, six streaming multiprocessors, 1152 CUDA cores, 92 TAUs and 32 ROPs. In the previous generation, this would place the GTX 760 in between the GTX 660 and 660 Ti. Despite that, we expect the card to be quite a bit faster than the GTX 660 Ti, as it's not only clocked faster but it has a wider memory bus (256-bit versus 192-bit). While the 660 Ti was limited to a peak memory bandwidth of 144.2GB/s, the GTX 760 enjoys a much larger 192.3GB/s of bandwidth. By default, the GTX 760 core is clocked at 980MHz with a boost frequency of 1033MHz while the 2GB of GDDR5 memory operates at 6GHz. The GTX 660 Ti in comparison runs at 915MHz with a boost of 980MHz but its memory also operates at 6GHz. Gainward has of course done some factory overclocking boosting the base core frequency to 1072MHz with a boost speed of 1137MHz. This is a 9% increase in core speed, while the memory has been overclocked by 3% to 6.2GHz. The rest of Gainward's card remains fairly standard, including a pair of SLI connectors, two 6-pin PCIe power connectors and an I/O panel configuration consisting of HDMI, DisplayPort and two DVI ports. Benchmarks: Battlefield 3, Crysis 3 The GTX 760 averaged 66.1fps at 1920x1200 in Battlefield 3, 6% slower than the overclocked Gainward GTX 760 Phantom. Although the GTX 760 Phantom was faster than the HD 7950, the standard GTX 760 was 3% slower, though it was still 9% faster than the GTX 660 Ti and 12% faster than the HD 7870. The GTX 760's frame time performance results were fairly similar to what we saw in its frames per second testing as the card was 5% slower than the Phantom edition but 8% faster than the GTX 660 Ti, 17% faster than the HD 7870 and 2% faster than the HD 7950. The GTX 760 rendered 33.1fps at 1920x1200 in Crysis 3, faring 6% worse than the Gainward GTX 760 Phantom while outperforming the GTX 660 Ti, HD 7870 and HD 7950 by 13%, 39% and 10%. Crysis 3's frame time margins were quite interesting as AMD-based cards performed better than they did in the frames per second testing. The GTX 760 was still 6% slower than the Phantom model and 14% slower than the GTX 660 Ti, however it was "only" 28% faster than the HD 7870 while it just managed to match the HD 7950's performance. Benchmarks: BioShock Infinite, Metro Last Light The GTX 760 averaged 54.3fps when testing BioShock Infinite, which was 3% slower than the GTX 660 Ti yet 11% faster than the HD 7950 and 33% faster than the 7870. The frame time margins were similar to those seen when measuring the frames per second performance, as the GTX 760 was 2% faster than the GTX 660 Ti, 19% faster than the HD 7950 and 38% faster than the 7870. Despite being the newest game tested, Metro: Last Light only managed to drag the GTX 760 down to 43.7fps — 5% slower than the overclocked Phantom card, but 11% faster than the GTX 660 Ti, 3% faster than the HD 7950 and 9% faster than the 7870. The frame time results were quite different to the fps data as the GTX 760 was just 1% faster than the GTX 660 Ti and 3% slower than the HD 7870. Read More... GeForce GTX 760: Mainstream PerformanceTesting MethodologyBenchmarks: DiRT 3, Far Cry 3Benchmarks: Max Payne 3, Sleeping DogsBenchmarks: Medal of Honor, HitmanBenchmarks: Tomb Raider, Resident Evil 6Overclocking PerformancePower Consumption & TemperaturesFinal Thoughts Republished with permission from: Steven Walton is a writer at TechSpot. TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998.

Posted by Kotaku Jun 20 2013 13:00 GMT
- Like?
Last month Nvidia's Project Shield Android handheld got a shortened name, a release window of June, and a $349 price tag. With June quickly coming to a close, the Shield is coming in hot on June 27, with $50 knocked off the price to sweeten the deal. Customers who've already preordered the Tegra 4-powered, PC game streaming handheld will be charged the $299 price when their Shield ships. Those who waited to order one have missed out on suddenly having an extra $50 to screw around with. Is the Shield worth $299? I do not know. At this point I've only seen pictures and watched videos of Randy Pitchford singing. I have my preconceived notions, but they are ripe for shattering. If you're on the fence, I'm sure they'll be plenty of fresh opinions floating around come next week.

YouTube
Posted by Kotaku Jun 10 2013 21:28 GMT
- Like?
Something about the Nvidia Shield, or something. Who knows? Here we get a peek into the office and home of Gearbox president and CEO Randy Pitchford, and at 5:08 he plays guitar and sings. He also plays piano and performs magic. So yes, this video is pretty special. Something,m something, Nvidia Shield.

YouTube
Posted by Kotaku Jun 05 2013 20:45 GMT
- Like?
Madfinger Games' ambitious follow-up to zombie shooter Dead Trigger is one of the poster games for Nvidia's new Tegra 4 chip, the power behind the upcoming Shield handheld. Today the developer has released a video demonstrating how much moister the game will look on the powerful new tech. There are other effects in there as well — smoke and particles and such — but man, the ground in Dead Trigger 2 is going to be so wet. Even the interior location shown here has been flooded, which I guess is something that happens when the zombie apocalypse comes. Everyone is too busy dying to drink water. As a game, Dead Trigger 2 sounds rather nifty. Players will come into the game on the second year after the zombie outbreak, and from there the game will evolve according to the actions of the playing community. Players waiting until next year to play wilkl be stepping into year three of the infestation. That's a really cool concept. As a Tegra 4 tech demo, Dead Trigger 2 is so wet. The game will release later this year as a free-to-play iOS and Android offering, with developers still pondering a simultaneous release on ruled out the possibility that Dead Trigger 2 will appear simultaneously on the MacStore, Steam and Facebook.

Posted by Kotaku Jun 03 2013 16:16 GMT
- Like?
Interested in Tegra 4 gaming but don't want to drop $350 on a Shield handheld? Asus' newly-announced Transformer Pad Infinity will be one of the first non-Nvidia devices to feature the world's fastest mobile processor.

Posted by Kotaku May 30 2013 19:00 GMT
- Like?
Having taken the covers off the GeForce GTX 780 a week ago, Nvidia is ready to release their next part in the GeForce 700 series. Giving us our first look at the GeForce GTX 770 is Gainward, with their special Phantom edition card featuring an upgraded cooling solution, factory overclocking, and 8-phase PWM. But let's put things into further context. The GTX 780 that debuted last week was based on the same Kepler GK110 architecture used by the GTX Titan. Nvidia priced the GTX 780 at $650, making it 35% cheaper than the GTX Titan but also 40% more than the GTX 680. In terms of performance, the GTX 780 was only 10 - 15% slower than the Titan, so it added value to an otherwise very exclusive price point, however when compared to the GTX 680 the numbers were less impressive as the GTX 780 was just 24% faster. Therefore the GeForce GTX 780 is an attractive option for those wanting Titan-levels of performance at a more moderate price, but in the overall scope of things, the 780 was hardly exciting news for the vast majority of gamers as it remains a very expensive affair and the release did nothing to drive down prices of previous generation cards. Looking forward to the GeForce GTX 770's release, we were hoping this would be a little more meaningful for the gaming community. The GTX 770 is based on the GK104 architecture, first used by last year’s GTX 680. Earlier rumors indicated that the GTX 770's specifications would be much like a GTX 680 on steroids, and as it turns out that's exactly what it is. Virtually everything about the GTX 770 and GTX 680 are the same, except for core and memory clock speeds. The GTX 770 features the fastest GDDR5 memory we have ever seen at 7GHz. Memory at that clock rate is good for a peak bandwidth of 224GB/s, 16% more than the GTX 680. Therefore, technically if you could overclock a GTX 680 well enough you could create a GTX 770. GeForce GTX 770 Phantom in Detail Gainward has prepped their Phantom card in time for the GTX 770 release, touting a reworked PCB with an upgraded power phase, factory overclocking and a massive triple slot cooler — the last of which is the most noteworthy enhancement. Although Gainward featured its Phantom cooler on some GTX 600 series cards, the GTX 770 is the first to market with the company's third-generation solution. The new Phantom delivers better thermals while making less noise and boasting of a sturdier construction. It's unlike any triple-slot cooler we've encountered before. It features five 6mm heatpipes that extract heat from the base and evenly distribute it throughout the heatsink. The most unusual part of the cooler design is the fans, or rather their location. Fans are typically attached to the top side of the heatsink, but instead Gainward has embedded three quiet 80mm brushless PWM fans inside the heatsink. The fans are also removable, featuring a tool-less design. Similar to the way hot-swappable hard drivebays work, the fans slide out once a single thumb screw has been removed, no cables, no fuss. The heatsink measures 257mm long, 65mm wide and 45mm tall. It features a black fan shroud that forces the 80mm fans to draw air in through fins above them and push it over the card below them at the same time. Moving past the heatsink is a black aluminum heat spreader that engulfs the top side of the card and cools the eight 256MB GDDR5 memory chips along with the 8-phase PWM. By using a 8-phase design, Gainward includes two extra phases for power delivery to the GPU, which shouldimprove performance under heavy loads and aid in the card's overclocking abilities. Speaking of overclocking, Gainward has done a little bit of the heavy lifting by pushing the core clock from 1046MHz to 11150MHz, a decent 10% increase, while the Boost Clock is increased from 1085MHz to 1202MHz, an 11% increase. The GDDR5 operating frequency has been left at 7GHz meaning the memory bandwidth remains at 224.3GB/s. As mentioned before, beyond clock speeds the GeForce GTX 770's specifications are identical to the GTX 680. This means there are 4 graphics processing clusters, 8 streaming multiprocessors, 1536 CUDA cores, 128 TAUs and 32 ROPs. The rest of Gainward's card remains fairly standard, including a pair of SLI connectors, 6-pin and 8-pin PCIe power connectors, and an I/O panel configuration consisting of HDMI, DisplayPort and two DVI ports. Testing Methodology As usual we tested each graphics card with Fraps, which lets us record the average frame rate in seconds over a set amount of time. Typically, we run our tests for 60 seconds. Reporting the average fps (frames per second) is how things have been done for... well, forever. It's a fantastic metric in the sense that it's easy to record and easy to understand. But it doesn't tell the whole story, as The Tech Report and others have shown. To get a fuller picture, it's increasingly apparent that you need to factor in a card's frame latency, which looks at how quickly each frame is delivered. Regardless of how many frames a graphics card produces on average in 60 seconds, if it can't deliver them all at roughly the same speed, you might see more brief jittery points with one GPU over another — something we've witnessed but didn't fully understand. Assuming two cards deliver equal average frame rates, the one with lowest stable frame latency is going to offer the smoothest picture, and that's a pretty important detail to consider if you're about to drop a wad of cash. As such, we'll be including this information from now on by measuring how long in milliseconds it takes cards to render each frame individually and then graphing that in a digestible way. We'll be using the latency-focused 99th percentile metric, which looks at 99% of results recorded within X milliseconds, and the lower that number is, the faster and smoother the performance is overall. By removing 1% of the most extreme results, it's possible to filter anomalies that might have been caused by other components. Again, kudos to The Tech Report and other sites like PC Per for shining a light on this issue. Test System Specs Intel Core i7-3960X Extreme Edition (3.30GHz)x4 2GB G.Skill DDR3-1600(CAS 8-8-8-20)Asrock X79 Extreme11 (Intel X79)OCZ ZX Series (1250W)Crucial m4 512GB (SATA 6Gb/s)HIS Radeon HD 7990 (6144MB)HIS Radeon HD 7970 GHz (3072MB) CrossfireHIS Radeon HD 7970 GHz (3072MB)HIS Radeon HD 7970 (3072MB)HIS Radeon HD 7950 Boost (3072MB) CrossfireHIS Radeon HD 7950 Boost (3072MB)HIS Radeon HD 7950 (3072MB)HIS Radeon HD 7870 (2048MB) CrossfireHIS Radeon HD 7870 (2048MB)Gigabyte GeForce GTX Titan (6144MB)Gainward GeForce GTX 780 (3072MB)Gainward GeForce GTX 770 (2048MB)Gainward GeForce GTX 690 (4096MB)Gainward GeForce GTX 680 (2048MB)Gigabyte GeForce GTX 670 (2048MB)Gainward GeForce GTX 660 Ti (2048MB) SLIGainward GeForce GTX 660 Ti (2048MB)Microsoft Windows 7 Ultimate SP1 64-bitNvidia Forceware 320.18AMD Catalyst 13.5 (Beta 2) Benchmarks: Battlefield 3, Crysis 3 The Gainward GTX 770 Phantom was 5% faster than the standard GTX 770 in Battlefield 3 at 2560x1600 rendering 54.1fps. This meant that it was just 4% faster than the Radeon HD 7970 GHz Edition but 12% faster than the GeForce GTX 680. The overclocked Gainward GTX 770 Phantom was also 17% slower than the GTX 780 which averaged a more impressive 64.8fps. The frame time performance saw the Gainward GTX 770 Phantom produce similar margins when compared to the competition. Here the GTX 770 Phantom was 5% faster than the standard GTX 770, 12% faster than the GTX 680 and 7% faster than the Radeon HD 7970 GHz Edition, while it did trail the GTX 780 by an 18% margin. The Gainward GTX 770 Phantom averaged 27.9fps when testing with Crysis 3 at 2560x1600 which is the same result turned in by the standard GTX 770. Despite that it was still 6% faster than the GTX 680 and 18% faster than the Radeon HD 7970 GHz Edition. Meanwhile the Gainward GTX 770 Phantom was also just 6% slower than the GeForce GTX 780. The Crysis 3 frame time performance is quite different to that of the frames per second performance. This time the Gainward GTX 770 Phantom was 8% faster than the standard GTX 770 and 18% faster than the GTX 680, while it was just 4% slower than the GTX 780. Read More... Benchmarks: DiRT 3, Far Cry 3Benchmarks: Max Payne 3, Sleeping DogsBenchmarks: Medal of Honor, HitmanBenchmarks: Tomb Raider, Resident Evil 6Power Consumption & TemperaturesConclusion: Adding Value to High-End GFX? Republished with permission from: Steven Walton is a writer at TechSpot. TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998.

Posted by Rock, Paper, Shotgun May 27 2013 12:00 GMT
- Like?

Ha, sorry. Not really. But it got your attention. And there’s a thin tendril of truth in it. It’s been a busy week in hardware and in my mortal hands I hold a laptop containing AMD’s Jaguar cores. The very same cores as found in the freshly minted games consoles from Microsoft and Sony. So what are they like and what does it mean for PC gaming?

Meanwhile, Nvidia drops a price bomb of the bad kind and Intel has some new chips on the way. Read on for the gruesome details. (more…)


Posted by Kotaku May 23 2013 13:40 GMT
- Like?
The GeForce GTX 680 was Nvidia’s first 28nm part featuring 1536 CUDA cores, 128 texture units and 32 ROP units. It's remained since release Nvidia's fastest single GPU graphics card of the series, second only to the dual-GPU GTX 690 which features a pair of GK104 GPUs. And so for the last 12+ months the GTX 680 and the Radeon HD 7970 have been battling over the performance crown, forcing numerous price cuts and even a little overclocking from AMD to produce the 7970 GHz Edition. In the end AMD was able to undercut Nvidia on price, producing what we believe to be the better solution. Most recently however Nvidia showed what they could really do with the GK104 architecture by beefing it up with more CUDA cores, texture units and ROPs creating the GK110. The GeForce GTX Titan is a monster that belongs to an entirely different league, crushing the GeForce GTX 680 as well as the Radeon HD 7970 GHz each and every way possible. Real-world gaming tests saw the GTX Titan outpace the GTX 680 by a 42% margin and the Radeon HD 7970 GHz Edition by a 30% margin. In the past we've seen performance jumps of 20 to 25% from one generation to the next, so these numbers are indeed something special. But of course, with a $1,000 price tag it's comparing apples and oranges. If anything, the Titan did show how much more complex and powerful Nvidia could make the current generation 28nm GPU without putting the TDP rating through the roof. It also meant that Nvidia could move to the next generation mainstream GPUs without having to completely redesign their architecture for the GeForce 700 series and that is exactly what they have done. The new GeForce GTX 780 is based on a similar, albeit slightly cut down version of the Titan GPU, managing to keep many of the features that make the $1,000 graphics card great, such as the 384-bit memory bus. GeForce GTX 780 in Detail The GeForce GTX 780 reference board measures 10.5” (26.7cm) length. Display outputs include two dual-link DVIs, one HDMI and one DisplayPort connector. With 2304 CUDA cores at its disposal, the GeForce GTX 780 features 50% more CUDA cores than the GeForce GTX 680. The GTX 780 also gets a 3GB memory buffer standard which is 50% more than the GTX 680. Helping to take advantage of the extra memory are six 64-bit memory controllers for a 384-bit wide memory bus. Paired with a 6008Mhz GDDR5 memory clock, it provides up to 288.4GB/sec of peak memory bandwidth to the GPU. Those specs mean that those still rocking a GeForce GTX 580 should be looking at around a 70% performance improvement when upgrading to the GTX 780. It’s not just GTX 580 owners that can expect a decent performance upgrade as owners of last year’s GTX 680 will still receive around 30 - 40% more performance, at least on paper. The 12 SMX units providing 2304 CUDA cores are clocked at 863MHz though using Boost 2.0 they can be clocked up to 900MHz in certain scenarios. The second generation GPU Boost technology works in the background, dynamically adjusting the GPU’s graphics clock speed based on operating conditions. Originally GPU Boost was designed to push the GPU to the highest possible clock speed while remaining within a predefined power envelope. However Nvidia’s engineers found that the GPU temperature usually limits performance first. Therefore with Boost 2.0 they have changed the way the technology works boosting clock speeds according to the GPU temperature rather than power target. The new target in question for the GTX 780 is 80 degrees Celsius. In other words, the GTX 780 will automatically boost to the highest clock frequency it can achieve as long as the GPU temperature remains at 80C. Boost 2.0 constantly monitors GPU temperature, adjusting the GPU’s clock and its voltage on-the-fly to maintain this temperature. Nvidia has borrowed the design of the GTX Titan for the GTX 780 which is great news as the Titan not only looked imposing but it was also whisper quiet. Many other recent high-end GPUs like the GTX 780 make use of vapor chamber cooling, which consists of a copper vapor chamber that extracts heat from the processor using an evaporation process similar to a heatpipe, but more powerful. Helping to improve efficiency here is a new thermal material designed by a company called Shin-Etsu, which is said to provide twice the performance of the grease used on the GTX 680. Additionally, Nvidia has included an extra heatsink behind the 80mm blower-style fan that increases the cooling area. There is also an aluminum baseplate, which provides additional cooling for the PCB and board components. The guts of the cooling operation are covered in a case that encloses the top of the card. Given the high-end nature of this board, Nvidia engineers decided to use an aluminum casing for the cover. At its center is a clear polycarbonate window, allowing you to see the vapor chamber and dual-slot heatsink used to cool the GPU. Another nice touch in our opinion: the side of the card features a large GeForce GTX logo that glows green when the system is turned on. We think this looks cool, but if it's not for you, the LED intensity can be adjusted in software. Beside the logo towards the end of the card is a pair of PCI Express power connectors. The configuration is the same as the GTX Titan meaning you will find a single 8-pin along with a 6-pin connector. The GTX 780 has been given a TDP rating of 250 watts, which is 28% greater than the GTX 680, so Nvidia recommends using a 600W power supply. The board features a 6+2 power phase design that Nvidia says feeds enough power even when overclocking. Six phases are dedicated to the GPU while two are for the GDDR5 memory. Testing Methodology Reporting average fps (frames per second) using Fraps is how things have been done for... well, forever. It's a fantastic metric in the sense that it's easy to record and easy to understand. But it doesn't tell the whole story, asThe Tech Report and others have shown. To get a fuller picture, it's increasingly apparent that you need to factor in a card's frame latency, which looks at how quickly each frame is delivered. Regardless of how many frames a graphics card produces on average in 60 seconds, if it can't deliver them all at roughly the same speed, you might see more brief jittery points with one GPU over another — something we've witnessed but didn't fully understand. Assuming two cards deliver equal average frame rates, the one with lowest stable frame latency is going to offer the smoothest picture, and that's a pretty important detail to consider if you're about to drop a wad of cash. As such, we'll be including this information from now on by measuring how long in milliseconds it takes cards to render each frame individually and then graphing that in a digestible way. We'll be using the latency-focused 99th percentile metric, which looks at 99% of results recorded within X milliseconds, and the lower that number is, the faster and smoother the performance is overall. By removing 1% of the most extreme results, it's possible to filter anomalies that might have been caused by other components. Again, kudos to The Tech Report and other sites like PCPer for shining a light on this issue. Test System Specs Intel Core i7-3960X Extreme Edition (3.30GHz)x4 2GB G.Skill DDR3-1600(CAS 8-8-8-20)Asrock X79 Extreme11 (Intel X79)OCZ ZX Series (1250W)Crucial m4 512GB (SATA 6Gb/s)Radeon HD 7990 (6144MB)Radeon HD 7970 GHz (3072MB) CrossfireRadeon HD 7970 GHz (3072MB)Radeon HD 7970 (3072MB)Radeon HD 7950 Boost (3072MB) CrossfireRadeon HD 7950 Boost (3072MB)Radeon HD 7950 (3072MB)Radeon HD 7870 (2048MB) CrossfireRadeon HD 7870 (2048MB)GeForce GTX Titan (6144MB)GeForce GTX 780 (3072MB)GeForce GTX 690 (4096MB)GeForce GTX 680 (2048MB)GeForce GTX 670 (2048MB)GeForce GTX 660 Ti (2048MB) SLIGeForce GTX 660 Ti (2048MB)Microsoft Windows 7 Ultimate SP1 64-bitNvidia Forceware 320.14AMD Catalyst 13.5 (Beta 2) Benchmarks: Battlefield 3, Crysis 3 The GeForce GTX 780 averaged 62.4fps in Battlefield 3 at 2560x1600, making it 20% faster than the Radeon HD 7970 GHz Edition and 29% faster than the GTX 680. When compared to the dual-GPU GTX 690, the GTX 780 was 32% slower and just 8% slower than the GTX Titan. The GeForce GTX 780 slipped further behind the GTX Titan when measuring frame time performance — it was 14% slower to be precise. Still, when compared to the Radeon HD 7970 GHz Edition the GTX 780 was 17% faster, as it was 22% faster than the GTX 680. The GeForce GTX 780 spat out 29.1fps in Crysis 3 at 2560x1600, just 2fps slower than the GTX Titan but also just 1fps faster than the GTX 680. Nevertheless, it was 31% faster than the Radeon HD 7970 GHz Edition. When measuring frame time performance in Crysis 3, the GeForce GTX 780 was 15% slower than the GTX Titan but 12% faster than the GTX 680. It was also 17% faster than the Radeon HD 7970 GHz Edition card. Read More... Benchmarks: DiRT 3, Far Cry 3Benchmarks: Max Payne 3, Sleeping DogsPower Consumption & TemperaturesClosing Thoughts: Performance vs. Price Republished with permission from: Steven Walton is a writer at TechSpot. TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998.

Posted by Joystiq May 17 2013 14:30 GMT
- Like?
Nvidia canceled its forced wait time to pre-order an Nvidia Shield, allowing anyone to commit to the $350 Android handheld as of now. The Nvidia Shield is available for pre-order directly through Nvidia or through select retailers: GameStop, Micro Center, Canada Computers and New Egg.

The Nvidia Shield - previously known as Project Shield - is a handheld gaming console powered by Android. It has a five-inch retinal multi-touch screen capable of 720p, 16GB of internal storage and can stream your PC games, granted you have a GTX 650 GPU or better in your PC.

Posted by Kotaku May 16 2013 12:30 GMT
- Like?
Mere days after the announcement of its $350 price tag and June release date, Nvidia's Shield Android handheld is already showing up in the background of popular television shows. Look what Modern Family's Luke Dunphy has traded in his Nintendo 3DS for. As spotted over at Game Usagi, Lucas brings Nvidia's new portable along for a family RV ride in lieu of his usual handheld system of choice, perhaps lured by the potential of the Tegra 4 processor, or the ability to stream games from a Geforce GTX video card-toting PC. Or maybe some advertising money changed hands. It's a rather subtle appearance, especially considering the average younger gamer likely has no idea what the Shield looks like, and would probably mistake it for a fancy 3DS play-through case if pressed to identify it. It'll get there.

Posted by Rock, Paper, Shotgun May 15 2013 09:00 GMT
- Like?

Nvidia’s Shield is technically an Android-based device, but a) what kind of mighty android machine overlord needs a shield and b) we’re a PC gaming website. So then, why am I posting about this rare breed of land-dwelling game clam? Well, because it flawlessly streams just about any PC game you can throw at it – or at least, it will once that feature leaves beta a couple months after launch. Do you feel like an itsy bitsy screen, infinitely twiddle-able thumbsticks, and the ability to play anywhere in the whole wide worrrrrrrrrrrld (as long as your PC is, er, pretty close by) will greatly enhance your experience?  Then stream your eyeballs past the break for details.

(more…)


Posted by Joystiq May 14 2013 15:00 GMT
- Like?

The Nvidia Shield arrives next month for $349.99, and yesterday I got to sit down with the final version available at retail. The first thing I noticed was the heft: bulkier than a PS Vita, but no less comfortable.

Where the PS Vita sacrificed bigger buttons for smaller form factor, the Nvidia Shield takes a lot of inspiration from the Xbox 360 controller. In fact, the left and right triggers feel identical to the Xbox 360 and, more or less, so does the d-pad. The one big difference from Microsoft's gamepad is the symmetrical analog sticks.

Posted by Joystiq May 14 2013 14:00 GMT
- Like?
Nvidia's Project Shield - now officially dubbed Nvidia Shield - will launch before the end of June for $349.99, Nvidia has announced. Pre-orders for the Tegra 4-powered Android handheld will open on May 20, through select online and brick-and-mortar retailers: New Egg, GameStop, Micro Center and Canada Computers. Those on Nvidia's Project Shield notification list can pre-order starting today.

With the price and pre-order date, Nvidia announced five new games coming to the platform: Broken Age and Costume Quest from Double Fine, Flyhunter: Origins from Steel Wool Games, Skiing Fred from Dedalord Games and Chuck's Challenge from Chuck Sommerville's Niffler, who you may recall of Chip's Challenge fame.

The Nvidia Shield is running Android 4.2.1, sports a 5-inch retinal display capable of 720p and has 16GB of internal storage, expandable through a SD card slot on the back. Other hardware features include a built-in mic and GPS, plus a mini-HDMI out in the back. All Nvidia Shields will include two free games: Sonic the Hedgehog 4: Episode 2 and Expendable: Rearmed.

We'll have a hands-on video with the final Nvidia Shield and some impressions up soon.

Posted by Kotaku May 14 2013 03:00 GMT
- Like?
"Ira" is the name given to the bald guy you may have seen lately in some next-gen tech videos from Nvidia and Activision. He's one more step down the road towards more believable artificial characters. If you'd like a little taste of the next-gen on your current-spec PC, an interactive demo of the tech has been released by Nvidia, allowing users to adjust the settings, lighting and shaders used on Ira's face. Lifelike Human Face Rendering [Nvidia]

Posted by Rock, Paper, Shotgun May 13 2013 15:00 GMT
- Like?

Back in Feb we had a little chin wag about the mad dash of annual graphics hardware launches slowing to a saunter. We can add a little more flesh to the bones of that story this week, with some pretty plausible looking details of Nvidia’s upcoming plans – and further confirmation of nothing new from AMD. It’s worth a quick dip into the mucky waters of rumour for anyone pondering a GPU upgrade or a generally a new rig as some new kit – of sorts – is imminent. (more…)


Posted by Rock, Paper, Shotgun May 13 2013 13:00 GMT
- Like?

It feels like it was only, ooh, 1 month and 24 days since we last risked our collective sanity. We stared into the cold, shark-like eyes of the technological advancement of NvIdIa’S FaceWorks and lived. But at what cost? Back then, we were given a peep into the future of GraphicsFace with Digital Ira, a sadly uninteractive demonstration of what gamefaces will be like in the future. It looked impressive, but with the caveat that it was shown on stage and running on a Titan, Nvidia’s mahoosive card of graphics. Well the tech monolith has just released the demonstration for everyone to play with. If you fancy making a high-fidelity head gurn, then your fetish is well catered for.(more…)


Posted by Rock, Paper, Shotgun Apr 25 2013 14:00 GMT
- Like?

I feel like I should apologise for the headline but Nvidia call their middleware physics engine PhysX, for crying out loud. ‘Making a splash’ is almost Nabokovian in comparison. You may recall recent advances in convincing/crazed coiffures and I care about that about as much as I care about the latest floppy-fringed hair fashions in the real world. Not a jot. Fluid physics though? Ever since the invention of physics, which was sometime just before I balanced bricks on a plank to create a see-saw bridge in Half Life 2, I’ve been waiting for a game with proper water. The latest PhysX tech demo got my juices flowing and you can see it below.

(more…)


YouTube
Posted by Kotaku Apr 23 2013 12:30 GMT
- Like?
Simulating the physics of water was always tricky and game engines sometimes still have to use dodgy mechanics to make it feel real. But the above demonstration of this new fluid simulation technique proves that slowly but surely we're getting there. PhysXInfo has the details how it works: Position Based Fluids is a way of simulating liquids using Position Based Dynamics (PBD), the same framework that is utilized for cloth and deformables simulation in PhysX SDK. Because PBD uses an iterative solver, it can maintain incompressibility more efficiently than traditional SPH fluid solvers. It also has an artificial pressure term which improves particle distribution and creates nice surface tension-like effects (note the filaments in the splashes). Finally, vorticity confinement is used to allow the user to inject energy back to the fluid. According to PhysXInfo, it is running in real-time on a single GTX 580, which makes the whole thing even more impressive. Position Based Fluids Demonstration [YouTube]

Posted by Rock, Paper, Shotgun Apr 18 2013 17:00 GMT
- Like?

Don’t sling your old CPU on eBay just yet. Too many Rumsfeldian known unknowns remain, never mind the unknown unknowns. But the known knowns suggest Intel is bringing back at least a slither of overclocking action to its budget CPUs. It’s arrives with the incoming and highly imminent Haswell generation of Intel chips and it might help restore a little fun to the budget CPU market, not to mention a little faith in Intel. Next up, local game streaming. Seems like a super idea to me. So, I’d like to know, well, what you’d like to know about streaming. Then I’ll get some answers for you. Meanwhile, game bundles or bagging free games when you buy PC components. Do you care? I’ve also had a play with the latest bonkers-wide 21:9-aspect PC monitors… (more…)


Posted by Rock, Paper, Shotgun Mar 20 2013 16:00 GMT
- Like?

Faces are everywhere in games. NVIDIA noticed this and has been on a 20-year odyssey to make faces more facey and less unfacey (while making boobs less booby, if you’ll remember the elf-lady Dawn). Every few years they push out more facey and less unfacey face tech and make it gurn for our fetishistic graphicsface pleasure. Last night at NVIDIA’s GPU Technology Conference, NVIDIA founder Jen-Hsun Huang showed off Face Works, the latest iteration. Want to see how less unfacey games faces can be?

(more…)


Posted by Joystiq Mar 19 2013 02:45 GMT
- Like?
Tomb Raider on PC got another patch this weekend, launched in conjunction with new beta drivers from Nvidia for GeForce GPUs. Download the latest GeForce 314.21 drivers direct from Nvidia.

Since its launch, GeForce Tomb Raider players have seen "major performance and stability issues" trying to play the game at max settings. Complaints include problems with TressFX and tessellation tripping the game up, at least for those with 600-series cards.

There are more Tomb Raider patches incoming from Crystal Dynamics, based on ongoing player feedback.

Posted by Kotaku Mar 16 2013 00:00 GMT
- Like?
#tombraider Although this year's Tomb Raider reboot made our latest list of most anticipated PC games, I must admit that it was one of the games I was least looking forward to from a performance perspective. Previous titles in the franchise have received mixed to positive reviews, but gameplay aside, their visuals weren't exactly mind-blowing so we've never bothered doing a performance review on one — until now, anyway. More »

Posted by Rock, Paper, Shotgun Mar 07 2013 11:00 GMT
- Like?

NVidia have written a little apology note to all suffering with Tomb Raider graphics issues. Although I’ve yet to receive chocolates. I mentioned in yesterday’s Tomb Raider review that I had some issues with running the game on prettier graphics, and it seems I’m not alone. Apart from the silly hair mode reducing NVidia cards to jelly, I had peculiar problems with the OSD occasionally causing the game to judder, and couldn’t play above the normal settings. Extraordinarily, as spotted by Joystiq, this is because for some reason NVidia didn’t receive final code of the game until the weekend before release, so didn’t have a chance to create an update to accommodate it all.

(more…)