• Have something to say? Register Now! and be posting in minutes!

Everything we know about the GeForce RTX 2080, Nvidia's next graphics card

DetroitDevil

Consoles are for ages 3 and up & broke adults.
Supporting Member Level 2
54,221
19,879
1,033
Joined
Sep 19, 2016
Hoopla Cash
$ 7,500.00
Fav. Team #1
Fav. Team #2
Fav. Team #3
Everything we know about the GeForce RTX 2080, Nvidia's next graphics card


If current indications are to be believed—those indications being signs and teasers direct from Nvidia itself—the company's next graphics card is called the Nvidia GeForce RTX 2080, and it will launch (or at least be officially revealed) on August 20. But there are still a lot of questions, like what will RTX 2080 performance be like, and how much will the RTX 2080 cost? Let's dive in.

It's been a while since Nvidia introduced its last new graphics architecture for gaming GPUs. That last architecture was Pascal, and it has powered everything from the top-tier GTX 1080 and GTX 1080 Ti to the entry-level GTX 1050 and GT 1030—it's in many of the best graphics cards you can buy right now. The next generation of Nvidia graphics cards is finally approaching, using the Turing architecture. Here's what we know about the Turing, the RTX 2080, what we expect in terms of price, specs, and release date, and the winding path we've traveled between Pascal and Turing.

Details emerge for the GeForce RTX 2080
Previously, there has been a ton of speculation and, yes, blatantly wrong guesses as to what the Turing architecture would contain. Let me just put this out there: Every single one of those guesses was wrong. Chew on that for a moment. All the supposed leaks and benchmarks? They were faked. Even the naming guesses were wrong. Nvidia's CEO Jensen Huang unveiled many core details of the Turing architecture at SIGGRAPH, finally putting to bed all the rumor-mongering. Combined with a teaser video for the next GeForce cards and we now know most aspects of what will be in the GeForce RTX 2080.

Nvidia has been extremely tight-lipped about its future GPUs this round, but with an anticipated announcement at Gamescom as part of Nvidia's GeForce gaming celebration, plus the SIGGRAPH Turing architecture details, the name is now pretty clear. GTX branding is out, RTX (for real-time ray-tracing) is in; 11-series numbers are out, and 20-series numbers are in. Nvidia also recently trademarked both GeForce RTX and Quadro RTX, and while it's possible GTX parts might coexist with RTX parts, I'd be surprised if Nvidia chose to go that route. The new cards apparently start with the GeForce RTX 2080 and will trickle down to other models over the coming months.

Moving on to the Turing architecture, this is where Nvidia really kept some surprises hidden from rumor mill. The Volta architecture has some features that we weren't sure would get ported over to the GeForce line, but Nvidia appears ready to do that and more. The Turing architecture includes the new Tensor cores that were first used in the GV100, and then it adds in RT cores to assist with ray-tracing. That could be important considering Microsoft's recent creation of the DirectX Raytracing API.

The Quadro RTX professional GPUs will have both core types enabled, though it's still possible Nvidia will flip a switch to disable the Tensor cores in GeForce—the RT cores on the other hand need to stick around, or else the GeForce RTX branding wouldn't make sense. However, Nvidia also revealed a new anti-aliasing algorithm called DLAA, Deep Learning Anti-Aliasing, which implies the use of Tensor cores.

Initially, Turing GPUs will be manufactured using TSMC's 12nm FinFET process. We may see later Turing models manufactured by Samsung, as was the case with the GTX 1050/1050 Ti and GT 1030 Pascal parts, but the first parts will come from TSMC. One particularly surprising revelation that comes by way of the Quadro RTX announcement is that the top Turing design will have 18.6 billion transistors and measures 754mm2. That's a huge chip, far larger than the GP102 used in the GTX 1080 Ti (471mm2 and 11.8 billion transistors) and only slightly smaller than the Volta GV100. That also means the new RTX 2080 will likely remain as the top product in the 20-series stack—or else Nvidia will call it the RTX 2080 Ti.

What does the move to 12nm from 16nm mean in practice? Various sources indicate TSMC's 12nm is more of a refinement and tweak to the existing 16nm rather than a true reduction in feature sizes. In that sense, 12nm is more of a marketing term than a true die shrink, but optimizations to the process technology over the past two years should help improve clockspeeds, chip density, and power use—the holy trinity of faster, smaller, and cooler running chips. TSMC's 12nm FinFET process is also mature at this point, with good yields, allowing Nvidia to create such a large GPU design.

We also know maximum core counts, for the Tensor cores, RT cores, and CUDA cores—or at least, we know the target speed for the RT cores (Nvidia hasn't discussed the specifics of how many RT cores are used). The top Turing design allows for up to 4,608 CUDA cores, an increase of 20 percent relative to the GP102, and 29 percent more than the GTX 1080 Ti. Nvidia will deliver 16 TFLOPS of computational performance from the CUDA cores (FP32), which indicates a clockspeed of around 1700MHz. Turing also has 576 Tensor cores capable of 125 TFLOPS of FP16 performance (576 * 64 * 2 * 1700MHz again), and the RT cores can do up to 10 GigaRays/sec of ray-tracing computation—25 times faster than what could be done with the general-purpose hardware found in Pascal GPUs. Finally, the Turing architecture introduces the ability to run floating-point and integer workloads in parallel, at 16 trillion operations for each, which should help improve other aspects of performance.

Nvidia appears to have a second Turing design with up to 3,072 CUDA cores, 384 Tensor cores, and 6 GigaRays/sec of RT cores. It's possible this is just a harvested version of the larger chip, but that would mean disabling more than a third of the design and that's not usually necessary. A smaller chip would be a good candidate for RTX 2060, with RTX 2070 using a harvested version of the larger design, or the naming may shake out differently. Regardless of what names use which chips, Nvidia will have the ability to disable certain units within each design, allowing for various levels of performance.

Moving along, Turing will use GDDR6 memory. Based on the Quadro RTX models, there are two chip designs, one with a 384-bit interface and 24GB/48GB of GDDR6, and the other with a 256-bit interface and 16GB GDDR6. Nvidia is avoiding the use of HBM2, due to costs and other factors, and GDDR6 delivers higher performance than GDDR5X. While GDDR6 officially has a target speed of 14-16 GT/s, and Micron has demonstrated 18 GT/s modules, the first Turing cards appear to go with the bottom of that range and will run at 14 GT/s. Nvidia states that Turing will use Samsung 16Gb modules for the Quadro RTX cards, so it looks like it's going whole hog and doubling VRAM capacities for the upcoming generation of graphics cards (unless there will also be 8Gb GDDR6 modules).

With a transfer rate of 14 GT/s, the 384-bit interface has 672GB/s of bandwidth, and the 256-bit interface provides 448GB/s. Both represent massive improvements in bandwidth relative to the 1080 Ti and 1080/1070. We may also see higher clocked GDDR6 designs in the future, potentially with a narrower bus. Nvidia hasn't done a deep dive on the ray-tracing aspects yet, but I suspect having more memory and more memory bandwidth will be necessary to reach the performance levels Nvidia has revealed.

The above image shows a die shot of Turing, for the largest design with up to 4,608 CUDA cores. From a high level, there are six large groups that are repeated around the chip. Each group in turn has 24 smaller clusters of chip logic, and within each of those clusters there appear to be 32 small blocks. 24 * 6 * 32 = 4,608, indicating the smallest rectangular shapes are CUDA cores.

If Nvidia sticks with its recent Pascal and Maxwell ratios, four blocks of 32 CUDA cores make up one streaming multiprocessor (SM) of 128 CUDA cores, giving Turing 36 SMs in total. 16 Tensor cores in each SM give the 576 total Tensor cores as well, and each SM could come equipped with two 32-bit interfaces to give the 384-bit GDDR6 interface.

For the smaller variant of Turing, chop the above design down to four main clusters instead of six. That gives the expected 3,072 CUDA cores across 24 SMs, 384 Tensor cores, and 256-bit interface. So far so good.

The big question is where the RT cores reside. We don't have any figure for how many RT cores are in the architecture, just a performance number of 10 GigaRays/sec. The RT cores are likely built into the SMs, but without a better image it's difficult to say exactly where they might be. The RT cores might also reside in a separate area, like the center block. We'll have to wait for further clarification from Nvidia on this subject.

RTX 2080 release date, price, and specs: Everything we know about Nvidia's next GPU | PC Gamer
 

DetroitDevil

Consoles are for ages 3 and up & broke adults.
Supporting Member Level 2
54,221
19,879
1,033
Joined
Sep 19, 2016
Hoopla Cash
$ 7,500.00
Fav. Team #1
Fav. Team #2
Fav. Team #3
 

DetroitDevil

Consoles are for ages 3 and up & broke adults.
Supporting Member Level 2
54,221
19,879
1,033
Joined
Sep 19, 2016
Hoopla Cash
$ 7,500.00
Fav. Team #1
Fav. Team #2
Fav. Team #3

Strutz1896

New Member
20
2
3
Joined
Aug 12, 2021
Location
SouthEastAsia
Hoopla Cash
$ 1,000.00
Fav. Team #1
Fav. Team #2
Fav. Team #3
But how are the GPU prices now? Are they still on the ridiculous level?
 

DetroitDevil

Consoles are for ages 3 and up & broke adults.
Supporting Member Level 2
54,221
19,879
1,033
Joined
Sep 19, 2016
Hoopla Cash
$ 7,500.00
Fav. Team #1
Fav. Team #2
Fav. Team #3

BigKen

Day to Day
23,872
12,918
1,033
Joined
Apr 17, 2013
Location
Palm Coast
Hoopla Cash
$ 500.68
Fav. Team #1
Fav. Team #2
Fav. Team #3
Other than gaming, what are the graphics good for?

Not everyone is an XBoxer or PS4 freak.

Speed and performance is all I ask for. I'm too old to react to computer generated enemies who just happen to appear after three hundred of their buddies were killed in the same room mazes fifteen minutes ago.

I did the shit for real 55 years ago and the people who create these games don't have a clue about what's real and what isn't. I watch my grandson play and get frustrated watching him run over the same areas over and over and killing new guys that didn't have to get into the area by any other vehicle than a magic carpet.
 

Picklerick 2.0

Well-Known Member
11,344
7,284
533
Joined
Sep 11, 2020
Location
minnesota
Hoopla Cash
$ 1,000.00
Fav. Team #1
Fav. Team #2
Fav. Team #3
Other than gaming, what are the graphics good for?

Not everyone is an XBoxer or PS4 freak.

Speed and performance is all I ask for. I'm too old to react to computer generated enemies who just happen to appear after three hundred of their buddies were killed in the same room mazes fifteen minutes ago.

I did the shit for real 55 years ago and the people who create these games don't have a clue about what's real and what isn't. I watch my grandson play and get frustrated watching him run over the same areas over and over and killing new guys that didn't have to get into the area by any other vehicle than a magic carpet.
Lol, this type of graphics is for high performing computers, not consoles.
 

DetroitDevil

Consoles are for ages 3 and up & broke adults.
Supporting Member Level 2
54,221
19,879
1,033
Joined
Sep 19, 2016
Hoopla Cash
$ 7,500.00
Fav. Team #1
Fav. Team #2
Fav. Team #3
Other than gaming, what are the graphics good for?

Not everyone is an XBoxer or PS4 freak.

Speed and performance is all I ask for. I'm too old to react to computer generated enemies who just happen to appear after three hundred of their buddies were killed in the same room mazes fifteen minutes ago.

I did the shit for real 55 years ago and the people who create these games don't have a clue about what's real and what isn't. I watch my grandson play and get frustrated watching him run over the same areas over and over and killing new guys that didn't have to get into the area by any other vehicle than a magic carpet.
An example could be video Cameras, they create live images, images are interpreted
By software producing massive data in a short amount of Time, graphics cards or GPU’s focus on this kind of data, programs/instruction sets are written to process that data. GPU’s have instruction sets designed for it. So Imagine your autonomous vehicle “seeing” whats ahead, the gpu can sift thru the data stream and detect the kid crossing the street in the crosswalk, and react. GPU is superior to CPU in this as CPU’s are more of a jack of all trades in this regard, Shitty example but easy enough to understand.
 

DetroitDevil

Consoles are for ages 3 and up & broke adults.
Supporting Member Level 2
54,221
19,879
1,033
Joined
Sep 19, 2016
Hoopla Cash
$ 7,500.00
Fav. Team #1
Fav. Team #2
Fav. Team #3
Lol, this type of graphics is for high performing computers, not consoles.
Consoles arent that far behind. Especially with PS5 and xbox whatever new one.
 

moxie

Polite as fuck.
42,496
24,707
1,033
Joined
Jul 2, 2013
Hoopla Cash
$ 5,538.64
Fav. Team #1
Fav. Team #2
Fav. Team #3
Other than gaming, what are the graphics good for?

Not everyone is an XBoxer or PS4 freak.

Speed and performance is all I ask for. I'm too old to react to computer generated enemies who just happen to appear after three hundred of their buddies were killed in the same room mazes fifteen minutes ago.

I did the shit for real 55 years ago and the people who create these games don't have a clue about what's real and what isn't. I watch my grandson play and get frustrated watching him run over the same areas over and over and killing new guys that didn't have to get into the area by any other vehicle than a magic carpet.
Lately they’ve been used for mining crypto. It’s basically what caused a run up in prices and the impetus behind nvidia creating chips specifically for mining so that gamers would still have the products they need.
 

JohnShadows

Bayesian
Moderator
37,722
7,223
533
Joined
May 29, 2012
Location
Baker St.
Hoopla Cash
$ 706.71
Fav. Team #1
Fav. Team #2
Fav. Team #3
Other than gaming, what are the graphics good for?

Not everyone is an XBoxer or PS4 freak.

Speed and performance is all I ask for. I'm too old to react to computer generated enemies who just happen to appear after three hundred of their buddies were killed in the same room mazes fifteen minutes ago.

I did the shit for real 55 years ago and the people who create these games don't have a clue about what's real and what isn't. I watch my grandson play and get frustrated watching him run over the same areas over and over and killing new guys that didn't have to get into the area by any other vehicle than a magic carpet.
Machine Learning (some call it "AI"). You can use GPU's to train predictive models for image recognition and NLP (natural language processing). They're 13-20 times faster than using a CPU. Data scientists use them for this.

Did a recent PC build, still got the EVGA GTX 1070, but I'm looking for a used 2080 TI with the 11GB. Was hoping to see some CL listings from folks who spent their stimmy checks, then have to pay the rent this month, and are listing. But so far 8 GB are all I've seen.
 

JohnShadows

Bayesian
Moderator
37,722
7,223
533
Joined
May 29, 2012
Location
Baker St.
Hoopla Cash
$ 706.71
Fav. Team #1
Fav. Team #2
Fav. Team #3
Lately they’ve been used for mining crypto. It’s basically what caused a run up in prices and the impetus behind nvidia creating chips specifically for mining so that gamers would still have the products they need.
^^^ And this, big-time.
 

BigKen

Day to Day
23,872
12,918
1,033
Joined
Apr 17, 2013
Location
Palm Coast
Hoopla Cash
$ 500.68
Fav. Team #1
Fav. Team #2
Fav. Team #3
Thanks guys. I see the value going forward.

There are thin gs I just don't get. The best example is that women can get bigger boobs, bigger butts, but I've never been able to get a bigger "Little Head". Either longer or thicker. Another feminist thing??
 
Top