The third and final episode of our "Needs to Fix" series focuses on Nvidia. After discussing what we think Intel and AMD can do with their upcoming products to make them more user-friendly, now is the time to look at the green team. As before, we look at the issue from the consumer's point of view and not as an analysis of the industry. Here we are ...
Deceptive Product Names
After the GTX 970 mess, we thought that Nvidia would like to play cool with the GeForce 10 series. But since it was one of their best-selling GPUs, they did not see it. clearly not the need. Nvidia started strong in May 2016 with GTX 1080, followed by 1070 in June and 1060 in July. It was a strong alignment covering price points of $ 250, $ 400 and $ 600.
By the end of 2016, Nvidia ushered in a curve with the 3GB 1060, a new model with half of the VRAM, which in itself was not a big deal. However, they also disabled an SM unit by taking the CUDA kernel count from 1280 to 1152. This meant that despite having the same name, it actually had 10% fewer kernels.
Calling the 3 GB model of a GTX 1060 was misleading since the specifications suggested that only the VRAM capability had been changed. Ideally, Nvidia should have called this model the GTX 1050 Ti, before the 1050 Ti is launched in October. Admittedly, this is not something that I've drawn quite hard on Nvidia at the time, I have since recalled them several times, but I regret not having made more noise in our cover of the day.
It seems that Nvidia does not care about the negative comments that they have received from some critics and many customers who have subsequently released a 560 1060 and although this model has all hearts enabled, the memory capacity reduced in this case also means less memory bandwidth. More recently, they also released a 3GB version of the GTX 1050 that has less bandwidth than the original 2GB and 4GB models. So we want to see Nvidia do better in this regard and stop with the product names misleading and we will make a greater effort to repel this anti-consumer practice.
Bait and switch
If you thought for a second, we had missed the whole GT 1030 fiasco by discussing misleading product names, we did not do it. But this one is so bad that we think it deserves its own category. The 3Go version of the GTX 1060 was certainly misleading, but for around 20% less than the 6GB version, those who play in 1080p should see only a ~ 7% drop in performance. So, in terms of value, it's pretty good. And with this I do not justify it, I just notice that it's not a bad proposition.
However, what Nvidia recently shot with the GT 1030 is really disgusting. Although the basic configuration was not affected, performance was severely compromised by exchanging GDDR5 memory with DDR4, which modern desktops use for system memory.
The end result is almost three times less memory bandwidth, rising from an already 48 GB / s anemic to just 16.8 GB / s. This means that in the intensive workloads in memory, in other words games the new DDR4 version is often 1 - 2x slower and this is not half the price no Plus, it's $ 10 less. Yes, $ 10.
It's incredibly misleading. The risk for players with a tight budget is extreme and even if it is unfair to take advantage of anyone, stealing those who struggle the most to get into the game is really unpleasant in our opinion.
Award winning "Founders Edition"
With the release of the GeForce 10 series, Nvidia introduced the "Founders Edition", a sexy-looking reference card that nobody should have bought, and we've said as much today. 39; hui. In terms of quality, the Founders Edition models are not bad, the thermals leave a little to be desired, but the overall build quality is good and they look great.
Personally, I have no problem with FE cards but the consensus of readers was to include them in this list. I can certainly understand where you all come from. Nvidia charges a premium for standard graphics cards and often makes them the only choice for early adopters. That said, you have the opportunity to wait and get the card from a partner at a lower price, which we recommended to you during our review of the GTX 1080 day.
If you are looking for a Titan, you will be forced to buy a reference card, but hey, we also understand the dynamics of the market and here, Nvidia simply reserves the big arms, which works well in terms of brand marketing. and profit margins.
But back to the Founder Edition cards, this is not a big problem for us and charging a premium is somewhat understandable. Do not let partners compete with custom designs at launch and delay is not so enjoyable. So ideally, we would like to see Nvidia do what it did with the GTX 1070 Ti version.
Dribbling products: Titan X, GTX 1080 Ti, Titan Xp
Speaking of the GTX 1070 Ti, we would like Nvidia to stop feeding us new GPUs. Of course, release a new generation within a few months if you must, but after that, stop it with the incessant cut models that share the same names as the full parts. Stop it with the cards we do not need, and stop rolling your most loyal customers in the process.
We have already talked about dubious models, so no need to come back to it. As for the release of models, we do not really need a late product cycle, like the GTX 1070 Ti, this is not terrible but it can confuse people.
The big problem with the publication of so many products in the same series is that Nvidia always stumbles on themselves. If you bought a Titan X at the end of the year 2016, you probably feel very good about yourself: 40% more CUDA hearts than the flagship part of GeForce, so at the time it was a real beast. Of course, you also pay twice the price, which makes absolutely no sense in terms of cost per image, but it was still the undisputed king of the hill.
Seven months later, Nvidia said, "I hope you enjoyed these bragging rights, because now we are essentially offering the same product for $ 500 less." Then a month later, they said "Are you upset? 1080 Ti yet? No? Still, I deny huh, well about a Titan Xp, the" P "is to take the l? piss edition, we add 7% more CUDA hearts and return to pay $ 1200.
Oh, and if you're a Star Wars fan, it might have been another six months for the Titan Xp Collector's edition. It is not easy to be a fan of Nvidia with deep pockets.
Calm down with GPU prices
This is a tricky question to address because Nvidia can only charge you as much money as you are willing to pay for a graphics card. We also have to deal with the cryptocurrency extraction and we have seen how this can explode the demand for GPUs to the point where Nvidia's AMD and MSRP mean nothing.
In 2010, Nvidia billed $ 500 for its flagship part, the GeForce GTX 480. They did the same with the GTX 580, but later on the GTX 590 at $ 700 and that's when Nvidia has probably started to realize some groups of players are willing to pay more for premium graphics cards. By the time the GTX 600 series was launched in 2012, they were well aware of that and we got the GTX 690 for $ 1,000. It did not make sense at this price but they were still sold.
With the confirmation that Nvidia could sell $ 1,000 graphics cards, in 2013, the Titan lineup was born and this helped them justify a $ 700 asking price for the GeForce flagship. One could say that the Titan Z was somehow an experiment to see how much some people would pay for bragging rights, although it is also a computationally heavy product, so perhaps justified.
Armed with this information, the Pascal Titan series climbed to $ 1,200 and now, gamers everywhere are worried about the cost of the next generation of models (Titan V is $ 2,999). The much needed AMD competition will certainly be useful here, but even so, the $ 500 flagship games GPUs are well and truly over.
Nvidia GameWorks
First of all, let me say that I do not think the GameWorks program is as bad as some would like to believe, but there have certainly been sneaky tactics that must stop.
For those of you who are not familiar with GameWorks, Nvidia provides developers with a set of exclusive technologies that allow them to include cutting-edge effects such as realistic hair, destruction and shadows without having to create them from scratch. However, Nvidia often seems to be scrambling to make sure that these effects are poorly executed on AMD GPUs, and sometimes they do so at the expense of their GeForce customers.
We have seen evidence of abuse of certain technologies with the intention of giving Nvidia an advantage. For example, the use of tessellation in Crysis and The Witcher 3. This situation has caused the backlash of players, even those who use Nvidia hardware, because these dirty tricks often hurt the owners of GeForce. We have seen examples where Nvidia would take an unnecessarily large performance shot rather than optimize an effect, and they do so to make sure that the performance shot for AMD is even greater. Nvidia denies these claims, but we have seen some pretty solid evidence that this is actually happening.
At the release, we noticed a huge success using both AMD and Nvidia hardware during the benchmarking of The Witcher 3 with HairWorks enabled. I've reported a 55% reduction in performance of the GTX 980, but a 67% drop for the Radeon R9 290. With HairWorks turned off, the GTX 980 was 16% faster than the 290 R9 and it was a good fit. is a pretty big win for Nvidia. But with HairWorks enabled, the GeForce GPU was now 56% faster.
There were, however, some problems with that. Firstly because the GameWorks feature meant that the owners of the GTX 980 were going from a minimum of just over 60 fps in 1080p with HairWorks turned off, to under 30 fps with the lite. ; activation. This leads us to recommend at the time no one uses HairWorks.
However, it was later discovered that Nvidia was abusing their tessellation performance advantage to the detriment of their own customers. By default, the level of tessellation was set to x64, despite the fact that there was no real visual difference between x16 and x64, even when analyzing the catches. screen side by side. In fact, x16 was barely better than x8, while x8 was really only slightly better than x4 and the descent to x2 made the effect rather mediocre.
Reducing the tessellation level from x64 to x16 more than doubled the frame rates for scenes that made heavy use of HairWorks, while x8 further increased performance by 20%. In short, the use of tessellation x8 provided essentially the same visuals with very little impact on performance, unlike the non-use of technology.
You can not blame the developer here, it is well documented that Nvidia has worked very closely with The Witcher 3 team, so that is sabotage or incompetence. We have also seen much more recent examples of unjustified performance hits in GameWorks titles, but fortunately we are all very aware of what is happening and the problems of Final Fantasy 15 have been resolved quickly after the initial benchmarking. . was complete. The developer also seemed very happy to take the blame on that one, so maybe just an honest mistake.
Connecting to GeForce Experience
We have not given AMD any trouble with driver support, nor will we be giving Nvidia any trouble. Sometimes the display drivers are not as good as they could be, but I think for the most part, both companies are doing a good job. After all, making sure that the current range as well as a number of previous generation GPUs all work in a wide range of games can not be an easy task, so I'll cut them a little bit of play.
We are not fans of GeForce Experience. The software can be very clunky sometimes and in many ways it feels overwhelmed. However, my biggest complaint is that Nvidia asks you to log in to use the software and, in my opinion, it's totally useless.
This is not new. Nvidia made this change years ago and we probably did not push back enough at that time. I know some of you may not see this as a big deal, and for many of you it will be rare for you to log in manually. Even so, you would think that paying hundreds of dollars for a GeForce GPU was enough to avoid having to give your details.
Maybe we are about to choose this one, and technically GeForce Experience is an optional software, but we think it would be ideal to get this benefit without any conditions.
Stop Anti-Consumers Practices: GPP
If you're reading this article, you've probably heard about the GeForce Partnership Program, or GPP. Otherwise, here's a little reminder: Nvidia has tried to convince all its partners, such as Asus, Gigabyte, MSI and others, to sign a document called GeForce Partner Program that would impose heavy restrictions on their operation.
The bad parts of the document, which was disclosed earlier this year by HardOCP, suggested that if a company did not align its gaming brand exclusively with GeForce, it would no longer receive significant funds from development and other incentives.
This is another of those anti-competitive and anti-consumer movements that Nvidia seems to like doing from time to time. Rather than simply producing attractive products and acting fairly, Nvidia feels the need to exclude competitors by using sneaky tactics. GeForce graphics cards, especially on the high-end, are already the best purchases and Nvidia has much more market share than AMD, so it's odd that Nvidia continually feels the need to bury its competitors.
No need to clearly post the garbage statements on your website about how these dubious offers "benefit" the players and concern the "transparency". Just play fair, play nice, and finally do the right thing for the consumers.
G-Sync Monitors
What G-Sync brings to monitors is pretty good: adaptive sync support, mandatory low-speed compensation, monitor validation, and so on. You can buy a G-Sync monitor and know that you are getting something of quality. But the way Nvidia has implemented and locked G-Sync is hostile to buyers.
G-Sync requires a dedicated hardware module. The module only works with Nvidia graphics cards. The module is also quite expensive, adding about $ 200 to the price of a given monitor. This means that any Nvidia GPU owner wishing to purchase a G-Sync monitor must spend $ 200 more than AMD GPU owners who have access to the very similar FreeSync technology.
And there is no reason why Nvidia GPUs do not support FreeSync, if it is that Nvidia wants to exclude its own customers from competing technologies. FreeSync is an open standard, in fact it's just VESA Adaptive Sync, so Nvidia is free to implement support whenever it sees fit and gives Nvidia GPU buyers access to adaptive synchronization monitors. cheaper. But they do not, because they can force players to buy G-Sync monitors, while blocking AMD owners from G-Sync and making it more difficult to change GPU ecosystems.
Final remarks
Although this list is almost entirely negative, that does not mean that Nvidia has not released really incredible products with its GeForce 10 series. We love the GTX 1080 Ti, it's an incredible GPU flagship. Scratch that, we really like most of Pascal's programming (and have repeated recommendations to prove it), which makes products such as the GT 1030 DDR4 all the more disappointing.
This is simply a wish list and we do not expect Nvidia to address any of these. Some may indeed affect their results, which makes them even harder to implement, but looking deeper into what they might approach to become more user-friendly, it would be good to see some of them. some in the next year
.
TechSpot Series:
To be corrected
After attending Computex 2018, the very PC-centric trade show, we found ourselves internally discussing some areas where Intel, AMD, and Nvidia could improve to become more user-friendly. At the end of this discussion, we realized that it would make a good column, so we make one for each company.
[ad_2]
Source link