FreeSync 2 in action: How good is it (right now)?

ebn benghazi
[ad_1]






FreeSync 2 is AMD's monitoring technology for the next generation of HDR game screens. In a previous article, we explained everything you need to know about this and now we will give our impressions on the use of one of these monitors for some games, and if it is worth it. buy a FreeSync 2 monitor now.



The monitor that I used to test FreeSync 2 is the Samsung C49HG90, an incredibly wide 1080p dual 1080p 1080p screen with a total resolution of 3840 x 1080. It has a 1800R curve, uses VA technology , and it is certified for the DisplayHDR 600, this means that it has a maximum brightness of 600 nits, that it covers at least 90% of the DCI-P3 range and that it has an attenuation basic local.




While this panel does not support the DisplayHDR 1000 complete with 1000 nits of maximum brightness for optimal HDR, the Samsung CHG90 offers more than just a simple entry-level HDR experience.





There are many HDR compatible panels that can not push their brightness above 400 nits and do not support a wider range than sRGB, but Samsung's latest Quantum Dot monitors offer greater brightness and range wider.



Despite being advertised on their website, the Samsung CHG90 does not support FreeSync 2 out of the box, requiring to download and install a firmware update for the monitor, which It's not a great experience. As previously explained, AMD announced FreeSync 2 in early 2017, but it is the first generation of products that actually support technology.





In many cases, users will buy this monitor, hang it on their PC without performing a firmware update, and assume that FreeSync 2 is working as expected. The fact that you need to upgrade the firmware is not well advertised on the Samsung website - it is hidden in a footnote - and the firmware upgrade of one monitor is not a common practice.



If you buy a supported Samsung Quantum Dot monitor, make sure that it uses the latest firmware that introduces support for FreeSync 2; If it runs the right firmware, the Information tab of the on-screen display will show a FreeSync 2 logo.



The graphics driver and the AMD software utility exacerbate this problem with the FreeSync 2 display firmware. While Radeon Settings indicates when your GPU is connected to a FreeSync display, it does not makes no distinction between FreeSync and FreeSync 2. It is impossible to tell in Radeon Settings or any part of Windows that your system is connected to a FreeSync 2 display, so there is no way to check if FreeSync 2 is working , if your monitor supports FreeSync 2, or if it is enabled.





There was no change in the way that Radeon's parameters indicated support for FreeSync after upgrading our monitor to the firmware supporting FreeSync 2. This is related to confusing users and requires the attention of AMD ...



So, how does FreeSync 2 actually work and how do you configure it?



If FreeSync is enabled in Radeon Settings and in the monitor's screen display - and both are enabled by default - it should be ready to work. There is no magic toggle for everything to work and no real configuration options, but key functions are permanently enabled, such as low latency and low framerate compensation, or ready to be used as HDRs .



The use of HDR capabilities of FreeSync 2 requires you to activate HDR when you want to use it. In the case of Windows 10 desktop applications, this means going to the Settings menu, going to the display settings, and enabling "HDR and WCG". This switches the Windows desktop environment to an HDR environment and all applications that support the HDR format can transmit their HDR data directly to the monitor via HDR10. For standard SDR applications, which are currently most Windows applications, Windows 10 attempts to map SDR colors and brightness to HDR format because it can not automatically switch modes on the fly.








While Windows 10 has improved its HDR support with each major update of Windows, it is not yet at the point where SDR is correctly mapped to HDR. With HDR and WCG enabled, the SDR applications appear washed out and the brightness is insufficient. Some apps like Chrome are broken in HDR mode. There is a slider in Windows to change the basic brightness for SDR content, however with our Samsung test monitor, the maximum brightness supported for SDR in this mode is about 190 nits, which is well below 350 nits of the monitor. disabled.





Now, 190 nits of brightness is probably good for a lot of users, but it's a bit strange that the slider does not match the monitor's full brightness range. This is also a different control of the brightness control on the screen; If the brightness of the monitor is less than 100, you will get less than 190 nits when displaying the SDR content.



If all this sounds confusing to you, it's because that's the case. In fact, all the HDR implementation of the Windows desktop is a little messy, and if you can believe it, earlier versions of Windows 10 were even worse.



This is the case not only for FreeSync 2 monitors, but for all HDR displays connected to Windows 10 PCs. For now, we advise you to disable HDR and WCG when using the Windows 10 desktop and enable it only when you want to run an HDR application, because you will thus enjoy the best SDR experience in the vast majority of applications currently does not support HDR.



So, how about games? Surely, this is the area where the FreeSync 2 and HDR monitors really shine, right? In fact it depends. We tried a range of games that currently support HDR under Windows, and we were disappointed with the results. The implementations of HDR differ from one game to another, and it would seem that many game developers have no idea how to correctly plot their games for HDR.





The worst of the lot are the EA games. The implementation of the Mass Effect Andromeda HDR was famous when the HDR monitors were shown, but the curse of the bad HDR continues in Battlefield 1 and the new Star Wars Battlefront 2. Both games feature faded colors in the HDR mode that seems much worse than the SDR Presentation, made worse by a general dark tone to the image, and a low use of the spectacular glare of HDR. In the three new EA games that support HDR, there is no reason to activate it because it looks so much better in SDR.



We do not know exactly why these games look so bad, because reports seem to suggest that EA titles are also bad on TVs with better HDR support, and on consoles. We think that there is something fundamentally broken with how the Frostbite EA engine handles HDR and we hope it can be solved for the next games.





Hitman is one of the oldest games to support HDR, and it does not handle HDR either well. Although the presentation is not as faded as with the titles of EA, the colors are still boring and the image in general is too dark, with little (if any) use. impressive highlights. The idea of ​​HDR is to add to the color gamut and increase the brightness range used, but in Hitman, it seems that everything has become darker and less intense. Again, this is a game you should play in SDR mode.



Assassin's Creed Origins has an interesting HDR implementation because it allows you to change the brightness ranges to match the exact specifications of your screen. We are torn as to whether the game looks better in HDR or SDR modes; The HDR seems to have better reflections and a wider color gamut during the day, but suffers from a strange lack of depth at night that strangely makes the night scenes less like if they were actually at night. The SDR mode looks better during these night time periods and may be slightly behind the HDR presentation during the day.








On a screen with a full local dimming feature, that this Samsung monitor does not have, Assassin's Creed Origins would be more aesthetically pleasing but it's not the best HDR implementation that we have never seen.





The best game for HDR is by far Far Cry 5. AMD tells me that it's the first game that will support the GPU tone mapping of FreeSync 2 in the coming weeks, although the game does not support this HDR pipeline . Instead, as with most HDR games, Far Cry 5 uses HDR10 which is transmitted to the screen for more tone matching.



Unlike most other games, the Far Cry 5 HDR10 is pretty good. The color gamut is clearly enlarged to produce brighter colors, and we do not have the same faded look as many other HDR titles. The brilliant highlights are really brighter in HDR mode with a great dynamic range, and in general this is one of the few titles that looks significantly better in HDR mode. Nice work Ubisoft.





Middle Earth: Shadow of War is another game with a decent HDR implementation. When HDR is enabled, this game uses a much wider range of colors and the highlights are brighter. Again, there is no problem with dull colors or faded presentation, which allows HDR mode to enhance SDR presentation in virtually any way possible.



How to activate HDR in these games is not always the same. Most titles have a built-in HDR switch that replaces the Windows HDR and WCG setting, allowing you to leave the desktop as an SDR and simply enable HDR in the games you want to play in HDR mode.



Hitman is an interesting case where he has an HDR switch in the game settings, but will display a black screen if the Windows HDR switch is also enabled. Shadow of War has no HDR switch at all, preferring to refer to the Windows HDR and WCG setting, which is annoying because you will have to switch between HDR and SDR manually to get the optimal HDR experience in the game but a decent SDR experience on the desktop.








While the HDR experience in a lot of games right now is rather bad and actually far worse than the basic SDR presentation, we think that there are reasons to be optimistic as for the future of the HDR game on PC. Some newer games like Far Cry 5 and Shadow of War have fairly decent HDR implementations that significantly improve the SDR mode, while many games that have bad HDR implementations are a bit older.



As the HDR ecosystem matures, we should see more Far Cry 5 and less Andromedas Mass Effect in terms of their HDR implementation.





Neither are we at the stage where the games are using the FreeSync 2 GPU tone mapping. As we have already mentioned, Far Cry 5 will be the first to do so in the coming weeks, AMD claiming more games should come out later in the year and that FreeSync 2 support will be ready right out of the box.



It will be interesting to see how GPU-side tone mapping is revealed, but it certainly has the potential to improve the HDR implementation for PC gaming.



However, in the current state of things, we see little reason to buy a FreeSync 2 monitor until more games include decent HDR. It's just too random - and mostly missed - to be worth the significant investment in a first generation FreeSync 2 HDR monitor. This is not the kind of technology to adopt at this time, because later in the year we should have a wider range of HDR monitors to choose from, potentially with better support for HDR through levels. higher brightness. gradation and wider ranges. From now on, we should also have a better overview of the HDR game ecosystem, with a few more games for which the HDR implementations are correct.



That's not to say that you should avoid these Samsung Quantum Dot FreeSync 2 monitors, in fact, they're pretty good when it comes to gaming monitors. Just do not buy them specifically for their HDR capabilities or you could find a little disappointed right now.





[ad_2]

Source link

Post a Comment

Cookie Consent
نستخدم ملفات تعريف الارتباط على هذا الموقع لتحليل حركة المرور، وتذكر تفضيلاتك، وتحسين تجربتك.
Oops!
يبدو أن هناك مشكلة في اتصالك بالإنترنت. يرجى الاتصال بالإنترنت والبدء في التصفح مرة أخرى.
AdBlock Detected!
لقد اكتشفنا أنك تستخدم إضافة حظر الإعلانات في متصفحك.
الإيرادات التي نحصل عليها من الإعلانات تُستخدم لإدارة هذا الموقع، نطلب منك إضافة موقعنا إلى قائمة الاستثناءات في إضافة حظر الإعلانات الخاصة بك.