• Thread starter News
  • Start date
  • " /> News - Nvidia GeForce tuning guide: 5 tips to optimize your RTX graphics card | SoftoolStore.de - Софт, Avid Media Composer, Книги. | бесплатные прокси (HTTP, Socks 4, Socks 5)

    News Nvidia GeForce tuning guide: 5 tips to optimize your RTX graphics card

    News

    Staff member
    Редактор
    Messages
    15,112
    Points
    358
    Offline
    #1
    Nvidia’s GeForce RTX 40-series and 30-series graphics cards are already very powerful out of the box, particularly for gaming. But of course, tuning can bring out even more performance, efficiency, and image quality. Here, we’ll provide the best tips for optimizing Nvidia’s modern GeForce GPUs, showing which features you can use to make your gaming experience better than ever.

    AMD gamers will want to check out our guide to optimizing Radeon graphics cards instead.

    Before we begin, the following GeForce models work best for these optimizations, though some of the features and tricks highlighted below will also work with older GPUs, particularly the RTX 20-series:

    Nvidia GeForce RTX 4000 (“Ada Lovelace”)


    Nvidia GeForce RTX 3000 (“ampere”)



    Tip 1: Faster through overclocking


    Users who want to overclock their PC’s graphics card are confronted with many unanswered questions at the beginning and should pay attention to a few basic things to ensure that the project is crowned with success.

    In order to overclock a graphics card or its graphics processor, you must make some preparations. First, it must be clear exactly which graphics card is involved and its technical specifications (e.g. clock frequencies).

    As we point out in our roundup of the best graphics cards for PC gaming, the Nvidia GeForce RTX 4090 is hard to beat for maximum capability.

    GeForce RTX 4090 Founders Edition


    Using the GeForce RTX 4090 as an example, in this case the Founders Edition from Nvidia, we then determine all the specifications relevant for overclocking. These can usually be found on the product page or in the user manual.

    Next we verify the technical specifications from the graphics card itself, using an appropriate program. To verify the parameters already mentioned, such as base, boost, and memory clock, we recommend the small but powerful tool GPU-Z, which was developed by TechPowerUp! and is available for download free of charge.

    In the case of the GeForce RTX 4090 Founders Edition, the tool spits out the following technical specifications:

    Nvidia GeForce RTX 4090 Founders Edition

    • 2.230 MHz base clock
    • 2.520 MHz boost clock (maximum)
    • 24 GB GDDR6X graphics memory with 21 Gbps (10.5 GHz)
    • 1.008 GB per second memory bandwidth

    Our goal is to increase the maximum possible boost clock and thus the average clock frequency in games accordingly, and optionally also to maximize the memory bandwidth by using a higher memory clock.

    For this purpose, users who have a graphics card from Nvidia, as in our example, will rely on the powerful toolbox of MSI Afterburner. With the help of an integrated fully automatic feature, called an “overclocking scanner,” the maximum clock frequency of the graphics card can be determined without much user intervention.

    This MSI video explains how this works in just a few steps.


    After the overclocking has been carried out with the help of the OC scanner, the on-screen display (OSD) and the useful feature “Video Capture” help to read out and evaluate the newly created clock frequencies and to record the gameplay.

    MSI also explained these features in an easily understandable way in a corresponding how-to video.


    Overclocked accordingly, you can something achieve 10 to 15 percent more performance compared to the factory settings, depending on the model and quality of the graphics processor.

    Important note: By overclocking or undervolting the GPU, you change important parameters such as the power dissipation (TDP), heat generation, or power consumption of the graphics card. Please note that incorrectly set voltages can not only result in an unstable system, but also irreparable damage to the GPU. Be careful!

    Tip 2: More efficient through undervolting


    Since manufacturers always include a safety buffer in the factory settings of their graphics cards with regard to the highest possible stability, and therefore apply a little more voltage than necessary to the chip, this can be lowered within a certain range without causing crashes or instabilities.


    The supreme discipline in overclocking is to be able to operate the graphics card with more or at least the same performance, sometimes with significantly less energy consumption. To do this, it is necessary to lower the voltage of the graphics processor and thus to “undervolt” it.

    To undervolt the graphics card, you no longer have to use a tool like MSI Afterburner or Asus GPU Tweak III. On the Nvidia side, the corresponding GPU settings for the respective graphics card can be changed via GeForce Experience under “GPU Tuning.”

    To reduce the power consumption of the graphics card, either the supply voltage or the power limit must be reduced. It doesn’t really matter which method you ultimately decide on — though especially with the latest GeForce RTX 40-series (“Ada Lovelace”), it has proven to be most efficient to slightly reduce the power limit.

    The following tools, among others, are suitable for overclocking and undervolting:


    Note: The manufacturer tools also work without problems with graphics cards from other manufacturers, so you can use EVGA’s Precision X1 on an Asus GPU, and so on.

    Tip 3: More stable through stress tests


    To put an overclocked graphics card through its paces in terms of stability after overclocking and undervolting, you can use the following benchmarks and stability tests:


    Since the loads (and especially the load peaks) for the GPU and the VRAM are different in games, some of your preferred titles should also be played and checked for stability. Ideally, the result is a faster and more efficient graphics card that can be operated with absolute stability. If your overclock isn’t stable in real-world use, you’ll either need to dial it back a bit, or apply a bit more power to your GeForce graphics card.

    Tip 4: More beautiful thanks to DLSS


    Users who like to play games in higher resolutions and enjoy the significantly improved picture quality are often confronted with the lack of raw performance of their graphics card. But Nvidia has a solution for this.

    Deep Learning Super Sampling (DLSS) is the name of the technology that first calculates images in a lower resolution and then “upscales” them using an innovative upscaling process powered by dedicated AI tensor cores. This way, the frame rates and frame times in supported games can be increased without having to upgrade to a new graphics card.


    With the help of DLSS 3 — including frame generation, a method that generates intermediate frames — the frame rate can even be doubled again in many cases.

    Nvidia demonstrates how this works using the example of several popular current games, which can be significantly accelerated with DLSS 3. DLSS 3 Frame Generation can only be used on modern RTX 40-series graphics cards, however, while all GeForce RTX GPUs can run the fantastic DLSS 2 supersampling feature in games that support it.

    Tip 5: Smoother thanks to Nvidia G-Sync


    No distortion, no judder. That’s what Nvidia’s proprietary adaptive synchronization technology does for monitors that support a variable refresh rate (VRR) to reliably avoid tearing and reduce stuttering.

    the best oled ultrawide supports g-sync ultimate
    Alienware AW3423DW

    Best Prices Today: $899.99 at Dell Home | $1099.99 at Dell Small Business

    Nvidia G-Sync technology delivers smooth, smooth gaming performance at virtually any frame rate, with no distorted or clipped frames. The latest evolutionary stages also support refresh rates from 120 to 360Hz, Low Framerate Compensation (LFC) and high contrast images created with HDR.

    To be able to use Nvidia G-Sync, a compatible monitor is required.


    This article was translated from German to English and originally appeared on pcwelt.de.

    Computer Components, Graphics Cards
     
    Top Bottom