DLSS

  • What is it?
    Deep Learning Super Sampling (DLSS) is a new image rendering technology developed by NVIDIA, which uses deep learning to produce an image that looks like a higher-resolution version of an original lower resolution image.

    This technology is enabled by training a neural network on thousands of images from a specific video game both with and without anti-aliasing enabled. The network then learns how to improve the image to anti-aliasing quality without requiring any performance overhead. This trained neural network is then packaged into a video game driver, and is run during game play using the CUDA cores featured on NVIDIA’s GPUs.

    DLSS is currently enabled on NVIDIA’s RTX20 Series GPU cards with a select number of video games (currently around 25 titles).

  • Why do you need it?
    DLSS can enable much higher image resolution without requiring the same amount of performance overhead as anti-aliasing. And compared to commonly-used Temporal Anti-Aliasing (TAA) techniques, it provides much better temporal stability and image clarity. DLSS can provide a performance boost that increases framerates by up to 2X at 4K compared to NVIDIA’s previous generation of GeForce GPUs, allowing users to enjoy a super smooth gaming experience at 60 FPS or more.

  • How is GIGABYTE helpful?
    GIGABYTE’s range of NVIDIA RTX20 Series graphics cards are all enabled with DLSS technology. In addition, GIGABYTE is also a leading manufacturer of GPU server hardware that are used for deep learning to develop revolutionary technologies such as DLSS. Some of GIGABYTE’s most powerful servers that are used for deep learning include the G291, G292, G481 and G482 Series.

  • WE RECOMMEND
    RELATED ARTICLES
    How to Own a Screen with Accurate Color

    Advanced

    How to Own a Screen with Accurate Color

    Have you considered what’s actually causing the picture to look different? Let’s see how the AERO series creator laptop can assist these creative professionals achieve their goals quickly and effectively. They told you should drop everything and calibrate monitors? Actually, you only need the AERO laptop. It's easy way to improve color accuracy!
    HDMI2.1: Unleashing the Full Potential of the RTX 30 with full support for 4K/120 Hz

    Advanced

    HDMI2.1: Unleashing the Full Potential of the RTX 30 with full support for 4K/120 Hz

    2021 saw another powerful collaboration between GIGABYTE and Nvidia (a leader in graphics card manufacturing) in launching the RTX 30 series laptop GPUs. The latest generation of the RTX series achieves higher 3D operational performance and leads the field with a comprehensive range of connection ports, including the HDMI 2.1. Engineered with an eye to the future, the inclusion of the HDMI 2.1 port offers users a future forward laptop. Keep reading to discover the difference HDMI 2.1 makes in performance quality.
    Buyers Guide 2020:   How to Choose the Best Gaming Laptop For Gaming and Live streaming PART 2

    Advanced

    Buyers Guide 2020: How to Choose the Best Gaming Laptop For Gaming and Live streaming PART 2

    As an advanced gamer you know 144Hz is not good enough for your eyes, you demanded 240Hz/300Hz for smoother and faster performance and GIGABYTE answered. And yeah, you want battery life, no worries there, the AORUS series all comes with 94Wh of battery, which can extend the battery life over 6~7 hours.
    Buyers Guide 2020:   How to Choose the Best Gaming Laptop For Gaming and Livestreaming PART1

    Advanced

    Buyers Guide 2020: How to Choose the Best Gaming Laptop For Gaming and Livestreaming PART1

    To ensure you choose the right model for your gaming and streaming needs we put together a guide to walk you through understanding the key components and features you need to consider to get the most out of your investment.
    What is Edge Computing? Definition and Cases Explained

    Cloud

    What is Edge Computing? Definition and Cases Explained

    With the proliferation of 5G communications technology, edge computing—the practice of performing computing tasks as physically or logically close as possible to where data is created and commands are executed—has begun to permeate the modern world of smart technology. In this article, we explore the concept of edge computing in detail, and explain how it offers many excellent advantages, especially in terms of latency reduction for applications that rely on real-time decision-making.