We NEVER anticipated NVIDIA to permit this…

Thanks dbrand for making our CES 2019 content feasible. Check out the Grip and Prism at http://dbrand.com/shop

Buy our new merch: http://lttstore.com

10 Comments:

  1. so does this mean gsync will work on a free sync monitor now?

  2. Elite Gamer Nation

    “BFGD’s”… Big Fucking Gaming Desktops right

  3. warwick yorkesmith

    I’m a little confused, will I be able to activate gsync on my 1050ti when i connect in to a freesync monitor? I’m guessing not but just taking a punt ,thanks!

  4. The problem with modern displays is that they ONLY contain enough electronic circuitry to do a Z-Scan of the display (i.e. scanning left-to-right and top-to-bottom OR sometime (right-to-left and bottom-to-top) if pixels within a window of X-number of milliseconds. (i.e. 16 milliseconds for 60 fps or 8 ms for 120 fps!)

    Each pixel and line is scanned on a linear basis which can be INTERRUPTED by all sorts of external factors including RFI (Radio Frequency Interference), EMI (Electromagnetic-interference), CPU to Video Buffer pixel transfer delays, unsynchronized buffer overwrites (lines of pixels are partially filled with pixels from previous frames) simple timing jitter, and a number of OTHER electrical and CPU/GPU factors.

    In order to FIX screen tearing, and blank-out, there NEEDS to be enough onboard-the-display buffer memory that fits the native resolution (i.e. 41,472,000 bytes for 10-bits per channel HDR video at 4k 3840×2160 resolution) of the display which can then use a technique similar to GLOBAL SHUTTER where every pixel is sent at the same time to be output on the display at their correct 2D-XY pixel location.

    Unfortunately, one needs pixel emissions circuits and DAC (Digital-to-Analog Converter) for EVERY pixel on a STACKED basis where each pixel-site contains the light emitter overlaid on-top of a 16-bits to 10-bits DAC microcircuit along with a 40-bit to 64-bits pixel memory storage site for that pixel AND a separate line OR access to a stacked electrical pathway system to the actual graphics card buffer. THEN individual pixels can be turned ON and OFF at the appropriate RGBA colour setting ALL AT THE SAME TIME across the ENTIRE DISPLAY!

    We actually CAN do this today using a Q-LED (Quantum LED — NOT OLED!) technology which does allow proper synchronous display of ALL pixels all at the same time! This would allow us FRAME RATES to approach 1000 fps (one millisecond per frame display time) for SUPER-SMOOTH motion and NO tearing or blank-out!

    In terms of technology, I would probably be going with stacked Gallium Nitride emitter circuits because they can operate up to 60 GHz for FAST data transfer. GaN we can do TODAY at a price of about $2000 US for a 1000 fps (1000 Hz!) 10-bit HDR 4k gaming monitor at 27 inches in size! I do know of Boron Nanotube and Carbon Nanotube RGB micro-laser emitter technology SLOWLY coming online within the next 10 to 15 years which SHOULD bring that same 1000 Hz 10-bit HDR Quantum-LED technology down to about $350 U.S. for a 27 inch 4k gaming display!

    Graphics Card VRAM and Buffer technology would NEED to catch up to that fancy Q-LED technology since or say 64-bits RGBA pixel downsampled to 10-bits per channel (40 bits per pixel) needs a bandwidth of 66,355,200,000 bytes per second
    (66.4 gigabytes per second!) to be able to handle the 1000 Hz refresh rate! We would light a single-mode dense-wave fibre-optic connector to handle that sort of bandwidth from graphics card (GPU) to the video display/monitor!

    In terms of electrical signal bandwidth (i.e. the ON/OFF bitwise electrical representation of data pulses and ECC error correction signals) you need a 1.062 Terahertz (1,061,683,200,000 Hz) GPU processor to handle that electrical signal bandwidth. Again, Gallium Nitride or Gallium Arsenide GPU circuits WOULD BE ABLE to handle that for about $5000 US in today’s dollars for a 1.1 THz speed for 1000 fps NVidia 2080-series GPU equivalent! In 10-to-15 years, we could bring that price down to about $700 US in my estimation.
    .

  5. linus your late

  6. 6:30 When with the power of your mind you drop things

  7. so if you have a monitor dont use HDMI use a dp?

  8. GTX 1070 or RTX 2060 I’m not going to use ray tracing.

  9. Wow, it works now.

  10. Linus’s dropification powers have evolved to the point that even without trying, he can affect localized gravity causing it to malfunction…..in time we will all be dropping things simply by watching LTT videos.

    #LinusIsGoingToDropTheISSSoon

Leave a Reply

Your email address will not be published. Required fields are marked *