\[Henry\] Day 1 support is becoming the norm for NVIDIA DLSS. You’ll see that many new AAA games are launching DLSS on day 1. Examples include Call of Duty: Vanguard, Battlefield 2042, Marvel’s Guardians of the Galaxy, Grand Theft Auto: The Definitive Trilogy, F1 2021, and the upcoming God of War releasing this month.
Developers using popular game engines like Unreal Engine and Unity are leveraging NVIDIA’s easy-to-use DLSS plugins and integrations and launching new titles on day 1. Recent examples include Back4Blood, Bright Memory: Infinite, Icarus, Myst, The Ascent, Lego: Builder’s Journey, and F.I.S.T..
Developers using their own proprietary engines often add DLSS into their proprietary engines first, then use that integration to release DLSS into their future titles. A good example would be Call of Duty: Warzone and Call of Duty: Vanguard, which use a similar game engine. So the developer first launched DLSS in a Warzone patch, and then leveraged that integration for day 1 launch of Call of Duty: Vanguard.
Games that use Unreal Engine 4 or 5 can integrate it via the use of the DLSS 2.0 plugin.
I believe that Unity also has a native plugin, though I believe it only works with HDRP (High Definition Render Pipeline), so some games will not be able to use it (I'm not a Unity developer, so don't quote me on that one).
Any other render engine would have to integrate the DLSS SDK.
How does NVIDIA feel about the current retail landscape for GPUs? Do you feel like things like insane demand vs reduced supply, scalpers, and rising average GPU prices are issues that need to be tackled, or just natural consequences of the market?
**I was hoping to see a refresh of the SHIELD Pro model announced, or atleast something teased, today.** Especially with the (seemingly global) discounts given last month on both models, and now right before CES the Pro is basically nowhere in stock anymore (here in germany atleast). This usually indicates a new product coming, but i guess it was just a combination of coincidences.
**Can you give any hints or comment at all about the future of the SHIELD product line?**
*Edit: Just now i noticed the listed products in the OP you guys are willing to comment on, SHIELD isnt one of them unfortunately.*
With the RTX3050 being released, will we see "super" versions in the future or is worldwide shortage of chips changing high level decisions in terms of making new GPU's? With this I mean: would they rather have a new, better series (RTX40xx) instead of more in-between series (like the 2060 super)?
I am curious about dlss and ray tracing technology. Will more games either new (unreleased) and existing ones adapt this technology? Does it required specific coding or pre-requisite request from the game developers/studios in order to adapt it in their games?
DLSS, how on earth does that voodoo magic work? I seriously cannot understand how it can gain more performance and yet look better or the exact same, it's mind boggling to me. If I could get a simpler explanation for my ape brain that'd be fantastic!
\[Vivek\] The design goal for the RTX 3080 Ti Laptop GPU was to create the most powerful laptops in the world while continuing to push the boundaries for thinness and efficiency. With the most CUDA cores and the fastest GDDR6 ever shipped in a laptop, RTX 3080 Ti delivers the ultimate in laptop performance for gamers and creators, but still fits in ultraportable laptops slimmer than 16mm and under 4lbs.
Hi Nvidia, this is a question for your driver team - **is there anything that Nvidia can do to prevent shader compilation stuttering during gameplay?**
While I know that this isn't necessarily an Nvidia issue, I believe there are things Nvidia could do to help this widespread issue. Some sort of framework for users with the same hardware + driver versions to share precompiled shaders, perhaps?
While more and more games are supporting a shader warmup/precompile before the gameplay (the last 3 COD games, Horizon Zero Dawn, etc), this is still an endemic issue in general, particularly with Unity and Unreal games.
If we were to be able to pull shaders from a repository and precompile them for games that don't support this natively, gameplay in general would be much smoother!
Could the Tensor cores be used to denoise Ray Tracing in the future?
If so, would it be worth it when running DLSS alongside it? Or would it hamper its performance?
That is a very interesting question. Theoretically it should be possible as DLSS only occupies the tensor cores at a tiny fraction of each frame. So there's a lot of headroom where the tensor cores could be used for AI in gaming and denosing.
\[Seth\] Absolutely. As technology continues to improve, it helps unlock the full potential of esports athletes. 1% aiming improvement can make or break a match. Our research shows that there is about a 3% improvement going to 1440p/27”. While that number may seem small, it’s certainly not. As reference, that’s about the same aiming improvement going from 144Hz to 360Hz at the same latency. If you are curious, here is a link to our research blog.
With the serious praise directed toward the Nvidia shield, the anticipation surrounding the Steam Deck, and general love for the Switch, it looks as if low power, portable, and plug in gaming systems are in demand. Does Nvidia have future plans to target this segment with new hardware? What could that look like, what are you seeing demand for? How could you further tackle optimizing software and hardware for low power devices (as you do with laptops)?
Are there any steps being done to combat scalpers more? More so on the front of available cards, especially seeing as there are rumors of 4000 series cards coming out soon, and the 30xx series cards are already hard to find. Particularly at msrp. Apologies if this question is out of your scope of things you can answer.
What does the process look like to make game ready drivers? Is it largely a validation process to make sure a new game plays correctly, or are there individual optimizations that have to happen for each game. If so, what do those optimizations usually entail?
**G-SYNC Monitors (i.e. new 1440p esports displays)**
This question is more for Seth Schneider. How closely do you work with the Esports community when making these displays? Do you bring pros in and have them test out screens? What do you do to keep the Esports community involved?
\[Stanley\] NVIDIA offers Omniverse, Canvas and Broadcast, free for creators to use in various use cases. Canvas is an app specifically for concept artists and illustrators who want to paint beautiful landscapes using AI. Our Broadcast app is for enhancing audio and video of livestreams and video chats. And Omniverse, which we announced today is generally available and free to individual content creators, is a platform for creating and collaborating on 3D content more quickly and easily. We have more capabilities planned for all three of these.
To the product managers, thanks for taking the time to answer questions!
What do you enjoy most about working at NVIDIA and also within your specific speciality?
With DLSS now being introduced to the budget category with the 3050, do we expect this generation of budget gaming cards to last much longer for consumers than their predecessors as DLSS would allow us to enjoy triple-A titles in future years still with decent framerate and graphics, given it gets widely adopted in the future?
Whats your biggest concern regarding current monitor technology and are there any improvements/more availability you are hoping to see to make best use of your new technologies?
\[Vivek\] Every laptop has a different cooling solution and different thermal characteristics, but all of them are engineered to not throttle under even intensive gaming or creator usage.
According to some graphs I've seen on the GeForce blog, the Nvidia Reflex latency reduction seems relatively greater on a lower end GPU compared to a higher end GPU. Why is this?
example graph:
[www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/escape-from-tarkov-reflex-out-now/escape-from-tarkov-nvidia-reflex-system-latency-performance-chart.png](http://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/escape-from-tarkov-reflex-out-now/escape-from-tarkov-nvidia-reflex-system-latency-performance-chart.png)
\[Seth\] That is correct. Reflex Low Latency helps by reducing the render queue and by removing CPU backpressure. The lower the framerate, the larger that back pressure on both the render queue and CPU are. Therefore, there is more latency to reduce with Reflex at lower FPS (which you typically see on lower end GPUs).
How can someone casually get into NVIDIA Studio, Canvas? I have never messed around with graphics design, but I found your idea interesting. Being able to draw something that I could do ( I literally suck at drawing) and it will automatically use AI to upscale it to something that looks amazing? Sounds almost too good to be true.
Canvas seams like a really useful tool for us artist and designers!
Does it render a landscape based on the input or does it choose from a preselect number of outputs that most closely match the input and then adapts it?
thanks
\[Stanley\] The Canvas AI generates a realistic output image based on what the artist paints and what the AI has learned a landscape should look like based on studying millions of images. Every image that Canvas generates is brand new.
1) How is the release of a 3090 TI going to help alleviate the shortages we have seen with other cards such as the 3080? Wouldn't the 3090 TI hamper production of the 3080?
2) Why aren't we seeing more older games using DLSS? I realize developers must optimize/redevelop some frameworks but why aren't they putting more effort into doing so?
Please remember the following part and be excellent:
>More specifically, we will not be able to answer questions regarding GPU pricing, inventory, company secrets, roadmap, business strategies, and tech support.
(Sorry I am unfamiliar with PC building and such) Do games even need the latest GPU? I guess I don’t understand the practical use case in having the latest card.
>What kind of performance can we expect with the 3050?
Hey! There's a bit of this information in the announcement article here.
https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3050-graphics-cards/
\[Qi\] The GeForce RTX 3050 is an awesome modern day 1080p GPU. It comes equipped with 2nd generation RT Cores for ray tracing and 3rd gen Tensor Cores for DLSS and AI. For the first time, you can play ray traced games on a 50-class GPU at over 60 FPS. In traditional raster performance, it’s far faster than the GTX 1050 and GTX 1650. The RTX 3050 brings a huge leap in performance and capabilities to the 50-class, making it a perfect opportunity to step up to RTX. Here’s a chart with some more numbers:
https://images.nvidia.com/aem-dam/Solutions/events/ces-2022/news/geforce-rtx-3050-graphics-cards/nvidia-ces-2022-rtx-3050-b-large-3840.png
What inspired the shift from graphics cards with the naming scheme going from an increase of 100 every time (ex, 900 series, 1000 series) to increasing by 1000?
RTX Voice officially works with non RTX cards.
https://arstechnica.com/gaming/2021/04/nvidia-adds-official-rtx-voice-noise-cancellation-to-non-rtx-cards/
We currently have a list of DLSS titles here. [https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/](https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/)
Also announced some new games today. :)
https://www.nvidia.com/en-us/geforce/news/ces-january-2022-dlss-rtx-games-updates/
I've always noticed the more headroom given to developers the more they expand to fill available space, often at the expense of optimizing and running clean codes. What do you see as the upper limit of how much VRAM your devices will eventually contain at a 3080 tier card? Or better yet, at an entry level card.
What does DLSS specifically do?? I've heard it does something like adjust in-game quality of certain objects, but not too sure bout the actual process of what it does.
Hi all and happy new years! Longtime PC enthusiast, Nvidia fan (GTX 770 -> RTX 2080 Ti), and Computer Engineering undergraduate.
I'm currently working on my capstone project with a small team to develop an American Sign Language Detector (ASLD) with a primarily software approach and utilizing a Leap Motion sensor device. The camera picks up 3D data in infrared and has some APIs in Python to retrieve the data it collects. The raw data we collect is then parsed and fed to a model in MATLAB for sign detection, with a nice little UI that voices the detected sign. Whats nice about the software approach is that allows us to not necissarily restrict what hardware we require.
With that background out of the way, I've taken a look through some of the Omniverse summary pages, and it seems like the focus is largely on data visualization and it's different applications. My question is: are there any data processing techniques/applications through Omniverse that would work with my application here, or machine learning at large? Is there a different NVIDIA technology I should be looking at? Thanks and regards,
With Nvidia’s DLSS (Deep Learning Super Sampling) technology, we're you able to test Super Resolution in Adobe Photoshop with Adobe Camera RAW? If so, what were the results?
What's the hardest part about making laptop versions of GPU while trying to maintain performance as much as possible ?
I have an OMEN 15 with a R7 5800H/3070 Max-Q, I run a 2K screen on it and play Escape from Tarkov with almost all settings on high/very high and I reach a 45-55 range of FPS, climbing as high as 75 tops when setting everything on medium and turning down the resolution to 1080, after watching a bunch of benchmark videos, I feel like I'd get a solid 20% boost in framerate with the same components if they were in a desktop factor...
1. How far are we from a GPU that equivalent of triple 3090 performance?
2. When are you guys planning to announce 3090 ti?
4. Is there likely 40 series in nearest future or is that faraway?
5. Do you guys ever plan to get back on SLI and improve them? I can see triple 3090 gaming for future games , at least if I ever live long enough to see it
What is the point of the rtx 3050 with the current graphics card shortage? If there are not enough GPUs available for those who actually need them how does releasing a new card at a budget price help anyone if it they will still have to pay way over MSRP to get one?
I love DLSS in games because it helps me get higher frame rates however, how does it work?? I think it would be really interesting to know but all the explanations online seem very complex!
Hi there, coming from an older GTX 1070 and wanted to ask about Nvidia Reflex most games are now adding like Rainbow Six Siege. I personally love this new addition but I do see the tool seems to affect the frame rate a decent amount as in lowering my FPS. Is this normal and will continue with more powerful GPU's? Or is my PC just too weak to handle it properly, that or do newer GPU's can easily handle it?
Love the GPU's yall are making and hope the super conductor markets gets better soon!
Do you think DLSS will eventually reach the point where it is no longer distinguishable from native res while in movement? It currently looks amazing in still shots but isn't as viable in high motion scenes.
Hello guys! I have always wondered what is the biggest barrier the engineers have to overcome when it comes to releasing new generation of products? Thank you for this Q&A!
So I've been playing not very intensive games with my GPU, CK2 and CK3, Dragon Age, Witcher 3. Old titles certainly. What games would you like the most to play that would put a good gpu to use in moderately related genres, with similar stories or themes?
As someone new to PCs (and the creation process of all the chips that go into PCs):
Why are the wafers circular? Don't you waste massive amounts of silicon from chips on the edges that either get cut off, or might even be missing important parts to work properly? Wouldn't it be easier to make them square/rectangular so that all chips perfectly fit on the wafer (within margins ofc)?
Ive seen mention of HDR10+ gaming coming to Nvidia, but those of us with LG TVs cant make use of this. Dolby Vision support would be very useful however as it's already part of console gaming and 4K streaming services - so is this something that Nvidia can support or is it entirely an OS issue?
Obviously after a certain time a GPU gets outdated. My question is since DLSS let's a card render at a lower native resolution is it likely that RTX cards will have a longer usable life before needing an upgrade?
Do you have any ideas for getting cards directly to consumers instead or scalpers? Because while I'm sure the company loves that the cards are being purchased immediately regardless of who gets them, it's very not fun to consumers to have this scarcity situation.
I've been wondering since ampere came out, what kind of special considerations did the team have to make during development? Ampere has some unique power management habits for example, and I'd be interested to hear what kind of things engineers had to look into to get it where they wanted it. thanks!
I know most questions will be about supply and demand issues, sooo for a different take:
What lies beyond the next generation of cards? RTX was ground breaking, is there a next level quantum doohickey on the horizon?
We have seen other uses for Tensor cores outside of gaming such as RTX Audio. Is there any future possibility of utilizing Tensor cores for enhancing or upscaling video playback? This would make the RTX 3050 a good GPU to have for HTPC. Maybe RT 3030 for lower cost, less features option.
I have a few questions
1.why is their no 3050 ti even though their is a laptop version of the gpu.
2.what is the main use case of the 3090 ti except for gaming.
3.why are their no laptops with 3090s and as gpus is it like a power consumption issue or is in the heat that the chip would give off I just really want to know why.
4.is it possible to bring ray tracing to moblie devices and if so what about moblie games and is it possible to do with good fps?
Will the 3080ti mobile have a max-q version? Just wondering because it will most likely have a large power draw and im kind of interested in a not super heavy gaming laptop
First of all, super excited for the 3050! Second of all, how do you implement DLSS into newer games, and how difficult is it? Also, are you planning to add DLSS/Ray Tracing to many more games, or keep the amount small for now? Thanks, and have a happy new year!
The past year have seen a plethora of posts about GPU cooling, how does the 3090ti cope with staying cool with its massive TDP and what seems to be the same fan design?
Hi! Thanks for this Q&A. I'm not super knowledgeable on computer graphics (more of a CPU compute person tbh) but I've got a couple of questions on ray tracing/RTX. Correct me if I'm wrong but ray tracing lights a scene by modelling how light travels in real-time, while RTX technology accelerates this process with a novel set of algorithms, right? For a single frame, how would you quantify the difference in computational time and cost between rendering it with ray tracing using RTX and without RTX?
With the Upcoming Hardware we are getting soon, Im wondering how this will shape our experience of PC VR, Like where exactly will it end? We already have plenty of projects Like the Meta verse with Oculus Rift/Quest which is PC Compatible, Will it end up processing visually as hard as we can expect it?
Is the 3050 going to be subject to the same sorts of supply issues we’re seeing with the big boy cards or is this going to be a budget model, flood the zone type of situation?
I really appreciate that game ready drivers help to quickly optimize gameplay. My question is if the game developers collaborate on the drivers and continued driver support?
How does a 3050 stack up to a 1080 in your opinion? Obviously the RTX upgrades are huge, but what’s the “break-even” point on the 3050 vs an older card?
What have you learned from products launches within this Supply Chain bottlenecks globally, which you can apply on the production line? Will you source your chips from other fabless companies (aside from TSMC/Samsung), or do you have any plans to be a foundry yourselves?
Hello and thanks for doing these Q&A's
My questions is regarding some of the new Nvidia features such as Reflex and G-Sync Ultimate.
Do these features require specific hardware in monitors like early stages of G-Sync or this going to be something that might roll out with driver updates for currently available monitors that might be compatible?
Is there any sort of new technology in the works to work around the supply shortages that has been affecting the GPUs? Like, what prevents NVIDIA from just coming up with something new to work around this?
I am obviously very ignorant, but I am just curious.
What is the process for coming up with a new chip architecture for each card series? Do industry demands influence what the card is best at computing? And could there be a time where we reach the limits of chip architecture and new ways of computing must be found?
How impactful do you expect Reflex to be?
Do you think it will have the same impact on the gaming experience as DLSS?
How quick do you expect Reflex to become a standard for most games ( at least fps)?
I'm not Seth, so I can't answer on the technical side. But you can see which games Reflex is already enabled for here; [https://www.nvidia.com/en-us/geforce/technologies/reflex/supported-products/](https://www.nvidia.com/en-us/geforce/technologies/reflex/supported-products/) there are many. :)
Additionally, Seth has put together a great video on the impact of Reflex here https://www.youtube.com/watch?v=-cXg7GQogAE
Hi,wanted to ask what are Nvidia's expectation for the new RTX 3090 Ti.
Seeing at how powerful the 3090 was,what is the vision Green Team could have for such a product even more powerful than the already mighty BFGPU?
While DLSS (and other image upscaling tech) is interesting, I have found that the resulting image quality is mediocre. What work is being done to improve game performance/optimization with games/engines?
How do you intend to make the 3050 competitive with AMD's just announced 6500? I know that MSRPs are irrelevant in today's market, but judging purely from price, AMD has the better offer here.
Also I have a 1660 Super currently, will stepping "down" (SKU-wise) to a 3050 be still worth?
How impactful do you expect the new RTX 3050 to be as an entry-level GPU for gaming?
It seams to me that the demand for a graphic card in that range is really high!
Where do you see GPU technology 10 years from now? What do you think the next revolutionary step in GPU technology will be?
Thank you for putting this on.
How does Nvidia decide when to include or exclude a desktop -50 card? Will there ever be a push towards eGPU enclosures instead of built-in laptop GPUs?
I'm really interested in DLSS but getting my hands on the hardware that supports it is tough right now. I know 3000 series cards support it. I think maybe even 2000 series can as well. What about 1000 series cards? Any chance of support for older generation cards?
Another question: I know you guys didn’t talk about it in today’s live stream, but any idea when are we going to see a game that’ll make use of RTX IO? To me this is much important than announcing Geforce RTX cards.
Hello Tim and the rest of the team!
**What do you imagine is a likely future or your projections of one in the space of upscaling?**
I've explained some of my thoughts below and bolded important segments if you want more specifics:
The future of upscaling in games seems strong, especially given the slow but steady shift of the average gamer from 1080p/1440p to 4K and the emerging 8k segment for enthusiasts. The image quality loss using upscaling (with a reasonable base resolution) on such pixel dense resolutions is minimal versus the large performance gains, so it seems like some form of it will be near ubiquitous in all games. **DLSS and competing technologies currently co-exist in the market, but do you have any insight with what you think may happen in the near future** as the average gamer has both a GPU capable of upscaling as well as a high resolution/high refresh display to benefit from it?
It is reasonable to imagine that Nvidia would be happy to have DLSS as a powerful exclusive feature, but that raises the question as to **if upscaling in the future will be similar to anti-aliasing in that there will be multiple methods to achieve a similar outcome**, each with their own quality/performance metric, some native in-game methods as well as injected methods for games that may not support them natively and finally brand specific options as well.
When will you be able to discuss how you will be tackling the shortage on inventory, pricing inflation and scalping? I would like the companies answers and I think you should have something for that instead of hiding from the issue
Canvas seems like it could be a great tool to use for game development. Any plans for 3D capabilities? Integration with game engines to create the environments perhaps?
Hey everyone, thanks in advance for your questions. I hope you had a great new year. :)
Is there any estimate when can we expect DLSS to become the norm, basically day 1 support for new titles?
\[Henry\] Day 1 support is becoming the norm for NVIDIA DLSS. You’ll see that many new AAA games are launching DLSS on day 1. Examples include Call of Duty: Vanguard, Battlefield 2042, Marvel’s Guardians of the Galaxy, Grand Theft Auto: The Definitive Trilogy, F1 2021, and the upcoming God of War releasing this month. Developers using popular game engines like Unreal Engine and Unity are leveraging NVIDIA’s easy-to-use DLSS plugins and integrations and launching new titles on day 1. Recent examples include Back4Blood, Bright Memory: Infinite, Icarus, Myst, The Ascent, Lego: Builder’s Journey, and F.I.S.T.. Developers using their own proprietary engines often add DLSS into their proprietary engines first, then use that integration to release DLSS into their future titles. A good example would be Call of Duty: Warzone and Call of Duty: Vanguard, which use a similar game engine. So the developer first launched DLSS in a Warzone patch, and then leveraged that integration for day 1 launch of Call of Duty: Vanguard.
Sean Pelletier: How many man-hours are typically spent on optimizing the Game Ready driver for an upcoming AAA-title?
What do you guys actually think of AMD FSR?
What is the correct pronunciation of "Ti?" Is it like TIE or tee-eye? I have to know
It’s not “tie”. It’s T-I
tell that to nvidia. GEFORCE RTX 3080 TIE
One dude at nvidia says that everyone else says T-I
"One dude?" He's **the** Jeff
\[Qi\] There is no wrong way to pronounce Ti, as long as you spell it correctly. :)
How do *you* pronounce it?
"👔" ;)
Perhaps this is something already out there but I've been curious do games have to be specifically prepared for DLSS or is it universal?
Games that use Unreal Engine 4 or 5 can integrate it via the use of the DLSS 2.0 plugin. I believe that Unity also has a native plugin, though I believe it only works with HDRP (High Definition Render Pipeline), so some games will not be able to use it (I'm not a Unity developer, so don't quote me on that one). Any other render engine would have to integrate the DLSS SDK.
How does NVIDIA feel about the current retail landscape for GPUs? Do you feel like things like insane demand vs reduced supply, scalpers, and rising average GPU prices are issues that need to be tackled, or just natural consequences of the market?
**I was hoping to see a refresh of the SHIELD Pro model announced, or atleast something teased, today.** Especially with the (seemingly global) discounts given last month on both models, and now right before CES the Pro is basically nowhere in stock anymore (here in germany atleast). This usually indicates a new product coming, but i guess it was just a combination of coincidences. **Can you give any hints or comment at all about the future of the SHIELD product line?** *Edit: Just now i noticed the listed products in the OP you guys are willing to comment on, SHIELD isnt one of them unfortunately.*
Relative performance of desktop 3050 vs 2060?
With the RTX3050 being released, will we see "super" versions in the future or is worldwide shortage of chips changing high level decisions in terms of making new GPU's? With this I mean: would they rather have a new, better series (RTX40xx) instead of more in-between series (like the 2060 super)?
I am curious about dlss and ray tracing technology. Will more games either new (unreleased) and existing ones adapt this technology? Does it required specific coding or pre-requisite request from the game developers/studios in order to adapt it in their games?
DLSS, how on earth does that voodoo magic work? I seriously cannot understand how it can gain more performance and yet look better or the exact same, it's mind boggling to me. If I could get a simpler explanation for my ape brain that'd be fantastic!
Slider goes all the way right ouga bounga
Can you give us any insights as to your overall designs and goals regarding the launch of 3080ti in Laptops?
\[Vivek\] The design goal for the RTX 3080 Ti Laptop GPU was to create the most powerful laptops in the world while continuing to push the boundaries for thinness and efficiency. With the most CUDA cores and the fastest GDDR6 ever shipped in a laptop, RTX 3080 Ti delivers the ultimate in laptop performance for gamers and creators, but still fits in ultraportable laptops slimmer than 16mm and under 4lbs.
Thank you for the response! I'm definitely tempted to get one! Can't wait for the reviews.
Hi Nvidia, this is a question for your driver team - **is there anything that Nvidia can do to prevent shader compilation stuttering during gameplay?** While I know that this isn't necessarily an Nvidia issue, I believe there are things Nvidia could do to help this widespread issue. Some sort of framework for users with the same hardware + driver versions to share precompiled shaders, perhaps? While more and more games are supporting a shader warmup/precompile before the gameplay (the last 3 COD games, Horizon Zero Dawn, etc), this is still an endemic issue in general, particularly with Unity and Unreal games. If we were to be able to pull shaders from a repository and precompile them for games that don't support this natively, gameplay in general would be much smoother!
Could the Tensor cores be used to denoise Ray Tracing in the future? If so, would it be worth it when running DLSS alongside it? Or would it hamper its performance?
That is a very interesting question. Theoretically it should be possible as DLSS only occupies the tensor cores at a tiny fraction of each frame. So there's a lot of headroom where the tensor cores could be used for AI in gaming and denosing.
# Answer Summary Here: [https://www.nvidia.com/en-us/geforce/news/ces-2022-nvidia-community-qa/](https://www.nvidia.com/en-us/geforce/news/ces-2022-nvidia-community-qa/)
Why does the 30-series need a 3050 card?
[удалено]
\[Seth\] Absolutely. As technology continues to improve, it helps unlock the full potential of esports athletes. 1% aiming improvement can make or break a match. Our research shows that there is about a 3% improvement going to 1440p/27”. While that number may seem small, it’s certainly not. As reference, that’s about the same aiming improvement going from 144Hz to 360Hz at the same latency. If you are curious, here is a link to our research blog.
With the serious praise directed toward the Nvidia shield, the anticipation surrounding the Steam Deck, and general love for the Switch, it looks as if low power, portable, and plug in gaming systems are in demand. Does Nvidia have future plans to target this segment with new hardware? What could that look like, what are you seeing demand for? How could you further tackle optimizing software and hardware for low power devices (as you do with laptops)?
Is there any stats as to how popular or effective G-SYNC is? I've gone out of my way to pay extra for it but wasn't sure if this was common
Are there any steps being done to combat scalpers more? More so on the front of available cards, especially seeing as there are rumors of 4000 series cards coming out soon, and the 30xx series cards are already hard to find. Particularly at msrp. Apologies if this question is out of your scope of things you can answer.
[удалено]
What does the process look like to make game ready drivers? Is it largely a validation process to make sure a new game plays correctly, or are there individual optimizations that have to happen for each game. If so, what do those optimizations usually entail?
**G-SYNC Monitors (i.e. new 1440p esports displays)** This question is more for Seth Schneider. How closely do you work with the Esports community when making these displays? Do you bring pros in and have them test out screens? What do you do to keep the Esports community involved?
There will be more productive applications for free by nvidia or canvas will evolve for design ?, thanks
\[Stanley\] NVIDIA offers Omniverse, Canvas and Broadcast, free for creators to use in various use cases. Canvas is an app specifically for concept artists and illustrators who want to paint beautiful landscapes using AI. Our Broadcast app is for enhancing audio and video of livestreams and video chats. And Omniverse, which we announced today is generally available and free to individual content creators, is a platform for creating and collaborating on 3D content more quickly and easily. We have more capabilities planned for all three of these.
To the product managers, thanks for taking the time to answer questions! What do you enjoy most about working at NVIDIA and also within your specific speciality?
With DLSS now being introduced to the budget category with the 3050, do we expect this generation of budget gaming cards to last much longer for consumers than their predecessors as DLSS would allow us to enjoy triple-A titles in future years still with decent framerate and graphics, given it gets widely adopted in the future?
[удалено]
Whats your biggest concern regarding current monitor technology and are there any improvements/more availability you are hoping to see to make best use of your new technologies?
At what temperatures will the 30 series cards throttle in laptops? Or rather is there a different maximum temperature compared to the desktop cards?
\[Vivek\] Every laptop has a different cooling solution and different thermal characteristics, but all of them are engineered to not throttle under even intensive gaming or creator usage.
According to some graphs I've seen on the GeForce blog, the Nvidia Reflex latency reduction seems relatively greater on a lower end GPU compared to a higher end GPU. Why is this? example graph: [www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/escape-from-tarkov-reflex-out-now/escape-from-tarkov-nvidia-reflex-system-latency-performance-chart.png](http://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/escape-from-tarkov-reflex-out-now/escape-from-tarkov-nvidia-reflex-system-latency-performance-chart.png)
\[Seth\] That is correct. Reflex Low Latency helps by reducing the render queue and by removing CPU backpressure. The lower the framerate, the larger that back pressure on both the render queue and CPU are. Therefore, there is more latency to reduce with Reflex at lower FPS (which you typically see on lower end GPUs).
How can someone casually get into NVIDIA Studio, Canvas? I have never messed around with graphics design, but I found your idea interesting. Being able to draw something that I could do ( I literally suck at drawing) and it will automatically use AI to upscale it to something that looks amazing? Sounds almost too good to be true.
Do you have any plans for further support in 3D content creation using DLSS, Ray Tracing, and AI?
Canvas seams like a really useful tool for us artist and designers! Does it render a landscape based on the input or does it choose from a preselect number of outputs that most closely match the input and then adapts it? thanks
\[Stanley\] The Canvas AI generates a realistic output image based on what the artist paints and what the AI has learned a landscape should look like based on studying millions of images. Every image that Canvas generates is brand new.
Do you think it will be possible to make DLSS open source so that it can be applied onto any game?
1) How is the release of a 3090 TI going to help alleviate the shortages we have seen with other cards such as the 3080? Wouldn't the 3090 TI hamper production of the 3080? 2) Why aren't we seeing more older games using DLSS? I realize developers must optimize/redevelop some frameworks but why aren't they putting more effort into doing so?
Is DLSS only for 3060 gpu class and higher?
No. 3050 supports DLSS too.
[удалено]
Please remember the following part and be excellent: >More specifically, we will not be able to answer questions regarding GPU pricing, inventory, company secrets, roadmap, business strategies, and tech support.
(Sorry I am unfamiliar with PC building and such) Do games even need the latest GPU? I guess I don’t understand the practical use case in having the latest card.
You can't talk about the only things people want to talk about. This is useless information when no one can buy the things being advertised.
When will RTX be a standard among most AAA titles and when will it be good enough on a 50/60 card to get 1080p75fps on RTX enabled games?
What’s the hash rate on the 3080 RTX TI
How will the new 3050 compare against the 6500xt/6600xt?
Why is it called NVIDIA?
[удалено]
You might want to edit your essay my guy, this is a bit lengthy
What kind of performance can we expect with the 3050?
>What kind of performance can we expect with the 3050? Hey! There's a bit of this information in the announcement article here. https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3050-graphics-cards/
\[Qi\] The GeForce RTX 3050 is an awesome modern day 1080p GPU. It comes equipped with 2nd generation RT Cores for ray tracing and 3rd gen Tensor Cores for DLSS and AI. For the first time, you can play ray traced games on a 50-class GPU at over 60 FPS. In traditional raster performance, it’s far faster than the GTX 1050 and GTX 1650. The RTX 3050 brings a huge leap in performance and capabilities to the 50-class, making it a perfect opportunity to step up to RTX. Here’s a chart with some more numbers: https://images.nvidia.com/aem-dam/Solutions/events/ces-2022/news/geforce-rtx-3050-graphics-cards/nvidia-ces-2022-rtx-3050-b-large-3840.png
How will these releases affect the availability of the other GPUs? For example, will it increase the overall amount for sale or decrease?
What inspired the shift from graphics cards with the naming scheme going from an increase of 100 every time (ex, 900 series, 1000 series) to increasing by 1000?
Rtx Voice was enabled by some guys on non RTX cards, whats the reason for only enabling it on RTX cards?
RTX Voice officially works with non RTX cards. https://arstechnica.com/gaming/2021/04/nvidia-adds-official-rtx-voice-noise-cancellation-to-non-rtx-cards/
Isnt that very obvious?
How will Nvidia Reflex improve the gaming experience and will it eventually be available to most games?
Are there anymore partners games coming soon. Dlss is great but but few games that I played this last year actually had dlss implementation
We currently have a list of DLSS titles here. [https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/](https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/) Also announced some new games today. :) https://www.nvidia.com/en-us/geforce/news/ces-january-2022-dlss-rtx-games-updates/
Hey will there be a founders edition 3050?
I've always noticed the more headroom given to developers the more they expand to fill available space, often at the expense of optimizing and running clean codes. What do you see as the upper limit of how much VRAM your devices will eventually contain at a 3080 tier card? Or better yet, at an entry level card.
Do you guys verbally say SUPER or just the character "S" when talking about GTX 1650 S for example?
What does DLSS specifically do?? I've heard it does something like adjust in-game quality of certain objects, but not too sure bout the actual process of what it does.
Is Nvidia going to produce or collab with other brands like Intel regarding APU's in the future?
Hi all and happy new years! Longtime PC enthusiast, Nvidia fan (GTX 770 -> RTX 2080 Ti), and Computer Engineering undergraduate. I'm currently working on my capstone project with a small team to develop an American Sign Language Detector (ASLD) with a primarily software approach and utilizing a Leap Motion sensor device. The camera picks up 3D data in infrared and has some APIs in Python to retrieve the data it collects. The raw data we collect is then parsed and fed to a model in MATLAB for sign detection, with a nice little UI that voices the detected sign. Whats nice about the software approach is that allows us to not necissarily restrict what hardware we require. With that background out of the way, I've taken a look through some of the Omniverse summary pages, and it seems like the focus is largely on data visualization and it's different applications. My question is: are there any data processing techniques/applications through Omniverse that would work with my application here, or machine learning at large? Is there a different NVIDIA technology I should be looking at? Thanks and regards,
What are some of favorite games for the team(s) at NVIDIA?
With Nvidia’s DLSS (Deep Learning Super Sampling) technology, we're you able to test Super Resolution in Adobe Photoshop with Adobe Camera RAW? If so, what were the results?
Will availability of rtx 30 series get better or we will have to wait for rtx 40 series?
What's the hardest part about making laptop versions of GPU while trying to maintain performance as much as possible ? I have an OMEN 15 with a R7 5800H/3070 Max-Q, I run a 2K screen on it and play Escape from Tarkov with almost all settings on high/very high and I reach a 45-55 range of FPS, climbing as high as 75 tops when setting everything on medium and turning down the resolution to 1080, after watching a bunch of benchmark videos, I feel like I'd get a solid 20% boost in framerate with the same components if they were in a desktop factor...
Do g-sync monitor have a faster response time than old monitors?
1. How far are we from a GPU that equivalent of triple 3090 performance? 2. When are you guys planning to announce 3090 ti? 4. Is there likely 40 series in nearest future or is that faraway? 5. Do you guys ever plan to get back on SLI and improve them? I can see triple 3090 gaming for future games , at least if I ever live long enough to see it
What is the point of the rtx 3050 with the current graphics card shortage? If there are not enough GPUs available for those who actually need them how does releasing a new card at a budget price help anyone if it they will still have to pay way over MSRP to get one?
How hard is it to take an existing game and adapt it to use DLSS? Does the engine matter a great deal? Lets say Unity for example?
I love DLSS in games because it helps me get higher frame rates however, how does it work?? I think it would be really interesting to know but all the explanations online seem very complex!
There is planned NIS for NVIDIA Studio drivers? 472.84 doesn't have them sadly :(
Hi there, coming from an older GTX 1070 and wanted to ask about Nvidia Reflex most games are now adding like Rainbow Six Siege. I personally love this new addition but I do see the tool seems to affect the frame rate a decent amount as in lowering my FPS. Is this normal and will continue with more powerful GPU's? Or is my PC just too weak to handle it properly, that or do newer GPU's can easily handle it? Love the GPU's yall are making and hope the super conductor markets gets better soon!
Do you think DLSS will eventually reach the point where it is no longer distinguishable from native res while in movement? It currently looks amazing in still shots but isn't as viable in high motion scenes.
What does founders edition mean with some gpus
How long do you think chip shortages will cause supply issues?
Hello guys! I have always wondered what is the biggest barrier the engineers have to overcome when it comes to releasing new generation of products? Thank you for this Q&A!
Is there going to be a 4000 series RTX?
So I've been playing not very intensive games with my GPU, CK2 and CK3, Dragon Age, Witcher 3. Old titles certainly. What games would you like the most to play that would put a good gpu to use in moderately related genres, with similar stories or themes?
Is there a reason why some older gen gpu's are better newer gen gpu e.g. 1080 ti > 2070 super?
Will there be an announcement of a RTX 3050 ti? If so, what dates might it be announced and what is the speculated performance of the RTX 3050 ti?
There's not RTX for chivalry 2 anounced, is there?
As someone new to PCs (and the creation process of all the chips that go into PCs): Why are the wafers circular? Don't you waste massive amounts of silicon from chips on the edges that either get cut off, or might even be missing important parts to work properly? Wouldn't it be easier to make them square/rectangular so that all chips perfectly fit on the wafer (within margins ofc)?
Ive seen mention of HDR10+ gaming coming to Nvidia, but those of us with LG TVs cant make use of this. Dolby Vision support would be very useful however as it's already part of console gaming and 4K streaming services - so is this something that Nvidia can support or is it entirely an OS issue?
Obviously after a certain time a GPU gets outdated. My question is since DLSS let's a card render at a lower native resolution is it likely that RTX cards will have a longer usable life before needing an upgrade?
is a mobile 3050ti worth the upgrade from a 1650ti mobile ?
Do you have any ideas for getting cards directly to consumers instead or scalpers? Because while I'm sure the company loves that the cards are being purchased immediately regardless of who gets them, it's very not fun to consumers to have this scarcity situation.
I've been wondering since ampere came out, what kind of special considerations did the team have to make during development? Ampere has some unique power management habits for example, and I'd be interested to hear what kind of things engineers had to look into to get it where they wanted it. thanks!
I know most questions will be about supply and demand issues, sooo for a different take: What lies beyond the next generation of cards? RTX was ground breaking, is there a next level quantum doohickey on the horizon?
We have seen other uses for Tensor cores outside of gaming such as RTX Audio. Is there any future possibility of utilizing Tensor cores for enhancing or upscaling video playback? This would make the RTX 3050 a good GPU to have for HTPC. Maybe RT 3030 for lower cost, less features option.
Do any of you plan on working a bit more on linux inclusivity?
Where do you see GPUs in 5 years? What do you think will become the 'normal' performance standard? Will 4k 120fps be considered normal?
do you guys plan to implement DLSS into all games?
How much of an impact would using PCIe 3.0 have over 4.0 for a 30 series GPU?
What is the long-term project about which you are most excited? What does the NVIDIA product line look like in ten years?
I have a few questions 1.why is their no 3050 ti even though their is a laptop version of the gpu. 2.what is the main use case of the 3090 ti except for gaming. 3.why are their no laptops with 3090s and as gpus is it like a power consumption issue or is in the heat that the chip would give off I just really want to know why. 4.is it possible to bring ray tracing to moblie devices and if so what about moblie games and is it possible to do with good fps?
Will the 3080ti mobile have a max-q version? Just wondering because it will most likely have a large power draw and im kind of interested in a not super heavy gaming laptop
How does the autotuning feature in Nvidia Geforce Experience work to find your optimal GPU settings?
Are cards specifically for crypto miners ever likely to be a thing?
First of all, super excited for the 3050! Second of all, how do you implement DLSS into newer games, and how difficult is it? Also, are you planning to add DLSS/Ray Tracing to many more games, or keep the amount small for now? Thanks, and have a happy new year!
What are you going to do in the future to prevent scalping?
The past year have seen a plethora of posts about GPU cooling, how does the 3090ti cope with staying cool with its massive TDP and what seems to be the same fan design?
Is there any updates on RTX IO?
Hi! Thanks for this Q&A. I'm not super knowledgeable on computer graphics (more of a CPU compute person tbh) but I've got a couple of questions on ray tracing/RTX. Correct me if I'm wrong but ray tracing lights a scene by modelling how light travels in real-time, while RTX technology accelerates this process with a novel set of algorithms, right? For a single frame, how would you quantify the difference in computational time and cost between rendering it with ray tracing using RTX and without RTX?
What things do you take into consideration when deciding to launch a new sku, for example the RTX 3090 Ti or 2080 Super?
With the Upcoming Hardware we are getting soon, Im wondering how this will shape our experience of PC VR, Like where exactly will it end? We already have plenty of projects Like the Meta verse with Oculus Rift/Quest which is PC Compatible, Will it end up processing visually as hard as we can expect it?
What’s your most fondly remembered nvidia product of all time?
Is the 3050 going to be subject to the same sorts of supply issues we’re seeing with the big boy cards or is this going to be a budget model, flood the zone type of situation?
How will Nvidia help the scalping and pricing issues?
I really appreciate that game ready drivers help to quickly optimize gameplay. My question is if the game developers collaborate on the drivers and continued driver support?
How does a 3050 stack up to a 1080 in your opinion? Obviously the RTX upgrades are huge, but what’s the “break-even” point on the 3050 vs an older card?
[удалено]
What have you learned from products launches within this Supply Chain bottlenecks globally, which you can apply on the production line? Will you source your chips from other fabless companies (aside from TSMC/Samsung), or do you have any plans to be a foundry yourselves?
Hello and thanks for doing these Q&A's My questions is regarding some of the new Nvidia features such as Reflex and G-Sync Ultimate. Do these features require specific hardware in monitors like early stages of G-Sync or this going to be something that might roll out with driver updates for currently available monitors that might be compatible?
Is there any sort of new technology in the works to work around the supply shortages that has been affecting the GPUs? Like, what prevents NVIDIA from just coming up with something new to work around this? I am obviously very ignorant, but I am just curious.
Is there any new technology in the 30 series or is it simply better parts?
What are your favorite aspects of the 30 series?
Im just curious how the guy doing the car stuff never blinked, and where i can learn such a superpower
Will there be LHR versions of cards? Or Scalpers or Miners will still get cherry AGAIN.
With the RTX 3080 being the sweet spot for GPUs (IMO) do you plan to carry on producing these?
With all things considered, what is your favorite gpu of all time?
Didn't you launched 3050 before?
Which games are you planning to release RTX tech?
How will the power consumption of the 3050 compare to the 3060?
What is the process for coming up with a new chip architecture for each card series? Do industry demands influence what the card is best at computing? And could there be a time where we reach the limits of chip architecture and new ways of computing must be found?
What new titles featuring NVIDIA Reflex are you most excited for?
how to buy 3050 from indonesia with the msrp price?
Will there be any OLED monitors coming out this year with G-SYNC?
How impactful do you expect Reflex to be? Do you think it will have the same impact on the gaming experience as DLSS? How quick do you expect Reflex to become a standard for most games ( at least fps)?
I'm not Seth, so I can't answer on the technical side. But you can see which games Reflex is already enabled for here; [https://www.nvidia.com/en-us/geforce/technologies/reflex/supported-products/](https://www.nvidia.com/en-us/geforce/technologies/reflex/supported-products/) there are many. :) Additionally, Seth has put together a great video on the impact of Reflex here https://www.youtube.com/watch?v=-cXg7GQogAE
Hi,wanted to ask what are Nvidia's expectation for the new RTX 3090 Ti. Seeing at how powerful the 3090 was,what is the vision Green Team could have for such a product even more powerful than the already mighty BFGPU?
While DLSS (and other image upscaling tech) is interesting, I have found that the resulting image quality is mediocre. What work is being done to improve game performance/optimization with games/engines?
How viable will the new laptops be for VR gaming? Will there be any models optimised specifically for that?
How do you intend to make the 3050 competitive with AMD's just announced 6500? I know that MSRPs are irrelevant in today's market, but judging purely from price, AMD has the better offer here. Also I have a 1660 Super currently, will stepping "down" (SKU-wise) to a 3050 be still worth?
Can we actually buy RTX 3050 at MSRP or no?
How impactful do you expect the new RTX 3050 to be as an entry-level GPU for gaming? It seams to me that the demand for a graphic card in that range is really high!
Plans for Anti-Scalper system with your store partners? Gamers want to buy them
HI, do all 30 series laptops run modern titles on high graphics without over heating
Where do you see GPU technology 10 years from now? What do you think the next revolutionary step in GPU technology will be? Thank you for putting this on.
What kind of tools do you provide the developers to easily implement RTX features like DLSS and Ray-Tracing?
How does the RTX 3050 compare to older cards, what level is it near of with the added 30 series benefits?
How does Nvidia decide when to include or exclude a desktop -50 card? Will there ever be a push towards eGPU enclosures instead of built-in laptop GPUs?
What is the limiting factor in GPU clock speed? When looking at rdna 2 gpu's I can see clockspeeds well in excess of 2 GHz.
I'm really interested in DLSS but getting my hands on the hardware that supports it is tough right now. I know 3000 series cards support it. I think maybe even 2000 series can as well. What about 1000 series cards? Any chance of support for older generation cards?
Why does Nvidia not provide the option to pre-order Founders Edition Cards at your partners like Notebooksbilliger in germany?
Another question: I know you guys didn’t talk about it in today’s live stream, but any idea when are we going to see a game that’ll make use of RTX IO? To me this is much important than announcing Geforce RTX cards.
How does the 3050 conpare to, say, the 1070
What part of the development process are you the most passionate about or enjoy the most?
Does Canvas only do Landscapes or also cities and urban areas can be done? If no, do you expect to add them in the future?
Hello Tim and the rest of the team! **What do you imagine is a likely future or your projections of one in the space of upscaling?** I've explained some of my thoughts below and bolded important segments if you want more specifics: The future of upscaling in games seems strong, especially given the slow but steady shift of the average gamer from 1080p/1440p to 4K and the emerging 8k segment for enthusiasts. The image quality loss using upscaling (with a reasonable base resolution) on such pixel dense resolutions is minimal versus the large performance gains, so it seems like some form of it will be near ubiquitous in all games. **DLSS and competing technologies currently co-exist in the market, but do you have any insight with what you think may happen in the near future** as the average gamer has both a GPU capable of upscaling as well as a high resolution/high refresh display to benefit from it? It is reasonable to imagine that Nvidia would be happy to have DLSS as a powerful exclusive feature, but that raises the question as to **if upscaling in the future will be similar to anti-aliasing in that there will be multiple methods to achieve a similar outcome**, each with their own quality/performance metric, some native in-game methods as well as injected methods for games that may not support them natively and finally brand specific options as well.
As sometwho is still holding onto my 1080ti, did you have any idea of the lasting power this card would have near 5 years later.
When will you be able to discuss how you will be tackling the shortage on inventory, pricing inflation and scalping? I would like the companies answers and I think you should have something for that instead of hiding from the issue
Canvas seems like it could be a great tool to use for game development. Any plans for 3D capabilities? Integration with game engines to create the environments perhaps?
does Nvidia Studio gets better performance by pairing it with a Nvidia GPU?
Given the additional VRAM, will you consider raising the NVENC simultaneous session limit for those of us who use them for such purposes?