BasilZero said: Yeah, the devs should have kept quiet and it should have been something that was done on the backend privately. |
I have a 3DS emulator, i am curious if it still works.
BasilZero said: Yeah, the devs should have kept quiet and it should have been something that was done on the backend privately. |
I have a 3DS emulator, i am curious if it still works.
To those of you with an X670E motherboard, which are a few, I'd keep an eye on this:
Crucial Discovers Flaw in AMD X670E Motherboards: Gen 5 NVMe Slots Drop to Gen 1 Speeds, Cause Boot Issues
https://www.techpowerup.com/327243/crucial-discovers-flaw-in-amd-x670e-motherboards-gen-5-nvme-slots-drop-to-gen-1-speeds-cause-boot-issues
Memory and SSD maker Crucial noticed an uptick in support requests by users claiming that their Gen 5 or Gen 4 NVMe SSDs would drop to PCIe Gen 1 speeds, besides being unable to boot into Windows after a restart. Crucial then did some digging, and localized the issue to users with motherboards based on the AMD X670E chipset, AMD's flagship Socket AM5 platform chipset. While not a function of the chipset itself, it turns out that there is a flaw in the way AMD designed the PCI-Express I/O of the X670E platform, specifically the PCIe Gen 5-capable M.2 NVMe interfaces that are attached to the CPU, causing them to drop in speeds to Gen 1. This problem isn't surfacing on the AMD B650 or the B650E, or even the X670—it is oddly specific to the X670E, despite the Gen 5 M.2 NVMe slots not being wired to the chipset.
The article speculates about the cause, gives a temporal solution and states that MSI has already released a fix.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.
Chrkeller said: I have a 3DS emulator, i am curious if it still works. |
It should still work - only thing is you cant check for updates because there's no updates/server connection for updates is off.
BasilZero said:
It should still work - only thing is you cant check for updates because there's no updates/server connection for updates is off. |
Good to know. I'm not too worried about updates, given the emulator works fine.
Hm, I'm not sure what happened there with Ryujinx - I actually doubt Nintendo could do anything about it legally (aren't those guys from Brazil?), especially since Ryujinx avoided doing anything that is not legal.
So, if Nintendo were clever, they offered them money/job, since it's rather embarrassing having your newest game running better on emulator than on your actual hardware.
Jizz_Beard_thePirate said:
The thing is that it's not up to Nvidia to convince people to buy their products... People are already buying their products. It's up to AMD to convince people to buy their products instead. I think a lot of people in that price range or really in a lot of price ranges think to themselves... Should I buy this AMD GPU or spend more and buy an Nvidia gpu? Like I don't think people are penny pinching that much to be like, well I can buy a 7800XT for $480 but I can't spend extra $50-$70 for a 4070. I think we are in the era of where Raster performance alone no longer sells products. Lets say I am a 970/1070 user and I want to upgrade. Okay so if I were to upgrade to a 7800XT, what am I getting? A lot more Raster performance (compared to 970/1070)? Yep. FSR 2? Nah have that. Anti-lag 2? Already have Reflex. Actually Reflex is in way more supported games than Anti-lag 2 so this would be a downgrade. Ray Tracing? Yea but kinda shat on AMD. Long term driver support? Kinda iffy considering AMDs track record. Ai upscaling? Nah. More Vram? Heck yea. Okay what if I spend extra $100 and get a 4070. A lot more Raster performance (compared to 970/1070)? Yep DLSS that is significantly better and more supported? Yep. Ray Tracing that is actually viable? yep. Reflex? Yep. FSR 2? Yep. FSR 3? Yep. DLSS 3 FG? Yep. Long term driver support? Hell Nvidia is still supporting Maxwell. List goes on. So it's like, outside of a few areas like 7700XT vs 4060 Ti, Radeon really doesn't make much sense unless you want pure raster... And what is Raster alone going to give you when UE5 was built around upscaling, when XVI needs a 4090 for 1440p, when the new MH game requires frame gen at 1080p to achieve 60fps? The reality is that the extra raster you get for the price point feels pointless when the developers are building entire games around upscaling as a requirement. |
DLSS is better than FSR for sure - in still frames and zoomed in, slowed down ones. During normal gameplay, the difference is barely, if even actually noticeable when run side by side at normal framerates, with some rare exceptions where ghosting may be an issue. I give it a small advantage to NVidia for the better quality and wider support.
AMD has analogues to Reflex and DLSS 3 FG, the latter AFMF even working without FSR. So, no real advantage for NVidia here.
Long term driver support? AMD does it too, though they put them into a specific legacy rail. This is sometimes even missed by techtubers (for instance, Linus had just a new video with an all-Amazon PC with an RX 580 and used the last non-legacy driver because he didn't know that it has more modern drivers, the most recent one being from July this year and another one about to come out this month). However, this only goes until Polaris, and Maxwell is one generation earlier, so there's a small advantage for NVidia here.
One thing you didn't mention is CUDA, which is a big advantage if you work with software that uses it, but pretty useless for gamers. Still it's one more plus for NVidia.
But this also means that for gamers, the only real advantage is the better raytracing performance and slightly better DLSS over FSR - and the raytracing is mostly worthless on NVidia cards under $500 as they lack both the raw power and the VRAM to put it to good use.
As for your 7800XT vs 4070 comparison, I just checked how they compare, and basically the 7800XT was on average 10-15% faster than the 4070 in raster but 10-15% slower when Raytracing is used - and both needed upscaling to reach 60FPS when RT is activated. I'd consider that a draw, especially considering some of the games already used over 10GB VRAM, so in 1-2 years this will become the bottleneck for the NVidia card. Certainly not worth the extra $100 unless you want to upgrade soon or don't mind lower settings to keep the VRAM buffer alive.
Long story short, under $500 I see absolutely no reason to buy an Nvidia card, as they only have the small DLSS advantage left (outside of CUDA) and tend to have less VRAM, which I consider a much bigger advantage for AMD than the DLSS can be for NVidia. Above that price tag, NVidia can finally play out it's Raytracing muscles and the gap becomes bigger to AMD, but for anything below a 4070 Super I'd much rather buy an AMD card as NVidia is simply overpriced by comparison.
Last edited by Bofferbrauer2 - on 03 October 2024The Nintendo eShop rating Thread: http://gamrconnect.vgchartz.com/thread.php?id=237454 List as Google Doc: https://docs.google.com/spreadsheets/d/1aW2hXQT1TheElVS7z-F3pP-7nbqdrDqWNTxl6JoJWBY/edit?usp=sharing
The Steam/GOG key gifting thread: https://gamrconnect.vgchartz.com/thread/242024/the-steamgog-key-gifting-thread/1/
Free Pc Games thread: https://gamrconnect.vgchartz.com/thread/248138/free-pc-games/1/
Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see
So pay up motherfuckers you belong to "V"
La la la la laaaaa ~ 🎵
What will she find todayyyyy
Our lovely Apatosaurus Marnieeee
Scavenging for pretty things to displayyy!!! ~ 🎶 pic.twitter.com/wILU1hETh2
— Amber Isle 🔶 Switch ➡️ Feb 2025 (@AmberIsleGame) October 2, 2024
I need this game in my veins already.
It's like Animal Crossing mixed with Moonlighter, but with Dinos.
Last edited by Chazore - on 03 October 2024Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see
So pay up motherfuckers you belong to "V"
Bofferbrauer2 said:
DLSS is better than FSR for sure - in still frames and zoomed in, slowed down ones. During normal gameplay, the difference is barely, if even actually noticeable when run side by side at normal framerates, with some rare exceptions where ghosting may be an issue. I give it a small advantage to NVidia for the better quality and wider support. AMD has analogues to Reflex and DLSS 3 FG, the latter AFMF even working without FSR. So, no real advantage for NVidia here. Long term driver support? AMD does it too, though they put them into a specific legacy rail. This is sometimes even missed by techtubers (for instance, Linus had just a new video with an all-Amazon PC with an RX 580 and used the last non-legacy driver because he didn't know that it has more modern drivers, the most recent one being from July this year and another one about to come out this month). However, this only goes until Polaris, and Maxwell is one generation earlier, so there's a small advantage for NVidia here. One thing you didn't mention is CUDA, which is a big advantage if you work with software that uses it, but pretty useless for gamers. Still it's one more plus for NVidia. But this also means that for gamers, the only real advantage is the better raytracing performance and slightly better DLSS over FSR - and the raytracing is mostly worthless on NVidia cards under $500 as they lack both the raw power and the VRAM to put it to good use. As for your 7800XT vs 4070 comparison, I just checked how they compare, and basically the 7800XT was on average 10-15% faster than the 4070 in raster but 10-15% slower when Raytracing is used - and both needed upscaling to reach 60FPS when RT is activated. I'd consider that a draw, especially considering some of the games already used over 10GB VRAM, so in 1-2 years this will become the bottleneck for the NVidia card. Certainly not worth the extra $100 unless you want to upgrade soon or don't mind lower settings to keep the VRAM buffer alive. Long story short, under $500 I see absolutely no reason to buy an Nvidia card, as they only have the small DLSS advantage left (outside of CUDA) and tend to have less VRAM, which I consider a much bigger advantage for AMD than the DLSS can be for NVidia. Above that price tag, NVidia can finally play out it's Raytracing muscles and the gap becomes bigger to AMD, but for anything below a 4070 Super I'd much rather buy an AMD card as NVidia is simply overpriced by comparison. |
Lol if you really believe there is a small difference between DLSS and FSR after all the comparisons Digital Foundry and Hardware Unboxed has done, then I don't know what to tell you other than to take the blinders off and face reality.
AMD has their own version of Reflex yes... But supported in significantly less games was the point I was making...
As for Driver, the difference between what AMD and what Nvidia does with their legacy drivers is AMD only does as per their own words "Going forward, AMD is providing critical updates for Polaris- and Vega-based products via a separate driver package, including important security and functionality updates as available." You can see this if you read the actual driver notes on AMDs website. Where as with Nvidia, they continue to provide game ready driver updates.
"As for your 7800XT vs 4070 comparison, I just checked how they compare, and basically the 7800XT was on average 10-15% faster than the 4070 in raster"
Maybe you should check again or link me where it's that much faster on average? On average according to hardware unboxed, if they exclude Ray Tracing, 7800XT is 8% faster at 4k on average and 7% at 1440p on average. https://www.techspot.com/review/2736-geforce-rtx-4070-vs-radeon-7800-xt/
At the end of the day, it doesn't really matter what you or I think. It comes down to what the market thinks and the market has spoken for the last 3-4 generations. The market wants Nvidia specific features and AMD's feature set and GPU stack isn't good enough outside of some comparisons which give them that 12% market share. People can blame brand loyalty but the reality is AMD's CPU division was able to convince people that they are better than Intel. Meanwhile Radeon division continues to feel like it's run by idiots. Like even something as simple as launching Anti-Lag 2, they goofed by getting peoples accounts banned.
It's time Radeon stops being the xbox of the gpu market if they want to actually gain any market share.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850
Yes AMD's CPU division was able to convince people that they are better than Intel, but even now there are people who still think Intel is better.