Jump to content

PC Specs?


Laim

Recommended Posts

I own and have beat 2077. On an i7-6950X. With everything maxed. Ran fine for me. The biggest buff I have seen as the owner of a decent CPU is that when DLSS is enabled you get a MAJOR FPS boost. In fact the way I understand it, many players cannot play without it, likely due to low core counts or sub-par PCs.

 

I'm not at war with AMD. If it was so much better their stock would show it. The simple fact is that you have all this fake fanfare out there out there, something AMD is awesome at generating, but I have yet to see it matter in the real world. Intel has always out-performed in the real world, and my company does yearly testing so we can update our offerings based on customer needs. Yes, we even do gaming rig builds, and out top performers in actual games have always been Intel chips, but we don't buy bad chips. We buy their X series and similar, which is probably why we have had such good experiences with them in comparison to AMD. Not that we don't purchase and sell AMD, but they are generally relegated to budget builds or encoding setups, or SOHO servers, etc.

 

A final thing I will note is that while AMD has focused on shrinking their lithography, they had chips at 7nm compared to Intel chips at 14nm as Intel began rolling out more thoroughly tested 10nm chips. AMD's 7nm chips were barely ahead of Intel's aging 14nm chips at the time. That speaks volumes about the differences between the two. I am sure AMD was embarrassed and has since taken advantage of a 7nm lithography in a more efficient manner, but that alone keeps me from buying them the day they arrive. Oh, and let's not forget the "each core shares one cache which slows us way down" fiasco. I believe that was only fixed on AMD chips within the last year and the current-gen Ryzens. Intel learned that lesson on the Core2Duo's eons ago.

Link to comment
Share on other sites

2 hours ago, The_Great_Sephiroth said:

I'm not at war with AMD. If it was so much better their stock would show it.

Stock market doesn't paint the full picture. Most of that stock market is based on industrial use, like super computers / servers, not residential gamers. Besides, AMD is currently at 50.8% market share passing Intel in desktop CPUs according to this article:  https://www.techradar.com/news/amd-overtakes-intel-in-desktop-cpu-market-share-for-the-first-time-in-15-years

 

And as far as I'm aware, Intel still hasn't really released any 10nm chips yet as they struggle to get any (or they haven't been able to tune their 10nm to compete with anything yet). And nm is just a chip size that increases power efficiency which is why Intel runs so hot compared to AMD now. Running more efficient means AMD can find new limits as they continue to fine tune it more with each generation. It takes time to fine tune new technologies. Intel on the other hand is so desperate, they sliced off a layer of their chips just to get better contact with the heatsink so they can overclock it ever so slightly more in order to keep up with AMD again. I imagine that makes them even more delicate too. AMD is already working on 5nm and 3nm chips while Intel still struggles to release any 10nm chips. At some point, Intel will need to spend some of their blood money and evolve in order to catch up, until then, they're still being lazy and cheap.

Edited by Fox (see edit history)
Link to comment
Share on other sites

3 hours ago, The_Great_Sephiroth said:

I own and have beat 2077. On an i7-6950X. With everything maxed. Ran fine for me. The biggest buff I have seen as the owner of a decent CPU is that when DLSS is enabled you get a MAJOR FPS boost. In fact the way I understand it, many players cannot play without it, likely due to low core counts or sub-par PCs.

 

Which isn't surprising as i7-6950X is a 10 core CPU and that shows exactly what I was driving at. If you buy a 4 core CPU today you can still play a lot of games without problems but I would call that not future-proof or even a bad idea for the present time already. The games industry is in the midst of changing and supports a lot more cores already.

 

3 hours ago, The_Great_Sephiroth said:

 

I'm not at war with AMD. If it was so much better their stock would show it. The simple fact is that you have all this fake fanfare out there out there, something AMD is awesome at generating, but I have yet to see it matter in the real world. Intel has always out-performed in the real world, and my company does yearly testing so we can update our offerings based on customer needs. Yes, we even do gaming rig builds, and out top performers in actual games have always been Intel chips, but we don't buy bad chips. We buy their X series and similar, which is probably why we have had such good experiences with them in comparison to AMD. Not that we don't purchase and sell AMD, but they are generally relegated to budget builds or encoding setups, or SOHO servers, etc.

 

A final thing I will note is that while AMD has focused on shrinking their lithography, they had chips at 7nm compared to Intel chips at 14nm as Intel began rolling out more thoroughly tested 10nm chips. AMD's 7nm chips were barely ahead of Intel's aging 14nm chips at the time. That speaks volumes about the differences between the two. I am sure AMD was embarrassed and has since taken advantage of a 7nm lithography in a more efficient manner, but that alone keeps me from buying them the day they arrive.

 

At another time when AMD called their CPUs Athlon and Intel called them Pentium it was exactly the other way round, AMD was trailing behind Intels litography and their CPUs were more than competitive. So what does that tell us? The litography is just one of hundreds of parameters that influence the power of a CPU.

 

Secondly the numbers like 7nm and 10nm are just labels now and not easily comparable as that depends very much on what exactly is measured in different litographic techniques. Here the info in laymans terms: https://www.techcenturion.com/7nm-10nm-14nm-fabrication

Quote: "What Intel calls as 10nm, is similar to what TSMC calls as 7nm".

 

So at the moment they produce somewhat equally good CPUs (if we use gaming benchmarks) with transistors about equally densly packed (see the Transistor Density Comparison table, Intels 10nm process is even slightly denser). There is no reason for AMD to be embarassed, maybe the 10(?) times bigger Intel should be embarrassed that they can't get a lead with a multiple times higher research budget.

 

3 hours ago, The_Great_Sephiroth said:

 

 

Oh, and let's not forget the "each core shares one cache which slows us way down" fiasco. I believe that was only fixed on AMD chips within the last year and the current-gen Ryzens. Intel learned that lesson on the Core2Duo's eons ago.

 

So you are saying Intel made the same mistake? And learned from it like AMD did now? Well, here is 1 point for both 😉

 

Edited by meganoth (see edit history)
Link to comment
Share on other sites

We should split this off into a CPU technical thread. I feel like we're OT now.

 

Anyway, I can find a single story about the very brief time AMD got ahead of Intel late last year/early this year due to Intel having production issues. You know, due to the world being closed for business. AMD has suffered also, don't get me wrong, but they had better output so it was a matter of people needing to have something.

 

This is from Jan 5, 2021

 

On the other hand, AMD lags way behind, as usual. Also as usual, AMD is still "closing the gap". Been told that since like, 2001? At the rate of closing that gap they should overtake Intel in about a thousand years.

 

Worldwide CPU Share

Passmark results updated daily

AMD still closing that gap

 

Again, both manufacturers make good chips, but Intel still dominates in every case we have put them through, including gaming. I do own a few AMD systems, but they are generally for other things. In fact I have a quad-core AMD laptop running PCLinuxOS for the wife to do web-stuff, email, streaming video, and even basic games on. Good system, but it runs hotter than my eight-core i5 laptop.

 

Speaking of which, we just had a discussion on here about AMD running hotter. A user was advised to check temps and another, with the same setup, mentioned the after-market fan/heatsink he was using on the chip to keep it cool. AMD may be cooler than prior generations but our Ryzens run hotter than the Intels when under load. I assume games would stress them the same way as production software. Especially 7 Days or Raft.

 

Fox is right about Intel in one thing. They need to get their butts in gear. AMD is listening to their users and tossing on more cores so gamers can stream and play (or simply play 2077 and pray they don't overload the CPU) while Intel has not, and when they have, they are charging INSANE prices now. My 6950X was ridiculous, but new chips are just stupid. It is the primary reason that I have NOT upgraded yet.

 

In fact, when two things happen I will likely build an AMD-based rig. First, I need a big jump in power and cores. I would like 16-core, 32-thread setup with at least 50% more performance per core. Second, make games work properly with AMD. Ark is a shining example of where, at least on Windows (not sure about Linux), AMD users have more issues due to being AMD users. More crashes being a big one. Unreal Engine is and always has been setup for Intel/nVidia setups and Epic now needs to give some love to AMD. I suppose a third thing would be for AMD to catch up to nVidia in ray-tracing. Once you have used it, you can NEVER go back. It makes things so much better! I imagine with Vulkan being bigger and bigger, AMD can close this gap soon enough. I also suppose I could do the AMD CPU/nVidia GPU combo, but I'm not sure yet.

 

Now, probably to your surprise, we're about to build to HEAVY servers. We chose Threadrippers over Xeon-W despite the performance gap. Why? Price. Xeon-W's are what we wanted, but it is well beyond the client's budget. Intel needs to stop their stupid "price a Xeon high so gamer's don't buy them" crap. My motherboard (ASRock X99 Extreme4) can handle Xeons with more cores and even ECC RAM, but Xeons are just too high. I do hope AMD can put a hurt on Intel and get them to wake the heck up. Intel needs AMD as much as AMD needs Intel.

 

*EDIT*

 

Fastest AMD versus fastest Intel X. AMD has a whopping 5% overall better performance, but the Intel overclocks better. I do not OC. The big elephant in the room? The AMD is nearly half the price. This is why AMD ***IS*** going to kill Intel.

 

5% better, 50% cheaper

Edited by The_Great_Sephiroth (see edit history)
Link to comment
Share on other sites

And let's not forget the abysmal performance that is Intel's 10th and 11th gen CPU's. It's pretty sad when some of the 9th-gen equivalents actually end up performing better in real-world scenarios.

 

For most people right now, and since Gen 3 Ryzen, AMD is the choice for performance-per-dollar in the CPU market. After over 20 years of having the dust kicked in their face by Intel, they have risen above and are truly ahead of the game for workload and performance.  It used to be that if you wanted to do little things fast (like games), you went AMD. If you needed to handle a workload of serious computing, you went Intel. Now AMD is doing both, and they're doing it better than Intel. 

 

And with Nvidia @%$#ing over the consumer market the past couple of years, AMD is getting a foothold there as well.  Sure they don't have the nose on Ray Tracing or DLSS, but how big of a market is that really?  Performance-wise their latest generation of GPU's is on-par or ahead of Nvidia.

 

We're talking consumer market here. PC's for normies.  Not servers. :)

Edited by SylenThunder
10th, not 19th >.< (see edit history)
Link to comment
Share on other sites

I'll disagree with their GPUs. They may eb closing in on the 2080, but the 3090 is just WAY beyond anything else out there, and the 3080 is not far behind. CPUs? AMD is literally pricing Intel out of the market. I Will say this. The comparison linked above is Intel on old lithography versus a new 7nm AMD chip. The AMD is only 5% better overall. That speaks volumes to how sloppy AMD is. They should be 50% better or more since they can cram twice as many transistors onto the same die as Intel can, but it isn't. Still, half the price even if it was the same performance says a lot to people on a budget. Like I said, I am looking at AMD for my next build unless Intel does something meaningful.

 

I would like to see Intel try to justify double the price for the same performance. That would be interesting.

Link to comment
Share on other sites

While I like the fact that AMD is catching up to the greedy Nvidia in GPUs, I still prefer to buy Nvidia cards just because of gaming support dominance. If AMD can get their drivers and software to not suck and if AMD can gain dominance in gaming support, then I would happily switch over to AMD GPUs. In my opinion, support matters more than cost hence why I tolerate Nvidia's pricing.

Link to comment
Share on other sites

With my newest computer, I went with the AMD Ryzen 7 3800X.  I looked at the comparable Intel (Core i7-9700k) which based on benchmarks, the Intel was better.  However the AMD gave me what I wanted and I got a really good deal on it.  I am not a hard core computer person like all of you, but I typically build my computers to last (7-10 years) and with the exception of adding new RAM or replacing a burnt out component, I usually do a complete new build when it comes time and my computer is seriously lagging in performance.

 

It makes it easy to get my wife on board to spend money on a new computer for me when I point out that every time I build one, it lasts for a minimum of 7 years.

23 hours ago, The_Great_Sephiroth said:

I originally used a GTX 550 Ti with 1GB. Played just fine back in those days. I still have the card. I also did not realize that the 2080 only had 8GB. I have the Ti version which has 11GB.

 

Mine is the 2080 Super.  I don't think they had Ti or the regular ones available when I was building it, but I lucked out on getting this one.  I hope I can get a lot of years out of this one, but I don't think I play as demanding games (with the exception of 7D2D) as you all do.  As I have gotten older, I have gotten into the more casual games that I can enjoy when I am not spending time with the family  🙂

Link to comment
Share on other sites

7 minutes ago, Fox said:

While I like the fact that AMD is catching up to the greedy Nvidia in GPUs, I still prefer to buy Nvidia cards just because of gaming support dominance. If AMD can get their drivers and software to not suck and if AMD can gain dominance in gaming support, then I would happily switch over to AMD GPUs. In my opinion, support matters more than cost hence why I tolerate Nvidia's pricing.

Don't know how old you are, but I have a guy I formed my clan with in 1999 who is still with me. He's a Marine and he travels, but we both have a good laugh about the time he bought a Radeon 9800 (AGP, years ago) and the ATI driver did reverse-rendering to gain a few FPS and compare to the nVidia of that day, which I had. The thing is, drawing player models (and plants, furniture, etc) last meant that in most games, he had a permanent, LEGAL wallhack. He sent me screenshots from UT99 and the Infiltration mod we played and he saw all players and such all the time. ATI released a fix, but it cost him some FPS. Still, those were fun times. ATI and AMD have always had driver issues with their GPUs though. Remember the Omega drivers? I do!

 

BFT, that is what drives most gamers. Price. If an AMD CPU is in the same ballpark as the Intel, but costs less, it WILL be sold. I hope Intel learns this lesson soon, or they WILL lose me to AMD for CPUs.

Link to comment
Share on other sites

1 hour ago, The_Great_Sephiroth said:

I'll disagree with their GPUs. They may eb closing in on the 2080, but the 3090 is just WAY beyond anything else out there, and the 3080 is not far behind. CPUs? AMD is literally pricing Intel out of the market. I Will say this. The comparison linked above is Intel on old lithography versus a new 7nm AMD chip. The AMD is only 5% better overall. That speaks volumes to how sloppy AMD is. They should be 50% better or more since they can cram twice as many transistors onto the same die as Intel can, but it isn't.

 

 If AMD had so much possibilities with double the transistors and could do only 5% better, then NOW when Intel has the same amount of transistors than AMD they should be ahead by 45%, right?

 

But they are not. Which means that you may be putting too much value on the effect of a shrink. 10 or more years ago every shrink of the litography was accompanied by a sizable increase in Mhz as well as better IPC. All of that together made a new generation of CPUs so much better than the previous one. But now the frequency stays constant and the small improvements with IPC and more transistors are not that much of a deal. 

 

Also both companies did probably notice that the notebook market is the current growth market (even before Corona) and have used the shrink to draw less power and not get out more performance

 

 

 

Edited by meganoth (see edit history)
Link to comment
Share on other sites

3 hours ago, Fox said:

I'm 36. And yes, I remember Omega drivers. It was often the only way I could get AMD GPUs to work properly.

Then you were born in 1985. I was born in 1980. Glad you remember those drivers. No idea what happened to them though.

 

Meganoth, I believe they are only 5% better because this is how they are reducing power and heat. Half the lithography but same transistor count means more space to breathe. AMD has always run hotter and drawn more power in the past. Perhaps while Intel twiddles its thumbs AMD will close that gap and when Intel begins to realize something is wrong, AMD will be make better use of their stuff for more speed also. Time will tell.

Link to comment
Share on other sites

18 hours ago, The_Great_Sephiroth said:

Then you were born in 1985. I was born in 1980. Glad you remember those drivers. No idea what happened to them though.

 

Meganoth, I believe they are only 5% better because this is how they are reducing power and heat. Half the lithography but same transistor count means more space to breathe. AMD has always run hotter and drawn more power in the past. Perhaps while Intel twiddles its thumbs AMD will close that gap and when Intel begins to realize something is wrong, AMD will be make better use of their stuff for more speed also. Time will tell.

Heck, I do remember my first assembly of a PC where I had to choose between S3 Trio and VoodooRush ))) I made a bad choice then)), and the first pc of mine was a 33 Mhz IBM with TURBO mode to 66Mhz what did exactly nothing))) 10Mb hard drive! And before that I was frequently visiting one of my friends as his father was some sort of computer scientist that time and he had a "portable" computer using spool of magnetic tape)))

Link to comment
Share on other sites

My first real experience with games was Pong. Later I wrote software on a TRS-80. Then the first PC I built was the first 16MHz home computer. Dad was VP of marketing for one of the big tech companies and got a Heathkit about a month before they hit the shelves. He tossed the instructions and just laid out the parts on the game room table. Took me about a week to sort it all out. Around '86-7 if I remember correctly. Still have that here somewhere. Have an old Mac 512k that still works too, but I lost the keyboard for it.

 

Been fixing electronics since I was like 8, and then just naturally went into PC's. Graduated high school and tried to get a job at Tandy, but they turned me down because "no experience". Then the day I was shipping out for the Army I got a call out of the blue from American Megatrends. Hadn't even applied there, but they had seen some of the stuff I did. They wanted me to work on code for BIOS software. Was really hard turning that down. I was literally getting into my truck to go report for duty when they called.

 

I really do need to clean out the closet. It's got a fair amount of old systems I should document, and a lot of e-waste I should just get rid of. LOL

Edited by SylenThunder (see edit history)
Link to comment
Share on other sites

5 hours ago, SylenThunder said:

I really do need to clean out the closet. It's got a fair amount of old systems I should document, and a lot of e-waste I should just get rid of. LOL

This may sound cheesy but....

I had a lot of 'old junk" that was more for tinkering than useful.  Basically I feel better knowing I'm learning on something that's basically trash vs. spending money tearing something new up.  And I had some stuff that was old 'hobbies" I was never going to get back into but I couldn't throw away because "it was still good stuff...hardly used!"

 

Anyway: My solution, should anyone want some ideas:

1. Give away/donate what you can to people who might actually want it (not just dump it on the salvation army). Of course sell what you can if its really, truly high dollar.

 

2. Take pictures of your junk..wait...not... that's not what I mean. Literally just take pictures and put them in "the picture collection" whatever you use.  Take as much as you want, obviously. You'll likely never look at them again, but somehow it "stores" them in your brain as "not completely gone" and you can still show people the stuff if it ever comes up... so its easier to throw out the physical stuff.

 

3. For everything else you can't throw out (sentimental junk): either display it somehow as "decor" or carefully pack it into a box and label the box with the date you packed it and contents. put it in a closet or something. If you don't open it, when/if you move homes you can choose to take it or not. Eventually it will seem silly to keep it. Or your house will burn down with the box in it.

 

Then: After you're done, treat yourself and buy something new, like a system that can play 7D2D at full graphics settings at 60+ FPS :). Even get a new keyboard, monitor, the works...Not trying to be a jerk, just saying "you deserve this!"

Edited by doughphunghus (see edit history)
Link to comment
Share on other sites

6 hours ago, doughphunghus said:

This may sound cheesy but....

I had a lot of 'old junk" that was more for tinkering than useful.  Basically I feel better knowing I'm learning on something that's basically trash vs. spending money tearing something new up.  And I had some stuff that was old 'hobbies" I was never going to get back into but I couldn't throw away because "it was still good stuff...hardly used!"

 

Anyway: My solution, should anyone want some ideas:

1. Give away/donate what you can to people who might actually want it (not just dump it on the salvation army). Of course sell what you can if its really, truly high dollar.

 

2. Take pictures of your junk..wait...not... that's not what I mean. Literally just take pictures and put them in "the picture collection" whatever you use.  Take as much as you want, obviously. You'll likely never look at them again, but somehow it "stores" them in your brain as "not completely gone" and you can still show people the stuff if it ever comes up... so its easier to throw out the physical stuff.

 

3. For everything else you can't throw out (sentimental junk): either display it somehow as "decor" or carefully pack it into a box and label the box with the date you packed it and contents. put it in a closet or something. If you don't open it, when/if you move homes you can choose to take it or not. Eventually it will seem silly to keep it. Or your house will burn down with the box in it.

 

Then: After you're done, treat yourself and buy something new, like a system that can play 7D2D at full graphics settings at 60+ FPS :). Even get a new keyboard, monitor, the works...Not trying to be a jerk, just saying "you deserve this!"

I always sell anything I'm not using, for pc hardware I have pictures like you said, but I also keep the benchmarks if I ever wanted to compare. Older hardware usually doesn't have much value, although some older hardware is worth more than it's original cost decades later. I always try to sell as soon as I no longer use it, to get the most resale value from it. But some things are just unsellable, I've had books and dvds listed for years for pennies, no one wants them XD

Link to comment
Share on other sites

Doughphunghus, 7 Days cannot attain more than 60fps. Unity is a very dated engine and internally things go haywire above 60fps. The only way to fool your FPS counter into thinking you are above 60fps is to disable vertical sync, which means you get a LOT of incomplete buffers which count as entire frames, despite not being so. I have a 144Hz G-Sync monitor and a 2080 Ti. I stayed pegged at 60fps the entire time I am playing 7 Days, and I play in 1440p. I leave vertical sync on so I don't get tearing and so G-Sync works. Always capped at 60fps in Unity games though. Subnautica, Subnautica BZ, 7 Days, My Summer Car, The Forest, Raft, LoE, Stranded Deep, and others all lock you to 60 real FPS max. The good news is that my 2080 Ti never warms up while playing those titles!

Link to comment
Share on other sites

1 hour ago, The_Great_Sephiroth said:

Doughphunghus, 7 Days cannot attain more than 60fps. Unity is a very dated engine and internally things go haywire above 60fps. The only way to fool your FPS counter into thinking you are above 60fps is to disable vertical sync, which means you get a LOT of incomplete buffers which count as entire frames, despite not being so. I have a 144Hz G-Sync monitor and a 2080 Ti. I stayed pegged at 60fps the entire time I am playing 7 Days, and I play in 1440p. I leave vertical sync on so I don't get tearing and so G-Sync works. Always capped at 60fps in Unity games though. Subnautica, Subnautica BZ, 7 Days, My Summer Car, The Forest, Raft, LoE, Stranded Deep, and others all lock you to 60 real FPS max. The good news is that my 2080 Ti never warms up while playing those titles!

That's interesting, I've never heard anything like that before. Can you link your source, wouldn't mind a read? 

Link to comment
Share on other sites

Have fun, I spent an entire day reading this mess about a year ago. All I can tell you is that Unreal Engine > Unity. There's plenty of easy things to check though. Look around. See any refresh rate setting in Unity? No? Me either. Here's a boat-load of threads on this very thing, and after reading enough you will find one which links to Unity Engine documentation (I am NOT spending another day looking for it) which explains why timing in their engine is based on 60Hz/60fps. This is a Unity thing, NOT a 7 Days thing.

 

Have fun!

Link to comment
Share on other sites

39 minutes ago, The_Great_Sephiroth said:

Have fun, I spent an entire day reading this mess about a year ago. All I can tell you is that Unreal Engine > Unity. There's plenty of easy things to check though. Look around. See any refresh rate setting in Unity? No? Me either. Here's a boat-load of threads on this very thing, and after reading enough you will find one which links to Unity Engine documentation (I am NOT spending another day looking for it) which explains why timing in their engine is based on 60Hz/60fps. This is a Unity thing, NOT a 7 Days thing.

 

Have fun!

so no source😆 what you're describing isn't a unity problem, it's a setting problem. GPU's don't render partial frames (Unless it's rendering with SFR in an SLI Setup but never independently rendering partial frames) What actually happening is your monitor with vsync off is displaying multiple COMPLETE frames. This is called screen tearing and happens when your GPU is rendering more frames than your monitor can display, or fewer frames than your monitors refresh rate. The reason your locked to 60fps in games with vsync on ,is because you've not noticed that your monitor is running at 60hz. Yes 60hz+ monitors don't always run at the hz on the side of the box. This happens sometimes when you have a driver update, plug/replug the display cable, change primary monitor etc so forth.

You can once again enjoy 60+ fps gaming with gsync and vsync enabled, by opening nvidia control panel > under display > Change resolution > Refresh rate drop down
 

Spoiler

1468324007_Screenshot2021-07-17184121.png.df06f9ffa5cd36ba9a22987fa1f9b17d.png

 

The first unity docs result in your suggested google search on the topic.

Additionally if the QualitySettings.vSyncCount property is set, the targetFrameRate will be ignored and instead the game will use the vSyncCount and the platform's default render rate to determine the target frame rate. For example, if the platform's default render rate is 60 frames per second and vSyncCount is set to 2, the game will target 30 frames per second.
using UnityEngine;

public class Example
{
    void Start()
    {
        // Make the game run as fast as possible
        Application.targetFrameRate = 300;
    }
}

 

 

Edited by Naz (see edit history)
Link to comment
Share on other sites

I suggest you spend at least five seconds looking at the results I linked. You cannot override this. In fact, while playing Unity games, my monitor is in 144Hz mode. The game only allows 60fps due to internal timing. Again, if you want to understand it and not simply beg others to do it for you, read Unity's manuals. Also, if you read the replies on the code you linked, you'd see it still locks you to 60fps when built. You're not even reading the entire post. You read a single doc and ended it.

 

Since you're too lazy to literally just read some of what I linked, here is a very relevant result.

 

Even after disabling v-sync and using the code you linked, capped at 60fps

 

Oh, and here's an answer literally on the Unity forum explaining that it won't go above 60fps.

 

Locked at 60

 

Finally, if you own ANY Unity game it is easy to see this. Simply run with v-sync on so you see a true framerate and use Steam or your favorite FPS counter and you'll see that despite owning eight 3090's in SLI you cannot break 60fps without breaking things. There was a thread on one of the Unity forums about this and the devs literally said that timing gets crazy above 60fps internally. This is why you can peg out at 60fps but not get higher. Disabling vertical sync can produce insane framerates (I get almost 1000fps in Ark with v-sync disabled, on my 144Hz monitor) but due to the lack of sync with the monitor, many of those buffers may be a single pixel instead of the millions which make up our scene.

 

In addition to the list I mentioned earlier other Unity games I own which also lock to 60fps max even while the monitor is at 144Hz include The Long Dark, Firewatch, Guns of Icarus, Jazzpunk, Kona, Slender: The Arrival, Thief Simulator, and Undertale. I am sure I own more Unity titles, but explain why every single Unity title I have ever owned, across a wide array of systems, always pumps 60fps and no more. From Pentium D960's with 4GB of RAM to this i7-6950X with 128GB of RAM, Unity has NEVER gone above 60fps and there IS a reason. I just don't feel like spending my Saturday digging through Unity forums to find the answer. Do your own homework.

Link to comment
Share on other sites

36 minutes ago, The_Great_Sephiroth said:

I suggest you spend at least five seconds looking at the results I linked. You cannot override this. In fact, while playing Unity games, my monitor is in 144Hz mode. The game only allows 60fps due to internal timing. Again, if you want to understand it and not simply beg others to do it for you, read Unity's manuals. Also, if you read the replies on the code you linked, you'd see it still locks you to 60fps when built. You're not even reading the entire post. You read a single doc and ended it.

 

Since you're too lazy to literally just read some of what I linked, here is a very relevant result.

 

Even after disabling v-sync and using the code you linked, capped at 60fps

 

Oh, and here's an answer literally on the Unity forum explaining that it won't go above 60fps.

 

Locked at 60

 

Finally, if you own ANY Unity game it is easy to see this. Simply run with v-sync on so you see a true framerate and use Steam or your favorite FPS counter and you'll see that despite owning eight 3090's in SLI you cannot break 60fps without breaking things. There was a thread on one of the Unity forums about this and the devs literally said that timing gets crazy above 60fps internally. This is why you can peg out at 60fps but not get higher. Disabling vertical sync can produce insane framerates (I get almost 1000fps in Ark with v-sync disabled, on my 144Hz monitor) but due to the lack of sync with the monitor, many of those buffers may be a single pixel instead of the millions which make up our scene.

 

In addition to the list I mentioned earlier other Unity games I own which also lock to 60fps max even while the monitor is at 144Hz include The Long Dark, Firewatch, Guns of Icarus, Jazzpunk, Kona, Slender: The Arrival, Thief Simulator, and Undertale. I am sure I own more Unity titles, but explain why every single Unity title I have ever owned, across a wide array of systems, always pumps 60fps and no more. From Pentium D960's with 4GB of RAM to this i7-6950X with 128GB of RAM, Unity has NEVER gone above 60fps and there IS a reason. I just don't feel like spending my Saturday digging through Unity forums to find the answer. Do your own homework.

Oh i did, neither of your linked "sources" are from the unity docs . They're troubleshooting form posts from unity devs and if you read the posts as you suggest, they don't support your fantasy either, everyone in those seems to think it's not normal and to contact unity support. No one else there seems to have the same problem, just as no one else with a high refresh rate monitor is having your experience. 

I can run 7DTD, Subnautica, Subnautica below zero, all with gync and vsync on with well over 60fps.
 

Spoiler

1441891393_Screenshot2021-07-17194654.thumb.png.025cd33f10388e73423e225a1fbc463b.png

 

What your experiencing isn't normal and getting mad at someone trying to help you correct it isn't helpful either. I already have done my homework , i've benchmarked this game for hundreds of hours across multiple alphas and unity version, with various settings on half a dozen different systems. The only time I've ever had the game locked to 60 with vsync in all that time has been a setting misconfiguration. Don't believe me if you wish, keep playing at 60, but i'm certainly not locking my game 60. 

This topic has gone off topic too much as it is, so this is the last i'll say, enjoy the game have a good day.

Link to comment
Share on other sites

After you mod out your nVidia settings those override your in-game settings. If what you are saying is true, it's hysterical that this magical 60fps lock is posted over virtually every Unity Engine-based game forum on the planet and the most common "solution" is to disable vsync and deal with tearing.

 

Oh look, the only way to beat 60fps, on the Unity forum, is to disable vsync.

Disabling V-Sync

 

More FPS issues at the Unity forum!

Half the target?

 

Games on Unity locked at 60fps? No!

Reddit

 

Yet even more advice to turn off v-sync! I guess 144Hz can't handle more than 60fps unless I use the magical Unreal Engine...

Reddit - Turn off V-Sync

 

Darn, another Unity game where a dev flat-out states it's locked to 60fps!

Huntdown

 

That's where I am stopping. From a clean install of ANY Unity game I own on all kinds of hardware, they are capped to 60fps on 144Hz monitors which literally tell you they are in 144Hz. No nVidia settings have EVER been adjusted for these games. All nVidia settings are at default. All game options are maxed. True fullscreen mode, no EAC, every setting maxed.

 

Wait, are you in fullscreen or windowed fullscreen? Show your launcher and check to see if "Exclusive fullscreen mode" is checked. If it isn't, the in-game fullscreen option just makes a window the size of your monitor. You are using some new, non-standard thing which allows for faster ALT+TAB functionality. When I play my games, I play my games. That means true fullscreen. I have a phone, tablet, and three laptops I can look mess up on.

 

*EDIT*

 

Yep, if I switch to fake fullscreen and set my desktop refresh rate to 144Hz, I get 144fps. If I Go to exclusive fullscreen mode, even if I go into nVidia control panel and force the refresh rate to 144Hz, the game creates the rendering window at 60Hz, monitor confirms this. This is a bug in either 7 Days or Unity which has existed since A17. Why use true (exclusive) fullscreen? Way higher framerate, better colors, better performance! I will file a bug report.

 

*EDIT2*

 

I filed a bug report, then found this article. Not sure what The Pimps are using, but it basically allows Unity to use VRR (G-Sync / Freesync) and go above 60Hz.

 

Unity Fix

 

Also, you would need a 160Hz+ monitor to show the framerate you took a screenshot of. I hope you're not running your desktop in 160Hz.

Edited by The_Great_Sephiroth (see edit history)
Link to comment
Share on other sites

On 7/14/2021 at 3:21 PM, The_Great_Sephiroth said:

I wish I had a few more GPUs I could get rid of. Hate seeing people ripped off if they need one. I have a GTX 980 Ti and a GTX 970 in the box if that is an upgrade for anybody. Shoot me a PM. My 1070's are sold as soon as I can setup an eBay auction for somebody.

 

Fox, you are correct. I was thinking of the game "Raft". That game can eat cores, even on my 6950X. As for per-core performance, it is why I pay a premium for Intel chips unless the box will be doing something that benefits from more cores, such as 4K60 encoding.

I missed this, if I still didn't have this Rx480 and on my on board apu I would take you up on that (it's good enough and they're close to one another). But, I wanted to say that's very awesome of you to offer that to the community and I give you kudos for it -- thank you!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...