# NEW Unigine Valley Benchmark 1.0 Scores



## MrGenius (May 3, 2016)

****PRESS F12 for SCREENSHOT - *Please attach a screen capture of your results for score verification.***


Spoiler: Where's my screenshot saved?



C:\Users\User Name\Valley\screenshots


***Your submission will not be added if you fail to follow the rules stated below.***

1.) Benchmark setup:

*Language: English
Preset: Custom (or Extreme HD if resolution is 2560x1440 or 1920x1080)
API: DirectX 11
Quality: Ultra
Stereo 3D: Disabled
Monitors: Single
Anti-aliasing: x8
Full Screen: On or Off (box checked or unchecked)
Resolution: 2560x1440 or 1920x1080 (full screen or windowed 2560x1440 or 1920x1080 is allowed)*

2.) Sound *ON* (sound disabled in benchmark *is not* allowed)
3.) Integrated/onboard graphics scores, and/or the usage of software such as Lucid Virtu/XLR8/Hydra, *are not* allowed (iGPU otherwise enabled *is* allowed)
4.) Tessellation settings on AMD cards *not bypassed* in CCC/Crimson/ReLive/Adrenalin (AMD optimized tessellation, shader cache, and surface format optimization *are *allowed*)
5.) Texture filtering set to *standard* (performance texture filtering *is not* allowed**)
6.) HBCC Memory Segment set to *disabled****
7.) You must also provide correct GPU and CPU clocks (CPU-Z & GPU-Z proof *is* *not* required, but providing such proof is not discouraged)
8.) Screenshots showing the number of GPUs as x2/x3/x4 are considered Multi GPU (whether they are or aren't)
9.) *Must Be a Full Screenshot from within Valley with the sound tab and upper right corner info showing to be valid (See bottom of post)*
10.) The only allowed "tweak" is overclocking. *Absolutely no driver tweaks (other than stated above) or operating system tweaks are permitted*
*Because AMD optimized tessellation, shader cache, and surface format optimization are the default settings, and none of them have a significant impact on scores.
**Because that would be considered a driver tweak, and it can have a significant impact on scores.
***Because that's the default setting, enabling it would be considered a driver tweak, and it might have a significant impact on scores.

It will ensure that we have consistent results.

***Scores**

Single GPU 2560x1440*


#CPUFrequencyGPUGPU ClocksScoreUser Name1.)i7-8700K4.9GHzRTX 2080 Ti1489/19375131trog1002.)i7-8700K5.3GHzGTX 1080 Ti2202/15804852Enterprise243.)i7-3770K5.3GHzRX Vega 641667/11752552MrGenius4.)i5-3570K5.0GHzRX Vega 641660/11752543MrGenius5.)Xeon E5-1680 v24.3GHzGTX Titan X1178/17532372agent_x007

*Multi GPU 2560x1440*


#CPUFrequencyGPUGPU Clocks# of GPUsScoreUser Name

*Single GPU 1920x1080*


#CPUFrequencyGPUGPU ClocksScoreUser Name1.)i7-8700K5.3GHzGTX 1080 Ti2164/15807579Enterprise242.)i7-8700K5.0GHzRTX 2080 Ti1802/19377477trog1003.)i7-8700K5.0GHzGTX 1080 Ti2101/15787204mouacyk4.)i7-6850K4.6GHzGTX 1080 Ti2037/15136792VIGILANTBOY5.)i7-6700K4.8GHzGTX 1080 Ti2075/15586701irate_primate6.)i7-5960X4.6GHzGTX 1080 Ti2072/14756374The Pack7.)i7-6850K4.3GHzGTX 1080 Ti2072/14756277The Pack8.)i7-4790K4.4GHzGTX 1080 Ti2012/16006213TexSC9.)i7-5820K4.5GHzGTX 1080 Ti2050/15006169niceoslov10.)i7-4790K4.7GHzGTX 1080 Ti2075/15016133dleester82



Spoiler: Sub Top 10 Single GPU 1920x1080 Scores



*Single GPU 1920x1080*


CPUFrequencyGPUGPU ClocksScoreUser Namei7-3770K4.8GHzGTX 1080 Ti2062/15046130m1ch4Li5-6600K4.6GHzGTX 1080 Ti2050/14756129MaxxBoti7-6700K4.5GHzGTX 1080 Ti2075/15316060zharthi7-980X4.7GHzGTX 1080 Ti2050/15275623Tomgangi5-7600K5.2GHzGTX 980 Ti1554/19355104n1koi5-6600K4.6GHzGTX 10802138/14355100moe1903i7-4770K3.9GHzGTX 1080 Ti1911/13775081Scorpiusi7-7700K5.0GHzGTX 10802138/13535073AndrewWybR5 16003.8GHzGTX 1080 Ti1962/13775029vjeksi7-8700K4.8GHzGTX 10802100/13754966Vayra86i7-4790K4.7GHzGTX 10802063/14004956Kaizoku11i7-8700K4.9GHzGTX 10802062/13014931Mr.KTi7-5820K4.9GHzGTX 980 Ti1470/20254891Velliniousi7-4770K4.7GHzGTX 10801960/18604850D007i7-4790K4.6GHzGTX 10801999/13594815TkBahai5-6600K4.7GHzGTX 10802100/13584801WhiteNoisei7-6800K4.0GHzGTX 10802126/13774783MrEWhitei7-6850K4.4GHzGTX 10702189/24604684The Packi5-6600K4.1GHzGTX 10802100/13264659ramenfani7-5820K4.5GHzGTX 10802088/13754658gdallski7-5930K4.6GHzGTX 980 Ti1470/20034610petedreadi7-4770K4.7GHzGTX 980 Ti1482/19354553TheHunteri7-5820K4.0GHzGTX 980 Ti1498/19294408EarthDogi7-3960X4.7GHzGTX 980 Ti1471/19254369Ferrum Masteri7-3930K4.2GHzGTX 980 Ti1501/19494311the54thvoidi7-4770K4.4GHzGTX 980 Ti1517/17534248HammerONR7 2700X4.3GHzGTX 980 Ti1432/20004220CS85i5-65003.2GHzGTX 10702050/24154218P4-630i5-4690K4.5GHzGTX 980 Ti1581/18654204_MissBehave_i5-7600K5.0GHzGTX 980 Ti2062/22254168adonarasi7-3770K5.3GHzRX Vega 641661/11754126MrGeniusi7-6700K4.0GHzGTX 980 Ti1392/17534119stefanelsi5-3570K4.8GHzRX Vega 641664/11754064MrGeniusR5 16003.8GHzGTX 10702025/22024013Hardii5-7600K5.1GHzGTX 10702000/20033915Spektrei7-5930K4.4GHzGTX 980 Ti1518/18003854erixxXeon E5-1680 v24.3GHzGTX Titan X1178/17533836agent_x007i7-3770K4.1GHzGTX 980 Ti1263/17533812rtwjunkieXeon E5-2667 V33.0GHzGTX 980 Ti1430/17523801kkarabi7-5820K4.9GHzR9 290X1303/18023513Velliniousi7-5820K4.3GHzR9 Fury X1094/5353460xkm1948i7-4770K3.5GHzGTX 780 Ti1446/14253415HugisR7 1700X3.9GHzGTX 10702113/20523180YautjaLordi7-6700K4.0GHzR9 Fury1050/5003175stefanelsi3-61003.7GHzGTX 10602100/23013148Cetinakpani7-5930K4.5GHzGTX 9801416/18783032THE_EGGi7-4790K4.4GHzGTX 9801355/17532946Jetsteri5-6600K4.6GHzGTX 10601928/22002939Lui Leyland Roberti7-3770K4.1GHzGTX 9801485/17532922rtwjunkiei5-4690K4.2GHzGTX 9701558/18792920jboydgolferi7-9504.2GHzGTX 9701656/19882774ArctucasFX-83504.0GHzR9 Fury1040/5002759DirtymadraXeon X56703.3GHzGTX 10602202/22992746Kliimi5-3570K4.8GHzGTX 9701594/18722643PHaS3i-3570K4.2GHzGTX 9701505/17532499BiggieShadyi7-3770K5.3GHzR9 280X1272/18502361MrGeniusi5-3570K5.0GHzR9 280X1219/18502265MrGeniusFX-83203.5GHzRX 5701280/175019211Gpi2ZV6Jyi5-64003.3GHzGTX 9601632/20531795simmiePentium G32203.0GHzGTX 7601123/15021582Fricki5-6600K4.6GHzR9 270X1180/15001521Lui Leyland RobertR7 1700X3.8GHzR9 3701050/14001407Mumtazi7-4710HQ3.5GHzGTX 860M1232/1328939THE_EGGi7-4700HQ3.4GHzGTX 770M862/1002916P4-630




*Multi GPU 1920x1080*


#CPUFrequencyGPUGPU Clocks# of GPUsScoreUser Name1.)i7-6700K5.0GHzGTX 980 Ti1493/190527362JohnnyDirect2.)i7-6850K4.5GHzGTX 10702151/222526536The Pack3.)i7-6700K4.6GHzGTX 980 Ti1531/175326291Shift11904.)i7-4790K4.0GHzGTX 980 Ti1500/175225895RealNeil5.)i7-3770K4.6GHzGTX 780 Ti1147/185225246jaggerwild6.)i7-4790K4.8GHzR9 290X-R9 2901205/1600-1090/159025104fullinfusion7.)Xeon X56504.4GHzGTX 9801582/190324939EpicGrog8.)i7-4770K4.4GHzGTX 9701450/187524620Good Guru9.)i7-5820K4.4GHzGTX 9701316/175324371gdallsk10.)i7-9204.3GHzGTX 9701442/190824319Tomgang



Spoiler: Sub Top 10 Multi GPU 1920x1080 Scores



*Multi GPU 1920x1080*


CPUFrequencyGPUGPU Clocks# of GPUsScoreUser Namei7-4770K3.5GHzR9 390X1080/150024290RealNeilXeon E5-2686 v32.0GHzGTX 10802075/135424230er557i5-34703.4GHzR9 2901040/125024120Turboi7-3930K4.9GHzGTX 780 Ti1165/189223894jaggerwildi5-34703.6GHzHD 79901100/150023577Turbo




Please make my job easier by posting your results as I have below(or similarly).

i5-3570K @ 5.0GHz + R9 280X @ 1219/1850 = 2265





PS, the old Valley scores thread is dead. The OP is no longer adding new scores to the list in it. I've changed the rules of the game significantly for this thread. As a result I will not accept scores from the other thread posted here.


----------



## kkarab (May 4, 2016)

Hello again, posting my first valley score benchmark...

Intel Xeon 8core E5-2667 V3@2.90GHz + GTX 980Ti @1430/3500=3671


----------



## MrGenius (May 4, 2016)

jaggerwild said:


> jaggerwild
> 3930K@5000Mhz
> 780TI'S DC II OC SLI@1185/1893


Quality needs to be Ultra.


----------



## Frick (May 4, 2016)

Intel Pentium G3220 @ 3Ghz + Nvidia Geforce GTX 760 2GB @ 1123/1502 = 1582


----------



## TheHunter (May 4, 2016)

24/7 factory boost OC 980ti @ base 1278MHz - boost 1418MHz

Intel 4770K @ 4.7GHz + Zotac Omega GTX 980TI @ 1418/1805 = *4248*


----------



## P4-630 (May 4, 2016)

i7-4700HQ 3.4GHz GTX770M 862/1002MHz


----------



## EarthDog (May 4, 2016)

Im curious... modern NVIDIA users...

Are you putting in your ACTUAL clocks (boost clocks found on sensor tab in GPUz) or what GPUz says on the front page? I would recommend putting in the ACTUAL boost clocks for comparison/accuracy sake.


----------



## TheHunter (May 4, 2016)

In my case its load boost gpu-z/rtss OSD reading @ 1418Mhz,

 

I modded bios from stock 1178 to 1279 and boost from 1279 to 1380.. Also limited max total boost to match extreme which in end made its real boost to max 1418mhz @ 1.187v vs factory default 1343MHz @ 1.187v.


----------



## R00kie (May 4, 2016)

5820K 4.4 GHz
2X 970 1316/7013


----------



## P4-630 (May 4, 2016)

TheHunter said:


> 24/7 factory boost OC 980ti @ base 1278MHz - boost 1418MHz
> 
> Intel 4770K @ 4.7GHz + Zotac Omega GTX 980TI @ 1418/1805 = *4248*
> 
> ...



I thought all GTX980Ti's have 6GB vram? Yours has 4GB vram?


----------



## R00kie (May 4, 2016)

P4-630 said:


> I thought all GTX980Ti's have 6GB vram? Yours has 4GB vram?


That's what the Valley wants to believe.


----------



## BiggieShady (May 4, 2016)

Card is slightly factory overclocked ... I did make an effort to open the front door of the case
i5-3570K @ 4 GHz + Gainward GTX 970 @ 1392 MHz all the way down to 1329 MHz  memory at 1753 MHz ... put me in for 1360/1753 = 2385
Graphs

GpuZ
 
Score


i5-3570K, 4.0 GHz, GTX 970, 1360/1753, 2385, BiggieShady


----------



## the54thvoid (May 4, 2016)

i7 3930k @ 4.2GHz + EVGA 980ti Kingpin 1501/1949 = 4311

These are my standard gaming clocks.

Card can bench (gingerly) at 1550MHz core and 8292MHz memory but it's a bit flaky for all of a few % more performance.


----------



## Tomgang (May 4, 2016)

I7 920 @ 4,3 GHz + 2 x GTX 970 @ base clock 1305/Boost clock 1442/Mem 7632 = 4319
Dont know why it says Win 8, but it is done on Win 10 Pro 64 bits


----------



## rtwjunkie (May 4, 2016)

Score=*2922*
i7-3770k@4.1Ghz.  MSI GTX 980: 1220Mhz core/1321Mhz boost (running full out at *1485Mhz* Boost the whole test.  Memory was 3505Mhz.

I just noticed...Valley reports W8!   I'm on W10.


----------



## TheHunter (May 5, 2016)

My max OC, but I see it downclocked at one part to 1464 from 1478mhz


Intel 4770K @ 4.7GHz + Zotac Omega GTX 980TI @ 1478/1860 = *4359*





Min fps is also a funny one, it happens by changing scenes, one time it was 29 now 40, my initial 44fps, Heaven has the same bug.


----------



## jboydgolfer (May 5, 2016)

Reference GTX970 running @
Core=1552Mhz
Memory=1890Mhz
@1.231v

***in benchmark Capture of the results @ bottom for Validity's sake ***


----------



## MrGenius (May 5, 2016)

jboydgolfer said:


> ***in benchmark Capture of the results @ bottom for Validity's sake **


No need for that. What you had there first works for me. I just want to take a quick look at the upper right corner and the score panel. And since you left out the CPU bits, I'm going with your system specs. Correct me if I'm wrong. If they weren't there I'd have to default to what the score panel shows.


Where's all my AMD peeps? This is looking like a gang rape!


----------



## jboydgolfer (May 5, 2016)

i didnt do it for you, i just added it .


----------



## manofthem (May 5, 2016)

MrGenius said:


> Where's all my AMD peeps? This is looking like a gang rape!



Oh I'm coming for you!  as soon as I get a few minutes, which is proving most difficult, I'll be giving it a go for you to add.


----------



## kkarab (May 5, 2016)

kkarab said:


> Hello again, posting my first valley score benchmark...
> 
> Intel Xeon 8core E5-2667 V3@2.90GHz + GTX 980Ti @1430/3500=3671



New run with system settings > Intel Xeon E5-2667 V3@3.0GHz + GTX 980ti (pgu@1430/ram@1752) + DDR4 2133@11-11-11-31-2T = 3801


----------



## Caring1 (May 5, 2016)

BiggieShady said:


> Card is slightly factory overclocked ... I did make an effort to open the front door of the case
> i5-3750K @ 4 GHz + Gainward GTX 970 @ 1392 MHz all the way down to 1329 MHz  memory at 1753 MHz ... put me in for 1360/1753 = 2385
> 
> i5-3750K, 4.0 GHz, GTX 970, 1360/1753, 2385, BiggieShady


Or a 3570K


----------



## EarthDog (May 5, 2016)

i7-5820K @ 4.0GHz + GTX 980 Ti @ 1498/1929 = 4408

Sorry for the lack of cropping. I made the screenshot my backround.


----------



## HammerON (May 5, 2016)

i7-4770K @ 4.4GHz + EVGA GTX 980 Ti Classified @ 1517/1753 = 4248


----------



## jaggerwild (May 5, 2016)

jaggerwild
3930K@4900
780TI'S DC II OC 1165/3784

for some reason cant run it at 5Mhz this am


----------



## Vellinious (May 5, 2016)

5820k - 2 cores at 4.9
8GB 290X @ 1303 / 1802
Score:  3513








5820k @ 4.7
2 x 970 SLI @ 1576 / 2026
Score:  5581


----------



## EarthDog (May 5, 2016)

Now there is some SLI scaling... The others looked like it barely scaled.


----------



## MrGenius (May 5, 2016)

EarthDog said:


> Sorry for the lack of cropping. I made the screenshot my backround.


No cropping please. I changed the rules to state a *Full* Screenshot. And I would like to be able to see the upper right corner for GPU model and quantity verification. I hadn't mentioned that in this thread(and the rule is now changed to reflect that). So I will allow it to be not shown in this instance.


----------



## Vellinious (May 5, 2016)

I just wish I had been pushing the 2 cores of the CPU higher when I had the 970s.  I'm quite certain I could have pushed them up near 5700 with the additional CPU core clock.  

I'm still pretty new to pushing hardware quite this hard and I'm still learning...I've been overclocking for a long while, but..not on this scale / level.


----------



## BiggieShady (May 5, 2016)

Caring1 said:


> Or a 3570K


So that's why you are a caring one  ... first one was a typo and second one copy pasted  ... fixed, lucky for me op ingeniously parsed the message correctly


----------



## jaggerwild (May 5, 2016)

Vellinious said:


> I just wish I had been pushing the 2 cores of the CPU higher when I had the 970s.  I'm quite certain I could have pushed them up near 5700 with the additional CPU core clock.
> 
> I'm still pretty new to pushing hardware quite this hard and I'm still learning...I've been overclocking for a long while, but..not on this scale / level.



 You mean additional cooling don't you? Of course you can push it higher, doesn't mean it will work but.


----------



## EarthDog (May 5, 2016)

MrGenius said:


> No cropping please. I changed the rules to state a *Full* Screenshot. And I would like to be able to see the upper right corner for GPU model and quantity verification. I hadn't mentioned that in this thread(and the rule is now changed to reflect that). So I will allow it to be not shown in this instance.


If it was me (and clearly its not, LOL) I would follow Hwbot rules since they are THE authority and site for that kind of stuff. A potential concern with the current method is that the OSD doesn't report much right a lot of the time so what is it really confirming? This is why they use GPUz and CPUz for verification.


----------



## MrGenius (May 5, 2016)

Vellinious said:


> I just wish I had been pushing the 2 cores of the CPU higher when I had the 970s.


I'm just so glad you didn't post that score in the previous thread. If I had to disqualify it based on that it was going to be a huge disappointment for me. Whether or not you disapproved of the decision. I want to tell you that I had to look for it there just in case. Not because I don't trust you. Just out of fairness to your competition. I didn't find it there thankfully. So as it sits, you're still #1. If it is there and I missed it, here's your chance to "plead the 5th". 

Man...this is why I never take authoritative positions in life. I don't mind being the leader of the pack. So long as it doesn't require being the boss too. In that sense I'm much more comfortable being subordinate. I'm not really cut out for this giving orders and making rules crap. However, like I always say about such circumstances, "it's a dirty job, but somebody has to do it". Unfortunately in this case that somebody is me. 

No big deal though. It could be much worse.


----------



## MrGenius (May 5, 2016)

EarthDog said:


> If it was me (and clearly its not, LOL) I would follow Hwbot rules since they are THE authority and site for that kind of stuff. A potential concern with the current method is that the OSD doesn't report much right a lot of the time so what is it really confirming? This is why they use GPUz and CPUz for verification.


Well I'm not a member there. And I don't frequent that site(what's Hwbot?). So I wasn't aware of their authority on the matter. I'm 100% aware that the info is often wrong(more often than not actually). But what does seem to be correct most, if not all, of the time is the GPU model and quantity. In particular the quantity(I guess you missed that argument in the Heaven thread). But even barring those things it's further proof that you didn't fabricate the screenshot in any way. I make sure that all my screenshots show as clearly as possible that I didn't photoshop the info in the upper right corner. Mostly because with my particular GPU the info shown there IS 100% accurate ALL of the time.

Oh..and the CPU-Z and GPU-Z info can also be wrong. And is just as easily fabricated. So I don't care to see it.


----------



## BiggieShady (May 5, 2016)

@jboydgolfer You should post your actual clocks during the benchmark. For reference my card was much lower clocked than yours.


----------



## jboydgolfer (May 5, 2016)

BiggieShady said:


> @jboydgolfer You should post your actual clocks during the benchmark. For reference my card was much lower clocked than yours.



Those are my actual clocks.....im not a liar.why wouldnt i post the actual settings?

What does your cards settings have to do with my scores for "reference"??

Edit
Based on the results, your clocks dont look much lower than mine.
i dont think i get what you mean
If your referring to "Boost" clocks, i'll re-run the test, and when it finishes, ill see what it reached, concerning that.

@MrGenius
My BOOST Clocks, (if they are indeed required are as follows
Core =1552Mhz
Memory 1890Mhz
ill add them to my original post as well.


----------



## EarthDog (May 5, 2016)

GPUz shows that as well. there is a dropdown. Its also fairly obvious when someone uses multiple GPUs over one.

I missed the Heaven argument, yes, but again, its obvious to those that benchmark for Hwbot by scores... or if you use multiple GPUs, you use the dropdown in GPUz.

Use the best tools for the job.

A bit too late now of course, but if you get saucy and want to start another one, that is the way to go. 

thanks for starting up a new one!!




jboydgolfer said:


> Those are my actual clocks.....im not a liar.why wouldnt i post the actual settings?
> 
> What does your cards settings have to do with my scores for "reference"??


Nobody is calling you a liar... read what we talked about earlier... the boost clocks. GPUz (nor Valley) report the right clocks. Boost clocks will vary..what GPUz reports as boost, it goes WAY higher (unless you modded your BIOS???). 

He mentioned reference because you have a "reference" card...


----------



## Vellinious (May 5, 2016)

jaggerwild said:


> You mean additional cooling don't you? Of course you can push it higher, doesn't mean it will work but.



Not quite sure what you're saying....


----------



## jaggerwild (May 5, 2016)

MrGenius said:


> I'm just so glad you didn't post that score in the previous thread. If I had to disqualify it based on that it was going to be a huge disappointment for me. Whether or not you disapproved of the decision. I want to tell you that I had to look for it there just in case. Not because I don't trust you. Just out of fairness to your competition. I didn't find it there thankfully. So as it sits, you're still #1. If it is there and I missed it, here's your chance to "plead the 5th".
> 
> Man...this is why I never take authoritative positions in life. I don't mind being the leader of the pack. So long as it doesn't require being the boss too. In that sense I'm much more comfortable being subordinate. I'm not really cut out for this giving orders and making rules crap. However, like I always say about such circumstances, "it's a dirty job, but somebody has to do it". Unfortunately in this case that somebody is me.
> 
> No big deal though. It could be much worse.



I'm sure he can clock out way higher, keep it in #1 place he's a slippery one!!! Your doing a good job Mr Genius, its a fun thread that's all! 



> I'm quite certain I could have pushed them up near 5700 with the additional CPU core clock



I thought you meant cooling, not core's is all..................


----------



## Vellinious (May 5, 2016)

Ah, nope...core clock.  Valley is super CPU bound, so the higher you can clock your CPU, the better you'll score.  And since it only uses 2 threads max, you can disable all but 2 cores and hyperthreading to keep the heat down, and be able to clock the remaining cores higher.  It helps....  Depending on the GPU setup, it'll help a TON.


----------



## PHaS3 (May 5, 2016)

i5 3570K @ 4.5GHz
GTX 970 @ 1524 Core / 1851 (7404) RAM
Score: 2609


----------



## jboydgolfer (May 5, 2016)

update to MY old Score 





You'll notice GPUz has its reading set to "MAX" reading, so You can see MY "actual" scores  my bad i didnt think of that b4
Core= 1558
Memory =1879


----------



## MrGenius (May 5, 2016)

jboydgolfer said:


> My BOOST Clocks, (if they are indeed required...


I'm making no such requirement. At this point the rules are set in stone(I hope). And they state you must provide correct clocks. That's your job, not mine. You make the call. Since you're making the change, I will change the record of it. I'm taking everyone's word on what their correct clocks are. I don't care what the screenshot says, or what CPU-Z or GPU-Z says. I care what you say they are. And that's all. If that seems unfair to anyone...tough titties. Fight amongst yourselves. I'm not stepping in.


----------



## jboydgolfer (May 5, 2016)

MrGenius said:


> I'm making no such requirement. At this point the rules are set in stone(I hope). And they state you must provide correct clocks. That's your job, not mine. You make the call. Since you're making the change, I will change the record of it. I'm taking everyone's word on what their correct clocks are. I don't care what the screenshot says, or what CPU-Z or GPU-Z says. I care what you say they are. And that's all. It that seems unfair to anyone...tough titties.



i didnt think so, but just the same, so theres no whining, I posted a new result, with GPUz set to "MAX" reading, that way there IS no doubt . i suppose i can understand that these results are important to people.


----------



## EarthDog (May 5, 2016)

Not so much important as it is people want to use it to compare their results to those on this table. If people are posting XXXX clocks when its actually boosting 100 Mhz higher that is, as you can see, a SIGNIFICANT difference.

Its just about 'doing it right', that is all. 

This isn't really a dedicated overclocking site, so the finer points can easily be missed by those not as well versed in such activities.


----------



## jaggerwild (May 5, 2016)

Vellinious said:


> Not quite sure what you're saying....



 I'll have to try it, never bothered disabling cores for Valley. Think my max was 5700Mhz with 2 cores, that was on a RIVE board, this X79 Deluxe seems way better I guess cause it's a server board.


----------



## Vellinious (May 5, 2016)

5820k @ 4.7
970 @ 1607 / 2105

I know it probably won't count, but.....it's the 2nd highest single 970 score I've seen.  This card was phenomenal.


----------



## BiggieShady (May 5, 2016)

jboydgolfer said:


> ... so theres no whining, I posted a new result ...


Nah, no whining, I just wanted to compare how much score increases with core frequency and memory frequency increase ... because 1375 is my average boost clock (quiet build on air)


----------



## jaggerwild (May 6, 2016)

Vellinious said:


> I know it probably won't count, but.....it's the 2nd highest single 970 score I've seen. This card was phenomenal.



 Did it Die, you said "Was" or sold it?


----------



## Vellinious (May 6, 2016)

Sold them.  I get bored easily.


----------



## MrGenius (May 6, 2016)

Vellinious said:


> 5820k @ 4.7
> 970 @ 1607 / 2105
> 
> I know it probably won't count, but.....it's the 2nd highest single 970 score I've seen.


It doesn't count for this thread. As far as the list is concerned. Since one of my, albeit stupid, rules does truly apply(the multi GPU showing in the score panel rule). I would make an exception to the no previously posted scores rule. Since it wouldn't apply, *if* the other rules were followed. Which are "honor system" rules. If you say they were followed, or even if you say nothing about them, then they were followed(or I will assume they were at least). What I meant by scores from the previous thread do no apply is _mostly_ because of the 4x AA thing(though other new rules have been added or clarified that could potentially disqualify it too.). Otherwise though it's definitely worth posting here for reference purposes. Since it's still a nice score regardless. Its disqualification only being due to a minor technicality. So far as I know.

It'll all make sense in the end. Though I still agree with you for the moment. It doesn't make much, if any, sense right now. The only scenario I can imagine so far is someone crying foul because you beat their single GPU score, and they don't want to believe it because it says "x2". So you must have photoshopped the upper right corner to make it look like a single GPU. It's a long shot...I know.


----------



## EarthDog (May 6, 2016)

Yeah, you are really reaching there... 

But uh... who are you talking to? You didn't quote anyone and the post 'above' you doens't have anything to do with your response?


----------



## MrGenius (May 6, 2016)

Generally to the person in the post above. Like right now.

The post above became invisible when the new page was created.

Anyways...fixed.

On the stretch of the imagination...it's not really a big deal. Like I said before. If it's such a big deal to pull the power cables from one or more of your GPUs...then who's really being silly here? At least I have an excuse at all.


----------



## jaggerwild (May 6, 2016)

See if this works
jaggerwild
3770K@4600
GTX 780TI DCIIOC 1147/1852=5246


----------



## jboydgolfer (May 6, 2016)

Ran a test at Bone stock just to see what it would do.....*_voltages were not stock, but clocks ARE_*
Boost Core @ 1278Mhz
Boost Memory @ 1753Mhz

Valley Score = 2538






and heres the Plain old results with no readings in the way for validity's sake


----------



## MrGenius (May 6, 2016)

Wait...what?^^

Oh...I get it. Not really. But whatever floats your boat. 

@jaggerwild             

I don't know how you did that. But none-the-less...that's amazing!


----------



## jboydgolfer (May 6, 2016)

MrGenius said:


> Wait...what?^^
> 
> Oh...I get it. Not really. But whatever floats your boat.
> 
> ...




You dont get my post?


----------



## jaggerwild (May 6, 2016)

MrGenius said:


> Wait...what?^^
> 
> Oh...I get it. Not really. But whatever floats your boat.
> 
> ...



MrGenius ^5


----------



## MrGenius (May 6, 2016)

jboydgolfer said:


> You dont get my post?


No. I do? I think? I just don't know what to do with it. You're not under the impression that I'm going to add it to the list right? Top scores only. 1 score per member. Unless the CPU or GPU changes. Then I will put it in there. Same GPU and CPU right? Just the GPU clocked lower? Am I missing something?

Do I need to make another rule? Please tell me I don't.


----------



## jboydgolfer (May 6, 2016)

no, i just posted it for the sake of posting it, and was done, but when you made that comment, i became confused. no i dont expect you to add the score, i just post em, ive done it this way for a long time, just ignore 'em


----------



## MrGenius (May 6, 2016)

It's cool by me. No problemo.


----------



## rtwjunkie (May 7, 2016)

So, do I interpret right, no updating the score if it improves? I just wanted to be clear.


----------



## MrGenius (May 7, 2016)

*Top scores only*. As in if you post a *higher score with your particular GPU and CPU combo, that one gets listed*, old score goes bye bye. *1 score(highest posted)* per member *per each particular GPU and CPU combo*. *Unlimited top scores* per member *if *they are *all* with *a* *different GPU and CPU combo*. *A* *different GPU and CPU combo* can be either *a* *different GPU and the same CPU*, or *the* *same GPU and a different CPU*.


----------



## EarthDog (May 7, 2016)

Highest entry per gpu. CPU isn't terribly relevant here...at least not enough to have two entries on the same gpu...

Make it easy on yourself.


----------



## MrGenius (May 7, 2016)

I appreciate the input. But I beg to differ. Especially in the case of this particular benchmark. In my experience, with Valley, CPU can make a major difference in scores. Whether through core count or through clock speed.

For instance. My old score in the old Valley thread with my 280X @ 1200/1850 + E8600 @ 3.33GHz = 2197 points. My later(unlisted) score with my 280X @ 1207/1850 + 3570K @ 4.8GHz = 2727 points. Now I can't believe that +7MHz on the GPU core gained me 530 points. I'm fairly certain that had to do with more CPU cores at a higher frequency.

Heaven doesn't seem to be similar in that respect. With a difference of only 31 points between those setups. 1059 points vs. 1090 points.

In both cases it appears to make a difference that is significant enough to be noteworthy. It is also keeping within the spirit of both previous threads to list such differences. It gives a better idea of what one can expect to see results wise with whichever GPU combined with whichever CPU. These threads are not just about top scores with certain GPUs. That's not to say they couldn't or even shouldn't be. It's just how it is, and how it's always been.


----------



## xkm1948 (May 7, 2016)




----------



## xkm1948 (May 7, 2016)




----------



## jaggerwild (May 7, 2016)

Only been 3 days, you guys are stressing him out. It's FRIDAY! FRIDAY FRIDAY!!!!!


----------



## EarthDog (May 7, 2016)

MrGenius said:


> I appreciate the input. But I beg to differ. Especially in the case of this particular benchmark. In my experience, with Valley, CPU can make a major difference in scores. Whether through core count or through clock speed.
> 
> For instance. My old score in the old Valley thread with my 280X @ 1200/1850 + E8600 @ 3.33GHz = 2197 points. My later(unlisted) score with my 280X @ 1207/1850 + 3570K @ 4.8GHz = 2727 points. Now I can't believe that +7MHz on the GPU core gained me 530 points. I'm fairly certain that had to do with more CPU cores at a higher frequency.
> 
> ...


That is a huge generational gap/IPC performance difference (what 50%+?) and massive clockspeed difference . Nearly 50% over the e8600 dual core. Makes sense considering the e8600 likely choked the GPU at those speeds. 

I have a lot more left in the CPU, so I will post something up at some point with the same clocks and faster clockspeeds on the CPU and see what happens.


----------



## Vellinious (May 7, 2016)

Valley is really CPU bound.  I can add 100 points to my score just going from my daily clock of 4.4, down to 2 cores and no hyperthreading with the clock at 4.9.  It gets worse the more powerful the GPU is, and with multi-GPU setups, it's even more evident.


----------



## BiggieShady (May 7, 2016)

Bumped cpu to 4.2 and gpu boosted to 1505 ... vram speeds unchanged.

i5-3570K, 4.2GHz, GTX 970, 1505/1753, 2499, BiggieShady


----------



## MrGenius (May 7, 2016)

Vellinious said:


> Valley is really CPU bound.  I can add 100 points to my score just going from my daily clock of 4.4, down to 2 cores and no hyperthreading with the clock at 4.9.  It gets worse the more powerful the GPU is, and with multi-GPU setups, it's even more evident.


I glad you mentioned the multi GPU part. That seems to give some explanation as to why a certain member could increase their score with a certain pair of GPUs so drastically as was recently the case. I'm still having a very hard time understanding how that was possible. But the rules were followed so far as I can tell. So the score appears to be valid. And I did list it as such. But I'm not necessarily the scores judge here. I have a rough idea of what's realistic. At least I think I do. My job as I see it is just to police the scores and keep track of them.  I don't feel I have the experience required to deem a score as unrealistic or impossible unless it were _supposedly_ achieved with a similar GPU to the one I own currently.

Anyhow, maybe you or someone else could help me understand better how or why that score was possible. I think it's pretty obvious by a quick look at the list which score by which member I'm referring to. I'm just not willing to name names or point fingers at this point. I'm big on the benefit of the doubt theory.

However I did mention in the other thread that if anyone sees anything that they think is contestable scores wise to please let me know about it. I would prefer, or rather insist, that the protest be made publicly. That seems most fair to the parties involved. I am willing to try and keep the lists as free of deception as I can. But I will need help with that from other members. Like I said, I just don't have enough experience to make a judgment call in each particular case as to which scores are obviously fake. So unless somebody has a particular complaint about a certain score, and can make a reasonable argument as to why, I'm going to assume nobody has a problem with any listed scores. As I feel I've done the best I could, barring any potential oversights on my part, to keep invalid scores off the list.

Lastly, I apologize to the member who's score I'm making an example of. It's not accusatory. I just don't fully understand it. And you didn't provide any explanation of it. So it's left some of us wondering.


----------



## jboydgolfer (May 7, 2016)

Upping cpu clocks helps?
Or dropping cores,and i creasing clocks....
I wouldnt have guessed, cpu usage seems fairly low during benching

Ill need to try that.


----------



## Emmanuel Fuller (May 7, 2016)

https://drive.google.com/open?id=0B-AEgBUCcQAwenpHRGRDQ044SUU

Here's this benchmark running in 4k. The link is a google drive shared link that shows the picture. I posted this to show you what you need to play recent games maxed out in 4k. You can do it with an i5. 4x my score would be a score of 7852.


----------



## MrGenius (May 7, 2016)

Emmanuel Fuller said:


> ...my score would be a score of 7852.


Pics, or in this case screenshot, or it didn't happen.


----------



## Vellinious (May 7, 2016)

jboydgolfer said:


> Upping cpu clocks helps?
> Or dropping cores,and i creasing clocks....
> I wouldnt have guessed, cpu usage seems fairly low during benching
> 
> Ill need to try that.



Yes.  I took to turning off hyperthreading and disabling all but 2 cores, because...hyperthreading creates heat, as do the other 4 cores that aren't being used by Valley....so, using 2 cores without hyperthreading gives you more thermal headroom to get a little more core clock out of your processor.



MrGenius said:


> I glad you mentioned the multi GPU part. That seems to give some explanation as to why a certain member could increase their score with a certain pair of GPUs so drastically as was recently the case. I'm still having a very hard time understanding how that was possible. But the rules were followed so far as I can tell. So the score appears to be valid. And I did list it as such. But I'm not necessarily the scores judge here. I have a rough idea of what's realistic. At least I think I do. My job as I see it is just to police the scores and keep track of them.  I don't feel I have the experience required to deem a score as unrealistic or impossible unless it were _supposedly_ achieved with a similar GPU to the one I own currently.
> 
> Anyhow, maybe you or someone else could help me understand better how or why that score was possible. I think it's pretty obvious by a quick look at the list which score by which member I'm referring to. I'm just not willing to name names or point fingers at this point. I'm big on the benefit of the doubt theory.
> 
> ...



I'm not really sure what scores you're talking about.  PM me some examples and I'll try to help.  Some of can be the difference in motherboards, some can be a difference in clock speeds, temps, stability of the overclock, generational gaps between comparisons....if it's a Maxwell GPU, one person could be seeing some thermal throttling or power limit throttling and not even know it.


----------



## MrGenius (May 7, 2016)

Vellinious said:


> I'm not really sure what scores you're talking about.  PM me some examples and I'll try to help.  Some of can be the difference in motherboards, some can be a difference in clock speeds, temps, stability of the overclock, generational gaps between comparisons....if it's a Maxwell GPU, one person could be seeing some thermal throttling or power limit throttling and not even know it.


I'd rather not PM you any examples. Since they're already a matter of public record. And I don't exactly see the point of it. In this instance at least. So here's what I'm looking at.

Ok. For example. Say someone had a pair of GTX 780 Tis  @ 1162/1892 + an i7-3930K @ 4.9GHz and got a score of 3894 in Valley under the rules of this thread. Does it stand to reason that the same person with the same pair of GTX 780Tis @ 1147/1852 + an i7-3770K @ 4.6GHz could score 5246 in Valley under the rules of this thread? And if so how or why?

Like I said, I'd like the debate about such a topic to be open for all to see here. I'm willing to initiate the conversation(as I believe I have and am). But I don't think it's fair to the person in question to be left in the dark about it. Especially if it ends up in a decision to remove the score from the list. I'd like for whomever that someone is to have a chance to witness what's happening here and at least have a chance to explain themselves. Of course that person could choose not to participate in the discussion. I can only hope that they would. As it could likely be the only reason to not remove the score in this case.

Thanks for offering your help and advice. I'm thinking I might need a referee of some sort from time to time, to make the calls that I can't based on my limited experience.  But I'm also kind of hoping everyone can be. To whatever extent they can at least. I definitely don't know all there is to know about the performance limitations of all the graphics cards and processors available. And what I do know about is typically too old an outdated to really apply here. Aside from my last 2 gaming setups. Which other than previously stated includes some experience with an HD 6950.


----------



## Vellinious (May 7, 2016)

First....your rules are a little restrictive.  There's absolutely NO way to tell if an NVIDIA guy has altered the performance settings in the control panel.  None...may as well just allow driver tweaks, EXCEPT in the case of AMD and tessellation and move on with life.

Will that little tweak make up that difference?  Doubtful.  If anything, I'd say whomever ran the 3894 has got some serious issues someplace else.  A pair of 780tis running Valley should see at LEAST 5k score....anything less would indicate to me, there are issues.

It's a benchmark thread...it's about doing everything you can to obtain a higher score...  Just my two copper.

**In fact, you should just go ahead and remove my 2 x 970 score, because I used that driver tweak.  Just realized today, that was a rule....


----------



## MrGenius (May 7, 2016)

I totally see your point about the rules being too restrictive. That was fully intentional. My idea is to try and put maximum stress on the CPU and GPU at the given resolution. Mainly to keep the Nvidia guys from kicking my/our ass so badly. It seems to be working so far. So I doubt I'll be changing it. It is a benchmark thread I get that. But as I gather, not your typical one. And not like the old Valley thread. More like the old Heaven thread.

Thanks for the input anyway. It's duly noted.

Ok. Will do. I greatly appreciate your honesty. All you had to do was say nothing. But I'm very glad you didn't.


----------



## Vellinious (May 7, 2016)

I think you'll find, that a lot of people aren't going to be so honest about it.  SOME AMD guys will try to sneak some tess tweaks in, and SOME NVIDIA guys will try to sneak in the performance textures tweak...among a few others...

Point is....you'll NEVER know....  I could run my 290X right now, and post a screenshot of a 3.7k score, and you'd never know I used a tess tweak.  Just like you'd never know I used driver tweaks on my 970s.  Hell....I gamed with my driver settings tweaked...


----------



## MrGenius (May 7, 2016)

And I totally get your point. My counter point is that there are many things that can't exactly be proven. So many in fact that I can't imagine anything that can't be faked or falsified to a degree as to be virtually, if not entirely, undetectable. Therefore if we are going to use the "honor system" for anything, we might as well use it for everything. Ok...not everything. But you get my point.

I don't know what else to say. I've always followed the rules stated as far as I understood them and even applied the more restrictive interpretations in some cases. Not with the old Valley thread. Because "highest" to me meant "highest performance".  If he had used the word maximum I might not have. That and the flat out vagueness of the statement. "Highest settings" as in what? Supersampling AA? 64x Tessellation? High performance or high quality texture filtering? So I just assumed he meant what I'm sure you, and everbody else, thought it meant. Whatever yields the highest performance. Quite obviously the case by looking at the old scores.

Well one more thing I could say is, it could be worse. Much worse. Like give it a try with SS AA and High Quality texture filtering. Watch as your score plummets. Then you'll be thanking me I didn't make those a rule. And it won't seem like so much to ask for default driver settings.


----------



## Emmanuel Fuller (May 7, 2016)

I posted a Google drive link of the screenshot in my post. Can you see it there or did the moderators block it.

okay, I found an online image editor to make the 8MB file a lil smaller.


----------



## MrGenius (May 7, 2016)

That's not a 7852 score. It's probably still a good 4K score though. But we aren't doing 4K.

Seems a bit compressed or something. Let's try it like this.


----------



## Emmanuel Fuller (May 7, 2016)

MrGenius said:


> And I totally get your point. My counter point is that there are many things that can't exactly be proven. So many in fact that I can't imagine anything that can't be faked or falsified to a degree as to be virtually, if not entirely, undetectable. Therefore if we are going to use the "honor system" for anything, we might as well use it for everything. Ok...not everything. But you get my point.



I understand, I just ran the Valley at 4k in the same specs the opener stated. My setup runs 60fps+ on all new games 90% everything turned up, the extra 10% we can say is 16xAA. At 4k there's not really any jagged edges anymore. There's no need for AA over 8x, AF at 16 is still really nice. You can find a beautiful happy spot. No kind of dynamic resolution can let you see what 4k gaming really looks like on a 4k monitor. If any of you remember what 1080p looked like when it first came out, it was amazing. Also, benchmarking at 1080p will not stress these modern cards out. I guess 4K shows you how good your fill rate and memory bandwidth really is. This is just my honorable mention Valley 1.0 4k test submission.

P.S. Did anyone catch how hot Valley 1.0 thinks my video card is.


----------



## MrGenius (May 7, 2016)

Yeah. The temp thing seems to be a Fury bug. They all look like that.

Do a 1920x1080 run with the settings per the rules in post #1. I would love to see that.

It's not all about stressing the card out. It's about the score at 1920x1080.


----------



## jboydgolfer (May 7, 2016)

Regarding scores that are WAY higher than like systems,as far as what @MrGenius  and @Vellinious  are discussing. I am of the opinion that if there are 10 users,all of which have the same or skmilar systems,and 1 of the users scores WAY more than the other 9,i personally dont consider it as a "valid" score i  my own mind.a few hundred poi ts can be gained here and there, but thousands are likely exploit generated,and i pedsonally disregard them.THATS how i resolve the "youll never know"dilemma.
As i mentioned earlier on in the thread.

For what its worth


----------



## MrGenius (May 7, 2016)

I agree. I just don't want to be the sole decision maker is all. Because I'm just as likely as anybody to make the wrong decision. I'm going to have to ask for some input from you guys first. I mean I have no effing idea how a pair of GTX 780 Tis beats a pair of GTX 970s. But I've at least got one other person saying it's possible. So I suppose it is. I could easily make a poor judgment call on that one though. I don't know how stupid it is that I think that or not. I have no experience in the matter. And the switch up with the CPUs just confuses the fuck out of me even worse. In my head it seems like everything got slower, but the score got higher. Am I fucking retarded or what? Help me out here.

Fuck it. I give up. Hey @jaggerwild what's with the 2 multi GPU scores being so much different there? There seems to be a general consensus that it looks weird. And since nobody else seems to have the nerve to ask you straight up about it, it looks like I'm going to have to then. I'm still not saying you did anything wrong. I'm just wondering WTF you did do. Because you must have done something(other than trade CPUs). It just doesn't make a whole lot of sense(not even the CPU trade part). And that's putting it mildly. Frankly it makes no sense whatsoever. Not to me anyway. So let's quit burning through thread space and put it to rest. No reply means I'm walking away from it for now. But it'll also probably mean we'll be back to it at a later date. My guess is the next time somebody shows up with a pair of GTX 780 Tis.


----------



## jaggerwild (May 7, 2016)

Emmanuel Fuller said:


> https://drive.google.com/open?id=0B-AEgBUCcQAwenpHRGRDQ044SUU
> 
> Here's this benchmark running in 4k. The link is a google drive shared link that shows the picture. I posted this to show you what you need to play recent games maxed out in 4k. You can do it with an i5. 4x my score would be a score of 7852.




Read the first post in this thread, you wanna show yer E pen you got to play by the same settings, OH and Welcome to the TPU! You got a pertty mouth! 



> Fuck it. I give up. Hey @jaggerwild what's with the 2 multi GPU scores being so much different there? There seems to be a general consensus that it looks weird. And since nobody else seems to have the nerve to ask you straight up about it, it looks like I'm going to have to then. I'm still not saying you did anything wrong. I'm just wondering WTF you did do. Because you must have done something(other than trade CPUs). It just doesn't make a whole lot of sense(not even the CPU trade part). And that's putting it mildly. Frankly it makes no sense whatsoever. Not to me anyway. So let's quit burning through thread space and put it to rest. No reply means I'm walking away from it for now. But it'll also probably mean we'll be back to it at a later date. My guess is the next time somebody shows up with a pair of GTX 780 Tis.




Wow A 500LB gorilla in the room,
I figured as they keep popping in asking, I swapped my GPU'S from my 3930K rig(in kitchen window) to my every day rig 3770K, I'll admit I do shut down back ground items. I was thinking Valley doesn't like my 3930K or my overclock is so squed that its about to chit the bed for lack or a better term? Maybe valley favor's the smaller chip I must admit it seems squed I even noticed it, I'll try running it a few more time's(both cards are on a loop with pump n rad)was gonna leave um in the 3770K as I will be replacing them.  But I can re run in myX79 rig also, honestly both rigs only shutting down Internet explorer and a few back ground laggers to like 30 processes in the back ground. It's totally cool if you feel one or both is not legit remove it I can re run um(I wont take offense) they are asus GTX 780 Ti's DCIIOC under water my 580's didn't give high cause lets face it there old school. I wouldn't discount a pair of $1400 dollar GPU'S as not able to hang with 1 generation newer, Feel free to take it down I planed on topping it anyway. If that offends please ask to see anything, I haven't broke out my phase on my 3930K and we have had Very Cool Mornings here in NY. I'd rather show anything you wanna see in the picture(like CPU running full throttle?) when I do a screenie? If anyone feels I'm cheating take it down!!!!!!!But I do ask for a clear example so I can correct it, to be fair to me, truth is I haven't been reading the tread but someone mentioned moving hardware I kinda thought so, those are my cards.

Oh for clarity the first 780Ti's where run in a asus X79 deluxe then moved to my asus maximus V formula(just got it)and run there. my other runs where with GTX 580'S I assume no issue with those scores? like I said pin point it i'll redo it, I didn't switch CPU'S just rigs with my GPU'S(AGAIN I SURELY WASN"T trying to cheat), I can go higher with my phase change unit it runs at a costant-45 on my sandy bridge E 12 core, why its score was so low compared to my 3770K is anyone's guess. someone did mention min frame rates seem different, I notice it change from one run to the next but I in NO WAY used it to my advantage infact the next run(my high score) it went down but the score went up.


----------



## MrGenius (May 7, 2016)

Next topic of discussion: Windowed 1920x1080. Yea or nay?

I say it's fine. So I'm probably going to allow it. Please give me your opinions on it ASAP. I'd like to change the rules sooner than later.

Thanks for your input!

@jaggerwild I have no problem with it myself. Seriously. I will say I got a PM about it. And it did make me wonder even before that person said something to me. I pretty much told that individual "I'm not gonna touch it. I'm willing to give the benefit of the doubt. But maybe if I had a pair of GTX 780 Tis I'd be pissed". But I don't have enough info to conclude anything about it at this point. I'm just trying to play the fool here so someone else doesn't have to. I'm mean let's face it. It's bound to happen eventually. Or not...I don't know. But it wouldn't surprise me if someone did. It didn't make a lot of sense to me, or that guy. So I can see it not making much sense to somebody else.

God's honest truth that 5MHz thing makes less sense to me than this. But nobody said anything to me about that. So I'm not asking. It's a helluva lot easier to just take everyone for their word and leave it at that. I don't want no part of no conflicts or battles about nothing as for as these threads are concerned. HOWEVER that's UNLESS it has to do with an HD 7970/R9 280X. I know EXACTLY WTF I'm talking about there. And I know EXACTLY what you can do with one of those. And if you think you're going to waltz in here and knock me off of my throne....you better bring your A game and have your proof in triplicate. Because I am, have, will, and do. I ain't fucking around when it comes to that.


----------



## jaggerwild (May 7, 2016)

I can post picture's of each rig and even a vid if it would help, Jbody like villianious said you gonna tell me Earth dog is on top by accident? But I pop in a score and it's not valid, some one run some 780Ti'S in SLI? its cool take down my score's n delete the posts !!!! It suppose to be a fun thread now I'm a Villan...................................

Click the link in my sig see my hardware library@ HWBot, have a visit. I go through mother boards like underwear for some people.

I'm done!

OPPS just noticed the 1080 question, its my highest res i can run presently I think it will run on higher but I assumed it wouldn't be a good score. Mr Genius take down the 780'S with the 3770K, I can't delete it, OK im done.


----------



## MrGenius (May 7, 2016)

@jaggerwild Wait wait wait...hold up. What's going on now? Nobody's accusing nobody of nothing last I checked. And no jboydgolfer never said anything to me about it. I don't think that's what he means by that either. He's just hypothesizing. I don't think that was directed at anybody.

BTW don't go deleting nothing please. Then I will have to take the scores down for lack of proof.

We're cool. Everybody's cool. This whole thing is probably my fault. I'll take the blame. I shouldn't have mentioned it. I didn't want to. But for some dumbass reason I did. MY BAD!!

Oh and the 1920x1080 windowed thing was for the 4K guy(s). Or whoever. I didn't even know you were running it windowed. If you were. And like I said that's fine by me even if you do. I just wanted to see what everybody else thinks.

HOLY SHIT!!! What have I done now??????????!!!!!!!!!!


----------



## jaggerwild (May 7, 2016)

here s a run with HD on so it dont count, nothing shut off in windows infact firefox open in back ground. I do have the SKYNET bios on my GPU'S and will have to find my max OC on this set up. EXAMPLE HD>@4500Mhz V




my eyes r bad but clearly you can see the GPU settings................. 

3770K MVF board^780TI'S^


----------



## MrGenius (May 7, 2016)

Ok...that's not bad. But I liked your previous score better. So if you don't mind I'd like to keep it on the list. Up to you.

Bad ass rig BTW. Or rigs? So much shit going on there I don't even know what all I'm looking at. Looks pretty sick though! 

Oh now there is 2. I think I like the one on top best. Hard to tell by looking though.


----------



## jaggerwild (May 8, 2016)

Yeah i got to change the tubbing to red on my GPU'S in the top one then incert it all in an InnWin D frame mini  Thanks gonna get new Videos for the X79 Deluxe, I sell so much stuff its always changing. 1 of my GTX 580's is sold one more to sell them shopping time. Im gonna run my 5 5770 HD'S on the X79 always fun to knock out a world record or 2 Better view fullerV I used a AMD CPU water block between the GPU'S as it was a tight bend, didn't want it to kink.






ON WITH THE THREAD!!!!!!!!!!!!!!!!!!!! one last photo for the DOUBTERS
3Dmark11 Extreme Score





3770K @4500(to compare nothing shut down but explorer)GPU@1201/1852





 Link for overclockersdotnet showing 780Ti scores in the 6000 range almost 7000
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0


----------



## EarthDog (May 9, 2016)

100 posts in and you are still switching things up??? 


Vellinious said:


> First....your rules are a little restrictive.  There's absolutely NO way to tell if an NVIDIA guy has altered the performance settings in the control panel.  None...may as well just allow driver tweaks, EXCEPT in the case of AMD and tessellation and move on with life.
> 
> Will that little tweak make up that difference?  Doubtful.  If anything, I'd say whomever ran the 3894 has got some serious issues someplace else.  A pair of 780tis running Valley should see at LEAST 5k score....anything less would indicate to me, there are issues.
> 
> ...


+111111111111

~100 posts in and there are still rule changes...

Noble effort man, but, shhhhhhstuff should be together BEFORE these threads are started.

Unsubscribed... best of luck!


----------



## jboydgolfer (May 9, 2016)

Reading through these posts, it seems like i was mentioned...why?

Ive made no accusation, unless its a typo    @jaggerwild

Frankly i cant tell if he is mentioning me, he typed a few names,but i  having trouble deciphering his meaning

im referring to This here ....



jaggerwild said:


> I can post picture's of each rig and even a vid if it would help, Jbody like villianious said you gonna tell me Earth dog is on top by accident?



all i ever said was that if a score of a member is Thousands above the scores of members using the same builds, and systems, I (me , myself ,Personally) dont count that score. This is my right. to me it seems dubious, but I didnt accusse a single person, i was Merely replying to what @MrGenius  and @Vellinious  were discussing together. So take me out of your accusational rant please, IF that IS indeed my name. im not that kind of person.
frankly, i dont like the direction this thread is taking, im just going to stop posting in it. all of this contention is NOT the reason i visit Techpowerup, Some may, but i have enough stress in the REAL world.
regards . and best of luck to all of you. remember the scores mean nothing.


----------



## Shift1190 (May 9, 2016)

I7 6700K @4.6Ghz MSI 980Ti SLI


----------



## Vellinious (May 9, 2016)

^^This is a perfect example of a couple high end cards being held back a LOT by CPU core clock.  As I stated earlier....the more powerful cards suffer more.


----------



## jaggerwild (May 9, 2016)

^+1 could clock that so much higher................. Hes only got 2 posts take it down!


----------



## Shift1190 (May 9, 2016)

jaggerwild said:


> ^+1 could clock that so much higher................. Hes only got 2 posts take it down!



I'm penalized for being a newbie to the forum?


----------



## MrGenius (May 9, 2016)

It's a joke. He's just got a messed up sense of humor.

Can we just do this without so much gosh darn drama? Good flipping grief...


----------



## Shift1190 (May 9, 2016)

Lol


----------



## MrGenius (May 10, 2016)

Shift1190 said:


> Lol


I'm glad you found that funny. It lightens the mood a bit.

For anybody who thinks that was out of line. I'm sorry. I'm just a little ticked right now about being accused of shit I didn't even do. Then asking politely what he meant be that. I even PMed the guy and got nothing as a reply. So that's the end of that. The end of what? *Taking shit from some asshole for no good reason at all*. What am I going to do about it?

Well @EarthDog I hope you meant you weren't coming back. Because you just sealed your fate by acting like you did. You're not welcome back here or in the Heaven thread. *You're now banned from both.* You can kiss your score good-bye. And you better not even show your face. Won't matter if you do, you're on my ignore list. I showed you respect. And there's no excuse for you to treat me like that in return. You had your chance to act like a decent human being and you blew it. I hope you're proud of yourself.

Explanation and apology accepted. Ban removed. Score is being relisted.


----------



## jaggerwild (May 10, 2016)

Shift1190 said:


> I'm penalized for being a newbie to the forum?



 Was trying to take the edge off is all, Welcome to TPU~~~~


----------



## Shift1190 (May 10, 2016)

jaggerwild said:


> Was trying to take the edge off is all, Welcome to TPU~~~~


Thanks , I enjoy reading the posts and learning from people who push theirsystems to the limits.


----------



## Jetster (May 10, 2016)

Whats new about it?


----------



## THE_EGG (May 10, 2016)

i7-5930k @ 4.5Ghz, Asus STRIX GTX 980 @ 1416mhz boost clock according to Afterburner (otherwise 1404mhz according to GPU-Z), 1878mhz memory clock. Oh and I'm on 365.10 drivers with Windows 10.


----------



## Shift1190 (May 10, 2016)

THE_EGG said:


> View attachment 74376
> 
> i7-5930k @ 4.5Ghz, Asus STRIX GTX 980 @ 1416mhz boost clock according to Afterburner (otherwise 1404mhz according to GPU-Z), 1878mhz memory clock. Oh and I'm on 365.10 drivers with Windows 10.



Question, why does Valley report the 980 card as 4gb Vram? It does the same on my 980ti just noticed.


----------



## Jetster (May 10, 2016)

Okay

Intel i7- 4790k @ 4.4, MSI GTX 980 1216/1753 = 2956


----------



## MrGenius (May 10, 2016)

Shift1190 said:


> Question, why does Valley report the 980 card as 4gb Vram? It does the same on my 980ti just noticed.


Because Valley's not perfect. It doesn't mean anything more than that.



Jetster said:


> *1216*/1753


Somebody...not me...might get on your case about that. Some folks are real picky about the whole boost thing. I'll put 1481 instead if you'd like.


----------



## PHaS3 (May 10, 2016)

Decided to give it another go... 

i5 3570k @ 4.8 + 970 @ 1594 / 1872 (7488) = 2643


----------



## erocker (May 10, 2016)

MrGenius said:


> I'm glad you found that funny. It lightens the mood a bit.
> 
> For anybody who thinks that was out of line. I'm sorry. I'm just a little ticked right now about being accused of shit I didn't even do. Then asking politely what he meant be that. I even PMed the guy and got nothing as a reply. So that's the end of that. The end of what? *Taking shit from some asshole for no good reason at all*. What am I going to do about it?
> 
> Well @EarthDog I hope you meant you weren't coming back. Because you just sealed your fate by acting like you did. You're not welcome back here or in the Heaven thread. *You're now banned from both.* You can kiss your score good-bye. And you better not even show your face. Won't matter if you do, you're on my ignore list. I showed you respect. And there's no excuse for you to treat me like that in return. You had your chance to act like a decent human being and you blew it. I hope you're proud of yourself.


It's the internet. Ignore it and move on. Don't let it deter from the fact that you're doing an awesome thing by keeping track and updating this thread. 

Cheers.


----------



## Jetster (May 10, 2016)

MrGenius said:


> Somebody...not me...might get on your case about that. Some folks are real picky about the whole boost thing. I'll put 1481 instead if you'd like.




What? 1481 would be more than 1216?

So I guess I don't know what your talking about


----------



## Vellinious (May 10, 2016)

MrGenius said:


> I'm glad you found that funny. It lightens the mood a bit.
> 
> For anybody who thinks that was out of line. I'm sorry. I'm just a little ticked right now about being accused of shit I didn't even do. Then asking politely what he meant be that. I even PMed the guy and got nothing as a reply. So that's the end of that. The end of what? *Taking shit from some asshole for no good reason at all*. What am I going to do about it?
> 
> Well @EarthDog I hope you meant you weren't coming back. Because you just sealed your fate by acting like you did. You're not welcome back here or in the Heaven thread. *You're now banned from both.* You can kiss your score good-bye. And you better not even show your face. Won't matter if you do, you're on my ignore list. I showed you respect. And there's no excuse for you to treat me like that in return. You had your chance to act like a decent human being and you blew it. I hope you're proud of yourself.



Man....wtf is your deal?  You set rules in opposition to having a good competition, because some may and some may not follow the rules...and as there is absolutely NO way to tell, you've CREATED an unfair environment.  For someone that says he likes fairness so much....that seems folly.  THEN, you start banning people because they don't agree with you?  lol

1.  Get some thicker skin
2.  The TPU forums aren't the end all beat all of PC / overclocking forums.  In fact, they barely register...I put it on the same level as LTT forums, and JUST ever so slightly above Tom's Hardware.  So, get over yourself.

I'll await my banishment.  /shivers


----------



## EarthDog (May 10, 2016)

MrGenius said:


> I'm glad you found that funny. It lightens the mood a bit.
> 
> For anybody who thinks that was out of line. I'm sorry. I'm just a little ticked right now about being accused of shit I didn't even do. Then asking politely what he meant be that. I even PMed the guy and got nothing as a reply. So that's the end of that. The end of what? *Taking shit from some asshole for no good reason at all*. What am I going to do about it?
> 
> Well @EarthDog I hope you meant you weren't coming back. Because you just sealed your fate by acting like you did. You're not welcome back here or in the Heaven thread. *You're now banned from both.* You can kiss your score good-bye. And you better not even show your face. Won't matter if you do, you're on my ignore list. I showed you respect. And there's no excuse for you to treat me like that in return. You had your chance to act like a decent human being and you blew it. I hope you're proud of yourself.


Wooooooow, really? That came from nowhere...

Alert me to come back... ok. Since Im here....Let's set things straight, shall we?

1. You PM'd me at ~5PM last night according to the timestamp/email.
2. I didn't log in (at a desktop - saw I had a PM though) until this morning and read your PM. My apologies for not responding instantly, but I was taking care of the things that matter in life... my family and I.
3. After the nice PM, 3 hours later, you went off ALERTING me about it to come back. W.T.F???
4. NOBODY accused you of anything. Are you that paranoid, or really have no idea what people are trying to say to you to help?

You deleted my result? Hahahalololohahaha!!! Really, I could care less. If you don't want me back in this thread, I suggest not PMing me AND alerting me three hours later to come back to it. You were polite and inquisitive in your PM which was HOURS after my post. Then, it feels like you went Bipolar on me in public????!! WTF?

Anyway, my PM in response to yours explains things... though if I would have seen this vitriol infused nonsense, I probably wouldn't have bothered to respond But for those that are curious... here is your nice PM and my nice PM in response:



			
				MrGenius said:
			
		

> I didn't change anything. So I don't know what you mean by that. And I'm not really asking for an explanation if you don't want to give one. That's fine. I'm just trying to explain to you that I didn't do what it sounds to me like you're saying I did. I'm doing the best I can here. I don't expect to meet everybody's expectations. Just giving it my best shot. I hope you don't think I did anything to spite you on purpose. Maybe I did something, but it wasn't intentionally directed at you. I had a talk with jaggerwild but I didn't mention your name. I wouldn't do anything like that without your permission. I'm just sitting here wondering what you think I did or didn't do. I honestly don't know what it is or isn't. But I apologize anyway...for whatever. I don't need any more enemies.
> 
> So...I guess that's about the size of that. Thanks for listening.





			
				Earthdog said:
			
		

> I have ZERO issue with any of the scores listed or anyone in the thread.
> 
> My concerns with the thread are the odd rules for the benchmark and the incessant waffling over rules. Typically, these threads are created with rules in mind already and by those who know what they are doing (sorry, I told you I was direct to a fault!). Vellinous hit the nail on the head in that the point of benchmarks are to get the highest FPS/score possible. So to put limits on things that are SO DIFFICULT to discern (i.e the texture filtering) will really bring into question the validity of the result.
> 
> ...



In the end, I do apologize for voicing my opinion on the matter. My intentions were to help steer it straight as you were receptive to input throughout, but then you pulled an about face. I apparently assisted in making it worse by trying to help (but also voicing my frustrations with the process with my last post). Somewhere between your nice PM and your insulting/flame bait post you lost your mind... sorry that I caused such a hateful response.





Vellinious said:


> Man....wtf is your deal?  You set rules in opposition to having a good competition, because some may and some may not follow the rules...and as there is absolutely NO way to tell, you've CREATED an unfair environment.  For someone that says he likes fairness so much....that seems folly.  THEN, you start banning people because they don't agree with you?  lol
> 
> 1.  Get some thicker skin
> 2.  The TPU forums aren't the end all beat all of PC / overclocking forums.  In fact, they barely register...I put it on the same level as LTT forums, and JUST ever so slightly above Tom's Hardware.  So, get over yourself.
> ...


+1


----------



## rtwjunkie (May 10, 2016)

Shift1190 said:


> I'm penalized for being a newbie to the forum?



No it was meant as humor. 

Also, on your other question, Valley reports the 980 as 4GB of VRAM because  that is how much VRAM the 980 has.


----------



## P4-630 (May 10, 2016)

I'm sadly at the lowest spot now with my GTX770M 
Cannot wait to buy a GTX1070 and see what that scores.


----------



## MrGenius (May 10, 2016)

Ok kids. Let me explain how it works in the real world.

1. Don't accuse me of doing shit I didn't do.
2. If you don't like the rules or the way the game is played, then don't play. Go play somewhere else. Simple as that.
3. Please grow the heck up and quit bothering me with all this childishness. *Addressed TO WHOM IT MAY CONCERN!!!*
4. PMs are *Private Messages*. In this case I don't care that it was made public. *Since I deleted the post here explaining what it states.* But please don't do that again without asking me first. BTW you just outed yourself too. If jaggerwild catches the drift of that conversation. Since he knows someone PMed me about his scores. Gee who might that have been?
5. When someone is put on another person's ignore list their posts are made invisible to the one who's ignore list they are on. Pretty self-explanatory.



Spoiler: The deleted post as best I can recall



In fairness most of the ~100 posts so far are just jibber jabber. It's not like there's ~100 scores listed(looks like 18 so far). And I never got around to changing the rules to allow 1920x1080 windowed. I've been considering changing the rules to allow it. But I haven't done so yet. And I'm still waiting to hear what everybody else has to say about it. I'm only considering allowing it so more people would be able to play the game. Is that somehow a bad idea?

Anyways, other than that I really don't know what you're talking about. But regardless I'm sorry if I ruffled your feathers. That wasn't my intent.





Spoiler: In response to...






EarthDog said:


> 100 posts in and you are still switching things up???
> +111111111111
> ~100 posts in and there are still rule changes...
> Noble effort man, but, shhhhhhstuff should be together BEFORE these threads are started.
> Unsubscribed... best of luck!





 


Jetster said:


> What? 1481 would be more than 1216?
> 
> So I guess I don't know what your talking about


Your core speed makes no sense for that score IMO. I've listed it as such anyway. Since if the clock speeds you state are correct, then I don't care what's shown in the screenshot. But the issue has been raised before. So I thought I'd let you know ahead time.


----------



## jaggerwild (May 10, 2016)

Shift1190 said:


> Thanks , I enjoy reading the posts and learning from people who push theirsystems to the limits.




Your Welcome and your Welcome HERE!

 There are lots of great minds here and VERY VERY Helpful and the Give Aways(look around you'll see)!!!! We should all go out of our way to help/encourage/welcome new blood to the addiction I mean Hobby of computer hardware, especially here as for what we do have. Shift1190(is that a new socket coming out soon?) you ever need anything  just ask, feel free to PM me or both. You need help reaching 4900Mhz so............................. I get a buzz from it.


----------



## MrGenius (May 10, 2016)

PHaS3 said:


> Decided to give it another go...
> 
> i5 3570k @ 4.8 + 970 @ 1594 / 1872 (7488) = 2643
> 
> View attachment 74384 View attachment 74385


I can't accept that because of the screenshot not being from within Valley. If you can do like you did before that would be great.

Thanks!

EDIT: Wait...it looks like maybe it is. So I guess I can. Though it's pretty borderline. I'm sure it's going to cause a problem with someone at some point. So if you can fix that it would be better. Until then I'm going to call it valid. I suppose.


----------



## HammerON (May 10, 2016)

@MrGenius please tone the language down. You do not need to continue using the f word to emphasize your point/frustration. Thanks


----------



## MrGenius (May 10, 2016)

Got it. Just a little upset. Let me go back and fix 'er up. Thanks for letting it slide. Won't happen again.


----------



## PHaS3 (May 10, 2016)

MrGenius said:


> EDIT: Wait...it looks like maybe it is. So I guess I can. Though it's pretty borderline. I'm sure it's going to cause a problem with someone at some point. So if you can fix that it would be better. Until then I'm going to call it valid. I suppose.



It was an F12 from within valley.... what looks suspect about it? The GPU-Z was a seperate screenshot.

Edit - the blackness? i was fast with my F12 after the bench ended and the window appeared is all... If you think its too suspect I wont argue  up to you. Thanks for the thread  having some fun pushing OC again.


----------



## MrGenius (May 10, 2016)

I've got no problems with it. On the list it stays.


----------



## Tomgang (May 10, 2016)

Oh man. So muh for that 3 plade. Oh well two gtx 970 is no mach for two gtx 980 ti.

But it look like my seven year old i7 920 holds up well compare to systems with newer cpu and same grafich card setup.


----------



## Jetster (May 10, 2016)

*MrGenius, *

I ran it again with open monitor in the background. I guess it does have boost I wasn't aware of that. Feel free to change it


----------



## MrGenius (May 10, 2016)

The thing is I don't know much about exactly how boost works. I've never owned a boost card. I have a feeling you can top that score there. But you can always post that score later. I'll go ahead and list what I believe are the correct clocks and the lower score for now. Unless you want it listed differently. The clocks are whatever you say they are. I just want to see a valid screenshot for the score. I technically shouldn't list a lower score. But also...technically...I have to because we aren't sure about what the GPU core boost was on the last one.

And just so you know for future reference, if you leave it out(please don't) I'll take what the screenshot says. Hence I offered to use 1481 instead. As it stands I'm going with 1355 since that sounds closer to the truth to me(than 1216). But I'm no expert on the subject(it very well could be 1481). I'm just a poor fool trying to keep a thread straight. Dodging bullets as I go. To the best of my ability. Who knew this was going to be so hellish? Not I.

Thanks for helping me out with that though. I need all the help I can get here. And if you happened to see what I said earlier, before I edited that reply, please understand I made a mistake by misreading what you said. I'm having a really bad day with this thread today. And you're probably the last guy here I would ever intentionally attack or be rude to on purpose. You do have an unfair advantage being a fellow Oregonian. Other than that though you seem to be a kind soft-spoken person. And I appreciate that greatly.


----------



## Jetster (May 10, 2016)

1355 is fine. Change it


----------



## JohnnyDirect (May 10, 2016)

6700K @4.6 2x980 Ti Clocked @1506 Mem 4131/8261mhz


----------



## MrGenius (May 10, 2016)

JohnnyDirect said:


> 6700K @4.6 2x980 Ti Clocked @1506 Mem 4131/8261mhz


Got screenshot?


----------



## JohnnyDirect (May 10, 2016)

MrGenius said:


> Got screenshot?


I'm working on it it was too big to upload apparently


----------



## Jetster (May 10, 2016)

Just use sniping tool in windows

Or TPU capture


----------



## MrGenius (May 10, 2016)

I use MS Paint and save as jpg.


----------



## JohnnyDirect (May 10, 2016)

I posted screenshot in original post.  Sorry for the delay. I  didn't expect it to kick back my pic and still post text.  Doh!


----------



## Shift1190 (May 10, 2016)

JohnnyDirect said:


> 6700K @4.6 2x980 Ti Clocked @1506 Mem 4131/8261mhz



That's a helluva score, nice man


----------



## EpicGrog (May 10, 2016)

8 year old x58 platform with a 4.4 Ghz 12 core Xeon/SLI 980 Hybrids O/Ced


----------



## MrGenius (May 11, 2016)

JohnnyDirect said:


> 6700K @4.6 2x980 Ti Clocked @1506 Mem 4131/8261mhz


Ok...so other than turning the sound off, which is against the rules, what else did you do? Because that score is beyond possible. And is disqualified because of the no turning off sound rule regardless.

But...if turning the sound on puts your score back in the range off possibilities...I'll consider listing that one. This one...no chance in hell. Please do read *the rules* first, before you try again, to make sure you aren't breaking any more rules than the one previously mentioned. There's a great deal of honesty and self-policing required to keep this thread fair and free of deception. Most of which is in your hands. You play fair and I don't have to be judge, jury, and executioner. Which is becoming quite bothersome for me as of late. So your cooperation in these matters is greatly appreciated.

Thanks!

We've got a saying where I come from. Goes a little something like this. RTFM. As in Read The F'ing Manual.

I'm creating a new saying for these parts.  RTFR. As in Read The F'ing Rules.


----------



## JohnnyDirect (May 11, 2016)

If I had the sound off it was inadvertent. I read the rules and checked my setting and may have turned it off as I was checking...I don't know.  But I made sure It was *on* for this run.  In addition, I then put my clocks back to where I leave them for daily use to show that I'm not far from Max over clocks.  So here are my runs along with Nvidia control Panel.


----------



## MrGenius (May 11, 2016)

Then I'll accept it as possible. I will ask that you state the correct boost clock(and mem clock too if you would). If it is other than 1682(and 2078). 1506 doesn't seem likely is the reason I ask.

Other than that...welcome to the #1 Multi GPU score position. Nice job!

BTW...not a single yea or nay on the 1920x1080 windowed thing from anybody? Well...1 yea from me. Which I suppose doesn't count. Oh well...I tried. Maybe we'll revisit the idea at a later date. Until then it's "don't ask, don't tell".


----------



## JohnnyDirect (May 11, 2016)

MrGenius said:


> Then I'll accept it as possible. I will ask that you state the correct boost clock(and mem clock too if you would). If it is other than 1682(and 2078). 1506 doesn't seem likely is the reason I ask.
> 
> Other than that...welcome to the #1 Multi GPU score position. Nice job!
> 
> Well it's consistently possible and my 2nd post of 7280 beat my 1st post.  Your post is still new, I have no doubt my score will be beat quick, fast and in a hurry.   I can run at 1519mhz but valley will crash after a few runs so I'm posting 1506. Nvidia's boost goes up/down in 13mhz increments. I can't get through a run@ 1531mhz.  When I'm not showing off my max OC,  I set my core clocks to 1493mhz and leave my Mem clocks at 4131.  I can play any game at those clocks all day stable..   I can't take a regular screenshot though windows without a blank screen and Valley won't take a screen shot showing the Afterburner OSD, so here is a camera shot.


----------



## MrGenius (May 11, 2016)

Alright then. 1506/2079 it is. That should go over well.


----------



## EpicGrog (May 12, 2016)

I think I hit the limit for these cards at 1.25mV  Squeezed a little more out of them at 1582/3806 and PCI-X OC at 103mhz


----------



## Shift1190 (May 12, 2016)

EpicGrog said:


> I was able to squeeze a little bit more out of the system (122 points) by increasing the PCI-X clock to 103mhz


Nice


----------



## MrGenius (May 12, 2016)

Here's a tip: In the Valley > screenshots folder, right click on the screenshot file. Then scroll down to Open with > Paint and left click. Then in the upper left corner left click File. Scroll down to Save as > JPEG picture and left click. Take note of where the file is going to be saved in the window you've opened. Then left click Save. This will yield a very good quality screenshot file that can be uploaded and inserted into your post with no extra steps involved. If there's a better way to provide screenshots for this thread, I don't even care what it is. I wouldn't try it anyway.

I mean look how nice they turn out. Everything clear as a bell.


----------



## Kaapstad (May 12, 2016)

Old one not for the scoreboard.


----------



## THE_EGG (May 12, 2016)

More results! This time on my laptop; Gigabyte P34G-V2 (specs can be seen in my system specs sub-menu)




i7 4710HQ @ 2.5-3.5GHz, GTX 860M (Maxwell variant w/ 4GB). Boost @ 1232mhz, Memory @ 1328mhz.
Windows 10 Pro and 365.10 drivers.

Attached is a screenshot of my settings in afterburner and what figures GPU-Z says.


----------



## JohnnyDirect (May 12, 2016)

Verisimilitude for MrGenius;  I uploaded a video of my run on on Youtube. I don't know if I'm allowed to post a link or not but if you Search youtube  
*Unigine Valley Evga 980 Ti SC Overclocked SLI johnny direct *


----------



## MrGenius (May 13, 2016)

FWIW I don't have a problem with it. Or I wouldn't have listed it. I'm just playing Devil's Advocate. I do however think it's a great idea that you posted that vid for all to see. You're definitely on the up and up. And then some apparently. I'm tempted to list that score for you. But dare I? Wouldn't that likely cause some kind of a fuss? I mean that's not in the rules. The rules say a full screenshot from within Valley. And lord knows if I go changing the rules at this point I'm pretty much just asking for trouble.

Anywho. I'll post it. Most they can do is take it down. Well they could do more than that I suppose. But I kinda doubt they will. Here goes nothin'!









Hell with it. I'm listing it anyways. I mean what better proof can you possibly provide? None that I can think of.


----------



## Shift1190 (May 14, 2016)

Updated score with latest driver install


----------



## JohnnyDirect (May 15, 2016)

Shift1190 said:


> Updated score with latest driver install
> View attachment 74571


 You and I have the same  memory.  Are you running the XMP profile in the BIOS/UEFI; along with enhanced performance for memory? If you bump up your CPU and memory speeds, you'll get into the 7000's. I get a nice boost with CPU/Memory increase.


----------



## BiggieShady (May 15, 2016)

MrGenius said:


> The thing is I don't know much about exactly how boost works. I've never owned a boost card.


Thing with boost is that boost clock reported on the first tab of gpuz is minimum guaranteed factory bios dynamic clock in worst case scenario (80 C temps on gpu)
As people change frequency tables in gpu bios and also boost settings in driver at the same time and have different thermal solutions, IMO it only makes sense to compare the real clocks recorded in the sensor tab.


MrGenius said:


> I'm just a poor fool trying to keep a thread straight. Dodging bullets as I go. To the best of my ability. Who knew this was going to be so hellish? Not I.


You are doing a good job. (If we exclude understandable mistaking a case of "internet friendliness" for being attacked voraciously by a stranger from another side of the planet)


----------



## Arctucas (May 15, 2016)

i7-950 @ 4152, eVGA GTX970SSC @1656/3975 = 2774


----------



## Ferrum Master (May 15, 2016)

3960X @colder 4.7Ghz, otherwise as on screen.


----------



## Shift1190 (May 15, 2016)

JohnnyDirect said:


> You and I have the same  memory.  Are you running the XMP profile in the BIOS/UEFI; along with enhanced performance for memory? If you bump up your CPU and memory speeds, you'll get into the 7000's. I get a nice boost with CPU/Memory increase.



Johnny,
I am running the xmp profile in Bios but not sure what you mean about the enhanced performance for memory part. Are you talking about increasing Mem clock on the gpus?


----------



## MrGenius (May 16, 2016)

Look at his old score. Look at his new score. Look at your scores. Do the math.


----------



## TheHunter (May 16, 2016)

I've taken the lead   although that 4.8ghz failed later in 3dmark11 physics test lol


Intel 4770K @ 4.8GHz + Zotac Omega GTX 980TI @ 1478/1925 = *4504*

*

 *


----------



## JohnnyDirect (May 16, 2016)

Shift1190 said:


> Johnny,
> I am running the xmp profile in Bios but not sure what you mean about the enhanced performance for memory part. Are you talking about increasing Mem clock on the gpus?



In my bios Gigabyte Z170 Gaming 7, I am able to adjust my FCLK freqs.  It defaults to 800mhz but I bump it up to where it should be, 1000mhz.  I also have a setting  "Memory enhancement setting"  with options: Auto/Normal/enhanced stability/enhanced performance. I have that set to enhanced performance. I'm sure you have similar ability in your UEFI/BIOS; you just need to look around.   These settings along with increased CPU clocks give me a nice bump in Unigine's Valley.  You have CPU water cooling, so with increased CPU Clocks and voltage, along with spooling your GPU fans up to push your GPU Core and GPU memory a bit higher, you should be real close to my system scores.  We have essentially the same system.  I might be able to push a bit harder because my GPUs are water cooled and stay below 40c. Spool up your fans to max instead of auto for the run and you might beat me.   I'd be surprised if you don't get past a score of 7000.


----------



## Shift1190 (May 16, 2016)

JohnnyDirect said:


> In my bios Gigabyte Z170 Gaming 7, I am able to adjust my FCLK freqs.  It defaults to 800mhz but I bump it up to where it should be, 1000mhz.  I also have a setting  "Memory enhancement setting"  with options: Auto/Normal/enhanced stability/enhanced performance. I have that set to enhanced performance. I'm sure you have similar ability in your UEFI/BIOS; you just need to look around.   These settings along with increased CPU clocks give me a nice bump in Unigine's Valley.  You have CPU water cooling, so with increased CPU Clocks and voltage, along with spooling your GPU fans up to push your GPU Core and GPU memory a bit higher, you should be real close to my system scores.  We have essentially the same system.  I might be able to push a bit harder because my GPUs are water cooled and stay below 40c. Spool up your fans to max instead of auto for the run and you might beat me.   I'd be surprised if you don't get past a score of 7000.



Thanks man, I'll take your advice and see if I can get a better score. I think the wc gpus definitely give an advantage but otherwise pretty similar systems. My next build I want to dive into custom watercooling. Nice setup you have with the custom loops.


----------



## erixx (May 18, 2016)

intel i7 5930K at 4200Ghz + Nvidia GTX 980 Ti at 1163Ghz -> 3665


----------



## MrGenius (May 18, 2016)

No clocks + no FULL screenshot(as per the rules) = no score on the list

Nice score though.


----------



## erixx (May 18, 2016)

Added my clocks. But it is a full screenshot so don-t know what you mean...

Thanks. GPU is stock, and CPU is a mess to OC, so there is room but no room for me for now haha


----------



## MrGenius (May 18, 2016)

erixx said:


> Added my clocks. But it is a full screenshot so don-t know what you mean...


FULL as in showing the info in the upper right corner(as per the rules).


> 8.) *Must Be a Full Screenshot from within Valley with the sound tab and upper right corner info showing to be valid (See bottom of post)*


Settings tab :  FPS Counter √ GPU Monitor √


----------



## jaggerwild (May 18, 2016)

I hit the print button then open paint then paste it in there then name it n save it


----------



## erixx (May 19, 2016)

ok, never activated that fps counter before, lol.
Thanks for caring!


----------



## xkm1948 (May 20, 2016)

With new modified BIOS


----------



## Caring1 (May 20, 2016)

jaggerwild said:


> I hit the print button then open paint then paste it in there then name it n save it


Only works for Valley, not Heaven. Must be some weird format it doesn't like.
I downloaded and use Irfanview to convert.


----------



## _MissBehave_ (May 20, 2016)

4690K@4.5Ghz MSI GTX 980ti @1581/3730 Win 10 Pro x64


----------



## Dirtymadra (May 20, 2016)




----------



## MrGenius (May 20, 2016)

fullinfusion said:


> I cant find the spot F12 saved the screen shot to


This PC > (C: ) > Users > User Name > Valley > screenshots

(C: ) = the drive Windows 10 is installed on. So it could be a different drive letter.


MrGenius said:


> *Here's a tip:* In the Valley > screenshots folder, right click on the screenshot file. Then scroll down to Open with > Paint and left click. Then in the upper left corner left click File. Scroll down to Save as > JPEG picture and left click. Take note of where the file is going to be saved in the window you've opened. Then left click Save. This will yield a very good quality screenshot file that can be uploaded and inserted into your post with no extra steps involved.



I will have to see a full screenshot from within Valley to list your score. It's a rule.


----------



## fullinfusion (May 20, 2016)

@MrGenius thank you, with your help I found it.

Here ya go 

Intel 4790K @4.8ghz / AMD 290x 1205-1600 / AMD 290 1090-1590 / Score 5104/ fullinfusion


----------



## MrGenius (May 20, 2016)

There's my proof for rule #7 too.^^^ Thank you so much for that @fullinfusion. To all that doubted me...STICK THAT IN YOUR PIPE AND SMOKE IT!!! I WAS RIGHT AFTER ALL!!! SEE...I TOLD YOU IT WOULD WORK OUT THAT WAY!!!


----------



## fullinfusion (May 20, 2016)

MrGenius said:


> There's my proof for rule #7 too.^^^ Thank you so much for that @fullinfusion. To all that doubted me...STICK THAT IN YOUR PIPE AND SMOKE IT!!! I WAS RIGHT AFTER ALL!!! SEE...I TOLD IT WOULD WORK OUT THAT WAY!!!


Yeah after pointing where I'd find it there she was, heck I always press F12 but never knew where to look.

I had a bunch of screen shots from the past in there too lol...
But yup simple as could be. Thanks again, I learned something new today


----------



## MrGenius (May 20, 2016)

Me too. What I learned was that sometimes x2 means 2 cards are running/enabled, *EVEN IF* info for both cards aren't shown in the upper right corner. Which a few people have argued with me about. Because sometimes it still says x2 even with one card disabled. They said it *ALWAYS* shows info for 2 cards in the upper right corner when 2 cards are running/enabled. And that x2 doesn't mean both cards are actually running/enabled. They said it needs to say x2 *AND* have the info for both cards in the upper right corner to count as Multi GPU. Obviously *NOT*. Sometimes it says x2 and *IT DOESN'T* show info for both cards in the upper right corner with both cards running/enabled. So I made the rule saying x2/x3/x4 is *ALWAYS* considered Multi GPU. Just in case something like that ever happened(I didn't know for sure it ever would until now). And I was right to make the rule. The main reason, as it turns out, for making the rule being because if you don't want it to show x2, then all you have to do is unplug the power cables from one card. But if you're running 2 cards and the info doesn't show both cards in the upper right corner there's no way to fix it so it does. And I knew that would be the case if it ever happened(well I actually never really thought about it, but I would have known if I did). What I did know was that the rule would make sense eventually. And it did/does. So I stood my ground on it, and left it in place.

I'm just so glad to finally have the proof of it. Now they can STFU. I don't have to prove anything anymore.  indeed!!!


----------



## fullinfusion (May 20, 2016)

MrGenius said:


> Me too. What I learned was that sometimes x2 means 2 cards, *EVEN IF* info for both cards aren't shown in the upper right corner. Which a few people have argued with me about. Because sometimes it still says x2 even with one card disabled. They said it *ALWAYS* shows info for 2 cards in the upper right corner when 2 cards are running. And that x2 doesn't mean both cards are running. They said it needs to say x2 *AND* have the info for both cards in the upper right corner to count as Multi GPU. Obviously *NOT*. Sometimes it says x2 and *IT DOESN'T* show info for both cards in the upper right corner. So I made the rule saying x2/x3/x4 is *ALWAYS* considered Multi GPU. Just in case something like that ever happened(I didn't know for sure it ever would until now). And I was right to make the rule. The main reason, as it turns out, for making the rule being because if you don't want it to show x2, then all you have to do is unplug the power cables from one card. But if you're running 2 cards and the info doesn't show both cards in the upper right corner there's no way to fix it so it does. And I knew that would be the case if it ever happened(well I actually never really thought about it, but I would have known if I did). What I did know was that the rule would make sense eventually. And it did/does. So I stood my ground on it, and left it in place.
> 
> I'm just so glad to finally have the proof of it. Now they can STFU. I don't have to prove anything anymore.  indeed!!!


Yeah I hear what your saying, I just did a run with one card disabled in the CCC and the run showed as having 2 cards or x2 cards but the score clearly shows only one card was run.

When I had two identical MSI reference 7970's in cross fire Id never see it in the upper right corner It only show the top slot card.... the only thing it showed was the correct temperature unlike the R series and the crazy 107999 temperature lol

Here is just one card active

Oh and @W1zzard image uploader isn't working right now


----------



## erixx (May 21, 2016)

intel i7 5930K at 4600Ghz + Nvidia GTX 980 Ti at 1190Ghz -> 3774
EVERYTHING ON AIR, MOFOS! lol


----------



## Melvis (May 21, 2016)

AMD Phenom II 965 @ Stock + GTX 650 TI @ 1006/1375 =1080

Cant run it on 1080, as this monitor doesnt go that high, so just for reference


----------



## MrGenius (May 21, 2016)

erixx said:


> intel i7 5930K at 4600Ghz + Nvidia GTX 980 Ti at 1190Ghz...


That's all well and good. And I listed your score. But I have to ask about the GPU clocks. Because 1190GHz isn't possible and 1190MHz sounds too low for that score. I see what appears to be 1418MHz in the screenshot. Which would probably be the true boost clock, or close to it. It matters to a lot of people that you provide the true boost clock for the record. I'm flexible on the matter. I'm just asking for you to tell me what the clocks are. If you insist 1190MHz is correct, then it is. However there's the memory clock too. Or lack there of. You didn't provide it, so I went with what appears to be shown in the screenshot. It looks like 3505, which seems to be typical. So if 1190/1753 isn't correct, please let me know so I can change it.

Thanks.


----------



## JohnnyDirect (May 21, 2016)

MrGenius said:


> That's all well and good. And I listed your score. But I have to ask about the GPU clocks. Because 1190GHz isn't possible and 1190MHz sounds too low for that score. I see what appears to be 1418MHz in the screenshot. Which would probably be the true boost clock, or close to it. It matters to a lot of people that you provide the true boost clock for the record. I'm flexible on the matter. I'm just asking for you to tell me what the clocks are. If you insist 1190MHz is correct, then it is. However there's the memory clock too. Or lack there of. You didn't provide it, so I went with what appears to be shown in the screenshot. It looks like 3505, which seems to be typical. So if 1190/1753 isn't correct, please let me know so I can change it.
> 
> Thanks.


Actually a 5930k @ 4.6ghz will push the Valley score up nicely.  This program scales nicely with CPU overclocking.  Here is my best score 7362 176FPS CPU 6700k@5003mhz GPU running at 1493mhz.  I don't get benefit from my gpus running above 1493mhz


----------



## MrGenius (May 21, 2016)

JohnnyDirect said:


> Actually a 5930k @ 4.6ghz will push the Valley score up nicely.  This program scales nicely with CPU overclocking.  Here is my best score 7362 176FPS CPU 6700k@5003mhz GPU running at 1493mhz.  I don't get benefit from my gpus running above 1493mhz


Oh I don't doubt it too much. I've noticed the same thing on my systems. I just want to be sure it's as accurate as possible. It doesn't hurt to ask.

I noticed in your last screenshot the memory clocks were accurately displayed. I'm assuming that's still the case. So I'm going to put your recent score down as 1493/1905. Feel free to correct me if I'm wrong.


----------



## erixx (May 21, 2016)

GPUz says 1283 Mhz maximum core clock during Valley
i did not save my custom GPU settings, so did it again, so maybe it is unrelated to previous value...

Thanks a lot!


----------



## TheHunter (Jun 10, 2016)

TheHunter said:


> I've taken the lead   although that 4.8ghz failed later in 3dmark11 physics test lol
> 
> 
> Intel 4770K @ 4.8GHz + Zotac Omega GTX 980TI @ 1478/1925 = *4504*
> ...



Somewhat stabilized gpu clock, now I've read Maxwell gpuboost2.0 throttles @ 64C and yep.. Set manual fan higher to 75% and managed to get it stable 1482MHz almost through whole benchmark, only towards then end it dropped to 1470mhz when gpu reached 63-64C..

Intel 4770K @ 4.7GHz + GTX 980TI @ 1482/1935 = *4553*







Edit:
Which beats GTX 1080 FE @ 107.5fps, max OC Strix got 116fps
http://www.overclock3d.net/reviews/gpu_displays/nvidia_gtx1080_founders_edition_review/22


here it got only 100.5fps which is 1fps slower then my factory OC 1418mhz boost, not bad for this oldie, which was also 150€ less 
http://www.gamespot.com/articles/nvidia-geforce-gtx-1080-review/1100-6439863/


----------



## Vellinious (Jun 12, 2016)

i7 5820k - 2 cores at 4.875, hyperthreading disabled
980ti Classy (air cooled) - 1525 core / 2025 memory
Score:  4891


----------



## R00kie (Jun 12, 2016)

5820k 4.4GHz
1080 FE 2 GHz/2600MHz


----------



## petedread (Jun 12, 2016)

Please help, got a dumb question. My max FPS is being locked to my monitor refresh rate (it actually goes about 10-20% above) on 1080p 144htz monitor I get max FPS of 165. On 4k 60htz monitor I get max FPS of 70. This is only happening in Valley. I must have a V-sync setting somewhere but I cant find it. I've looked in windows display settings, and in EVGA PrecisionX and Nvidia control panel, can't find it. You guys with 980TI are getting max FPS of 200+.

980TI @1430-1880 water cooled
5930k @4.375


----------



## JohnnyDirect (Jun 13, 2016)

petedread said:


> Please help, got a dumb question. My max FPS is being locked to my monitor refresh rate (it actually goes about 10-20% above) on 1080p 144htz monitor I get max FPS of 165. On 4k 60htz monitor I get max FPS of 70. This is only happening in Valley. I must have a V-sync setting somewhere but I cant find it. I've looked in windows display settings, and in EVGA PrecisionX and Nvidia control panel, can't find it.
> 980TI @1430-1880 water cooled
> 5930k @4.375
> You are not running with V-Sync enabled.  Your scores are expected for your clock speeds and CPU.  Valley at 1080p hits a CPU bottle
> neck. To show you what I mean,  I have two cards but I disabled one in device manager and removed the SLI bridge. I also went into Nvidia control panel and made sure physx wasn't using my second card. Here are my single card scores with my overclocks and then I set my clocks to match yours.  My first screen shot is me running my CPU at 5ghz and Gpu at 1493mhz Vmem at 2025ish.  The second Screen shot at 4573 score is my card set to your clocks and CPU running at 4.4Ghz.  Keep in mind though, I have my Motherboard/RAM timings optimized which gives me a nice boost as well and I'm running a 6700k CPU, so I'll Still be slightly higher.


----------



## petedread (Jun 13, 2016)

@JohnnyDirect thank you for going to the trouble, it is appreciated. This is good news as now I can start tweaking my score. 

I was thinking V-sync because, before I swapped back to my 1080p monitor my score was 2490 every time. Then when I switched to the 1080p monitor I forgot to go into monitor settings and set it to 144htz so it was running at 60htz and my score was again 2490. Set 1080p monitor to 144htz and score went up to 4220 and stayed the same when I went back to the 4k 60htz monitor. I don't know why I was stuck at 2490.
Will raising CPU cache help improve my score? It is currently 200mhz behind core because it takes a lot of voltage to get it running faster but I could give it more voltage for benching. Also would raising the memory multiplier help or would I be better tightening the timings? My system mem is a bit lame, not great overclocker and only 3000mhtz 15,16,16,39,2T (was one of the first DDR4 batches from Kingston)


----------



## Vellinious (Jun 13, 2016)

petedread said:


> @JohnnyDirect thank you for going to the trouble, it is appreciated. This is good news as now I can start tweaking my score.
> 
> I was thinking V-sync because, before I swapped back to my 1080p monitor my score was 2490 every time. Then when I switched to the 1080p monitor I forgot to go into monitor settings and set it to 144htz so it was running at 60htz and my score was again 2490. Set 1080p monitor to 144htz and score went up to 4220 and stayed the same when I went back to the 4k 60htz monitor. I don't know why I was stuck at 2490.
> Will raising CPU cache help improve my score? It is currently 200mhz behind core because it takes a lot of voltage to get it running faster but I could give it more voltage for benching. Also would raising the memory multiplier help or would I be better tightening the timings? My system mem is a bit lame, not great overclocker and only 3000mhtz 15,16,16,39,2T (was one of the first DDR4 batches from Kingston)



Valley is very CPU bound, so yes...increasing your CPU core clock will help.  System memory speed won't do much of anything though.


----------



## petedread (Jun 13, 2016)

I threw a load of voltage at my CPU knowing it was only for a couple of Valley runs and it OC'D to 4.630, I bumped the Cache up a bit too. GPU 1470, mem 2003mhz. Getting better scores now.
The problem was I had to unplug the 4k monitor and restart my machine otherwise I would not be able to get past a score of 2490!!!! FPS barley going past 60.


----------



## petedread (Jun 13, 2016)

Are these the correct pictures to get on the score chart? GPU 980ti@1470,2003. CPU 5930k @4.621.


----------



## erixx (Jun 16, 2016)

GPU GTX980Ti Strix @1518, 3600(or 1800)* CPU 5930@4400
* As soon in upper right.


----------



## petedread (Jun 16, 2016)

@erixx bumb mem up, you have same GPU and CPU as me but your GPU core clocks higher, you can smash that score


----------



## erixx (Jun 16, 2016)

mem and cpu  (later after work I will kindly try it Will start to lower the airco


----------



## Vellinious (Jun 16, 2016)

erixx said:


> GPU GTX980Ti Strix @1518, 3600(or 1800)* CPU 5930@4400
> * As soon in upper right.
> View attachment 75480



The clocks in the upper right corner almost never show correctly.  You'd need to read GPUz sensors tab to get a good reading.  Judging by that score, I'd say you have something seriously wrong, if you're any where NEAR those clock speeds.


----------



## jaggerwild (Jun 16, 2016)

Vellinious said:


> The clocks in the upper right corner almost never show correctly.  You'd need to read GPUz sensors tab to get a good reading.  Judging by that score, I'd say you have something seriously wrong, if you're any where NEAR those clock speeds.



 WAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAH!


----------



## Vellinious (Jun 17, 2016)

jaggerwild said:


> WAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAH!



??


----------



## Caring1 (Jun 17, 2016)

Vellinious said:


> ??


I think he may be questioning your statement that something may be seriously wrong.
If you compare the readings with the screenshot directly above, which uses the same processor and series card, the scores are not that much different, the higher overclock on the initial screeny explains the higher scores.


----------



## Vellinious (Jun 17, 2016)

Caring1 said:


> I think he may be questioning your statement that something may be seriously wrong.
> If you compare the readings with the screenshot directly above, which uses the same processor and series card, the scores are not that much different, the higher overclock on the initial screeny explains the higher scores.



800 points different and almost 20 fps....  He should easily be above 4k at 1500.  He's probably either got a thermal throttle or power limit throttle going on.

Even at stock boost 2.0 clocks, should be hitting at least near 4k, if not above.  The baseline I ran on mine with stock clocks was 4100 something....  

Just tellin the man that he's got something wrong, and he should check GPUz to see if he can get it sorted out.


----------



## WhiteNoise (Jun 19, 2016)

i5-6600K @ 4.5GHz + GTX 1080 @ 2075/5433 = 4764











*NOTE I cannot post GPU-Z as it is not detecting my card OC. No matter what I do even the slightest OC nothing registers on GPU-Z so I figure it needs an update to properly detect the new nvidia cards.


----------



## rtwjunkie (Jun 19, 2016)

Hey @MrGenius Please change my entry.  Now using an MSI GTX 980Ti.  Per GPU-z it ran at 1263.  No overclock done...yet.  

Quite a jump up from my overclocked 980 score!  82% ASIC rating too. Not bad for end of the line chip.


----------



## P4-630 (Jun 21, 2016)

Hmm, I forgot how to make a screenshot within valley!  Anyone?

My new card will arrive within an hour or so, gotta test it out!


----------



## Ferrum Master (Jun 21, 2016)

P4-630 said:


> Hmm, I forgot how to make a screenshot within valley!  Anyone?
> 
> My new card will arrive within an hour or so, gotta test it out!



F12, it will stutter, you will notice it being made.


----------



## rtwjunkie (Jun 21, 2016)

P4-630 said:


> Hmm, I forgot how to make a screenshot within valley!  Anyone?
> 
> My new card will arrive within an hour or so, gotta test it out!



F12

EDIT: Damn Ninja @Ferrum Master!


----------



## P4-630 (Jun 21, 2016)

So I made a screenshot within valley, where is it?


----------



## rtwjunkie (Jun 21, 2016)

P4-630 said:


> So I made a screenshot within valley, where is it?



It will be in the Valley Folder under C drive Users.

Let me know if you don't find it.


----------



## P4-630 (Jun 21, 2016)

i5 6500 stock 3.2GHz/3.6GHz turbo - MSI GTX1070 Gaming X Core@1987MHz Memory@2003MHz


----------



## Vellinious (Jun 21, 2016)

P4-630 said:


> Hmm, I forgot how to make a screenshot within valley!  Anyone?
> 
> My new card will arrive within an hour or so, gotta test it out!



F12


----------



## P4-630 (Jun 21, 2016)

Vellinious said:


> F12



Yeah thanks, already got it benchmarked: http://www.techpowerup.com/forums/t...nchmark-1-0-scores.222154/page-9#post-3476394


----------



## Vellinious (Jun 21, 2016)

P4-630 said:


> Yeah thanks, already got it benchmarked: http://www.techpowerup.com/forums/t...nchmark-1-0-scores.222154/page-9#post-3476394



Yeah, I just noticed.  Need to add some CPU clock...that score could be quite a bit higher.


----------



## P4-630 (Jun 21, 2016)

Vellinious said:


> Yeah, I just noticed.  Need to add some CPU clock...that score could be quite a bit higher.



I could, since the CPU runs really cool and I'm still using an older BIOS so I can OC non-k CPU's, but I'll just leave it as is, I prefer not to fiddle with CPU speed in BIOS.


----------



## petedread (Jun 22, 2016)

Is my score on page 8 going to be added?


----------



## RealNeil (Jun 23, 2016)

i7-4770K with two new Sapphire Toxic R9-390X in Crossfire.


----------



## Caring1 (Jun 23, 2016)

petedread said:


> Is my score on page 8 going to be added?


@MrGenius appears to be M.I.A. for the last month.


----------



## WhiteNoise (Jun 23, 2016)

Well gave my cpu a little bit more of an overclock. 6600K@4.7GHz / GTX1080 @ 2100/5433
this brought my final score up a bit. I think if the OP ever updates this thread I may be in the top 5 for SGL card users.


----------



## rtwjunkie (Jun 23, 2016)

Caring1 said:


> @MrGenius appears to be M.I.A. for the last month.


Yeah.  I was hoping to get my new score up to replace the old one.


----------



## P4-630 (Jun 23, 2016)

rtwjunkie said:


> Yeah.  I was hoping to get my new score up to replace the old one.



I see you got a score of 3812, did you OC your CPU and GPU? Edit: at least your CPU I see.
I got a score of 3758 hardware on stock clocks.


----------



## rtwjunkie (Jun 23, 2016)

P4-630 said:


> I see you got a score of 3812, did you OC your CPU and GPU? Edit: at least your CPU I see.
> I got a score of 3758 hardware on stock clocks.



980Ti was stock clock. CPU is at 4.1, which probably is the difference maker.  All things being equal, your score should probably edge mine out.


----------



## EarthDog (Jun 23, 2016)

I purposely stayed the hell out of this thread (after trying to help)... but I hope for those participating he comes back soon. If not, I will consider starting and running something else... 

I wonder if he got banned... this site doesn't seem to show who is banned and who isn't...


----------



## jaggerwild (Jun 23, 2016)

Think hes staying away cause of personal stuff, thought I saw something about his brother passing in one of the bench mark threads?


----------



## rtwjunkie (Jun 23, 2016)

jaggerwild said:


> Think hes staying away cause of personal stuff, thought I saw something about his brother passing in one of the bench mark threads?



Oh no, that's sad. Thank you for letting us know.


----------



## RealNeil (Jun 24, 2016)

i7-4790K and two GTX-980Ti GPUs in SLI.  I didn't take the time to clock the two card the same.


----------



## Caring1 (Jun 24, 2016)

jaggerwild said:


> Think hes staying away cause of personal stuff, thought I saw something about his brother passing in one of the bench mark threads?


Yep, that was not long before his last post here, I guess he has personal stuff to take care off.


----------



## HammerON (Jun 24, 2016)

MrGenius will be back with us (if he chooses) after August 29th....


----------



## the54thvoid (Jun 24, 2016)

HammerON said:


> MrGenius will be back with us (if he chooses) after August 29th....



In which case the thread needs to shut down until then or let someone else administer it until he returns?  No point all these scores with no tables.  @MrGenius would not want that to happen.


----------



## Jetster (Jun 24, 2016)

the54thvoid said:


> @MrGenius would not want that to happen.



Then he should delegate someone

But he can catch it up later, in the meantime just say what he would when someone posts scores that are not in the correct format just like he did.


----------



## Caring1 (Jun 24, 2016)

I'm sure he won't mind updating the charts in one sitting, instead of dribs and drabs.


----------



## EarthDog (Jun 24, 2016)

the54thvoid said:


> In which case the thread needs to shut down until then or let someone else administer it until he returns?  No point all these scores with no tables.  @MrGenius would not want that to happen.


who knows what he wants. But 3 months is a long time to wait for an update, especially after (what I am presuming is) _another_ banning.

I'll leave this alone for him and start up something different (diff benchmark) in the meantime. If he comes back, he can play catch up if he wants. If he doesn't, well, he doesnt.

I'd close it down until then if I was staff...

EDIT: Well, I would start another, but it appears our friend Genius has a monopoly on the major benchmarks. Oy. That puts 3 of these in a major holding pattern (one doesn't have any participation anyway though).


----------



## P4-630 (Jun 24, 2016)

CPU i5 6500 stock MSI GTX1070 Gaming X OC mode Core 2012MHz Memory 2025MHz


----------



## EarthDog (Jun 24, 2016)

Or just keep  pumping in scores while the guy is gone.. LOL


----------



## P4-630 (Jun 24, 2016)

EarthDog said:


> Or just keep  pumping in scores while the guy is gone.. LOL




Can you start a new thread with valley scores then?


----------



## EarthDog (Jun 24, 2016)

Nope. Not without some closure on this one first. That last thing I want to do is upset the tumultuous Mr Genius and witness another rant that may get the guy perma banned. The staff should make a decision on this if you ask me. So while I am thankful for the staff update, waiting 3 months to see IF the guy comes back to handle his threads, really isn't a resolution.


----------



## D007 (Jun 25, 2016)

Tessellation setting "extreme" then?
Vsync off?
When I set quality to "ultra" it didn't change the tessellation settings.. I assume we are all using extreme tessellation?


----------



## D007 (Jun 25, 2016)

GTX 1080 FTW:
1960/1860
i7 4770k @ 4.7
=4850


----------



## Vellinious (Jun 25, 2016)

D007 said:


> GTX 1080 FTW:
> 2113/5665
> i7 4770k @ 4.7




Nice score, but.....the clocks that show up in Valley are almost never correct.  You should read them from GPUz sensors tab.


----------



## D007 (Jun 25, 2016)

Vellinious said:


> Nice score, but.....the clocks that show up in Valley are almost never correct.  You should read them from GPUz sensors tab.


Ahh, I was wondering about that.. Ty.
Memory doubled? Doesn't seem right..lol


----------



## george vasiliadis (Jun 28, 2016)

Intel pentiun g3258@4.2 Geforce gtx titan black @ 1160/1900


----------



## MrGenius (Jan 21, 2017)

Well...I'm *finally* back. Did you miss me? 

Nah...didn't think so. Seriously though. It's time to get this thread rolling again. Or not(up to you). Since Valley's been updated since I left. And there's another thread for those scores. Wait...WTF is Valley 1.4? There's no updated Valley beyond/above 1.0...that I know of. Regardless...I'll update new scores to the list here...*if *you choose to submit them. I just got this one and the Heaven thread reopened. So give me some time to get things caught up with both if you would please. I've been out of the game for a while. Still getting warmed up so to speak. Your patience is greatly appreciated.



EarthDog said:


> EDIT: Well, I would start another, but it appears our friend Genius has a monopoly on the major benchmarks. Oy. That puts 3 of these in a major holding pattern (one doesn't have any participation anyway though).



Right you are! Nobody ever replied to that 3DM(2013) scores thread anyway. So who cares? About that one I mean. The other 2 are going to pick up right where we left off though. At least...until I get banned again. 

BTW I thought we(you & I) weren't "_friends_" anymore.


----------



## P4-630 (Jan 21, 2017)

Welcome back @MrGenius 

i5 6500 @ 3.2GHz - GTX1070 2050 / 2415


----------



## Globespy (Jan 23, 2017)

Not sure how you guys are getting some of these scores.....I'm running a i7 6700K @ 4.4Ghz and a EVGA GTX 1080 FTW at 2088Mhz. 32GB Corsair Veangence LPX @3200Mhz and nowhere near these scores. Any ideas on why?
Running latest 376.33 driver, but no better with older ones such as 368.69/81 etc.


----------



## MrGenius (Jan 24, 2017)

Experiencing technical difficulties. Need to request unlimited post editing(again) to update scores. Please stand by. Thank you for your patience. Shouldn't be much longer to get this all fixed up. Sorry so slow.


----------



## purecain (Jan 27, 2017)

it looks like valley 1.1 is getting ready for release. I thought it was out but nope not yet.


----------



## GeneO (Jan 29, 2017)

Got to do some benching today. 4790k @ 4.7 GHz, MSI GTX 1070 OC to 2100 MHz on curve, memory 9116 MHz effective


----------



## MrGenius (Feb 4, 2017)

purecain said:


> it looks like valley 1.1 is getting ready for release. I thought it was out but nope not yet.


That's cool. You got dibs on that thread.

But I'm wondering where you got that info. All I can find on it is a Twitter post from Nov 2013 mentioning it. Which was quite a while ago.

I'm just getting started updating the scores to the list here. Finished updating the Heaven thread today. Won't be too long before I get this one all caught up too. 

EDIT: All done! Where's all the GTX 1080s(that are willing to follow the rules for score submissions anyway)? They showed up in force over at the Heaven thread. Damn near took every top 10 single GPU score(exc. one). Put a royal beat down on the 980 Tis and Titan Xs. They should be able to do it here too I'd imagine.


----------



## The Pack (Feb 12, 2017)

The Pack Score: 6188 / i7 6850K@4.4GHz/ 2x Asus Strix GTX 1070 O8G @ 2126MHz/ 4300MHz


----------



## The Pack (Feb 12, 2017)

With singel GPU @2176MHz /4900MHz


----------



## ramenfan (Feb 15, 2017)

Single GPU at 2100/4.1


----------



## ramenfan (Feb 16, 2017)

Single Gpu I56600k @4.1  FE1080 @2100


----------



## AndrewWyb (Feb 18, 2017)

Think I got everyone beat with the single GPU category (unless I missed someone's Pascal Titan post  )

Kaby Lake i7 7700K w/ 1.328V @ 5.0 Ghz / ZOTAC GTX 1080 ArcticStorm @ 2138 Mhz GPU clock/ 5412 Mhz Mem Clock




I could probably push it a little higher if I wanted, but I was getting artifacts when I got close to 2200 Mhz. My temps never go above 35 C, it's really a beautiful thing


----------



## MrGenius (Feb 18, 2017)

AndrewWyb said:


> Think I got everyone beat with the single GPU category (unless I missed someone's Pascal Titan post  )


Yes, indeed you do. But why stop there? Head on over to the Heaven scores thread and dominate that single GPU list too.


----------



## TkBaha (Feb 22, 2017)

GTX 1080 FE 1999/1359 newest driver test 
i7 4790k 4.6Ghz 1.206v
2400MHZ DDR3 16 GB


----------



## P4-630 (Feb 22, 2017)

Thanks for updating the scores @MrGenius ! 

Hmm, not bad, I have the best score with i5 (@stock) sofar!


----------



## EarthDog (Feb 22, 2017)

On a gpu heavy test... well done!


----------



## P4-630 (Feb 22, 2017)

EarthDog said:


> On a gpu heavy test... well done!



yeah lol!


----------



## The Pack (Feb 22, 2017)

Here is another update of my Strix

The Pack; i7 6850K@4.5Ghz; Asus Strix GTX 1070 O8G@2176/2463 = 4698


----------



## P4-630 (Feb 22, 2017)

The Pack said:


> Here is another update of my Strix
> 
> The Pack; i7 6850K@4.5Ghz; Asus Strix GTX 1070 O8G@2176/2463 = 4698View attachment 84377



SLI lol!


----------



## The Pack (Feb 22, 2017)

No!!
Look at my card


----------



## The Pack (Feb 22, 2017)

My SLI looks like this

The Pack; I7 6850K4@4.5GHz 2x Asus Strix 1070 O8G 2151/2225 = 6476


----------



## MrGenius (Feb 22, 2017)

The Pack said:


> No!!
> Look at my card


Yes!! Look at rule #7.


MrGenius said:


> *7.) Screenshots showing the number of GPUs as x2/x3/x4 are considered Multi GPU (whether they are or aren't)*


The funny thing is you did follow the rule the first time you posted your single GPU score. I guess you just forgot. 

Nice score though.


----------



## The Pack (Feb 23, 2017)

Uff,yes,you are rhigt. Its my fault. I'sorry,i look another time,for e new update of my score with the singel GPU .


----------



## kaizoku11 (Feb 23, 2017)

i7-4790k @ 4.7ghz + GTX 1080 @2063/1400= 4956


----------



## MrGenius (Feb 23, 2017)

kaizoku11 said:


> GTX 1080 @*1833/2063*


Are those numbers correct? 2063 for memory speed seems improbable, if not impossible. 

Are they backwards maybe? 2063 core/1833 memory? 1833 for memory speed is even questionable. 

The screenshot suggests somewhere around 1400 memory speed.


----------



## kaizoku11 (Feb 23, 2017)

MrGenius said:


> Are those numbers correct? 2063 for memory speed seems improbable, if not impossible.
> 
> Are they backwards maybe? 2063 core/1833 memory? 1833 for memory speed is even questionable.
> 
> The screenshot suggests somewhere around 1400 memory speed.



My apologies. It seems I got confused. I had it the right way the first time then edited it. Would you like me to re edit the post? Also I assume the memory speed you want me to list is the one on GPUz correct? Clock/memory?


----------



## MrGenius (Feb 23, 2017)

kaizoku11 said:


> My apologies. It seems I got confused. I had it the right way the first time then edited it. Would you like me to re edit the post? Also I assume the memory speed you want me to list is the one on GPUz correct? Clock/memory?


I just want to make sure what I put on the list is as correct as possible. Some people give me weird numbers. You aren't the first. It just looks different than what's shown in the screenshot. So I wanted to ask to be sure. The most accurate way to see the true clocks is to monitor the clocks with GPU-Z during the benchmark. But I'll list whatever you want me to list. The memory speed of 2063 just seems a little higher than I've seen so far. And I'm wondering if that's correct. If you say it is then it is. If you say it isn't, then all I want to know is what you think it really is. I'm not going to question it any further than that.

Anyway, just to be clear. If you wouldn't mind stating what you think the correct clocks are one more time, in the GPU clock/Memory format, that would be great. Thanks!


----------



## kaizoku11 (Feb 23, 2017)

MrGenius said:


> I just want to make sure what I put on the list is as correct as possible. Some people give me weird numbers. You aren't the first. It just looks different than what's shown in the screenshot. So I wanted to ask to be sure. The most accurate way to see the true clocks is to monitor the clocks with GPU-Z during the benchmark. But I'll list whatever you want me to list. The memory speed of 2063 just seems a little higher than I've seen so far. And I'm wondering if that's correct. If you say it is then it is. If you say it isn't, then all I want to know is what you think it really is. I'm not going to question it any further than that.
> 
> Anyway, just to be clear. If you wouldn't mind stating what you think the correct clocks are one more time, in the GPU clock/Memory format, that would be great. Thanks!


I edited the post. It's now 2063/1400 : D


----------



## R00kie (Feb 23, 2017)

I don't think I've ever posted my GTX 1080 score, so here it is:

i7 5820K @ 4.5GHz + GTX 1080 @ 2088/1375 = 4658






The score seem to have gone down since the last time I've run it on an older driver.


----------



## Lui Leyland Robert (Feb 24, 2017)

Here is my Inno3D GeForce GTX 1060 6GB's result ( 1750MHz Core [1928MHz Boost]] /2200MHz Mem ) with Intel Core i5 6600K @ 4.6GHz


----------



## Lui Leyland Robert (Feb 24, 2017)

Also feat with my old Graphics Card Sapphire Toxic Radeon R9 270X ( PCI-Express x4 from Z170 PCH Mode ) ( 1180MHz Core / 1500MHz Mem )  with Intel Core i5 6600K @ 4.6GHz


----------



## The Pack (Feb 24, 2017)

A little bit more points with clean Windows 10 

The Pack; i7 6850K@4.5GHz; 2x 1070 2151/2225 = 6536


----------



## stefanels (Mar 24, 2017)

i7 6700K @ stock / 16Gb 3000Mhz CL15 / R9 Fury OC 1020/500


----------



## moe1903 (Mar 24, 2017)

i5-6600k @4.6 GHz, GTX1080 @ 2138/5738 MHz, 16GB DDR4-3066 MHz, Asus Z270TUF Mark 1


----------



## jboydgolfer (Mar 24, 2017)

stefanels said:


> i7 6700K @ stock / 16Gb 3000Mhz CL15 / R9 Fury OC 1020/500



2560x1440 isnt 4K its* 2K. *FWIW.
that setup wouldnt do THAT good with 4K  , good but not that good.more likely in the low 1000's running 4K would be my guess.

Im impressed how relevant that fury still is though, some serious HP


----------



## EarthDog (Mar 24, 2017)

It's not 2k... actually. 

2k is 2048x1080 if you want to be technical about it. 2560x1440 is 1440p. 

... AMD no, I don't care what newegg calls 2k...lol

https://en.m.wikipedia.org/wiki/Display_resolution#


----------



## stefanels (Mar 24, 2017)

You are right...  Tomorrow i will do a 4k test...  Just now i saw that the test was done only in 1440p...

The 4K test


----------



## jboydgolfer (Mar 24, 2017)

stefanels said:


> You are right...  Tomorrow i will do a 4k test...  Jusy now i saw that the test was done only in 1440p...



It's impressive that the fury can still score so high


----------



## Spektre (Mar 28, 2017)

i5 7600k @ 5.1GHz + GTX 1070 1633/2002 (I think I'm doing this right.)


----------



## MrGenius (Mar 28, 2017)

Spektre said:


> i5 7600k @ 5.1GHz + GTX 1070 1633/2002 (I think I'm doing this right.)


I question the GPU core boost you've stated. But if it's not the 2037 shown in the screenshot, and you think it's 1633. Then 1633 it is. It just sounds kinda low to me...for a GTX 1070.


----------



## Spektre (Mar 28, 2017)

MrGenius said:


> I question the GPU core boost you've stated. But if it's not the 2037 shown in the screenshot, and you think it's 1633. Then 1633 it is. It just sounds kinda low to me...for a GTX 1070.


1633 is the default clock, going by what gpu z tells me. I think it's supposed to over clock, but I'm not entirely sure how that works out. (Maybe it was higher during the benchmark?) My CPU speed kinda changes constantly too... So I should say, it OCs up to 5.1 GHz


----------



## MrGenius (Mar 28, 2017)

Yeah...it should be higher. So you can either trust what Valley says it is during then benchmark. Or run GPU-Z in the background and see what it says. It'll record whatever the max boost clock is, so you can check it afterwards. Just click the GPU-Z Sensors tab right after Valley's done running and then click the down pointing v to the right of GPU Core Clock. Then check Show Highest Reading. It'll show you the max boost speed achieved since GPU-Z was opened. So don't open it until right before you run the benchmark. And be sure to check it before you run any other benchmarks, games, etc..


----------



## Spektre (Mar 28, 2017)

MrGenius said:


> Yeah...it should be higher. So you can either trust what Valley says it is during then benchmark. Or run GPU-Z in the background and see what it says. It'll record whatever the max boost clock is, so you can check it afterwards. Just click the GPU-Z Sensors tab right after Valley's done running and then click the down pointing v to the right of GPU Core Clock. Then check Show Highest Reading. It'll show you the max boost speed achieved since GPU-Z was opened. So don't open it until right before you run the benchmark. And be sure to check it before you run any other benchmarks, games, etc..


Thanks, I'll give that a try. Still a bit of a noob.

EDIT: Is this more of what you're looking for? Highest says 1999.5 MHz fort core clock and  2003.4 for memory clock


----------



## MrEWhite (Apr 2, 2017)

i7-6800K @ 4.0GHz + GTX 1080 @ 2126/5508


----------



## Scorpius (Apr 3, 2017)




----------



## Vayra86 (Apr 8, 2017)

MrEWhite said:


> i7-6800K @ 4.0GHz + GTX 1080 @ 2126/5508



Impressive, GTX 1080 closing on the 1080ti below it.


----------



## P4-630 (Apr 8, 2017)

MrGenius said:


> It just sounds kinda low to me...for a GTX 1070.



At stock speed maybe...

My score with GTX1070 OC'd was just over 4000.


----------



## purecain (Apr 8, 2017)

http://img.techpowerup.org/170408/titanxpvalley.png




proper score@24/7 clocks




purecain
4770K@4400mhz
TitanXpascal@1910mhz stock cooler
lol theres no wonder ive got a bunch of scores that sit higher. 4xAA doh!!!!


----------



## the54thvoid (Apr 9, 2017)

I can never figure out benchmarks like this.  My 1080ti at 2050Mhz core and 5996Mhz (11992 effective) memory with Ryzen 1700X @3.9Ghz only gets 117fps and a score of 4895.

Tinkering is the art - not the hardware.  I don't tinker.

EDIT: I'd speculate that Valley does not know Mr Ryzen very well.


----------



## EarthDog (Apr 9, 2017)

It's not a core whore, this benchmark...needs clockspeed..which ryzen, well, blows at since it can't really break 4.1ghz at max. 

You also didn't mention the difference between the two setups, lol...


----------



## the54thvoid (Apr 9, 2017)

EarthDog said:


> It's not a core whore, this benchmark...needs clockspeed..which ryzen, well, blows at since it can't really break 4.1ghz at max.
> 
> You also didn't mention the difference between the two setups, lol...



What differences? I've said this before running the 3930k where my clocks on CPU and gfx were higher than another bench yet they score higher....

I'm just saying the truth that benchmarks aren't just about hardware. It's about tinkering with other processes to eliminate any and all possible limiting factors.


----------



## The Pack (Apr 10, 2017)

The Pack; i7 6850K @4.4Ghz GTX 1070 O8G @ 2189/4920 (effectiv 9840MHz) = 4684


----------



## erixx (Apr 10, 2017)

yesterday I did a Valley but good old "Image Resizer" is broken in Creators Update, so I had no fun... my screenies are always over the maximum size and editing in Paint makes them blurry and unfit to proof epeen here


----------



## Ferrum Master (Apr 10, 2017)

erixx said:


> yesterday I did a Valley but good old "Image Resizer" is broken in Creators Update, so I had no fun... my screenies are always over the maximum size and editing in Paint makes them blurry and unfit to proof epeen here



irfan view is one of those rare things I use since dino age.


----------



## Brenden (Apr 10, 2017)

I don't know how to post. it. Im a pc newb.


----------



## Chlenix (Apr 10, 2017)

`GTX 1080 TI Gigabyte
I Did not OC this Card and it was running @1900Mhz with I7 7700k @stock


----------



## the54thvoid (Apr 10, 2017)

People need to post the clock speeds for this thread to mean anything....


----------



## erixx (Apr 10, 2017)

Irfan... yeah. Re'mber it from the dark ages. But i prefered the right click and forget of Image Resizer.


----------



## MrGenius (Apr 11, 2017)

O.k. let's do this one more time(been done pretty much twice in this thread already). Now pay attention. I'm going to go through it step by step again, in thorough detail this time. This is for Windows 10. But most of it will apply to earlier OSs. You just have to adjust the method accordingly.

Press F12 at the end of the benchmark when your score is shown to take a screenshot of it
Then open File Explorer > This PC > C: (or whichever drive your OS is installed on) > Users > User Name > Valley > screenshots
Find the screenshot with your high score in the screenshots folder(preview it by left clicking on it to be sure you've got the correct one, if there are many to choose from)
Point to and right click on the screenshot with your high score > point to Open with > then point to and left click on Paint
In Paint point to and left click on File > then point to Save as > then point to and left click on JPEG picture > then, after taking note of where the JPEG picture is going to be saved, point to and left click on Save
You can now post the screenshot in this thread by using Upload a File at the bottom right of the page
To do so point to and left click on Upload a File > then point to and left click on Browse > then point to and select(by left clicking on) the screenshot from wherever it was saved during step #5
Now that you've uploaded the screenshot to TPU, insert it into your post by pointing to and left clicking on Thumbnail or Full Image(under the file name and Insert: )
That's how easy this is to post a high quality FULL screenshot for this thread people 
See.


----------



## Good Guru (Apr 11, 2017)

The ol 4770k at 4400mhz and 1450 gpu's GTX 970's


----------



## dleester82 (Apr 29, 2017)

i7 4790k 4.7gz msi gtx1080ti gaming x


----------



## MrGenius (Apr 29, 2017)

dleester82 said:


> i7 4790k @ 4.7gz  MSI gtx1080ti gaming xView attachment 87192


DAMN SON!


----------



## MaxxBot (Apr 29, 2017)

i5 6600k @ 4.6GHz + EVGA 1080 Ti SC Black @ 2050/5899MHz


----------



## rtwjunkie (Apr 29, 2017)

erixx said:


> yesterday I did a Valley but good old "Image Resizer" is broken in Creators Update, so I had no fun... my screenies are always over the maximum size and editing in Paint makes them blurry and unfit to proof epeen here


Try paint.net? It's all I've used for almost a decade. Constantly improved, free, and vastly better than anything MS can do.


----------



## MrGenius (Apr 30, 2017)

MaxxBot said:


> i5 6600k @ 4.6GHz + EVGA 1080 Ti @ 2050/5899MHz


Nice score!


----------



## dleester82 (Apr 30, 2017)

reran with little more tweak got a higher score yeah


----------



## stefanels (May 1, 2017)

here is mine fullscreen 
6700K @ stock 4.0Ghz
R9 Fury @ stock 1050/500 Mhz


----------



## stefanels (May 4, 2017)

New card new score 

6700K @ stock 4.00Ghz
EVGA 980ti Classified @ stock boost 1392/1753Mhz (in valley shows 1582Mhz on gpu???)


----------



## adonaras (Jun 18, 2017)

i5-7600K @ 5.0GHz + GTX1070 @ 2062/8900 = 4168


----------



## RealNeil (Jun 18, 2017)

Ran this on an older screen, hence the weird resolution.


----------



## niceoslov (Jul 13, 2017)

i7 5820K @ 4.5GHz DDR4 16GB @ 3200MHz

MSI GTX 1080 Ti @ 2050/1500 Core Clock -MEM Clock = 6145


----------



## MrGenius (Jul 13, 2017)

niceoslov said:


> i7 5820K @ 4.5GHz DDR4 16GB @ 3200MHz
> 
> MSI GTX 1080 Ti @ 2050/1500 Core Clock -MEM Clock = 6145


Great score!

But I can't list it.

You failed to follow rule #2


			
				MrGenius said:
			
		

> 2.) Sound *ON* (sound disabled in benchmark *is not* allowed)



Sorry.


----------



## niceoslov (Jul 13, 2017)

MrGenius said:


> Great score!
> 
> But I can't list it.
> 
> ...



Sorry about that , hope this one is valid.


i7 5820K @ 4.5GHz DDR4 16GB @ 3200MHz

MSI GTX 1080 Ti FE @ 2050/1500 - Core Clock/MEM Clock = 6169


----------



## RealNeil (Jul 13, 2017)

Same weird screen as my post #305 hence the weird resolution.
I replaced the Gigabyte 8GB RX480 G1-Gaming pair with another pair of GPUs.
The new cards are also Gigabyte G1-Gaming 8GB cards, but these are GTX-1070s with a much better score.
They also run cooler than the Rx480s do.

https://www.techpowerup.com/forums/attachments/1700x-score-jpg.89192/


----------



## vjeks (Jul 18, 2017)

I got Ryzen 1600 and X370 motherboard as temporary placeholders until Threadripper comes out 

Here's score on my 1080p monitor:


----------



## MrGenius (Jul 18, 2017)

vjeks said:


> I got Ryzen 1600 and X370 motherboard as temporary placeholders until Threadripper comes out
> 
> Here's score on my 1080p monitor:


Nice score! I'm going to put that on the list with the clocks shown in the screenshot. R5-1600 @ 3.7GHz + GTX 1080 Ti @ 2025/1251. If those aren't correct you need to let me know what the actual clocks are.


----------



## vjeks (Jul 18, 2017)

MrGenius said:


> Nice score! I'm going to put that on the list with the clocks shown in the screenshot. R5-1600 @ 3.7GHz + GTX 1080 Ti @ 2025/1251. If those aren't correct you need to let me know what the actual clocks are.




Hey, thanks!

I got a tiny bit higher score during the second run. I didn't change anything on my system compared to the first benchmark.

Graphics card is Gigabyte 1080 Ti Aorus, with OC mode selected within Aorus software. I haven't OC it manually.
I provided now my GPU-Z as well, so you can see GPU/mem clocks. Seems that mem was higher clocked during 2nd run for some reason.

The R5-1600 is @ 3.7GHz, that is correct. Overclocked using stock Wraith cooler @1.25V.
Temperatures are not exceeding 72° C under stress testing and it's around 45-ish° C when idle.

Unfortunately I still can't reach 3200MHz on my DDR4 - Ryzen loves faster memory. BIOS updates should sort it out.
Before the latest update, I was unable to post with RAM above 2133MHz. After the update, it's working @ 2800MHz.

I believe that I might get the score quite near or slightly above 5K using Valley bench if I manage to use RAM at 3200. Valley score difference between the 2133 and 2800MHz DDR4 was about 440.


----------



## Tomgang (Jul 19, 2017)

What, how is that possible. I mean my old system is still on the top 10 list for multi GPU after all this time?

Any way time to join the single GPU as well.

Well new old system + brand new GPU means a new score.

i7 980X @ 4.67 GHz + EVGA GTX 1080 TI 2050/1526 = 5623


----------



## vjeks (Jul 19, 2017)

Very nice @ post above!

I wanted to add a new score with measly 100MHz OC on R5-1600. That's the most I would go at the moment when using stock air cooler, considering thermals within reasonable margins. Nothing else was changed.

I play games at 4K, so I'm not at the moment bothered too much with the score as 1080p is CPU bound territory. Can't wait to get the Threadripper, as I need a beastly workstation CPU.


----------



## Tomgang (Jul 19, 2017)

vjeks said:


> Very nice @ post above!
> 
> I wanted to add a new score with measly 100MHz OC on R5-1600. That's the most I would go at the moment when using stock air cooler, considering thermals within reasonable margins. Nothing else was changed.
> 
> I play games at 4K, so I'm not at the moment bothered too much with the score as 1080p is CPU bound territory. Can't wait to get the Threadripper, as I need a beastly workstation CPU.



thanks.

You cut try and cramp up clocks on gpu and vram and see if that maybe help a bit to up the score. But CPU clocks does also have a lot to say in this benchmark.


----------



## hapkiman (Jul 19, 2017)

Tomgang said:


> What, how is that possible. I mean my old system is still on the top 10 list for multi GPU after all this time?
> 
> Any way time to join the single GPU as well.
> 
> ...



That i7 980x is still a beast even after all these years. And when you OC it and then pair it with that 1080 ti you're gonna be getting really great scores.  Especially at 1080p.


----------



## Tomgang (Jul 19, 2017)

hapkiman said:


> That i7 980x is still a beast even after all these years. And when you OC it and then pair it with that 1080 ti you're gonna be getting really great scores.  Especially at 1080p.



Jep good old X58 is still no slug. To be honest i exspected more bottleneck than i got here because while heaven is not so cpu needy valley is and i where a bit scare of how much will PCIe 2 hold this card back. turns out not much if any. All benchmarks shows an inprovement so far from GTX 970 sli to GTX 1080 TI. Games also benefitted from it. specially with optimized multithread games. I have a bunch of benchmark proving it, but will not post it here, since that will be off topic.

The most fun thing is while my old I7 920 hold my GTX 970 SLI back a bit, with the I7 980X it whas the 970 sli that holds back my CPU.

And even my old setup I7 920 setup is still on the top 10 for multi GPU.

X58 has truly been a great invest ment back then. Ass the system is now i can go for 2 more years on X58.


----------



## irate_primate (Aug 13, 2017)

6700K @ 4.6Ghz
1080ti +80/650


----------



## Countryside (Aug 19, 2017)

First generation GCN


----------



## FreedomEclipse (Aug 19, 2017)

I think the top 10 single gpu group should be renamed the '1080 posse'


----------



## Hardi (Aug 19, 2017)

R5 1600 @ 3.8GHz + GTX 1070 @ 2025/2202 = 4013
(yeah i know valley shows gpu clock @ 2062, but thats was only for a few seconds at the start, 99% of the test it ran @ 2025mhz, but if that's a problem then you can add it as 2062mhz)


----------



## Kliim (Aug 19, 2017)




----------



## Countryside (Aug 19, 2017)

If i set 2xAA i got 44.1 FPS witch is quite decent for a 5 year old gpu. 

Interesting thing wast if i turned of Motion blur i lost 1fps


----------



## irate_primate (Aug 19, 2017)

FreedomEclipse said:


> I think the top 10 single gpu group should be renamed the '1080 posse'


Until someone comes through here with a Titan, that list will probably be 1080s for a while, unless the newest AMD GPU can score highly.  I'm curious to see how those run, though I was definitely hoping for more out of them.


----------



## VIGILANTBOY (Aug 19, 2017)

hi everyone im new here:d
INTEL CORE i7 6850K @ 4.5ghz 
 GTX 1080 TI LIGHTNING Z @ 65+/400+ 
for my cores anything above 2037 is unstable and also weirdlly for memory i can not push above 400+ for full stability  i have tested manytimes and i dont know how is this possible that on a limited MONSTER card like LIGHTNING Z you get 60+/400+ :|


----------



## irate_primate (Aug 21, 2017)

VIGILANTBOY said:


> hi everyone im new here:d
> INTEL CORE i7 6850K @ 4.5ghz
> GTX 1080 TI LIGHTNING Z @ 65+/400+
> for my cores anything above 2037 is unstable and also weirdlly for memory i can not push above 400+ for full stability  i have tested manytimes and i dont know how is this possible that on a limited MONSTER card like LIGHTNING Z you get 60+/400+ :|



Yeah that seems a tad low for the memory, but in real world applications I doubt it makes much of a difference.  I can push mine to +700, but I think it only makes a difference in synthetic benchmarks.


----------



## VIGILANTBOY (Aug 21, 2017)

irate_primate said:


> Yeah that seems a tad low for the memory, but in real world applications I doubt it makes much of a difference.  I can push mine to +700, but I think it only makes a difference in synthetic benchmarks.



WOW 700+ is really great congratulations is it full stable and non artifacts at 700???
YES i was going with a titan xp but my friends told me to go with a kingpin or lightning card but with titan xp or a refrence card  it would have way better chance to get a better or gold chip
these 4 benefits make a graphics card a KING of its competitors >>>>
1. LOWER VOLTAGE GPU & MEMORY CHIPSETS
2.LOWER TEMP GPU & MEMORY CHIPSETS (i dont mean the gpu cooler i mean the chipset it self ) <<<< some chips are hotter and some chips are not
3.THE WHOLE GPU COOLER & COOLING OPTIONS (better air cooler or water block)
4.THE GPU PHASE POWER DELIVERY <<<<< so u see that the power delivery is in 4th place so if you dont have a lower voltage chip and a lower temp chip and a good cooler for the unit then the power delivery is not going to do anything or have any benefits or it has so much low effect even if it is 16+3 20+3 ....................... or whatever BUT if you have the first 3 options thats the time when the power delivery comes to game and the magic happens then it will be a KING AND SUPER MONSTER
but my friends lightning z which we bought together at the same time the same shop hits 2126 on cores and his temps are under 55 :O he says in full load for 30 minutes or an hour the temps are around 40-52


----------



## irate_primate (Aug 22, 2017)

VIGILANTBOY said:


> WOW 700+ is really great congratulations is it full stable and non artifacts at 700???
> YES i was going with a titan xp but my friends told me to go with a kingpin or lightning card but with titan xp or a refrence card  it would have way better chance to get a better or gold chip
> these 4 benefits make a graphics card a KING of its competitors >>>>
> 1. LOWER VOLTAGE GPU & MEMORY CHIPSETS
> ...


2100+ on core is real good, have your friend come by here and beat my score .  I just tried for my max, and got +775 on memory with +77 on core.  Honestly, it doesn't make much of a difference in gaming tho.  Boosts my score again, am I allowed to post a new screenshot if I've beaten my old score?

6600K at 4.6
1080ti at +77/775


----------



## VIGILANTBOY (Aug 30, 2017)

irate_primate said:


> 2100+ on core is real good, have your friend come by here and beat my score .  I just tried for my max, and got +775 on memory with +77 on core.  Honestly, it doesn't make much of a difference in gaming tho.  Boosts my score again, am I allowed to post a new screenshot if I've beaten my old score?
> 
> 6600K at 4.6
> 1080ti at +77/775


nice score
yeah 2126 is so muchh i will tell him but he is busy
Whats your 1080ti model ?


----------



## irate_primate (Aug 31, 2017)

VIGILANTBOY said:


> nice score
> yeah 2126 is so muchh i will tell him but he is busy
> Whats your 1080ti model ?


I've got the FTW3 Elite.  It's a good card, though it is a little bit louder than the Strix that I had.  I got the Strix first, but it had crazy coil whine so I had to send it back.  Newegg didn't have any more of them so I got a refund and grabbed the FTW3 instead.  It actually ran a little hot at first, but I opened it up and replaced the thermal paste with some good stuff I had and now the temps are really low.


----------



## 1Gpi2ZV6Jy (Aug 31, 2017)

****PRESS F12 for SCREENSHOT
*
okay, where is it saved?


----------



## MrGenius (Aug 31, 2017)

1Gpi2ZV6Jy said:


> ****PRESS F12 for SCREENSHOT
> *
> okay, where is it saved?


https://www.techpowerup.com/forums/...chmark-1-0-scores.222154/page-12#post-3636095


----------



## 1Gpi2ZV6Jy (Aug 31, 2017)

MrGenius said:


> https://www.techpowerup.com/forums/...chmark-1-0-scores.222154/page-12#post-3636095



Nope, I mean where is the screenshot saved on my box? I can't find it....


----------



## MrGenius (Aug 31, 2017)

1Gpi2ZV6Jy said:


> Nope, I mean where is the screenshot saved on my box? I can't find it....


On your "box"? I assume you mean on your "computer". It's in the Valley screenshots folder. Where's that? On Windows 7 you click Start(or open Windows Explorer from the icon on the taskbar), then click Computer. Then click Local Disk C: (or whatever drive letter corresponds to the drive your OS is on). Then click the Users folder. Then click the folder with your User Name. Then click the Valley folder. Then click the screenshots folder.



MrGenius said:


> 2. Then open File Windows Explorer > This PC Computer > C: (or whichever drive your OS is installed on) > Users > User Name > Valley > screenshots


----------



## 1Gpi2ZV6Jy (Aug 31, 2017)

MrGenius said:


> On your "box"? I assume you mean on your "computer".....



Aha, thank you very much!
Okay, found it... how am I supposed to know when that isn't mentioned anywhere!?

Attached my lowish score - just to mention all default speeds


----------



## MrGenius (Aug 31, 2017)

1Gpi2ZV6Jy said:


> ...how am I supposed to know when that isn't mentioned anywhere!?


Well...Google comes to mind. But I just added a hint in the first post to make it a little easier.


1Gpi2ZV6Jy said:


> Attached my lowish score - just to mention all default speeds


That score's impossibly low for your card. Your score should be AT LEAST 3x higher. I don't know what you've got going on with your system, but it's killing your graphics performance.


----------



## 1Gpi2ZV6Jy (Aug 31, 2017)

MrGenius said:


> Well...Google comes to mind. But I just added a hint in the first post to make it a little easier.



Brilliant idea!



MrGenius said:


> That score's impossibly low for your card. Your score should be AT LEAST 3x higher. I don't know what you've got going on with your system, but it's killing your graphics performance.



My few games are all fine like mostly FC4 maxed out Ultra/8xaa full screen runs very smooth. 
My Stormblood bench score (full screen) is 8374 which is not too bad. I think Valley is a little too old for new GPUs?!

I know there is a problem between Gigabyte's AGE and AMD's Wattman interfering each other (killing GPU mhz)


----------



## MrGenius (Aug 31, 2017)

1Gpi2ZV6Jy said:


> I think Valley is a little too old for new GPUs?!


It's not that. In fact that's not even a thing.


1Gpi2ZV6Jy said:


> I know there is a problem between Gigabyte's AGE and AMD's Wattman interfering each other (killing GPU mhz)


That I don't know about. I can only go by what your screenshot shows. Which shows that's not happening during the benchmark. It could be wrong though.

Here's a review showing an RX 570 running at 1300(Gaming Mode)/1750 scoring 2650 in Valley @ 1080p Ultra 4x AA.
http://vishveshtech.blogspot.com/2017/06/asus-rog-strix-rx-570-oc-review.html

There's no way the extra 20MHz on the core and 4x AA vs. 8x AA should account for a ~2000 point gain over your card's score. The difference between 1280/1750 and 1300/1750 + 4x AA and 8x AA shouldn't be more than 100-200 points at best. Something is holding your card back BIG TIME! You should EASILY be scoring around 2000 or more, even with no overclocking. Check your AMD Radeon gaming settings and make sure they look like this.


----------



## 1Gpi2ZV6Jy (Aug 31, 2017)

Yep, right you are - now I am at 1921 and the offender was? supersampling!
another thx for this

Funny that the Enix benchmark scores are not affected by Crimson's AA Settings - scores stay the same ... multi/super sampling does not matter.


----------



## VIGILANTBOY (Sep 4, 2017)

new update:
6850K @ 4.6 GHZ
1080 TI LIGHTNING Z @ 2037-1509


----------



## VIGILANTBOY (Sep 4, 2017)

irate_primate said:


> I've got the FTW3 Elite.  It's a good card, though it is a little bit louder than the Strix that I had.  I got the Strix first, but it had crazy coil whine so I had to send it back.  Newegg didn't have any more of them so I got a refund and grabbed the FTW3 instead.  It actually ran a little hot at first, but I opened it up and replaced the thermal paste with some good stuff I had and now the temps are really low.


YES fan noises and coil wines.... things that you never hear from a LIGHTNING Z


----------



## irate_primate (Sep 5, 2017)

new update:

1080ti:  +77/725
6700K:  @ 4.8GHz

I don't think I'll be able to keep my lead over you with your CPU, so this will probably be my last attempt, but good scores!  And I just noticed that I said I had a 6600K previously, which was a typo as I have a 6700K, oops.


----------



## VIGILANTBOY (Sep 5, 2017)

new Score: 6850K@4.6GHz
1080 TI LIGHTNING Z@2037-1513





this can be my last attemp too or with some extra mhz on gddr5x memorys and cpu maybe if i can go....... you are better on the cpu side and thats a 4.8ghz mine on 4.6 also why didnt you push the memory to 775?push that 6700k to 5ghz ,its interesting to know what would be the scores with your card on my system and my card on yours


----------



## zharth (Sep 12, 2017)

First real test. Haven't hit my threshold yet, but I figured it was worth a shot. I'm on air cooling and so far the temps are within range to keep boosting clock/memory. CPU is at 4.5GHz just for extra clarification, but sadly I'm pretty sure this is all it will handle unless I shove it passed 1.4v. I'd rather not do that though, this is supposed to last a long time.


----------



## Melvis (Sep 26, 2017)

AMD FX 8350@ Stock + GTX 970 Sli @ Stock = 3116


----------



## Cetinakpan (Sep 30, 2017)

My Zotac GTX1060 6G AMP! Edition With i3 6100


----------



## The Pack (Nov 30, 2017)

The Pack i7 6850K @ 4.3 Ghz, Asus Strix GTX 1080ti O11G @ 2062/1475 = 6227


----------



## purecain (Dec 13, 2017)

no one benchmarked a titan V yet??? lol i wish i had a spare 3 grand... damn it...!


----------



## Hugis (Jan 6, 2018)

Just for giggles


----------



## Vayra86 (Jan 7, 2018)

Are we really still stuck on 1080p?

I strongly advocate we move to 1440p or even 4K for Valley. These scores have become kinda less relevant these days and are becoming less and less of a true GPU bench. Was like that over a year ago, and its become worse.

I think now is a good time to consider this because we are at the end of a GPU cycle, with Vega being released and 1080ti as well. It will be informative and nice to see the current score leaders perform at higher res with the same cards and create a playing field for the upcoming releases to score against. Especially because we have such a wide variety of VRAM setups these days - higher res will push that harder as well, those are more interesting results to look at.


----------



## BarbaricSoul (Jan 7, 2018)




----------



## Vayra86 (Jan 17, 2018)

i7-8700K | 4.8GHz | GTX 1080 | 2100/5500 | 4966


----------



## YautjaLord (Jan 20, 2018)

MrGenius said:


> ****PRESS F12 for SCREENSHOT - *Please attach a screen capture of your results for score verification.***
> 
> 
> Spoiler: Where's my screenshot saved?
> ...


Texture filtering - Anisotropic sample optimization - Off, Texture filtering - Negative LOD bias - Clamp (Quality), Texture filtering - Quality - High quality, Texture filtering - Trilinear optimization - On, even though i set it (unsuccessfully) to Off, blame Nvidia Control Panel for "behaving" dorky. Whatd'yasay, will do?  Set GPU to +125MHz for base & boost clocks, VRAM - 4104MHz (8208MHz), fan speed - 65%, Power limit - 111%, Temp limit - 92 degrees C in MSI Afterburner, graphics card - GTX 1070 G1.


----------



## The Pack (Jan 21, 2018)

The Pack/ i7 5960x @4.625 Ghz + Asus Strix GTX 1080ti O11G @2062/1463 = 6374





YautjaLord said:


> Texture filtering - Anisotropic sample optimization - Off, Texture filtering - Negative LOD bias - Clamp (Quality), Texture filtering - Quality - High quality, Texture filtering - Trilinear optimization - On, even though i set it (unsuccessfully) to Off, blame Nvidia Control Panel for "behaving" dorky. Whatd'yasay, will do?  Set GPU to +125MHz for base & boost clocks, VRAM - 4104MHz (8208MHz), fan speed - 65%, Power limit - 111%, Temp limit - 92 degrees C in MSI Afterburner, graphics card - GTX 1070 G1.



My Strix 1070 O8G I had ran under water with 9800Mhz VRAM ... What a machine that was ...


----------



## YautjaLord (Jan 27, 2018)

I already uploaded one from Heaven 4.0, same stuff here.  Why both Heaven 4.0 & this one report i have 4GB stead of 8GB VRAM, wtf? Cause it's windowed? Cheers.


----------



## usmc362 (Feb 12, 2018)




----------



## KSMB (Apr 1, 2018)

*EXTREME HD:*
win 10 creator update...........Strix Gtx 970..........4670K @4.5Ghz + H80i V2..............4x2GB G.SKILL DDR3......................EVO 850 250Gb.


----------



## Vayra86 (Apr 1, 2018)

KSMB said:


> *EXTREME HD:*
> win 10 creator update...........Strix Gtx 970..........4670K @4.5Ghz + H80i V2..............4x2GB G.SKILL DDR3......................EVO 850 250Gb.
> 
> View attachment 99135





usmc362 said:


> View attachment 97121



These shots are not valid... you need to screenshot within the application with it running in full screen. Read the OP carefully.


----------



## KSMB (Apr 5, 2018)

EXTREME HD settings,  valley 1080p. strix (non OC) GTX970


----------



## KSMB (Apr 7, 2018)

Vayra86 said:


> These shots are not valid... you need to screenshot within the application with it running in full screen. Read the OP carefully.


 ok  (PS...they are true though)


----------



## Vayra86 (Apr 7, 2018)

KSMB said:


> ok  (PS...they are true though)



Yes I believe you but the problem is, anyone with photoshop or even Paint can fake that browser window in as little as 10 seconds, while the screenshot adds the actual application running with an onscreen display of the FPS and GPU used.

Its also kinda puzzling that people happily run and post bench results without reading the OP and even ignore it AFTER it was specifically pointed out a few posts ago... I mean why bother?


----------



## er557 (Apr 7, 2018)

Doesn't it seem a tad low for such a mean benchmarking machine?  could it be the benchmark is outdated and focuses highly on cpu clock with single thread?


----------



## Vayra86 (Apr 7, 2018)

er557 said:


> Doesn't it seem a tad low for such a mean benchmarking machine?  could it be the benchmark is outdated and focuses highly on cpu clock with single thread?
> 
> 
> 
> ...



Yes its low and its your CPU clock because these cards can destroy the FPS on this bench. FPS doesn't scale linearly.


----------



## mouacyk (Apr 21, 2018)

i7-8700K @ 5.0GHz + GTX 1080 TI @ 2100.5/12600 = 7001


----------



## MrGenius (Apr 23, 2018)

Scores list updated.


----------



## mouacyk (May 11, 2018)

Tweaked VRAM with VRAMBandwidth and memtestg80 with better score:

i7-8700K @ 5.0GHz + GTX 1080 TI @ 2100.5/12627 = 7184


----------



## mouacyk (May 20, 2018)

i7-8700K @ 5.0GHz + GTX 1080 TI @ 2100.5/12627 = 7204


----------



## RealNeil (May 20, 2018)

That 8700K is damn fine. I just got one of them but no time to put it together yet.


----------



## m1ch4L (Jun 13, 2018)

i7-3770k @ 4.8GHz + GTX 1080 Ti @ 2062/12030 = 6130


----------



## RealNeil (Jun 13, 2018)

m1ch4L said:


> i7-3770k @ 4.8GHz + GTX 1080 Ti @ 2062/12030 = 6130



Good score. That beats my 6700K system.





But if you add a pair of lesser GPUs in SLI, you do better.


----------



## MrGenius (Aug 15, 2018)

One of the last runs for the old 280X.  Vega 64 coming soon! 

i7-3770K @ 5.3GHz + R9 280X @ 1272/1850 = 2361


----------



## Mr.KT (Aug 16, 2018)

With my 1080 normal setting without oc
i7-8700K | 4.9GHz | GTX 1080 x1 | 2062 |5205 | 4931


----------



## AlwaysHope (Aug 17, 2018)

None of my image capturing apps work well with this benchmark. I have to completely exit the game. Captured HTML results but it doesn't show app in the background. 
Apart from that, this benchmark is out of date for the world of Win 10.


----------



## KSMB (Aug 17, 2018)

*- this old HASWELL eats FPS as a Banana...*_*.*_*i5 4670K @4.5Ghz {H80i v2} // Strix GTX 970, *(no Voltage OC on GPU)
67C (GPU) Max. with Extreme HD settings (1080p)....thats freaking Nice :=)










KSMB said:


> ok  (PS...they are true though)


ok....thanks :=)


----------



## MrGenius (Aug 18, 2018)

i7-3770K @ 5.3GHz + RX Vega 64 @ 1667/1159 = 4121


----------



## kurosagi01 (Aug 18, 2018)

AMD Ryzen 1600 @ 3.7GHZ + Vega 64 @ 1630/980 = 3464


----------



## Turbo (Aug 20, 2018)

I5-3470 @ 3.4GHz + HD7990 @ 1100/1500 = 3577


----------



## CS85 (Aug 21, 2018)

2700X @ Stock, GTX 980 Ti 1432/2000


----------



## purecain (Aug 21, 2018)

I just ran this on my Titan V and I only scored 5342. virtually the same score as on my titan x pascal. my cpu is definitely a bottleneck as it wouldnt go above 130fps... it was like a hard limit... the first time ive ever seen anything like it. I expected seeing much higher scores.  my core clocked all the way up to 1912mhz aswell. I havnt overclocked this card yet but it looks like I have a really good sample. shame about the bottleneck... I need to have a look at overclocking my cpu before running this in future. just checked and its less than I was getting before. really weird...


----------



## CS85 (Aug 21, 2018)

purecain said:


> I just ran this on my Titan V and I only scored 5342. virtually the same score as on my titan x pascal. my cpu is definitely a bottleneck as it wouldnt go above 130fps... it was like a hard limit... the first time ive ever seen anything like it. I expected seeing much higher scores.  my core clocked all the way up to 1912mhz aswell. I havnt overclocked this card yet but it looks like I have a really good sample. shame about the bottleneck... I need to have a look at overclocking my cpu before running this in future. just checked and its less than I was getting before. really weird...


130fps meaning the max FPS or just the average?


----------



## purecain (Aug 21, 2018)

the max as in would not go over at any point. it looks like a bottleneck... because the card can do better than 130fps on valley... although it could be a driver issue.. 
just checked online and I should of been hitting up to 170fps so something is up...


----------



## CS85 (Aug 21, 2018)

purecain said:


> the max as in would not go over at any point.


That's strange, we've got the same CPU but you've got a much faster GPU and I got 192fps max. Maybe something limiting the fps somewhere?


----------



## purecain (Aug 21, 2018)

yep i've been looking for the past hour. I'll come back to it tomorrow, maybe their is something to learn from this...


----------



## Turbo (Aug 24, 2018)

I5-3470 @ 3.4GHz + Crossfire R9-290 @ 1040/1250 = 4120

Example of what happens when you use a terrible board... x16-3.0/*x4-2.0* is causing a bottleneck
maybe not... I see theres a pair of 390s doing 4290, and thats on a SLI Z270 board, so I guess X4 2.0 isnt hurting too much?


----------



## CS85 (Aug 24, 2018)

Turbo said:


> I5-3470 @ 3.4GHz + Crossfire R9-290 @ 1040/1250 = 4120
> 
> Example of what happens when you use a terrible board... x16-3.0/*x4-2.0* is causing a bottleneck
> maybe not... I see theres a pair of 390s doing 4290, and thats on a SLI Z270 board, so I guess X4 2.0 isnt hurting too much?


Still a good score though, how does it perform in games that support Crossfire/SLI?


----------



## Turbo (Aug 25, 2018)

CS85 said:


> Still a good score though, how does it perform in games that support Crossfire/SLI?


In BF1 im CPU bound and get negative scaling, Bioshock Infinite doesnt show any increase in performance but its already at 100fps average with max/ultra settings... havent had a chance to try anything else... I currently only have a 1080p monitor though, so I probably wont see any scaling until I get 1440p or better


----------



## purecain (Aug 25, 2018)

ok this time I got 5637 I suppose this is better but I dont think its close to where this card should be. in msi afterburner I notice that cpu 1 is at 100% the whole time... I may just oc that core and check the effect...
ok it's my cpu oc. its originally been sticking around 4.0, then I changed the bios setting to the first oc setting and that gave me the score above with all cores at 4.1ghz. I just loaded up the oc3 profile and it has given me 4.1 at stock in performance mode with 4.2+ for the oc... will bench soon and see the result... I suspect this is the issue. if this works I'll just clock the first core to 4.35 or something and see what happens.. with the CPU @4.2-4.3 my score was 5843, a 200point increase for 100mhz on the CPU… the bottleneck is the CPU and it looks like I'm getting an extra 2points for each 1mhz increase in CPU speed. interesting.


----------



## CS85 (Aug 25, 2018)

purecain said:


> ok this time I got 5637 I suppose this is better but I dont think its close to where this card should be. in msi afterburner I notice that cpu 1 is at 100% the whole time... I may just oc that core and check the effect...
> ok it's my cpu oc. its originally been sticking around 4.0, then I changed the bios setting to the first oc setting and that gave me the score above with all cores at 4.1ghz. I just loaded up the oc3 profile and it has given me 4.1 at stock in performance mode with 4.2+ for the oc... will bench soon and see the result... I suspect this is the issue. if this works I'll just clock the first core to 4.35 or something and see what happens.. with the CPU @4.2-4.3 my score was 5843, a 200point increase for 100mhz on the CPU… the bottleneck is the CPU and it looks like I'm getting an extra 2points for each 1mhz increase in CPU speed. interesting.


Yeah, I don't think the benchmark is that well multi-threaded. I noticed the same thing on Heaven, it would stutter hard at one part with one core at 100% Got a higher score by disabling cores down to 4C4T and overclocking to 4.4GHz. Pretty old benchmark by now though.


----------



## KSMB (Sep 11, 2018)




----------



## Athlonite (Sep 12, 2018)

It's about time Unigen upscaled this bench mark to x64 bit


----------



## johnspack (Sep 26, 2018)

Again,   just a placeholder until my 980ti gets here...  my result under linux with my 970.  Yes,  I don't have dx....


----------



## johnspack (Oct 1, 2018)

Yeah,  ok,  did this one under windows so it would look more legit.  About a 40% increase?  I'll post full results once I learn this new card...


----------



## TexSC (Oct 7, 2018)

Created an account to post this score! Just got a brand new EVGA GeForce GTX 1080 Ti SC2 GAMING, and have been tweaking it for a week.

CPU: i7-4790K @ 4.4Ghz
GPU: 1080Ti @ 2012/1600
Score: 6213

Used EVGA Precision X OC to dial in the 100% voltage, 120% power target, 90C temp target (priority), +26MHz GPU clock offset and then the massive +900Mhz Mem clock offset. I think I got lucky with a pretty good GPU core but an outstanding set of memory. No artifacts at all. Air cooled.

Pretty excited to make the top 10! (#6 at time of post). I don't think i can push it higher and higher core clocks mean drastically lower memory clocks which drops the score quite a bit.


----------



## Enterprise24 (Oct 8, 2018)

i7-8700K @ 5.0GHz + GTX 1080 Ti @ 2088/1586 = 6843


----------



## Enterprise24 (Oct 12, 2018)

i7-8700K @ 5.3GHz + GTX 1080 Ti @ 2164/1586 = 7414


----------



## trog100 (Oct 28, 2018)

8700K at 5g.. 2080ti gpu core 1502  boost 1802... memory 1937..

something is amiss with this one i aint sure what.. i will run it at 1440.. just to see..






trog


----------



## Enterprise24 (Nov 1, 2018)

i7-8700K @ 5.3GHz + GTX 1080 Ti @ 2164/1580 = 7579


----------



## mouacyk (Nov 3, 2018)

8700K 5.1GHz / 1080Ti 2,177/12,627 = 7403


----------



## trog100 (Nov 8, 2018)

8700K at 5g.. 2080ti gpu core 1502 boost 1802... memory 1949 .. palit bios update giving a max power setting of +26% as opposed to +15%







trog


----------



## Kwh203 (Nov 18, 2018)




----------



## TRIPTEX_CAN (Nov 18, 2018)




----------



## lapino (Nov 18, 2018)

Since my performance in bf:v is not that good, I decided to run this benchmark. I5 6600k at 4.5ghz with evga gtx1080 ftw I get 4211 score. Is this normal? Seems low.


----------



## Vayra86 (Nov 19, 2018)

lapino said:


> Since my performance in bf:v is not that good, I decided to run this benchmark. I5 6600k at 4.5ghz with evga gtx1080 ftw I get 4211 score. Is this normal? Seems low.



You are CPU limited. There is a GTX 1080 score above you on an 8-core Ryzen, its 10% higher, for example.

BFV can use up to 8 threads, so your 4 thread CPU is not optimal. You stand to gain _some_ FPS. Best case I reckon 15-20%, worst case zero. If your BF V is stuttery right now, a CPU upgrade will fix that - make sure you get 6-8 physical cores in that case, so Ryzen 5 / Intel 8700K / 8600K or better.


----------



## trog100 (Nov 19, 2018)

Vayra86 said:


> You are CPU limited. There is a GTX 1080 score above you on an 8-core Ryzen, its 10% higher, for example.
> 
> BFV can use up to 8 threads, so your 4 thread CPU is not optimal. You stand to gain _some_ FPS. Best case I reckon 15-20%, worst case zero. If your BF V is stuttery right now, a CPU upgrade will fix that - make sure you get 6-8 physical cores in that case, so Ryzen 5 / Intel 8700K / 8600K or better.



some folks try very hard and often to get good scores on here.. and it aint that long ago people were saying four cores is all you need for gaming.. i could be wrong but i dont think much has changed..

trog


----------



## Vayra86 (Nov 19, 2018)

trog100 said:


> some folks try very hard and often to get good scores on here.. and it aint that long ago people were saying four cores is all you need for gaming.. i could be wrong but i dont think much has changed..
> 
> trog



Here's mine, no tryhard involved, just best out of three runs.

i7-8700K   4.8GHz   GTX 1080    2100/1375  _*4966*_     Vayra86

This bench is heavily CPU dependant, cores and clocks both have impact, and that is simply because this bench is too lightweight, the GPU is really just waiting around a lot.

Regardless, I see the same thing in games. Fast single thread CPU and sufficient cores are alpha and omega for gaming. That won't ever change. And while 4 cores is 'enough', once you get into 1080 or higher GPU territory, it will be the bottleneck, even if you run those cores at 5 Ghz. And then there is an increasing number of games that do really benefit from more than 4 cores/threads, and even scale up to 8. Games on Frostbite is one of them. You can look at this forum and the number of threads people make regarding their Ivy or Haswell i5 CPUs that have run out of juice for comfortable gaming, while the i7's are still doing alright.


----------



## trog100 (Nov 19, 2018)

mine is 7753 with a 2080ti.. but you are right.. its time for a 1440 resolution in both in heaven and valley..

trog


----------



## MrGenius (Nov 20, 2018)

trog100 said:


> its time for a 1440 resolution in both in heaven and valley..
> 
> trog


You want it? You got it!


----------



## trog100 (Nov 20, 2018)

MrGenius said:


> You want it? You got it!



you have my thanks for doing what you are doing but i dont have the patience.. 

trog


----------



## Enterprise24 (Nov 21, 2018)

1440p result.

i7-8700K @ 5.3GHz + GTX 1080 Ti @ 2202/1580 = 4852


----------



## trog100 (Nov 21, 2018)

valley at 1440  cpu 8700K at 4.9g  2080ti gpu 1489 core  1937 memory.. camera pic i still cant do a full screen save..






trog


----------



## n1ko (Nov 21, 2018)

i5 7600k 5,2ghz; RAM: HyperX Fury 16gb 2x2666mhz OC:3200mhz & tweaked timings; G1 Gygabite GTX980Ti 1554/3870 (+366 mem offset) - Unfortunately its not like my GTX970 memory around +750 offset (8550mhz)


----------



## MrGenius (Nov 21, 2018)

Damn son! That's one helluva score for a 980 Ti. 

i5-3570K @ 5.0GHz + RX Vega 64 @ 1660/1175 = 2543




i5-3570K @ 4.8GHz + RX Vega 64 @ 1664/1175 = 4064


----------



## infrared (Nov 22, 2018)

My 1440P result 

R7-1800X | 4.2GHz | GTX 1080 Ti | 2101/1552 | 2616 | infrared


----------



## MrGenius (Nov 22, 2018)

infrared said:


> My 1440P result
> 
> R7-1800X | 4.2GHz | GTX 1080 Ti | 2101/1552 | 2616 | infrared
> 
> View attachment 111094


Wrong thread. Good score though.


----------



## infrared (Nov 22, 2018)

MrGenius said:


> Wrong thread. Good score though.


Oops!  I'll have a crack at Valley tomorrow.  cheers


----------



## MrGenius (Dec 3, 2018)

Switched back to the 3770K.

i7-3770K @ 5.3GHz + RX Vega 64 @ 1667/1175 = 2552




i7-3770K @ 5.3GHz + RX Vega 64 @ 1661/1175 = 4126


----------



## Enterprise24 (Dec 3, 2018)

What voltage on your 3770K 5.3Ghz ?


----------



## MrGenius (Dec 3, 2018)

Enterprise24 said:


> What voltage on your 3770K 5.3Ghz ?


1.616V is all it needs for Valley.


----------



## simmie (Dec 23, 2018)

I'm still pretty new to learning my ways around overclocking, but id appreciate it if you'd lmk how I did and anything I can improve/tweak


----------



## Mumtaz (Jan 3, 2019)




----------



## Vayra86 (Jan 4, 2019)

simmie said:


> I'm still pretty new to learning my ways around overclocking, but id appreciate it if you'd lmk how I did and anything I can improve/tweak



Not much you can do with your rig. The 960 won't go faster and the CPU isn't overclockable. That said, you do have a good balance between CPU/GPU. That means if you're ever upgrading, go all the way.

You can still unlock core voltage for the 960 but I wouldn't bother, honestly.


----------



## agent_x007 (Jan 26, 2019)

Titan X (Maxwell) 1178/3505MHz at 1,055V.

1080p :




1440p


----------



## P4-630 (Aug 3, 2019)

i7 6700K @4.3GHz + MSI RTX2070 Super Gaming X Trio @ stock 1980/1750 (7001)


----------



## Arctucas (Aug 3, 2019)




----------



## MurdoD (Aug 3, 2019)

i9 9900K @3.6GHz (Intel Boost and Speedstep Enabled) + Palit RTX2080 Ti GamingPro OC @ Stock 1350/1650 (7000) 

1440p




1080p Extreme






trog100 said:


> valley at 1440  cpu 8700K at 4.9g  2080ti gpu 1489 core  1937 memory.. camera pic i still cant do a full screen save..
> 
> trog



Which RTX 2080Ti is this? With core clocks so high? Or is it overclocked?


----------



## The Pack (Aug 4, 2019)

Enterprise24 said:


> i7-8700K @ 5.3GHz + GTX 1080 Ti @ 2164/1580 = 7579
> 
> View attachment 109692
> 
> View attachment 109693


Here is working a Bios Mod (LN2 Bios...?) No one of the 1080ti takes 1.13 vcore. Max. 1.081 to 1.093. I pick a poseidon and it works perfektly, but max. 1.081 to 1.093v


----------



## Enterprise24 (Aug 4, 2019)

The Pack said:


> Here is working a Bios Mod (LN2 Bios...?) No one of the 1080ti takes 1.13 vcore. Max. 1.081 to 1.093. I pick a poseidon and it works perfektly, but max. 1.081 to 1.093v



ASUS Strix XOC BIOS allow upto 1.2V on 1080 Ti.
My card doesn't take advantage of anything more than 1.162V
Max OC is 2215Mhz in Basemark , 2202Mhz in Valley , 2190Mhz in Night Raid , 2177Mhz in Timespy / Heaven.


----------



## The Pack (Aug 4, 2019)

i7 5960x @ 4.5Ghz + Asus Poseidon GTX1080ti O11G 2075/1550 = 4306

I can not take a screenshot if Full Screen is enabled. I do not know. Think it will be up to Winows. Under Windowed possible.


----------



## Athlonite (Aug 4, 2019)

The Pack said:


> I can not take a screenshot if Full Screen is enabled



Have you tried pressing the Print Screen key then open Paint and paste as new image that usually works for almost everything


----------



## The Pack (Aug 4, 2019)

Windows+Printscr...


----------



## RealNeil (Aug 5, 2019)




----------



## Athlonite (Aug 5, 2019)

The Pack said:


> Windows+Printscr...



just the Prt Scr button on your keyboard it should be right next to the F12 key


----------



## purecain (Aug 6, 2019)

3900x @ 4.2- 4.4ghz
Titan V 1912mhz 998mhz mem 
score: 6525


----------



## The Pack (Aug 6, 2019)

Athlonite said:


> just the Prt Scr button on your keyboard it should be right next to the F12 key


That does not work somehow. Think there has microsoft on Windows once again shot something. I have now made a picture by mobile phone at 2560x1440p. I post it later, if i'm at home

somehow i have the worst case scenario with my Windows. It does not save the screeshot, but i can copy to paint via Ctr + V, and then the screenshot will appear .

i7 5960x @ 4.5Ghz + Asus Poseidon Gtx 1080ti 2075/1552 = 4387




i7 5960x @4.5Ghz + Asus Poseidon Gtx 1080ti 2075/1550 = 6447 




or better

i7 5960x @ 4,5 Ghz + Asus Poseidon Gtx 1080ti 2100/1575 = 4456




and
i7 5960x @ 4,5 Ghz + Asus Poseidon Gtx 1080ti 2100/1575 = 6502


----------



## purecain (Aug 6, 2019)

The Pack : well done!!!! nice results... BTW your scrn shots are automatically saved on your Microsoft account. they give you so much space on the cloud to keep photos and stuff you dont  want to lose.
Titan V 2025Mhz core 998Mhz HBM2


----------



## The Pack (Aug 7, 2019)

purecain said:


> The Pack : well done!!!! nice results... BTW your scrn shots are automatically saved on your Microsoft account. they give you so much space on the cloud to keep photos and stuff you dont  want to lose.
> Titan V 2025Mhz core 998Mhz HBM2
> View attachment 128661


Thanks, i found it now . I think, now is the point to change my CPU, to one 9900x or so. Better imc, better core.


----------



## purecain (Aug 7, 2019)

I would wait to upgrade your cpu. As when the 3950x 16core cpu is released youll be able to pick up a cheap 3900x... imo anyway...

heres another... 6781


had one more run 6832 @2085Mhz GPU 998Mhz HBM2
I think I need to put this card under water for any more performance...  as this is all on air with rig in sig...


----------



## JohnnyDirect (Aug 8, 2019)

Score 8703 FPS 208 9900K@5.3ghz 2X MSI 2080 TI Sea Hawk EK X 2115ghx VMEM 7835/AKA 3917x2


----------



## purecain (Aug 8, 2019)

Its a shame sli is receiving so little support. Benchmarks ,like this show the improvement. Great score!


----------



## mouacyk (Aug 8, 2019)

JohnnyDirect said:


> Score 8703 FPS 208 9900K@5.3ghz 2X MSI 2080 TI Sea Hawk EK X 2115ghx VMEM 7835/AKA 3917x2


If that ain't the fastest GPU for decades to come!


----------



## P4-630 (Aug 10, 2019)

i7 6700K @4.3Ghz , RTX 2070 Super 1980/2013


----------



## JohnnyDirect (Aug 11, 2019)

purecain said:


> Its a shame sli is receiving so little support. Benchmarks ,like this show the improvement. Great score!



I agree,  especially for VR. I play Rainbow Six Siege and get maked improvement @4K with SLI,  maxed out including X4TAA;  but I also play DCS and Elite Dangerous in VR.   VR is sorely lacking muti GPU support.


----------



## P4-630 (Sep 1, 2019)

i7 6700K @ 4.3GHz / RTX2070 Super 1995/2013


----------



## ThrashZone (Feb 23, 2020)

HI,
9940x 4.9 4k memory 




x99 4.5 3200c14 timings


----------



## Ferrum Master (Feb 23, 2020)

@ThrashZone

Do it in proper requested resolution dude.


----------



## ThrashZone (Feb 23, 2020)

HI,
Maybe later subs for tenforums.com


----------



## EarthDog (Feb 23, 2020)

ThrashZone said:


> HI,
> Maybe later subs for tenforums.com


Hi. Nice scores.. they dont match the requested settings though.

Do you start all your posts with hi? Across every thread here and other forums you've placed these results in?


----------



## ThrashZone (Feb 23, 2020)

EarthDog said:


> Hi. Nice scores.. they dont match the requested settings though.
> 
> Do you start all your posts with hi? Across every thread here and other forums you've placed these results in?


Hi,
Obviously yes 
Is there an echo in here


----------



## RealNeil (Feb 24, 2020)

With proper resolution and settings.
I7-8700K
2X 1080FE GPUs in SLI





With incorrect settings:


----------



## Schmuckley (Feb 24, 2020)

Is this thread maintained?



RealNeil said:


> With proper resolution and settings.
> I7-8700K
> 2X 1080FE GPUs in SLI
> 
> ...


Teh last of the SLI.

Allow me to show you some of the 1st:










This is with the gimped settings that I run daily. -25% power 



Dannng! Titan V! Lemme see how much one o' dem is!









						NVIDIA TITAN V Volta 12GB HBM2 Graphic Card (9001G5002500000) for sale online | eBay
					

Find many great new & used options and get the best deals for NVIDIA TITAN V Volta 12GB HBM2 Graphic Card (9001G5002500000) at the best online prices at eBay! Free shipping for many products!



					www.ebay.com
				




Doh! Dude, it's a video card. I could probably get one that clocks to the moon with half a 7970 welded to it for under $3k.


----------



## RealNeil (Feb 24, 2020)

SLI works for me. Most of my Shooters use it, and when they don't, one 1080FE is enough to run them on decent settings.
Another one of my PCs has a pair of 1070Ti cards in it.

I just sit back and chuckle when people post about SLI being dead and then I use it for myself.
My third system has a single 1080Ti in it and it runs great, but if I get the chance, I'll buy another one just like it.
If it's just a waste, I can live with it.


----------



## Cetinakpan (Apr 24, 2020)

New


----------



## petedread (May 1, 2020)

Ryzen 9 3950x @ stock(4.55ghz) + 2080 TI @ 2130/7250 = 6453



Have I done this correctly?

Ryzen 9 3950x @ stock(4.55ghz) + 2080 TI @ 2150/7500 = 6641


----------



## petedread (May 5, 2020)

Fans and pumps running a bit faster but nothing dialed into AfterBurner. GPU at stock and scores go up lol, =6703


Ryzen 9 3950x @4.5 + 2080TI @2150/8000 =6972Aggressive fan curve. +200/+1000 = 6972

Going from 32 cores stock to 8 cores at 4.625ghz has not gotten me a better score than that posted above. The day is warming up so I think temps are playing a bigger part right now. Maybe this evening when it gets cooler using fewer cores at higher speed will get me a better score.


----------



## agent_x007 (May 10, 2020)

Full HD : Xeon E5-1680 v2 @4.3GHz + Titan Xp (CE) @ 2025/1604 = 6160 (all Air cooling)




1440p : Xeon E5-1680 v2 @4.3 + Titan Xp (CE) @ 1911/1604 = 4455


----------



## petedread (May 10, 2020)

2160/1440 score
Ryzen 9 3950x @4.6 + 2080 TI 2160/1750 =5318
Just realised I've been listing my core +memory clock instead of core+boost clock. Should it be idle clock and boost or out of box boost and OC?


----------



## jorj02 (Jun 21, 2020)

is my score ok for 1080 ti?

1080 ti memory 500+ core clock stock


----------



## uco73 (Jul 24, 2020)




----------



## sounik (Jul 29, 2020)

gtx 1080 @ 2126 core / 1550 memory paired with r5 3600@4.6ghz and 3733/cl16 memory


----------



## Athlonite (Jul 29, 2020)

@RealNeil wrong benchmark for this thread It's Unigine Valley not Unigine SuperPosition


----------



## the54thvoid (Jul 29, 2020)

Athlonite said:


> @RealNeil wrong benchmark for this thread It's Unigine Valley not Unigine SuperPosition




Moved to the Unigine Superposition thread for now. The benchmark thread is locked as the OP is no longer overseeing the scores.









						Unigine Superposition
					

Does anyone know of a way to put Superposition into an endless loop, or use the stress function, without paying $20 for the privilege?  Otherwise does anyone know of something more modern and demanding than Unigine Valley for stress testing a GPU overclock?  Furmark only hits one aspect of a...




					www.techpowerup.com


----------



## Bantis (Sep 18, 2020)

AMD Rzen 7 3800X: GTX 980 TI Classified @ 1480 Core / 1951 Memory
Score: 7535



Extreme Score


----------



## KSMB (Aug 31, 2021)

1080p 0 X AA // 1080P 8 X AA

scores...........OCd gtx970 +100 core // 200 MEM. (ASUS strix 4GB VRAM). using new MSI Afterburner.....awseome

LOVE thé respons in Fiber internet. )


----------



## sobr2005 (Oct 22, 2021)

8600K 4.4GHz
MSI 1080 Gaming,  GPU: 2025@975mV   mem:5700   temp.GPU.max:68
DDR4 32Gb@3800MHz

preset: Extreme HD  
1920x1080
121FPS@5082


----------



## P4-630 (Oct 22, 2021)

sobr2005 said:


> 8600K 4.4GHz
> MSI 1080 Gaming,  GPU: 2025@975mV   mem:5700   temp.GPU.max:68
> DDR4 32Gb@3800MHz
> 
> ...



Nice score


----------



## pokerapar88 (Nov 4, 2021)

Hey Guys! Sharing my score over here. Just upgraded my GPU from a Palit RTX 3070 GameRock and was testing it out.
Thing is, I payed ~$600 for my 3070 in dec 2020.... traded my card +$400 for this almost new 3080ti (2 months of usage). Not the best model out there, but performance is tight.


----------



## Petar666 (Jan 9, 2022)

ЕVGA FTW3 ULTRA 3080TI/2100/21000/10900KF@5.30Ghz/2x8GB@4266mhz
FPS: 242.6
Score: 10149


----------



## AVATARAT (Feb 27, 2022)

Ryzen 5 5600x+PBO+CO Per Core
2x8GB DDR4@4000MHz 16-17-14-28-2T
RX 6800 XT Gaming OC 16GB @2680MHz / Mem 2150MHz
Driver 22.2.2
Win 11 21H2 (22000)

FPS: 203.3
Score: 8506


----------



## AVATARAT (Jul 21, 2022)

Ryzen 5 5600x+PBO+CO Per Core
2x8GB DDR4@4000MHz 16-17-14-28-2T
RX 6800 XT Gaming OC 16GB @2690MHz / Mem 2132MHz(17056)
Driver 22.6.1

*FPS: 219.9
Score: 9202



*


----------



## P4-630 (Jul 21, 2022)

P4-630 said:


> i7 6700K @ 4.3GHz / RTX2070 Super 1995/2013
> 
> View attachment 130628



New build, new CPU i7 12700K @ stock speeds , RTX 2070 Super @ stock speeds.


----------

