Поддерживает ли cyberpunk 2077 sli

Обновлено: 05.07.2024

Долгожданный проект от CD Projekt RED наконец-то вышел, однако на релизе столкнулся с проблемами. Игроки то и дело жалуются на проблемную техническую часть Cyberpunk 2077 и слабую производительность даже на мощных видеокартах. Производительность игры на самых новых линейках видеокарт от Nvidia и AMD уже проверили, проблемной часть оказались ультра настройки и трассировка лучей. Но что касается среднестатического геймера, у которого нет топового игрового монстра?


Мы решили проверить, как ведет себя Cyberpunk 2077 на бюджетном ПК и какое количество FPS удаться получить на самой популярной видеокарте 2016 года. Кроме этого, мы также запустили игру и на ноутбуках, что увидеть производительность на мобильных видеокартах.

GeForce GTX 1060

  • Процессор — Intel Core i5-7500 3,4 ГГц
  • Видеокарта — GeForce GTX 1060
  • 16 ГБ ОЗУ

Еще летом этого года, Valve опубликовала статистику популярности видеокарт в Steam. По данным компании уверенное лидерство держит GeForce GTX 1060. Увы, играть на видеокарте четырехлетней давности на высоких настройках будет не комфортно. Cyberpunk 2077 на протяжении сорока минут выдает в среднем 26 кадров в секунду. Счетчик FPS постоянно колеблется. Сильные просадки происходят в густонаселенных районах города, но в некоторых местах можно увидеть даже больше 30 PFS. Если потратить время на настройку графики, снизив некоторые параметры до средних или ниже средних можно будет добиться более высоких стабильный показателей.

AMD Radeon RX 560X

  • Процессор — Ryzen 5 3550H 2.1 ГГц
  • Видеокарта — AMD Radeon RX 560X на 4 ГБ
  • 8 ГБ ОЗУ

По характеристикам ноутбук сопоставим с бюджетным ПК. Процессор по производительности с можно отнести десктопному Intel Core i5-8400, а видеокарта с GeForce GTX 1050 (разве что объем памяти больше). Full HD разрешение и самый низкий пресет графики. В таких условиях Cyberpunk 2077 показывал в среднем 24 FPS. В целом, это было настоящее испытания для ноутбука, что отразилось на качестве записи. Здесь улучшить производительность никак не получится.

Игровой ноутбук — RTX 2060

  • Процессор — Intel Core i7-10750H 2,6 ГГц
  • Видеокарта — GeForce RTX 2060
  • 16 ГБ ОЗУ

Самые лучшие результаты показал игровой ноутбук на базе GeForce RTX 2060. Включенная трассировка лучей, максимальный пресет графики и Full HD. С такими настройками Cyberpunk 2077 выдает полные 40 FPS, но с просадками. В динамических перестрелках или быстрой езде счетчик может опуститься даже до 28 кадров в секунду, а кат-сцены могут опустить значение еще ниже. Но для максимальных настроек это отличные результаты.

Cyberpunk 2077 уже доступна. Приобрести новую игру от CD Projekt RED можно на ПК и консолях, в цифровом магазине “Игромагаз”. Кроме этого, игра доступна в нашем каталоге призов.


Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Косплей на Эвелин Паркер из Cyberpunk 2077

Сyberpunk 2077 "Заменяет оригинальную модель игрушки на Риту"


Как максимально поднять FPS (перебирал все настройки по одной и сравнивал):
1) самые жрущие настройки это настройки теней (особенно "Разрешение каскадных теней") и отражения, их нужно уменьшать в первую очередь до средних или даже низких. Далее по прожорливости идут туман и облака.
2) Если у вас RTX карта, то ОБЯЗАТЕЛЬНО включать DLSS на пресет "сбалансированное" или "качество", даже если вы не играете с трассировкой и даже если у вас 2080Ti.
3) Если у вас GTX/AMD, то включаете FidelityFX на постоянный пресет и уменьшаете процент разрешения. Даже при значении 85% картинка всё ещё достаточно чёткая. Если есть необходимость, то частично восстановить резкость можно через а) панель управления Nvidia, настройки резкости б) с помощью ReShade и фильтра LumaSharpen (если боретесь за каждый кадр) или же FilmicAnamorphicSharpen (если есть хоть какой-то запас по загрузке видеокарты).

P.S. у меня ПК не слабый (9900k, RTX2080), играю в 1440p60 кадров, настройки все высокие-ультра за исключением "Разрешение каскадных теней" - это на среднем. DLSS на "качество". Как итог просадки ниже 60 случаются только в самых лютых замесах, 95% игрового времени видеокарта загружается не выше 90%.

P.S.2 Замечено также, что на некотором железе иногда игра в принципе не реагирует на то, что вы изменяете настройки графики - всё равно производительность ужасна. Здесь остаётся только ждать патчей и возможно новых драйверов.


Вот кстати моя ситуация. i7 4790k и GTX 980 Ti. Какие бы настройки не ставил, всегда будет в городе 25 фпс, а в помещении 45. Даже разрешение менял на минимальное и там 0 разницы.


Это ты все себе придумал, как и все остальное себе фанатики киберпанка попридумывали, вам никто не обещал 30фпс в этой игре. cdpr ни в чем не виноваты, это комьюнити гнилое.


Может быть просто в малопоточный проц упирается, у меня 10400f и простая 980 (не ti), производительность говно полное, но всё же не 25 кадров.

Текстуры (которые настраиваются только до запуска) тоже на минимум и плотность населения?


Anyone get their SLI setup working with Cyberpunk 2077?


Wholesome

Hey everyone this thread is not about if sli is dead or not. So people commenting sli is dead. Move on to the sli support thread. This thread is about if its working currently on cyberpunk and if so how well is it doing? You are getting the fan boys riled up and us normal people just want to know if sli is working so we can skip ray tracing and just go ham on fps.

Cake day

· 10m

The game is not out yet, how can people test 🙂

Cake day

· 10m

SLI is dead to you perhaps, most likely you always buy single cards on a fixed budget well unfortunately the three comments here do not represent most top tier enthusiasts myself included. I use sli/nv link and never had one issue. SLI is dead to you yes but you do not represent all people.

+1 have had 2 2080 Tis for sometime. Have new system with 2 3090s on the way. Not dead.

Sli works on a lot of games usually big budget ones

It is literally dead for gaming. DX12 does not support it without special effort and NV/developers are stopping that at the end of the year. Most have not supported SLI since around the 900/1000 series cards.

Edit: It will probably still run with SLI set up, but the second card will be a paperweight that just burns electricity.

For being such an enthusiast, you are fairly out of the loop, because (the majority) of developers have stopped caring about SLI support for some time. Nvidia officially ends implicit support jan. 1st, however it might as well have already happened because almost no new SLI profiles are being added to drivers.

Like everyone else stated [explicit] SLI is still possible, but it requires the developers to care about and code for it, as Nvidia has no hand in it anymore since it now has to be done through DX12 API.


Like the title says, I have not seen any news about CP 2077 supporting multi-gpu setups yet and I would like to know.


They did not announce anything as of yet but who knows at this point me personally I don’t really care for sli or crossfire at this point as it does not really give you that much of a performance boost for the price that you put into it (price to performance)



Cyberpunk 2077

If it does then my SLI GTX 760 will work but if not i am below minimum requirement .. No difference between SLI 3090 or Single, so I'm going to say no. From someone elses testing at least. 12 дек. 2020 в 6:30 Well it would be Nvidia`s Job to add SLI but that really doesn`t mean that it will be playable either Well it would be Nvidia`s Job to add SLI but that really doesn`t mean that it will be playable either

well, SLI does work on nvidia's side.
My question was if cyberpunk's engine was coded to be able to use/support SLI

(when it's not you have flickering shadows and other thing)


yea i know. quite side concidering back then a SLI 760 was giving better perf than a Titan for half the price 12 дек. 2020 в 7:32 No difference between SLI 3090 or Single, so I'm going to say no. From someone elses testing at least. 12 дек. 2020 в 7:33

SLI was dead and rotting 5 years ago for gaming. The only place it still works well is with "professional" workloads like renders and stuff. Workstation kinda stuff. Like doing a 3D render for example, the scaling is nearly linear, maybe like 90% scaling.

My biggest disappointment with PC hardware was when I bought a second r9 390X to run in crossfire. Hot, hot trash. Lesson learned the hard way. If I had done even some basic research before breaking my wallet out I would have realized it. Oh well, buy once cry once.

SLI was dead and rotting 5 years ago for gaming. The only place it still works well is with "professional" workloads like renders and stuff. Workstation kinda stuff. Like doing a 3D render for example, the scaling is nearly linear, maybe like 90% scaling.

My biggest disappointment with PC hardware was when I bought a second r9 390X to run in crossfire. Hot, hot trash. Lesson learned the hard way. If I had done even some basic research before breaking my wallet out I would have realized it. Oh well, buy once cry once.

yea i know. that's why only the 3090 is compatible with SLI.

That being said: SLI should only be used to buy 2 cheap card to match the performance of a more expensive one for less cost. Some card even use multiple GPU in them to be more expensive

So i don't get your point maybe you had a mATX and your card didn't have space to breath?

12 дек. 2020 в 8:14

yea i know. that's why only the 3090 is compatible with SLI.

That being said: SLI should only be used to buy 2 cheap card to match the performance of a more expensive one for less cost. Some card even use multiple GPU in them to be more expensive

So i don't get your point maybe you had a mATX and your card didn't have space to breath?

No I was running a Full ATX case on a standard ATX motherboard. Had like a million fans on that gargantuan case, with x4 120mm fans on the side panel. Plenty of room to breathe. The problem with scaling was how it was implemented in the game engine and how the cards had to share resources.

Some games might see up to 20% uplift from crossfire. Other games saw no improvement and some were even worse than a single card. Then regardless of performance gains or losses from crossfire, many games would introduce "microstutter" that wasn't visible in FPS counters but visible to the eye. But even a 20% uplift isn't worth 2x the cost. You're better off getting a single better card.

I actually would do the opposite of what you say. SLI should only be used for top tier cards when you're a working professional and time = money. Faster renders mean faster turn around time and more throughput to your clients, which translates to more money. If you're using cheaper cards you're better off spending that money on a single nicer card. Or if you're gaming just forget SLI/Crossfire completely.

yea i know. that's why only the 3090 is compatible with SLI.

That being said: SLI should only be used to buy 2 cheap card to match the performance of a more expensive one for less cost. Some card even use multiple GPU in them to be more expensive

So i don't get your point maybe you had a mATX and your card didn't have space to breath?

No I was running a Full ATX case on a standard ATX motherboard. Had like a million fans on that gargantuan case, with x4 120mm fans on the side panel. Plenty of room to breathe. The problem with scaling was how it was implemented in the game engine and how the cards had to share resources.

Some games might see up to 20% uplift from crossfire. Other games saw no improvement and some were even worse than a single card. Then regardless of performance gains or losses from crossfire, many games would introduce "microstutter" that wasn't visible in FPS counters but visible to the eye. But even a 20% uplift isn't worth 2x the cost. You're better off getting a single better card.

I actually would do the opposite of what you say. SLI should only be used for top tier cards when you're a working professional and time = money. Faster renders mean faster turn around time and more throughput to your clients, which translates to more money. If you're using cheaper cards you're better off spending that money on a single nicer card. Or if you're gaming just forget SLI/Crossfire completely.

hm true, i guess it depend on how well implemented SLI is by the software.

The issue with SLi was how incomplete it was. everything depented on the master graphic card so you couldn't double yoru Vram etc.

IIRC there was a tecyhnology to replace SLI wich cancel this master/slave archetype but i guess it was dropped ? or is it only reserved for proffesionnal stuff like quadro ?

13 дек. 2020 в 11:43

No I was running a Full ATX case on a standard ATX motherboard. Had like a million fans on that gargantuan case, with x4 120mm fans on the side panel. Plenty of room to breathe. The problem with scaling was how it was implemented in the game engine and how the cards had to share resources.

Some games might see up to 20% uplift from crossfire. Other games saw no improvement and some were even worse than a single card. Then regardless of performance gains or losses from crossfire, many games would introduce "microstutter" that wasn't visible in FPS counters but visible to the eye. But even a 20% uplift isn't worth 2x the cost. You're better off getting a single better card.

I actually would do the opposite of what you say. SLI should only be used for top tier cards when you're a working professional and time = money. Faster renders mean faster turn around time and more throughput to your clients, which translates to more money. If you're using cheaper cards you're better off spending that money on a single nicer card. Or if you're gaming just forget SLI/Crossfire completely.

hm true, i guess it depend on how well implemented SLI is by the software.

The issue with SLi was how incomplete it was. everything depented on the master graphic card so you couldn't double yoru Vram etc.

IIRC there was a tecyhnology to replace SLI wich cancel this master/slave archetype but i guess it was dropped ? or is it only reserved for proffesionnal stuff like quadro ?
Direct X 12 was supposed to be the big thing that finally made SLI and crossfire usable. The way each card didn't have to duplicate its memory so you really would double the vram capacity with 2 cards. Also there was never really a great way to actually get it done, like alternate frame rendering, or halves of the screen or whatever. Maybe it will make a comeback once we have have a faster connection between CPU and GPU. That latency just kills you.

hm true, i guess it depend on how well implemented SLI is by the software.

The issue with SLi was how incomplete it was. everything depented on the master graphic card so you couldn't double yoru Vram etc.

IIRC there was a tecyhnology to replace SLI wich cancel this master/slave archetype but i guess it was dropped ? or is it only reserved for proffesionnal stuff like quadro ?

Direct X 12 was supposed to be the big thing that finally made SLI and crossfire usable. The way each card didn't have to duplicate its memory so you really would double the vram capacity with 2 cards. Also there was never really a great way to actually get it done, like alternate frame rendering, or halves of the screen or whatever. Maybe it will make a comeback once we have have a faster connection between CPU and GPU. That latency just kills you.

Is that NVLink ?
I think that was the name of "SLI 2.0" or "better SLI"

Читайте также: