RTX is finished

RTX is finished.
It's over.
Nvidia can fuck off with their RTX and overpriced garbage GPUs.

youtube.com/watch?v=1nqhkDm2_Tw

Attached: RayTRaced_Reflections_header.png (1280x720, 722K)

Other urls found in this thread:

en.wikipedia.org/wiki/List_of_CryEngine_games
youtube.com/watch?v=Pili9GcZNc0
youtube.com/watch?v=ponYwl8bEws
youtube.com/watch?v=N6ilbuS3EqI
gamasutra.com/view/news/311123/GDC_Speaker_QA_Building_the_data_pipeline_for_Ubisofts_Far_Cry_5.php
web.archive.org/web/20060517235354/http://www.armadilloaerospace.com/n.x/johnc/recent updates/archive?news_id=290
developer.valvesoftware.com/w/index.php?title=User_talk:Erik_Johnson&oldid=10088#GoldSource
twitter.com/SFWRedditGifs

Attached: Screenshot_2019-03-18 NEON NOIR Real-Time Ray Traced Reflections - Achieved With CRYENGINE - YouTube (1280x668, 188K)

does anyone even use Cryengine?
looks too hard to use

en.wikipedia.org/wiki/List_of_CryEngine_games
youtube.com/watch?v=Pili9GcZNc0

You convinced me, i'm gonna buy AMD now

yeah no shit. it was never like there was actually specific dedicated 'raytracing hardware'. the RTX line just had even more expanded GPU compute capability, which is how GPUs have already been developing forever (literally the whole past decade, since the shader revolution).

it's actually one of the most popular true 'shovelware' engines, and it's quietly the actual tech base of other middleware eg. amazons game engine.

the race is on to full real-time raytracing rendering (one ray per pixel, cast out from virtual view-origin through IRL display panel defined frustum, bouncing from first 'visible object' to 'light source'). this combines neatly with voxel technologies and with procgen to become basically the end of CG as we've known it (UNLIMITED DETAIL).

It works with any GPUs, retard.

Alright you convinced me, I'm going to buy Nvidia now.

It's a pretty neat and I'd love to see RTX go get fucked but
>that
>at 4k
>on vega 56
>at an acceptable framerate

Doubt

Good boy

Alright you convinced me, I'm going to buy Matrox now.

Are you guys telling me that hardware raytracing was all just a big lie to sell more nvidia cards?

Attached: scepticalnig.jpg (426x426, 95K)

Bad goy.

fine, s3 it is

if everyone was rocking 1024x768 still we'd see raytracing in all games, as it is video cards are barely keeping up with bigger screen resolutions (not to mention games with very buildable/destroyable environments having to toss away a lot of the static scene optimization techniques )

Don't even know what that is lmao.

Issue with Cryengine is that its expensice and devs are bound with contract/license. Its a pain. Even those who used it in the end try to switch, like the StarCitizen switching to Amazon CryEngine copy etc.

UE is overall the most popular proprietary engine for reason - its cheap (and now with Epic store litterally free), low requirements and great looking.

There never once was graphically good and well optimized Unity game. Big publishers have own engines they keep for themselves, like shitty Frostbite (real cancer engine pushed everywhere), Bethesda stuck with gamebryo etc.

Cryengine is probably always ahead graphically, but not optimization/price/development ease wise. UE4 wins. ID Software should distribute their engine on license, it seems great. As well Valve get their shit together and make new Source

I need a new vid card, but every time I go AMD it fucks up in the most pathetic ways, last time I owned one it literally couldn't handle torchlight 2 and that game looks like legos

Why do so many publishers advertise their game's engine on startup even though they don't publicly list it for licensing? e.g. Konami with FOX.

I actually really want to try id Tech 6
or if Capcom would license some version of MT Framework

Nvidia overestimated how well their new gimmick could sell cards. Ray tracing is neat but (a) making it a focal point of your sales pitch when it's an immature technology that almost nobody supports well and (b) using it to justify price increases is fucking retarded. If it was just an "oh by the way our high-end cards now support hardware ray tracing and we hope developers can take advantage of it" without the accompanying price increases for the 2000 line, nobody would've given a shit.

Anyone remember PhysiX cards?

Why do you think frostbite is shit? It looks great and runs great on every platform.

en.wikipedia.org/wiki/List_of_CryEngine_games
Cryengine is borderline dead at this point.

an open secret of the NVDA vs AMD battle is that its not just about the hardware but about how well the drivers are optimized for popular games(and benchmarks). The best hardware in the world doesn't matter if some drunk dev makes a crappy shader for a popular game, so the driver writers will "fix" it on their end sometimes

>It looks great and runs great on every platform.
Yeah let me try this Frostbite game.
>5 minutes later
It still at loading screen?! i put it on SSD even!

Just see Anthem for recent disaster. Frostbite only remotely works in DICE own hands, and even so remains the shitty one as far graphics and optimisation go.

Well I hear is a total bitch to use
youtube.com/watch?v=ponYwl8bEws
>TLDR
>developers keep updating it in hope that it implements [some stuff the Unreal already had]
>every update breaks some major system, this happens multiple time, so often that their programmers leave
>Timesplitters Rewind would have been done if it had not been for the fucking CryEngine.


I guess because developer side we keep hearing stories of how much of a pain it is to use, like with mass Effect Andromeda.

AMD
>Overheats
>not very poweful
>bad drivers
>burns your pcie slot
>no RTX or G-Sync
>no developer uses them
>slightly cheaper

Nvidia
>stays a under 10 degrees always
>cools the rest of your pc
>saves you money on electricity
>new driver every week
>very powerful
>RTX, G-Sync, Tesselation, and many more unique features
>developers love them
>The Way It's Meant To Be Played
>slightly higher cost

>he doesn't have a Chrome 540 GTX
>probably runs powerVR

Yeah I was talking about DICE games only. But seriously, the battlefront games and bf1 (don't know about 5) run like butter and are easily among the top 10% of graphics on both console and PC.

Go back to /g/

This is happening now precisely because we're hitting major diminishing of returns on simple resolution increases. Just like the shader revolution was precipitated by diminishing returns on polycount. Even VR resolution is getting 'solved' sooner than anyone was expecting, through foveation (which, of course, ray-tracing over voxels also enormously lends itself to). In conventional PC monitors, at conventional distances, there's actually very little reason to go beyond 1440p. 4k makes more sense for big screen TVs, but even there we're seeing that other features (eg. HDR or dynamic backlight) are far more important than just pushing resolution.

I still can't believe the goyim will do this for free

implying. Ubisoft/Cryteck uses Cryengine for For Cry, they just renamed the internal version to Dunia Engine

Far Cry, Prey, Kingdom Come etc. The issue is at this point there is many versions of CryEngine - Amazon has own copy, Ubisoft, etc.

>Bethesda stuck with gamebryo
Don't remind me

Technically, Id software belongs to Zenimax, so they have access to ID TECH 7 - but the Elder scrolls are all based on Creation Engine, not sure they are willing to switch considering development tools they have.

Cryengine has a 5% royalty model now, similar to UE4. CryEngine 5 is way easier to use.

Bethesda development team were never good enough themselves that the mere engine could bottleneck their potential. And Zenimax' roster is fucking based, as points out.

all these pointless performance hogging graphical upgrades well ensure that games will continue to be stuck in the xbox 360 era of gameplay and interaction

The main issue with frostbite is that since it was designed almost entirely as an FPS engine, major changes have to be made to it to get it to do other things well, if various' devs complaints are anything to go by.

Source 2 exists and has been deployed in Dota 2 for multiple years now.

>"one of the engine's architect at Ubisoft Montreal, the state of the Dunia Engine as of 2017 includes "vegetation, fire simulation, destruction, vehicles, systemic AI, wildlife, weather, day/night cycles, [and] non linear storytelling" which are all fundamental elements of the Far Cry games, and little of the original CryEngine code remained in the current version"
I guess by that logic I should say that Titanfall is still using Quake engine.

ID tech engines are not suited for rpgs.

no, the games being multiplatform and developed in part for consoles is why games are stuck where they are

ID tech been trying to make open world games since Rage, and Rage 2 is being released if you did not notice it.

>AMD gpu market share plummets below 18.1%
>damage control threads ensue
Oh no no no no no, glad to be on the winning side of history

oh yeah, they really are going through with that piece of shit aren't they?

though, main thing for RPG engine is not graphics/size of world, but how many thousand scripts it can ran simultaneously

that's cool proof of anything, something said

also filtered the retarded monkey

Eh.
Rage ran on idTech pretty well, Evil Within 1 and 2 ran on it quite well (and don't forget that Evil Within 2 is also open world)

Instancing could be a problem I suppose but it doesn't seem like something completely impossible for it

Rage is a good example of how retarded and narrow minded Carmack is.

shitposters and retards don't win or at anything

>Thinking Nvidia is retarded enough to develop something for a decade just to lie

He tries until he succeed and he always does.
youtube.com/watch?v=N6ilbuS3EqI

gamasutra.com/view/news/311123/GDC_Speaker_QA_Building_the_data_pipeline_for_Ubisofts_Far_Cry_5.php
Are you honestly gonna challenge an game engine architect's statement?

>cools the rest of your PC
what

>bitcoin drops
>Nvidia goes bankrupt

Role playing game means something different for me and something different for you and the nerd that thinks stories are irrelevant in video games.

you litterally made no point whatsoever, congradulations

yeah rtx is a meme, i don't care about 100% accurate shadows/reflections if it means a huge preformance dip on something i barely pay attention to anyway.

just give me raw power for fps. maybe we'll get some actual uses for ray tracing instead of just graphics, but i doubt it.

>>Thinking Nvidia is retarded enough to develop something for a decade just to lie
They are retarded enough.
RTX is non-existent with their current 20xx gen, devs openly admit that RTX cannot even exist now and most likely will not exist in the next 1-2 generations as well.
The only games with RTX are Metro and BFV and both are just BARELY supporting it with a couple of scenes and it destroys the performance.
Not to mention that when you compare RTX on / off it doesn't even look any better with RTX on.

A first person shooter with a few rpg elements tacked on cosmetically is not a rpg.
Id tech engines are made for first person shooter games.

You're a short sighted idiot. The point is that ray tracing is the future of 3D rendering. It's actual rendering, not the shitty stop gap we use now. We have reached the logical end of rendering as we know it because we have to fake everything. To achieve actual realistic graphics, we have to use tracing. I'm not saying RTX does this well, not yet anyway, but it's the right direction.

Probably Kojima wanted to, but everyone knows what happened in the end.

Gpus are almost like the samsung galaxy phones, they bring nothing new and are just a waste of money.
My gtx 1080 is still maxing out all games at 1080/60fps.
Cant wait to pirate the new doom in some time. Such is the life of the cp master race.

Attached: tFQcCpHpAy.jpg (492x627, 62K)

I like starcraft 2 engine.

Please cease posting, you have absolutely no idea about what you talk about and makes zero sence.

RPGs have zero special engine requirements and restrictions. You can run them in DOS command line for fuck sake. Engine supports scripting and UI? Its fit for RPG.

>only reflections
>no game running on top of it yet only gets 30fps
>it will end up using DXR which is what RTX is made for anyway
It wsa supposed to be our time AMDbros...

Attached: 1550849278222.png (396x408, 165K)

I think we're being tricked in the phrasing here:
What's being said that it is rendered on that card, and the 'real-time rendering' is never seen even on the same screen as where they mentioned the Vega 56.

So probably: an engine trick that makes real-time raytracing possible, but even a 2080ti wouldn't be pulling it off at acceptable framerates.
This clip is fully rendered on a Vega 56, but over the course of several hours.

Attached: mwuh.gif (480x262, 489K)

>no, the games being multiplatform and developed in part for consoles is why games are stuck where they are
This.
Consoles dictate the progress and that's exactly the reason why everything is looking like shit and is poorly optimized on PC as well.

How the fuck can we have any actual progress when you have to use technologies and develop games for $300 consoles with borderline smartphone hardware, fucking garbage like PS4 / PS4 Pro is outdated as fuck and PS5 will not be any better because, obviously, you won't put actually good hardware in a cheap box which is the main selling point of consoles.

what is satire?

wow your 1000 dollar GPU from 2 years ago can still play games at 1080p?
amazing

That's what I thought too. The wording seems like ad-speak for "This was rendered using an AMD gpu, but isn't running in real time.

>My gtx 1080 is still maxing out all games at 1080/60fps.
WOOOOW!!!!!
AMAZING!!!!!!!!!!!!!!
Maxing out at FULL HD and SIXTY FPS. CRAZY!

Fucking retard, have you ever heard about 144hz monitors and also 1440p / 2160p?

>cp

Attached: 1544778624770.png (1085x1217, 1.27M)

>the newest ferrari is a waste of money
>my honda accord gets me to work just fine

everything faster/stronger/better can be deduced to being a waste of time, if decide to live in the past.

Attached: amd.png (601x44, 8K)

Yes, and 4k is still unnatainable at max settings, 60 fps for all cards and a 1440p monitor is not worth the upgrade.
I'll upgrade when a real 4k card comes out so probably in 2-3 years time.

>real-time-in-editor

No, you braindead faggot.
Quake 3 engine is great for first person shooters.
Morrowinds engine is great for role playing games.

What you are talking about is the same mistake that braindead faggot Chris Roberts made when he chose Cry engine for scam citizen.

Get cancer, braindead tool.

So can real time ray-tracing just be done with compute? Was RTX another one of nvidia's "x feature is happening soon, get in prematurely, try and lock people into our ecosystem" schemes like GSync, PhysX etc.?

Real time in editor is actually more impressive because the editor has extra overhead that isn't present on a running exe.

No but Titanfall and Apex legends are still using the source engine. Source never started as a fork of goldsource, which was a fork of quake.

>Quake 3 engine is great for first person shooters.
>Morrowinds engine is great for role playing games.
Look at this faggot and laugths. Not sure if 12yo kid with absolutely no clue, or just this unimaginably dumb he does not know what the fuck engine is and what game developement involves with.

Really, do not ever post about things you have not the slightest clue about, please.

so they did that just to impress us? doubtful.
There's obviously some catch in that respect, because if they actually would be able to pull this off in-engine in-game, they would have slathered that all over the video.
Much better/more convincing pitch than hiding what actually is going on behind the title and in-video cautious claims.

Id tech 7 or whatever they call it now is a derivative of Quake 3 engine.
Zenimax renamed the quake 3 engine to IDtech 3 so they can copyright it as the Quake 3 source code and engine are open source now.

Go play some walking simulator, youngling, and follow a gps arrow...

While it's not the latest and greatest, a 1080 is not a fucking Honda Accord either.
But you know that don't you, shill. The goyim must be made to buy buy buy!

Only retards use "max settings" because in most cases they are indistinguishable from high or even mid settings. Most of the times it's just a fucking useless resource hog.

2080 / 2080ti can support 4k/60fps no problem if you don't plan to use garbage RTX.

>John Carmack - "There are still bits of early Quake code in Half Life 2..."
web.archive.org/web/20060517235354/http://www.armadilloaerospace.com/n.x/johnc/recent updates/archive?news_id=290
>When we were getting very close to releasing Half-Life 1 (less than a week or so), we found there were already some projects that we needed to start working on, but we couldn't risk checking in code to the shipping version of the game. At that point we forked off the code in VSS to be both $/Goldsrc and /$Src. Over the next few years, we used these terms internally as "Goldsource" and "Source". At least initially, the Goldsrc branch of code referred to the codebase that was currently released, and Src referred to the next set of more risky technology that we were working on. . When it came down to show Half-Life 2 for the first time at E3, it was part of our internal communication to refer to the "Source" engine vs. the "Goldsource" engine, and the name stuck.
developer.valvesoftware.com/w/index.php?title=User_talk:Erik_Johnson&oldid=10088#GoldSource

I`ll humor you, and even under suspicion its elaborate trolling actually try to educate you, as the fact of your existence is embarassing as it casts shadow on humanity and lowers the overall iq of the planet.

When people compare "engines" they talk merely and only about Graphic rendering engines, nothing else. Scripting, all game mechanics, physics, pretty much everything else is entirely unrelated and is being done separately - most developers develop own tools/backend for specific type of game they make, be it action, rpg, rts or whatever.

Some proprietary engines come with wider set of development tools/libraries to make starting developing with them easies - i.e. Gamebryo (thats Morrowind engine thats basically just number of generic c++ libraries and tools, Bethesda developed all game systems and dev tools entirely by themselves and struggled for years to make it even remotely workable), Unity (with easy tools for amateurs to make game from scrap) etc. - however its not a must, its a bonus. In general, devs have to CODE their fucking game and development tools themselves and it does not the fuck matter what Graphic Engine is for type of game aside of restrictions like how well it can portray open world/how many actors show etc.

Making an RPG game, it does not the fuck matter what engine you use at all - all engine matters for is graphics. You`ll still has to code your own game systems. You can if you care buy and attach proprietary scripting/dev libraries to any engine too.

RPGs impose zero special restrictions to engine requirements compared to any other game. Open World RPGs are bit different as they require high script load/rendering of big spaces (which btw, Bethesda handles awefully), but thats it.

Fucking RPG Maker VX is the epitome of "rpg engine".

NINTENDO HIRE THIS GUY

>HE NOT NICE HE SHILL
I own a 1070 myself; so if this were a dick measuring contest, I'd lose.

You just caught my eye since you're struggling so hard with being poor, you actually have to shit on progress to cope. New GPU's are absolutely better, and bring more performance and technologies to the table.

So what graphics card to with if I have £400 to spend?
I wanted to avoid RTX cards like the plague.

>everything faster/stronger/better can be deduced to being a waste of time, if decide to live in the past.
You must be retarded to buy 1080ti / 2080 / 2080ti for fucking 1080p.
Even 1080 is too much for it, 1060 would be enough.

>Dota 2
wow, it's fucking nothing, show me it running on an taxing game not potatowares

You stink of cancer, rot, burnt plastics, judaism and reddit.
Graphics rendering engine is just one part of what I call an engine.
Text chat,sound playing, multiplayer and networking, npc or bot scripting and ai, etc. are other parts.

ID tech 7 doesn't have some parts not even rudimentary that are required for making a role playing game.


>RPG game
lol

>tfw you game at 1080/60 fps but do rendering and other modeling shit so you need an expensive gpu anyways
Shit hurts me

Attached: Moth girls have ruined me.png (252x340, 77K)

Might just get a better monitor if that's the case user.

Rage 2 isn't on id tech.

Too bad AMD don't make high-end GPUs.

I got blown the fuck out. Thanks.

>small street with only a few reflections going on at any given time
>this is somehow a comparison to games with huge open levels that force raytracing onto everything even though you only see it sometimes.
Yeah okay, it's almost like using way less gpu power results in it being easier to use.

Attached: 29bebcbf6f400816ad0b5eb903926e4e.png (930x894, 398K)

Which type of calculations ray tracing is using? FP32? FP16? Mix?

>RTX cannot even exist now and most likely will not exist in the next 1-2 generations as well
True when you're talking about consoles. It would be a great thing for PC's if they managed to make it easily moddable, like injecting ENB.

When you say that RTX destroys performance, you should put in caps: Ray tracing DESTROYS performance. The game goes from 160 fps to 25 with that shit on in a 2080, who the fuck plays FPS games in such low framerate when it under 60 is borderline unplayable?

Attached: 1455642801028.gif (416x414, 1.63M)

If you can't see it, it's not being calculated. No one's wasting power. The demo is small because why the fuck would you make a giant open world to showcase reflections

>kikes won't articulate plans that can span decades, if it means they get more money at the end of it
you underestimate them, and that shall be your downfall

It was pretty weird. “We know you’ve been waiting a long time for a GPU upgrade, so here’s a massively more expensive piece of hardware with minimal speed gains tied to tech that will be supported by two games in the first six months.”

Their massive growth was driven by huge performance leaps with incremental price increases. Then they drop an incremental performance leap with a huge price increase.

I called it as a failure the day it was announced and Nvidia isn’t paying me shit.

It's funny now that they're learning their lesson in the mid-range. 1660 - 2060 are actually decent improvements over their predecessors in price: performance

You still have to calculate rays bouncing around on geometry the player can't actually see for ray tracing to work in the first place.
But my point is you can't determine the performance of this tech demo compared to the current implementation of RTX in games at all because this is optimal optimization case for the tech (dark streets)

Attached: 34962498d6f60fbb57e88b3d578f65d9.png (1098x1216, 576K)

I’m not a silicon expert so I might sound like a retard now but I wonder what the 2080ti would have been had they took out the RTX hardware and used that space to add more GTX hardware; in other words what regular performance did we give up for this gimmickry? The 1660ti is this hardware design without RTX and it’s pretty fucking good as you point out. And this is before a die shrink to 7nm.

The good thing is with the failure of this series Nvidia is likely to push out the 3xxx gen faster.

I'm just waiting for the day we can do this without hybrid rendering methods. Though I admit, Unreal's ray tracing implementation looks really promising but their denoising is kinda wonky atm.

Attached: D1iqdF-X4AApvrE.jpg (1200x1013, 188K)

that doesn't look real at all

unironically looks like the sort of graphics that the fucking alien from Toonami had, back in the early mid 2000s

I think the general aim is real-time rendering of highly detailed lighting and reflections rather than realism, though you can't have the latter without the former.