No, that'd just be an oversight from the devlopers. Forgetting to place a light source would have the same result if they were using RT as well. Of course it's wrong, it'd be wrong no matter what lighting technology you are using. I don't get the point of your comment, it's like you are implying that a accident on the dev's end somehow makes rasterization wrong. Makes no sense.
Well ok lets take a step back, games in general are meant to look realistic right? even in a fantasy game like for example Zelda or whatever the sun amits light and when that light is blocked by idk a mountain, it should cast a shadow etc etc etc, we get this.
RT makes that lighting be realistic, at the cost of performance yes but its correct and realistic.
Raterization is faked lighting (so to speak) and thus can actually be not realistic, we accept how it looks in general but that does not make it correct.
My original question was: cant the devs do an RT pass before the lighting has to be done, just to see how it should look if it was realistic, then use that as a reference when doing the faked lighting in rasterization so it looks as realistic as RT would, without the performance penalty?
(now obviously I get that RT lighting is also dynamic and you lose that sure, but there is plenty of static light that is revealed to be more realistic with RT and seemingly could easily be faked, in Cyberpunk there is a bench with a light above it used in Digital Foundry's video to illustrate RT on and off that shows what im talking about)