Tuesday, February 9th 2021
CD Projekt RED Hacked, Attacker Claims to Have "Cyberpunk 2077" and "The Witcher 3" Source Code
CD Projekt RED just announced that it has been hit by a cyber-attack on its internal network, with the attacker having gained access to certain sensitive information belonging to the CD Projekt group. In a press note posted to Twitter, the studio included the screenshot to a plain-text ransom note left on its servers by the attacker, who claims to possess source-code of the company's most popular titles, including "Cyberpunk 2077," "The Witcher 3: Wild Hunt," "Gwent," and an unreleased version of "The Witcher 3" (possibly a remaster). They also claim to have confidential documents related to CDPR's financial accounting, administration, legal, HR, IR, and more. The note ends with information on how to reach out to the attacker to discuss ransom within 48 hours. CDPR announced that it will not give into the demands of the attacker, and has reached out to law enforcement.
Source:
CD Projekt RED (Twitter)
75 Comments on CD Projekt RED Hacked, Attacker Claims to Have "Cyberpunk 2077" and "The Witcher 3" Source Code
And eh... is it really a loss to a triple A company to have some weird knockoffs of absolute gutter trash in the market?
Its an illusion you can control data like that, despite what many love to believe. We think that by having a metric ton of rules we can prevent stuff from happening. Lol. Right. If anything we should know better by now... not a single company, system or group of people has shown to be immune to all things human. The best thing you can strive for is 'mitigation'. Data management is like Covid in that sense, good luck getting to zero incidents, and there is a point at which rules take a major toll on usability of (a system) society, creating rebellious users that are more keen to not stick to them (and create leaks); or some event happens making people unhappy and willing to break rules explicitly.
This is why transparency of data can be such a powerful tool, and why it is in some ways inevitable. If you have to spend more money than you can make off the data to secure it, what's the point? That's herd immunity, right there: expose the data, and its no longer valuable.
- Keeping systems up to date (especially servers, firewalls etc.)
- Not running Windows for servers
- Restricting access to resources on a per user basis (not common universal passwords for everyone), this also allows revoking access easily when someone leaves.
- Have segmented networks, firewalls or VPNs controlling how and which computers can access each other.
And most importantly, you do security in layers. Sooner or later, one layer may be compromised, so detection and damage control is essential. Far too many companies operate in a way where just a virus or a bad actor on any computer can steal or damage everything. Many companies with thousands of employees have been hurt by a single compromised computer, if basic security practices were followed, this wouldn't be possible.
But often, the damage from accidents or incompetence can probably be even worse. I know of a concrete case where a sysadmin at one company typed rm in the wrong folder on their main source repository server! Oops, hundreds of projects gone. Luckily they had daily backups, but still, there were a lot of work lost for thousands of engineers.
*)
Some things may be hard to have up to date backups of, or loss of even a few hours of work can sometimes be very costly. Just give them 10.000 man hours, and then maybe…
The flaws of this game is severe to be fixed by a few tweaks. I think this fear is mostly old thinking that source code is incredibly valuable and thinking that competitors would "steal" it and use it for competing products.
But source code isn't something that so is easily adapted to other projects. Even if your project have a super smart algorithm that I want, chances are that it will be harder for me to integrate yours than to write my own. And I would argue, the bigger the source, the harder it is to adapt it to your own purpose. In software engineering there is a lot of specific knowledge known only those who have written the source. If someone get their hands on a completed game engine, it would take them years to get familiar with the code base and redesign it to fit their own needs. By that time, does it really matter that much?
(Leaking an unfinished product is different though)
Major source code leaks has happened for years, both for games, Windows itself and even hardware designs (from Nintendo). I haven't seen the immediate emergence of cloned/derived software from any of these.
I hope we can get to a point where it's more common that game source code is available (but not necessarily free). It can still be protected by copyright, so if another company uses it without permission they can still sue. And if some random guy uses it, who cares, really? Companies have a lot to gain from embracing their enthusiast bases, they can provide a lot of cool additions or improvements to a product, for free.
Or you can make Geralt pr0n with it. Well yay and well hello modding scene what did you do all these years Those markets are perfectly capable of imploding on their own. Gaming is like a learning curve, people get new systems to chew on, new mechanics, and they fall for all the psychological trickery involved, then get wiser. We all did. We all played our share of MMOs and grindy games with well managed dopamine shots and flashy bars filling up. We all raced for those high scores. The leaderboards in the arcade. The first guy to finish in a race.
Many of those concepts have been given more depth or are otherwise filled with new takes on those dopamine shots and how to manage them more effectively, fooling us once again ;) I recall a lootbox, MTX, skins, seasonal content, chapter-based releases, dailies, weeklies, monthlies, seasonal events, etc etc etc ad infinitum. That is also why those older concepts also work so well on the mobile space. Lots of new, gullible gamers. And the worst thing of it is, many of them have never known how glorious and worry/money free gaming could actually be. 'Pay 10 bucks to reset a timer that you're going to see a few hundreed more times' - and people do it.
The unfortunate thing is, it seems a lot of people also don't get much wiser. Is that an opportunity lost? The best way to combat that style of games is by bringing a real game. Its disturbingly easy and its part of the reason of CDPRs success. They did what few others were still doing and showed everyone there was still serious money in serious games. And not just them of course, but you get me.
What about a company having their security not in check? Never leave source-codes on a machine thats hooked to the internet.
A friend of humanity...
As for the attack: First of all, damn script kiddies. Looks like another wannabee hacker - script kiddie using Cobalt Strike. Kudos to CDPR for having a robust enough infrastructure to not care, sysadmins should get a hefty raise. Second, losing source code isn't such a huge deal as some seem to think. No one is going to create and sell copies of the games and no sane company is going to use stolen data in their products. Some top-tier modders might use the leaked REDEngine files in their work, but that's about it. Leaked HR/administration data can lead to targeted attacks and/or things like identity theft, but no more than someone digging up your electricity bill from the trash - which actually happens, so at least use a permanent market to cover your data. Seriously.
That said, that's just my take on it. There's no hatred in my heart for the 'console peasants', I just enjoy what I do and hopefully everybody else does, too.
I'm not looking down on "console peasants", and I most certainly had some clashes with the elitist "PCMR" types. Everyone uses what they want or can and there is no place for any feelings of superioity or inferiority in that. That being said, playing a complex game like Cyberpunk 2077 on a console is like taking an old, rusty bike to a hard single track - you absolutely can, but your experience will be far from optimal and you need to understand that you will contend with a lot of problems.
Statements like these also ignore the quite large amount of training and conditioning required just to actually perceive high-performance gaming as better. I'm not disputing whether "people can actually see more than 60Hz" or other nonsense like that (human perception can rarely be reduced to a single number, and is extremely dynamic and context-dependent, and also generally friggin amazing), but there's a distinct difference between being able to vaguely perceive that something is slightly different - such as motion being smoother - and being able to actually pinpoint why that is, whether and how it affects gameplay, and whether it is actually to the better. That requires training. Outside of "hardcore gamers" and hardware enthusiasts, it's likely that the average person would be able to perceive some sort of difference between, say, playing a fast-paced game at 60Hz or 120Hz, but the more important question is: would they care? Most likely not. And if you don't care, then the difference is rendered moot.
PC gamers and hardware enthusiasts have been in a decades-long cycle of conditioning into the belief that higher performance is always better (both by the community and importantly by marketing departments of PC, component and peripheral brands), which is strongly tied into the types of games that are presented as "real games" and the types of people seen as "real gamers". It's all a question of perception, identity and gatekeeping. Depending on the game there are obviously situations where higher performance can give an advantage, or where too low performance can make a game unplayable, but extrapolating this into an "objective truth" that PC gaming is universally superior is just pure, unadulterated nonsense. PC gaming discourse loves to present itself as if we have somehow gained access to a set of universal objective truths that other people simply don't know about, which just illustrates that the original irony of the term "PC master race" has long since been abandoned. The preference for high-performance gaming is just that: a preference. Taste. And taste is what? Subjective.
Now, this rant did come slightly out of the blue, and I'm by no means trying to "get" you in any way, nor trying to somehow say you're disingenuous in what you're saying - this is directed towards everyone and no-one, and certainly not you in particular, your statement just got me going down this path. But I think we all stand to gain from deluding ourselves less, and this is a case where that's pretty simple. There is no "objective truth" to the superiority of PC gaming. Humans do not engage with the world in non-subjective ways. It's a matter of preference. And having preferences is, in case it wasn't obvious, not just perfectly fine, but a core condition of being human. But presenting our preferences as if they were objective truths is a rhetorical and intellectual bad-faith tactic that serves no other purpose than to elevate "us" above "them". And that's a dangerous path to go down.
/rant
We hear a lot about massive conspiracies from government sponsored hackers using zero-day vulnerabilities or creative techniques to communicate with air-gapped devices, or techniques that can read your data through electromagnetic noise etc., but in reality, in 99.9% of cases people just fail the well-known basics.
Big question is did twitter ban said hacker lol
"It's either a fundamental lack of security, or an inside job (or both). Basic common security practices would have prevented the scale of this. Whoever was in charge of their security should probably be fired."
You probably never worked in a corporate environment. The scale of such infrastructure simply requires some degree of simplification - you can harden anything to the point of being anything-proof, but then your employees will spend more time doing security checklists than working. And I can guarantee there will be some mid-level douchebag manager with a superiority complex who will write down all his passwords on a piece of paper he tapes to the screen and a cleaning lady who will take this card and sell the passwords. For tools like Cobalt Strike that's enough. Get a foothold, elevate from there. It may take some time and effort, but once you're in, you're in.
This happens to every company at some point, it's how they deal with the problem that shows their quality. Many big corporations actually prefer to pay and keep it quiet to avoid bad PR.
But once you got the basics right, you can consider further steps, including some kind of intrusion detection, honeypots, etc. But as you mentioned, taking things too far might result in wasted resources or even people finding ways around the system. This is why it's important to get the fundamentals right first, instead of focusing on a niche problem. And doing an overall balanced approach with good security isn't that hard.
As for the rest - I really hope you are not responsible for any kind of security system. Just read about privilege escalation as an elementary basics. You can sandbox users or segment your network to your heart's content. If someone gets a foothold, starts snooping around and escalating horizontally and vertically, the only limiting factor is the attacker's skill. And we live in times when the tools you can get for free, buy or even rent are sophisticated to the point where such attacks don't really require years of learning. Hence my "damn script kiddies" remark, as everything points to a medium-skill, open source tool being used in this attack.
And I'm not saying that you're consciously expressing any kind of derogatory attitude either, nor do I have any basis whatsoever to comment on your opinions or ideals, but it's a rather simple fact that saying (whether to someone directly or nobody in particular) that "your choice of how to perform [activity A] is fine, but mine is better due to the criteria I and my peers have chosen to constitute "better performance of [activity A]" is an inherently derogatory statement, and that this discourse in PC gaming is fundamentally derogatory. That's just facts. And again, there are good arguments for why aspects of PC gaming are better, but choosing to focus on these aspects is a choice, and is thus subjective. The very claim of "my opinion is objectively true" (especially, but not exclusively when it actually isn't) is itself derogatory. I also think PC gaming is generally better than console gaming, but that's my tastes and my opinion, and no amount of arguments as for why I have that opinion will make it anything but an opinion. The derogatory part comes built in when one starts assuming that one's opinions are somehow objective facts, because that also necessitates the belief that the opinions of others are factually wrong. Which, again, you can think that they are, but that's another opinion. It's opinions all the way down.
I'm talking about the nature of some of the bugs. Those of us who have worked with graphics programming for many years got a good sense of understanding a bug just by observing it, and be able to tell if something is e.g. a shading bug, culling bug (there were some of those too), a synchronization issue or a flawed physics in the game simulation, without being familiar with the code base. Quite often, small graphical glitches are more an annoyance, while flawed game simulation often can be "game breaking".
Sometimes you can observe how well the game simulation is made by running it on a CPU that's barely fast enough. If you start to see objects flying around as a result of stutter (CPU limited, not GPU limited), it probably means the engine is using time deltas for calculating acceleration. Or if stutter causes objects to go through each other, fall through the ground etc., that probably means the simulation has "skipped" a few iterations. Both such examples are evidence of poor engine design.