New surge of bad programmers is a more apt title, or "We gave game clients absolute control and you wont believe what happened next!!1". Pretty much all the games listed have ZERO server side validation. As I commented under previous Valorant anti cheat backdoor story:
'what is actually crazy are modern game servers lacking even the most basic server side assertion logic(1) letting players FLY around the map, teleport, summon items, kill other players on the server at the press of a button, etc. Seen in all massive FPS games, for example Apex Legends, PUBG, Escape from Tarkov.'
> New surge of bad programmers is a more apt title, or "We gave game clients absolute control and you wont believe what happened next!!1". Pretty much all the games listed have ZERO server side validation.
I feel like I should share an inside opinion, but you will not like it. So please take it as an opinion and don't crucify me.
The Industry itself is mired in complexity, year on year we have to build bigger games, better and more complex games and we have diminishing amounts of budget (both in time and money) to do so. Failing to make something truly great will cause the next iteration to have even less budget and then you're dead.
When it comes to gameservers, which is actually something I understand very well[0] there are additional budgetary constraints. The cost of a player, playing the game is an eternal one. If you keep playing we only get the upfront game sale from you, so there's an incentive to keep the costs as low as possible.
NDA prohibits me from saying an exact number, but lets assume (and it's mostly true) that if you play a game for 3 months then we've lost money on you from server time alone, and we well optimise our servers. The more you put in for verification and detection, the more cost there is.
This is why when we originally talked about releasing the game we were intending to forego a PC release entirely, as consoles are barely affected by this, and when they are it's quite easy to detect comparatively.
Game companies invented this “problem” so they could extract more money from gamers. Go back 10-12 years ago and most PC games had server software and the community hosted most of the matches. The community handled the cheaters.
Game companies took this away so that they could sell season passes, hats, shirts, and other cosmetic garbage.
This is a problem your industry created. So please spare us all the woe is me charade.
That is a supremely cherry picked list. GTA4 was a single player game with some thrown in "community" features. Fallout 3 was a single player game.
You see... we didn't ask for you all to ban dedicated servers and take on the cost of official servers. You did that to us against our objections so that you could extract more money from us. PC gamers would be perfectly happy playing Modern Warfare on dedicated servers. We don't have that option. We are forced to play on official servers and then you whine about the cost of doing that. That was YOUR decision!
You can't force players to play on official servers and also whine about the cost of official servers. You have created this problem! If it is too expensive to host official servers then let the community host them like we used to.
>I did warn you that you wouldn't like it.
We don't like being patronized. It is peak irony for game developers to lament the cost of official servers when official servers were something they forced on the community so that they could sell more hats.
I know you hate me, frankly I don't really care at this point. I've been in the industry 6 years and I'm so utterly and completely burned out by being "a patronising and evil" presence when I make the mistake of trying to actually confer real information to people.
Everyone complains that the game companies are ivory towers and that their voices aren't heard: we hear you. But you don't want what you think you want and I'm not trying to patronise you by saying that.
Additionally, I didn't cherry pick the list I intentionally grabbed the first ones that had networking, almost no cross platform games from 2008 were heavily networked. (Fallout 3 was an oversight, I meant far-cry 3).
As I've stated repeatedly in this thread: Self-hosting servers only works for session based games.
Even then, those sessions can leave players with a very mixed experience (weird rules, mods, someone might have made themselves "god", it opens up the client to be exploited by the server). Weird latency issues and connection drops not withstanding.
If it's dedicated servers that you rent from us as a company then: it can be expensive for you, if it's too expensive, less people do it, less people do it, the more expensive it becomes to maintain the provisioning/admin tools and it's amortised across much less players so the price goes _even higher_.
Then there's the server list fiasco. Where do you connect?
* XxXxBubBas-BayxxXx--freelootlvl25OnlY!?1
* 174.255.121.2:5543
* bit.ly/free-nudes
We listened to the community before, many times, sometimes people are so outspoken that we really believe we'll have competitive edge if we're the ones to "listen" and every fucking time we get burned by it. So I don't blame anyone in the industry for listening to their CMK department more than the hardcore PC Master Race.
The Hardcore PC Master Race do not represent all gamers.
I was indifferent about you until this post.
While I appreciated the information you are giving "from the other side", I also understand why simply telling us this wont make you look any better.
But then you present us that the reason for only official servers is because community servers might have mods/weird rules, or ,dear god, funny names? How exactly is that a problem? Currently I spend some time in Rust. It has official servers as well as community ones. And let me tell you, the community servers are whats keeping the game afloat.
Funny names in server list virtually doesn't exist. Its like people in general have a certain level of taste and it just so happens that servers with cringe names are left empty.
Mods are great. Different people enjoy different mechanics, and thank god for different rules on some server, like "solo only", because it's actively sought out by people that don't want to combat 20 men teams on official servers.
Your statement that community servers are only working for session based games couldn't be more diverged from reality. Some of those servers are hosted for years continuously, only restarted to apply updates. And for many other games this is the case.
You say that we don't want what we want and Hardcore PC Master Race ain't representing all gamers, but frankly, if anyone knows what they want, wouldn't you agree Hardcore PC gave it more thought than your "average gamer"?
The issues you described ain't problem of community, which you heroically solve by taking our toys away, but are exaggerated, misunderstood, almost nonexistent issues that mainly come from skewed views of you as game developers and yours personally.
Maybe you are burned out of people hating while proclaiming "Im only the messenger" because it touches you personally, correctly, because you also believe the wrong message we Hardcore PC Master Race gamers so hate to hear.
EDIT: And the fact that multiplayer game becomes unprofitable once player spends more time in it is horrifying. So you telling me the studios are actually incentivized to make games we stop playing almost immediately after purchase? And you use this to defend your point why servers cant afford basic anti-cheat?
I still play some Battlefield 4 from time to time. Custom servers are awful.
You'll randomly get kicked because you didn't pay for a VIP slot, randomly killed because only 5 snipers per team are allowed and randomly kicked because the admin didn't like the particular gun you're using.
I can see why game developers would (and should) prevent that.
If i remember correctly BF4 follows the same logic of BF3, meaning that "Custom" servers are actually running only on hostings that EA has given license to, so at best the administrator has the ability to manage the players and not the game itself.
The only thing that may dissuade game creators from creating openly available dedicated server software is that they will need to let everyone access the list of servers(like Valve), so they would get traffic without any kind of ROI.
Interestingly I have the opposite experience with the previous battlefield 3. There are still several active servers, all of them are generally fun to play with no weird rule sets. I remember in the earlier years of the game some servers had rules like no sniper and you were automatically kicked if using one but that was made clear in the server title.
Nothing can explain it to you like experience. It's nice to be principled and see the world as black and white, but the overall theme I was trying to portray was that everything is trade-offs.
You can't have your cake and eat it too here, everything costs, and player experience is king.
Cheating happens, we don't want it to, but we do the best we can with what we have, and if it's not enough: play on a platform that does not allow memory modification and injection. It's damn impossible to "secure" a client, and when we do people cry about DRM. (I'm not saying that some DRM isn't to "prevent piracy" and I'm a free software advocate so I hate DRM, I'm just saying that if we included DRM in our game people would go nuts about it- but it would help solve the issue here).
AAA titles get hundreds of millions of dollars of development effort, and they can't even bother do basic things for the community and instead thwart community efforts. The problem is that the studios have been bought up by corrupt corporate bean counters who just see video games as a product to sell and not a passion they share in. Educated consumers are the bane of sleazy businessmen, and the salvation of honest ones.
It's sad that you think we're "out to get you", we're just making games.
Consider the widely acknowledged fact tat the working hours are brutal, the pay is quite low for C/C++ programmers, the average experience in the industry before burnout is 5 years.
Nobody would do this if the passion wasn't there. This industry basically couldn't exist without the passion for making games.
And, every time we fail we're yelled at for being incompetent. Every trade off we make is derided by people who never actually tried to make something like this.
Every time we try to increase the amount we're paid, or set a decent working schedule (longer development times mean more salaries because working on a project longer) we're told that we're "sleazy business men" trying to make a profit.
Like I said in another comment, the only way for you to get "better" games is to set your expectations for games lower than AAA (IE: stop buying AAA's), make us slaves or pay more.
And people (in aggregate) do not want to pay more.
> Every time we try to increase the amount we're paid, or set a decent working schedule (longer development times mean more salaries because working on a project longer) we're told that we're "sleazy business men" trying to make a profit.
Who's this "we"? The workers or the shareholders? Is there any unionizing effort on the videogame industry?
There is no need to secure the client because all the things the cheaters can mess with should be server side to begin with. DRM doesn't solve the problem. It only makes the problem more attractive because it lets you avoid the real solution. The end result is buggier games because the project manager can get away with shipping half assed crap.
I'm saying the trade-offs we make are known. It's impossible to do everything. If we could do what you want us to do on the server the game would be 1/10th of what it is because each frame would take 2 seconds to render otherwise.
But, I’m hiring. If you think you can do better. I’ll put in a good word.
Action games can't be server side fully. Latency would make the game unworkable. So you'll get wall hacks and speed hackers, and aim bots, and trigger bots. You can ban them post facto but its a continual cat and mouse game with detection and avoidance.
I think this explains why you are having trouble understanding the users point.
I was playing games like Half-Life off of user provided servers in 1998 and can't recall ever having any issues. Others on here seem to have fond memories of doing the same with other games.
> Even then, those sessions can leave players with a very mixed experience (weird rules, mods, someone might have made themselves "god", it opens up the client to be exploited by the server).
This is an odd concern to bring up considering the linked article.
I think stop paying attention to the hate is the best thing to do. There is a mindset right now to think of these problems as completely technical ones, I.e. “it’s easy to do server validation and anyone who doesn’t do it is a dumbass”, but there are very real limitations in regards to money and time.
It’s like some engineers are forgetting that most problems faced when programming are not technical.
I started my career developing on a real time OS, learning from people in industry for 20-30 years. I’ve since transitioned you something “higher level” and more web/server focused and a lot of the foo spat out like this is complete BS.
When developing on a real-time OS the only problems I faced was a simple product story (we are building this to solve X problem), and a simple technical problem (build a system which uses under 1% CPU to do X).
When developing these other products though suddenly there are many other considerations to take into place. I appreciate that you bring up latency concerns in multiplayer games, it can be the biggest downer to play a competitive FPS game where players have wildly different pings.
There is truth that we’ve coded ourselves into a corner of complexity and that we there are much efficiency gains to have in terms of output and program efficiency. But it’s willful ignorance to think that is the only problem or that it is trivial to solve, or even that the people involved in a project don’t think about this all the time.
> almost no cross platform games from 2008 were heavily networked.
Aren't you forgetting something? Like every MMORPG in existence? Almost every player hosted multiplayer game like Warcraft 3 that were released in 2002? (the host can cheat because he is running the server but the other clients can't which is the important part)
Seriously. I don't understand how hard it is to just run the security sensitive parts on the server. It's not like running things on the client removes the need to communicate with a server. The amount of time the hacky solution wastes is significantly worse than doing it correctly and kills the game when it finally launches.
If you want to know more about the server performance situation I recommend watching a video[0] about the subject from our lead gameplay programmer.
I'm getting very tired, and irritated so my responses are coming off as unreasonable now, so with a final hopeful breath let me say the following: I hear what you're saying.
Truly I understand that you think that it's super easy and we have so much money that it should be solved.
We don't have the money you think we do, we run on very thin margins, in both time and resources.
Others have mentioned that we should shift the burden of running servers to users, but that has many tradeoffs and doesn't work for many games- especially not on consoles. Especially not in the sheer numbers of users we have.
The issue is also understanding what 'security sensitive parts' are, because what is that? Everything is sensitive, movement, bullets, health, what you see, what you hear, what you can interact with, what loot you picked up, what gun you have equipped and how fast it's equipped, how fast you reload, how many bullets are in the clip.
If you were to stall a client so that the server verified you could do those things then your game is running 1 frame every 20 seconds. That's even if you had infinite performance on the server to make those calculations for the 24 players that can see each other.
It's a hard problem, one I genuinely invite you to solve for yourself, maybe the answer is to make much simpler games like CS:GO or Minecraft.
Or maybe the answer is simple and everyone I work with, both in my office and in the industry of AAA gaming, is malignantly incompetent like the commenters here keep suggesting.
As a relative novice when it comes to multiplayer game programming, your comments have been enlightening to me.
I’ve grown my experience over the years in other fields to a point where “don’t trust the client” is nearly second nature to me. So whenever I’d dip my toes into programming a game, and look at multiplayer server side stuff with puzzlement, trying in vain to work our where or how I’m supposed to be validating the legitimacy of the clients incoming game events. It never before occurred to me that doing this kind of server side validation might be too burdensome. The notion that the backend code I was trying to understand had been written with minimising cost per player hour as a positive metric is now so face-palmingly obvious that I am still stunned at how I never realised and literally having a string of “oh and that’s why that bit there..,” as I write this.
I was always curious if a p2p hosted game with host log shipping to a "validation" server would work for a game. People could run a tournament but validation servers would reject or accept a game. Not sure if humans would go for that. Maybe if the game was super up front about it.
>I'm so utterly and completely burned out by being "a patronising and evil" presence when I make the mistake of trying to actually confer real information to people.
Step down from your high horse. You're just parroting PR talking points that are totally nonsensical. If game studios could save money by offering dedicated servers they would do it. They switched to official servers not because they care about the experience but because it allowed them to monetize online play better. You can't sell hats when players can host their own servers and mod it however they wish. Why would I buy a hat from Infinity Ward when I can just mod myself a hat on my dedicated server?
>If it's dedicated servers that you rent from us as a company then: it can be expensive for you, if it's too expensive, less people do it, less people do it, the more expensive it becomes to maintain the provisioning/admin tools and it's amortised across much less players so the price goes _even higher_.
Again, if it were possible for game studios to make more money by offering dedicated servers, they would do it. You are inventing problems with dedicated servers so that you can ride in and be the hero when you solve them. These aren't actual, real, problems. These are just PR talking points to justify forcing official servers so that game studio have better monetization paths.
>As I've stated repeatedly in this thread: Self-hosting servers only works for session based games.
No, it doesn't. That is a limitation you (game studios) have invented. There are dedicated servers for World of Warcraft.
>But you don't want what you think you want and I'm not trying to patronise you by saying that.
Ah yes the ole "I know best." The final landing spot of ridiculous arguments.
Well, "our investors demand more monetization then what's possible when first-party servers don't have a monopoly" is a actual, real, problem; the problem is that the investors have too much control over the software design.
> There are dedicated servers for World of Warcraft.
There were dedicated (third-party, ie potentially self-hosted) servers; last I checked (when I wanted to give WoW another try) Blizzard has been agressively C&Ding them whenever it can.
You skipped few of the most played multiplayer games of 2008. Call of Duty 4: Modern Warfare, 'top-selling game worldwide for 2007', 7 million units moved in a year. Then there was 2008 Call of Duty: World at War. I stopped following cod soon after (life), but looking further to 2011 Call of Duty: Modern Warfare 3, again top seller, also shipped with dedicated servers support.
Turns out you can have AAA with dedicated servers as long as you arent planning to nickle and dime your userbase selling macrotransaction first day DLCs or pretend your limited instancing game is an MMO.
>On a more personal note: I don't understand why everyone wants to eat the fancy cake but hate the baker for telling you what it took to make it.
Because nobody asked for the fucking fancy cake! Why do people in the industry not get this simple fact?
We were all very happy with the cookies we had before. Which is why we kept coming back to the bakery and telling all our friends how awesome the bakery is. Then the baker says, "hey, those cookies suck, what you really want is the fancy cake! No more cookies, only cake" and despite the customers screaming at the baker, pleading with him to bring back the cookies, he just won't and is surprised that customers are frustrated that they can't get what they want. We eat the fancy cake now because it's the only thing available. But we want the cookies we had before.
Most players don't want self hosting, they want to hit 'join game' and have a nice gameplay experience. They want to form up a team with their buddies and play against similarly matched players (or destroy some newbies). They want consistent ping times, worldwide rankings, and to know they are playing on a level playing field.
As I remember playing in a self-hosted world was one where there were hundreds of barren servers, servers that disconnected mid-game, servers that randomly lagged due to bad internet and unknown server hardware (often shared with the game client) and hacked server galore.
If you were in the 'in-crowd' you had the passwords to servers where the hardcore players hung out and had fun - and you were in the 1%.
The market wants free to play. No really. For every player that claims they want to put down $60 before even playing a game there are 1000 that won't even glance at your game if it isn't free. I can't even recommend $5 iOS games to friends and family without them turning up their nose because they only play 'free' games.
There is a vocal minority of games who feel like the games industry isn't listening to them and not giving them what they want. And they are correct, but the games industry isn't conspiring internally to screw those people, they are just trying to price games in a way that maximizes their potential to pay down development costs and pays for ongoing development and devops costs.
You can just hit play and get assigned to high ranked community server in same manner as you do to official.
Nothing you say in first paragraph outplays community servers.
Your second paragraph is your memory, but its not what it is now.
Empty servers? what about "hide empty" checkbox in serverlist? Or your praised Join Now button finding server with high pop?
Server lagging-disconnecting? Are we in 1995? Most current servers are on par with official. In fact, I more often encounter connection issues in games that only sport Official servers, and the fact that I just cant join different server sucks.
Hacked server galore: Thats the issue with official servers too. Frankly, people devoting time and resources to host a server for their favorite game dont do so so they can hack on it.
"If you were in the 'in-crowd' you had the passwords to servers where the hardcore players hung out and had fun - and you were in the 1%." --- wwwwwwwwwhat? What planet you from?
Self hosted servers were some of the best times I had with gaming. Admins kept the server protected from hackers, communities formed, you met regulars.
There was a big backlash in the community when matchmaking became the norm. I actually "boycotted" Call of Duty MW2 when it launched because of the lack of server support.
This is the first time I've heard of it being a fallacy, but if someone says they want cookies, but spends money on cake, if I have a factory that can make either cookies or cake, and it operates under capitalism, it seems self-evident that I should make cake.
Sure, but if someone says they want not-heroin, but spends money on heroin, that doesn't mean they want heroin, except in the most asininely reductive misrepresentation of their preferences.
Maybe most people don't care about, say, the existence of servers that will put their desires above those of the game company (which is one of the things self-hosted software practically provides; no one disputes that running the server on your PC just because is a niche scenario), but the fact that they fail to effectively strategise their spending toward that goal is not evidence in favor of that conclusion, because it's much more easily explained by the known fact that people (note I didn't say "most") are short-sighted, impulsive, technologically-illiterate (okay, that one's most) hypocrites in most of their decision making.
"if I have a factory that can make either cookies or cake, and it operates under capitalism, it seems self-evident that I should make cake."
The system has evolutionary pressures towards financial reward. The game companies that make "cake" are rewarded and gain market share and power. The companies that make "cookies" are financially punished and lose market share and wither.
In the system that we have, with the financial systems at play, it's inevitable that you'll get "cakes".
fragmede's point doesn't need a rebuttal because it has nothing to do with the actual disagreement.
> > nobody asked for the fucking fancy cake! [...] We were all very happy with the cookies we had before.
> No, [...] you're wrong.
Clearly the the profit-maximizing action is to manufacture cake, 'free'-to-play games, and heroin, rather than cookies, well-designed games, and nothing; no one in this thread (that I've noticed at least) is actually disputing that; we're saying that we don't want cake, as far as we can tell no one actually wants cake, and it's axiologically bad that production is focused on cake rather than cookies.
Some people are then (rightly or wrongly) blaming the baker, but the only thing I was asserting was that no, the fact that people pay money for cake does not mean they want cake. (In the absence of any other evidence it suggests that people want cake - which is the reason capitalism ever works at all for anything - but we have multiple people saying cake is terrible, so there is not a absence of other evidence.)
Let's say there are thousands of bakers. There are bakers making cookies and cakes all the time.
Some of those bakers are financially rewarded for their choices, and thus gain power. Others are financially punished for their choices, and slowly are pushed to the margins or die (as a business. The person just moves to a non-baking field).
Over time, you will see an evolutionary pressure toward the baker's baking cakes. You will see fewer and fewer baker's baking cookies.
This is what happened in the gaming industry. There were options for cookies, but they weren't financially rewarded and those options have been drastically diminished because of it.
I will say that, in this now-stretched analogy, I also prefer "cookies". Those games are out there. They're being made by indie developers. Go spend money on them if you want more of them.
I feel like that list is a bit misleading and your comment ignores how the PC gaming community always wanted to run their own dedicated servers and games not allowing to do so often got a lot of flack.
Call of Duty games come to mind and afaik Battlefield games used to have that option too until EA changed it to “You can rent servers from us!”.
Which doesn’t even go into other issues this had solved but then came back, like toxic communities resulting from the random nature of match making.
In that context I still have very fond memories of the late 90s and early 2000s PC multiplayer scene: Whole communities grew around servers, who would usually self-moderate not just for cheating but also toxic behavior.
2 decades later this has been replaced with catch all bad word lists that warn and ban people for simply writing a wrong combination of letters, regardless of context, and media blasting gaming companies for their out of control toxic communities.
> I feel like that list is a bit misleading and your comment ignores how the PC gaming community always wanted to run their own dedicated servers and games not allowing to do so often got a lot of flack.
I was responding to the parents assertion that server software shipped with games 10-12 years ago, I did a google search for "games shipped in 2008" (to be charitable and check the oldest games) and I took the first online capable games that were available on PC with no cherry picking.
Whether they got flack or not is irrelevant, the vocal minority, especially in PC gaming is very dangerous. To give an example we got a lot of feedback on our game that we needed a dedicated PVP mode, so we spent a year (and many multiple millions of euros) making it, and never achieved anything over 100 concurrent players. It was well marketed, and you could say it's "not good", but our evidence shows that nobody even tried it enough to determine it's not good.
> Call of Duty games come to mind and afaik Battlefield games used to have that option too until EA changed it to “You can rent servers from us!”.
renting servers from us(tm) is decidedly _not_ a server package that you host yourself, although it's paid for, I'm not sure if it covers the costs of running. I think it probably doesn't given the fact that this isn't an option anymore.
There's a small irony here because if it's expensive, less people will buy it, which means maintaining the software that lets you do it becomes more expensive because it's not amortised among lots of people and so it gets even more expensive. (in a death spiral kind of way).
Regardless; your comment talks exclusively about PC titles, but the majority of AAA titles that have server backends are cross platform. I think this is part of the problem when talking about games in general. The PC guys like games tailored to their platform and really despise (or generally dislike) console players and how they interact with games.
I can't say it's unfair but if we were to sell only on PC, releasing a server package and making directory services, we would not have covered the cost of making the game by even a third. And I know that because we have pretty decent consumer market knowledge, it's a requisite.
> I did a google search for "games shipped in 2008" (to be charitable and check the oldest games)
In 2008, I was playing (typical example among others) Warcraft 3, which shipped in 2004 (and FWIW supports LAN play, without any server at all, as the third item on the main menu). I suspect this may explain some of your confusion (ie, at any given point in time, most software in use shipped before that point in time). No comment on the object-level argument though; I don't like online play in the first place.
Sorry, I think we somehow ended up talking passed each other.
Parent said that "if we go back 10-12 years and games had dedicated server software", tried to find some examples games that are of similar deployment to mine (IE; multi-platform AAA games with network play) to see if that assertion held true.
It seems like, aside from Valves games (which exist on XBox afaik) most do not offer a dedicated server software.
If we go far enough back (2004) then there's only the Xbox which had network capability, meaning that there's not so many networked games that work multi-platform.
I think it's inferred but when I say "cross platform" I mean Windows and whatever the presiding consoles are.
> Sorry, I think we somehow ended up talking passed each other.
I think so, yes.
> "if we go back 10-12 years and games had dedicated server software"
Assertion was specifically PC games, and I read it to mean games that people (PC users) 10-12 years ago were playing (regardless of whether console ports existed, or when they came out), which I think was the intended interpretation.
From what you've said, it sounds like the addition of network functionality to consoles was contemporaneous with PC games losing working peer/private-server functionality, which seems in line with cwhiz's position, if not their timeline.
> I think it's inferred but when I say "cross platform" I mean Windows and whatever the presiding consoles are.
I vehemently disagree with this, but I think it's mostly a difference of terminology, so I'm disinclined to argue it versus just ad-hocing around the problem[0].
I'm a huge linux enthusiast, so I share your dislike of "cross-platform" meaning consoles and Windows. I couldn't think of a better terminology though.
Linux (plus other unix and unix-like) exclusive here; I actually don't mind "cross-platform" for "console A or console B", but describing a mouse-and-keyboard game as "cross-platform" implies it works on other mouse-and-keyboard platforms, which in practice amounts to either OSX (which... sucks, basically) or unix (linux/bsd/etc tend to be compatible enough that you can get anything from one working on another with enough hammering).
> I couldn't think of a better terminology though.
Well "games that run on both PC and console" works, but it's a bit unwieldy.
> I was responding to the parents assertion that server software shipped with games 10-12 years ago
Fair enough, I guess the 10-12 year timeframe isn't far enough in the past. Tho I do remember official servers for L4D2 being a thing, in addition to the ability for players to run their own, on their own hardware.
> Whether they got flack or not is irrelevant, the vocal minority, especially in PC gaming is very dangerous.
It wasn't a "vocal minority" when the change happened because the community hosting servers themselves used to be the norm before companies stopped shipping dedicated server software with their games.
In that context, the flack was well deserved as we can now see the impact it had on the overall gaming landscape, namely: Taking away the players ability to run their own servers, which means players are now at the mercy of the company to decide when a multiplayer-only game will stop being playable.
That wasn't an issue when dedicated server software was commonly shipped with games, back then players could just set up a server on their own hardware, extending the life of the game way past official support, and allowing them to keep playing the game they paid money for.
> renting servers from us(tm) is decidedly _not_ a server package that you host yourself, although it's paid for, I'm not sure if it covers the costs of running.
But that wasn't the point, the point there is that while technically players can now still get "their own servers" these are not actually "their own servers", they are still at the mercy of the company supporting the game and actually offering that option, while at the same time giving players zero ability to customize the hardware and more advanced server settings.
It's the gaming equivalent of "outsourcing into the cloud instead of running your own hardware".
> Regardless; your comment talks exclusively about PC titles, but the majority of AAA titles that have server backends are cross platform.
My comment is specifically talking about the time when games would still ship with their own dedicated server software so players could set them up on their own hardware.
Which has by now pretty much died out because the closest thing to that which still exists is the EA variant of "We can rent you the server, but you still can't use your own hardware".
> I can't say it's unfair but if we were to sell only on PC, releasing a server package and making directory services, we would not have covered the cost of making the game by even a third.
As a consumer I consider it very unfair, particularly in the context that for quite a while it worked quite well, it was the de-facto way how multiplayer was facilitated.
Which by now has been completely replaced with "games as a service" and P2P based matchmaking, where publishers have the ultimate say how long you are allowed to play the game, even when players would be willing to pay for the server and the bandwidth themselves.
Why take that option away? Why put these artificial expiration dates on the games? It's just a very extreme change, particularly considering that to this day one can still find privately hosted servers for all kinds of games from 20 years ago.
This works because the dedicated server software wasn't "locked away", as such these games had their life extended by decades. Something that will be utterly impossible with the vast majority of modern releases where the dedicated servers are pretty much in their own little "walled garden" as to best facilitate MTX and have the ability to just shut the game down once it ain't considered profitable anymore or the publisher wants to force the player base into the newest release of the franchise.
Nobody is arguing that companies should pay for servers indefinitely, all that people are asking for is the ability to run their own servers as not to have their games artificially made obsolete when the publisher decides to shut down all the official servers.
I agree it never made much sense for games like wow, but I preferred games where you could host yourself and tweak a thousand knobs and add mods and whatnot and integrate it into a website thanks to rcon. I'm not really gaming anymore for more than a decade but I'd be surprised if there aren't still games around that ship with server software. Just not the triple As anymore.
Half-Life, Unreal Tournament and every mods built on top of them had this. That’s why Minecraft is so popular: custom servers, custom communities, custom experience, decades of playability for the price of a single copy.
You don't seem to get the game industry has been stealing PC games since 1997 with ultima online.
Buying any client-server game or piece of software means you're getting robbed.
Almost every game in the 90's had the ability to host locally, Doom, duke 3d, descent, quake 1+2+3, we used IPX emlulators like kali/kahn to play games like warcraft 2 over the internet.
So you need to go back further buddy, from 1992 to roughly 2004, all games had dedicated server functionality until the game industry realized the public was stupid with the rise of utlima online, everquest, guild wars 1 and wow, which lead to steam.
Steam is basically corporately hacked software, why would you need to login to someones remote PC at valves HQ to play a game?
Up until 2004, every PC game had dedicated server functionality built into it. Just go look at the FPS released from 1995 to 2004 roughly.
This stopped because the public bought into the MMO scam o the late 90's, mmo's were just rebranded PC RPG's to get the public to take up corporately hacked software.
So no.
GTA 4, fallout 3, battlefield BC and LEft 4 dead 2 are POST STEAM and POST MMO games, after they'd figured out you were too stupid to realize you were being robbed and were oblivious.
So go back to quake 3 engine based games and earlier PC games you'll notice they were complete DRM free pieces of software. You don't get the STEAM is unnatural corproate malware that was forced into half-life 2 because of what they learned about the public from 1997's ultima online.
UO was what spawned the great game theft of the last 20 years.
To have games that can be shut down remotely and who's servers can go offline means youre getting fraudulently coded software.
Note that this isn't a conspiracy, but just each game company individually following its incentives toward extractive business models (and occasionally openly bragging about it and encouraging others to do the same). (Not saying you claimed that, but it's easy to read your post as conspiracy theory rather than accusations of 'mere' ethical bankruptcy.)
> Not saying you claimed that, but it's easy to read your post as conspiracy theory
Actually, they seem to have a conspiracy theory, that software companies are secretly... doing the things that they're openly and obviously doing.
In other news, [the democratic and republican parties] have always hated [democracy]'s power to [pass laws that harm campaign contributors]. Anyone who has any idea of the history of [the US] and isn't a retard, knows [a bipartisan system] has been in the cards for a long time going back to pre [mass-media] days when [candidates] were trying to screw other [candidates].
It is, since Silicon valley and big media companies have always hated the PC's power to copy files. Anyone who has any idea of the history of silicon valley and isn't a retard, knows this has been in the cards for a long time going back to pre internet days when businesses were trying to screw other businesses.
> Because the public is so stupid they can operate in broad daylight. You don't seem to get that the vast majority of the population is computer illiterate.
No, I get that just fine - you're not actually arguing that they have a conspiracy; you're arguing that they don't need one.
> why would you go from having quake 3 with level editing, mods, free maps skins, etc, in big budget AAA games, to not getting that, and having the game client server locked to company PC's like diablo 3 and overwatch?
I wouldn't, partly because I'm not a moron, but mostly because I prefer Warcraft 3; the last good FPS[0] I played was Wolfenstein.
0: unless you completely ignore the "shooting" part and count minetest.
The most popular game modes in Overwatch and CSGO are their ranked modes. They'd lose much of their playerbases if they didn't have those modes, and you can't really have them without the inclusion of developer-controlled anti-cheat in the client.
And console games did not have server software 10-12 years ago so I think it's a bit unfair to claim that the move away from player-hosted servers was only done for monetary reasons.
There were ranked modes way before centralized multiplayer services, Battlefield 2, Team Fortress 2, Counter Strike, all had ladders (as they were called at the time since it was a literal list of people sorted by stats - they could easily just present the Elo score or whatever instead), both official or community-hosted.
Battlefield 2142 actually had the concept of ranked and unranked servers. Unranked servers allowed for more modification, but didn’t affect your global rank for obvious reasons, leading to sillier game play.
I don't disagree with the idea that game companies want a captive audience, and I loathe the microtransaction economy and psychological tricks that fill my games with distractions, but I'm not sure if that's directly related to the lack of community servers.
I only play Valve games (CSGO, TF2, L4D2), so I don't know how true it is of other companies, but you can still run your own game servers. If anything, it's easier to do so now than it used to be. Few people do it, because the Valve-run servers are easier to interact with and they're good enough for what most people want.
The community tried to handle cheaters before, but they weren't good at it. It relied on admins being present, and put the onus on them to identify and kick/ban cheaters. In a large-enough active gaming community, admins were around often enough, but the best they could do was guess at who was cheating. Sure, the egregious cases were obvious, but there was rarely definitive proof, and people who were subtle could cheat without getting caught. Good players who didn't cheat were also the victim of constant accusations. I was banned from plenty of servers just for having a good night and winning.
I'm glad that anti-cheat software has gotten good enough that it works for casual play. It's rare that I see anything in CSGO these days that looks like cheating. Sure, it's out there, but it's not very common. On busy weekends as a server admin, I used to kick and ban dozens of people. On the public servers now, I might run into a couple suspicious people all weekend.
I don't see companies running servers as directly related. At least in these Valve games, I still get the marketing spam through the Steam client and in the games themselves, and I can still use all the cosmetic junk whether I'm playing on an official server or a private server.
Why do you have to host the games and carry the cost?
Wouldn't it be cheaper for you to provide some great server software, but people would have to host it themselves, like it was done years and years ago with games like Day of Defeat, Half life etc.
Or maybe I'm missing something - it's been a while since I played those type of games.
>Or maybe I'm missing something - it's been a while since I played those type of games.
The primary reason companies do not follow the tried and true method of game design is that they can't extract money in the form of microtransactions or server rent if they provide the full game (including server).
But if you have a cheat fest server, players will leave and not buy any hats and other accessories.
Also, as part of the server software it could include the marketplace, which would sit on the game developers servers, so all transactions would go via the game developer and maybe they could give some commission to the person hosting the game.
I guess that very much depends on the game, but the game I make would be very uninteresting if you hosted your own server, it doesn't lend itself to that 'pick up and go' style of play like counterstrike does.
It's more like Destiny, World of Warcraft or Diablo in that there are RPG elements, a grind for loot and the possibility to meet other players.
Minecraft also grew organically over time; was not intended for consoles and has the benefit of a slow churn towards maturity for a very long time.
Minecraft as a project over the years easily cost more than 20x to make than The Division, I'm glad it's successful, but I don't know if it's really apples/apples.
It's been under some form of heavily active development since 18 November 2011. (though it was in development by Notch since 2009 and he quit his job and lived on the revenues at that point)
At that time there were 25 employees, all living in Stockholm Sweden, the cost of a developer there in 2011 is roughly 60-75k SEK/mo (lets say 60).
(25x(60x12))x9 = 15M SEK in Salaries.
Not including: marketing (usually more than half the total cost of making a game), build servers, distribution servers.
To put that into context: Final Fantasy IX is on the list of most expensive games ever made and cost 40M EUR.[0]
Although it might cost more than most games over time, Notch wasn't initially rich when he started developing it, right? V1 was pretty frugally created.
(Genuinely curious)
I'd love to hear an insider's opinion on: Why aren't more games developed like Minecraft? Bootstrapped, slowly crafted over years rather than months, soft launched, minimal marketing, iterated on, etc.? Why is "pump out and hype an AAA game as inexpensively as possible" the only option Serious Game Companies™ go with?
a game like csgo basically makes all of its revenue through selling weapon/player skins. if you run a private server, you can just give people skins for free. if they really wanted, they could probably patch this out, but it would make the modding community furious. csgo has actually taken a pretty fair approach here. you can still run your own server if you want, but the official servers are decent enough that most players don't bother.
I have to doubt this is a primary factor. If it were then developers would make more single players games, but everyone wants to make online games with microtransactions
I'm trying to find a nice way of responding to this because it's rather loaded.
> I have to doubt this is a primary factor
Time and money are indeed a factor, micro transactions typically are used to continue "live services" of a game, and don't get counted towards recouping development costs. (at least in Ubisoft).
> if it were then developers would make more single player games
We can, and do, but sometimes to create the game you want other players involved. Imagine a NPC only For Honour or Rainbow Six: Siege.. that would be very boring.
> but everyone wants to make online games
Because there can be more interesting games than just "go here, kill boss, go here, loot thing", you can have genuinely unique experiences on every play through with online and networked games.
> with microtransactions
To cover the cost of the servers and staff required to update/fix the game, usually.
This is a complex one because people are so polarised by micro transactions. I (and incidentally, my studio) are convinced that they _MUST_ be optional, they can offer no advantage in game.
We used to have the policy that you must be able to earn that cosmetic with a realistic amount of in-game time (which is contradictory for us because more in-game time = less money, and obviously we "lost" the revenue of a player not buying it, but that goes without saying).
Given that we release buggy mess games, you might argue that we would need less micro transactions if we just did a good job in the first place, but obviously we try our best to make decent games. I think it's dangerously arrogant to assume that 60 developers working on engine and game server code are all less intelligent than you are- and there definitely isn't malice, most game developers are gamers themselves otherwise the complexity, stress and time spent would definitely not be worth it.
So I'm not sure why people are so outwardly hostile to game developers, the alternative is that we spend longer and charge more for games, but people don't like that either, so at some point the market has to make up it's mind about whether it can accept rushed products that have improvements year on year, no or marginal improvements year on year (ala Fifa, Call of Duty) or pay more.
There is no magic here, unless we bring back slavery.
In general, I'm sympathetic; games development is difficult. However, regarding "Imagine a NPC only For Honour or Rainbow Six: Siege.. that would be very boring.", I'd take it. Despite being an avid gamer, I've given up on online games precisely because of the cheating and griefing. I just don't want to deal with it anymore as a player whose free time is more and more limited as the years go by.
Not the OP, but I enjoy playing R6 with bots more than with other people. However, unlike CSGO, R6 doesn't allow you to team up with bots and that makes it a lot more boring.
I can't promise that anything will change, because we can't easily qualify how much demand there would be for such a feature. But, I can tell the people I know.
I suspect that you're right about the general audience, but I for one would be happy to pay more upfront for a game if it meant I didn't have to deal with all the current shenanigans. a AAA game has cost about $60 for as long as I can remember, despite the dollar falling significantly over the last decade or so. I would happily pay $100 for a high quality game. I've probably spent at least that on csgo skins over the years.
I have to imagine they scale the server amounts based on metrics. Running an appropriate amount of servers scaled with active installs is probably cheaper than all of the extra production value and dev work you have to add for a great single player game.
> year on year we have to build bigger games, better and more complex games and we have diminishing amounts of budget (both in time and money) to do so
Why is that? Online Gaming as a market is growing, how are budgets shrinking?
The cost of making games outpaces the revenues generated by games, especially if you're not EA, Rockstar or Activision.
That is why there are monetisation options and special editions and, disgustingly: loot boxes.
Notably EA (which, gamers hate) is known inside the gaming industry as being quite competent and not as constrained on the time or financial budget, so they can throw more resources at things. But the ongoing cost of a server instance is still eternal and will eat your profits away in the future, so they focus on being p2p.
Rockstar has budget too, but they also don't have online games in their DNA yet, their development teams seem to still think of single player games. (based on a few conversations with their devs).
Note that one needs a Google account to watch the video. You probably don't notice if you're logged in all the time so I figured I'd mention this barrier exists.
While I know the technical reasons(I've been in the industry and worked on exactly these kinds of games), I also have a certain other thought on it after getting time and distance from the trenches, which is that, on some basic first-principles level, the industry is trying too hard to market appealing falsehoods.
And online multiplayer constitutes one of the biggest falsehoods of all, since it has to pull some magic to synchronize an apparently similar, real-time experience across many computers, which leaves integral parts of the experience compromised. What's tested in competition, ordinarily, is your belief in your skills and likewise those of your opponent. When we play these games(and I still do play some of them), what's tested is more like a mixture of what you believe, what your opponents want to believe, and what the game code says you should believe.
The less authority you impose to regulate this belief - at any level, not just in the specific code for in-game results, but in terms of overall moderation, matchmaking, and QoS - the more flimsy the result is. What results from poor regulation is that player communities will behave dishonestly. Players left to their own devices are excellent at innovating new ways of gatekeeping, not just with cheating but with mutual agreements on playstyle and how to rank talent.
The industry isn't too concerned about this because, in the end, they have to make a product. I don't think it's the way to make sustainable products, though.
This may have been true in the past, but the Tf2 hat model (dunno if tf2 was first to the cosmetic market, but it was my first memorable experience with it) proves that you can continue to extract value from customers after initial purchase.
How much does it cost to run a server compared to a $3.00 hat? (which if you're valve you develop a secondary market to trade those items and take a cut after creating demand by restricting the number of earbuds available).
Your post does nothing to explain why server side cheat detection is in such a poor state. "Don't trust user input" is a fundamental maxim in this industry.
When you log into your bank's website, does it do action validation on the client side or the server side? Every few weeks there's some security bug that boils down to the server trusting the client when they shouldn't. And hackernews comments correctly boil that down to the devs were lazy and/or incompetent.
Games are no different. Sure, your industry has its own quirks and limitations. But trusting unverifiable user input... That's in the ten commandments, man.
Let me try to explain what I mean as simply as I can so that I can be sure I'm communicating what I want to say:
Games are complex and ever more complex and costly each year.
Verifying user input is expensive on the CPU of the remote side.
Servers are expensive, people have indicated they want to run one: one of our servers costs more than my monthly salary every month. Double if it's in the cloud.
So verification of the server side is not done for all things, some ray casting here and there for bullets, some bounds checks, but nothing like a defined absolute "you can _only_ do X,Y,Z" because that would be very very very expensive on CPU, because there's millions and billions of things you can do based on your current situation. And we have to trust that the actions you're sending that change your situation are true.
So we start tracking movements every frame, updating your state on the server and then suddenly there's incoming damage. We can see that you got hit, but on your screen you were in cover? who decides?
So now we have this exponential increase in "where people are" on the server, what is possible for them to do. Instead of just doing the raycasting on the bullet trajectory. (which is expensive, but not like "take 30s" expensive.
Synchronising state amongst all clients 240 times per second is very difficult, in fact, I assume you know computer science so you tell me how easy distributed systems are without trust of anything. It's pretty difficult to have quorum here.
Now do it for as many players as possible (say, 1.000), and you have 2 engineers for a year. Btw at the same time you need to make deployable health skills in-game, oh and support level design and also write AI for the NPCs (Civilian and 5 factions of combat).
Making games is complex, obscenely so, and everything that's done is a hack on a hack on a hack. Frankly I'm surprised they work.
To give context: we are roughly 10% of the engineering force that makes the browser dialog pop-ups on google.com. But we have code that is at least three orders of magnitude more complex and we had 2 years to make it.
Minecraft started out as a single player game and then notch bolted in some basic block syncing and player position syncing. Everything else ran on the client, including the mobs. You could set the difficulty to peaceful to get rid of mobs in multiplayer. Step by step they converted everything to run on the server before the beta phase ended. What you are talking about is just pure laziness and incompetence.
I'm very glad that Minecraft was successful. I can't speak to my competence, maybe I am grossly incompetent and those I work with are too.
I can say that I was well regarded in my profession before joining the gaming industry and I feel like the work I'm doing is significantly more complex than I was doing before, but that might be because I am lazy and incompetent.
That said, I'm pretty sure you know that Minecraft and AAA games are largely dissimilar.
It's an example of making the worst decisions possible for a multiplayer game (i.e. by not implementing any multiplayer at all) and fixing them anyway.
I'm not sure what you mean by "more complex" because there aren't many games that even reach a tiny bit of the depth of modern minecraft with a modpack like Skyfactory 3. Do your games involve building nuclear reactors out of 60k individual blocks that each have a tile entity (game object) allocated for it? Did I mention that this is just one power generator and doesn't even involve intricate item transport networks carrying hundreds of thousands of item stacks (each of that is a game object) to your manufacturing machines (another game object with completely arbitrary rules)? The machine wasn't cheap. It cost millions of iron ingots and those had to be processed from ore and that had to be mined automatically and the machines that mine the ore were tile entities too. These aren't things that could even possibly run on a game client. Just think about the nightmare of synchronizing such a huge amount of data from the client to the server. You'd need 100mbit upload or more.
Isn't that solved by letting players run their own dedicated servers, like in the hood old days?
Seems like the games industry has created or at least exacerbated this problem by making everything free to play (easy to make a new account if you get banned) and by removing dedicated servers (cost problem as you mentioned)
Not really because part of the push to running servers for the game is to provide consistency in game experience, lowering the barrier of entry to less technical players (thus widening the playerbase) and to support features that would be much harder or impossible to implement if you let anyone run a server. There are trade offs in that decision for sure but there are also trade offs in going the other way.
That's not what I'm saying or implying, obviously this is something the company takes into account when producing the game.
I'm lucky to work for a company (that, yes, is still a company so wants to make profit): is more interested in continuing to make games than seeking great profits.
So for us, making a game that doesn't make money back isn't a bad thing, unless we make a game that doesn't make it's costs back and then we're risking the ability to continue making games. (cough Ghost Recon Breakpoint cough)
The truth of it is, most people don't play 3 months straight with no breaks, most people will play for a month, complete a campaign and then the player numbers will dwindle rapidly to the 20%~ mark.
But we put in monetisation strategies (skins, so forth) to carry the continued cost of the server instances forward. Hardcore players might buy some and that is enough to make them cost neutral or cost positive.
Wrong link? The comment you linked doesn't really address the disappearance of self hosting servers.
My guess is it was a combination of cost-savings related to the extra polish needed for making a distributable server, but mainly as an anti-piracy measure.
I can be more comprehensive though, because there's more to it than the comment I linked.
Back in the time of dedicated servers that you mention, there were a few caveats: 1) No console players, 2) You had directory listings of publicly accessible instances, 3) it was the Wild West, connecting to some servers was at best, confusing with weird rules, or at worst actually dangerous to your computer. (if there's client bugs).
So, that's the first thing, the UX is atrocious and there can be a lot of abuse.
Secondly, when it comes to crafting a "global experience" as in, one with global state which improves over time, you can do the "WoW" style and let people reverse engineer your servers so they can have Private instances. Or you can do the Diablo style where you have shared persistent state, no server package.
Other session based games that have no state are relatively easy to release server packages for, because there's no expectation that you'll build a character for 200hours. You spin up a counter strike server and you're playing as if it was an official server or not.
But with The Division or Destiny, there is "precious loot" and the whole game is built on the idea of a world in which you can encounter other people; it loses all charm if you can control who can come in or grant yourself whatever gun you want to go kill the last boss in the game.
Overwatch allows none of that and seems to have some pretty well-designed netcode. Clients only send their controller inputs and the server has full authority of what actually happens. Blizzard's GDC talk about it is really good if you want an explanation of how to make a game feel responsive and fair without giving clients authority over everything [1].
Ofc this does not eliminate aimbot or wallhacks, but it gets rid of pretty much all other forms of cheating.
Each one from that list could be removed by a certain criteria. By server-client architecture and netcode CS:GO is worse than the others by a few miles, yeah.
Its really not, it has full server side simulation/verification, PUBG has none and is completely client side for instance.
The only viable hacks in CS:GO are aimbots and extremely limited wall hacking (basically you can see a few "feet" around corners, the server does not send you any other data).
This compares to PUBG where you can press a button and headshot the entire map at once, while flying through the air and spawning guns.
I can forgive some of these things because they are basically implementing a distributed system where events have to be finalized in 1/240th of a second and where communication latency changes the outcome of the game. Some hacks are necessary, and players have to deal with the consequences.
I never enjoy dying to a weapon that was never even fired, but that is the reality of Internet latency. The travel time of the projectile in the game is less than the travel time of the packet that says "the weapon was fired". So the game client gets that and before it can draw anything, gets another message that says "btw you died". Then you're dead to a weapon that was never even fired from your view. That sort of thing would never work for storing data to a database, but it works out okay in a fast-paced game. (I know this is not what you're talking about, but what you're talking about stems from needing to reduce latency, and this is meant to be a story about how that can be annoying even when people aren't synthesizing "hey I'm cheating!" packets.)
Having said that, I feel like the game industry decides that because some hacks are necessary, they're all necessary. Hearthstone is a card game and relies on the client for critical checks, and people hack the client to successfully cheat. It happened recently with a certain card that copies itself and costs 0. Obviously a strong combination. There is a limit to the number of cards you can play per turn, however, and so you couldn't really go infinite with this. The card takes a certain amount of time to play, and your turn is a fixed amount of time, and that bounds the blast radius of a combo like this. What people did, though, was just hack the client to skip animations. The server happily accepted this. That sort of hack is a real problem; the server should know how many cards you are allowed to play and just say "nope, not possible". The fact that that's missing is just a hack; why write the code if it's not needed? One day it is needed, though, and people get mad and stop playing your game.
It is a tough balance, and as someone who favors correctness and least surprise... I'm glad I don't work in the game industry. I would die. But people seem to be making money selling games, so perhaps my opinion is simply wrong.
I understood why code was run server side around the year 2000, but even then cheating destroyed certain genres of games. Still game designers would not adjust their game design to make cheating less disruptive for other players.
The "P2P" games mentioned in the article are also not really P2P. Instead, one player gets to host the game, and a lot of logic runs on each client, as if it were an offline game session with trusted peers.. In a real P2P architecture players would cross validate each other.
And it's hilarous how game developers have shared conferences, but still haven't created an anti cheating association to share technology and banned player information for heuristics. Glad to see that Valve is offering its service to others at least.
And Crysis 2. I wanted to get into the multiplayer of that game but it was hopeless. Every single server was filled with either cheaters or players that were so damn good they could almost match the cheaters.
It's more about company strategy than available talents. There are some sacrifices associated with gameplay and online experience but most of the cutting corners are about budget.
It’s probably an artifact of aging, but one of the more beneficial things I’ve done in the past half decade is switch from online games to single player or coop.
The peaks of online games are very high; when you get a team that’s cooperative and communicative it can be great. But the expected outcome is much much lower, especially when trolls and cheaters show up.
Single player games let me enjoy the experience without it being ruined by anything other than my relatively low skill level at the game in question. Also, I can buy games that are older and play them on weaker hardware to significantly reduce my cost of ownership, since there’s no penalty for playing single player games later.
I built my initial programming/security skills by making PC game cheats, and now that I'm actually working in the software industry on other stuff I decided at the beginning of quarantine to see if I could still do it. Specifically, I targeted PUBG.
They've added obfuscation, that's about it. Even one of the guys the author interviews admits it:
> “Last year, we spent time working on various measures to block cheat programs,” explains Taeseok Jang, executive producer of PUBG PC. “Most of these actions focused on blocking cheat program developers to make it more difficult for them to create these highly lucrative cheats.”
That obfuscation was probably a huge problem when PUBG initially started adding it, but so long as some bored high school kid has a pirated copy of IDA and a desire to prove themselves, that info is going to end up online. Each new obfuscation feature or anticheat detection becomes a challenge, and the results of that challenge being inevitably solved are inevitably posted in a public and high-visibility place for others to learn from and use.
All of this public information meant that creating a cheat for the game probably added around a month or two of work to adapt to the cheat prevention efforts, on top of the month or so that I spent looking for the actual in-game structures necessary to implement the radar I was going for. I already expected every hindrance I encountered when reversing the game and writing the tooling to interact with the game's process. It was still daunting, especially since I had never touched the windows kernel until this project, but ultimately when I ended up getting everything working it felt like I was just using the same techniques I used to use back in the day only with extra steps.
My takeaways for anyone interested in preventing videogames from being cheated in:
- Cheaters will eventually find a way, but you can always reduce the quantity and quality of them.
- All information on how to write a cheat for your game eventually ends up in public forums. Keep an eye on those and learn how most people are writing cheats and target those methods specifically.
- Obfuscation (new detentions, new anti-reversing measures, new countermeasures to cheating methods) buys you time in the immediate term and invalidates existing online information in the long term. They're like antibiotics--they increases the barrier to entry and pain factor of cheating only if you continue adding/changing it.
- Obfuscation will never be adequate to prevent cheating entirely. Human monitoring, ML, skill-based pairing, and full visibility & control over hardware the game is executing on are probably the next generation in terms of cheat prevention.
> - Cheaters will eventually find a way, but you can always reduce the quantity and quality of them.
> - All information on how to write a cheat for your game eventually ends up in public forums. Keep an eye on those and learn how most people are writing cheats and target those methods specifically.
> - Obfuscation (new detentions, new anti-reversing measures, new countermeasures to cheating methods) buys you time in the immediate term and invalidates existing online information in the long term. They're like antibiotics--they increases the barrier to entry and pain factor of cheating only if you continue adding/changing it.
> - Obfuscation will never be adequate to prevent cheating entirely. Human monitoring, skill-based pairing, and full visibility & control over hardware the game is executing on are probably the next generation in terms of cheat prevention.
Cheating in online gaming is prevalent. I think there's a lot of people that find the enjoyment of it so there's a captive market for people to sell cheats.
But I also wonder about competitiveness when it comes to people playing with others with better than average hardware.
Like someone who has a 240hz monitor and the performance to back it up versus someone with a low power slim laptop.
Then consoles could be considered a level playing field. Except for Pro variants of the console. And also there seems to be fancier controllers with extra buttons on the rear side.
Of course adding cheating to the mix. Personally I have no interest in playing online multiplayer games.
Not really, just like a beginner painter using the finest unicorn-hair brushes or whatever would not be able to paint the Mona Lisa, "gaming" hardware gives a very marginal benefit, that is better taken advantage of when you actually have the game skill.
For all but the highest ranks, it does not make much of an actual difference.
Most games have some sort of skill ceiling, or at least narrow bands of clustered players. Hardware will put you slightly above competitors of the same cluster, but will not by itself enable you to climb to the next cluster.
This is evident in cross-platform games, where in casual or non-professional competitive play, a good console player can still beat a slightly less good fully-geared PC player, despite the 30fps and controller "handicaps".
People playing with better than average HW is not such a big issue. It could mean a bit for the absolute top tier competitors, but that's under 0.01% player-base and for certain genres (FPS primarily).
Cheating however affects anyone and is much more frustrating to play against than vs really good players. Being outplayed have its merits (you can learn something), but being killed by cheaters feels, well, cheated. It's just waste of time and energy.
It just seems trivially easy to me to just look at game stats and catch 80% of cheaters. I assume they are gathering all sorts of information to balance the game... so just use that data to catch cheaters and ban them after the game.
When you watch these cheaters play it is obvious. They laser everyone in the head. Number of kills, type of guns used, distance of kills, hit rate, head shot rate, distance traveled, etc, etc. Just look for consistent extreme outliers.
There is a push towards this and some companies offering 3rd party systems to do ML based detection of cheaters.
The basic checks trivially are easy to put in to place (kills past weapon range, warping, impossible movements etc) although someone has to do the grunt work (and make sure the checks are not invalidated by certain gameplay features).
Once you get in to things like head-shot rates it gets trickier to differentiate players with amazing hand-eye co-ordination and cheaters. The line is really really thin. Add in internet latency spikes and it gets even harder.
Some cheaters will use things like aim-bots to nudge their mouse inputs and give an advantage that is difficult to see even when watching them live on stream. Others will blatantly cheat, get detected and kicked within a few minutes (after ruining some people's experience), create a new account and repeat, multiple times an hour.
It doesn't have to be perfect. Some players are incredibly good and might show up as extreme outliers but whenever I watch cheaters it is so incredibly obvious. Cheaters don't play like Shroud. They kill people with snipers that they can't even see on their screen. It should not be that difficult to differentiate a cheater from Shroud.
I play a game that has hacks publicly available with no real enforcement. 95% of people go the obvious route. If you got rid of the obvious hackers I would see a couple cheaters per year instead of multiple per day.
No excuses for that! Assuming the game publisher wants to pay for developer time to do the fixes, generally that doesn't happen until the hackers are obviously affecting the bottom-line.
Certainly more subtle cheats exist, but blatant cheating was one of the reasons I had to stop playing Battlefield series games on PC. It was relatively common for there to be some 80-100:1 KDR player on a team who would admit to cheating in game chat. If they'd get banned, they'd buy another hacked account and just do the same thing again.
Pretty much this; if someone's cheating so subtly that you can't tell they're cheating, so? Just bump them up to the next skill rank; if they make it to e-sports level they'll get found out soon enough, and if they don't, how is that different from someone with unfair amounts of natural talent? For that matter, would you insist that your players submit a urine sample to prove they aren't on amphetamines?
If the opposing team can distinguish between cheating and non-cheating, so can the admins; if they can't, what's the point?
For a long time, this was sold as a plus for gaming consoles, but afaik even there a bunch of cheating tools are available now. So right now game streaming is being sold as the next savior of online gaming since everything runs in the cloud.
But being the pessimist I usually am, and reading about how the cheating business is apparently a multi million dollar market, I expect this to also not hold up forever. There's already AIs that can play doom and starcraft just by scraping the screen, so if people are willing to pay for cheating tools, they'd probably also be willing to attach stuff to their Google Stadia that captures the screen and simulates input.
> For a long time, this was sold as a plus for gaming consoles
And it will be again when the next generation comes out this year. This happens every console generation, in the last year cheating tools finally catch up.
BS. There are thousands of games developed as dedicated online games without any intention of having a client only version.
The difficulty is balancing how much of the game runs on the client, how much on the server and how state is synchronized between clients and server.
Oh and it needs to render a photorealistic world with 2km Draw distances at a constant 144Hz with no more than 50ms latency between a button press and the other players seeing the result of that press. The servers will be receiving hundreds of packets per second for each game instance, will potentially be hosting thousands of game instances and must cost at most a few cents per player hour.
I understand how difficult it is to build an online game, especially real-time MMOs with seamless maps.
> The difficulty is balancing how much of the game runs on the client
However, it's important to have the mentality to build on the server by default. And try other optimization before give up to the client.
The reason that games are fighting this cheating surge is mostly due to the mindset is more "security bolt-on" in the game industry, instead of "security built-in" which you could see more in Web development.
How does that help? The host can see inside the VM.
There have been a few games whose obfuscation was effectively a different instruction set being run in a VM, and it does slow down hacking by a few months while people reverse engineer and retool. Can't remember names though.
I wonder whether some day the industry will discover cloud-gaming as a "solution" for cheating. After all, if the gamer has no access to the system, many ways for cheating are impossible. And if you design a game around the additional lag, it might work well enough to feed it to the masses.
I doubt anyone will pursue game streaming purely as a solution for cheating, but it is a nice second order effect. Hosted servers are already an expensive ongoing cost to supporting a live game and cloud streaming would magnify that cost several times.
What costs? Im talking about Google Stadia or Nvidia Now(?), which is paid by the users, not the gaming-company. If anything, the gaming-Company might even get paid from the cloud-provider for enforcing usage of the cloud.
Submitting duplicates is not impossible indefinitely, just for something like 12 hours.
(And yes, it's quite random whether even a good submission ends up on the front page or not. But there's no point in complaining about it in the comments. Linking to earlier submissions mostly makes sense if there's interesting discussion there.)
'what is actually crazy are modern game servers lacking even the most basic server side assertion logic(1) letting players FLY around the map, teleport, summon items, kill other players on the server at the press of a button, etc. Seen in all massive FPS games, for example Apex Legends, PUBG, Escape from Tarkov.'