7/13/2023 0 Comments Shards of war makers![]() If you find someone in real life who plays the same MMO as you, there's a high likelihood that you both play on different servers. The negative aspects of sharding an in-game community in this manner are far more detrimental than not being the first to kill a boss or get the name you want. This does allow more players and guilds to get "first" kills on raid bosses and lets several players worldwide have the same character name, but the strategy is not without its downfalls. Players are restricted to interacting with a small subset of the overall game community. The problem is with splitting the game into separate micro-communities, each with its own economics, politics and achievements. Any MMO with a large enough playerbase will eventually exceed the computational capacity that one server can provide, and splitting players across servers is a necessary step in balancing that load. The problem with sharding isn't in the actual splitting of players across multiple servers. From a business standpoint, it looks like a great solution. Costs scale up linearly with the total number of players supported, and server population limits can be used to keep each server from exceeding the performance requirements of the hardware it's running on. By running separate copies of the server software on cheap hardware or generic server hosting, companies can support as many players as they want. The solution most developers come up with is simply to start more servers. Running a shard on higher-end machines will raise the limit on the number of players that shard can handle simultaneously, but at some point upgrades stop being a cost-effective way to squeeze more players into the game. Combined with a dramatic drop in the price of computing over the past few years, the regular income a successful MMO collects gives its development studio an unprecedented ability to buy high-end server equipment. Today, MMOs make so much money that companies like CCP Games can throw $50,000 US at buying a single piece of server hardware. ![]() Read on as I take a look at why developers rely on the sharded server model, the problems surrounding splitting communities and what alternative server models are out there.īack when MMOs were just starting to emerge, the games industry was operating in a fledgling market and online games weren't seen as the money-makers they have now become. There are even alternative server models out there that are just as cost-effective as the sharded model but are devoid of the negative side-effects of smashing the community into hundreds of pieces. Those limitations have been rapidly shattered in recent years, but still new MMOs shard their communities into small groups. Sharded server models made a lot of sense in the early 2000s, when server hosting was expensive and the teams working on the server code were small. The limits of what is possible have been pushed gradually forward, and yet certain ideas that were formed in the genre's infancy still seem to stick to new titles like glue. A functional market or auction mechanic now replaces the old meet-and-trade style barter of some early MMOs, for example, and an MMO without copious map or chat tools is seen as grossly incomplete. Some of these innovations have become so essential that without them a game looks cheap, old or backward. The MMO genre is now over a decade old, and in that time we've seen countless innovations in game design, graphics technology and hardware infrastructure. If you're afraid of opinions other than your own, you might want to skip this column. Disclaimer: The Soapbox column is entirely the opinion of this week's writer and does not necessarily reflect the views of Massively as a whole.
0 Comments
Leave a Reply. |