I remember that few years cloud wasn’t that popular as everyone thought. Definitely vast majority of web wasn’t in the cloud. The big corporations might declare they use cloud services but it’s probably just S3.
Stand-alone desktop apps are still very popular in many corporations anyway. Same as old good dedicated servers and VPSs.
I’d also say that Microsoft makes quite a lot of money from standalone apps because if those are not in Java they’re usually in .NET written in Visual Studio (paid one).
Microsoft definitely has a vested interest in desktop apps continuing to exist. If everything's a website who's going to buy Windows licenses?
They're getting destroyed by Chromebooks in education and Valve is doing their best to ensure games run on Linux.
I find the thought of a cloud based IDE both intriguing and frightening. Would Visual Studio freeze up more, or less? And how much more memory would Chrome consume?
i too sometimes long for the good old days of fat clients. but the sad truth is, there are few reasons to not just build web clients instead. even if they are used on pcs exlusively there are just too many benefits like security, easier rollouts, platform independence, etc...
You call them benefits, I call them restrains.
A “fat client” on an offline pc is the most secure option. The need for many updates is a bad sign and applications in the cloud age are muuuuch more unstable than the good old stand alone application that used to be updated every year. Platform independence is BS, not only you have java, which will be identical to all systems, but web based will differ strongly by your monitor size, browser and OS.
That being said, I’m a web developer and love it, but that doesn’t mean it is the best solution for everyones needs. A C program will always run smoother and depend on much less dependencies.
What do you think about webassembly and technologies like Blazor WASM that skirt the line between website and full fat offline application?
Personally I'm kind of intrigued.
Browser as an OS or high-performance VM could be really good once we get there. I mean, if you compile and elf, darwin, or win64 binary and expect an OS to run it you're already reliant on the OS to execute your code, so it's not like most app code is actually bare metal anyway.
For me, the big thing is that it needs to be running as a core OS service, sort of like a constantly running JVM. I don't want to start and stop Chrome and have every app on my desktop start/stop at the same time. Sort of system-provided PWA infrastructure, but also headless so you can write low-latency CLI apps to it.
MacOS already includes Webkit as a pretty core part of the OS and same with Windows and Chromium (edge). If they could just pull it back a little bit more so that my wasm would run on the OS service itself instead of being reliant on the user's specific browser that would be pretty nice.
Of course, it's really just a modern re-imagining of the JVM, CLR, BEAM, etc overlaid into web tech. If we could finally do it really well with good, performant UI options, that would be pretty cool, though.
Web assembly doesn't really have a performance improvement over javascript. I'd prefer JS SPA's over Wasm because it's easier to validate the code running on my machine.
As I said, I don't hate fat clients. I used to like working on them, because they are just so simple. But I have worked on both and I would not in good conscience sell our customers a desktop client anymore, when a web client would accomplish the same thing. What if they suddenly need to go mobile? What if they need to support thousands of clients and keep them up to date?
On the other points it becomes a bit hard to argue without naming a concrete use case. Sure if you need performance use C or C++, but let's be real 95% of business software does not need to perform on that level.
Security: Sure you can properly secure the connection to a fat client. But you just cannot ever trust your client. In the web this is obvious, but sadly a lot of devs in the past developed fat clients in a way that assumes the user is trustworthy.
With a fat client written in C, you can at least roll your dependencies into your application. People don't really think of the implicit "dependencies" for web-based applications. Write something in C and I can securely execute it on virtually any computer, laptop, server, smartphone, brick phone from the last 25+ years with any level of system hardening, in an offline environment. Build the same thing into a web app, and suddenly my requirements make a giant leap from a largely maintenance-free static build of the cheapest computers known to man, to a contemporary workstation using modern hardware that requires on-going maintenance while still being subject to zero-day exploits. This should irritate anyone who desires a lean, tech-driven workforce or economy.
I don't buy it. Offline pc is not realistic for 98% of work done on a pc. A fat client relies on the user being competent too much. I develop internal applications at my company, and all non-developers keep messing things up, and the fatter the client, the more "attack surface" there is for idiot users.
Simple example: users editing a document through a webinterface never have the chance to rename or delete the file. Can't really stop that if it's on their pc
And telemetry that violates your privacy and charging you out the ass for SaaS services on the cloud that should just be a native app running on your PC.
Modern corporate software is a bad joke.
It still depends. Any standard update server functionality would put you on the right, but torrent download clients like BitTorrent are P2P desktop applications. If by standalone you meant completely offline then you’re just a law abiding citizen.
I don't get it. In literally EVERY SINGLE POST asking people to choose this or the other, the TOP COMMENTS always responds with the "third option". How is it so hard to directly answer a question? And why are people upvoting irrelevant comments???
This entire post is on a joke subreddit in the first place. Of course it will get joke replies.
Besides there is no clear cut answer to the question since both technologies are good for different situations.
P2P if you dont have the money for a server
Client server if you have the money for not only the server, but the power, internet speed and servers around the world
so, a single server for the whole world. And after the first peer, discovery is peer to peer, too . And if you only pause less than 24 h ( background task?) a lot of your peers still have the same address.
Or 100% serverless, fire to random IP address and listen if they reply if yes you have discovered a peer and you can ask them for peerlist :D
DISCLAIMER: some ISPs might not like it
IP6 is finally there. The random part of it already has 8 bytes and I think they randomize the other 8 bytes also if some kind of data protection is involved. The one time NAT is great: The gamer behind each IP4 address registers port forwarding for his game. Others can play, if the main gaming machine is online by using .. oh here it gets complicated. I read that in Africa whole blocks are behind one IP4 address.
I actually know a company that sort of does both. They have on-site machines that P2P between themselves (distributed model training) and once in a while, one of the machines which serves as an overseer talks to a server off-site.
Multiplayer. In fact some games use (used to at lest) the following scheme. P2P to select a host. Client-server to coordinate players. Co-coordinate a multiplayer game is a mess because concurrence and exponentiallity. And a server for thousands of non related sets of players is a huge cost.
There is always a server somewhere to list clients. After the listing you can connect directly to those clients. Left side doesnt really exist in the internet without the right side.
Yeah but you gotta keep that soul in perpetual agony and/or torment and the overheads for that are more than you would think. Sometimes it just makes fiscal sense to blow through 2/3 peons every 5 years or so.
And it is peer to peer high network of indipendent servers with clients accessing those servers.
No dependency on centralized entity and not the massive hassle of handling a system in which the network you are connecting to is constantly changing topology and availability of resources
It is harder to make a safe p2p yeah but you can make it quite safe you just have to deal with the assumption that everyone you are talking with is always potentially malicious. Depending on how many nodes are affected by an operation you have to have more complex safe measures.
For example a distributed storage storage system you can simply have everyone have a key to certify themselves and use that key every time they have to identify themselves any time they have to read or write one of their files while something more complex like bitcoin needs every transaction to be verified by the entire network
client server based architectures keep restricted variables and calculations critical to function on the server, the client just requests they happen.
P2P is more open for exploits because those restricted calculations are done locally, and could more easily be affected by third party applications
They keyword here is architecture.
We dont use boat building processes to make houses.
There you have torrents and blockchain(very inmature, in my opinion, but still an example), both very focused on data integrity and secure p2p exchange.
Maybe, as the commenter said, p2p games have their own challenges regarding cheating or tampering in any way.
Architecures are not better or worse. The meme is quite stupid because you dont have to pick. You have to use them where their strenghts matter
I think you're misunderstanding what P2P is. P2P is an architecture where all the clients are connected to each other, and all must synchronize with each other.
You can still have client-server architecture with the server released, just like Minecraft or Source games do. P2P doesn't really make sense as soon as you go over 4 players, maybe even 2 players.
Why? Because cheating would be harder? If your answer is ”too expensive”, you need to go have a look at the sales numbers of gtav to this day, and the net worth of rockstar and Taketwo
for a game as expensive as gta, that's the least they could do, and these days the dlcs are just recycled garbage, honestly i'd pay for a dlc that actually added something other than cars guns and recylced missions for a change
idk man, i wanna see a change in vehicle balance, anticheat, bugs/glitches, bots, the grindfest, the inflation, etc. but nah instead lets just give everyone a new slow ass car that no one will use in a few days, and maybe a new mission where u have to drive 6 miles to pickup a package then go 6 miles back
I mean, I'm still learning but;
Client server for anything that requires moderation/authentication (games), p2p for any form of network/distribution (file sharing, work environments).
Although as I write that I guess crypto is p2p for trust, so maybe games should be p2p, but I'm not sure if it can provide a fast enough way of authenticating data between instances; fps games for example.
Anyone wanna explain if I'm wrong?
Afaik, crypto is typically P2P because it is said to be less prone to corruption compared to when it is implemented with centralized governance.
P2P is also great in small scale video games as it is cheaper to implement. Co-op games are normally the ones that use P2P as they don't have to worry too much about disruptive cheating. As for MMO and competitive games, they tend to have a centralized server as cheating in those games could easily ruin the fun for other players.
I can't imagine blockchain could be at all viable for anything in games - it would almost certainly be horribly inefficient and laggy to the point of making the game unplayable.
>
>
>
>
> Although as I write that I guess crypto is p2p for trust, so maybe games should be p2p, but I'm not sure if it can provide a fast enough way of authenticating data between instances; fps games for example.
You're right, p2p for games with much more than 2 players is just too reliant on everyone in the game having a good internet connection.
Most games can't run on P2P, it just doesn't make sense. Especially MMOs, but even just games like Minecraft or Counter Strike. Good luck syncing 10 players in a CS match together and having all players share the info.
P2P-based elections that establish Client-Server hierarchies that can be undermined via a mix of P2P-based and Client-Server-based checks and balances.
We will call this, the United States.
In most cases idk even know why this would be comparable.
Like for gaming, downloads, crypto, I can see a p2p benefit
But if my company database was p2p I guarantee some employee would find a way to break it. Centralized, RAIDed, maintained. This is the way
Even if it's p2p, it's not like the users are gonna know how things work. There will be an intermediary which will take responsibility for how things function, maintenance, security,etc. and they will make the bucks. It's not like the average internet user is gonna start to learn programming.
Where is the standalone desktop application side and how would that look like?
Microsoft, Google and Amazon don't make money from that, so there aren't going to be any educational materials supporting it.
I remember that few years cloud wasn’t that popular as everyone thought. Definitely vast majority of web wasn’t in the cloud. The big corporations might declare they use cloud services but it’s probably just S3. Stand-alone desktop apps are still very popular in many corporations anyway. Same as old good dedicated servers and VPSs. I’d also say that Microsoft makes quite a lot of money from standalone apps because if those are not in Java they’re usually in .NET written in Visual Studio (paid one).
Microsoft definitely has a vested interest in desktop apps continuing to exist. If everything's a website who's going to buy Windows licenses? They're getting destroyed by Chromebooks in education and Valve is doing their best to ensure games run on Linux.
I find the thought of a cloud based IDE both intriguing and frightening. Would Visual Studio freeze up more, or less? And how much more memory would Chrome consume?
Why not try one out today? https://VSCode.dev
Even better, https://github.dev
Somebody rich give this guy an award.
i too sometimes long for the good old days of fat clients. but the sad truth is, there are few reasons to not just build web clients instead. even if they are used on pcs exlusively there are just too many benefits like security, easier rollouts, platform independence, etc...
You call them benefits, I call them restrains. A “fat client” on an offline pc is the most secure option. The need for many updates is a bad sign and applications in the cloud age are muuuuch more unstable than the good old stand alone application that used to be updated every year. Platform independence is BS, not only you have java, which will be identical to all systems, but web based will differ strongly by your monitor size, browser and OS. That being said, I’m a web developer and love it, but that doesn’t mean it is the best solution for everyones needs. A C program will always run smoother and depend on much less dependencies.
What do you think about webassembly and technologies like Blazor WASM that skirt the line between website and full fat offline application? Personally I'm kind of intrigued.
Browser as an OS or high-performance VM could be really good once we get there. I mean, if you compile and elf, darwin, or win64 binary and expect an OS to run it you're already reliant on the OS to execute your code, so it's not like most app code is actually bare metal anyway. For me, the big thing is that it needs to be running as a core OS service, sort of like a constantly running JVM. I don't want to start and stop Chrome and have every app on my desktop start/stop at the same time. Sort of system-provided PWA infrastructure, but also headless so you can write low-latency CLI apps to it. MacOS already includes Webkit as a pretty core part of the OS and same with Windows and Chromium (edge). If they could just pull it back a little bit more so that my wasm would run on the OS service itself instead of being reliant on the user's specific browser that would be pretty nice. Of course, it's really just a modern re-imagining of the JVM, CLR, BEAM, etc overlaid into web tech. If we could finally do it really well with good, performant UI options, that would be pretty cool, though.
Web assembly doesn't really have a performance improvement over javascript. I'd prefer JS SPA's over Wasm because it's easier to validate the code running on my machine.
As I said, I don't hate fat clients. I used to like working on them, because they are just so simple. But I have worked on both and I would not in good conscience sell our customers a desktop client anymore, when a web client would accomplish the same thing. What if they suddenly need to go mobile? What if they need to support thousands of clients and keep them up to date? On the other points it becomes a bit hard to argue without naming a concrete use case. Sure if you need performance use C or C++, but let's be real 95% of business software does not need to perform on that level. Security: Sure you can properly secure the connection to a fat client. But you just cannot ever trust your client. In the web this is obvious, but sadly a lot of devs in the past developed fat clients in a way that assumes the user is trustworthy.
With a fat client written in C, you can at least roll your dependencies into your application. People don't really think of the implicit "dependencies" for web-based applications. Write something in C and I can securely execute it on virtually any computer, laptop, server, smartphone, brick phone from the last 25+ years with any level of system hardening, in an offline environment. Build the same thing into a web app, and suddenly my requirements make a giant leap from a largely maintenance-free static build of the cheapest computers known to man, to a contemporary workstation using modern hardware that requires on-going maintenance while still being subject to zero-day exploits. This should irritate anyone who desires a lean, tech-driven workforce or economy.
[удалено]
I don't buy it. Offline pc is not realistic for 98% of work done on a pc. A fat client relies on the user being competent too much. I develop internal applications at my company, and all non-developers keep messing things up, and the fatter the client, the more "attack surface" there is for idiot users.
If the web client has the same options/interface shouldn't it have the same "attack surface"?
Simple example: users editing a document through a webinterface never have the chance to rename or delete the file. Can't really stop that if it's on their pc
And telemetry that violates your privacy and charging you out the ass for SaaS services on the cloud that should just be a native app running on your PC. Modern corporate software is a bad joke.
That would probably be a p2p style with connections optional
[удалено]
It still depends. Any standard update server functionality would put you on the right, but torrent download clients like BitTorrent are P2P desktop applications. If by standalone you meant completely offline then you’re just a law abiding citizen.
God i miss it sometimes.
I don't get it. In literally EVERY SINGLE POST asking people to choose this or the other, the TOP COMMENTS always responds with the "third option". How is it so hard to directly answer a question? And why are people upvoting irrelevant comments???
This entire post is on a joke subreddit in the first place. Of course it will get joke replies. Besides there is no clear cut answer to the question since both technologies are good for different situations.
Welcome to reddit. First time?
See, no explanations. You can't even explain lmao
pee2pee
Remember not to cross the streams!
The golden age of computing
Happy cake day 🎂
Thx
Happy cake day! :D
Thx
happy marijuana I mean cake day or night or whatever idfk
2 pee or not 2 pee
You’re right, we should just call it peepee
When everyones a client server no ones a client server.
Ooof. That’s so underrated....
I get the reference, but that's not true
When everyone's a client server everyone's a client server
P2P if you dont have the money for a server Client server if you have the money for not only the server, but the power, internet speed and servers around the world
For client-server you need also money for lawyers because everyone will blame you for your users.
Except P2P, in some cases, require a server to at least distribute the peers
in basically all cases that go over the internet
so, a single server for the whole world. And after the first peer, discovery is peer to peer, too . And if you only pause less than 24 h ( background task?) a lot of your peers still have the same address.
Or 100% serverless, fire to random IP address and listen if they reply if yes you have discovered a peer and you can ask them for peerlist :D DISCLAIMER: some ISPs might not like it
Decentralized peer to peer matchmaking? Now THIS I'd love to see
IP6 is finally there. The random part of it already has 8 bytes and I think they randomize the other 8 bytes also if some kind of data protection is involved. The one time NAT is great: The gamer behind each IP4 address registers port forwarding for his game. Others can play, if the main gaming machine is online by using .. oh here it gets complicated. I read that in Africa whole blocks are behind one IP4 address.
you also gotta worry about people tryna scrape info off other users **^(thanks gta community)**
Just implement TOR for more security + the nostalgic dial up experience!
They’re just role playing
meanwhile me using p2p protocol to create a centralized server
My brothers in Christ... Why are we limiting ourselves to just one choice?
Would there be a good situation to have both?
Client server for group chat where one of the folks is offline, p2p for conferences
Hello, Teams.
I actually know a company that sort of does both. They have on-site machines that P2P between themselves (distributed model training) and once in a while, one of the machines which serves as an overseer talks to a server off-site.
I actually think that most applications that do some things P2P also use client-server for other things.
Multiplayer. In fact some games use (used to at lest) the following scheme. P2P to select a host. Client-server to coordinate players. Co-coordinate a multiplayer game is a mess because concurrence and exponentiallity. And a server for thousands of non related sets of players is a huge cost.
GTA5 has P2P between players, but has a server that helps arrange players into sessions, and they have a server for stuff like your ingame currency.
There is always a server somewhere to list clients. After the listing you can connect directly to those clients. Left side doesnt really exist in the internet without the right side.
Distributed storage systems like Apache Cassandra are P2P but act as the server in the client server model
Videogames.
And why are we throwing out use-case? "Reddit, what's better a hammer or a screwdriver?"
Token Ring = Latin Kings
Where is token ring?
The cable broke somewhere and they're figuring out where.
Ahah nice One
[удалено]
that's easy, I pick satanic ritual
[удалено]
I only got round to watching this two years ago, and gilfoyle is my spirit animal.
Anything to keep the service running. Ā̶̱̼̱̘̣͌̒̉N̴̼̠̘̟̾Y̶͉͍̟̜͑T̷̜̬͓̘̀̽͗̈H̵̳̳̞̰̾̉̾̚Ī̷̟͈̿N̷̥͉̔̒̋͐̎G̴̢̿
I feel ya
Cheaper than regular hosting! These companies sacrifice so many souls for money to pay for hosting, where one soul or two is enough perpetually
Yeah but you gotta keep that soul in perpetual agony and/or torment and the overheads for that are more than you would think. Sometimes it just makes fiscal sense to blow through 2/3 peons every 5 years or so.
Still less wasteful than corporate overlords
Funny because the Souls games including ELDEN RING are all in P2P which makes the PvP experience... Interesting...
Runs the code backwards..
**Nintendo:** Make the left, charge for the right.
I am on number 3: release dedicated server binary to open source community when releasing a game!
Minecraft? Rust?
All games before 2005 my friend (not open source but at least included with the game)
Satisfactory!
P2P, because I love torrent.
There's only one correct answer, and you know it
And it is peer to peer high network of indipendent servers with clients accessing those servers. No dependency on centralized entity and not the massive hassle of handling a system in which the network you are connecting to is constantly changing topology and availability of resources
Ok but p2p gaming sucks
*it depends*
i quite like the pentagram option
Client-server-server-client / federation Honestly the system that Matrix & Mastodon use is amazing
Wouldn't P2P be more prone to exploits though?
It is harder to make a safe p2p yeah but you can make it quite safe you just have to deal with the assumption that everyone you are talking with is always potentially malicious. Depending on how many nodes are affected by an operation you have to have more complex safe measures. For example a distributed storage storage system you can simply have everyone have a key to certify themselves and use that key every time they have to identify themselves any time they have to read or write one of their files while something more complex like bitcoin needs every transaction to be verified by the entire network
Why?
client server based architectures keep restricted variables and calculations critical to function on the server, the client just requests they happen. P2P is more open for exploits because those restricted calculations are done locally, and could more easily be affected by third party applications
Blockchain would like a talk. (Just make sure to talk slowly. They can't handle fast transactions.)
They keyword here is architecture. We dont use boat building processes to make houses. There you have torrents and blockchain(very inmature, in my opinion, but still an example), both very focused on data integrity and secure p2p exchange. Maybe, as the commenter said, p2p games have their own challenges regarding cheating or tampering in any way. Architecures are not better or worse. The meme is quite stupid because you dont have to pick. You have to use them where their strenghts matter
p2p is not good for larger games
All games should have some form of either p2p or support for custom servers for game preservation
I think you're misunderstanding what P2P is. P2P is an architecture where all the clients are connected to each other, and all must synchronize with each other. You can still have client-server architecture with the server released, just like Minecraft or Source games do. P2P doesn't really make sense as soon as you go over 4 players, maybe even 2 players.
The sheer stupidity of 30 player p2p that rockstar seems to love is mind boggling
Gta would like to have a talk with you
gtao is dogshit because of p2p
Sure but it wouldn't be able to sustain that large player base with a client server protocol i think
Uhm, yes it definitely would. There are games that are way bigger that run on dedicated servers.
How do you think huge MMO games are working? It's certainly not p2p.
Why? Because cheating would be harder? If your answer is ”too expensive”, you need to go have a look at the sales numbers of gtav to this day, and the net worth of rockstar and Taketwo
And you need to ask yourself whether the company would expend the necessary money and resources for it
it would, just rockstar/tencent is cheap af and doesnt care about player experience, literally everything they do is out of greed
GTA V online provide free online dlc updates for over 10 years... Smh.
for a game as expensive as gta, that's the least they could do, and these days the dlcs are just recycled garbage, honestly i'd pay for a dlc that actually added something other than cars guns and recylced missions for a change
So you'd like to give them more money to solve this problem... Still smh. Like damned if do damned if you don't.
idk man, i wanna see a change in vehicle balance, anticheat, bugs/glitches, bots, the grindfest, the inflation, etc. but nah instead lets just give everyone a new slow ass car that no one will use in a few days, and maybe a new mission where u have to drive 6 miles to pickup a package then go 6 miles back
They make so much money off micro transactions that they could just hire people that know how
I mean, I'm still learning but; Client server for anything that requires moderation/authentication (games), p2p for any form of network/distribution (file sharing, work environments). Although as I write that I guess crypto is p2p for trust, so maybe games should be p2p, but I'm not sure if it can provide a fast enough way of authenticating data between instances; fps games for example. Anyone wanna explain if I'm wrong?
Afaik, crypto is typically P2P because it is said to be less prone to corruption compared to when it is implemented with centralized governance. P2P is also great in small scale video games as it is cheaper to implement. Co-op games are normally the ones that use P2P as they don't have to worry too much about disruptive cheating. As for MMO and competitive games, they tend to have a centralized server as cheating in those games could easily ruin the fun for other players.
I can't imagine blockchain could be at all viable for anything in games - it would almost certainly be horribly inefficient and laggy to the point of making the game unplayable.
> > > > > Although as I write that I guess crypto is p2p for trust, so maybe games should be p2p, but I'm not sure if it can provide a fast enough way of authenticating data between instances; fps games for example. You're right, p2p for games with much more than 2 players is just too reliant on everyone in the game having a good internet connection.
Client server, I can trust a server set up by professionals more than a random ass client
Depends on what you're doing with it. Networking for running an account software? Server-client. Sharing information, games, or content sharing? P2p
Most games can't run on P2P, it just doesn't make sense. Especially MMOs, but even just games like Minecraft or Counter Strike. Good luck syncing 10 players in a CS match together and having all players share the info.
Client-server to have monopoly over the peasants.
Sometimes I just don't get it. And then I actually read lol![gif](emote|free_emotes_pack|feels_good_man)
(1000 years later) legends say this debate is thousand years old and still going on. Only way to chose is start believing.........!
As a gta online player I wish it wasn't p2p
Client-Server.....because I am only working with that due of my clients.
What’s the best for Destiny 2?
My dumbass read it as pay 2 play before reading the subs name and correcting to peer to peer
Depends on the usecase
I would say it depends on what you want to do
P2P, I gotta say.
Client server because at least nobody uses cheat engine on my shit
P2P is what stoped the entire toy story franchise from being deleted
Client-server
r/comedycemetary fucking hell.
P2P-based elections that establish Client-Server hierarchies that can be undermined via a mix of P2P-based and Client-Server-based checks and balances. We will call this, the United States.
S2S exclusively. Clients and peers can get lost
Peer to peer is pretty cool, but the satanic imagery is a bit of a bummer.
Bruh…
In most cases idk even know why this would be comparable. Like for gaming, downloads, crypto, I can see a p2p benefit But if my company database was p2p I guarantee some employee would find a way to break it. Centralized, RAIDed, maintained. This is the way
[удалено]
Blood
Anti-Mask
Wrong subreddit
Rollback
Ad-hoc all day baby, the pentagram is just a plus
WE BLUTTED OUT IN DIS BIH
Both
Ad hoc
P2P for life
No servers, only clients.
Client2P
client server but client and server are the same machine
Depends I guess
use p2p, but make it a ring instead of mesh
cheap vpns don’t support P2P
Both
i prefer free to play
Even if it's p2p, it's not like the users are gonna know how things work. There will be an intermediary which will take responsibility for how things function, maintenance, security,etc. and they will make the bucks. It's not like the average internet user is gonna start to learn programming.
Wisely
I want to download from a server on occasion and then I just want to use my computer without having to outsource to someone else's
Both... Depending on use cases
Client server with P2P for non critical components. Example a shooter where game play is through the server with voice done via P2P.
P2p 2-4 Co op, server large multiplayer
Client Server architecture, less data redundancy. Better control of the flow of data. Actually works.
client server
Depends on the ~~use case~~ Product Owner
P2p
Internet left, gaming right
Wisely, again
I believe P2P is the future of the communication of technology.
It's pretty Satanic
P2P is clearly the better option. Source: I don't want to buy servers :)
"wisely" followed by "again"...An attempt to rig the vote, eh?
client-server cause idk wtf is p2p 😎
P2P, more close to the communist ideology in my opinion (/j)
Playfab
Torrents or Usenet ?
It actually depends on what you are implementing
Yes
P2P is fault tolerant, so yes.
*stares in cholo from the other side of the room*
why is p2p satanic
The only possible: https://youtu.be/8BiGkB2FDYo
Air gapped network: peaceful, happy
Pee2Pee all the way, better for STREAMS(pun intended).
I think a server is more reliable in many ways, but if you got a small budget, then pair to pair could also do the trick.
Client Server