> Yes, and I also think itâs a toy version of an even larger problem: how to devolve power in general.
> Human societies are networks too. I think this work has po litical and philosophical implications inasmuch as the same information theoretic principles that govern computer networks might also operate in human ones.
> If we can fix it here, maybe it can help us find new ways of fixing it there.
Or the other way around for that matter. Look at the societies that work best and see how they do it.
> I wonder what might be done if we could pair mesh nets with broadcast media? Has anyone looked into that? I picture a digitally encoded shortwave analog to a ânumbers stationâ that continuously broadcasts the current mesh net consensus for trust anchor points and high-availa bility nodes.
I think we have two different problems here and it makes sense to distinguish them.
The first problem is the key distribution problem, which is an authentication problem. You have some name or other identity and you need a trustworthy method of obtaining the corresponding public key.
The second problem is the communication problem, which is a reliability/availability problem. You have some public key and you want to make a [more] direct connection to it so you need to identify someone or some path that can be trusted to reliably deliver the request.
Traditional broadcast media can actually solve both of them in different ways. Key distribution has the narrower solution. If you're The New York Times or CBS then you can e.g. print the QR code of your public key fingerprint on the back page of every issue. A reader who picks up an issue from a random news stand can have good confidence that the key isn't forged because distributing a hundred thousand forged copies of The New York Times every day or setting up a 50KW transmitter on a frequency allocated to CBS would be extremely conspicuous and would quickly cause the perpetrator to get sued or arrested or shut down by the FCC. But that only works if you yourself are the broadcaster (or you trust them to essentially act as a CA). And pirate radio doesn't have the same effect because the fact that the FCC will find you and eat you is *why* you can trust that a broadcast on CBS is actually from CBS. Without that it's just self-signed certificates.
By contrast, broadcasting could theoretically solve the availability problem for everyone. If anyone can broadcast a message and have it be received by everyone else then you've essentially solved the problem. The trouble is the efficiency. That's just the nature of broadcast. NBC broadcasting TV to millions of households who aren't watching it is an enormous waste of radio spectrum but it's a sunk cost (at least until the FCC reallocates more of their spectrum). You can even do the same thing without a broadcast tower, it just has the same lack of efficiency. It's simple enough to have every node regularly tell every other node how to contact it but it's not very efficient or scalable.
> Or... we are admitting that trust is inherently asymmetrical because of course it is! Nobody trusts everyone equally. The question then is whether people need to agree at the âmetaâ level on some common things that they all trust, and if so how this is accomplished. Seems to me that they do otherwise cooperation becomes difficult (game theory territory).
That's probably true in an "all must agree on what protocol to use" sense but I don't think dynamic global consensus is actually required in general. The things like that which everyone has to agree about are relatively static. Meanwhile if Alice and Bob want to communicate then Alice and Bob have to agree on how to do it but that doesn't require everybody else to do it in the same way or trust the same parties.
> I wonder if we could actually define good guys in some meaningful way, like via game theory? Are they actors that tend toward cooperation in an environment of mostly cooperators?
The hard part is that the bad guys can behave identically to the good guys until they don't. So establishing a trusted identity has to be in some way difficult or expensive so that burning one would be a significant loss to an attacker.
> The goal is just to build a system where the cost of an attack is so high
Right, of course. The trouble is there could be realistic DoS attacks within the capabilities of various notorious internet trolls which are legitimately hard to defend against.
> Decentralization and the devolution of power are something that lots of people want, and theyâre something human beings have been trying to achieve in various ways for a very long time. Most of these efforts, like democracy, republics, governmental balance of power, anti-trust laws, etc., pre-date the Internet. Yet it never works.
I don't really agree that it never works. For all the failings of free market capitalism, it's clearly better than a centrally planned economy. The thing about functioning decentralized and federated systems is that they often work so well they become invisible. Nobody notices the *absence* of a middle man.
And it seems like the more centralized systems work even less. Look at Congress. Their approval ratings are lower than hemorrhoids, toenail fungus, dog poop, cockroaches and zombies. Say what you will about PGP, at least it's preferable to zombies.