While thinking about the problem I talked about in my previous entry, it occured to me that it is quite wasteful for every site to have to talk to every other site. Instead, we can borrow from the USENET model and create a structured distribution network. For example:
A completely hypothetical network layout, of course. The basic principle here is that each node has a set of peers and keeps track of which of those peers are interested in each journal. Subscription control messages as well as entry state changes are passed around the network through these channels, and since the links are created through co-operation between two nodes they can either be persistant sockets or pull-type connections depending on the needs of the two peers. Nodes must also track which journals should be forwarded on to neighbours, to avoid redundant forwarding and ensure that smaller sites don't get overwhelmed with data.
All of the nodes need to know about all nodes which produce content. To avoid nodes tampering with the data as it passes through the data is signed and each content-producing site has its own keypair. Key exchange is the tricky part, as it is the only part of the process where every node must connect to every other node directly so that everyone has everyone else's keys.
As you can imagine, this is a closed network as it requires co-operation between nodes. This is much like USENET, but the network will be a lot smaller. The obvious question is "What's in it for the sites?", and that is a good question. Big sites benefit from reciprocal links because they are trading valuable content, but bigger players have no real reason to let the little players in. As distasteful as it may seem, someone has to pay for these things, and so the worst case is that the USENET model is followed where a peer pays an upstream provider to let it feed from them. This isn't really that bad, as a bunch of smaller services can co-operate together to get a single link to the main network and share costs between them.
I think this, really, is the only feasible model for now. If we design it right it could be general enough to let producers and consumers that aren't LiveJournal-based in later, such as (for example) TypePad pushing content into the network via LiveJournal, and aggregator peers which suck data from a bunch of RSS feeds and republish them onto the network as well as user-oriented aggregators which only consume content and provide something not unlike a LiveJournal friends page for those who don't have any wish to publish but want to read. That's for the future, though... for now, it'll probably just start as a small network between LJ and DJ and perhaps Plogs.net. What do you think?