Raft isn't for "decentralized" exactly, because it relies on knowing the number and identities of participants (and deliberately stops being able to commit updates when the number falls below a quorum). It's "no single point of failure" but the algorithm implies a sysadmin in control.
What feature would we need in Android by default, to make meshnets a lot more possible? Like some sort of auto-connection to whatever Wi-Fi is close to you.
I'm hoping that once the carriers themselves start to push users to use wi-fi for offloading from their network, the more Android and iOS and others will have features to make that easy, and the closer meshnets will be to reality.
Not my project, but it's an interesting one, in that it starts to break down the current hierarchical physical network topology: http://ronja.twibright.com/
#2. There are a lot of considerations apart from bandwidth when it comes to disseminating information. For eg., having to use speakers or even reach for a set of headphones is reason enough for me to skip to the text.
I think csense meant bandwidth in the sense of rate at which the audience takes in the information, not network bandwidth. You're in agreement with one another.
Yes, by "lower-bandwidth" I meant "video usually [1] takes more of my time to transfer ideas from the author's mind into mine than equivalent text." I wasn't referring to bytes-on-the-wire [2].
[1] Sometimes video is more efficient than text. E.g. with respect to video games, gameplay videos can convey a lot of information in a short time. Basically if you want to teach someone about some system that changes over time, showing how it changes in video is often a lot clearer and faster than trying to describe it with words or static diagrams.
But a video that just involves listening to a person talk -- or, worse yet, read slides from a projector -- usually just ends up wasting my time compared to equivalent information presented as text.
[2] It's obvious that video consumes way more bytes-on-the-wire than text.
You're agreeing with csense without realizing it. He said "this website" to refer to the site with the videos (not the comment above him), and saying video is "lower-bandwidth" to mean that is less quickly communicative, not that it somehow takes less internet bandwidth than text.
Well I know of a few people who are really interested in seeing what these hacker people are like. Are they raving maniacs? Timid hermits?
Video is a good format to get people who do not normally browse source code or read specs interested. I'd rather send my curious-but-not-technologist friends a video link than a protocol design page from a wiki which they'll never read.
And a proper transcription encompasses a fair amount of work. So maybe the authors found that that would outweigh the benefit of pleasing some of us. I'm still happy that someone is creating these videos (and is not putting them on Youtube but is self-hosting them, with WebM HTML5 video).
I'm not saying that video doesn't have certain advantages. I think that some people -- even some people who consider themselves hackers -- may even prefer it.
I'm just saying that I personally prefer text, and explaining why I prefer it.
Other problem is when I read, I will often skip backwards and forwards and backwards again, re-digesting sentences that didn't quite make sense and cross-referencing especially within paragraphs. When I'm listening, I have to depend on my auditory buffer, which is only a couple seconds (or) one sentence long.
Exactly. With a well laid out webpage of text I can scan fairly quickly and get the gist - and also figure out how much time to spend on it or wherever to defer it etc. video only I tend to skip.
It's good that you mentioned it, but it should be said that the experience is far from perfect (at least in FF). Playback speed resets at every question break, each new video, etc. Audio drops out of sync frequently and seems to have more issues when sped up than in VLC.
I'm more comfortable downloading the lectures, but there are a few upsides to using their HTML5 player:
- When I speed up the video in VLC the pitch of the sound goes up. This doesn't happen for me in Chrome.
- You don't get the "in-video quizzes" when you download the lectures. Some courses put them up for download separately, for others there is no other way to get to them but through the online player. The quizzes don't typically (never?) count towards your final grade, but they can reinforce concepts and correct mistaken understanding early.
On the other hand,
- I have finer speed control in VLC, and
- The video players on my computer are just "nicer" than the web alternative, maybe better than a web alternative can be.
What's this all about?
Over the last few years, we've noticed quite a few people trying to spread the Internet out again. Back to the edges, like it used to be.
Sometimes they do it for privacy. Sometimes they do it for resilience against disaster. Sometimes they do it just to bring playfulness back to computers and how we use them.
We've a list of the kind of projects we mean (any missing? send a pull request). On this site, we interview one of those projects each month.
The problem isn't just "information bandwidth", but also an audio problem in a public environment. If you're in a public place, do you want to put your headphones on, or would you rather have everyone listen to what you're watching?
I'm also with you. I rarely have sound on, actually at the moment there seems to be a bug with my sound. I don't plan to investigate it. I strongly dislike video content.
If you're trying to engage people who enjoy watching lolcat videos every day, a video is probably the best medium to use.
But I suspect that the majority of the people who are likely to be interested in this topic (programmers, mathematicians, engineers, internet freedom activists) prefer to access the most information possible in the shortest time possible, accompanied by the least possible amount of unnecessary bytes, and in a format that allows quick skimming. Ergo, text.
This being said, some salient dot points could be put forward to summarise the videos and what they're about. As it stands, it's largely a mystery, and not an enticing one. There's no 'hook'.
Agreed, I can't watch football and this video at the same time. I can read and listen to football though. How dare you try to consume all of my bandwidth?
If most all internet services can be implemented with open source, and processing and data storage is all virtual, then the logical place we're headed is either 1) big, centralized services that treat the details of our lives as fodder to sell products, or 2) renting cycles and storage out and decentralizing everything.
For anybody concerned with privacy and not looking forward to becoming part of some hivemind, option 2 seems most logical. I don't think cost would be prohibitive, and looks like this site is interviewing guys working in the problem area on new product development, which means that there is a future there. Anybody's bet if it actually happens, though.
The idea behind the site is to make decentralised, privacy preserving, resilient (and fun!) tech more accessible to people who don't already know about this stuff. Hence videos and user focused questions. We want to expand the appeal and plan to do podcasts and meetups too.
We hope to engage digitally literate people and show them that there are concrete existing software and app options they can use now that aren't selling your data, are interesting, easy to use and open.
However, we also need to be pragmatic - aiming to use decentralized open software ourselves when it works for our use case, but not if it stops us doing this quickly.
Please do sign up to the mail list or email us because we have ambitious plans that could use help and would love feedback (esp on stuff like if videos are really much less useful to people etc) :)
Is it that ironic? Git is inherently decentralised, github is just a publication method. Everybody's repo is still good if github gets nuked and can be used as the new public. It's nothing like, say, Facebook where They Have All The Data And You Don't. Self-hosting with Gitlab is no different, except to have the public repo sit on a different server.
Issues - I'm sure these can get recreated, even the discussions. In many cases, people will have those threads in their email.
Wiki - In Github's case, this is also a git repo.
Community - GitHub is rarely the only mechanism used to manage a community. There's likely a domain name and mailing list of some form.
Losing all of the above is certainly painful but I'm trying to point out that it's not irrecoverable. A better question might be how painful would it be to lose that (and the likelihood of losing it) as compared to other options.
You know, sourceforge was once the place to be for hosting your SVN projects, forums, etc., and nobody batted an eye when they ceased to be the community centroid. Similarly with github, if it shut down or something new and shinier came along the projects would move there before you could say "merge conflict."
Yes the wiki is a git repo, but the percentage of people who have a (current) checkout of their project wiki is probably below 0.1% of all users.
Larger project have a domain and a mailinglist but github hosts thousands of small projects which do not exist outside of it, and in theory it should be trivial to keep a second remote and put it in the readme or sync it over git in some way.
You're right about the issues, I wish GitHub used some plain-text format with an optional UI for that (e.g. plain org-mode files), but the wiki on GitHub is just a plain Git repo with Markdown files, it's not centralized.
I don't see how you lose the community more than with any hosting method. If GitHub were nuked off the face of the earth you have all the contributors in your Git history, just send them a mail with the new location of the repo.
You are missing the whole point being that if github is own and assume core developers don't have t latest opt you are screed. If you want to promote your idea, get to. Sage. Don use github and show people your new decentralized github clone.
You guys are bullying him without a proper reason.. cause they (redecentralize.org) are not creating the technologies themselves.. they are just making the efforts and projects of others public and organized.. therefore they dont need to do this sort of thing themselves..
Their goal are being achieved: To spread the word about projects that can create a descentralized internet..
So, can we please give them a break, and the credits for their nice work?
decentralizing is the only way to scale everything - from privacy to security, from resilience to performance.
concretely tho, it generally means hosting stuff on your own machine, on a connection you pay for. That's okay. But it's hard for non-specialists.
I think the way forward is actually to create standardized, easy ways to decentralize. The idea is shared by many and there's several attempts, although none seems to have matured enough yet.
Basically, those boxes and all allow you to store your data on your end. The part that's missing is that you need the devices you use today to be compatible. That means ios, android, etc.. have to be able to sync your pictures, movies, contacts, emails, what not there. That's the hard part.
> concretely tho, it generally means hosting stuff on your own machine, on a connection you pay for. That's okay. But it's hard for non-specialists.
To this end, I'm optimistic about lightweight containers, especially a la docker.io[1]. Releasing an application as a container can help solve dependency hell. Nesting containers for a complex application (i.e. with multiple disjoint services) looks promising to me, as well.
[1] Not that LXC and/or docker are ready out-of-the-box for non-specialists (yet) - but that seems achievable. A friendly wrapper around vagrant might do the job.
It solves simple dependency problems. It does nothing for your interdependent "decentralized" services which are talking to each other. That's where the real hard work is.
You're right. There is hard work in managing a system of services where any service endpoint can change at any time. It appears to me protocols for negotiating upgrades between services are important and there's no widely deployed solution for that yet.
The problem is lack of protocols. I can easily switch between GitHub and my laptop for hosting repos with changing just the remote. I can't do that with Facebook, which is very frustrating.
this. To get something truly decentralized, not just geeks have to play with it but the mass public has to adopt it. Else you have the decentralized geeks on one side, but they still have to play along with the rest of the centralized world (or get isolated).
Now something like torrents seems a pretty good attempt at it (as far as I can see, my knowledge on this subject is rather limited). But for the rest (say, your own mail server and dropbox-like solution hosted at your home) this seems simply impossible due to lack of proper, easy to use tools. The masses need a one-click type of solution, not a dozen of cryptic command line instructions for an OS most of them don't even run.
What are the best channels to follow discussions on decentralized social networks? I see too many alternatives coming up and it is getting too time-consuming to follow all of them. Friendica (BTW this one is missing on the list in wiki), Buddycloud, Diaspora, to name just a few - where do they coordinate their efforts and how can an interested developer follow discussions of these projects?
Also it would be interesting to exchange experiences made with these new apps, so that not every developer has to check every network, what is not possible in one life - the already existing listings of alternatives are of great value, but a more systematic approach in collective knowledge gathering and exchange would be even better.
Hey, that´s what the internet was built for, wasn´t it? Still does no software for this exist? :)
Honestly there isn't a lot of coordination, but many of the people most interested in decentralization participate in one or more of these networks. As one of the architects of Tent (https://tent.io) I spend a lot of time talking with others on Tent about similar projects.
I can not see what I am looking for there, sorry. The Tent protocol certainly is interesting, but I was asking for an open place where it is possible to share / exchange / collect and extract knowledge and experiences about decentralized social network software.
I was hoping for an open, easily available information source that can be followed.
I would hope this project could be that hub. Propose an interop data exchange format where information can be freely shared among all "exchanges". I'm definitely not seeing this as an easy problem to solve but it would make all of this more accessible to the grandmas of the world.
The major problem I see is geeks always think their idea is perfect and have a hard time justifying the technical cost of implementation when it doesn't directly affect them ("You mean I have to make feature x to help my competitor that's actually doing way better than I am at this? Psh!") Openness shouldn't only be about the ability to export and keep that information but to also interop and play nice with others.
I think it sucks that Disapora didn't have the traction it should have and I would hope that a data exchange format among all social networks would be beneficial to all. Unite against the Facebooks of the world. I believe united we can do this much better but its clear we haven't found the perfect disruption yet. I'm glad we haven't stopped trying because it's out there. Somewhere.
Just to be clear I'm not talking about the Tent website (https://tent.io), but the communication happening on Tent itself (or Diaspora, etc). There's a fairly strong dog fooding ethic around the decentralized projects, especially since so many of them are "social" in nature, using the projects themselves to communicate is strongly embraced by many.
"That's why we don't think the network will be taken over by child porn. You have to have someone accept what's on your node in order for them to pass your traffic around," he says. [1]
So instead of a guaranteed right to access information freely we make every mesh node a potential censor.
I do not want to depend on the likes or dislikes of mesh participants to be able to access or distribute information.
Here seems something fundamentally broken by design. I hope, I am misunderstanding the concept.
I think this idea would be even better if instead of recommending services (since people will be divided anyway), they would recommend protocols/standards, for each case. OFC cases like bitcoin, that is really stablished, should just be named as the only suggestion.
"Oh you're on Diaspora? Too bad, I'm on Kune and they don't talk with each other." You get the idea.
Great interview (the one I listened to). This stuff interests me, but how to choose a project to get behind? For example Diaspora and Gnu Social might be two scattered groups both trying to build the same thing separately, or there might be important design/philosophy differences that neither of them felt the need to explicitly state. FreedomBox sounds perfect for me but the last news/release was more than a year ago - have most of the developers moved on? What other projects are similar to it?
A redecentralizing hub is a great idea, can the list be more cross-referencable like a wiki - allowing project descriptions to be linked to similar projects, with information about what makes the "similar project" different. Appraisals of how projects are going etc.
Judging from the existing list, cross-referencing would have to be introduced by the redecentalize guys before we could start adding the info.
* gnusocial is another project that can be added to the list.
Wireless community networks are where there is much research going on for fully decentralized systems. Check out Guifi.net, the world's largest wireless community network with over 20,000 nodes. All the protocols are decentralized, from DNS to routing to HTTP Proxies for internet access.
I don't dislike it. I think that what's intended is that instead of just orating (loudly) about the sorry state of affairs, as any of us can do, these people actually walk the walk and go out of their way to make things better. Quietly, by hacking away at great projects, and not by occupying capitol hill or something.
We still have huge user experience issues with decentralized systems which make it more awkward for the average person. It's sad, because I (like most programmers) want a distributed system, but it's rather hard to get it right.
BitTorrent Sync seems to have done a really good job of making a decentralized system easy to use. Their pre-shared-key method of connecting computers is really simple for people to understand, and supports a wide variety of potential use cases.
I think i have a good plan to solve those issues and some more.. i hope i will throw it to the lions here at HN real soon.. so the hackers can validate the reasoning behind it, and play a little bit with it .. but you are right... it was hard to get it right..
In the Cryptosphere talk the author says that you can use a cryptosphere address to reference the latest version of a git repository but I thought cryptosphere was content addressable. How does this work?
Couldn't find writups about the mesh networking in Serval. How scalable is this? Last time I did some digging around mesh networks, they didn't scale too much. Every message has to be broadcasted to every node (and every client had to maintain a list of all nodes and their routes), or something along those lines.
If you're looking for a decentralized & encrypted communication platform (messaging, chat, file tranfer, etc.) to get behind, head over to http://retroshare.sourceforge.net/ and consider developing plugins for any missing features.
so to decentralize we would need to host everything ourselves? PC turned on 24 hours and awful gaming latency since you'll be using lots of upload traffic. please tell me I'm wrong, this would never work for lots of reasons.
Do you need to distribute things when you're gaming? For most interactions, I imagine that when the user says "save", replication of the user's data can start at that point, and for most things, ought to finish within a minute or two, freeing up the network/CPU, at least if you're doing it across a private cloud. I imagine that for most of my data, 3 copies in disparate locations would more than suffice. How long does that take?
Having a machine on 24/7 is perhaps a bit wasteful, but what is the electrical use of a mostly idling machine? If you really cared, you could perhaps get something with less power, like a raspberry pi.
Oh I got the wrong idea, I wasn't able to check the video and the "about" is really bad. But now that I'm home I opened the Github page. So it's just not giving control over the data for a central corporation, but we're still using datacenters.
I thought it was the same folks that wanted us to self-host diaspora (as in OUR home pc/network). If this works, the *aaS market will explode even more.
There seem to be a lot of discussions going on inside each of these projects. There should be some central hub for research and exchange ;)
Disconnected knowledge-islands with people not knowing about already existing solutions to problems they are trying to solve seems to be one downside of scattered communities.
Decentralization is needed for private communication, but there are downsides - creating x parallel nets will not help in finding each others - the knowledge evolution of the internet is driven by the desire of the human brain to connect all available information.
Imagine the NSA decided to create an open and freely available API to the centralized information they have collected - good-minded people certainly will extract useful knowledge from that data, possibly world-changing new ways of thinking about everything could be the results.
The human brain wants this central intelligence (the real one, not the agency) to solve the evolutionary problems we are confronted with today - it must be seen as an ironical evidence of this need that the first biggest data collections that might be useful for all humans on this planet are collected exclusively by paranoid psychopaths to have better tools for control and oppression.
But even if misused, it does happen - the human brain wants to connect. It will not be happy with decentralized knowledge - it is the reason, why the internet exists and it will not want to go back.
This does not mean that decentralized networks are a bad thing in itself (as I said before, needed for private talk urgently!), but it means that we need some information harvesting entity that makes it easy to connect.
It is not only marketing what makes google and facebook so successful - it is the biological meta-programming of our brains that make us want to use these possibilities to connect to the global information pool.
The global brain is the destination - of course it should be untouchable by insane sadist that want to control our life. This evolutionary force will lead to social changes - it does already.
Re-Decentralizing the internet, from this point of view, only postpones the real challenge: build technical and social systems, that can not be misused or at least make it easier to detect and eliminate misuse - we have already concepts for this, we need to apply. Real-life IDS will warn you if you consume fake information or are following the wrong leaders (or if you are following at all). Real-life social SIEM will prevent a group of people misusing the system just for their own profits.
Most probably "profit" will be an ancient word that will be remembered only with shame, like being confronted with the primitive life of our ancestors, that were living in caves.
Decentralization will not rescue us from our duty to expand participation.
My own main motivation wasn't just privacy, but also resilience (against, say, hurricanes) and fun (the web doesn't have the same joy it once did).
So my favourite interview so far is the mobile mesh network one http://redecentralize.org/interviews/2013/08/14/04-paul-serv...
I also think it is important that new algorithms like DHT (Kademlia), BitCoin and GFShare get known and used to their full extent.
Please get in touch if you've a project that would like interviewing!