people need to be online for that
It wouldn't make much sense not to be online.
I can't think of any cases where P2P would not be online.
This approach does account for being offline as well.
It just awaits the next time it can be compared to the other seeds and then updates to get back in sync (as allowed)..
The immediate benefit of this is something of a built in versioning system.
This is a requirement where technically ever offline server might have a different version of the same data.
The resync then gets everything back up to date.
Again, all this assuming the user actually wants to be up to date.
if you want to sync folders for instance: That will not be possible for security reasons and I do not hope that that will be change anytime soon.
I fear your hopes are dashed with this technology but you also likely would assess technologies such as Bittorrent as risky. (and you would be right)
The promise (which must be tested/proven) is that security must be built into the system.
A risk here is not unlike risks with any data that traverses the internet.
P2P is highly secure in the transfer of data as that is encrypted.
The area of concern is then while browsing the data because anyone with the address can access the information associated with that Dat and even those that cannot can root out other information..
The security is found in that the address provides a key to encrypt and decrypt the data and that address is not necessarily public.
This is why I can access/read the data if I have the address but neither I nor anyone else can where we do not.
This is not unlike email in a sense in that I might launch an email to one or more persons and they might in turn launch it to others.
There is always some risk that the ISPs used might try to read/interpret that data but their primary incentive to do so would be based on what they believe they can do with that data.
In theory, at the small scale, no data need be sent beyond that of the address itself which is then used to encode and decipher all subsequent traffic.
That data steam could then be limited to a one byte data stream. There... intercept and decipher that!
The smaller that footprint the more secure the data.
Additionally, the speed of that transfer is largely controlled by bandwidth (although other factors weigh in such as compression alogorithm that take advantage of previous and anticipated patterns.
So, in theory again, if all that encryption were not enough bad data could be mixed in with good data and those with the key would simply access the good data.
The middleman would just have that mess of undecipherable data.
This is the useful element related to blockchains without the global overhead.
But all of that neither here nor there...
Regarding WebRTC, the developers of Dat respond thusly:
WebRTC Usage Notes
Important: dat-js uses WebRTC, so it can only connect to other WebRTC clients. It is not possible for the dat-js library to connect directly clients using other protocols. All other Dat applications use non-WebRTC protocols (see this FAQ for more info). Non-browser clients can connect dats peer-to-peer via webrtc modules, such as electron-webrtc, or use proxies via websockets, http, or other client-server protocols.
Due to WebRTC's less than stellar performance - Dat has focused on creating solid networking using other protocols. We may integrate WebRTC if performance improves and it becomes easier to run in non-browser interfaces (though we'd prefer using more performant options in the browser, if they develop).
So, Dat-js does use WebRTC but it is deemed to limited at present for the purpose under consideration of having the browser be the server of local P2P distributed data.
Regarding areas of security... any and all P2P activity can be deemed high risk but risks should always be calculated and mitigated
Here's what the folks behind Dat are staring at with regard to security:
Additional information can be found here:
Added: Should we need to standardize on a more highly supported browser such as Chrome WebRTC would be a likely alternative.
The downside: this would appear to be primarily used at the local network level which defeats the broader goal of distributed P2P