Welcome
Awesome Dat
Dat Applications
datproject/dat
datproject/dat-desktop
Community Applications
codeforscience/sciencefair
mafintosh/hyperirc
jondashkyle/soundcloud-archiver
mafintosh/hypervision
joehand/hypertweet
beakerbrowser/dat-photos-app
High-Level APIs
datproject/dat-node
datproject/dat-js
beakerbrowser/pauls-dat-api
beakerbrowser/node-dat-archive
Hosting & Dat Management
mafintosh/hypercore-archiver
datprotocol/hypercloud
beakerbrowser/hashbase
joehand/dat-now
mafintosh/hypercore-archiver-bot
joehand/hypercore-archiver-ws
datproject/dat-registry-api
datproject/dat-registry-client
Managing & Aggregating Dats
datproject/multidat
datproject/multidrive
jayrbolton/dat-pki
beakerbrowser/injestdb
Http Hosting
joehand/hyperdrive-http
beakerbrowser/dathttpd
Dat Link Utilties
datprotocol/dat-dns
joehand/dat-link-resolve
pfrazee/parse-dat-url
juliangruber/dat-encoding
Dat Utilities
joehand/dat-log
mafintosh/dat-ls
karissa/hyperhealth
joehand/hyperdrive-network-speed
File Imports & Exports
juliangruber/hyperdrive-import-files
mafintosh/mirror-folder
pfrazee/hyperdrive-staging-area
pfrazee/hyperdrive-to-zip-stream
Hypercore Tools
mafintosh/hyperpipe
Dat Core Modules
mafintosh/hyperdrive
mafintosh/hypercore
CLI Utilities
joehand/dat-doctor
joehand/dat-ignore
joehand/dat-json
Networking
karissa/hyperdiscovery
mafintosh/discovery-swarm
mafintosh/webrtc-swarm
joehand/dat-swarm-defaults
Lower level networking modules
maxogden/discovery-channel
mafintosh/dns-discovery
mafintosh/multicast-dns
webtorrent/bittorrent-dht
mafintosh/utp-native
mafintosh/signalhub
Storage
datproject/dat-storage
datproject/dat-secret-storage
Random Access
juliangruber/abstract-random-access
mafintosh/multi-random-access
mafintosh/random-access-file
mafintosh/random-access-memory
mafintosh/random-access-page-files
datproject/dat-http
substack/random-access-idb
Other Related Dat Project Modules
mafintosh/peer-network
mafintosh/hyperdht
Dat Project Organization Stuff
datproject/datproject.org
datproject/discussions
datproject/design
datproject/dat-elements
kriesse/dat-colors
kriesse/dat-icons
juliangruber/dat.json
Outdated
juliangruber/dat.haus
poga/hyperfeed
yoshuawuyts/normcore
yoshuawuyts/github-to-hypercore
poga/hyperspark
juliangruber/hypercore-index
juliangruber/hyperdrive-encoding
mafintosh/hyperdrive-http-server
joehand/hyperdrive-http
joehand/dat-push
joehand/dat-backup
joehand/archiver-server
joehand/archiver-api
poga/hyperdrive-ln
substack/hyperdrive-multiwriter
substack/hyperdrive-named-archives
substack/git-dat
CfABrigadePhiladelphia/jawn
maxogden/dat-archiver
juliangruber/hyperdrive-stats
karissa/hypercore-stats-server
mafintosh/hypercore-stats-ui
karissa/zip-to-hyperdrive
joehand/url-dat
joehand/tar-dat
joehand/hyperdrive-duplicate

hyperdiscovery

build status

Join the p2p swarm for hypercore and hyperdrive feeds. Uses discovery-swarm under the hood.

npm install hyperdiscovery

Usage

Run the following code in two different places and they will replicate the contents of the given ARCHIVE_KEY.

var hyperdrive = require('hyperdrive')
var swarm = require('hyperdiscovery')

var archive = hyperdrive('./database', 'ARCHIVE_KEY')
var sw = swarm(archive)
sw.on('connection', function (peer, type) {
  console.log('got', peer, type) 
  console.log('connected to', sw.connections.length, 'peers')
  peer.on('close', function () {
    console.log('peer disconnected')
  })
})

Will use discovery-swarm to attempt to connect peers. Uses datland-swarm-defaults for peer introduction defaults on the server side, which can be overwritten (see below).

The module can also create and join a swarm for a hypercore feed:

var hypercore = require('hypercore')
var swarm = require('hyperdiscovery')

var feed = hypercore('/feed')
var sw = swarm(feed)

API

var sw = swarm(archive, opts)

Join the p2p swarm for the given feed. The return object, sw, is an event emitter that will emit a peer event with the peer information when a peer is found.

sw.connections

Get the list of currently active connections.

sw.close()

Exit the swarm

Options
  • stream: function, replication stream for connection. Default is archive.replicate({live, upload, download}).
  • upload: bool, upload data to the other peer?
  • download: bool, download data from the other peer?
  • port: port for discovery swarm
  • utp: use utp in discovery swarm
  • tcp: use tcp in discovery swarm

Defaults from datland-swarm-defaults can also be overwritten:

  • dns.server: DNS server
  • dns.domain: DNS domain
  • dht.bootstrap: distributed hash table bootstrapping nodes

See Also

License

ISC