Welcome
Awesome Dat
Dat Applications
datproject/dat
datproject/dat-desktop
Community Applications
codeforscience/sciencefair
mafintosh/hyperirc
jondashkyle/soundcloud-archiver
mafintosh/hypervision
joehand/hypertweet
beakerbrowser/dat-photos-app
High-Level APIs
datproject/dat-node
datproject/dat-js
beakerbrowser/pauls-dat-api
beakerbrowser/node-dat-archive
Hosting & Dat Management
mafintosh/hypercore-archiver
datprotocol/hypercloud
beakerbrowser/hashbase
joehand/dat-now
mafintosh/hypercore-archiver-bot
joehand/hypercore-archiver-ws
datproject/dat-registry-api
datproject/dat-registry-client
Managing & Aggregating Dats
datproject/multidat
datproject/multidrive
jayrbolton/dat-pki
beakerbrowser/injestdb
Http Hosting
joehand/hyperdrive-http
beakerbrowser/dathttpd
Dat Link Utilties
datprotocol/dat-dns
joehand/dat-link-resolve
pfrazee/parse-dat-url
juliangruber/dat-encoding
Dat Utilities
joehand/dat-log
mafintosh/dat-ls
karissa/hyperhealth
joehand/hyperdrive-network-speed
File Imports & Exports
juliangruber/hyperdrive-import-files
mafintosh/mirror-folder
pfrazee/hyperdrive-staging-area
pfrazee/hyperdrive-to-zip-stream
Hypercore Tools
mafintosh/hyperpipe
Dat Core Modules
mafintosh/hyperdrive
mafintosh/hypercore
CLI Utilities
joehand/dat-doctor
joehand/dat-ignore
joehand/dat-json
Networking
karissa/hyperdiscovery
mafintosh/discovery-swarm
mafintosh/webrtc-swarm
joehand/dat-swarm-defaults
Lower level networking modules
maxogden/discovery-channel
mafintosh/dns-discovery
mafintosh/multicast-dns
webtorrent/bittorrent-dht
mafintosh/utp-native
mafintosh/signalhub
Storage
datproject/dat-storage
datproject/dat-secret-storage
Random Access
juliangruber/abstract-random-access
mafintosh/multi-random-access
mafintosh/random-access-file
mafintosh/random-access-memory
mafintosh/random-access-page-files
datproject/dat-http
substack/random-access-idb
Other Related Dat Project Modules
mafintosh/peer-network
mafintosh/hyperdht
Dat Project Organization Stuff
datproject/datproject.org
datproject/discussions
datproject/design
datproject/dat-elements
kriesse/dat-colors
kriesse/dat-icons
juliangruber/dat.json
Outdated
juliangruber/dat.haus
poga/hyperfeed
yoshuawuyts/normcore
yoshuawuyts/github-to-hypercore
poga/hyperspark
juliangruber/hypercore-index
juliangruber/hyperdrive-encoding
mafintosh/hyperdrive-http-server
joehand/hyperdrive-http
joehand/dat-push
joehand/dat-backup
joehand/archiver-server
joehand/archiver-api
poga/hyperdrive-ln
substack/hyperdrive-multiwriter
substack/hyperdrive-named-archives
substack/git-dat
CfABrigadePhiladelphia/jawn
maxogden/dat-archiver
juliangruber/hyperdrive-stats
karissa/hypercore-stats-server
mafintosh/hypercore-stats-ui
karissa/zip-to-hyperdrive
joehand/url-dat
joehand/tar-dat
joehand/hyperdrive-duplicate

Hashbase

Hashbase is a public peer service for Dat archives. It provides a HTTP-accessible interface for creating an account and uploading Dats. It was created to power a content-community for the Beaker Browser

Links:

Setup

Clone this repository, then run

npm install
cp config.defaults.yml config.development.yml

Modify config.development.yml to fit your needs, then start the server with npm start.

Configuration

Before deploying the service, you absolutely must modify the following config.

Basics

dir: ./.hashbase              # where to store the data
brandname: Hashbase           # the title of your service
hostname: hashbase.local      # the hostname of your service
port: 8080                    # the port to run the service on
rateLimiting: true            # rate limit the HTTP requests?
csrf: true                    # use csrf tokens?
defaultDiskUsageLimit: 100mb  # default maximum disk usage for each user

Lets Encrypt

You can enable lets-encrypt to automatically provision TLS certs using this config:

letsencrypt:
  debug: false          # debug mode? must be set to 'false' to use live config
  email: '[email protected]'  # email to register domains under

If enabled, port will be ignored and the server will register at ports 80 and 443.

Admin Account

The admin user has its credentials set by the config yaml at load. If you change the password while the server is running, then restart the server, the password will be reset to whatever is in the config.

admin:
  email: '[email protected]'
  password: myverysecretpassword

HTTP Sites

Hashbase can host the archives as HTTP sites. This has the added benefit of enabling dat-dns shortnames for the archives. There are two possible schemes:

sites: per-user

Per-user will host archives at username.hostname/archivename, in a scheme similar to GitHub Pages. If the archive-name is == to the username, it will be hosted at username.hostname.

Note that, in this scheme, a DNS shortname is only provided for the user archive (username.hostname).

sites: per-archive

Per-archive will host archives at archivename-username.hostname. If the archive-name is == to the username, it will be hosted at username.hostname.

By default, HTTP Sites are disabled.

Closed Registration

For a private instance, use closed registration with a whitelist of allowed emails:

registration:
  open: false
  allowed:
    - [email protected]
    - [email protected]

Reserved Usernames

Use reserved usernames to blacklist usernames which collide with frontend routes, or which might be used maliciously.

registration:
  reservedNames:
    - admin
    - root
    - support
    - noreply
    - users
    - archives

Monitoring

pm2: false         # set to true if you're using https://keymetrics.io/
alerts:
  diskUsage: 10gb  # when to trigger an alert on disk usage

Session Tokens

Hashbase uses Json Web Tokens to manage sessions. You absolutely must replace the secret with a random string before deployment.

sessions:
  algorithm: HS256                # probably dont update this
  secret: THIS MUST BE REPLACED!  # put something random here
  expiresIn: 1h                   # how long do sessions live?

Jobs

Hashbase runs some jobs periodically. You can configure how frequently they run.

# processing jobs
jobs:
  popularArchivesIndex: 30s  # compute the index of archives sorted by num peers
  userDiskUsage: 5m          # compute how much disk space each user is using
  deleteDeadArchives: 5m     # delete removed archives from disk

Emailer

Todo, sorry

Tests

Run the tests with

npm test

To run the tests against a running server, specify the env var:

REMOTE_URL=http://{hostname}/ npm test

License

MIT