mirror of
https://github.com/dolthub/dolt.git
synced 2026-02-09 10:38:10 -06:00
Update quickstart.md
This commit is contained in:
@@ -38,21 +38,18 @@ The steps you’ll need to take are:
|
||||
1. Vendor the code into your project.
|
||||
1. Set `NOMS_VERSION_NEXT=1` in your environment.
|
||||
1. Decide which type of storage you'd like to use: memory (convenient for playing around), disk, IPFS, or S3. (If you want to implement a store on top of another type of storage that's possible too; email us or reach out on slack and we can help.)
|
||||
1. Set up and instantiate a database for your storage. Generally, you give a [dataset spec](https://github.com/attic-labs/noms/blob/master/doc/spelling.md) like `mem::mydataset` to a [`config.Resolver`](https://github.com/attic-labs/noms/blob/master/go/config/resolver.go) which gives you a handle to the [`Database`](https://github.com/attic-labs/noms/blob/master/go/datas/database.go) and [`Dataset`](https://github.com/attic-labs/noms/blob/master/go/datas/dataset.go).
|
||||
1. Set up and instantiate a database for your storage. Generally, you use the spec package to parse a [dataset spec](https://github.com/attic-labs/noms/blob/master/doc/spelling.md) like `mem::mydataset` which you can then ask for [`Database`](https://github.com/attic-labs/noms/blob/master/go/datas/database.go) and [`Dataset`](https://github.com/attic-labs/noms/blob/master/go/datas/dataset.go).
|
||||
* **Memory**: no setup required, just instantiate it:
|
||||
```
|
||||
cfg := config.NewResolver()
|
||||
database, dataset, err := cfg.GetDataset("mem::test") // Dataset name is "test"
|
||||
sp := spec.ForDataset("mem::test") // Dataset name is "test"
|
||||
```
|
||||
* **Disk**: identify a directory for storage, say `/path/to/chunks`, and then instantiate:
|
||||
```
|
||||
cfg := config.NewResolver()
|
||||
database, dataset, err := cfg.GetDataset("/path/to/chunks::test") // Dataset name is "test"
|
||||
sp := spec.ForDataset("/path/to/chunks::test") // Dataset name is "test"
|
||||
```
|
||||
* **IPFS**: identify an IPFS node by directory. If an IPFS node doesn't exist at that directory, one will be created:
|
||||
```
|
||||
cfg := config.NewResolver()
|
||||
database, dataset, err := cfg.GetDataset("ipfs:/path/to/chunks::test") // Dataset name is "test"
|
||||
sp := spec.ForDataset("ipfs:/path/to/ipfs_repo::test") // Dataset name is "test"
|
||||
```
|
||||
* **S3**: Follow the [S3 setup instructions](https://github.com/attic-labs/noms/blob/master/go/nbs/NBS-on-AWS.md) then instantiate a database and dataset:
|
||||
```
|
||||
@@ -119,11 +116,10 @@ The steps you’ll need to take are:
|
||||
}
|
||||
```
|
||||
1. You can inspect data that you've committed via the [noms command-line interface](https://github.com/attic-labs/noms/blob/master/doc/cli-tour.md). For example:
|
||||
```
|
||||
noms log /path/to/store::ds
|
||||
noms show /path/to/store::ds
|
||||
```
|
||||
Try Consider creating a [.nomsconfig](https://github.com/attic-labs/noms/blob/master/samples/cli/nomsconfig/README.md) file to save the trouble of writing database specs on the command line.
|
||||
* Note that Memory tables won't be inspectable because they exist only in the memory of the process that created them.
|
||||
```
|
||||
noms log /path/to/store::ds
|
||||
noms show /path/to/store::ds
|
||||
```
|
||||
* Note that Memory tables won't be inspectable because they exist only in the memory of the process that created them.
|
||||
1. Implement pull and merge. The [pull API](../../go/datas/pull.go) is used pull changes from a peer and the [merge API](../../go/merge/) is used to merge changes before commit. There's an [example of merging in the IPFS-based-chat sample
|
||||
app](https://github.com/attic-labs/noms/blob/master/samples/go/ipfs-chat/pubsub.go).
|
||||
|
||||
Reference in New Issue
Block a user