Skip to content

Commit

Permalink
chore: remove node buffers from runtime code (#66)
Browse files Browse the repository at this point in the history
Node `Buffers` are subclasses of `Uint8Array` and we don't use any
`Buffer`-specific functions, so do not demand `Buffer`s where
`Uint8Array`s will do. If other modules in the stack actually require
`Buffer`s, let the failure occur there.

`Buffer`s are still used in the tests because the `Buffer` requirement
needs pull out of (at least) the `cids` and `multihash` modules first.

BREAKING CHANGE: does not convert input to node Buffers any more, uses Uint8Arrays instead
  • Loading branch information
achingbrain authored Jul 28, 2020
1 parent b8b2ee3 commit db60a42
Show file tree
Hide file tree
Showing 16 changed files with 826 additions and 1,002 deletions.
2 changes: 1 addition & 1 deletion lerna.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"lerna": "3.20.2",
"lerna": "3.22.1",
"packages": [
"packages/*"
],
Expand Down
1,235 changes: 776 additions & 459 deletions package-lock.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
"update-contributors": "aegir release --lint=false --test=false --bump=false --build=false --changelog=false --commit=false --tag=false --push=false --ghrelease=false --docs=false --publish=false"
},
"devDependencies": {
"lerna": "^3.20.2"
"lerna": "^3.22.1"
},
"repository": {
"type": "git",
Expand Down
29 changes: 15 additions & 14 deletions packages/ipfs-unixfs-exporter/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ const files = []

for await (const file of importer([{
path: '/foo/bar.txt',
content: Buffer.from(0, 1, 2, 3)
content: new Uint8Array([0, 1, 2, 3])
}], ipld)) {
files.push(file)
}
Expand All @@ -65,18 +65,16 @@ console.info(entry.name) // bar.txt
console.info(entry.unixfs.fileSize()) // 4

// stream content from unixfs node
const bytes = []
const size = entry.unixfs.fileSize()
const bytes = new Uint8Array(size)
let offset = 0

for await (const buf of entry.content({
offset: 0, // optional offset
length: 4 // optional length
})) {
bytes.push(buf)
for await (const buf of entry.content()) {
bytes.set(buf, offset)
offset += chunk.length
}

const content = Buffer.concat(bytes)

console.info(content) // 0, 1, 2, 3
console.info(bytes) // 0, 1, 2, 3
```

#### API
Expand Down Expand Up @@ -175,17 +173,20 @@ There is no `content` function for a `CBOR` node.
When `entry` is a file or a `raw` node, `offset` and/or `length` arguments can be passed to `entry.content()` to return slices of data:
```javascript
const bufs = []
const length = 5
const data = new Uint8Array(length)
let offset = 0

for await (const chunk of entry.content({
offset: 0,
length: 5
length
})) {
bufs.push(chunk)
data.set(chunk, offset)
offset += chunk.length
}

// `data` contains the first 5 bytes of the file
const data = Buffer.concat(bufs)
return data
```
If `entry` is a directory or hamt shard, passing `offset` and/or `length` to `entry.content()` will limit the number of files returned from the directory.
Expand Down
Loading

0 comments on commit db60a42

Please sign in to comment.