Commit 17565876 authored by Caleb C. Sander's avatar Caleb C. Sander
Browse files

Fix #11

parent 758391f6
Showing with 4 additions and 3 deletions
+4 -3
......@@ -11,7 +11,7 @@ You will also be implementing several `Transform` streams as part of this projec
## Goals
- See how Node.js's stream abstraction makes it easy to consume different sources of data
- See how Node.js's stream abstraction simplifies working with different sources of data
- Implement some `Transform` streams
## The Unix `grep` utility
......@@ -52,7 +52,7 @@ For example, `some-command | grep ERROR` will print out only the lines of output
Implement `grep` (including 4 of its command-line flags) in Node.js.
Your `grep` will be invoked using `node grep.js [-i] [-r] [-v] [-z] pattern ...files`.
Your `grep` will be invoked using `node grep.js [-i] [-r] [-v] [-z] pattern [files ...]`.
The flags `-i`, `-r`, `-v`, and `-z` are described later.
A simple implementation of `grep` would read the entire input, loop over its lines, and print the lines that match the pattern.
......@@ -117,5 +117,6 @@ There are three types of `Readable` stream you will likely use:
Node.js streams can emit different sorts of data.
By default, most streams emit chunks of bytes.
To make them emit chunks of text instead, call `stream.setEncoding('utf8')` on the input stream, as well as each `Transform` stream.
This makes sense for a compressed file stream, but all other streams in the pipeline should emit chunks of text instead.
You can call `stream.setEncoding('utf8')` on the input stream as well as each `Transform` stream to make them output text.
Your `Transform` streams should also call `super({decodeStrings: false})` in their constructors so they receive each chunk of data as a string.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment