Now that the stream is initialized, we can send data to it: readableStream.push('ping!') const Stream = require('stream')Ĭonst readableStream = new Stream.Readable() We first require the Readable stream, and we initialize it. A practical example How to create a readable stream Whenever you’re using Express you are using streams to interact with the client, also, streams are being used in every database connection driver that you can work with, because of TCP sockets, TLS stack and other connections are all based on Node.js streams. You might have used the fs module, which lets you work with both readable and writable file streams. For example, in a Node.js based HTTP server, request is a readable stream and response is a writable stream. If you have already worked with Node.js, you may have come across streams. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file. Transform: streams that can modify or transform the data as it is written and read. Duplex: streams that are both Readable and Writable.For example: fs.createReadStream() lets us read the contents of a file. Readable: streams from which data can be read.For example, fs.createWriteStream() lets us write data to a file using streams. Writable: streams to which we can write data.Time efficiency: it takes significantly less time to start processing data as soon as you have it, rather than having to wait with processing until the entire payload has been transmitted.Memory efficiency: you don’t need to load large amounts of data in memory before you are able to process it.Streams basically provide two major advantages compared to other data handling methods: In Node.js it’s possible to compose powerful pieces of code by piping data to and from other smaller pieces of code, using streams. Designing with composability in mind means several components can be combined in a certain way to produce the same type of result. They also give us the power of ‘composability’ in our code. However, streams are not only about working with media or big data. Instead, your browser receives the video as a continuous flow of chunks, allowing the recipients to start watching and/or listening almost immediately. Let’s take a “streaming” services such as YouTube or Netflix for example: these services don’t make you download the video and audio feed all at once. Using streams to process smaller chunks of data, makes it possible to read larger files. This makes streams really powerful when working with large amounts of data, for example, a file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it. What makes streams unique, is that instead of a program reading a file into memory all at once like in the traditional way, streams read chunks of data piece by piece, processing its content without keeping it all in memory. Streams are a way to handle reading/writing files, network communications, or any kind of end-to-end information exchange in an efficient way. They are data-handling method and are used to read or write input into output sequentially. Streams are one of the fundamental concepts that power Node.js applications. We can figure this out! What are streams? This article will help you understand streams and how to work with them. In the words of Dominic Tarr: “Streams are Node’s best and most misunderstood idea.” Even Dan Abramov, creator of Redux and core team member of React.js is afraid of Node streams. Streams in Node.js have a reputation for being hard to work with, and even harder to understand.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |