Read Files with Node.js

One of the most common things you'll want to do with just about any programming language is open and read a file. With most languages, this is pretty simple, but for JavaScript veterans it might seem a bit weird. For so many years JavaScript was only available in the browser, so front-end developers may only be familiar with the FileReader API or similar.

Node.js, as you probably know, is much different than your typical JavaScript in the browser. It has its own set of libraries meant for handling OS and filesystem tasks, like opening and reading files. In this article I'll show you how to use Node.js to read files. Specifically, we'll be using the fs module to do just that.

There are two ways you can open and read a file using the fs module:

  • Load all of the contents at once (buffering)
  • Incrementally load contents (streaming)

Each of these methods will be explained in the next two sections.

Buffering Contents with fs.readFile

This is the most common way to read a file with Node.js, especially for beginners, due to its simplicity and convenience. Although, as you'll come to realize in the next section, it isn't necessarily the best or most efficient.

Here is a quick example using fs.readFile:

var fs = require('fs');

fs.readFile('my-file.txt', 'utf8', function(err, data) {
    if (err) throw err;
    console.log(data);
});

The data argument to the callback contains the full contents of the file represented as a string in utf8 format. If you omit the utf8 argument completely, then the method will just return the raw contents in a Buffer object. Removing the utf8 argument in the above code (and assuming my-file.txt contained the string "Hey there!"), we'd get this output:

$ node read-file.js
<Buffer 48 65 79 20 74 68 65 72 65 21>

You may have noticed that fs.readFile returns the contents in a callback, which means this method runs asynchronously. This should be used whenever possible to avoid blocking the main execution thread, but sometimes you have to do things synchronously, in which case Node provides you with a readFileSync method.

This method works exactly the same way, except that the file contents are returned directly from the function call and the execution thread is blocked while it loads the file. I typically use this in start-up sections of my programs (like when we're loading config files) or in command-line apps where blocking the main thread isn't a big deal.

Here is how to load a file synchronously with Node:

var fs = require('fs');

try {
    var data = fs.readFileSync('my-file.txt', 'utf8');
    console.log(data);    
} catch(e) {
    console.log('Error:', e.stack);
}

Notice that with the blocking (synchronous) call we have to use try...catch to handle any errors, unlike the non-blocking (asynchronous) version where errors were just passed to us as arguments.

Other than the way these methods return data and handle errors, they work very much the same.

Streaming Contents with fs.createReadStream

The second way to open and read a file is to open it as a Stream using the fs.createReadStream method. All Node streams are instances of the EventEmitter object, allowing you to subscribe to important events.

A readable stream object can be useful for a lot of reasons, a few of which include:

  • Smaller memory footprint. Since the target file's data is loaded in chunks, not as much memory is required to store the data in a buffer.
  • Faster response time. For time-sensitive applications, the time between the request and response is critical. Streams cut down the response time (especially for large files) since they don't need to wait to load the entire file before returning data.
  • Piping data. The stream abstraction allows you to use a common interface between data producers and consumers to pass that data around via pipes. This is very similar to the Unix pipe concept.

Although it really isn't very hard to use streams, they can be a bit intimidating and aren't quite as intuitive as the fs.readFile method. Here is the 'hello world' of file streaming:

var fs = require('fs');

var data = '';

var readStream = fs.createReadStream('my-file.txt', 'utf8');

readStream.on('data', function(chunk) {
    data += chunk;
}).on('end', function() {
    console.log(data);
});

This code does exactly what the code in the first section does, except that we have to "collect" chunks of data before printing it out to the console. If your file is fairly small then you'll probably only ever receive a single chunk, but for larger files, like audio and video, you'll have to collect multiple chunks. This is the case where you'll start to notice the real value of streaming files.

Free eBook: Git Essentials

Check out our hands-on, practical guide to learning Git, with best-practices, industry-accepted standards, and included cheat sheet. Stop Googling Git commands and actually learn it!

Note that the example I showed above mostly defeats the purpose of using a stream since we end up collecting the data in a buffer (variable) anyway, but at least it gives you an idea as to how they work. A better example showing the strengths of file streams can be seen here, in an Express route that handles a file request:

var fs = require('fs');
var path = require('path');
var http = require('http');

var staticBasePath = './static';

var staticServe = function(req, res) {
    var fileLoc = path.resolve(staticBasePath);
    fileLoc = path.join(fileLoc, req.url);
    
        var stream = fs.createReadStream(fileLoc);

        stream.on('error', function(error) {
            res.writeHead(404, 'Not Found');
            res.end();
        });

        stream.pipe(res);
};

var httpServer = http.createServer(staticServe);
httpServer.listen(8080);

All we do here is open the file with fs.createReadStream and pipe it to the response object, res. We can even subscribe to error events and handle those as they happen. It's a much better method to handling files once you learn how to properly use it. For a more complete example and explanation of the above code, check out this article on creating static file servers with Node.

Conclusion

From this article you should have learned the basics of reading files, as well as some advanced loading methods using Stream objects. Knowing when to use them is the key, and should be carefully considered for memory-constrained or time-constrained applications.

What's your preferred method of handling files? How have you used Streams in the past? Let us know in the comments!

Last Updated: April 27th, 2016
Was this article helpful?

Improve your dev skills!

Get tutorials, guides, and dev jobs in your inbox.

No spam ever. Unsubscribe at any time. Read our Privacy Policy.

Project

React State Management with Redux and Redux-Toolkit

# javascript# React

Coordinating state and keeping components in sync can be tricky. If components rely on the same data but do not communicate with each other when...

David Landup
Uchechukwu Azubuko
Details

Getting Started with AWS in Node.js

Build the foundation you'll need to provision, deploy, and run Node.js applications in the AWS cloud. Learn Lambda, EC2, S3, SQS, and more!

© 2013-2024 Stack Abuse. All rights reserved.

AboutDisclosurePrivacyTerms