Writing to Files in Node.js

Introduction

Writing to files is a frequent need when programming in any language. Like other programming languages, JavaScript with Node.js makes dealing with the file system intuitive through the use of a module dealing with the operating system's file system.

The fs module contains the functionality for manipulating files and dealing with the computing platform's file system. Along with reading from and writing to files, you can use the module to query for file statistics like file sizes and counts.

By default, the fs module will write files with an encoding of 'utf8'. UTF-8 is an encoding commonly used in web pages and other documents. File encoding refers to the character set that is used for the contents of the file. Commonly used encodings are 'utf8', 'ascii', 'binary', 'hex', 'base64' and 'utf16le'.

Each character encoding has specific advantages and instances where it makes the most sense. For instance, 'ascii' is an easy-to-read and write encoding, but it will strip the high bit from the data stream. For reasons like this you will want to select your character encoding with care.

In the next few sections we'll present the different ways fs allows you to write to the file system, including their differences and advantages.

Method 1: Using fs.writeFile

The fs module includes a high-level writeFile method that can write data to files asynchronously. This means, like with a lot of operations in Node.js, the I/O is non-blocking, and emits an event when the operation finishes. We can write a callback function to run when the writeFile returns.

Here is a simple code example of writing some song lyrics to a file:

// writefile.js

const fs = require('fs');

let lyrics = 'But still I\'m having memories of high speeds when the cops crashed\n' + 
             'As I laugh, pushin the gas while my Glocks blast\n' + 
             'We was young and we was dumb but we had heart';

// write to a new file named 2pac.txt
fs.writeFile('2pac.txt', lyrics, (err) => {
    // throws an error, you could also catch it here
    if (err) throw err;

    // success case, the file was saved
    console.log('Lyric saved!');
});

We specified the file name, along with the characters to write, as well as a callback function to run when the method returns. We can also pass an options object or string that specifies the encoding to use, as well as mode and flag. If you need to specify the character encoding then here is the way you'll need to call the method:

fs.writeFile('2pac.txt', 'Some other lyric', 'ascii', callback);

Note that if the file doesn't yet exist, then calling this method will actually create the file as well, so you don't need to worry about that on your end.

There is a synchronous method fs.writeFileSync that you can use in place of fs.writeFile.

The difference is that the fs.writeFileSync method performs input/output operations synchronously, blocking the Node.js event loop while the file is written. This may interfere with the performance of your Node.js application if you overdo these synchronous writes. In most cases, you should prefer using the asynchronous writes.

Just know that you should be careful when using these two methods because they will create a new file every time or they'll replace the contents if it already exists. If you merely want to update an existing file, you'll want to use appendFile, which we go over later in this article.

Method 2: Using fs.write

Unlike the high-level fs.writeFile and fs.writeFileSync methods, you can leverage more control when writing to files in Node.js using the low-level fs.write method. The fs.write method allows fine control over the position in the file to begin writing at, a buffer which you can specify to write, as well as which part of the buffer you want to write out to the file.

In addition, using low-level writing functionality allows you to know precisely when file descriptors are released and to respond accordingly, such as by sending push notifications in your application.

Here is an example where we write another few lines of lyrics to a different file using fs.write.

// fs_write.js

const fs = require('fs');

// specify the path to the file, and create a buffer with characters we want to write
let path = 'ghetto_gospel.txt';
let buffer = new Buffer('Those who wish to follow me\nI welcome with my hands\nAnd the red sun sinks at last');

// open the file in writing mode, adding a callback function where we do the actual writing
fs.open(path, 'w', function(err, fd) {
    if (err) {
        throw 'could not open file: ' + err;
    }

    // write the contents of the buffer, from position 0 to the end, to the file descriptor returned in opening our file
    fs.write(fd, buffer, 0, buffer.length, null, function(err) {
        if (err) throw 'error writing file: ' + err;
        fs.close(fd, function() {
            console.log('wrote the file successfully');
        });
    });
});

You will want to use fs.write when performing fine-grained updates to files, for instance, writing a sequence of bytes at a known position in the middle of a file.

These methods work with file descriptors, and you have to open the file for writing with fs.open and then, at the end, close it again with fs.close in order to avoid memory leakages.

Method 3: Using fs.createWriteStream

When handling particularly large files, or files that come in chunks, say from a network connection, using streams is preferable to writing files in one go via the above methods that write entire files.

Streams write small amounts of data at a time. While this has the disadvantage of being slower because data is transferred in chunks, it has advantages for RAM performance. Since the whole file is not loaded in memory all at once, RAM usage is lower.

To write to a file using streams, you need to create a new writable stream. You can then write data to the stream at intervals, all at once, or according to data availability from a server or other process, then close the stream for good once all the data packets have been written.

Here is a code example of how we do this:

// write_stream.js

const fs = require('fs');

let writeStream = fs.createWriteStream('secret.txt');

// write some data with a base64 encoding
writeStream.write('aef35ghhjdk74hja83ksnfjk888sfsf', 'base64');

// the finish event is emitted when all data has been flushed from the stream
writeStream.on('finish', () => {
    console.log('wrote all data to file');
});

// close the stream
writeStream.end();

We created a writable stream, then wrote some data to the stream. We have included a log statement when the "finish" event is emitted, letting us know that all data has been flushed to the underlying system. In this case, that means all data has been written to the file system.

Free eBook: Git Essentials

Check out our hands-on, practical guide to learning Git, with best-practices, industry-accepted standards, and included cheat sheet. Stop Googling Git commands and actually learn it!

Given the performance advantages, streams are a technique you will see used widely in Node.js territory, not just for file writing. Another example would be using streams to receive data from a network connection.

Error Handling when Writing to a File

When writing to files, many different errors can occur during input/output. Your code should address these potential errors. One simple thing to do is to throw the errors as Node.js exceptions. This crashes the program, and is therefore not recommended except in cases where you have little other recourse.

For example, if the error can only occur as a programmer mistake, also known as a bug, then crashing the program may be the best response since it alerts the programmer of the error right away and doesn't hide the error.

When you are dealing with operational errors, such as specifying a path that is inaccessible, then there are two approaches to take. One is to call a callback function with the error. In the callback, you then include some logic of handling the error, such as logging it.

Alternatively, you can include try/catch blocks around the code calling the section of the program where errors can occur.

Appending to a File with fs.appendFile

In cases where we just want to add to a file’s contents, we can use the fs.appendFile high level method and its synchronous counterpart, fs.appendFileSync. Using the fs.appendFile creates the file if it does not exist, and appends to the file otherwise.

Here is an example of using fs.appendFile to write to a file.

// append_file.js

const fs = require('fs');

// add a line to a lyric file, using appendFile
fs.appendFile('empirestate.txt', '\nRight there up on Broadway', (err) => {
    if (err) throw err;
    console.log('The lyrics were updated!');
});

Here, we are using appendFile to add an extra line to a file holding the following lyrics:

// empirestate.txt

Empire State of Mind - JAY-Z

I used to cop in Harlem;
hola, my Dominicanos

Using fs.appendFile does not overwrite the file, but updates it from the end position, with the new data we append. This is a useful feature when you just want to update a file, such as continuously adding data to a single log file.

So after running our code the text file will look like this:

// empirestate.txt

Empire State of Mind - JAY-Z

I used to cop in Harlem;
hola, my Dominicanos
Right there up on Broadway

As you can see, the old text is still there and our new string has been added to the end.

Learn More

Want to learn more about the fundamentals of Node.js? Personally, I'd recommend taking an online course like Learn Node.js by Wes Bos. Not only will you learn the most up-to-date ES2017 syntax, but you'll get to build a full stack restaurant app. In my experience, building real-world apps like this is the fastest way to learn.

Conclusion

As we saw, there are multiple approaches to consider when writing to a file in Node.js. The simplest way, and often the most appropriate, is to use the writeFile method in the fs module. This allows you to write to a specified file path, with asynchronous behavior, and creates a new file or replaces one if it exists at the path.

If you are writing larger files, you should consider the chunked writing capabilities of writable streams. These enable writing buffers of data in chunks, instead of loading the data all at once in memory. Using streams helps you achieve better performance in such cases.

For special situations, you can use the lower-level fs utilities like fs.write. The fs module supports fine-grained file-writing operations to match your specific needs.

Last Updated: July 12th, 2023
Was this article helpful?

Improve your dev skills!

Get tutorials, guides, and dev jobs in your inbox.

No spam ever. Unsubscribe at any time. Read our Privacy Policy.

Tendai MutunhireAuthor

Tendai Mutunhire started out doing Java development for large corporations, then taught startup teams how to code at the MEST incubator, and is in the process of launching a few startups of his own.

Project

React State Management with Redux and Redux-Toolkit

# javascript# React

Coordinating state and keeping components in sync can be tricky. If components rely on the same data but do not communicate with each other when...

David Landup
Uchechukwu Azubuko
Details

Getting Started with AWS in Node.js

Build the foundation you'll need to provision, deploy, and run Node.js applications in the AWS cloud. Learn Lambda, EC2, S3, SQS, and more!

© 2013-2024 Stack Abuse. All rights reserved.

AboutDisclosurePrivacyTerms