Introduction
The term CSV is an abbreviation that stands for comma-separated values.
A CSV file is a plain text file that contains data formatted according to the CSV standard. It has distinct lines which represent records and each field in the record is separated from another by a comma.
It's very convenient to store tabular data in CSV:
Name,Surname,Age,Gender
John,Snow,26,M
Clair,White,33,F
Fancy,Brown,78,F
Here, the first row represents the titles of the columns/fields of our CSV records and then there are 3 records that represent certain people. As you can see, the values are delimited by commas and each record starts on a new row.
Hey, but what if we want to include commas or line breaks to some of the fields that are stored in the CSV format?
There are several approaches to solving this issue, for example, we could wrap up such values in double quotes. Some of the CVS implementations don't support this feature by design, though.
CSV Standardization
One of the most commonly used CSV standards is described in the RFC4180.
According to it, the CSV format is described by these 7 rules:
- Each record is located on a separate line, delimited by a line break (CRLF).
- The last record in the file may or may not have an ending line break.
- There may be an optional header line appearing as the first line of the file with the same format as normal record lines. This header will contain names corresponding to the fields in the file and should contain the same number of fields as the records in the rest of the file (the presence or absence of the header line should be indicated via the optional "header" parameter of this MIME type).
- Within the header and each record, there may be one or more fields, separated by commas. Each line should contain the same number of fields throughout the file. Spaces are considered part of a field and should not be ignored. The last field in the record must not be followed by a comma.
- Each field may or may not be enclosed in double quotes (however some programs, such as Microsoft Excel, do not use double quotes at all). If fields are not enclosed with double quotes, then double quotes may not appear inside the fields.
- Fields containing line breaks (CRLF), double quotes, and commas should be enclosed in double-quotes.
- If double-quotes are used to enclose fields, then a double-quote appearing inside a field must be escaped by preceding it with another double quote.
If you're interested in reading more with multiple examples, you can study the original RFC4180 document, linked above.
Reading CSV Files in Node.js
To read a CSV file in Node.js, we could use nothing else than just the fs
module, as in essence the CSV file is a plain text file.
If you're interested in reading more about Reading Files with Node.js or Writing Files with Node.js, we've got both covered!
However, there are a couple of helpful modules that could handle generating or parsing the CSV content for us. We'll start by installing the module csv-parser
:
$ npm i -s csv-parser
Then, let's put the CSV data from the beginning of the article to a file called data.csv
and follow up with a very simple example:
const csv = require('csv-parser');
const fs = require('fs');
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row) => {
console.log(row);
})
.on('end', () => {
console.log('CSV file successfully processed');
});
Here, we create a readStream
using the fs
module, pipe it into the csv
object that will then fire the data
event each time a new row from the CSV file is processed. The end
event is triggered when all the rows from the CSV file are processed and we log a short message to the console to indicate that.
For demonstration purposes, we just console.log
each processed row and after running the code, you'll see this output in your console:
Row {
Name: 'John',
'Surname': 'Snow',
'Age': '26',
'Gender': 'M' }
Row {
Name: 'Clair',
'Surname': 'White',
'Age': '33',
'Gender': 'F' }
Row {
Name: 'Fancy',
'Surname': 'Brown',
'Age': '78',
'Gender': 'F' }
CSV file successfully processed
Writing CSV Files in Node.js
Remembering the fact that CSV files are just plain text files, we could always limit ourselves to using only the native fs
module, but to make our life easier, we'll use another common npm
module, csv-writer
.
First goes the installation:
$ npm i -s csv-writer
Then, the code:
const createCsvWriter = require('csv-writer').createObjectCsvWriter;
const csvWriter = createCsvWriter({
path: 'out.csv',
header: [
{id: 'name', title: 'Name'},
{id: 'surname', title: 'Surname'},
{id: 'age', title: 'Age'},
{id: 'gender', title: 'Gender'},
]
});
const data = [
{
name: 'John',
surname: 'Snow',
age: 26,
gender: 'M'
}, {
name: 'Clair',
surname: 'White',
age: 33,
gender: 'F',
}, {
name: 'Fancy',
surname: 'Brown',
age: 78,
gender: 'F'
}
];
csvWriter
.writeRecords(data)
.then(()=> console.log('The CSV file was written successfully'));
The csv-writer
module requires an initial configuration where we provide it with the name of the resulting CSV file and the header
configuration.
Note: In our JavaScript object, all properties are in lowercase, but in the CSV file the first letters of them should be capitalized.
Check out our hands-on, practical guide to learning Git, with best-practices, industry-accepted standards, and included cheat sheet. Stop Googling Git commands and actually learn it!
After the config is done, all we need to do is call the writeRecords
function, pass in the data
array that represents the data structure that should be written to the CSV file.
Once this process is done, we'll print an informational message to the console stating that the program has completed.
Using the fast-csv Module
The Node.js ecosystem and npm
provides a lot of options to read and write CSV files. We'll show another example of a popular CSV module and take a look at how we can write our data array using the fast-csv
module as an alternative.
First, we have to install the module:
$ npm i -s fast-csv
const fastcsv = require('fast-csv');
const fs = require('fs');
const ws = fs.createWriteStream("out.csv");
fastcsv
.write(data, { headers: true })
.pipe(ws);
The API is a bit different, but the result is identical. In just a couple of lines of code, we managed to write the array of JavaScript objects to a CSV file that could be later used by a variety of other applications.
Conclusion
Reading and writing CSV files with Node.js is a common development task as a CSV format is commonly used to store structured tabular data. Many npm
modules provide this functionality, so you should choose the one that suits best to your needs and has ongoing support.