Converting Arrays to JSON in JavaScript
JSON (JavaScript Object Notation) has become the ubiquitous data interchange format for web apps and APIs. Its lightweight structure makes JSON easy to read and parse in any language. JavaScript includes simple methods for serializing native arrays and objects to JSON strings and reviving them back to live JavaScript.
Why Convert an Array to JSON?
Here are some common reasons you may need to transform a JavaScript array into JSON format:
- Client Storage – localStorage and sessionStorage use JSON as their underlying data format. So arrays must be stringified before saving and parsed when reading.
- Data Transmission – Fetch, Axios, and Ajax only allow sending text or buffer payloads. So arrays to be sent to servers need JSON conversion.
- Configuration – Static config files and seed data sets often store arrays and objects using JSON syntax.
- Debugging – Printing JSON representations of complex arrays or objects can be easier to inspect than the live JavaScript versions.
JSON is the "lingua franca" of web dev – transforming arrays to JSON enables transferring data between client, server, and storage layers.
JSON.stringify()
The easiest way to encode a JavaScript array to JSON is using the native JSON.stringify()
method:
const fruits = [‘Apple‘, ‘Banana‘, ‘Orange‘];
const fruitsJSON = JSON.stringify(fruits);
// ["Apple","Banana","Orange"]
JSON.stringify()
handles all types of arrays seamlessly:
JSON.stringify([1, 2, 3]);
// "[1,2,3]"
JSON.stringify([true, false]);
// "[true,false]"
JSON.stringify([[1,2], [3,4]]);
// "[[1,2],[3,4]]"
The array contents and structure always survive the transformation to a JSON string.
One detail is that JSON.stringify()
does not guarantee order for object properties. But array elements always maintain position.
Stringification Options
JSON.stringify()
accepts two optional parameters for more control over the output:
JSON.stringify(myArray, replacer, space)
The Replacer
The replacer can be either an Array listing allowed keys, or a Function that filters each value.
For example, to filter an array of objects:
const data = [{id: 1, text: ‘a‘}, {id: 2, text: ‘b‘}];
function replacer(key, value) {
// Filter out text
if (key === ‘text‘) return undefined;
return value;
}
const filtered = JSON.stringify(data, replacer);
// [{"id":1},{"id":2}]
Or an array index replacer to only output the first 2 elements:
const fruits = [‘Apple‘,‘Orange‘,‘Banana‘];
const firstTwo = JSON.stringify(fruits, [0,1]);
// ["Apple","Orange"]
So the replacer function gives complete control over which array elements get serialized.
String Indentation
The space parameter lets you define the number of spaces used to indent formatted JSON:
JSON.stringify(myArray, null, 4);
// [
// "Apple",
// "Banana",
// "Orange"
// ]
Indenting makes stringified content much more readable. Between 2 and 4 spaces is typical.
Performance
For small arrays under 1,000 items, JSON.stringify()
is quite fast, often less than 1 millisecond.
However for large arrays, it can be over 100x slower than alternatives like mapping elements to JSON strings.
// Large array of 10,000 items
const largeArr = [...Array(10000).keys()];
console.time(‘stringify‘);
JSON.stringify(largeArr);
console.timeEnd(‘stringify‘);
// stringify: 8.123ms
console.time(‘map‘);
largeArr.map(v => JSON.stringify(v));
console.timeEnd(‘map‘);
// map: 0.069ms
So while concise, for serialization of big data, JSON.stringify()
has considerable overhead.
Reviving JSON to Arrays
Once you have converted an array to a JSON string, at some point you will likely need to deserialize it back into live array form again.
The JSON.parse()
method allows this:
const fruitsJSON = ‘["Apple","Banana","Orange"]‘;
const fruits = JSON.parse(fruitsJSON);
// [‘Apple‘,‘Banana‘,‘Orange‘]
JSON.parse()
essentially reverses what stringify()
does:
- JSON array syntax
[]
becomes a JS array literal - JSON strings map to JS strings
- JSON numbers/booleans match native equivalents
Array structure, element order, and data types are maintained throughout the transformations.
Reviver Function
JSON.parse()
also allows an optional reviver function. This gets called on each parsed value and can transform the final output:
function reviver(key, value) {
if (typeof value === ‘string‘) {
return value.toUpperCase();
}
return value;
}
JSON.parse(‘["a","b","c"]‘, reviver);
// [‘A‘,‘B‘,‘C‘]
So revivers provide a way to hook into the parsing process for additional processing or sanitization.
Use Cases
Now let‘s explore some practical examples of converting arrays to JSON and back again.
Browser Storage
localStorage and sessionStorage only support saving strings. So arrays need JSON conversion:
// Array to store
const fruits = [‘Apple‘,‘Banana‘,‘Orange‘];
// Stringify
const fruitsJSON = JSON.stringify(fruits);
// Save to localStorage
localStorage.setItem(‘fruits‘, fruitsJSON);
Then reading the array back out:
// Get JSON string
const storedFruits = localStorage.getItem(‘fruits‘);
// Revive to array form
const fruits = JSON.parse(storedFruits);
// [‘Apple‘,‘Banana‘,‘Orange‘]
This serialize & deserialize workflow applies the same for sessionStorage and IndexedDB values.
Transmitting Data to Server
Web APIs universally accept only string/buffer payloads. So array data must be JSON encoded before sending:
const data = [1, 2, 3];
fetch(‘/api/endpoint‘, {
method: ‘POST‘,
headers: {
‘Content-Type‘: ‘application/json‘
},
body: JSON.stringify(data)
});
For an Express backend receiving this data:
app.use(express.json());
app.post(‘/api/endpoint‘, (req, res) => {
// Body is parsed to array
const array = req.body;
res.send(‘Data received‘);
});
The same works for sending array data through sockets, gRPC, and other protocols.
Merging Arrays of Objects
A common task is merging array data from different sources. Rather than manual iteration, you can leverage JSON:
const fruits = [{
name: ‘Apple‘,
color: ‘red‘,
},{
name: ‘Banana‘,
color: ‘yellow‘,
}];
const vegetables = [{
name: ‘Carrot‘,
color: ‘orange‘,
}];
// Stringify arrays
const strFruits = JSON.stringify(fruits);
const strVeg = JSON.stringify(vegetables);
// Join and parse to single array
const produce = JSON.parse(strFruits + ‘,‘ + strVeg);
// Merged fruits and vegetables
By serializing to JSON, we can easily combine arrays and decode to consolidated data.
Debugging Data Structures
Since JSON representations are human-readable, stringifying complex arrays can help inspect their structure:
const data = [
{
stats: [2.71, 5.32],
metrics: {
efficiency: 0.87,
errors: 0
},
items: [
{label: ‘a‘},
{label: ‘b‘}
]
}
];
// Hard to read
console.log(data);
// Easy to analyze
console.log(JSON.stringify(data, null, 2));
The formatted JSON clearly shows the nested objects and arrays.
Comparing Performance
Let‘s analyze the performance impact of JSON conversions more.
Below we stringify an array of 10,000 integers using JSON.stringify()
, map() + JSON.stringify
, and serialize the array manually:
const arr = Array(10000).fill(1);
function serializeManually(arr) {
let str = ‘‘;
arr.forEach(num => {
str += JSON.stringify(num) + ‘,‘;
});
return `[${str.slice(0, -1)}]`;
}
function testSpeed(name, fn) {
console.time(name);
const str = fn(arr);
console.timeEnd(name);
}
// JSON.stringify(): 9.468ms
testSpeed(‘JSON Stringify‘, arr => {
return JSON.stringify(arr);
});
// map + JSON.stringify: 0.984ms
testSpeed(‘Map Stringify‘, arr => {
return arr.map(v => JSON.stringify(v));
});
// Manual serialize: 87.482ms
testSpeed(‘Manual Stringify‘, serializeManually);
JSON.stringify()
is quite fast for smaller arrays. But custom serialization can be over 8x faster for huge batches of data.
So for best performance across array sizes, use JSON.stringify()
for length < 1000, otherwise map()
elements individually.
Security Considerations
While JSON parsing is convenient, it can also pose security risks if handling untrusted data.
For example, a JSON string can trigger prototype pollution on vulnerable versions of Node:
// Malicious payload
const attack = ‘{"__proto__":{"poluted":true}}‘;
// Proto polluted!
JSON.parse(attack);
Also excessive nesting depth or array lengths during parsing could crash apps:
// 200mb+ string crashes process
JSON.parse(giganticString);
So validate and sanitize any JSON prior to parsing, and enforce depth/size limits:
const json = getUntrustedData();
// Limit depth / array size
const safe = JSON.parse(json, (k, v) => {
if (depth > 10) return undefined;
if (Array.isArray(v) && v.length > 100) return;
return v;
})
Parsing from trusted sources is fine. But practice care when allowing arbitrary JSON inputs.
Comparison to Other Formats
JSON is the standard data interchange format for JavaScript apps given its native compatibility. But sometimes alternate serialization formats like YAML or CSV may better serve your needs:
YAML – More readable format supporting comments. Useful for configuration:
fruits:
- Apple
- Banana
- Orange
CSV – Simple tabular format rendering to spreadsheets:
Id,Name,Color
1,Apple,Red
2,Banana,Yellow
So while JSON encoding/decoding is preferred for internal JavaScript use, other formats can excel for human editing or analysis.
Conclusion
Converting arrays to JSON enables simpler storage, transmission, and aggregation of JavaScript data. The native JSON
object provides straightforward array serialization/deserialization:
- JSON.stringify – Encode arrays to compact JSON
- JSON.parse – Decode JSON back to live arrays
Chaining these methods allows array data to move anywhere from client to cloud. Learning to properly handle JSON transformations unlocks more scalable and decoupled application architectures.