JavaScript Deep Copy vs Shallow Copy: Complete Guide

Deep Copy vs Shallow Copy in JavaScript: Complete Guide

Master object copying techniques to avoid reference bugs and build more reliable JavaScript applications

Understanding Object References in JavaScript

Have you ever changed something you thought was a separate copy, only to find you messed up the original data too? Yeah, that happens a lot if you don’t get how JavaScript handles different kinds of copying; deep copy vs shallow copy.

Value vs Reference Types

JavaScript has these two main ways it thinks about data: by value and by reference. Simple stuff like numbers, strings, booleans, null, and undefined are value types. When you copy them, JavaScript just makes a whole new one. A separate thing entirely. Changing the new one never affects the old one.

let a = 10;
let b = a; // b gets a copy of the value 10
b = 20;
console.log(a); // Output: 10 (a is unchanged)

However, objects, arrays, and functions are reference types. When you assign one variable to another, you’re not copying the actual data. Instead, you’re copying the reference to where that data lives in the computer’s memory.

It’s like having two street signs pointing to the same house. You change the house through one sign, the change is there if you use the other sign, too. It can be confusing, is it not?

let obj1 = { name: 'Alice' };
let obj2 = obj1; // obj2 gets a copy of the reference to the object
obj2.name = 'Bob';
console.log(obj1.name); // Output: Bob (obj1 was changed!)

The Problem with Direct Object Assignment

Assigning one object variable to another using the simple equals sign (=) creates this reference copy. This is where things get messy if you aren’t expecting it. You think you’ve made a copy to work with, but you’ve just made another handle for the exact same object in memory. Modifying the ‘copy’ variable directly modifies the original object.

Let’s say you get some data from an API and store it in a variable, maybe called userData. Then you need to modify this data before displaying it, but you also want to keep the original userData untouched for later use. If you just do let displayData = userData; and then change displayData, you’ve just changed your original userData.

let originalUser = { id: 1, name: 'Charlie', settings: { theme: 'dark' } };
let modifiedUser = originalUser; // Just copying the reference

modifiedUser.name = 'David'; // Changes originalUser.name too!

console.log(originalUser.name); // Output: David

Therefore, if your goal is to have a separate version of the object that you can change without affecting the source, a direct assignment simply doesn’t do the job you think it might.

Copy Operations and Memory Management

JavaScript supports two kinds of copying: shallow and deep. Both involve different levels of memory handling.

Grasping how memory works in each case helps explain why deep copying tends to be slower and more resource-intensive. So, let’s break down both types in more detail.

Shallow Copy: Methods and Limitations

When you make a shallow copy, a new memory is allocated for the top-level object or array structure. However, the nested objects inside still refer to the same memory locations as the original.

It’s like a new outer box, but the things inside are the originals. It’s faster than a deep copy because it does less work.

Built-in Shallow Copy Techniques

JavaScript offers several easy ways to create shallow copies.

  • Object.assign(): This method copies enumerable own properties from one or more source objects to a target object. The target object is then returned. It’s been around for a while, very reliable for flat objects.
const original = { a: 1, b: { c: 2 } };
const shallowCopy = Object.assign({}, original);

shallowCopy.a = 100; // Changes only shallowCopy.a
shallowCopy.b.c = 200; // Changes original.b.c and shallowCopy.b.c

console.log(original.a); // Output: 1
console.log(original.b.c); // Output: 200 (Still shared!)
  • Spread syntax (...): This is perhaps the most modern and often cleanest way for arrays and objects. It ‘spreads’ the properties or elements from the source into a new object or array literal. It does essentially the same thing as Object.assign() for objects and provides a nice syntax for arrays.

const originalArr = [1, { d: 4 }];
const shallowCopyArr = [...originalArr]; // Shallow copy of array

shallowCopyArr[0] = 100; // Changes only shallowCopyArr[0]
shallowCopyArr[1].d = 400; // Changes originalArr[1].d and shallowCopyArr[1].d

console.log(originalArr[0]); // Output: 1
console.log(originalArr[1].d); // Output: 400 (Still shared!)

const originalObj = { a: 1, b: { c: 2 } };
const shallowCopyObj = { ...originalObj }; // Shallow copy of object

shallowCopyObj.a = 100; // Changes only shallowCopyObj.a
shallowCopyObj.b.c = 200; // Changes originalObj.b.c and shallowCopyObj.b.c

console.log(originalObj.a); // Output: 1
console.log(originalObj.b.c); // Output: 200 (Still shared!)
  • Array methods (slice(), concat()): For arrays specifically, methods like slice() without arguments or concat() with an empty array also create shallow copies. These are classic methods, reliable for many years now.

const originalArrOld = [1, { e: 5 }];
const shallowCopyArrOld1 = originalArrOld.slice();
const shallowCopyArrOld2 = [].concat(originalArrOld);

shallowCopyArrOld1[1].e = 500; // Changes originalArrOld[1].e and shallowCopyArrOld1[1].e
shallowCopyArrOld2[1].e = 600; // Changes originalArrOld[1].e, shallowCopyArrOld1[1].e, and shallowCopyArrOld2[1].e

console.log(originalArrOld[1].e); // Output: 600 (All shared!)

Nested Object Behavior

As you saw, the crucial thing about shallow copies is their behavior with nested objects. The copy process only goes one level deep. For that reason, the nested object { c: 2 }, { d: 4 }, or { e: 5 } is still the exact same object in memory? Changing it through any of the variables (original, shallowCopy, shallowCopyObj, etc.) affects all of them.

This sharing of nested references is the primary reason shallow copy isn’t suitable when you need complete independence between the original and the copy, especially when dealing with complex data structures or state management, where immutability is important. It’s a common source of bugs.

Deep Copy: Complete Object Duplication

Sometimes you need a true clone, a completely independent replica where changing the copy never impacts the original, no matter how deep the nesting goes. This is where deep copies come in.

A deep copy, on the other hand, allocates entirely new memory for every part of the object structure, including all nested objects and arrays. This requires more processing time and more memory space compared to a shallow copy.

It’s like unpacking a box and then making a brand new copy of everything inside and putting it into a new box. This way, nothing in the new box points back to the old box’s contents.

JSON Method: Syntax and Limitations

The most common trick developers used for a long time to achieve a deep copy was JSON.parse(JSON.stringify(obj)). How does this work? JSON.stringify() converts a JavaScript object into a JSON string. This string representation contains only the data, not the references. Then, JSON.parse() converts that string back into a new JavaScript object. Since it’s built from a string, all the objects and arrays within it are newly created.

const originalDeep = { a: 1, b: { c: 2 } };
const deepCopyJSON = JSON.parse(JSON.stringify(originalDeep));

deepCopyJSON.a = 100; // Changes only deepCopyJSON.a
deepCopyJSON.b.c = 200; // Changes only deepCopyJSON.b.c

console.log(originalDeep.a); // Output: 1
console.log(originalDeep.b.c); // Output: 2 (Original is untouched!)

This method is simple and works well for basic data objects that contain only primitive types, arrays, and other plain objects. However, it has significant limitations you must be aware of:

  • Loses Data Types: It cannot handle certain JavaScript data types. Functions, undefined, NaN, Infinity, and RegExp objects turn into null or are simply dropped during stringification. Dates become strings. Maps and Sets become empty objects. This is because the JSON format itself doesn’t have standard representations for these JavaScript-specific concepts.
const objA = { date: new Date() };
const objB = JSON.parse(JSON.stringify(objA));

console.log(objA.date instanceof Date) // true
console.log(objB.date instanceof Date) // false (Loses the Date type!)
  • Circular References: Objects with circular references (where a property of an object refers back to the object itself or one of its ancestors) will cause JSON.stringify() to throw an error.
const objA = { name: 'A' };
const objB = { name: 'B', parent: objA };
objA.child = objB;

// JSON.parse(JSON.stringify(objA)); // Throws TypeError

So, while easy, this method is far from a universal deep copy solution. You might run into trouble unexpectedly.

Structured Clone API

Modern JavaScript introduces the structuredClone() function, which handles many of the limitations above. It’s becoming the standard way to do this.

const original = {
  name: 'Alpha',
  date: new Date(),
  map: new Map([['key', 'value']]),
  set: new Set([1, 2, 3])
};
original.self = original;

const copy = structuredClone(original);

console.log(copy.date instanceof Date); // true
console.log(copy.map instanceof Map);   // true
console.log(copy.set instanceof Set);   // true
console.log(copy.self === copy);        // true (circular reference preserved)

It supports:

  • Date, RegExp, Map, Set.
  • ArrayBuffer, Blob, File, FileList, ImageData, MessagePort, ImageBitmap, Error.
  • NaN, Infinity.
  • Circular references.

But it still has drawbacks:

  • Functions are not allowed: they’ll be dropped or throw an error.
structuredClone({ fn: () => {} }); // Throws DataCloneError
  • Doesn’t preserve the prototype chain: class instances lose their methods.
class MyClass {
  constructor(value) { this.value = value; }
  greet() { console.log('Hello ' + this.value); }
}

const instance = new MyClass('World');
const clone = structuredClone(instance);

console.log(clone.value);           // "World"
console.log(clone.greet);           // undefined
console.log(clone instanceof MyClass); // false
  • Doesn’t support getters/setters or property descriptors.
class SecretBox {
  constructor() {
    this._value = 'secret';
  }

  get hiddenValue() {
    return this._value;
  }

  set hiddenValue(val) {
    console.log('Trying to set to:', val);
  }
}

const original = new SecretBox();
const clone = structuredClone(original);

// Original still works
console.log(original.hiddenValue);  // Output: 'secret'
original.hiddenValue = 'new';       // Logs: Trying to set to: new

// Clone fails
console.log(clone.hiddenValue);     // Output: undefined (getter is lost)
clone.hiddenValue = 'test';         // No log (setter is gone)

// Prototype check
console.log(clone instanceof SecretBox); // false
console.log(Object.getPrototypeOf(clone) === SecretBox.prototype); // false

It’s a much better option than JSON for most modern needs, but not perfect for all use cases, especially when behavior or prototypes matter.

💌 Enjoying this post? Subscribe to get more practical dev tips right in your inbox.

Library Solutions: Lodash

If you need a more flexible, production-grade deep clone and don’t want to write your own, libraries like Lodash are a great choice. These libraries have already solved many of the edge cases that custom implementations miss.

// npm install lodash

class MyClass {
  constructor(value, obj) {
    this.value = value;
    this.obj = obj;
  }
  greet() { console.log('Hello ' + this.value); }
  get val() {
    return this.value;
  }
}

const instance = new MyClass('World');
instance.obj = { greeting: "Hello", date: new Date() };
instance.obj.self = instance.obj;
const clone = _.cloneDeep(instance);
clone.obj.greeting = "Hi";

console.log(clone.value);                        // "World"
console.log(instance.obj.greeting);              // "Hello"
console.log(clone.obj.greeting);                 // "Hi"
console.log(instance.obj.date instanceof Date);  // true
console.log(clone.obj.date instanceof Date);     // true
console.log(clone.obj.self);                     // Handles the circular refs (No errors)
console.log(clone.greet);                        // ƒ greet() (Doesn't lose functions)
console.log(clone instanceof MyClass);           // true
console.log(instance.val);                       // "World"
console.log(clone.val);                          // "World" (Doesn't lose getters and setters)

As you see, it mostly solves all the problems.

Libraries are generally the recommended path for complex deep cloning needs because they handle many tricky situations for you. Why reinvent the wheel when someone else built a perfectly good one, right?

Custom Recursive Implementation

If you need total control, maybe you want to copy specific things and ignore others; you might need to write your own deep copy function.

However, keep in mind, writing a robust custom deep clone function is quite complicated. It needs to account for all built-in types, handle inheritance (prototypes), property attributes, and those circular references.

Because of the complexity, this is usually only worth doing when libraries don’t meet your requirements.

When to Use Shallow vs Deep Copy

How do you choose between shallow and deep copy?

It comes down to the structure of your data and what you intend to do with the copy. No single rule applies to every situation.

Use shallow copy when:

  • Your object or array is flat (no nested objects/arrays).
  • You only need to modify properties directly on the top level.
  • Performance is critical, and shared nested references are acceptable or won’t be modified.

Use deep copy when:

  • Your object or array has nested objects or arrays.
  • You need complete independence between the original and the copy, allowing modification of nested structures without affecting the source.
  • Immutability of the entire structure is required.

Conclusion

JavaScript offers several approaches to copying objects, each with distinct trade-offs. Shallow copies (like Object.assign() or spread operators) are fast and memory-efficient but only duplicate the top level, leaving nested structures vulnerable to unintended side effects.

On the other side, deep copies provide complete independence by duplicating everything, though at the cost of performance and memory. And we covered some approaches to handle it.

The right choice depends on your specific situation, the complexity of your data, performance needs, and which limitations you can accept.

By understanding these trade-offs, you can make informed decisions that lead to more predictable, bug-free code.

Think about it

If you enjoyed this article, I’d truly appreciate it if you could share it; it really motivates me to keep creating more helpful content!

If you’re interested in exploring more, check out these articles.

Thanks for sticking with me until the end. I hope you found this article valuable and enjoyable!

Want more dev insights like this? Subscribe to get practical tips, tutorials, and tech deep dives delivered to your inbox. No spam, unsubscribe anytime.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top