Upcoming JavaScript features in 2026
Published on
It's 2026. JavaScript continues to gain new features and improvements. Below I'll list the most important features that we'll hopefully see in 2026.
Map upsert
Have you ever been in a situation where you had to update some value in
Map
but were not certain whether a key existed or not? For example, you need to count the number of occurrences of
each character in a string.
You would probably do something like this:
const string = "counting characters in a string";
const map = new Map();
for (const character of string) {
let counterObj = map.get(character);
if (!counterObj) {
counterObj = {
character,
count: 0,
};
map.set(character, counterObj);
}
counterObj.count++;
}
console.log(map);
Alternatively, you could also use map.has(character), but handling
missing keys still requires additional lookups (including set()) in the map. In any case, writing
this is a bit bulky, also the additional lookups add some performance overhead (even if they happen only
once per unique key).
Fortunately, this will become a thing of the past.
Map.prototype.getOrInsert()
and
Map.prototype.getOrInsertComputed()
allow to add the missing map entry only doing one lookup. WeakMap also has the same methods:
WeakMap.prototype.getOrInsert()
and
WeakMap.prototype.getOrInsertComputed().
So, the same thing could be done like this:
const string = "counting characters in a string";
const map = new Map();
for (const character of string) {
// This is a combination of get and set. When it
// fails to find the key, it inserts
// the value provided in the 2-nd parameter and returns it
const counterObj = map.getOrInsert(character, {
character,
count: 0,
});
counterObj.count++;
}
console.log(map);
However, if the construction of the value in the 2-nd parameter is expensive, you can use
getOrInsertComputed() instead:
const string = "counting characters in a string";
const map = new Map();
for (const character of string) {
// In this case a callback is provided, the callback will be called
// only when the key is missing, so the default
// value construction will be avoided when it's not necessary
const counterObj = map.getOrInsertComputed(character, () => ({
character,
count: 0,
}));
counterObj.count++;
}
console.log(map);
The feature is not yet supported in all major browsers. So, don't rely on this in production code.
Lossless JSON serialization and deserialization
In JavaScript it's impossible to encode and decode some values as JSON. For example, it's not possible to encode
and decode bigint numbers:
const obj = {
prop: 1n,
};
console.log(JSON.stringify(obj)); // Exception
One workaround for this is to convert bigint numbers to strings by passing the replacer
callback to
JSON.stringify():
const obj = {
prop: 1n,
};
console.log(
JSON.stringify(obj, (key, value) => typeof value === "bigint" ? value.toString() : value)
); // {"prop":"1"}
When parsing the JSON with
JSON.parse()
we can pass the reviver
callback where we can convert the stringified bigint back to bigint for known
properties:
console.log(JSON.parse(
'{"prop":"138","otherProp":"789"}',
(key, value) => key === "prop" && typeof value === "string" ? BigInt(value) : value
)); // { prop: 138n, otherProp: "789" }
It works, but it's complicated for parsing when dealing with deep objects. You have to identify the property
values
that indeed need to be converted into bigint.
A better solution is to encode bigint into a regular number and convert it back. This is possible
because the JSON standard itself has no precision / digit count limit for both integers and floating point
numbers.
Now, we need to have some control and access to the raw JSON property value. And we can achieve this, thanks to
JSON.rawJSON()
and
context.source
of JSON.parse() reviver callback.
JSON.rawJSON() accepts the original form of the stringified primitive value. It
returns a special exotic object. When that object is stringified the stringified value is preserved
(there is no wrapping in quotes or any major transformation). That
stringified value is directly injected into JSON.
Keep in mind that the passed JSON.rawJSON()only accepts
valid stringified primitive values will throw error.
context.source holds the original JSON string in the raw form.
So, by combining JSON.rawJSON() and context.source we can encode and decode some
values easily. Here is an example with bigintnumbers:
const obj = {
regularNumber: 99999,
bigRegularNumber: 999999999999999999,
bigInt: 999999999999999999n,
};
const serializedJSON = JSON.stringify(
obj,
(key, value) => typeof value === "bigint" ? JSON.rawJSON(value.toString()) : value,
);
console.log(serializedJSON);
// '{"regularNumber":99999,"bigRegularNumber":1000000000000000000,"bigInt":999999999999999999}'
// Notice that bigRegularNumber had a precision loss because the value
// (999999999999999999) is outside the safe integer range.
// Meanwhile, bigInt was encoded into regular JSON number without losses.
console.log(JSON.parse(serializedJSON));
// {regularNumber: 99999, bigRegularNumber: 1000000000000000000, bigInt: 1000000000000000000}
// If the number value is not a safe integer and the raw JSON source is in a form of integer
// we can decode it back as bigint
console.log(JSON.parse(
serializedJSON,
(key, value, { source }) =>
typeof value === 'number' && !Number.isSafeInteger(value) && /^-?[0-9]+$/.test(source) ?
BigInt(source) : value),
);
// {regularNumber: 99999, bigRegularNumber: 1000000000000000000n, bigInt: 999999999999999999n}
The feature is not yet supported in Safari. So, don't rely on this in production code.
Iterator.concat()
A nice handy feature. As the name suggests,
Iterator.concat()
does concatenation of iterable objects. It works very similarly to
Array.prototype.concat(),
just supports all types of iterables and accepts only iterable objects. The return value is an iterator which is
lazily evaluated. Here is an example of using Iterator.concat():
function* generate() {
yield 1;
yield 2;
yield 3;
}
const iterable = Iterator.concat(generate(), [4, 5, 6])
console.log(iterable); // Iterator
console.log(iterable.toArray()); // [ 1, 2, 3, 4, 5, 6 ]
Iterator.concat({}, 1); // exception, all arguments must be iterable
The feature has a very limited support. So, don't rely on this in production code.
Math.sumPrecise()
Another nice but somewhat niche feature is
Math.sumPrecise().
Math.sumPrecise() receives an iterable of numbers and sums them more precisely compared to a
naive loop based sum. It uses a specialized summing algorithm. This feature might be useful when doing some
financial or math calculations.
Here is an example of Math.sumPrecise() vs regular sum when calculating the approximate value of
the mathematical constant
e using the power series
formula:
// The power series for approximation of mathematical constant e
function* sequenceOfE() {
let member = 1;
yield member;
for (let i = 1; i < 20; ++i) {
member /= i;
yield member;
}
}
const regularSum = sequenceOfE().reduce((acc, cur) => acc + cur, 0);
const preciseSum = Math.sumPrecise(sequenceOfE());
console.log(regularSum, preciseSum); // 2.7182818284590455 2.718281828459045
// Calculating the actual errors compared to Math.E
console.log(Math.abs(Math.E - regularSum), Math.abs(Math.E - preciseSum));
// 4.440892098500626e-16 0
// Math.sumPrecise() can reduce the error
The feature has a very limited support. So, don't rely on this in production code.
import defer (proposal)
import defer is a
proposal of lazy module evaluation. JS modules can get very large, their loading and initialization can be
really expensive. Yes, theoretically lazy loading can be done via dynamic
import().
However, this
results in all functions and their callers switching to an asynchronous programming model. As a result, all
callers must be updated to accommodate the
new model, which is impossible without introducing API changes that break compatibility with existing API
consumers. Doesn't
this remind you of anything?
In order to solve this, a new syntax was introduced - import defer. The idea is when the module is
imported via import defer, it doesn't load immediately. So, no blocking occurs. The
module is only loaded and initialized when
the script is accessing it for the first time. The code that uses deferred modules looks like a normal code, and
doesn't need
significant changes:
import defer * as heavyModule from '/static/js/test/some-script.js';
// The file '/static/some-script.js'will not be loaded immediately
// and won't block the script until it's used
console.log('heavyModule is not loaded yet'); // will print immediately, since the import is not blocking
setTimeout(() => {
// When heavyScript is about to be accessed for the first time,
// the execution is paused until the module is loaded and initialized.
// After that heavyScript can be safely used, like a normal module.
// The code that uses a deferred module looks like a normal code.
heavyModule.someFunction();
}, 1000);
This feature has reached stage 3. Hopefully, browsers will implement this soon.
Error.isError()
Error.isError()
allows to determine whether the value is a genuine error or not. It's also kinda possible to do this via
value instanceof Error. However, it's not very robust, since this can be faked
by patching the prototype. Error.isError() actually checks a private internal field which is
impossible to
fake. It's very similar to
Array.isArray().
Here are some examples of Error.isError() vs instanceof Error:
const genuineError1 = new Error();
const genuineError2 = new TypeError();
const genuineFalseNegativeError = new TypeError();
Object.setPrototypeOf(genuineFalseNegativeError, Object.prototype);
const fakeFalsePositiveError = Object.create(TypeError.prototype);
console.log(
genuineError1 instanceof Error,
genuineError2 instanceof Error,
genuineFalseNegativeError instanceof Error,
fakeFalsePositiveError instanceof Error,
); // true true false true
console.log(
Error.isError(genuineError1),
Error.isError(genuineError2),
Error.isError(genuineFalseNegativeError),
Error.isError(fakeFalsePositiveError),
); // true true true false
The feature has almost
universal support. Safari still returns false for DOMException.
Other important features
-
Temporal API. Historically the old
Dateobject has been a pain to work with. It has many quirks, missing features and footguns. Temporal API fixes many of these problems. - Explicit resource management. Adds a nice and clean syntax to automatically release the resources.
Conclusion
Year 2026 brings several nice features that we hope will be implemented soon. These new features will make JavaScript more robust and powerful, allowing developers to build complex applications with greater confidence and fewer frustrations.