If we put export
in front of a named entity inside a module, it becomes a named export of that module. All other entities are private to the module.
//===== lib.mjs =====
// Named exports
export const one = 1, two = 2;
export function myFunc() {
return 3;
}
//===== main.mjs =====
// Named imports
import {one, myFunc as f} from './lib.mjs';
assert.equal(one, 1);
assert.equal(f(), 3);
// Namespace import
import * as lib from './lib.mjs';
assert.equal(lib.one, 1);
assert.equal(lib.myFunc(), 3);
The string after from
is called a module specifier. It identifies from which module we want to import.
import()
ES2020So far, all imports we have seen were static, with the following constraints:
Dynamic imports via import()
don’t have those constraints:
//===== lib.mjs =====
// Named exports
export const one = 1, two = 2;
export function myFunc() {
return 3;
}
//===== main.mjs =====
function importLibrary(moduleSpecifier) {
return import(moduleSpecifier)
.then((lib) => {
assert.equal(lib.one, 1);
assert.equal(lib.myFunc(), 3);
});
}
await importLibrary('./lib.mjs');
A default export is most often used when a module only contains a single entity (even though it can be combined with named exports):
//===== lib1.mjs =====
export default function getHello() {
return 'hello';
}
There can be at most one default export. That’s why const
or let
can’t be default-exported (line A):
//===== lib2.mjs =====
export default 123; // (A) instead of `const`
This is the syntax for importing default exports:
//===== main.mjs =====
import lib1 from './lib1.mjs';
assert.equal(lib1(), 'hello');
import lib2 from './lib2.mjs';
assert.equal(lib2, 123);
Module specifiers identify modules. There are three kinds of them:
Absolute specifiers are full URLs – for example:
'https://www.unpkg.com/browse/yargs@17.3.1/browser.mjs'
'file:///opt/nodejs/config.mjs'
Absolute specifiers are mostly used to access libraries that are directly hosted on the web.
Relative specifiers are relative URLs (starting with '/'
, './'
or '../'
) – for example:
'./sibling-module.js'
'../module-in-parent-dir.mjs'
'../../dir/other-module.js'
Every module has a URL whose protocol depends on its location (file:
, https:
, etc.). If it uses a relative specifier, JavaScript turns that specifier into a full URL by resolving it against the module’s URL.
Relative specifiers are mostly used to access other modules within the same code base.
Bare specifiers are paths (without protocol and domain) that start with neither slashes nor dots. They begin with the names of packages. Those names can optionally be followed by subpaths:
'some-package'
'some-package/sync'
'some-package/util/files/path-tools.js'
Bare specifiers can also refer to packages with scoped names:
'@some-scope/scoped-name'
'@some-scope/scoped-name/async'
'@some-scope/scoped-name/dir/some-module.mjs'
Each bare specifier refers to exactly one module inside a package; if it has no subpath, it refers to the designated “main” module of its package.
A bare specifier is never used directly but always resolved – translated to an absolute specifier. How resolution works depends on the platform.
What does “source code unit” mean in the world of JavaScript?
JavaScript has a rich history of source code units: ES6 brought built-in modules, but older formats are still around, too. Understanding the latter helps understand the former, so let’s investigate. The next sections describe the following ways of delivering JavaScript source code:
Table 29.1 gives an overview of these source code units. Note that we can choose between two filename extensions for CommonJS modules and ECMAScript modules. Which choice to make depends on how we want to use a file. Details are given later in this chapter.
Usage | Runs on | Loaded | Filename ext. | |
---|---|---|---|---|
Script | Legacy | browsers | async | .js |
CommonJS module | Decreasing | servers | sync | .js .cjs |
AMD module | Legacy | browsers | async | .js |
ECMAScript module | Modern | browsers, servers | async | .js .mjs |
Table 29.1: Ways of delivering JavaScript source code.
Before we get to built-in modules (which were introduced with ES6), all code that we’ll see, will be written in ES5. Among other things:
const
and let
; only var
.
Initially, browsers only had scripts – pieces of code that were executed in global scope. As an example, consider an HTML file that loads script files via the following HTML:
<script src="other-module1.js"></script>
<script src="other-module2.js"></script>
<script src="my-module.js"></script>
The main file is my-module.js
, where we simulate a module:
var myModule = (function () { // Open IIFE
// Imports (via global variables)
var importedFunc1 = otherModule1.importedFunc1;
var importedFunc2 = otherModule2.importedFunc2;
// Body
function internalFunc() {
// ···
}
function exportedFunc() {
importedFunc1();
importedFunc2();
internalFunc();
}
// Exports (assigned to global variable `myModule`)
return {
exportedFunc: exportedFunc,
};
})(); // Close IIFE
myModule
is a global variable that is assigned the result of immediately invoking a function expression. The function expression starts in the first line. It is invoked in the last line.
This way of wrapping a code fragment is called immediately invoked function expression (IIFE, coined by Ben Alman). What do we gain from an IIFE? var
is not block-scoped (like const
and let
), it is function-scoped: the only way to create new scopes for var
-declared variables is via functions or methods (with const
and let
, we can use either functions, methods, or blocks {}
). Therefore, the IIFE in the example hides all of the following variables from global scope and minimizes name clashes: importedFunc1
, importedFunc2
, internalFunc
, exportedFunc
.
Note that we are using an IIFE in a particular manner: at the end, we pick what we want to export and return it via an object literal. That is called the revealing module pattern (coined by Christian Heilmann).
This way of simulating modules, has several issues:
Prior to ECMAScript 6, JavaScript did not have built-in modules. Therefore, the flexible syntax of the language was used to implement custom module systems within the language. Two popular ones are:
The original CommonJS standard for modules was created for server and desktop platforms. It was the foundation of the original Node.js module system, where it achieved enormous popularity. Contributing to that popularity were the npm package manager for Node and tools that enabled using Node modules on the client side (browserify, webpack, and others).
From now on, CommonJS module means the Node.js version of this standard (which has a few additional features). This is an example of a CommonJS module:
// Imports
var importedFunc1 = require('./other-module1.js').importedFunc1;
var importedFunc2 = require('./other-module2.js').importedFunc2;
// Body
function internalFunc() {
// ···
}
function exportedFunc() {
importedFunc1();
importedFunc2();
internalFunc();
}
// Exports
module.exports = {
exportedFunc: exportedFunc,
};
CommonJS can be characterized as follows:
The AMD module format was created to be easier to use in browsers than the CommonJS format. Its most popular implementation is RequireJS. The following is an example of an AMD module.
define(['./other-module1.js', './other-module2.js'],
function (otherModule1, otherModule2) {
var importedFunc1 = otherModule1.importedFunc1;
var importedFunc2 = otherModule2.importedFunc2;
function internalFunc() {
// ···
}
function exportedFunc() {
importedFunc1();
importedFunc2();
internalFunc();
}
return {
exportedFunc: exportedFunc,
};
});
AMD can be characterized as follows:
Benefit of AMD modules (and the reason why they work well for browsers): They can be executed directly. In contrast, CommonJS modules must either be compiled before deployment or custom source code must be generated and evaluated dynamically (think eval()
). That isn’t always permitted on the web.
Looking at CommonJS and AMD, similarities between JavaScript module systems emerge:
ECMAScript modules (ES modules or ESM) were introduced with ES6. They continue the tradition of JavaScript modules and have all of their aforementioned characteristics. Additionally:
ES modules also have new benefits:
This is an example of ES module syntax:
import {importedFunc1} from './other-module1.mjs';
import {importedFunc2} from './other-module2.mjs';
function internalFunc() {
···
}
export function exportedFunc() {
importedFunc1();
importedFunc2();
internalFunc();
}
From now on, “module” means “ECMAScript module”.
The full standard of ES modules comprises the following parts:
Parts 1 and 2 were introduced with ES6. Work on part 3 is ongoing.
Each module can have zero or more named exports.
As an example, consider the following two files:
lib/my-math.mjs
main.mjs
Module my-math.mjs
has two named exports: square
and LIGHT_SPEED
.
// Not exported, private to module
function times(a, b) {
return a * b;
}
export function square(x) {
return times(x, x);
}
export const LIGHT_SPEED = 299792458;
To export something, we put the keyword export
in front of a declaration. Entities that are not exported are private to a module and can’t be accessed from outside.
Module main.mjs
has a single named import, square
:
import {square} from './lib/my-math.mjs';
assert.equal(square(3), 9);
It can also rename its import:
import {square as sq} from './lib/my-math.mjs';
assert.equal(sq(3), 9);
Both named importing and destructuring look similar:
import {func} from './util.mjs'; // import
const {func} = require('./util.mjs'); // destructuring
But they are quite different:
Imports remain connected with their exports.
We can destructure again inside a destructuring pattern, but the {}
in an import statement can’t be nested.
The syntax for renaming is different:
import {func as f} from './util.mjs'; // importing
const {func: f} = require('./util.mjs'); // destructuring
Rationale: Destructuring is reminiscent of an object literal (including nesting), while importing evokes the idea of renaming.
Exercise: Named exports
exercises/modules/export_named_test.mjs
Namespace imports are an alternative to named imports. If we namespace-import a module, it becomes an object whose properties are the named exports. This is what main.mjs
looks like if we use a namespace import:
import * as myMath from './lib/my-math.mjs';
assert.equal(myMath.square(3), 9);
assert.deepEqual(
Object.keys(myMath), ['LIGHT_SPEED', 'square']
);
The named export style we have seen so far was inline: We exported entities by prefixing them with the keyword export
.
But we can also use separate export clauses. For example, this is what lib/my-math.mjs
looks like with an export clause:
function times(a, b) {
return a * b;
}
function square(x) {
return times(x, x);
}
const LIGHT_SPEED = 299792458;
export { square, LIGHT_SPEED }; // semicolon!
With an export clause, we can rename before exporting and use different names internally:
function times(a, b) {
return a * b;
}
function sq(x) {
return times(x, x);
}
const LS = 299792458;
export {
sq as square,
LS as LIGHT_SPEED, // trailing comma is optional
};
Each module can have at most one default export. The idea is that the module is the default-exported value.
As an example of default exports, consider the following two files:
my-func.mjs
main.mjs
Module my-func.mjs
has a default export:
const GREETING = 'Hello!';
export default function () {
return GREETING;
}
Module main.mjs
default-imports the exported function:
import myFunc from './my-func.mjs';
assert.equal(myFunc(), 'Hello!');
Note the syntactic difference: the curly braces around named imports indicate that we are reaching into the module, while a default import is the module.
What are use cases for default exports?
The most common use case for a default export is a module that contains a single function or a single class.
There are two styles of doing default exports.
First, we can label existing declarations with export default
:
export default function myFunc() {} // no semicolon!
export default class MyClass {} // no semicolon!
Second, we can directly default-export values. This style of export default
is much like a declaration.
export default myFunc; // defined elsewhere
export default MyClass; // defined previously
export default Math.sqrt(2); // result of invocation is default-exported
export default 'abc' + 'def';
export default { no: false, yes: true };
The reason is that export default
can’t be used to label const
: const
may define multiple values, but export default
needs exactly one value. Consider the following hypothetical code:
// Not legal JavaScript!
export default const a = 1, b = 2, c = 3;
With this code, we don’t know which one of the three values is the default export.
Exercise: Default exports
exercises/modules/export_default_test.mjs
Internally, a default export is simply a named export whose name is default
. As an example, consider the previous module my-func.mjs
with a default export:
const GREETING = 'Hello!';
export default function () {
return GREETING;
}
The following module my-func2.mjs
is equivalent to that module:
const GREETING = 'Hello!';
function greet() {
return GREETING;
}
export {
greet as default,
};
For importing, we can use a normal default import:
import myFunc from './my-func2.mjs';
assert.equal(myFunc(), 'Hello!');
Or we can use a named import:
import {default as myFunc} from './my-func2.mjs';
assert.equal(myFunc(), 'Hello!');
The default export is also available via property .default
of namespace imports:
import * as mf from './my-func2.mjs';
assert.equal(mf.default(), 'Hello!');
Isn’t
default
illegal as a variable name?
default
can’t be a variable name, but it can be an export name and it can be a property name:
const obj = {
default: 123,
};
assert.equal(obj.default, 123);
These are my recommendations:
Avoid mixing named exports and default exports: A module can have both named exports and a default export, but it’s usually better to stick to one export style per module.
In some cases, you may be sure that the module will only ever export a single value (usually a function or a class). That is, conceptually, the module is the value – similarly to a variable. Then a default export is a good option.
You can never go wrong with only using named exports.
A module library.mjs
can export one or more exports of another module internal.mjs
as if it had made them itself. That is called re-exporting.
//===== internal.mjs =====
export function internalFunc() {}
export const INTERNAL_DEF = 'hello';
export default 123;
//===== library.mjs =====
// Named re-export [ES6]
export {internalFunc as func, INTERNAL_DEF as DEF} from './internal.mjs';
// Wildcard re-export [ES6]
export * from './internal.mjs';
// Namespace re-export [ES2020]
export * as ns from './internal.mjs';
internal.mjs
into exports of library.mjs
, except the default export.
internal.mjs
into an object that becomes the named export ns
of library.mjs
. Because internal.mjs
has a default export, ns
has a property .default
.
The following code demonstrates the two bullet points above:
//===== main.mjs =====
import * as library from './library.mjs';
assert.deepEqual(
Object.keys(library),
['DEF', 'INTERNAL_DEF', 'func', 'internalFunc', 'ns']
);
assert.deepEqual(
Object.keys(library.ns),
['INTERNAL_DEF', 'default', 'internalFunc']
);
So far, we have used imports and exports intuitively, and everything seems to have worked as expected. But now it is time to take a closer look at how imports and exports are really related.
Consider the following two modules:
counter.mjs
main.mjs
counter.mjs
exports a (mutable!) variable and a function:
export let counter = 3;
export function incCounter() {
counter++;
}
main.mjs
name-imports both exports. When we use incCounter()
, we discover that the connection to counter
is live – we can always access the live state of that variable:
import { counter, incCounter } from './counter.mjs';
// The imported value `counter` is live
assert.equal(counter, 3);
incCounter();
assert.equal(counter, 4);
Note that while the connection is live and we can read counter
, we cannot change this variable (e.g., via counter++
).
There are two benefits to handling imports this way:
ESM supports cyclic imports transparently. To understand how that is achieved, consider the following example: figure 29.1 shows a directed graph of modules importing other modules. P importing M is the cycle in this case.
Figure 29.1: A directed graph of modules importing modules: M imports N and O, N imports P and Q, etc.
After parsing, these modules are set up in two phases:
This approach handles cyclic imports correctly, due to two features of ES modules:
Due to the static structure of ES modules, the exports are already known after parsing. That makes it possible to instantiate P before its child M: P can already look up M’s exports.
When P is evaluated, M hasn’t been evaluated, yet. However, entities in P can already mention imports from M. They just can’t use them, yet, because the imported values are filled in later. For example, a function in P can access an import from M. The only limitation is that we must wait until after the evaluation of M, before calling that function.
Imports being filled in later is enabled by them being “live immutable views” on exports.
In the JavaScripte ecosystem, a package is a way of organizing software projects: It is a directory with a standardized layout. A package can contain all kinds of files - for example:
A package can depend on other packages (which are called its dependencies):
The dependencies of a package are installed inside that package (we’ll see how soon).
One common distinction between packages is:
The next subsection explains how packages can be published.
The main way of publishing a package is to upload it to a package registry – an online software repository. Two popular public registries are:
Companies can also host their own private registries.
A package manager is a command line tool that downloads packages from a registry (or other sources) and installs them as shell scripts and/or as dependencies. The most popular package manager is called npm and comes bundled with Node.js. Its name originally stood for “Node Package Manager”. Later, when npm and the npm registry were used not only for Node.js packages, that meaning was changed to “npm is not a package manager” ([source](https://en.wikipedia.org/wiki/Npm_(software)#Acronym)). There are other popular package managers such as jsr, vlt, pnpm and yarn. All of these package managers support either or both of the npm registry and JSR.
Let’s explore how the npm registry works. Each package has a name. There are two kinds of names:
Global names are unique across the whole registry. These are two examples:
minimatch
mocha
Scoped names consist of two parts: A scope and a name. Scopes are globally unique, names are unique per scope. These are two examples:
@babel/core
@rauschma/iterable
The scope starts with an @
symbol and is separated from the name with a slash.
Once a package my-package
is fully installed, it almost always looks like this:
my-package/
package.json
node_modules/
[More files]
What are the purposes of these file system entries?
package.json
is a file every package must have:
node_modules/
is a directory into which the dependencies of the package are installed. Each dependency also has a node_modules
folder with its dependencies, etc. The result is a tree of dependencies.
Most packages also have the file package-lock.json
that sits next to package.json
: It records the exact versions of the dependencies that were installed and is kept up to date if we add more dependencies via npm.
package.json
This is a starter package.json
that can be created via npm:
{
"name": "my-package",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
What are the purposes of these properties?
Some properties are required for public packages (published on the npm registry):
name
specifies the name of this package.
version
is used for version management and follows semantic versioning with three dot-separated numbers:
Other properties for public packages are optional:
description
, keywords
, author
are optional and make it easier to find packages.
license
clarifies how this package can be used. It makes sense to provide this value if the package is public in any way. “Choose an open source license” can help with making this choice.
main
is a legacy property and has been superseded by exports
. It points to the code of a library package.
scripts
is a property for setting up abbreviations for development-time shell commands. These can be executed via npm run
. For example, the script test
can be executed via npm run test
.
More useful properties:
Normally, the properties name
and version
are required and npm warns us if they are missing. However, we can change that via the following setting:
"private": true
That prevents the package from accidentally being published and allows us to omit name and version.
exports
is for package exports – which specify how importers see the content of this package. We’ll learn more about package exports later.
imports
is for package imports – which define aliases for module specifiers that packages can use internally. We’ll learn more about package imports later.
dependencies
lists the dependencies of a package.
devDependencies
are dependencies that are only installed during development (not when a package is added as a dependency).
The following setting means that all files with the name extension .js
are interpreted as ECMAScript modules. Unless we are dealing with legacy code, it makes sense to add it:
"type": "module"
bin
lists modules within the package that are installed as shell scripts.
More information on
package.json
Package exports are specified via property "exports"
in package.json
and support three important features:
Hiding the internals of a package:
Without property "exports"
, every module in a package my-lib
can be accessed via a relative path after the package name – e.g.:
'my-lib/dist/src/internal/internal-module.js'
Once the property exists, only specifiers listed in it can be used. Everything else is hidden from the outside.
Nicer module specifiers: Package exports let us change the bare specifier subpaths for importing the modules of a package: They can be shorter, extension-less, etc.
Conditional exports: The same module specifier exports different modules – depending on which JavaScript platform an importer uses (browser, Node.js, etc.).
Next, we’ll look at some example. For a more detailed explanation of how package exports work, see section “Package exports: controlling what other packages see” in “Shell scripting with Node.js”.
Example – specifying which module is imported via the bare specifier of a package (in the past, this was specified via property main
):
"exports": {
".": "./dist/src/main.js"
}
Example – specifying a better path for a module:
"exports": {
// With filename extension
"./util/errors.js": "./dist/src/util/errors.js",
// Without filename extension
"./util/errors": "./dist/src/util/errors.js"
}
Example – specifying better paths for a tree of modules:
"exports": {
// With filename extensions
"./*": "./dist/src/*",
// Without filename extensions
"./*": "./dist/src/*.js"
}
The examples in this subsection show excerpts of package.json
.
Example – export different modules for Node.js, browsers and other platforms:
"exports": {
".": {
"node": "./main-node.js",
"browser": "./main-browser.js",
"default": "./main-browser.js"
}
}
Example – development vs. production:
"exports": {
".": {
"development": "./main-development.js",
"production": "./main-production.js",
}
}
In Node.js we can specify an environment like this:
node --conditions development app.mjs
Package imports let a package define abbreviations for module specifiers that it can use itself, internally (where package exports define abbreviations for other packages). This is an example:
package.json
:
{
"imports": {
"#some-pkg": {
"node": "some-pkg-node-native",
"default": "./polyfills/some-pkg-polyfill.js"
}
},
"dependencies": {
"some-pkg-node-native": "^1.2.3"
}
}
Each of the keys of "imports"
has to start with a hash sign (#
). The key "#some-pkg"
is conditional (with the same features as conditional package exports):
If the current package is used on Node.js, the module specifier '#some-pkg'
refers to package some-pkg-node-native
.
Elsewhere, '#some-pkg'
refers to the file ./polyfills/some-pkg-polyfill.js
inside the current package.
Note that only package imports can refer to external packages, package exports can’t do that.
What are the use cases for package imports?
package.json
via package importsLet’s explore two ways of accessing package.json
via package imports.
First, we can define a package import for the root level of the package:
"imports": {
"#root/*": "./*"
},
Then the import statement looks like this:
import pkg from '#root/package.json' with { type: 'json' };
console.log(pkg.version);
Second, we can define a package import just for package.json
:
"imports": {
"#pkg": "./package.json"
},
Then the import statement looks like this:
import pkg from '#pkg' with { type: 'json' };
console.log(pkg.version);
There are no established best practices for naming module files and the variables they are imported into.
In this chapter, I’m using the following naming style:
The names of module files are dash-cased and only have lowercase letters:
./my-module.mjs
./some-func.mjs
The names of namespace imports are camel-cased and start with lowercase letters:
import * as myModule from './my-module.mjs';
The names of default imports are camel-cased and start with lowercase letters:
import someFunc from './some-func.mjs';
What is the thinking behind this style? We want module file names to be similar to package names:
Dashes are far more commonly used than underscores in package names. Maybe that is influenced by underscores being very rare in domain names.
npm doesn’t allow uppercase letters in package names (source).
Thanks to CSS, there are clear rules for translating dash-cased names to camel-cased names. We can use these rules for namespace imports and default imports.
Module specifiers are the strings that identify modules. They work slightly differently in browsers and Node.js. Before we can look at the differences, we need to learn about the different categories of module specifiers.
There are three kinds of module specifiers:
Absolute specifiers are full URLs – for example:
'https://www.unpkg.com/browse/yargs@17.3.1/browser.mjs'
'file:///opt/nodejs/config.mjs'
Absolute specifiers are mostly used to access libraries that are directly hosted on the web.
Relative specifiers are relative URLs (starting with '/'
, './'
or '../'
) – for example:
'./sibling-module.js'
'../module-in-parent-dir.mjs'
'../../dir/other-module.js'
Every module has a URL whose protocol depends on its location (file:
, https:
, etc.). If it uses a relative specifier, JavaScript turns that specifier into a full URL by resolving it against the module’s URL.
Relative specifiers are mostly used to access other modules within the same code base.
Bare specifiers are paths (without protocol and domain) that start with neither slashes nor dots. They begin with the names of packages. Those names can optionally be followed by subpaths:
'some-package'
'some-package/sync'
'some-package/util/files/path-tools.js'
Bare specifiers can also refer to packages with scoped names:
'@some-scope/scoped-name'
'@some-scope/scoped-name/async'
'@some-scope/scoped-name/dir/some-module.mjs'
Each bare specifier refers to exactly one module inside a package; if it has no subpath, it refers to the designated “main” module of its package.
A bare specifier is never used directly but always resolved – translated to an absolute specifier. How resolution works depends on the platform. We’ll learn more soon.
.js
or .mjs
.
Style 1: no subpath
'my-library'
Style 2: a subpath without a filename extension. In this case, the subpath works like a modifier for the package name:
'my-parser/sync'
'my-parser/async'
'assertions'
'assertions/strict'
Style 3: a subpath with a filename extension. In this case, the package is seen as a collection of modules and the subpath points to one of them:
'large-package/misc/util.js'
'large-package/main/parsing.js'
'large-package/main/printing.js'
Caveat of style 3 bare specifiers: How the filename extension is interpreted depends on the dependency and may differ from the importing package. For example, the importing package may use .mjs
for ESM modules and .js
for CommonJS modules, while the ESM modules exported by the dependency may have bare paths with the filename extension .js
.
Let’s see how module specifiers work in Node.js. Especially bare specifiers are handled differently than in browsers.
The Node.js resolution algorithm works as follows:
This is the algorithm:
If a specifier is absolute, resolution is already finished. Three protocols are most common:
file:
for local files
https:
for remote files
node:
for built-in modules
If a specifier is relative, it is resolved against the URL of the importing module.
If a specifier is bare:
If it starts with '#'
, it is resolved by looking it up among the package imports (which are explained later) and resolving the result.
Otherwise, it is a bare specifier that has one of these formats (the subpath is optional):
«package»/sub/path
@«scope»/«scoped-package»/sub/path
The resolution algorithm traverses the current directory and its ancestors until it finds a directory node_modules
that has a subdirectory matching the beginning of the bare specifier, i.e. either:
node_modules/«package»/
node_modules/@«scope»/«scoped-package»/
That directory is the directory of the package. By default, the (potentially empty) subpath after the package ID is interpreted as relative to the package directory. The default can be overridden via package exports which are explained next.
The result of the resolution algorithm must point to a file. That explains why absolute specifiers and relative specifiers always have filename extensions. Bare specifiers often don’t because they are abbreviations that are looked up in package exports.
Module files usually have these filename extensions:
.mjs
, it is always an ES module.
.js
is an ES module if the closest package.json
has this entry:
"type": "module"
If Node.js executes code provided via stdin, --eval
or --print
, we use the following command-line option so that it is interpreted as an ES module:
--input-type=module
In browsers, we can write inline modules like this:
<script type="module">
// Inline module
</script>
type="module"
tells the browser that this is an ESM module and not a browser script.
We can only use two kinds of module specifiers:
<!-- Absolute module specifier -->
<script type="module" src="https://unpkg.com/lodash"></script>
<!-- Relative module specifier -->
<script type="module" src="bundle.js"></script>
Read on to find out how to work around this limitation and use npm packages.
Browsers don’t care about filename extensions, only about content types.
Hence, we can use any filename extension for ECMAScript modules, as long as they are served with a JavaScript content type (text/javascript
is recommended).
On Node.js, npm packages are downloaded into the node_modules
directory and accessed via bare module specifiers. Node.js traverses the file system in order to find packages. We can’t do that in web browsers. Three approaches are common for bringing npm packages to browsers.
Content delivery networks (CDNs) such as unpkg.com
and esm.sh
let us import npm packages via URLs. This is what the unpkg.com
URLs look like:
https://unpkg.com/«package»@«version»/«file»
For example:
https://unpkg.com/lodash@4.17.21/lodash.js
One downside of CDNs is that they introduce an additional point of failure:
node_modules
with bare specifiers and a bundlerA bundler is a build tool. It works roughly as follows:
If an app has multiple entry points, the bundler produces multiple bundles. It’s also possible to tell it to create bundles for parts of the application that are loaded on demand.
When bundling, we can use bare import specifiers in files because bundlers know how to find the corresponding modules in node_modules
. Bundlers also honor package exports and package imports.
Why bundle?
A downside of bundling is that we need to bundle the whole app every time we want to run it.
There are package managers for browsers that let us download modules as single bundled files that can be used in browsers. As an example, consider the following directory of a web app:
my-web-app/
assets/
lodash-es.js
src/
main.js
We used a bundler to install package lodash-es
into a single file. Module main.js
can import it like this:
import {pick} from '../assets/lodash-es.js';
To deploy this app, the contents of assets/
and src/
are copied to the production server (in addition to non-JavaScript artifacts).
What are the benefits of this approach compared to using a bundler?
Approach 3 can be further improved: Import maps are a browser technology that lets us define abbreviations for module specifiers – e.g. 'lodash-es'
for '../assets/lodash-es.js'
.
This is what an import map looks like if we store it inline – inside an HTML file:
<script type="importmap">
{
"imports": {
"lodash-es": "./assets/lodash-es.js"
}
}
</script>
We can also store import maps in external files (the content type must be application/importmap+json
):
<script type="importmap" src="imports.importmap"></script>
Now the import in main.js
looks like this:
import {pick} from 'lodash-es';
import.meta
– metadata for the current module ES2020
The object import.meta
holds metadata for the current module.
import.meta.url
The most important property of import.meta
is .url
which contains a string with the URL of the current module’s file – for example:
'https://example.com/code/main.mjs'
import.meta.url
and class URL
Class URL
is available via a global variable in browsers and on Node.js. We can look up its full functionality in the Node.js documentation. When working with import.meta.url
, its constructor is especially useful:
new URL(input: string, base?: string|URL)
Parameter input
contains the URL to be parsed. It can be relative if the second parameter, base
, is provided.
In other words, this constructor lets us resolve a relative path against a base URL:
> new URL('other.mjs', 'https://example.com/code/main.mjs').href
'https://example.com/code/other.mjs'
> new URL('../other.mjs', 'https://example.com/code/main.mjs').href
'https://example.com/other.mjs'
This is how we get a URL
instance that points to a file data.txt
that sits next to the current module:
const urlOfData = new URL('data.txt', import.meta.url);
import.meta.url
on Node.jsOn Node.js, import.meta.url
is always a string with a file:
URL – for example:
'file:///Users/rauschma/my-module.mjs'
Many Node.js file system operations accept either strings with paths or instances of URL
. That enables us to read a sibling file data.txt
of the current module:
import * as fs from 'node:fs';
function readData() {
// data.txt sits next to current module
const urlOfData = new URL('data.txt', import.meta.url);
return fs.readFileSync(urlOfData, {encoding: 'UTF-8'});
}
fs
and URLsFor most functions of the module fs
, we can refer to files via:
Buffer
.
URL
(with the protocol file:
)
For more information on this topic, see the Node.js API documentation.
file:
URLs and pathsThe Node.js module url
has two functions for converting between file:
URLs and paths:
fileURLToPath(url: URL|string): string
file:
URL to a path.
pathToFileURL(path: string): URL
file:
URL.
If we need a path that can be used in the local file system, then property .pathname
of URL
instances does not always work:
assert.equal(
new URL('file:///tmp/with%20space.txt').pathname,
'/tmp/with%20space.txt');
Therefore, it is better to use fileURLToPath()
:
import * as url from 'node:url';
assert.equal(
url.fileURLToPath('file:///tmp/with%20space.txt'),
'/tmp/with space.txt'); // result on Unix
Similarly, pathToFileURL()
does more than just prepend 'file://'
to an absolute path.
import()
ES2020 (advanced)
The
import()
operator returns Promises
Promises are a technique for handling results that are computed asynchronously (i.e., not immediately). It may make sense to postpone reading this section until you understand them. More information:
await
operator for Promises, which we use in this section)
import
statementsSo far, the only way to import a module has been via an import
statement. That statement has several limitations:
if
statement.
import()
operatorThe import()
operator doesn’t have the limitations of import
statements. It looks like this:
const namespaceObject = await import(moduleSpecifierStr);
console.log(namespaceObject.namedExport);
This operator is used like a function, receives a string with a module specifier and returns a Promise that resolves to a namespace object. The properties of that object are the exports of the imported module.
Note that await
can be used at the top levels of modules (see next section).
Consider the following files:
lib/my-math.mjs
main1.mjs
main2.mjs
We have already seen module my-math.mjs
:
// Not exported, private to module
function times(a, b) {
return a * b;
}
export function square(x) {
return times(x, x);
}
export const LIGHT_SPEED = 299792458;
We can use import()
to load this module on demand:
// main1.mjs
const moduleSpecifier = './lib/my-math.mjs';
async function getLightSpeedAsync() {
const myMath = await import(moduleSpecifier);
return myMath.LIGHT_SPEED;
}
const result = await getLightSpeedAsync();
assert.equal(result, 299792458);
Two things in this code can’t be done with import
statements:
Why is
import()
an operator and not a function?
import()
looks like a function but couldn’t be implemented as a function:
import()
were a function, we’d have to explicitly pass this information to it (e.g. via an parameter).
import()
Some functionality of web apps doesn’t have to be present when they start, it can be loaded on demand. Then import()
helps because we can put such functionality into modules – for example:
button.addEventListener('click', async (event) => {
const dialogBox = await import('./dialogBox.mjs');
dialogBox.open();
});
We may want to load a module depending on whether a condition is true. For example, a module with a polyfill that makes a new feature available on legacy platforms:
if (isLegacyPlatform()) {
await import('./my-polyfill.mjs');
}
For applications such as internationalization, it helps if we can dynamically compute module specifiers:
const message = await import(`messages_${getLocale()}.mjs`);
await
in modules ES2022 (advanced)
await
is a feature of async functions
await
is explained in “Async functions ES2017” (§44). It may make sense to postpone reading this section until you understand async functions.
We can use the await
operator at the top level of a module. If we do that, the module becomes asynchronous and works differently. Thankfully, we don’t usually see that as programmers because it is handled transparently by the language.
await
Why would we want to use the await
operator at the top level of a module? It lets us initialize a module with asynchronously loaded data. The next three subsections show three examples of where that is useful.
const params = new URLSearchParams(location.search);
const language = params.get('lang');
const messages = await import(`./messages-${language}.mjs`); // (A)
console.log(messages.welcome);
In line A, we dynamically import a module. Thanks to top-level await
, that is almost as convenient as using a normal, static import.
let mylib;
try {
mylib = await import('https://primary.example.com/mylib');
} catch {
mylib = await import('https://secondary.example.com/mylib');
}
const resource = await Promise.any([
fetch('http://example.com/first.txt')
.then(response => response.text()),
fetch('http://example.com/second.txt')
.then(response => response.text()),
]);
Due to Promise.any()
, variable resource
is initialized via whichever download finishes first.
await
work under the hood?Consider the following two files.
first.mjs
:
const response = await fetch('http://example.com/first.txt');
export const first = await response.text();
main.mjs
:
import {first} from './first.mjs';
import {second} from './second.mjs';
assert.equal(first, 'First!');
assert.equal(second, 'Second!');
Both are roughly equivalent to the following code:
first.mjs
:
export let first;
export const promise = (async () => { // (A)
const response = await fetch('http://example.com/first.txt');
first = await response.text();
})();
main.mjs
:
import {promise as firstPromise, first} from './first.mjs';
import {promise as secondPromise, second} from './second.mjs';
export const promise = (async () => { // (B)
await Promise.all([firstPromise, secondPromise]); // (C)
assert.equal(first, 'First!');
assert.equal(second, 'Second!');
})();
A module becomes asynchronous if:
await
(first.mjs
).
main.mjs
).
Each asynchronous module exports a Promise (line A and line B) that is fulfilled after its body was executed. At that point, it is safe to access the exports of that module.
In case (2), the importing module waits until the Promises of all imported asynchronous modules are fulfilled, before it enters its body (line C). Synchronous modules are handled as usually.
Awaited rejections and synchronous exceptions are managed as in async functions.
await
What are the pros and cons of top-level await
?
Pros:
Cons:
await
delays the initialization of importing modules. Therefore, it’s best used sparingly. Asynchronous tasks that take longer are better performed later, on demand. However, even modules without top-level await
can block importers (e.g. via an infinite loop at the top level), so blocking per se is not an argument against it.
await
cannot be required from CommonJS. That matters if you write an ESM-based package and want it to be usable from CommonJS code bases. For more information, see section “Loading ECMAScript modules using require()
” in the Node.js documentation.
Importing artifacts that are not JavaScript code as modules, has a long tradition in the JavaScript ecosystem. For example, the JavaScript module loader RequireJS has support for so-called plugins. To give you a feeling for how old RequireJS is: Version 1.0.0 was released in 2009. Specifiers of modules that are imported via a plugin look like this:
'«specifier-of-plugin-module»!«specifier-of-artifact»'
For example, the following module specifier imports a file as JSON:
'json!./data/config.json'
Inspired by RequireJS, webpack supports the same module specifier syntax for its loaders.
These are a few use cases for importing non-JavaScript artifacts:
For more use cases, you can take a look at the list of webpack’s loaders.
The motivating use case for import attributes was importing JSON data as a module. That looks as follows:
import configData from './config-data.json' with { type: 'json' };
type
is an import attribute (more on the syntax soon).
You may wonder why a JavaScript engine can’t use the filename extension .json
to determine that this is JSON data. However, a core architectural principle of the web is to never use the filename extension to determine what’s inside a file. Instead, content types are used.
If a server is set up correctly then why not do a normal import and omit the import attributes?
Let’s examine in more detail what import attributes look like.
We have already seen a normal (static) import statement:
import configData from './config-data.json' with { type: 'json' };
The import attributes start with the keyword with
. That keyword is followed by an object literal. For now, the following object literal features are supported:
There are no other syntactic restrictions placed on the keys and the values, but engines should throw an exception if they don’t support a key and/or a value:
To support import attributes, dynamic imports get a second parameter – an object with configuration data:
const configData = await import(
'./config-data.json', { with: { type: 'json' } }
);
The import attributes don’t exist at the top level; they are specified via the property with
. That makes it possible to add more configuration options in the future.
A re-export imports and exports in a single step. For the former, we need attributes:
export { default as config } from './config-data.json' with { type: 'json' };
Import attributes are really just syntax. They lay the foundation for actual features that make use of that syntax. The first ECMAScript feature based on import attributes is JSON modules – which we’ve already seen in action:
This is a file config-data.json
:
{
"version": "1.0.0",
"maxCount": 20
}
It sits next to the following ECMAScript module main.js
:
import configData from './config-data.json' with { type: 'json' };
assert.deepEqual(
configData,
{
version: '1.0.0',
maxCount: 20
}
);
Exercise: Importing JSON
exercises/modules/get-version_test.mjs
Backends have polyfills, too
This section is about frontend development and web browsers, but similar ideas apply to backend development.
Polyfills help with a conflict that we are facing when developing a web application in JavaScript:
Given a web platform feature X:
A polyfill for X is a piece of code. If it is executed on a platform that already has built-in support for X, it does nothing. Otherwise, it makes the feature available on the platform. In the latter case, the polyfilled feature is (mostly) indistinguishable from a native implementation. In order to achieve that, the polyfill usually makes global changes. For example, it may modify global data or configure a global module loader. Polyfills are often packaged as modules.
A speculative polyfill is a polyfill for a proposed web platform feature (that is not standardized, yet).
A replica of X is a library that reproduces the API and functionality of X locally. Such a library exists independently of a native (and global) implementation of X.
There is also the term shim, but it doesn’t have a universally agreed upon definition. It often means roughly the same as polyfill.
Every time our web applications starts, it must first execute all polyfills for features that may not be available everywhere. Afterwards, we can be sure that those features are available natively.