3.6 KiB
Contributing
Adding a new plugin or polyfill to support (when approved in the next ECMAScript version)
Update plugin-features.js
Example:
If you were going to add **
which is in ES2016:
Find the relevant entries on compat-table:
exponentiation (**) operator
Find the corresponding babel plugin:
@babel/plugin-transform-exponentiation-operator
And add them in this structure:
// es2016
"@babel/plugin-transform-exponentiation-operator": {
features: [
"exponentiation (**) operator",
],
},
Update data for core-js@2
polyfilling
Example:
In case you want to add Object.values
which is in ES2017:
Find the relevant feature and subfeature on compat-table
and split it with /
:
Object static methods / Object.values
Find the corresponding module on core-js@2
:
es7.object.values.js
Find required ES version in corejs2-built-in-features.js
and add the new feature:
const es = {
//...
"es7.object.values": "Object static methods / Object.values"
}
If you wan to transform a new built-in by useBuiltIns: 'usage'
, add mapping to related core-js
modules to this file.
Update data for core-js@3
polyfilling
Just update the version of core-js-compat
in dependencies.
If you wan to transform a new built-in by useBuiltIns: 'usage'
, add mapping to related core-js
modules to this file.
If you want to mark a new proposal as shipped, add it to this list.
Update plugins.json
Until compat-table
is a standalone npm module for data we are using the git url
"compat-table": "kangax/compat-table#[latest-commit-hash]"
,
So we update and then run npm run build-data
. If there are no changes, then plugins.json
will be the same.
Tests
Running tests locally
npm test
Checking code coverage locally
npm run coverage
Writing tests
General
All the tests for @babel/preset-env
exist in the test/fixtures
folder. The
test setup and conventions are exactly the same as testing a Babel plugin, so
please read our documentation on writing tests.
Testing the debug
option
Testing debug output to stdout
is similar. Under the test/debug-fixtures
,
create a folder with a descriptive name of your test, and add the following:
- Add a
options.json
file (just as the other tests, this is essentially a.babelrc
) with the desired test configuration (required) - Add a
stdout.txt
file with the expected debug output. For added convenience, if there is nostdout.txt
present, the test runner will generate one for you.