servo/tests/wpt/web-platform-tests/resources/test
2018-03-23 22:48:00 -04:00
..
tests Update web-platform-tests to revision d04a8fc02b85bd32799691759c8c05ead07cd939 2018-03-23 22:48:00 -04:00
config.test.json Update web-platform-tests to revision a46616a5b18e83587ddbbed756c7b96cbb4b015d 2017-10-05 00:42:13 +02:00
conftest.py Update web-platform-tests to revision d04a8fc02b85bd32799691759c8c05ead07cd939 2018-03-23 22:48:00 -04:00
harness.html Update web-platform-tests to revision a46616a5b18e83587ddbbed756c7b96cbb4b015d 2017-10-05 00:42:13 +02:00
README.md Update web-platform-tests to revision be5419e845d39089ba6dc338c1bd0fa279108317 2018-01-09 12:52:27 -05:00
tox.ini Update web-platform-tests to revision 26e8a76d7fbea0721468e791a325444ac9939a4f 2018-03-21 22:47:14 -04:00
wptserver.py Update web-platform-tests to revision d04a8fc02b85bd32799691759c8c05ead07cd939 2018-03-23 22:48:00 -04:00

testharness.js test suite

The test suite for the testharness.js testing framework.

Executing Tests

Install the following dependencies:

Once these dependencies are satisfied, the tests may be run from a command line by executing the following command from this directory:

tox

Currently, the tests should be run with Firefox Nightly.

In order to specify the path to Firefox Nightly, use the following command-line option:

tox -- --binary=/path/to/FirefoxNightly

Authoring Tests

Test cases are expressed as .html files located within the tests/ sub-directory. Each test should include the testharness.js library with the following markup:

<script src="../../testharness.js"></script>
<script src="../../testharnessreport.js"></script>

This should be followed by one or more <script> tags that interface with the testharness.js API in some way. For example:

<script>
test(function() {
    1 = 1;
  }, 'This test is expected to fail.');
</script>

Finally, each test may include a summary of the expected results as a JSON string within a <script> tag with an id of "expected", e.g.:

<script type="text/json" id="expected">
{
  "summarized_status": {
    "message": null,
    "stack": null,
    "status_string": "OK"
  },
  "summarized_tests": [
    {
      "message": "ReferenceError: invalid assignment left-hand side",
      "name": "Sample HTML5 API Tests",
      "properties": {},
      "stack": "(implementation-defined)",
      "status_string": "FAIL"
    }
  ],
  "type": "complete"
}
</script>

This is useful to test, for example, whether asserations that should fail or throw actually do.