Testing Framework

From SVG
Revision as of 12:54, 21 February 2011 by Edahlstr (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The SVG WG is interested in working with the HTML WG Testing Task Force to have a unified and uniform testing framework.


Here are some open issues:

  • Need a way to extract only the SVG tests, for SVG-only user agents
  • Need to know the reporting framework
  • Must have a way to input results for automated and non-automated tests

Exploring the test.w3.org testing infrastructure

The SVG1.1F2 testsuite has taken in the testing frameworks from test.w3.org. The files have been copied from test.w3.org/resources to 1.1F2/test/resources and 1.1F2/test/harness/resources.

Here's an example of a test using the testing infrastructure: struct-dom-01-b.svg

Each svg file that wants to use the automated testing framework needs to include the testharness.js file, like this:

<script type="text/ecmascript" xlink:href="../resources/testharness.js"/>

The svg test harnesses then includes the testsuite report script, 1.1F2/test/resources/testharnessreport.js, that is meant to be implemented by each vendor. See the testharnessreport.js for how to implement your own report. The testharnessreport.js script is automatically included by the svg harness generation perl script.

Converting/writing new tests using the framework

The documentation for how to write tests using the framework can be found at the top of testharness.js. Duplicated below for easier access.

* == Introducion ==
* This file provides a framework for writing testcases. It is intended
* to provide a convenient API for making common assertions, and to work
* both for testing synchronous and asynchronous DOM features in a way that
* promotes clear, robust, tests.
*
* == Basic Usage ==
*
* To use this file, import the script into the test document:
* <script src="http://test.w3.org/resources/jsharness.js"></script>
*
* Within each file one may define one or more tests. Each test is atomic
* in the sense that a single test has a single result (pass/fail/timeout).
* Within each test one may have a number of asserts. The test fails at the
* first failing assert, and the remainder of the test is (typically) not run
*
* If the file containing the tests is a HTML file with an element of id "log"
* this will be populated with a table containing the test results after all
* the tests have run.
*
* == Synchronous Tests ==
*
* To create a sunchronous test use the test() function:
*
* test(test_function, name)
*
* test_function is a function that contains the code to test. For example a
* trivial passing test would be:
*
* test(function() {assert_true(true)}, "assert_true with true)"
*
* The function passed in is run in the test() call.
*
* == Asynchronous Tests ==
*
* Testing asynchronous features is somewhat more complex since the result of
* a test may depend on one or more events or other callbacks. The API provided
* for testing these features is indended to be rather low-level but hopefully
* applicable to many situations.
*
* To create a test, one starts by getting a Test object using async_test:
*
* var t = async_test("Simple async test")
*
* Assertions can be added to the test by calling the step method of the test
* object with a function containing the test assertions:
*
* t.step(function() {assert_true(true)});
*
* When all the steps are complete, the done() method must be called:
*
* t.done();
*
* == Making assertions ==
*
* Functions for making assertions start assert_
* The best way to get a list is to look in this file for functions names
* matching that pattern. The general signature is
*
* assert_something(actual, expected, description)
*
* although not all assertions precisely match this pattern e.g. assert_true only
* takes actual and description as arguments.
*
* The description parameter is used to present more useful error messages when a
* test fails

Automatic verification of svg animations

One of the things that is not automatically testable in the current testsuite is animations. If the animated values were verifiable with scripting that would be a big improvement.

Here are two possible ways of doing that.

Method 1 - spot-testing with a pre-defined interval

  • Let a script function run at a pre-defined interval:
    1. Query the current document-time
    2. Run a reference JS-SMIL engine to compute the reference value for the current document-time
    3. Query the relevant current animated values and compare to computed references
    4. Stop after a defined time.

Pros

  • Only minor changes to existing testcases would be necessary

Cons

  • Needs a reference implementation implemented in ecmascript (FakeSMILe might be one candidate, but it's rather incomplete)
  • Can easily miss interesting points in the timeline
  • Requires running for some time to gather samples
  • A degree of uncertainty is whether the document-time runs while executing the ecmascript checking function, which could lead to inconsistent results

Method 2 - spot-testing with pre-defined document-times

  • First automatically (or manually) decide some good times to query the animated values (e.g shortly after a transition has taken place, or after some event).
    1. Pause all animations in the svg via 'SVGSVGElement.pauseAnimations'
    2. Set the current document-time to the next pre-decided time via 'SVGSVGElement.setCurrentTime'
    3. Query the relevant current animated values and compare to pre-computed references
    4. Report result to testing framework
    5. Repeat steps 2-5 until done.

Pros

  • Testing quick due to fast-forwarding the timeline to the interesting parts
  • Higher chance of providing consistent results due to pausing the timeline

Cons

  • May not give exactly the same results as not fast-forwarding the timeline (due to bugs in implementations)


Strategy for existing and future testsuites

There are several options open.

  1. Don't attempt to convert any of the old testsuites, instead write new minimal testcases using the new framework without regard to what tests existed in the old testsuites.
    • Cons: overlap in testing (is that good or bad? could we eventually retire tests in the old testsuite because of that?)
    • Pros: might mean less work initially, but could also mean more repeated work (due to test duplication)
  2. Only write tests for things not already tested in the old testsuite using the new framework.
    • Cons: would mean fewer automated tests
    • Pros: no testing overlap
  3. Convert the testcases in the old testsuite that can be converted, e.g most of the DOM tests and most of the animation tests.
    • Cons: requires some effort to rewrite tests
    • Pros: lowers the amount of manual testing in the old testsuites