This page groups together pages being developed by the Internationalization Core Working Group to assess internationalization support of user agents. These tests are still under development and should not be taken as final. See the description of how the tests work at the bottom of this page.
Note that these tests do not only test conformance with W3C standards. In some cases the tests also allow for exploration of the behavior of user agents in ways not described by the standards.
* We are in the process of rewriting and moving all the tests. The rewrite corrects some test errors, but, more importantly, produces a test format that will fit well with other test suites (such as those for HTML and CSS) and is linked with the W3C Test Framework.
See all tests that used the old framework.
- Character encoding
- Text direction
- 184.108.40.206 The dir attribute
- 3.2.6 Requirements relating to the bidirectional algorithm
- 4.6.24 The bdo element
- 4.6.26 The br element
- 4.5.3 The pre element
- 220.127.116.11.2 The dirname attribute
- 4.10.13 The textarea element
- 10.7.4 Native user interfaces
HTML5 tests still using old framework
- Character encoding
CSS3 Counter Styles
Predefined Counter Styles
- Exploratory tests
- Text transform
- White space
- Line breaks
- 5. Line Breaking and Word Boundaries
- 5.1. Line Breaking details
- 5.2. Breaking Rules for Punctuation: the 'line-break' property
CSS Writing Modes
CSS tests still using old framework
- Text direction
- Line breaking
These tests have been developed bearing in mind the need for content developers to learn about features, and know whether and how those features are supported on a particular user agent. Therefore a good deal of explanatory information is included with the tests.
Care has also been taken to enable fairly easy adaptation of the tests by QA Engineers as part of another test suite.
The tests and results have also proved useful to user agent developers for planning improvements to their products.
i18n-format tests vs. W3C test framework
You can view tests in an i18n-Activity-specific (i18n) format, or within the W3C Test Framework.
The i18n format provides notes and some formatting that are not visible in the W3C test framework. You are also able to run through the tests in an order that is often more coherent than the alphabetic order of test ids (which is what the W3C Test Framework does). This is often useful for quick tests that you don't want to record the results of, or for exploring how things work.
The W3C Test Framework allows you to run the tests and record the results. It also shows the results for tests, individually, by section, or for a whole test suite. There is a separate test suite for each specification being tested.
We try to provide links from the i18n Activity pages to the W3C Framework for each test and each spec section.
Ways to get into the data
Results summary pages such as Bidi algorithm in HTML show results for major browsers for a particular topic, and summarise those results, sometimes drawing conclusions.
From each test on a results summary page you can link to the test itself in the i18n format, or to the results for that test in the W3C Test Framework (from which you can also link to the test itself to run the test and record the result).
The W3C Test Framework can also be accessed directly.
i18n-format test pages
Prerequisites for running the test appear at the top of the page on a dark background.
Instructions about what to do or look for are given in this color above the box. Green text or backgrounds usually indicates a successful test. Red, a failure.
Often the test requires you to compare a result with an image (typically ignoring font differences). These tests typically use a large orange equals sign when the image and the text should match, or a large violet not-equal sign when the two should NOT match. This convention significantly speeds up visual checking of results. Note that a not-equals sign does not indicate that the test has failed - rather that the test only matches the assertion if the result and image don't match.
The actual test is usually enclosed by an orange box.
A test is successful if it behaves as stated in the assertion that appears lower down the page. Note that this assertion may not always represent behaviour described in a specification. In addition, the i18n test suite has many exploratory tests, which look at behaviour that may not be specified anywhere.
Notes are often provided in gray text at the bottom of the page. These may explain how the test was put together, or other useful information.
At the top right of the page is a link to the Next Test. This will cycle through all the tests in a group, but not necessarily in a logical order (ie. not as shown on the pages that link to the tests individually). Unless this is a recently updated set of tests, the order tends to depend on when a test was created. Test numbers are not usually changed, to help ensure that links to the individual tests don't break.
Also in the top right corner of the tests you will see an indication of the format and mime type used for a particular test, such as XHTML 1.0 served as XML.
Tests are normally run in HTML5 (though some may be served as XHTML5). It is, however, possible to run the test in any of the following formats by changing the parameter
- h4 for HTML 4.01
- h5 for HTML5
- xh for XHTML 1.0 served as text/html
- xx for XHTML 1.0 served as application/xhtml+xml
- x5 for XHTML5
- x11 for XHTML 1.1 served as application/xhtml+xml
NOTE: Bear in mind that the difference between HTML5 and XHTML5 is one of MIME-types, not one of syntax. It is not comparable to the difference between HTML4 and XHTML1.0. (See What is XHTML5?).