Testing and Reporting for XSL-FO 1.1

W. Eliot Kimber, Innodata Isogen

18 Oct 2005

Objective and Scope

The objective of the testing process is to test the implementation of new and changed features of XSL-FO 1.1 for existence and correctness and to ensure that existing XSL–FO 1.0 behaviors are unaffected by the implementation of new 1.1 features. Unchanged features of XSL-FO are not tested.


Test contributions will be published outside W3C Recommendation. Contributors who are not W3C members, team, or invited experts in the working group must agree to the terms of Grant II: Grant of License for Contributed Test Cases Published Outside a W3C Recommendation. Participating Members are covered by the W3C Patent Policy. Refer to Policies for Contribution of Test Cases to W3C for more information.

Test Coverage

The set of new and changed features in XSL-FO is listed in Appendix F, Changes from XSL 1.0 of the XSL 1.1 specification. For each new or changed formatting object or property there must be at least one test case that demonstrates that the feature is or is not implemented by a given FO implementation and that when the feature is implemented, enables clear determinination of correct implementation of the feature. Note that there is not a requirement to test every possible combination of values of properties or otherwise cover all possible cases. However, the tests should attempt to cover the combinations most likely to lead to incorrect implementation.

The XSL-FO subgroup has developed a base set of test cases that provide the minimum necessary feature coverage.


The XSL-FO Subgroup will, at its sole discretion, determine what tests are incorporated into the test suite. Refer to section 8 for more information on the acceptance process.

General Submission Guidelines

A test case must be either a complete and valid XSL-FO document or a non-FO XML document and an XSLT 1.0 transform that produces a single XSL-FO document. Each test case should be described by a test catalog entry in a test catalog XML document as described below.

Constructing Test Case FO Documents

Because test cases must be verified by inspection it is important that the test cases be as short as possible. They should clearly indicate, as part of the rendered result, both what the test case is testing and what the expected correct result is. Optionally, a test case may have an associated pre-rendered representation of the expected correct result, such as PDF document or graphic.

Test cases should either demonstrate implementation of a single feature or demonstrate correct processing of two or more features that potentially interact with each other. In general, test cases should not demonstrate multiple features that are otherwise unrelated to each other.

Test Catalog Entries

Each test case should have a corresponding test catalog entry conforming to the test suite DTD shown in Appendix A. Each test case must have an ID that is unique within the <testsuite> document that contains it. Each test case must indicate which formatting object it tests, using the “fo=” attribute. Use the “xml=” attribute to point to either the FO instance or non-FO XML document for the test case. If the test case is a non-FO XML document, specify the XSLT style sheet for generating an XSL-FO instance from the XML document via the “xsl=” attribute. The content of the <test> element should be a brief description of the purpose of the test case.

Organize related test cases together within <testcases> elements. Use the “profile=” attribute of the <testcases> element to indicate what the test cases apply to, e.g., “inside/outside for clear and float”.

Vendor Specific Features and XML Versions

  • Submitted tests should not include any vendor specific extensions
  • Submitted tests should be independent of XML versions “1.0” and “1.1” and XML Namespaces versions “1.0” and “1.1” unless the purpose is to evaluate those differences.

Test Harness

A test harness will NOT be provided by the XSL-FO Subgroup. However, the test catalog should contain sufficient information to enable automation of the processing of each test case in order to generate the appropriate renderered result.

Test Result Interpretation

The agreement between an individual test result and the expected result as described in the test case is categorized as follows:

Full Full agreement with the expected result and no problems reported with either the spec or the test
Future The test result differed from the expected result but future support is expected. Full agreement with the expected result was not reported, no problem with either the spec or the test was reported, and future support of the feature tested by this test case is expected
Differ The test result differed from the expected result. Full agreement with the expected result was not reported, no problem with either the spec or the test was reported, and no expected future support of the feature tested by this test case was reported
Test Full agreement with the expected result was not reported and a problem with the test was reported
Spec Full agreement with the expected result was not reported and a problem with the spec was reported
 N/A The test result file does not cover this test

Challenge Test Validity or Specification

All comments relating to the testing process, reporting process, or individual tests should be reported using the W3C Members Bugzilla tool: http://www.w3.org/Member/bugzilla/ The XSL-FO Subgroup will process every bug report.

Publish Test Results

Test results must be submitted in the form of an XML document containing <testresult> elements conforming to the testsuite.dtd shown in Appendix A. Implementers may choose to include their product and organization name in the header of the publication of the XSL-FO Test Suite results. An implementer may ask for anonymity. Test results must be submitted to the W3C staff contact: liam@w3.org.

Appendix A. Test Suite DTD

Test catalog entries must conform to this document type declaration:

<?xml version="1.0" encoding="UTF-8"?>
 @(#)testcases.dtd 1.0 12/18/00

 Mary Brady, NIST

 DTD describing a database of XSL tests.

 Revised 2001 Jan 10 (Paul Grosso) - augmented by XSL FO SG:
	- add to the TEST element an XSL attribute to point to
	 the XSL stylesheet and rename the URI attribute to XML
	- add to the TEST element a PROPERTY attribute to 
	 indicate when a given property is being tested
	- add to the TEST element a LEVEL attribute to indicate
	 which conformance level is being tested
	- add to the TEST element a SPECPTR attribute
	 (of URI type) to allow referring to relevant sections 
	 of the XSL spec.
	- add to the TEST element an ERRATUM attribute
	 (of CDATA type) to allow recording of an erratum
	 name/number that this test is meant to test.
	- add to INTERACTION attribute value list MULTIPLE
	 and give it a default of NONE
	- add to FO attribute value list MULTIPLE
	- make the BASE attribute on TESTCASES #IMPLIED, 
	 since it's not unreasonable for all the URI-type
	 attributes to be absolute
	- add to the TEST element a RESULTS attribute to point
	 to, say, the PDF demonstrating the expected output

 Revised 2001 Jan 10 (Paul Grosso) - added TESTRESULT structure

 Revised 2001 Jan 24 (Paul Grosso) - lowercased names (though
 not in comments, since that helps highlight them)

 2001-02-19 (AntennaHouse) added repeatable-page-master-alternatives 
	 conditional-page-master-reference region-before
	 region-after region-start region-end static-content 
	 inline inline-container.

 Changed 2001 Apr 4 (Max Froumentin)
	- made xsl attribute in test element optional (so FO files
 can be referred to directly), following the decision by the WG
	 to allow FO files in the test suite.

 Revise 2005 Apr 8 (W. Eliot Kimber) - Updated to reflect latest list of FOs in 1.1 spec. 


<!ENTITY % text "(#PCDATA | em | b | a)*">

 The root element of the whole collection is TESTSUITE. While not
 very different from TESTCASES, it must be distinguished due to
 improper processing by IE5.
<!ELEMENT testsuite ( testcases+ )>
<!ATTLIST testsuite 
 profile	CDATA		#IMPLIED

 The root element of a collection should be "TESTCASES". It serves to 
 group a set of tests to be collectively identified as follows:
	* PROFILE - name of test profile 
	* BASE - base directory in which tests for this collection reside;
	 allows tests to be found in many application instances.

<!ELEMENT testcases (test | testresult | testcases)*>
<!ATTLIST testcases
 profile	CDATA		#IMPLIED

 The body of each TEST element is its description as well as a
 discussion of the expected results. The following attributes 
 must be specified for each test as follows:

	* ID - unique test identifier
	* INTERACTION - used later for categorizing tests
	* FO - formatting object that is tested
	* XML - relative uri that points to the XML of the actual test
	* XSL - relative uri that points to the XSL of the actual test

 Other optional attributes are:

	* PROPERTY - indicates the property being tested
	* LEVEL - indicates which conformance level is being tested
	* SPECPTR - pointer to relevant part of the XSL spec
	* ERRATUM-LBL - label/name of the erratum being tested
	* ERRATUM-PTR - pointer to the erratum being tested
	* RESULTS - pointer to a display of the expected results


<!ELEMENT test %text;>
<!ATTLIST test
 interaction (none|area|writing|spacing|collapsing|multiple)	"none"
 basic-link |
	bidi-override |
	block |
	block-container |
	bookmark |
	bookmark-title |
	bookmark-tree |
	change-bar-begin |
	change-bar-end |
	character | 
	color-profile |
	conditional-page-master-reference |
	declarations | 
	external-graphic |
	float |
	flow | 
	flow-assignment |
	flow-map |
	flow-name-specifier |
	flow-source-list |
	flow-target-list |
	folio-prefix |
	folio-suffix |
	footnote |
	footnote-body |
	index-key-reference |
	index-page-citation-list |
	index-page-citation-list-separator |
	index-page-citation-range-separator |
	index-page-number-prefix |
	index-page-number-suffix |
	index-range-begin |
	index-range-end |
	initial-property-set | 
	inline | 
	inline-container |
	instream-foreign-object |
	layout-master-set |
	leader | 
	list-block |
	list-item |
	list-item-body |
	list-item-label |
	marker |
	multi-case |
	multi-properties |
	multi-property-set |
	multi-switch |
	multi-toggle |
	page-number |
	page-number-citation |
	page-number-citation-last |
	page-sequence |
	page-sequence-master |
	page-sequence-wrapper |
	region-after |
	region-before |
	region-body |
	region-end |
	region-name-specifier |
	region-start |
	repeatable-page-master-alternatives |
	repeatable-page-master-reference |
	retrieve-marker |
	retrieve-table-marker |
	root |
	scaling-value-citation |
	simple-page-master |
	single-page-master-reference |
	static-content |
	table |
	table-and-caption |
	table-body |
	table-cell | 
	table-column | 
	table-header | 
	table-row |
	title |
	wrapper |
 property	CDATA				#IMPLIED
 level	(basic|extended|complete)	"basic"
 specptr %URI; #IMPLIED
 erratum-lbl CDATA #IMPLIED
 erratum-ptr %URI; #IMPLIED
 results %URI; #IMPLIED

 Really basic HTML font tweaks, to support highlighting
 some aspects of test descriptions ...
 EM == emphasis (e.g. italics, fun colors)
 B == bold
<!ELEMENT em (#PCDATA | b)*>
<!ELEMENT b (#PCDATA | em)*>

 We also allow for hyperlinks in text (e.g., to include
 references to supporting evidence within SPECPROBLEM and 
<!ELEMENT a (#PCDATA | b | em)*>
 href	%URI;				#REQUIRED

 The TESTRESULT element is used to record test results.
	It has a required ID attribute which identifies the TEST.
	It has a required AGREEMENT attribute that indicates whether
	the results are in full agreement with the expected
	results as described in the TEST case or not.

 The textual contents of the TESTRESULT element should describe
	the results of the test and any issues or further information.

 The target of the optional RESULTS attribute could be either some
	PDF showing the results or any other arbitrary resource
	describing/discussing the results.

 The optional FUTURESUPPORT attribute is used to indicate
	expected future support of the feature tested by this
	test case. 

 The optional SPECPROBLEM attribute indicates if there is
	any ambiguity or other problem found in the spec that relates
	to this test. Especially if the results weren't as expected
	because of a misinterpretation of the spec, this should be
	documented here. Details/discussion should appear in the 
	textual contents of the TESTRESULT element.

 The optional TESTPROBLEM attribute indicates if there is
	any issue with the TEST case and the expected results it
	suggests. Especially if the results of the test differ
	from that given as "expected" by the test case but are,
	in fact, believed to be the correct results, this should
	be documented here. Details/discussion should appear in the 
	textual contents of the TESTRESULT element.


<!ELEMENT testresult %text;>
<!ATTLIST testresult
 agreement	(full|issues)			#REQUIRED
 results		%URI;			#IMPLIED
 futuresupport	(full|partial|none)	#IMPLIED
 specproblem	(yes|no)		"no"
 testproblem	(yes|no)		"no"