Warning:
This wiki has been archived and is now read-only.

TestSuite implementation

From Media Annotations Working Group Wiki
Jump to: navigation, search

Overall goal

API tests should validate that the specification is interpreteed in the same way by (at least) two implementers.

General information

The test cases can be only tied to the JSON responses, which will be the evaluated features of the test suite. The reference JSON responses used are translations of the reference RDF documents collected in the Ontology fopr Media Resource test suite.

For running the automatic test suite, it is assumed that a JavaScript interface is available for testing (which might direct calls to a web service in the back-end).

Binding the test cases to actual interfaces may cause problems due to availability or visibility. In this terms this is the only intersection between the sync and async specification.

The test cases have been designed to be independent from an actual implementation mode, so automatic test suite can be used by both, the synchronous and asynchronous mode.

Implementations Overview

Current Implementations (compliant to new spec version)

  • Firefox extension (covering Youtube and Vimeo, whereas Vimeo metadata is not in scope)

Implementations under update/development

  • Web service implementation of Werner, Tobias and Florian (implementing a subset of mappings between mw ontology and DC/MPEG-7) (ready until ?)
  • Web service implementation of Salzburg research (formats in scope not yet discussed) (ready: mid December - contact: Thomas Kurz)

Unknown status

  • ETRI: EXIF, ID3, YouTube (covering all the properties)
  • IBBT-MMLab: Maybe they can help us out with NinSuna? We should ping Eric on this. (Florian will ping him)

Expected Output Documents

  • Test suite (prose description & offering test data for offline testing)
  • Automatic test suite (implementation of online test and evaluation)
  • Implementation test report

The test suite shall cover tests, if the implementations behaves as defined in the API specification.

(Automatic) Test suite

The test suite has to be aware, that there are tests which are independent of formats a) and tests, that need information on formats to be conducted (b).

a) General behaviour (format independent)

* An implementation must be able to echo the implemented mode (sync, async or both).
* Calling not-existent properties. Here, the right error response has to be set.
* Passing null/bad values. Here, the right error response has to be set.
* Multiple entries of properties

b) Metadata mapping test cases (format dependent)

The API specification defines a specific JSON response structure, to which a implementation has to be compliant. To enable an automatic test, a normative JSON response for a example media resource for each metadata format has to be created covering all properties defined in the corresponding mapping table. The example media resources have been already defined in the ontology test suite.

Requirements for automatic testing:

* It must be possible to define a certain set of properties of a metadata format to test against.
* The automatic test must indicate, whether a test for a specific property is failed, valid to the core requirements
  or full compliance.
* An implementation has to be able to retrieve the original metadata document as a whole (at least for the properties in use).
* Optional: Filtering by language and metadata format.

Note 1: A normative JSON response for a particular metadata format is set of atomic tests. In example, if a metadata format has 10 mappings defined to our ontology, the compliance test for this metadata format may consist of 10 atomic tests at maximum (+ the genereal test cases).

Note 2: There exist some tests, where properties are not static fields and change over time. An example would be Youtube rating. In such cases, we can only check, whether a value is present or not.

Generation of Implementation Report

The generation report is following the form of a table. Here every possible atomic operation performed in the automatic test suite is listed. Beside the atomic test cases of the general behaviour, for every format, every property has to be listed in two instantiations (optional attributes vs. specific attributes). An example for Dublin Core creator would be:

#Metadata formatPropertycoverageImplementation 1...Implementation nJSON Fragment
iDCcreatorcore attributes...some link
i+1DCcreatorspecific attributes...some link

In the current implementation report, there exist also lines for properties that passed no values. These cases have already been covered by the general behaviour and makes these lines redundant.

Options for running the test

1) An automatic test suite is able to call the implementations (browser extension or web service supported) and compares the results against the reference data. Here the user is able to configure certain values.

2) To download the reference results and perform the checking locally. Maybe we should also provide some kind of excel sheet to integrate the results for feedback.

Status

Normative JSON responses for general behaviour

Atomic test caseLink to JSONReviewed byFinal
Implemented mode[1]NANO
Not-existent property[2]NANO
Property not supported in source format[3]NANO
Passing null value[4]NANO
Duplicate properties set in array[5]NANO

Normative JSON files for metadata formats:

Metadata format Link to JSON Reviewed by Final
CableLabs 1.1 NOT IN SCOPE Joakim NO
DIG35 [6] Chris Reviewed
Dublin Core [7] Thierry Reviewed
EBUCore [8] Jean-Pierre Reviewed
Exif 2.2 [9] Tobias Reviewed
ID3 [10] Pierre-Antoine Revewied
IPTC [11] Jean-Pierre Revewied
LOM 2.1 [12] Tobias Reviewed
Media RSS [13] Wonsuk NO
MPEG-7 [14] Werner Reviewed
DMS-1 [15] Werner Reviewed
TTML [16] Werner Reviewed
TV-Anytime [17] Jean-Pierre Revewied
TXFeed [18] Wonsuk NO
XMP [19] Felix Reviewed
YouTube [20] Wonsuk NO

Normative JSON files for container formats:

Metadata format Link to JSON Reviewed by Final
MP4 [21] Courtney, Marie-Carmen No
3gp [22] Courtney, Marie-Carmen No
f4v [23] Felix Reviewed
flv [24] Felix Reviewed
Quicktime NOT IN SCOPE Courtney, Marie-Carmen No
WebM NOT IN SCOPE Sylvia No
Ogg NOT IN SCOPE Sylvia No

Issues:

* CableLabs 1.1: Inconsistency between RDF and example file
* EXIF2.2: File not testable by an automatic routine at the moment
* How should be container formats handled? Only offline testing?
  (Affected formats: OGG, 3gp, flv, qt, mp4 and webm)


Automatic Test suite:

* Selection between different modes -> done.
* Implementation for Web service APIs -> ongoing.
* Selection of metadata formats -> done.
* Selection of a subset of properties -> done.
* Comparison functionality of two JSON files -> done.
* Two level checking -> done.
* Debug informations on invalid properties -> done.
* Test original metadata -> ongoing
* Test filter criteria -> ongoing