Meeting minutes
Slideset: https://
Sarven: trying to get a sense from the room about QA at W3C
Sarven: history of QA Activity in W3C started in 2000
… nowadays, WGs are expected to produce open test suites; charters have explicit success criteria on interop
… typically 2 independently developed software that implement the features described by the specs
… these test suites lead to interop report by groups
… and ideally, these test suites remain useful and functional long after the spec has been developed
… this isn't only about QA, but about how specs are developed
Sarven: the former QA WG/IG produced a number of documents: QA Framework and its handbook, spec guidelines, variability in specs
… a lot of if is how to express different layer of structure in specs and how they relate to test suites aligned with the specs
… including considerations about how to express conformance, requirements
[reviewing quickly spec guidelines, variability in spec, testing faq]
sarven: adjacent work include WPT (used by a large number of specs, but not all W3C specs are in scope of it either)
… there was also an HTTP vocabulary, a schema to describe test results (EARL)
… Spec Terms is a vocabulary to express the significant units in specifications
… the Solid QA work builds on the the W3C QA work and tries to some of the loose ends - it came out of the Solid CG, but it's not particularly tied to Solid itself
Sarven: Technical Reports have a long list of significant units in specifications - a number of which could be made machine readable
Sarven: many (but not all) specs use controlled vocabulary to express normative content and distinguish informative content
… that includes but isn't restricted to RFC2119/8174 keywords
Sarven: a few questions to the room:
… which product classes defined in a spec are required to work together? how can we extract that from a test suite development perspective?
… how do we demonstrate actual interoperation? incl consideration for optional features
… who ensures interop as specs evolve?
Sarven: helping anyone to contribute to spec development is a way to improve QA on specs
… there are no limitations to whcih tool can be used to write specs - with respec and bikeshed popular options
… as long as the result conform to pubrules
… current tools focus more on publishing than authoring - their accessibility for the latter isn't great
… machine readability of spec is quite limited at the moment
[showing screencast of an improved spec authoring environment based on dokieli]
jyasskin: restarting QA sounds good and do work in that direction
… in terms of the semantics representation of the "MUST" statements; one of them is "MUST conform to HTTP semantics" - that's too big for a single test case; how does this help with developing a test suite?
… that might be guidance a new QA activity could help develop
Sarven: writing requirements is more an art than a science
… here the assumption that when you build on top of HTTP, it would assume the underlying implementation would be tested against the existing HTTP suite
… one requirement might have one or more test cases - the requirements needs to be specific enough, but may not hash out all the details you might need, and part of it might be optional
<Zakim> jyasskin, you wanted to ask how a testing tool uses a Requirement link that's as big as "MUST conform to HTTP Semantics"?
jyasskin: that points to another potential output: when a spec is used as a dependency, how to build your test suite to make it useful for specs that dpeend on yours
florian: the premise here is we should restart QA - how we do QA has evolved
… we're doing a lot of testing, thinking on interop
… if we were to restart work in this space, it would be useful to be more specific on what exactly needs more focused attention
sarven: fair enough - I have tried to identify some of the things that needs more attention
florian: clarifying the assumptions of the state of the world and what wish it was
<Zakim> manu_, you wanted to ask about value proposition of QA to implementers -- drives requirements.
<Jemma4> c/me an I get the url for the doc on the screen or the previous one?
manu: what is the value proposition for QA at W3C? it's almost thought about testing the spec, whereas what's more useful having test suite that are testing implementations for interop
… the conformance statements in the specs are important, but if the test suite is only for spec, its value disappear quickly
<jyasskin> Jemma4: https://
manu: the continuous testing of what WPT is doing is good
… but not all specs are in this category
<manu_> dom: The QA activity stopped in 2017 -- low participation.
dom: QA activity stopped in 2007, because there was a low level of participation, and tended to be staff-heavy with excellent but limited member participation
… Lot of practice has evolved in the meantime.
… 2 tracks of thoughts, both of which are interesting. 1) spec engineering: what have we learned about how to write and maintain specs that helps increase the final interoperability
… some of the work of Francios and I have done in specref, we've extracted information from specs and tied them to implementations in various ways, ensured they're aligned. Semi-formal analysis of algorithms in browser specs.
… There's a whole set of things we've been doing on the side, which has grown over time, with useful outcomes. But it's a side-project. Very focused on browser specs, partly because there's more spec infrastructure on browser specs, and there are many more of them. Fairly confident that there are useful things we could broaden to other ecosystems.
… 2) not stopping at the spec level. Specs are a means to an end, but we really care about interoperability. We've learned enormously. WPT has grown from a side project to a huge project that drives a lot of work. On top of it, Interop has proved really interesting to not stop at just testing the spec. After the spec there's a lot more, and need to keep iterating.
… WebDX CG work on web features and baseline. These are browser-focused, but the lessons deserve being extracted, to see if they apply to other ecosystems. These are 2 potential interesting tracks that could deserve more structured attention.
… WG? 2 WGs? CG?
jyasskin: browser specs and the other ecosystems have evolved at a different pace: browser specs have put a lot of effort in their QA efforts with WPT and other tools
… which sometimes create confusion in terms of mismatched expectations
… more cross-polination would be useful
… a lot of what Sarven shows was around non-browser ecosystem - there would be value in talking to each other
sarven: re RDF, I didn't mean to suggest a particular format or convention is needed
… whatever tool gets us where we need
<jyasskin> +1
florian: manu mentioned the idea of test suites only used to get to Rec
… that's not what we do for WPT, but in terms of the Process, that's correct
<jyasskin> And you can see that a bit in how few web specs get to REC.
florian: in WPT, the spec is a mean to understand why test should fail or not
… there may need to be a bit of a cultural shift in that regard - e.G. with more emphasis on test suites in charter
… re dokieli, I'm not sure how realistic it is to expect spec authors to take such a structured approach - e.g. writing requirements in a way that they can be extracted out of context is non trivial, and might be hard to get applied at scale
… it may still be useful as a way to motivate getting closer to that approach
sarven: the authoring part is a gap I'm highlighting - I don't feel we have a robsust ecosystem for authoring; you're expected to use certain publishing tools and go through a bunch of hoops
… we can increase the possibly for contributions through more accessible authoring tools
florian: even I had a magic tool that did everything perfectly - this would require a lot of additional work from the author
<astearns> Some specs (like WOFF, I believe) have markup that isolates testable assertions
<jyasskin> https://
florian: in WPT, the CSSWG used to be on more-metadata camp with link back to specific requirements; we've been given up to promote ease of test writing
<dom> the WebRTC spec has a "toggle test annotation" in the ReSpec pill that highlight conformance requirements and whether they have known associated WPT test cases https://
sarven: this is a pay-as-you-go approach where additional efforts provide progressive additional benefits
<Zakim> manu_, you wanted to ask about concrete tooling that could be helpful
manu: is there a base level tooling that a QA group could work on?
… e.g. adding a conformance requirement identification feature in respec to help cross-linking from test cases
… not necessarily for WPT, but for other specs
… we had to rewrite the test infrastructure for VC/DID
… we think it's useful, but we don't know if it's reproducible and worth reproducing
jyasskin: +1 on spec authoring formats to help with normative statements
… having a forum to discuss these approaches would be good - a CG would be a good way to start
A Method for Writing Testable Conformance Requirements
Sarven: having a way to id conformance requirements sounds like a good addition
<pchampin> FTR, Respec already has this https://
Sarven: there are a number of potential contributors that can't use the authoring tools, we should lower that bar
sarven: sense of the room of interest in such a CG?
florian: I strongly suspect there is value
jyasskin: I'd be happy to help draft the proposal and bring WPT people
<jyasskin> I did suggest that csarven needs to drive it. :)
manu: it would be good to encourage WG test facilitators to participate in such a group