Meeting minutes
Welcome back after the break!
Announcement: WebML CG meeting at TPAC
anssik: Proposed WebML CG Meeting at TPAC 2020 on October 22 15:00 UTC:
anssik: TPAC is an opportunity to interact with the broader W3C community.
… Suggestions for topics for joint meetings with other W3C groups: WebGPU, Wasm, Audio?
Announcement: Web & ML workshop presentations published
anssik: This week we published the first wave of presentations, thanks to all the speakers!
anssik: Discussion on presentations happens via GitHub, speakers are tagged, but everyone is encouraged to take part in the discussions
anssik: The workshop culminates in September into four interactive sessions that will summarize the finding and chart out a path for future W3C efforts.
anssik: August 14, 2020 was the deadline to register as workshop participant, if you missed the DL but think you should attend, get in touch and we'll look into it; we got 800 registrants overall but had to filter the list down to 150 to have a reasobly sized room for discussions
WebNN polyfill and samples
anssik: The group agreed to add WebNN API polyfill and samples to its deliverables:
https://github.com/webmachinelearning/webnn/issues/81
https://github.com/webmachinelearning/webnn/issues/70
anssik: I've asked Ningxin to give an introduction to the initial contribution from Intel proposed to be used as a starting point, latest status, issues, plan, areas that welcome further contributions etc.
<ningxin_hu> https://github.com/webmachinelearning/webnn-polyfill/pull/1
ningxin_hu: I created a PR to add the polyfill foundation
… this PR implements 10 ops from the latest WebNN API spec
… required by 1st wave models
… the ops supported are listed in the PR comments
… the ops are included because they are used by our basic examples, we also agreed to add an advanced example handwriting detection with a convnet
… 10 ops implemented by the polyfill are required by the two examples included
… TFJS WebGL kernels used, if not available CPU kernels are used, but that's slow at this point
… licensed under a permissive Apache 2.0
ningxin_hu: exact implementation of the polyfill uses TypeScript transpiled into JS
anssik: are there open issues folks can help with?
ningxin_hu: we need to add the ops required by 1st wave ops, some new are being specified so those need to be added to the polyfill
… the spec contains 30+ ops, polyfill supports 10 ops, so need to bridge this gap
… another TODO, compilation options are not implemented by the polyfill, always using OpenGL kernels
ningxin_hu: we develop spec and polyfill in parallel, it allows others to get a feel of the API early on
chai: WebNN API spec is getting bigger, and some of the ops are not trivial, so having some unit tests will be helpful
… it should ideally follow the models we're targeting with the API spec, currently the polyfill is in a catch up mode
ningxin_hu: I also added some unit tests for the ops in the PR
… not a full conformance test
ningxin_hu: as chai mentioned, we want to improve certain areas in the polyfill and will open new issues for those
anssik: polyfill repo for discussing polyfill implementation topics, webnn repo for spec topics, don't worry if you create an issue in the wrong repo, they're easy to move around
anssik: what is browser / OS support?
ningxin_hu: for development, I use Windows laptop, test with Chrome and Edge
… using TFJS WebGL kernels, so WebGL perf critical
… did not test with other browsers or OSes yet, contributions welcome
ningxin_hu: I added more advanced samples in addition to the polyfill
<ningxin_hu> https://github.com/webmachinelearning/webnn-samples/pull/1
ningxin_hu: using NIST dataset, LeNet arch in this sample
<ningxin_hu> https://huningxin.github.io/webnn-samples/lenet/
ningxin_hu: to facilitate review, I host this sample live in my fork
… two modes: use MNIST dataset, or use mouse to draw your own
<ningxin_hu> https://github.com/huningxin/webnn-samples/blob/lenet/lenet/lenet.js#L22-L96
ningxin_hu: this sample uses the polyfill, the code to build the network ^
… uses multiple ops from the 1st wave
chai: having the actual code will help uncover integration issues, polyfill is kind of a reference implementation
ningxin_hu: agree with chai, samples TODO includes adding support for 1st wave models, ResNet etc.
anssik: any questions?
First-wave models and ops delta with WebNN API definition
Review the delta between the first-wave models ops and WebNN API surface:
- clamp
- globalAveragePool
- gru
- sigmoid
- split
- squeeze
anssik: Discuss which ops to add to the spec definition considering e.g. major platform support and performance implications.
https://github.com/webmachinelearning/webnn/pull/83
chai: the set of models we're targeting, and ops we define should go hand in hand, many models use a shared set of ops, but not exactly the same set necessarily
… adding model support as we go has been successful approach in some of our projects
… that is, all models in 1st wave we should support and define required ops as we go, iterate with polyfill to understand what is testable and verifiable
Noise suppression
https://github.com/webmachinelearning/webnn/issues/66
chai: I spend time looking at jmvalin's models, PR sent out for GRU to partially address the issue: https://github.com/webmachinelearning/webnn/pull/83
… RNNoise relies on GRU and looking forward to fully support it, with couple of extra ops we'd be good to go
… in terms of issue #66, once we define all the ops we should be able to close it
chai: related to this, questions re 1st wave models?
… how to add models to 1st wave models?
RafaelCintron: I'm fine having multiple waves of models
… Teams folks would like to have some models to have them work on their web app, maybe their models can be done with 1st wave already
chai: it is useful to define models in waves, we can qualify a wave as complete, makes it more trackable
ningxin_hu: agree with chai
… we can make wave a fixed timeframe cadence, like quarterly?
… another things I'd like to raise, we can link these models with out use cases defined in the spec
… the flow from use cases -> models -> ops makes sense to document
<chai> +1