Meeting minutes
anssik: please welcome Baidu folks to the group
… interested in the work this group does, bringing in Paddle and Paddle-Lite perspective and experience
conv2d and matMul op definitions
anssik: Flesh out how we proceed adding conv2d and matMul op definitions to the WebNN API spec.
… Proposed work mode to evolve the op definitions iteratively: first land signature and arguments definitions, refine in subsequent PRs based on compatibility study findings.
… Please review and provide feedback on the respective issues prior to the call:
anssik: compat table just added Paddle-Lite mapping
anssik: any comments?
chai: question on versioning
… need to tackle this version question early
… should discuss compat and versioning together when making progress
anssik: I expect the editors to make a PR for these ops
ningxin_hu: I can take that action, working with Chai, also consider versioning
proposed RESOLUTION: Add conv2d and matMul op definitions to WebNN API
ningxin_hu: one comment regarding matMul, it lacks compat study findings
… should figure out if the compat study has been useful for conv2d, then scale to matMul other ops
Chai: compat table is definitely useful, we do not know all the APIs, maybe needed only for big ops
Resolution: Add conv2d and matMul op definitions to WebNN API
Revisit inference API to load and run a model
anssik: Discuss a proposal for inference API to load and run a model.
anssik: quoting the explainer:
… "ML Inference is a proposed web API to take a custom, pre-trained machine learning model in a standard format, and apply it to example data in JavaScript in order to perform inference, like classification, regression, or ranking. The idea is to make it as easy as possible for web developers to use a custom, pre-built machine learning model in their web app, across devices and browsers."
Revisit inference API to load and run a model #41
anssik: feedback wanted in issue #41
… Given adequate support, I'll start the process to expand the Community Group's scope per the charter change process to explicitly bring this proposal in scope of this group.
Jonathan: motivation, there was a previous issue #3, some folks from Microsoft have interest in web developer focus, aka load & run model
… when I heard from TF.js having reservations with graph API, I was initially disappointed, so this is a level of API that is supported by Google
… complementary to the graph API
… challenge will be the model format
… in explainer, there's more information on it
… question whether operation based model format such WinML, CoreML, TF is the right level
… MLIR is looking at using a lower-than-op-level abstraction, that model format is likely Google's future direction
… MLIR is not ready, so that's an issue
… questions welcome!
ningxin_hu: very good topic to discuss
… want to understand the major concerns of the graph API
… building the graph out off ops is a the concern, right?
… in terms of API surface, compilation and execution can be reused, and add load model API