IRC log of webmachinelearning on 2020-11-12

Timestamps are in UTC.

15:03:53 [RRSAgent]
RRSAgent has joined #webmachinelearning
15:03:53 [RRSAgent]
logging to
15:03:55 [Zakim]
RRSAgent, make logs Public
15:03:56 [Zakim]
please title this meeting ("meeting: ..."), anssik
15:04:00 [anssik]
Meeting: WebML CG Teleconference – 12 November 2020
15:04:05 [anssik]
Chair: Anssi
15:04:11 [anssik]
15:04:21 [anssik]
Scribe: Anssi
15:04:27 [anssik]
scribeNick: anssik
15:04:43 [anssik]
Present+ Anssi_Kostiainen
15:04:54 [anssik]
Present+ Ningxin_Hu
15:04:59 [anssik]
Present+ Chai_Chaoweeraprasit
15:05:03 [anssik]
Present+ Rafael_Cintron
15:05:08 [anssik]
Present+ Ganesan_Ramalingam
15:05:13 [Jonathan]
Jonathan has joined #webmachinelearning
15:05:14 [anssik]
Present+ Jonathan_Bingham
15:05:39 [anssik]
Present+ Sandeep_Gupta
15:05:46 [anssik]
RRSAgent, draft minutes v2
15:05:46 [RRSAgent]
I have made the request to generate anssik
15:06:09 [anssik]
Topic: WebNN API TAG spec review submission
15:06:27 [anssik]
-> [DRAFT] TAG Specification Review: Web Neural Network API
15:08:13 [sandeepngupta]
sandeepngupta has joined #webmachinelearning
15:09:54 [anssik]
anssik: Final review of the TAG spec review request before submission:
15:09:59 [anssik]
... Please provide your feedback in the issue ahead the meeting.
15:10:49 [anssik]
anssik: Plan to submit this request tomorrow, any concerns?
15:11:16 [anssik]
Subtopic: Self-Review Questionnaire: Security and Privacy
15:11:23 [anssik]
anssik: Contribute your suggested responses to the questionnaire:
15:11:35 [anssik]
-> Self-Review Questionnaire: Security and Privacy
15:13:42 [anssik]
anssik: Any questions re the questionnaire?
15:13:59 [anssik]
Topic: Support the execution of the sub-graph scenario
15:14:09 [anssik]
anssik: Discuss and provide feedback on the preferred builder pattern:
15:14:14 [anssik]
15:15:30 [anssik]
ningxin_hu: I our last call we discussed this issue and follow up action was to understand the use case better
15:15:48 [anssik]
... Ping mentioned transfer learning as the key use case
15:16:04 [anssik]
... so some layers can be trained with personal data
15:18:10 [ping_yu]
ping_yu has joined #webmachinelearning
15:18:40 [anssik]
[Ping noted "But this API does not seem to follow the conventional builder pattern"]
15:18:59 [anssik]
Sandeep: transfer learning makes sense, building into a single model, not familiar with the use case Ping had in mind
15:20:05 [anssik]
Present+ Ping_Yu
15:20:34 [anssik]
Ping: I have discussed with Ningxin about his subgraph issue, also Chai has chimed in
15:20:59 [anssik]
... there are use cases for transfer learning, the current API only considers the output part of it, input part is not as clear as output
15:21:29 [anssik]
... manual training is not typical(?)
15:21:45 [anssik]
... what the API now does can address majority of use cases, I can follow up on the issue
15:21:53 [anssik]
... happy with the solution from Ningxin
15:22:33 [anssik]
Topic: Proposed new ops Mirrorpad, SquaredDifference, Pow, TransposeConv
15:22:39 [anssik]
anssik: Review low-level ops decomposition and gaps:
15:22:46 [anssik]
15:23:32 [anssik]
Chai: I looked at the original and ONNX part, they are essentially the same models
15:23:54 [anssik]
... I think this is one of the models WebNN should support, it is one of the samples for many frameworks, also ONNX
15:24:11 [anssik]
... the ops sound reasonable, Ningxin may have some questions?
15:24:26 [anssik]
... I can take an action to tap out all the ops needed, there are not that many
15:25:58 [ningxin_hu]
15:26:04 [anssik]
ack ningxin_hu
15:26:40 [anssik]
ningxin_hu: question to Chai, there are two models using different ops to decode, TF Lite and ONNX
15:27:12 [anssik]
... do you want to support convTranspose?
15:27:21 [anssik]
Chai: it is supported in many models, should add it
15:27:47 [anssik]
ningxin_hu: we had an early discussion on conv op support, will revisit
15:28:17 [anssik]
Topic: Specify the ModelBuilder.createModel
15:28:23 [anssik]
anssik: Discuss naming and how to spec the ModelBuilder.createModel:
15:28:28 [anssik]
15:30:42 [anssik]
Ping: comment about the naming convention, duplication the name not preferred
15:31:04 [Chai]
15:31:57 [anssik]
s/the name /in the name
15:32:06 [anssik]
ack Chai
15:32:36 [anssik]
Chai: suggestion for naming, createModel -> build
15:32:45 [anssik]
... that would be reasonable
15:32:52 [ningxin_hu]
+1 rename to build
15:33:44 [anssik]
Topic: Chained API for the Operands
15:33:50 [anssik]
anssik: Follow-up on the mixin interface proposal:
15:33:56 [anssik]
15:34:31 [anssik]
Rama: my questions was whether in the chaining style you omit the first operand
15:34:48 [anssik]
... was not clear how the suggested proposal would work, and Ningxin seemed to agree
15:37:12 [anssik]
Ping: TF.js chaining API is not for model building
15:38:11 [anssik]
anssik: is this a nice to have or must have feature from TF.js perspective?
15:38:16 [anssik]
Ping: nice to have feature
15:38:47 [anssik]
... that said, should think how the API should look like, we want to chain the ops here, we have to think about it
15:39:31 [anssik]
anssik: welcome discussion in the GH issues #106
15:40:38 [anssik]
Topic: WG Charter feedback
15:40:44 [anssik]
Status check and discussion on WG Charter issues:
15:40:49 [anssik]
15:41:33 [anssik]
15:41:47 [anssik]
anssik: "Is this API likely to be a long-term solution?"
15:43:38 [anssik]
[ Jonathan talking through the points in the issue ]
15:46:23 [anssik]
Jonathan: asked the internal Google team why couldn't the existing NN API be augmented with the low level instructions being explored at Google
15:46:46 [anssik]
... haven't gotten an answer to that question internally yet
15:47:39 [anssik]
... would like to get web standards and e.g. TAG perspective on the situation where an API might get a new API in the future that might or might not replace the old API
15:48:19 [anssik]
Chai: my response is on the issue, don't want to read it
15:48:38 [anssik]
... to me the question is what would be the right abstraction for ops to ensure interop across platforms?
15:48:45 [anssik]
... the topic is about that abstraction
15:49:00 [anssik]
... the web stack has a browser and underneath an OS
15:49:25 [anssik]
... to be really cross-platform we need to pick an abstraction the underlying platforms can support
15:49:50 [anssik]
... ping audio, media, AI it is the same, how to layer those capabilities across multiple platforms
15:50:19 [anssik]
... I believe this abstraction WebNN API chose can stay relevant for 10 years or more
15:50:59 [anssik]
... it has always been the case that abstraction established as maintained, and the Web is never the first for a good reason, it is a followed building on the foundation of the underlying platforms
15:51:29 [anssik]
... we have to be careful to look across ecosystems and platforms and define the abstraction we believe is good for all platforms
15:51:39 [anssik]
... this process takes time
15:53:01 [anssik]
... WebNN API is around that corner, AI/ML has been in the past 5-10 years, enough development, working in the Msft OS group, dealing with GPU vendors, seeing the directions where the hardware is going and SW frameworks, TF, ONNX, CoreML evolving, but when you do cross section you find there's a handful of common currency flowing through this system
15:53:13 [anssik]
... eventually you need support from the underlying platform
15:53:48 [anssik]
... the folks working on the lowest stack that people do not often see, what they do is they look up to understand the use case
15:54:24 [anssik]
... this has been happening in the past years in desktop and phone, the questions is, how do you define the web stack that can take benefit of that overlap
15:54:42 [anssik]
... you can always wait, but the space moves so fast no one actually waits for you
15:55:01 [anssik]
... enough parties recognizing simple things like need to support conv, gemm etc.
15:55:30 [anssik]
... for example, matrix multiplication is so fundamental, you'll need this abstraction anywhere
15:55:58 [anssik]
... are we ready to say this is the current we'll start with for the Web?
15:56:20 [anssik]
... there could be a new thing coming that will validate everything that came before it
15:56:51 [anssik]
... the web had a very basic image formats 20 years ago, much improved today step by step over the years
15:57:14 [anssik]
... there may be obsolete ops in the future at some point, that's fine
16:02:23 [Jonathan]
There are a couple different ways that it could work. If Web NN 2.0 supported a lower level instruction set, either...
16:02:49 [Jonathan]
1) Web NN could support both the currently proposed ~100 operations + the lower level instructions in a single graph, or
16:03:16 [Jonathan]
2) Web NN could support a choice of two mutually exclusive op sets: either developers would use the 1.0 operation set, or they would use the 2.0 operation set
16:08:09 [sangwhan]
The proposed ~100 operations would be nice, but getting implementor traction might be a hard sell, given the complexity of the feature - I think that's my main concern. If there is a high level API that implements less but has very wide developer adoption it feels like there would be stronger motivation to push forward a more complex API.
16:08:37 [anssik]
anssik: are these 4 issues all the issues Google would like us to address?
16:08:54 [anssik]
Jonathan: there may be couple of more, but no surprises
16:10:01 [Jonathan]
@anssik has requested, can Google write something about how the future of the NN API might work for the web platform?
16:11:20 [anssik]
Topic: Adjourn
16:11:33 [anssik]
RRSAgent, draft minutes v2
16:11:33 [RRSAgent]
I have made the request to generate anssik
17:58:15 [Zakim]
Zakim has left #webmachinelearning