15:03:53 RRSAgent has joined #webmachinelearning 15:03:53 logging to https://www.w3.org/2020/11/12-webmachinelearning-irc 15:03:55 RRSAgent, make logs Public 15:03:56 please title this meeting ("meeting: ..."), anssik 15:04:00 Meeting: WebML CG Teleconference – 12 November 2020 15:04:05 Chair: Anssi 15:04:11 Agenda: https://github.com/webmachinelearning/meetings/blob/master/telcons/2020-11-12-agenda.md 15:04:21 Scribe: Anssi 15:04:27 scribeNick: anssik 15:04:43 Present+ Anssi_Kostiainen 15:04:54 Present+ Ningxin_Hu 15:04:59 Present+ Chai_Chaoweeraprasit 15:05:03 Present+ Rafael_Cintron 15:05:08 Present+ Ganesan_Ramalingam 15:05:13 Jonathan has joined #webmachinelearning 15:05:14 Present+ Jonathan_Bingham 15:05:39 Present+ Sandeep_Gupta 15:05:46 RRSAgent, draft minutes v2 15:05:46 I have made the request to generate https://www.w3.org/2020/11/12-webmachinelearning-minutes.html anssik 15:06:09 Topic: WebNN API TAG spec review submission 15:06:27 -> https://github.com/webmachinelearning/webnn/issues/89#issuecomment-722415240 [DRAFT] TAG Specification Review: Web Neural Network API 15:08:13 sandeepngupta has joined #webmachinelearning 15:09:54 anssik: Final review of the TAG spec review request before submission: 15:09:59 ... Please provide your feedback in the issue ahead the meeting. 15:10:49 anssik: Plan to submit this request tomorrow, any concerns? 15:11:16 Subtopic: Self-Review Questionnaire: Security and Privacy 15:11:23 anssik: Contribute your suggested responses to the questionnaire: 15:11:35 -> https://github.com/webmachinelearning/webnn/issues/119 Self-Review Questionnaire: Security and Privacy 15:13:42 anssik: Any questions re the questionnaire? 15:13:59 Topic: Support the execution of the sub-graph scenario 15:14:09 anssik: Discuss and provide feedback on the preferred builder pattern: 15:14:14 -> https://github.com/webmachinelearning/webnn/issues/105 15:15:30 ningxin_hu: I our last call we discussed this issue and follow up action was to understand the use case better 15:15:48 ... Ping mentioned transfer learning as the key use case 15:16:04 ... so some layers can be trained with personal data 15:18:10 ping_yu has joined #webmachinelearning 15:18:40 [Ping noted "But this API does not seem to follow the conventional builder pattern"] 15:18:59 Sandeep: transfer learning makes sense, building into a single model, not familiar with the use case Ping had in mind 15:20:05 Present+ Ping_Yu 15:20:34 Ping: I have discussed with Ningxin about his subgraph issue, also Chai has chimed in 15:20:59 ... there are use cases for transfer learning, the current API only considers the output part of it, input part is not as clear as output 15:21:29 ... manual training is not typical(?) 15:21:45 ... what the API now does can address majority of use cases, I can follow up on the issue 15:21:53 ... happy with the solution from Ningxin 15:22:33 Topic: Proposed new ops Mirrorpad, SquaredDifference, Pow, TransposeConv 15:22:39 anssik: Review low-level ops decomposition and gaps: 15:22:46 -> https://github.com/webmachinelearning/webnn/issues/108 15:23:32 Chai: I looked at the original and ONNX part, they are essentially the same models 15:23:54 ... I think this is one of the models WebNN should support, it is one of the samples for many frameworks, also ONNX 15:24:11 ... the ops sound reasonable, Ningxin may have some questions? 15:24:26 ... I can take an action to tap out all the ops needed, there are not that many 15:25:58 q+ 15:26:04 ack ningxin_hu 15:26:40 ningxin_hu: question to Chai, there are two models using different ops to decode, TF Lite and ONNX 15:27:12 ... do you want to support convTranspose? 15:27:21 Chai: it is supported in many models, should add it 15:27:47 ningxin_hu: we had an early discussion on conv op support, will revisit 15:28:17 Topic: Specify the ModelBuilder.createModel 15:28:23 anssik: Discuss naming and how to spec the ModelBuilder.createModel: 15:28:28 -> https://github.com/webmachinelearning/webnn/issues/107 15:30:42 Ping: comment about the naming convention, duplication the name not preferred 15:31:04 q+ 15:31:57 s/the name /in the name 15:32:06 ack Chai 15:32:36 Chai: suggestion for naming, createModel -> build 15:32:45 ... that would be reasonable 15:32:52 +1 rename to build 15:33:44 Topic: Chained API for the Operands 15:33:50 anssik: Follow-up on the mixin interface proposal: 15:33:56 -> https://github.com/webmachinelearning/webnn/issues/106 15:34:31 Rama: my questions was whether in the chaining style you omit the first operand 15:34:48 ... was not clear how the suggested proposal would work, and Ningxin seemed to agree 15:37:12 Ping: TF.js chaining API is not for model building 15:38:11 anssik: is this a nice to have or must have feature from TF.js perspective? 15:38:16 Ping: nice to have feature 15:38:47 ... that said, should think how the API should look like, we want to chain the ops here, we have to think about it 15:39:31 anssik: welcome discussion in the GH issues #106 15:40:38 Topic: WG Charter feedback 15:40:44 Status check and discussion on WG Charter issues: 15:40:49 -> https://github.com/w3c/machine-learning-charter/issues 15:41:33 -> https://github.com/w3c/machine-learning-charter/issues/7 15:41:47 anssik: "Is this API likely to be a long-term solution?" 15:43:38 [ Jonathan talking through the points in the issue ] 15:46:23 Jonathan: asked the internal Google team why couldn't the existing NN API be augmented with the low level instructions being explored at Google 15:46:46 ... haven't gotten an answer to that question internally yet 15:47:39 ... would like to get web standards and e.g. TAG perspective on the situation where an API might get a new API in the future that might or might not replace the old API 15:48:19 Chai: my response is on the issue, don't want to read it 15:48:38 ... to me the question is what would be the right abstraction for ops to ensure interop across platforms? 15:48:45 ... the topic is about that abstraction 15:49:00 ... the web stack has a browser and underneath an OS 15:49:25 ... to be really cross-platform we need to pick an abstraction the underlying platforms can support 15:49:50 ... ping audio, media, AI it is the same, how to layer those capabilities across multiple platforms 15:50:19 ... I believe this abstraction WebNN API chose can stay relevant for 10 years or more 15:50:59 ... it has always been the case that abstraction established as maintained, and the Web is never the first for a good reason, it is a followed building on the foundation of the underlying platforms 15:51:29 ... we have to be careful to look across ecosystems and platforms and define the abstraction we believe is good for all platforms 15:51:39 ... this process takes time 15:53:01 ... WebNN API is around that corner, AI/ML has been in the past 5-10 years, enough development, working in the Msft OS group, dealing with GPU vendors, seeing the directions where the hardware is going and SW frameworks, TF, ONNX, CoreML evolving, but when you do cross section you find there's a handful of common currency flowing through this system 15:53:13 ... eventually you need support from the underlying platform 15:53:48 ... the folks working on the lowest stack that people do not often see, what they do is they look up to understand the use case 15:54:24 ... this has been happening in the past years in desktop and phone, the questions is, how do you define the web stack that can take benefit of that overlap 15:54:42 ... you can always wait, but the space moves so fast no one actually waits for you 15:55:01 ... enough parties recognizing simple things like need to support conv, gemm etc. 15:55:30 ... for example, matrix multiplication is so fundamental, you'll need this abstraction anywhere 15:55:58 ... are we ready to say this is the current we'll start with for the Web? 15:56:20 ... there could be a new thing coming that will validate everything that came before it 15:56:51 ... the web had a very basic image formats 20 years ago, much improved today step by step over the years 15:57:14 ... there may be obsolete ops in the future at some point, that's fine 16:02:23 There are a couple different ways that it could work. If Web NN 2.0 supported a lower level instruction set, either... 16:02:49 1) Web NN could support both the currently proposed ~100 operations + the lower level instructions in a single graph, or 16:03:16 2) Web NN could support a choice of two mutually exclusive op sets: either developers would use the 1.0 operation set, or they would use the 2.0 operation set 16:08:09 The proposed ~100 operations would be nice, but getting implementor traction might be a hard sell, given the complexity of the feature - I think that's my main concern. If there is a high level API that implements less but has very wide developer adoption it feels like there would be stronger motivation to push forward a more complex API. 16:08:37 anssik: are these 4 issues all the issues Google would like us to address? 16:08:54 Jonathan: there may be couple of more, but no surprises 16:10:01 @anssik has requested, can Google write something about how the future of the NN API might work for the web platform? 16:11:20 Topic: Adjourn 16:11:33 RRSAgent, draft minutes v2 16:11:33 I have made the request to generate https://www.w3.org/2020/11/12-webmachinelearning-minutes.html anssik 17:58:15 Zakim has left #webmachinelearning