02:49:46 Karen has joined #immersive-web 12:47:10 dom has joined #immersive-web 18:02:11 RRSAgent has joined #immersive-web 18:02:11 logging to https://www.w3.org/2020/11/05-immersive-web-irc 18:02:18 cabanier has joined #immersive-web 18:02:23 present+ 18:02:23 meeting: Immersive-Web WG/CG TPAC 2020 Day 4 18:02:33 present+ 18:02:38 agenda: https://github.com/immersive-web/administrivia/tree/main/TPAC-2020 18:02:51 rrsagent, make log public 18:03:18 zakim, clear agenda 18:03:18 agenda cleared 18:04:32 agenda+ Navigation (@toji, @manish, @cabanier) (35 Minutes) 18:04:32 agenda+ Break (5 minutes) 18:04:32 agenda+ Required Dependencies. Example: Should Hit Test be made part of the AR module? (@toji, @klausw) (45 minutes) 18:04:32 agenda+ Break (5 minutes) 18:04:32 agenda+ WebXR WebGPU Binding (@toji) (45 minutes) 18:04:33 agenda+ Break (5 minutes) 18:04:33 agenda+ XR Accesibility (@Manish, @Yonet) (30 minutes) 18:04:33 agenda+ Break (10 minutes) 18:04:34 agenda+ WebXR Hand Input (@Manishearth, @fordacious, @thetuvix)(60 minutes) 18:07:02 zakim, take up agendum 1 18:07:02 agendum 1. "Navigation (@toji, @manish, @cabanier) (35 Minutes)" taken up [from atsushi] 18:07:07 bajones has joined #Immersive-Web 18:07:07 zakim, list agenda 18:07:07 I see 9 items remaining on the agenda: 18:07:08 7. XR Accesibility (@Manish, @Yonet) (30 minutes) [from atsushi] 18:07:08 2. Break (5 minutes) [from atsushi] 18:07:08 3. Required Dependencies. Example: Should Hit Test be made part of the AR module? (@toji, @klausw) (45 minutes) [from atsushi] 18:07:08 4. Break (5 minutes) [from atsushi] 18:07:09 5. WebXR WebGPU Binding (@toji) (45 minutes) [from atsushi] 18:07:09 6. Break (5 minutes) [from atsushi] 18:07:09 1. Navigation (@toji, @manish, @cabanier) (35 Minutes) [from atsushi] 18:07:10 8. Break (10 minutes) [from atsushi] 18:07:10 9. WebXR Hand Input (@Manishearth, @fordacious, @thetuvix)(60 minutes) [from atsushi] 18:09:24 bialpio has joined #immersive-web 18:09:51 present+ 18:10:01 madlaina-kalunder has joined #immersive-web 18:10:05 present+ 18:11:18 yonet has joined #immersive-web 18:11:31 present+ 18:11:58 present+ 18:11:59 scribenick:cwilso 18:12:04 scribe: Chris Wilson 18:12:19 zakim, take up agendum 1 18:12:19 agendum 1. "Navigation (@toji, @manish, @cabanier) (35 Minutes)" taken up [from atsushi] 18:13:12 bajones: I think Oculus is the only one who's implement any form of navigation at this point (i.e., the ability to jump from page to page without leaving immersive context). 18:13:15 rrsagent, publish minutes v2 18:13:15 I have made the request to generate https://www.w3.org/2020/11/05-immersive-web-minutes.html atsushi 18:14:00 ...the UA is navigating to a new URL, but you're still in an immersive context. A la a "portal" in my immersive web app that pops me into a new virtual world. "There are complications." 18:15:11 rik: Oculus browser has implemented, but for most origins it's behind a flag. For security reasons, this isn't something we can just turn on. I think there are couple of origins we trust -for Facebook 3D photos, e.g. 18:15:55 ...unless there has been movement [on the security investigation] we don't intend to turn this on (by default) 18:16:16 Brandon: can you give a desc of how this is presented to the user? 18:16:39 Rik: for 3D photos, you get a loading spinner, then the new immersive page comes in. 18:16:46 rrsagent, make log public 18:16:49 rrsagent, publish minutes v2 18:16:49 I have made the request to generate https://www.w3.org/2020/11/05-immersive-web-minutes.html atsushi 18:16:59 brandon: the spinner is the UA itself? 18:17:03 rik: yes. 18:17:30 ...the spinner actually comes from the OS, not even the UA. 18:17:34 s|agendum 1. "Navigation (@toji, @manish, @cabanier) (35 Minutes)" taken up [from atsushi]|| 18:17:36 rrsagent, publish minutes v2 18:17:36 I have made the request to generate https://www.w3.org/2020/11/05-immersive-web-minutes.html atsushi 18:18:05 brandon: what happens if you take a long time (30 sec, e.g.) to load? 18:18:15 rik: you'll get 30 seconds of spinner. 18:18:16 present+ rik, Brandon 18:19:31 brandon: Does the new page know it's being navigated to? 18:19:45 Rik: yes - I think Diego proposed something that provides a handoff. 18:19:54 q? 18:20:55 brandon: it does seem critical that we let users know 1) that they're about to navigate, and 2) where they're navigating to. These shouldn't be able to be reasonably spoofed or hidden. 18:21:05 q+ 18:21:37 ...I'm not sure how to make that happen. Using the Oculus flow as an example, the spinner should show where you're going to (origin) 18:22:19 ...it would be really nice if there were something like the "hover over a link, get the URL" in 2D navigation. 18:22:29 [I'd note that can be defeated] 18:22:49 ...but maybe the right place is onbeforenavigation 18:23:21 q- 18:23:24 ...but for some people, popping up a "you're about to navigate" would kill the experience. 18:23:34 ... I feel like we need some guidance from browser security people. 18:23:45 q+ 18:23:51 ack cabanier 18:24:04 rik: not sure it's possible, but could we pre-register origins or something like that? 18:24:34 brandon: in terms of manifest or the like? 18:24:48 rik: yes 18:24:57 brandon: that feels clunky in terms of user experience 18:25:05 q+ 18:25:42 ...imagining a geocities-style user experience, where lots of subdomains... 18:26:29 ... this seems almost untenable. It seems more practical to have a popup on page load, saying "this page may navigate you to XXX" 18:26:48 ...or that experience, but one that hides after the first time. 18:27:02 ... it feels like it's going to be clunky no matter what you do. 18:27:18 ...I'm worried people will not get the UI they want here. 18:27:38 ... I think Oculus' experience is only tenable because it's controlled on both ends. 18:28:15 ...(also, that scenario is fairly lightweight in terms of resources) 18:28:27 ...you could be staring at that spinner for a long time. 18:29:32 ...navigating out to the page maybe isn't so bad... 18:30:55 cwilso:maybe the right thing at this point as we have been looking at the UI parts, the best thing is to hit up security experts and the TAG and get pointers/advice from them. 18:31:16 cwilso:some of the assumptions were making may not be correct 18:32:03 rik: didn't John Pallett look into this? 18:32:22 brandon: I think so, but that's a while in the past, and the context may have changed. 18:32:38 ....we should probably verify. 18:33:37 alcooper has joined #immersive-web 18:34:41 brandon: I just wanted to get current status and next steps. "Yes this is a good idea and let's continue" or "No for these reasons we're not pursuing". It keeps coming up but we don't seem to have concrete steps. 18:35:27 ada: I seem to recall Diane doing work around this. Should we ask? 18:36:35 manish: I can ask, though her being at Apple now might make that problematic. Her idea was somehting like "you have some UI navigation chrome, that you can keep out of your way, and it expands when something's happening - like an address bar. 18:36:48 brandon: seems like you'd need to do some composition with the frame 18:36:55 ... more like layers 18:37:23 manish: Diane will put it on archive today or tomorrow 18:37:29 all: thanks Diane! 18:38:11 ada: does anyone have bandwidth to have a go at implementing this? 18:38:40 brandon: would need to see, but probably don't have bandwidth 18:39:15 manish: would be best if a headset did this; handhelds already have chrome 18:40:24 bialpio_ has joined #immersive-web 18:40:45 ada: ok, we're over by 5 minutes, let's take a break. 18:45:26 (will drop here.) 18:45:54 ada: (or someone in scribe or chair) please not forget to put 'rrsagent, publish minutes v2' at the end of the meeting. 18:46:12 s|ada: (or someone in scribe or chair) please not forget to put 'rrsagent, publish minutes v2' at the end of the meeting.|| 18:46:26 (failed to push in notice) 18:49:10 chair: yonet 18:54:29 scribe: madlaina 18:55:21 aysegul: so our next subject is the required dependencies 18:55:37 ... i have Brandon and Piotr to introduce 18:56:06 s/aysegul/yonet/ 18:56:25 present+ 18:56:39 bajones: the topic at hand is that we have been developing soem of these features as modules 18:56:59 ... it seems to have been working very well for us, as we were able to develop the core api really well 18:57:30 ... however upon looking back it would be worth taking some time to look at past developments of the feature itself 18:57:41 ... should these features be seperate features? 18:57:43 q+ 18:57:54 ... should we lift these features to core? 18:57:57 q- 18:58:20 ... we on the chrome team see immersive vr and hit test used in immersive sessions in our samples 18:58:48 q+ 18:59:20 s/immersive vr/immersive ar/ 18:59:28 ...correction: immersive ar sessions... we almost never see just immersive ar used 18:59:50 ... the reason why they are seperate is more a matter of timing to get immersive-ar features out 19:00:11 ... now that both immersive-ar and hittesting module are fairly well establishedand used 19:00:17 ... we should look into combining these modules 19:00:41 q+ 19:00:49 ... would anyone like to voice their support or opposition to the topic 19:01:10 ... it could be a simpler message to developers 19:01:47 let someone else go ahead 19:02:07 ack Manishearth 19:02:08 manishearth: my initial reaction is that this does not sound like a good idea to me 19:02:34 ... in particular we had an open questions that we left to the UA: can you report that uou support immersive-ar 19:02:52 ... the environement blend mode only gets reported in immersive-ar 19:03:04 ack Manishearth 19:03:10 ... i don;t want to be in a situation where the headset does not support immersive-vr 19:05:10 klausw: the concern is not so much that we see immersive-ar sessions uusing hittesting all the time if the content is ar-centric 19:05:41 ... your concern is that if headsets want experiences that is more immersive-vr but have transparent optics, hittest would not matter for them at all 19:06:08 manishearth: environement blend mode was not important so far in order to make the immersive-vr look nice 19:06:22 ...but you want all the vr examples to run on a hololens for example 19:06:59 cwilso: the opaque environment blending must not be applied in immersive-ar 19:07:23 ... for alpha blend environment blending this technique will not be applied for immersive-ar 19:07:34 ... for other it should be applied regardeless 19:07:43 q+ to say I think a main issue is implementation burden. Hit test can have a minimal implementation, i.e. intersect with floor plane. On the other hand, DOM overlay requires a full browser implementation in immersive mode, that's likely in the person-years of effort range, so I'd be against making that a mandatory feature 19:07:56 ... we can advertise that a device that uses ar-blending could still use the feature 19:08:28 manisearth: ... this will require any device to state that they support immersive-ar and we would stop them from having the choice 19:08:57 ... all UA users are already can make this choice to use additive-mode in VR 19:09:00 q? 19:09:13 q+ to say I think a main issue is implementation burden. Hit test can have a minimal implementation, i.e. intersect with floor plane. On the other hand, DOM overlay requires a full browser implementation in immersive mode, that's likely in the person-years of effort range, so I'd be against making that a mandatory feature 19:09:20 dom has joined #immersive-web 19:09:24 ack bialpio 19:09:30 bialpo: we should be careful by saying what should be implemented by the UA 19:09:48 ... all modules have a mechanism to say which features are supported 19:10:06 ... you need to be able to recognize this as a mode, to surface environment blend mode on a session 19:10:29 ... you need to be able to recognize a feature descriptor 19:10:36 ... and try to leverage the specific feature 19:10:47 ... it seems taht most of our modules could be merged in a big spec 19:10:55 ... it is my perspective to keep them seperated 19:11:03 ... to work on features seperated 19:11:11 ... and point developers to specific specs 19:11:36 ... we probably should take that (developer communication) to account 19:11:46 ... i would not be opposed to merging those two 19:11:49 q+ to say I think a main issue is implementation burden. Hit test can have a minimal implementation, i.e. intersect with floor plane. On the other hand, DOM overlay requires a full browser implementation in immersive mode, that's likely in the person-years of effort range, so I'd be against making that a mandatory feature 19:12:11 klausw: to say I think a main issue is implementation burden. Hit test can have a minimal implementation, i.e. intersect with floor plane. On the other hand, DOM overlay requires a full browser implementation in immersive mode, that's likely in the person-years of effort range, so I'd be against making that a mandatory feature 19:12:48 cwilso: i agree with that 19:12:51 ack klausw 19:12:51 klausw, you wanted to say I think a main issue is implementation burden. Hit test can have a minimal implementation, i.e. intersect with floor plane. On the other hand, DOM overlay 19:12:54 ... requires a full browser implementation in immersive mode, that's likely in the person-years of effort range, so I'd be against making that a mandatory feature 19:12:58 q+ 19:13:11 s/cwilso/brandon 19:13:17 ack bialpio 19:13:32 bialpio: if we keep it seperated it will also be easier for the browsers to implement the features 19:13:43 ... we could just dump a list on what gets implemented in each of those 19:13:58 q+ 19:13:58 ... are we considering making any features mandatory in ar / vr? 19:14:23 ... just because the feature is more naturale in AR it does not mean it can not be featured in VR 19:14:56 ... the anchors will not change over time in VR, they will be dynamic. but there is nothing in the spec to rule out that something cna;t be used in both 19:15:05 ... we have a mechanism to say what is implemented 19:15:11 ack bajones 19:15:28 q+ 19:15:33 bajones: i want to clarify that if we merge the two modules together, i don;t think that hittest should be mandatory 19:15:38 ... i would not change the text at all 19:15:48 ... besides integrating them into the same document 19:15:59 ... i would continue to support that hittest is avaibable 19:16:19 ... it would imply to a lot of people that these features go hand in hand. i would be worried about that a bit 19:16:36 ... that developers would confuse the presence of one that the feature is mandatory 19:16:48 ... maybe merging these would add some confusion to this 19:17:20 ... these features seem to be highly correlated, but there have also been enough arguments that this may not be the case 19:17:44 ... in that case while I personally feel that we should leave them separate now 19:18:07 ... we could have more discussion: when in the group is it appropriate to merge things or hold off until webXR spec 2 19:18:23 ... maybe there is some benefit to agree that this is the proceedure 19:18:29 ack Manishearth 19:18:47 manishearth: like you said we don;t need to imply the feature to merge the specs 19:18:50 ... i am open to merging 19:18:58 ... the ar module does not have more to be added 19:19:09 ... everything is related and it would make sense to merge for maintenance 19:19:20 ... the ar module is not very large and therefore it would make sense 19:19:35 ... teh biggest concern would be for me that ar requires hit test 19:20:00 bajones: my biggest concern would be developers-facing 19:20:09 ... a little bit of reading could sort it out for them 19:20:21 ... i am always hesitant to developers reading fine-prints 19:20:22 q+ 19:20:32 ... the idea that you have a vr headsets that implements this 19:21:11 ... the impact would be that we will force the environment blend mode on you if you want hit test 19:21:19 ack bialpio 19:21:40 bialpio: question: does anyone remember what was the original plan with modules? 19:21:52 ... was there a plan to start merging into the main spec when they mature? 19:22:17 bajones: my intention was, that we will deal with this later 19:22:39 ... it was a natural thing to bundle a bunch of extensions and make them a part of the core 19:22:44 ... we did not talk about specific plans 19:23:03 ... we want to free up to be able to work independently without cross dependency 19:23:31 ... which mostly worked fine, but there are some weird exceptions where this modular approach did not work so well 19:23:48 ... there are some awkward dependencies 19:24:00 ... i don't recall a specific plan on module merging 19:24:09 ... so this is why i wanted to bring this up to day 19:24:18 ... do we wait for the big bundling day? 19:24:43 ... this might happen regardless in the future 19:24:54 ... i was not sure if intermediate merging is a good plan 19:24:59 ... i can see it going either way 19:25:04 q+ 19:25:06 ... we need to be careful about implications 19:25:16 ack Manishearth 19:25:22 manishearth: i prefer not merging things except core concepts 19:25:39 ... i like how css does it, where they accurately split things 19:25:45 ... this is far future stuff 19:26:08 bajones: we can look at established groups and they must have a good reason on why they split things 19:26:17 yonet: let's shoot for small messes 19:26:52 bajones: do we want to try and establish through a straw poll 19:26:58 ... if people would like to merge or not? 19:27:10 ... or if we should bother with this at the moment 19:27:25 ada: we will do a straw poll 19:27:38 lgombos has joined #immersive-web 19:27:48 Present+ Laszlo_Gombos 19:27:49 bajones: +1 for being in favor of mearging -1 for keeping them separate 19:27:59 -1 19:28:00 -1 19:28:01 -1 19:28:05 -1 19:28:05 -1 19:28:11 0 19:28:12 +0.5 19:28:28 +0.25 (merge common concepts like XRRay, XRWebGLBinding, wait with the rest until Big Bundling Day) 19:29:03 bajones: i feel like this is coming down to ... keep them seperate ... pretty clearly 19:29:16 ... we can revisit it again if we have compelling reasons 19:29:21 ... we just keep them separate for now 19:30:24 yonet: we will get started with the webGPU topic 19:31:16 https://docs.google.com/presentation/d/17_YzJAavUluGFNBOY8-N2itdaV02cCQhu7iBoOx00E4/edit?usp=sharing 19:31:53 bajones: don't stress reading through all of the slides to much, i will highlight the important parts 19:32:06 ... just a quick update on where we are with webgpu bindings 19:32:12 ... and a few questions that come along 19:32:24 ... the current state: the proposal has a repo 19:32:38 ... it builds heavily off the layers module which is webgl centric 19:32:50 https://github.com/immersive-web/WebXR-WebGPU-Binding 19:32:58 RafaelCintron has joined #immersive-web 19:33:06 ... there are a couple of interesting differences 19:33:37 ... it has no impact on ... and the dom layers 19:33:47 ... webgpu will only use the new layers interface 19:34:11 .. we would like to move towards a more layer-centered approach 19:36:06 ... presenting the IDL Interface (-> slides)... 19:37:17 ... what are the differences from the webGL layers 19:37:31 ... one big thing ist that format / usage must be specified 19:37:51 ... in this case the webgpu spec will specify the bgra format 19:38:00 ... which can be rendered on all platforms 19:38:22 ... on all desktops and most mobile devices this is the most efficient format to render to compared to rgba 19:38:34 ... this is why we will use bgra by default 19:38:45 ... you could also specify a different format 19:39:03 ... if you want a depth stencil buffer you can specify a format, but by default it will be null 19:39:33 ... there is also the texture usage. if you want to render the texture on top you have to specify the usage 19:39:56 ... we need to be able to allow developers to change that 19:40:05 ... we determined that these are all important for webgpu 19:40:31 .. this is why this makes those two apis align pretty nicely 19:44:46 ... all projection layers will use texture arrays 19:45:02 ... this does not actually allow for multiview rendering 19:45:16 ... it's not clear how much benefit you would get from it 19:45:30 ... side-by-side rendering is still allowed 19:45:40 ... couple of areas for discussion: 19:45:48 q+ 19:45:58 q+ 19:45:58 ... any questions on what we just talked about? 19:46:08 q? 19:46:23 ack cabanier 19:46:31 cabanier: when you return the list of supported formats, it seems like the format is not enough but you also need the type? 19:46:53 bajones: for webGPU follows those patterns a lot more, where formats are a well defined list 19:47:21 ... strings that represent a very specific memory layout 19:47:34 ... those implicitly refer to a type 19:47:51 ... which is not the case in webGPU, or that we would like to avoid and simplifu 19:48:02 ack RafaelCintron 19:48:29 rafaelcintron: the equivalent feature of multiview was not removed in [vulkan] 19:48:36 ... we won't be able to add this in the future 19:48:46 bajones: i would not want to prevent to add this in the future 19:48:55 ... we will use texture arrays to accomplish this in the future 19:49:14 ... my hope is that by saying this always uses texture arrays for simplicity will be the right call 19:49:19 q 19:49:20 q+ 19:49:40 ... i have a couple of questions 19:49:58 ... these apply to both 19:50:08 ... it may be useful to update the xr projection layer... 19:50:21 ... to advertise what the texture width, height, layers are 19:50:36 ... you could only get the dimensions once you are in the middle of a frame loop 19:50:46 ... this can be problematic 19:50:58 ... if you need a compatible depth buffer for your layer 19:51:08 ... it may be useful to lift this up to this specific interface 19:51:18 ... so you could lift this off the critical frame loop path 19:51:32 ... any concerns from the group? 19:52:26 ... there may be a scenario where we would want to allow this 19:52:50 ... unless someone wants to advocate for this path, i feel like the current path is probably fine 19:52:58 q+ 19:53:17 ack cabanier 19:53:32 cabanier: this is a good thing to have... and update the layer spec 19:53:40 ... there is a number of textures internally i assume? 19:54:01 bajones: if you have a texture array, you likely want to have a depth buffer of the same size 19:54:10 ... we are not sure if we want to report it as one 19:54:17 ack RafaelCintron 19:54:17 ... or as the number of textures 19:54:25 rafaelcintron: i don;t feel too strongly about this 19:54:45 ... unless we know it could become a problem for people not to allocate their depth buffer outside 19:54:50 ... we should keep things flexible 19:55:18 q+ 19:55:29 ack cabanier 19:55:33 cabanier: we do need extra layers 19:55:48 ... you would be able to infer it in the frame loop 19:56:07 bajones: it would be slightly more awkward to do it in the frame loop 19:56:12 ... especially on the webGL side 19:56:26 ... this will make the math a bit harder and you have to keep track of more 19:57:11 ... i would be concerned if something will change with the stream, the developer must watch the sizes frame over frame 19:57:21 ... i am not convinced that too many people will do that 19:57:31 ... a better way would be to fire an event that a layer has changed 19:57:41 ... rather to keep track of it each frame 19:58:02 ... it might just be more consistent for everybody: once you specified the layer size, it will stay the same 19:58:26 q+ 19:58:26 ... it would really just be the UAs choice (eg. for performance reasons)... 19:58:39 ... we do still depend on setting the viewport that the api reports to you 19:58:43 ... that can change over time 19:59:06 ... we went through this discussion with Klaus that this should be application-driven for non-corrective reasons 19:59:20 ... this is in the hands of the applications 19:59:30 ... this is probably a better route 19:59:41 ack cabanier 20:00:11 rafaelcintron: we should have spec texts that says this is not really relevant to the webGPU 20:00:37 bajones: There is an issue on the webgl layers, if we should keep this value around 20:01:06 ... do we still need ignoreDepthValues? 20:01:18 ... some applications still need this for compositing reasons 20:01:37 ... but in some cases the developer may not be populating the depth buffer with values of the scene 20:01:56 ... we wanted a way to signal to the system to turn off using those depth values 20:02:05 ... this is how it is being used now 20:02:18 ... this concept got carried over when we did the layers api 20:02:27 q+ 20:02:46 ... if the devlopers don;t want to use specify it, they should just allocate it themselves 20:02:53 ... they should populate it with valid values 20:03:06 ... this is how we got rid of the boolean at creation time from the old method 20:03:24 ... sometimes you would allocate a depth buffer, but the system would just not use it for compositing 20:03:34 ... sometimes applications may want to know that 20:03:42 q- 20:03:48 ... i don't know how often this situation will come up 20:03:57 ... or if it will burden developers 20:04:07 ... we could just get rid of this value at all 20:04:23 ... this feels like an easy thing that developers could overlook 20:04:40 q+ 20:04:43 q+ 20:04:55 ack RafaelCintron 20:05:02 rafaelcintron: is your position to kill it for webGL and webGPU? 20:05:11 bajones: we could make it report a reasonable value for both 20:05:41 ... i would prefer that we strongly lean into the practice that this is what you should concretely do if you have good value or if you don't 20:05:51 ... they should not have to switch on the fly 20:07:38 ... we have the xr webgl layer which hands back frame buffers 20:07:56 ... we have the layers module which has the webgl api 20:08:07 ... what we are talking about is just the layers module variant 20:08:16 ... the one that hands out frame buffers is in core 20:08:26 ... which will continue to have this boolean in it 20:08:38 rafaelcintron: given that i would agree with removing this 20:08:48 ... give out textures and make your own depth buffers 20:08:52 ack cabanier 20:09:10 cabanier: the other proposal is to move it to teh xrGPU binding 20:09:35 bajones: i apologize that i did not see that proposal 20:09:43 ... this has a different implication to me 20:10:04 ... as long as we can get the verbage right... 20:10:21 ... hey you might want us to allocate the depth buffer, that would seem more reasonable 20:11:14 ... unless anyone has further questions or comments... i have no more slides 20:11:41 cabanier: does the browser allow you to mix webgl and webgpu layers? 20:12:01 bajones: i have wondered about that. i would appreciate if anyone could speak up if it's not like this for them 20:12:12 ... it seems like they always boil down to a common set of surfaces 20:12:28 ... but they all get funneled down on a common set 20:12:34 ... for compositing 20:12:55 q+ 20:13:04 ... weather or not we use compositing or not the UA will be able to get surfaces for that 20:13:18 ... my intent is not to say that you could only do either. they should be intermixable 20:13:28 ... there is not a lot of motivation for developers to do so 20:13:43 ... i do thing tehre is motivation to mix media, dom and graphics 20:13:50 ... we need to make sure that this works 20:13:58 ack RafaelCintron 20:14:12 rafaelcintron: i don't see any problem at least on windows to mix those 20:14:24 ... they will all compose and work correctly on the same page 20:14:57 cabanier: do you initialize so you can do both? 20:15:25 rafaelcintron: today, webgpu runs on the d12 20:15:38 bajones: if you have an openGL texture, you can get to the windows presentation layer 20:15:52 ... we do this in chrome today 20:16:06 ... it's all the same driver that handles this 20:23:56 scribenick: cabanier 20:24:07 yonet: manish went through requirements 20:24:15 ... and we will get an update 20:24:31 ... and we should see if anyone is interested on working 20:24:43 Manishearth: I don't have much new to report 20:24:50 ... they haven't gotten back to us 20:24:56 q+ to ask if there's accessibility work at the OpenXR level? 20:25:00 ... I know bajones is planning on writing a document 20:25:03 Apologies, having audio issue and can't hear anyone. Will work it out ASAP 20:25:12 ... to share with TAG and this group 20:25:35 ... everything boils down to the a11y feature of webgl which isn't great 20:25:54 yonet: you said it's not possible to scale reality. Why can't we do it in vr? 20:26:07 Manishearth: I recall saying that but now I don't remember 20:26:30 ... normally scaling for a11y is making it bigger so it's readable 20:26:41 ... right now we don't have a way to do this in vr 20:26:49 ... but content can do this itself 20:27:17 klausw: if you're using phoneAR, you can use the OS level functionality to the overlay is zoomed 20:27:28 ... zooming doesn't make sense 20:27:40 yonet: for hololens we can scale the whole thing 20:27:42 q 20:27:45 q+ 20:27:51 ... and it would be really nice to have that 20:27:56 ack bajones 20:28:18 bajones: I generally agree with Manishearth , zooming in vr will be very difficult 20:28:38 ... how does hololense do this? Is content scaling up, or is there a system level scaler? 20:28:46 ... it seems the content has to do this 20:29:34 yonet: in the experience, you would be able to scale it up to the room scale 20:29:44 ... so you can interact with it differently 20:29:53 bajones: this sounds like tilt brush 20:30:13 ... where you can scale with your hands to do some detailed thing 20:30:26 ... it really helps people with the content in that environment 20:30:37 ... but it's not a result of an OS a11y feature 20:30:47 yonet: this is one of the issues that are still open 20:30:53 q+ 20:30:53 https://youtu.be/G5m7ukcGeQg 20:31:00 q+ 20:31:14 ack klausw 20:31:14 klausw, you wanted to ask if there's accessibility work at the OpenXR level? 20:31:25 klausw: is there anything happening at the OpenXR level? 20:31:38 ... many of these thing make sense there as well 20:31:46 ... does anyone know? 20:31:54 bajones: maybe Alex knows 20:32:09 yonet: do you know lachlan? 20:32:54 lachlan: there is no explicit support but it's highly dynamic so the runtime can do it itself 20:33:07 ack cabanier 20:33:50 q+ 20:33:57 ack Manishearth 20:34:10 cabanier: DOM layers would give the session the same a11y features as a web page 20:34:17 Manishearth: what was the audio question 20:34:31 yonet: the open issue 20:34:53 bajones: it was about documenting better on how people can use audio 20:35:23 https://github.com/immersive-web/webxr/issues/815 20:35:25 https://github.com/immersive-web/webxr/issues/815#issuecomment-524492352 20:35:29 ... I believe that issue was brought up in the context to make audio a component of the spec 20:35:40 ... but I forget why this is an outstanding issue 20:35:55 ... we need to document it better in the spec because it is not normative text 20:35:59 q? 20:36:05 ack bajones 20:36:31 ... since dom overlay and layers were brought up, every time where a system has a content aware structure it has good implementations for accessilbility 20:36:43 ... this could carry over to media layer 20:36:47 q+ 20:37:02 ... I'm not sure what opportunities are available there 20:37:16 ... similarly we can look at frameworks like aframe 20:37:31 ... maybe the work there has already been done 20:38:00 ... we are just a consumer of raw pixels. 20:38:16 ... can we do an OS level zoom. That is likely not practical 20:38:32 ... maybe there is an extension to change the floor level 20:38:47 ... then you can make yourself float 20:39:01 ... and that is something we can surface in the API 20:39:28 ... is this a good idea to do through the UA? 20:39:30 ack cabanier 20:40:08 cabanier: how does video have a11y? 20:40:17 bajones: it's just subtitles 20:40:22 q+ 20:40:38 and then we can make it easy to make the titles right in front of you 20:40:46 ... based on system preferences 20:40:51 ack klausw 20:41:01 klausw: more that could be done with audio 20:41:10 q+ to ask about that doc 20:41:13 ... you could show visual cues where the audio is coming from 20:41:42 ... for many things if things were accessible from the OpenXR level 20:41:54 ... we should see what the platforms are doing 20:42:11 ... for competitive games, you might not want this 20:42:21 q+ 20:42:26 ... but those are likely rare use cases 20:42:29 Laford has joined #immersive-web 20:42:30 q+ 20:42:31 ack ada 20:42:31 ada, you wanted to ask about that doc 20:43:02 ada: I want to say that sites do not know that user is using accessbility tools 20:43:09 ... and this is very important 20:43:13 q+ 20:43:19 ... that information is kept private 20:43:41 ... wrt subtitles, someone was talking what they would like to see for subtitles in XR 20:43:51 ... they had an URL but I can no longer find it 20:44:10 q+ to strongly agree that it should not be disclosed if someone is using accessibility features. in games, the developers will have to find out anyways if there are any cheats involved 20:44:22 yonet: if you block people that manage their height, that would be very unfortunate 20:44:29 q? 20:44:37 ack yonet 20:44:46 ack Laford 20:44:56 Laford: the input system in openxr is done by the runtime 20:45:11 ... it has mechanisms where users can reconfigure the binding 20:45:31 ... the browser is a runtime consumer, this binding API could provide a level of accessibility 20:45:44 ... wrt competitive games, is there a way to stop it? 20:46:03 klausw: some games look for tools installed 20:46:14 ack bajones 20:46:23 ... I'm unsure if this is an issue since webxr is not used for competitive games 20:46:35 bajones: I agree if that that won't happen 20:46:53 ... for the most part, the web is not about getting the ultimate performance 20:47:17 ... I don't want concerns for that type to stop our development 20:47:33 ... right now, chrome and edge do work with OpenXR 20:47:45 ... we don't expose a binding system 20:48:05 ... we expose the left, right squeeze action etc 20:48:21 ... it's worth nothing that using steamvr you can rebind chrome 20:48:34 ... so a pedal could become a squeeze 20:48:44 q+ to mentio nthe xbox adaptive controller 20:48:48 ... it's unfortunate that we don't have contextual name 20:50:12 ... I agree it's critical that if we find we need to turn off a11y features for the integrity of the content, would be to something at session creation time 20:50:36 ... so we are not advertize arm lengthening 20:50:47 ... because it would leak information about the person 20:50:53 ack madlaina-kalunder 20:50:53 madlaina-kalunder, you wanted to strongly agree that it should not be disclosed if someone is using accessibility features. in games, the developers will have to find out anyways 20:50:56 ... if there are any cheats involved 20:51:11 madlaina-kalunder: I wanted to strenghten bajones 20:51:25 ... it's important to know how it can influence the experience 20:51:36 ... remapping is something that is commonly used 20:52:00 ... is there a way to provide a semantic scene? 20:52:06 q+ to reply to Q 20:52:11 q+ 20:52:29 ack klausw 20:52:29 klausw, you wanted to reply to Q 20:52:34 klausw: it depends on the framework 20:52:43 ... aframe would be able to do it 20:53:00 ... I think there was a discussion on integrating that 20:53:07 ack ada 20:53:07 ada, you wanted to mentio nthe xbox adaptive controller 20:53:10 ... but this would be for specific frameworks 20:53:36 ada: to go back, you can use the XBOX adaptive controller 20:53:43 ack bialpio 20:54:05 bialpio: I want to make sure I understand the question 20:54:07 q+ 20:54:24 ... are we trying to make sure that the API are attaching semantics 20:54:36 ... to real world understanding 20:54:44 ... maybe this is something that could be done 20:55:00 ... if I understand the frameworks don't hand this information 20:55:19 ... so it could be a bit more challenging 20:55:28 ... but it's not something we can force them to do 20:55:51 yonet: maybe hit testing could tell if the user hit something 20:55:53 ack cabanier 20:57:26 q+ 20:57:32 ack ada 20:57:33 cabanier: kip had a proposal to add semantics to the scene and make it available to screen readers 20:57:47 https://w3c.github.io/apa/xaur/ 20:57:54 ada: I would like to mention the XR a11y requirements documents 20:58:09 ... it contains the feedback and thoughts from the groups 20:58:39 ... these are things that we need to be aware of 20:58:51 yonet: we can invite experts to some of our sessions 21:09:40 chair: ada 21:09:52 scribenick: bajones 21:10:05 scribenick: bajones 21:10:08 scribe: Brandon Jones 21:10:29 Manishearth: Hand input shipping in Oculus for a while 21:10:36 ... API is more or less done 21:11:02 ... Matter of pushing forward to a point where Oculus and Edge can ship without a flag 21:11:20 ... Alex Turner wanted to talk about what it would take to get there 21:11:51 ... Manish's main concern is stronger privacy/security analysis 21:11:53 q? 21:12:01 ... Microsoft may have been looking into it 21:12:21 laford: Put up for TAG review 21:12:33 ... Spec functionally shouldn't change much 21:12:37 q+\ 21:12:46 ack \ 21:12:55 ... don't know what privacy/security analysis would entail 21:13:18 manishearth: Current privacy/security docs don't contain much 21:13:21 https://github.com/immersive-web/webxr-hand-input 21:13:35 ... Explainer has a privacy section written by Diane 21:14:00 ... Would be nice if a privacy expert could look through and identify fingerprinting concerns and mitigations 21:14:11 ... Don't consider it a blocker, not much else to do. 21:14:37 ... if MS/Oculus could work together on an unflag date 21:14:47 laford: MS is pretty happy if everyone else is. 21:15:04 cabanier: Oculus is about to start a privacy/security review 21:15:28 q+ 21:15:34 ... Unsure if they need to get explicit permission for hand access 21:15:59 ... Oculus devices implicitly have access to hands, but it's a new sensor for the web. 21:16:24 Manishearth: The spec does require a permission prompt in that it requires a feature descriptor 21:16:35 ... up to UA how they handle feature descriptors 21:16:49 ... Should strongly consider having a prompt 21:17:02 ... Already have a prompt, should look at extending that. 21:17:10 ... But the spec only strongly suggests it. 21:17:24 ack Manishearth 21:17:30 ... Secondary view is an example of a feature that probably doesn't need a prompt. 21:17:48 q+ 21:17:59 ack Laford 21:18:01 cabanier: Happy to work with Microsoft and share results of internal privacy review with the group 21:18:13 q+ 21:18:18 ack Manishearth 21:18:20 laford: We're at stage of about to implement on top of OpenXR in Chromium 21:18:45 manishearth: Good to hear! At this stage mostly blocked on Oculus/Microsoft figuring out timing 21:18:49 q+ 21:18:51 q+ 21:18:55 ack Laford 21:19:06 ... No siginificant spec descions, just polish 21:19:28 ack yonet 21:19:32 laford: Alex turner concerned about a couple of C-isms, such as how enums are used. 21:19:49 yonet: Do we need to add WPT test for hand input. 21:20:01 Manishearth: Oh, yeah. We should! 21:20:04 https://immersive-web.github.io/webxr-test-api/ 21:20:27 ... (But not me, because I'm not tied to an implementation ATM) 21:20:38 ... Still doesn't block unflagging API 21:20:59 ... WebXR was out for a while without tests 21:21:26 cabanier: we don't intend to gate on WPT 21:22:10 ada: Anything else? 21:22:35 manishearth: Alex Turner requested this meeting. Hope that we covered it. Lachlan? 21:22:46 laford: Think Alex is in a meeting, checking. 21:23:12 manishearth: Fill poses may have been the only minor thing, mostly cosmetic 21:23:24 ada: Should we fill time by naming things? 21:23:43 bajones: *Stares in Software Engineer.* 21:23:43 q+ 21:24:12 cabanier: Working on generic hand model 21:24:37 laford: Is there a plan to expose hand meshes, as opposed to the current joint-based model? 21:24:48 ack Laford 21:25:06 manishearth: Didn't have time or motivation to look at meshes. Joints allow for gestures and approximate with a rigged hand 21:25:28 ... if someone wants to do the work to write up a mesh API we can look at it. 21:25:43 laford: Yeah, meshes for gestures is not great. 21:26:11 manishearth: Already seeing an explosion of cool content, so likely not necessary to have API provided meshes 21:26:41 cabanier: I wrote a hand mesh API when working at magic leap. 21:26:47 ... Runs at a lower framerate 21:27:11 q+ 21:27:19 ack Laford 21:27:21 manishearth: Thought about it a little. Making API that can return hundreds of points efficiently is tough. 21:27:49 laford: Could do it in concurrence with other types of meshes that you may want to track. 21:28:04 ... but that's a whole different discussion. Makes sense not to include for now. 21:28:10 q+ 21:28:30 cabanier: Looked it up. There IS an oculus hand mesh API 21:28:42 ... used in Oculus Home screen 21:28:54 q+ 21:28:55 ack ada 21:29:05 ... can be used in conjunction with joints, just renders nicer 21:29:23 ada: Did anyone see Babylon JS update today? Added hand tracking. 21:29:28 https://twitter.com/babylonjs/status/1323308129631694848 21:29:28 ack Manishearth 21:29:32 yonet: Coming next week, can try now. 21:29:58 manishearth: Hololens has another API for room meshes. Can't recall if OpenXR API is the same. 21:30:16 laford: Doesn't exist yet. Windows specific bridge exists. 21:30:28 q+ 21:30:40 manishearth: Assume that OpenXR will eventually have room mesh API. Curious is that will be the same as hand mesh API. 21:30:52 ... worth waiting for more progress in that direction 21:31:14 cabanier: Tried it at one point. Definitely very different APIS 21:31:33 ... world mesh is lots slower and larger. Add/remove updates 21:32:01 manishearth: In that case is probably OK that they're different. May want to share types for triangles. 21:32:37 ack Laford 21:32:40 ... Reluctant to start until I have more visibility of how world meshes work to find commonalities 21:33:01 laford: Industry differentiates the two and has separate APIs 21:33:13 ... thought about the difference between objects 21:33:39 ... collecting vertices/indices. Difference is in update time and how static/dynamic 21:33:45 Makes sense to share primitives 21:34:03 ... but not an expert in this area and not sure why they tend to be separate. 21:34:15 cabanier: https://github.com/immersive-web/real-world-geometry/blob/master/webxrmeshing-1.html 21:34:32 ... Not worth assuming that they should be separate or together. Should investigate. 21:34:59 ... Similarly, treatment of hand joints and human skeleton is another potential area of shared concepts 21:35:11 ada: Anything else on this topic? 21:35:41 ... Nothing, so moving on to any remaining business. 21:35:48 q+ 21:35:56 ... Tomorrow's call is at 8PM GMT 21:36:18 q+ 21:36:18 laford: One more topic. Overlapping-squeeze profile? 21:36:45 https://github.com/immersive-web/webxr-input-profiles/pull/185 21:36:45 https://github.com/immersive-web/webxr-input-profiles/pull/185 21:37:17 ... not super familiar with history 21:37:31 ... original problem with hands is that squeeze/select overalp 21:37:43 ... spec implies they are distinct 21:38:03 ack Laford 21:38:05 ack Manishearth 21:38:17 ... has a grasp action mapped to 4th button 21:38:24 manishearth: I like this. 21:38:47 ... grasp is a way better name than overlapping-squeeze 21:38:56 ... need a line in the spec about it 21:39:06 q? 21:39:14 ... need to work with framework people to make sure we handle it right. I'm hopeful. 21:39:43 ada: Think we can call it an evening/afternoon/morning. :) 22:19:54 RRSAgent, make minutes 22:19:54 I have made the request to generate https://www.w3.org/2020/11/05-immersive-web-minutes.html yonet 23:59:59 previous meeting: https://www.w3.org/2020/11/04-immersive-web-minutes.html 23:59:59 s/...but maybe the right place/brandon: but maybe the right place/ 23:59:59 i/cwilso:maybe the right thing at this/scribenick: ada/ 23:59:59 i/rik: didn't John Pallett look into/scribenick: cwilso/ 23:59:59 s/scribe: madlaina/scribe: madlaina-kalunder/ 23:59:59 i/scribe: madlaina-kalunder/topic: Required dependencies, Example: Should Hit Test be made part of the AR module? (@toji, @klausw)/ 23:59:59 i/yonet: we will get started with the webGPU topic/topic: WebXR WebGPU Binding (@toji) 23:59:59 i/yonet: manish went through requirements/topic: XR Accesibility (@Manish, @Yonet)/ 23:59:59 s/and then we can make it easy to make/... and then we can make it easy to make/ 23:59:59 i/Manishearth: Hand input shipping in Oculus/topic: WebXR Hand Input (@Manishearth, @fordacious, @thetuvix)/ 23:59:59 i/ Anything else?/topic: AOB/