Socialwg/2016-09-22-minutes

From W3C Wiki

Social Web Working Group Face-to-Face Day 1

22 Sep 2016

See also: IRC log

Attendees

Present
rhiaro, cwebber, tantek, KjetilK, aaronpk, tsyesika, Benjamin_Young, csarven, newton, Arnaud, Ann, Bassetti, AnnBass, ben_thatmustbeme
Regrets
Chair
tantek
Scribe
sandro


Minutes

<tantek> good morning #social! day 1 of the f2f is starting.

<tsyesika> I can hear you

<cwebber2> fabulous

<bigbluehat> scribenick: bigbluehat

Agenda item scheduling

https://www.w3.org/wiki/Socialwg/2016-09-22#Agenda

<Loqi> Social Web WG Face to Face Meeting in Lisbon (F2F7)

tantek: great work everyone on the demos yesterday
... first time I've seen a WG demo so many of their working drafts
... think we have 5?

sandro: depends on how you count

tantek: AnnB put up a great photo of the breakout
... the demos yesterday did a great job of heading off divisive discussions
... thanks to everyone for making the environment so much better
... we have a review request from I18N and a schedule meeting with them today
... how long rhiaro?

rhiaro: an hour

tantek: they'll be reviewing AS2 and activitypub with them?

cwebber2: I'm not sure what ActivityPub will need that isn't covered by AS2

tantek: but we'll show them just the same to be sure it's covered

aaronpk: there might be a few things in Web Mention about the responses
... and that might also effect LDN

rhiaro: we do need to file a formal request for LDN and (??)

tantek: if we did 10 minute per spec, that'd be an hour
... this afternoon I and sandro I believe need to go to the AC meeting
... we are meeting until 3 pm today
... unless we somehow setup Evan to remote chair

rhiaro: we go to them, right?
... can the other groups chair for the group meetings?

sandro: yeah. that could work.

tantek: I don't think I need to be there for the I18N discussions
... I believe I've shared my opinions already and those can be relayed
... now that we've discussed that bit...we should go back and do introductions
... Amy can update the agenda since she's working on scheduling the other groups
... Let's pop back to intros

<tantek> Tantek Çelik, chair, Mozilla, also on the AB

<sandro> sandro: Sandro Hawke, W3C / MIT

<cwebber2> I'm Chris Webber, I'm an editor of ActivityPub, I work on MediaGoblin as motivation, and I'm an invited expert in the group

<tantek> observers, please add yourselves to https://www.w3.org/wiki/Socialwg/2016-09-22#Observers

<Loqi> Social Web WG Face to Face Meeting in Lisbon (F2F7)

<paulcj> Paul Jeong, HTML5 Forum in Korea, making korean local social web standard using W3C standard

<aaronpk> Aaron Parecki, editor of Webmention and Micropub

<KjetilK> Kjetil Kjernsmo, Observer, old-time semwebber, worked with social media in the past, trying to get back into the area of decentralized social media

Benjamin Young, co-editor of the Web Annotation spec, interested in AS2 and LDN for their use in Web Annotation

<kaorumaeda> Kaoru Maeda, Observer

<csarven> I'm Sarven Capadisli http://csarven.ca/#i , editor of https://www.w3.org/TR/ldn/ . Invited expert. Working on https://dokie.li/

tantek: I18N is in 1.05--right next door

<rhiaro> Amy Guy, W3C/MIT/University of Edinburgh, staff contact, Social Web Protocols, LDN

tantek: at 15:30
... our end of day will be at 16:30

<tsyesika> I'll just write mine here: I'm Jessica Tallon (on hangouts), I am a co-editor on ActivityPub, and invited expert in the group and have done a lot of work on GNU Mediagoblin's federation

tantek: we have some time to discuss strategy for the next 3 months--which takes us to the end of the charter
... after that we have blocks of time for our various CRs
... I scheduled things partly around evan's schedule--he'll hopefully be awake by then

<tsyesika> :)

tantek: first thing I have is ActivityPub and then LDN and then Post Type Discovery after that...because I'll be here
... PUbSubHubbub will be tomorrow
... and then finish with a "what's next?" tomorrow
... anything else?
... then let's go on to strategy for the next 3 months

Strategy for the next 3 months

tantek: we have several CRs and a few WDs that are pretty advanced
... we have another that is FPWD state, but has several implementations
... our goal--our a proposed straw goal--is to get all of these to TR before the end of our charter
... I think we have a decent chance to do that
... having multiple docs to push through the process at various times, has proved useful for getting things out the door
... I think we can continue that pattern over the next 3 months
... I think it's achievable
... the biggest unknowns are:
... Sufficient Test Suites
... and sufficient implementation coverage to show to W3C Management

sandro: we also need public wide review and horizontal review

<harry> Sorry, had to leave TPAC to help teach a course, but note that I spoke re Pubsubhubbub with DanBri, who is close with BradFitz (Pubsubhubbub original author).

tantek: yes! that's a big requirement.
... I'd like to underscore that

<harry> I would follow up with danbri, but he said as long as it's clear Google is not endorsing the work or the WG, he can speak with BradFitz over RF licensing.

sandro: plh said 3 months before CR is when you go out for horizontal review

<harry> So if any of you are at TPAC (particularly sandro/rhiaro), talk with Danbri.

tantek: yeah...that was several yester-months ago
... at this point, we'd like to get horizontal review ASAP
... especially since they're kind of a pair, those requests should go out this week

sandro: definitely this week

tantek: is that something sandro or rhiaro can cover?

rhiaro: it depends on who you're asking

<harry> Yes, I think Wendy would red-flag going forward with Pubsubhubbub if there's no contributor agreement from the original author, unless Julien didn't use any of BradFitz's original text.

sandro: yes. the staff contacts can help
... but the speed is groups is different and several of them have pre-requisite self-review

tantek: I think we should give them warning at least that we're coming
... and estimates of when we expect to take them to CR

<wseltzer> [note it's not a question of text, but features for RF patent commitment]

tantek: so that we don't ask for review last minute as we'd done before

sandro: we could say "we're ready to go to CR, modulo your review then great"
... tantek: in two weeks
... then we can try and push these through faster

csarven: how do select who to get reviewed by?

sandro: it's based on our own needs, but if we don't get any then there are problems

tantek: correct. If there aren't external reviews, then W3C Management will be unhappy

cwebber2: who should we find for external review

sandro: the farther away the better

cwebber2: k. trying to decide who to contact
... someone from Pump.io has recently dug into ActivityStreams and (to a lesser extent) ActivityPub and heavily reviewed it already

sandro: yeah. that's perfect.

tantek: generally I think we've taken the approach of generally useful pieces for other groups--often external
... if you expect your spec is the foundation for someone else, then be sure they're part of the review
... Web Annotation, for instance should review LDN if their considering recommending it

sandro: ideally, this sort of things has gone on for 3 years
... but in the case of these new specs, we're down to the 3 months

tantek: right, so greater encouragement to review is needed

<sandro> (and that

tantek: wider and great horizontal review is the most critical thing at this point

<sandro> (and that's when it's most important to get wide review)

<harry> wseltzer, yes the concepts/features are more or less the same as BradFitz's spec.

tantek: and we're also dependent on other people to get back to us

cwebber2: so. I'm trying to figure out when we should have people get back to us

<harry> However, I also think some of text is his as well, so it makes to get a RF. BradFitz isn't against, he just doesn't see the point or any advantages of standardization, but DanBri or Julien could likely discuss.

tantek: I think if you have a sense of what's optional, at risk, etc, then you're ready for wide review
... there's a list of standard horizontal reviews and rhiaro is going to share that list

<wseltzer> [harry, let's take this discussion offline. we discourage patent discussion in WGs]

tantek: I'm happy to connect editors to others folks in other WGs that they want review from

sandro: we can also check into some of the community groups--though many of them lie fallow

paulcj: was curious about community groups and handling on going specs

tantek: yes. we want to discuss that, probably tomorrow, along with the recharter discussion
... which is scheduled at 15:30

paulcj: sadly, I'm not here tomorrow.

sandro: to your question, we can revise our specs after we've shipped them
... but we can use the CG to discuss them, and work toward a later recharter if we find it's needed

tantek: we can continue to do information guides and anything informative in a CG
... one of the things we did to AS2, was have it processed down to zero issues
... and then sent a wider request for input for a "last call" on filing issues
... I'd like to get the thoughts from the editors on how to handle issues
... and whether or not this would work

aaronpk: I like this in theory, but two weeks is not a lot of time
... and I want to be sure there's enough time to get feedback

sandro: yeah. the goal is more "is it ready to start implementing"
... there used to be a "last call" step and it still feels like it's missing

tantek: yeah, and that's now part of CR
... and that's more or less what we're proposing here
... bringing that back with this 2-week window / "last call" period
... I'd like to get a temperature gauge on this idea
... seeing some head nods

cwebber2: yep.

tantek: k. let's plan to do this in mid october

sandro: 2-weeks from now is Oct 6.

tantek: so. let's put that down and talk to the rest of the WG, that we'll do this 2-week window
... our goal is to say "proposed: take XYZ to CR" and get a round of +1's and push for horizontal review, etc.
... and the horizontal reviews is a different matter
... they might take 2 months
... so we'll give them a different window

sandro: is post type discovery ready for this process?

tantek: it depends on my time, but I think it'd fall just behind that schedule, but could still happen

sandro: and pubsubhubbub?

cwebber2: yeah, I think there's still interest and activity

tantek: it seems there's been some good github activity recently
... the big question there is whether its ready for FPWD

aaronpk: I'd like to review it, but I'd like to tackle WebMention and the other things I'm tackling

tantek: right. this is sort of like Post Type Discovery. they're not as ready as the others
... they'd be more "at risk" than the others
... they feel pretty small

rhiaro: well. pubsubhubbub is pretty large

aaronpk: yeah. it's bigger than what it looks like from my guide

sandro: signed deliver specifically sounds like an "at risk" feature

tantek: or perfect for a later version
... k. we have 10 more minutes left in this item
... we'd talked about doing a November face-to-face
... presumably by then all of our specs would be in CR
... and we'd be evaluating reports and test suites
... to be sure all that was covered
... so the question is, is there value to doing some of this in person?
... or is that something we want to do remotely/virtually over telecom

csarven: real quick about the dates
... we said October 11th
... is there then sufficient time before a proposed F2F?

sandro: we'd be in CR, but we'd possibly be at the end of CR for some of these

tantek: it would be sufficient to still have time left in CR
... it'd then be up to myself and the chairs to cover
... it'd be great to quickly turn around exit reports
... it shouldn't block us on a F2F
... so I'd like to get some input

aaronpk: so. my other thought.
... our biggest difference between a F2F and the tel-cons is the length of consecutive time.

sandro: right a virtual face to face

bigbluehat: DPUB did this for their use case documents--and with enough coffee it's not too bad

tantek: there was some talk that if we did a F2F we could use MIT
... as the potentially preferred option
... and still looking at November

cwebber2: I'd be AOK with doing another F2F
... they've been super productive lately
... but if that's to difficult for everyone, it might be good to do the remote f2f
... maybe 2 weeks with 2 half day meetings

sandro: M, T and then the next M, T

rhiaro: one advantage of the F2F is that folks get less distracted
... I also don't know where I'll be in September
... Bali?

sandro: we should probably do that in December

tantek: so there does seem to be some consensus that a f2f would be ideal, and virtual as a workable fallback

cwebber2: maybe somewhere in europe?

tantek: wseltzer just pointed to our charter

<sandro> wseltzer, can you be a little more specific?

tantek: it says "F2F once a year at minimum, 3 times a year at maximum"

<wseltzer> yes, tht's what I was pointing out

<tantek> wseltzer: are you able to join us in 1.06?

<wseltzer> since f2f's are expensive in time and travel costs, we want to keep an eye on them

tantek: the facts are, we have met 3 times this year
... we are interpreting that as we could do that, if enough of us agree
... it would be odd to say, we can't do it if everyone in the group would like to

aaronpk: I will say that I no longer have external funding for this
... so personally closer to the West Coast would be helpful

tantek: ok...
... there's a since that F2F would still be useful
... there's a since that the US would be preferred over international
... there's another proposal for Sweden

cwebber2: yeah...but I can't really volunteer someone elses time and buliding

sandro: personally, West Coast is nicer for me than a European trip that time of year

cwebber2: my preference is Boston because i have lots of "crash spaces"

csarven: I wouldn't be able to attend unless its in Bern

tantek: oh. here's Wendy

wseltzer: yeah. I saw you were chatting about the F2F
... and just wanted to remind that you'd chartered it to 3, but you can override it with agreement from the membership

cwebber2: I'm AOK with doing the remote thing

tantek: if we do a F2F with remote participation

rhiaro: what if we do 2 F2F's one in the US and one in the EU with remote participation

tantek: do we have that much activity in the EU?

rhiaro: not sure, I'm just continuing to volunteer people who aren't here

sandro: I've been part of two-headed f2f's with 6 people in each room

tantek: paulj what are your thoughts on a F2F

paulj: I am not sure we can attend a F2F

tantek: would you be interested in attending virtually?

paulj: yes.

<sandro> wseltzer, my sense would be if the WG has unanimity to meet, it's okay to meet more often than the charter (foolishly IMHO) says

paulj: it is difficult because of timezones--telecom is at 2 am in Korea

rhiaro: we can schedule it for 24 hours and do it in shifts

tantek: let's do a straw\ poll

aaronpk: do we not already have that

tantek: true. anyone object to a F2F?

sandro: the one thing maybe I have said, is that I'm likely not up for traveling, but I would be up for remote

tantek: aaronpk, cwebber2?

aaronpk: I'm up for West Coast. Maybe East Coast, depending on the timeframe and cost

tantek: if we're committed to the F2F, then perhaps we can pin down the dates for the people most interested

sandro: maybe we should look at 14-15th (avoiding the week before because politics)

tantek: maybe 15 & 16, so we can do Monday for travel

aaronpk: that's actually the best week in November for travel

tantek: can we discuss 15 & 16 for a F2F?
... any other dates to propose?
... open to counter proposals. this one just seems to be getting traction

<csarven> +1 to Nov 15. -1 to Nov 16.

aaronpk: is this for boston?

tantek: if your date and location are tied together, that would be good to note

csarven: I'd be remote

tantek: how about the 17-18th

rhiaro: I'll be traveling
... I'll be traveling earlier that month so that's slightly better sometime in that month. Those dates are OK

aaronpk: I'd have to stay over the weekend to make it work...

rhiaro: I smell an indie web camp

aaronpk: good point

tantek: k. i think that's probably narrowed down enough that it's worth us bringing to the folks not in the room
... to see if that works for them or have a preference
... particularly Evan
... certainly in the US is easier for him
... Julien is another person that would be great to have at the F2F
... so knowing location needs for them would be great
... any objections?

sandro: tantek do you want to send that out?

tantek: I'll let you do that.
... we're about 20 minutes behind
... aaronpk are yo ready to talk about web mention next steps?
... since this is that last session before the morning break...
... csarven can you present the issues page for webmention?

Webmention

aaronpk: since we're chatting LDN later today, then there's only 1 issue

tantek: actually let's be sure to do the I18N one also, so we're ready for that review

aaronpk: summary of issue #57
... the spec says that while there's no required body as a response it may contain content
... there are responders that send cute messages in response
... mostly they are ACKs--esentially
... some of them do send JSON responses that point to where the notification is stored
... if it's used for things like IndieNews, then they have useful information in the response
... but if it's pure WebMention, the only thing you need in response is the 201 response code
... the I18N concern that that the spec says "a human readable response" but doesn't address I18N concerns at all

tantek: it's optional?

aaronpk: right. it's a MAY
... and likely no user will actually ever see this--just developers
... the same is true with error responses
... the spec says it MAY contain a description of the error
... sometimes they are explicit about the error
... "we were able to find the page, but unable to find your link"

tantek: let me see if I can summarize
... this is about informative developer messages
... one way we can phrase a question to the I18N
... what is your recommendation on optional informative developer messages?
... possibly this is something they have a general recommendation for that kind of thing
... that's one way could narrow that request of them

csarven: so I can understand this better, is the assumption that an application is making the request?
... is the developer unaware of the request going through?

aaronpk: yeah. generally it's a sender server application
... and it's rarely exposed to the recipient user

tantek: what about webmentions from a form request?

rhiaro: you wouldn't dump it to the user

aaronpk: so. some of them respond with a formated HTML response that is seen by people

<rhiaro> scribenick: rhiaro

bigbluehat: the content type can be whatever in response?
... Can you just recommend that they use http headers for any language declarations?

aaronpk: probably

bigbluehat: and just say respond with http and reference 7240 or whichever one that says what the language is
... and you should do http good

aaronpk: that's probably fine

bigbluehat: just push it down the stack to http. otherwise you're going to run into defining other things

aaronpk: and there's reasons to return nothing

bigbluehat: it's a nice big known quantity youc ould use for that
... happy to help find those

tantek: if you're sending a humanr eadable response you should be sending the folloiwng http headers
... the other consideration which i18n is getting at is that there are accept headers, and accept language..

bigbluehat: there's accept language and content language

tantek: so you should be looking at accept headers sent by the senders

aaronpk: is it okay to just say do http?

bigbluehat: there are two people doing http. The sender and server. You'd have to state that you're going tp ass through anya ccept language stuff to the endpoint and then back trhough.. relyaing those headers?

tantek: I think all you have to say is the endpoint shoudl look at the accept header of the request and then should respond accordingly per http with the appropriate content and language header

bigbluehat: should maintain client preferences

aaronpk: content type applies as well

tantek: you can narrow the requirements. if the accept header is requesting html do this, otherwise do what you want

csarven: if it's html it's defintiely inteded to be viewed by a human
... plain could go either way, but less likely human in this case

tantek: so if there was an accept header of application/json then the endpoint could just blow it off
... the only accept content type header that's relevant to pay attention to is html

bigbluehat: the ones that the spec should encourage for fallbacks to text/plain or */* so we don't get 415, especially since the body i soptional
... i18n might be okay with the body should be ignored but may be persisted
... options has this
... most people just ignore the body
... If you say the meaning of this response is restricted to the headers, you may reuse the contents however you see fit, and possibly take out the humanr eadable bit, and that would totally punt on the problem
... Then you can say for more advanced use cases lean on http's defined header patterns

aaronpk: I like that

tantek: that is an option to drop that may/recommend completely
... you can put a note saying implementations have done x

aaronpk: does that include removing that example?

csarven: is that example an error?

aaronpk: it may already have a status url, doesn't mean it's done

bigbluehat: the resource exists but not its representation

csarven: if I go and dereference that..

aaronpk: what you get will change... gives you a 200 and a json body
... This is also something I want to do as an extension
... here's how to do status reporting of processing, it's pretty useful
... But totally an extension

cwebber2: that's something we have in media goblin, with submitting a video, it has to transcode, yo udon't wait to give a response

aaronpk: yeah deservers proper research and spec

tantek: as an interim you may want to consider an informative note

bigbluehat: and be clear that th e normative response is 'it happened, here is location'

tantek: setting expecatiosn for consumers with that information

<bigbluehat> scribenick: bigbluehat

tantek: I did want to talk about bigbluehat's point about passing HTTP headers
... is that something you want to state normatively?
... specifically we should be sure that the Accept-* headers are handled
... and perhaps recommend that */* is always included as a safety net

aaronpk: so this is solely about client to webmention endpoint. not endpoint to server.
... we can add an informative note for how things happen in a browser context

tantek: does that resolve that issue? and solve the I18N issue?

aaronpk: right. I'm going to drop the human readable response recommendation from the normative text
... there's still the error response issue
... I will ask for recommendations that have no actual processing needs

tantek: that all sounds good. plus bigbluehat's do HTTP properly recommendation
... that should hopefully make the I18N folks happy about it

aaronpk: I've added those to issue #57
... the other one is issue #48
... this came up during a face-to-face. it has my name on it but I opened it for someone else--probably Ryan of Bridgy

tantek: there are situations where this has broken "in the wild"
... so we should probably be ready for this same situation

aaronpk: the scenario is an blog post containing 8 links
... and discovery having to be done on all 8 links
... so there are interesting thoughts in the thread
... bear for instance has some interesting thoughts

csarven: so to fill in the blanks. is this the sending or the discovery?

aaronpk: it's the discovery step
... you may have added a web mention endpoint

sandro: this is just about discovery and rediscovery

aaronpk: yeah. even re-sending.
... because it's spec'd to recheck
... I feel like it's pretty simple per URL. a simple backup strategy

sandro: cache headers?

aaronpk: per-url following cache headers is a pretty easy answers
... you should start there.
... I don't think we need to recommend a back-off strategy for per-url
... and document that they should have some back-off strategy
... the challenge is multiple URLs on the same host
... a very common way this actually happens is when I link to your post and your home page
... a lot of people have the mention endpoint on the post, but not on the home page
... so the question is, how do you avoid these failure cases

breaking for serious coffee needs

<tantek> resume at 11:05

<Loqi> Aaronpk made 2 edits to Socialwg/2016-09-22 https://www.w3.org/wiki/index.php?diff=100164&oldid=100009

<Loqi> Rhiaro made 2 edits to Socialwg/2016-09-22 https://www.w3.org/wiki/index.php?diff=100165&oldid=100155

<Loqi> Rhiaro made 1 edit to Socialwg/LDN CR Transition Request https://www.w3.org/wiki/index.php?diff=100166&oldid=0

<Loqi> Inword made 1 edit to Socialwg/2016-09-22 https://www.w3.org/wiki/index.php?diff=100161&oldid=100156

aaronpk: we looked at OPTIONs during the break

tantek: but it's unclear who can control that

aaronpk: also robots.txt does have some extension/variation that can state rate limit style statements
... however it's not documented in the standard
... though it is implemented by yandex and bing
... because we don't have any implementation experience around host-level rate limiting
... another option we have is to move the scenario to a client concern
... so they have a way to handle the problem or warn the server
... so it's clear why there are so many GET requests
... another option is making recommendations around multiple URLs
... one is recommending respecting cache headers per URL

tantek: sounds like there's enough information to iterate on

aaronpk: the only thing I'm confident to recommend at this point is stating that the client would include something in the user-agent string
... so that servers know why there's a high level of GET requests

csarven: so we've actually only handled it in retry scenarios

rhiaro: ActivityPub recommended we handle that

sandro: yeah. the webmention scenario is about discovery

rhiaro: LDN's discovery is basically the same

csarven: the URL could be somewhere else on the web

sandro: right it's the same for webmention

tantek: right. the follow-your-noise kind of thing

csarven: think we should just state "be nice"
... it's going to be hard to recommend a clear hard limit for people to follow

sandro: it's sort of like "how long can a URL be?"

tantek: aaronpk can you propose a solution

aaronpk: yep. 1. add a cache header and not try more often than that suggests
... also 2. including the text "webmention" in the User-Agent header so there's an indication of why the requests are coming

tantek: anyone object to that?

RESOLUTION: accept aaronpk's proposal to close issue 48

<Loqi> Tantekelik made 1 edit to Socialwg/2016-09-22 https://www.w3.org/wiki/index.php?diff=100178&oldid=100165

<aaronpk> https://github.com/w3c/webmention/issues/48#issuecomment-248865148

tantek: next issue?

aaronpk: who posted #63?

<sandro> KjetilK,

KjetilK: it's just about the HEAD request and a status code

tantek: the key is to be sure that the things you need in the later spec are still there
... next issue?

aaronpk: things seem done. waiting on a response for #55
... otherwise, we'll see after the I18N review

tantek: k. we're through the WebMention issues
... so. now we talk test suite
... does it cover the conformance requirements?

aaronpk: great question. let me find that section
... I believe it covers all the sender requirements
... most of the test suite checks the discovery and receiving of them
... there are tests for updates and deletes
... for testing receivers, it basically sends you a mention and then you prove that you can receive it
... I haven't gone through all the MUSTs and SHOULDs?

bigbluehat: definitely the MUSTs

tantek: but it's best to do the SHOULDs too
... it's expected that implementations conform to both

aaronpk: there's actually not a lot of MUSTs in receiving at all

tantek: should there be?

aaronpk: no. lots of that is up to the receiver
... things like what sort of source content it receives
... also the number of redirects to follow...there's no tests for that

sandro: you could have it test against infinite redirects

aaronpk: I could bump what ever number they say they support by 1 and then do that many redirects and see if it succeeds or fails

tantek: another way to look at it is interoperability.

aaronpk: possibly testing for 1 redirect would be useful for interop

tantek: that does sound useful. for receivers right?

aaronpk: yes

tantek: we're looking at feature coverage and interop

sandro: could you testing the infinite redirect case for the error scenario?

aaronpk: it's possible. that's not a conformance thing though

sandro: but it's a nice thing to have for killing broken code

tantek: is that something you cover in security concerns?

aaronpk: yes. I believe so
... yes. it's in security considerations

tantek: perhaps make sure the redirects bit are there

aaronpk: it's there.

tantek: don't bother with the infinite case--as it's not needed for the spec validation

Arnaud: yeah. if it's not a spec requirement it's not something we have to test

tantek: yeah. there are also better things to work on given the amount of time we have in our charter
... you might consider raising the redirect issue with the TAG

Arnaud: no. don't do that...

bigbluehat: you could do it post CR/TR for a way to test non-spec requirement things that people really should still do for a way to help implementers

aaronpk: I'm going to make a milestone for it

tantek: perhaps "feature complete" testing
... things that help implementors do a better job with their implementations
... we need to know from you, aaronpk (and the other editors), that you feel the tests are ready to cover the spec requirements
... and generate reports
... how are the implementation reports coming?

<aaronpk> https://github.com/w3c/webmention/tree/master/implementation-reports

aaronpk: missing a few of them
... some of these are self-reported
... some of them are check marks generated by the test suite?

sandro: is there an easy view of this?

tantek: do you have a tabular format?

aaronpk: I have not done that yet

tantek: how much more time do you want for that?

aaronpk: I can probably aggregate that today

tantek: and give a review tomorrow?

aaronpk: yeah. that should work

sandro: are you all doing the same sort of reporting?

rhiaro: we're copying webmention

cwebber2: my plan has been to copy the other two

tantek: what about AS2?

<csarven> https://github.com/w3c/activitystreams/blob/master/implementation-reports/template.md

tantek: this is a bit of an aside...we'll get to these discussions later in the AS2 section
... aaronpk you'll get use those reports tomorrow.
... we know there are more tests

aaronpk: and there are things in the reports that don't necessarily have code tests

tantek: than that's a good hint that there's more to add to the test suite

<sandro> ( looking back dreamily on https://www.w3.org/2003/08/owl-systems/test-results-out -- which took live feeds of test results )

cwebber2: do you need to ask people to re-run tests if you change the tests?

tantek: yes.

bigbluehat: if they're conformance requirements

aaronpk: the implementation report template is complete
... that does reflect the spec
... so I'm not going to be changing the template

tantek: right now that's self reporting

aaronpk: my understanding is that manual testing is an option

sandro: right. that's fine.

tantek: code would be nicer

sandro: some scenarios can't be tested with code

tantek: sure.

aaronpk: and some of these webmention tests can't be either and have to be validated by humans

tantek: my preference would be that if you can write a code test, then you should and we should make that the conformatant requirement
... I know in CSS there's a pretty high bar for claims of passing
... now. css specs often take a very long time to excite CR
... but my preference is that we do have code tests for implementations as much as possible

aaronpk: I agree that makes since.
... however, I will say it's possible to write some of these but also impractical
... for instances the asynchronous cases
... because there's no defined way to say that it's "complete"
... we haven't specified a way to know when it's done
... so it'd be a lot of work and not even a guarantee that it's confromant

sandro: it's more like writing code to help a human do the testing

tantek: so. it's probably best that we spot check implementations that they actually work if mashed together
... as far as us taking this to a CR transition call
... so we can say that we've done manual testing and put implementations against each other

aaronpk: yeah. this is even a challenge in practice
... sometimes you don't know if it worked because the mentions are moderated

cwebber2: could you have a manual mode for you suite?

aaronpk: I could, but it's a lot of work and only marginally valuable

sandro: because webmention doesn't keep things around it's tricker to know if it worked

aaronpk: and the spam avoidance features make it particularly tricky to test

sandro: if we could go backwards we cold spec features specifically for testing/validation, but it's too late for that

tantek: whatever method we employ, we need to talk the director through the interop situation.
... ideally, anyone could come to our test reporting and find conformant implementations
... it would certainly be nice. we don't have to. but it would make things smoother and more impressive

Arnaud: well. let's be real. I don't think anyone's ever lied about passing these sorts of tests

tantek: yeah. I'm not implying that, just that there may be bugs that the test suite doesn't cover or find

sandro: there are scenarios where spot checks are done across multiple implementations
... this is especially true with vocabularies
... you can test that the terms are there, but an human usually validates that they're in the right place and used the right way

aaronpk: k. just to summarize, the requirements for PR is
... implementation reports validate 2 or more implementations of every feature
... ideally done via automated tested

tantek: it's a huge plus

aaronpk: and what was the other requirement?

sandro: all issues address. and wide review
... did we miss security review?

tantek: yes. it's in the spec

<sandro> https://w3ctag.github.io/security-questionnaire/

<Loqi> [Mike West] Self-Review Questionnaire: Security and Privacy

tantek: wait. is it filled out?

sandro: specifically https://w3ctag.github.io/security-questionnaire/

<Loqi> [Mike West] Self-Review Questionnaire: Security and Privacy

tantek: it's not currently required
... but it's very helpful

sandro: specifically the privacy bits
... given that this is a social protocol

tantek: how do folks feel about this?
... I filled this out for CSS UI
... I went through it. I didn't find any real surprises, but it was helpful to think about these issues.
... after having done the self-review I found it helpful
... I'd like us to consider adding this as a requirement for our specs

aaronpk: where would I put this?

tantek: in security considerations

<tantek> https://www.w3.org/TR/css-ui-3/#security-privacy-considerations

<Loqi> [Tantek Çelik] CSS Basic User Interface Module Level 3 (CSS3 UI)

tantek: or an appendix would work
... which is what I did for CSS3 UI
... I think it would be pretty short
... I think it's useful for the privacy interest group specifically

csarven: should I just pick applicable ones?

tantek: no. you answer them all

csarven: that seems possible
... that's only for convenience right?

tantek: it's for anyone

csarven: I definitely see the value of it
... what about the others?
... should the I18N self review go in there too?

<sandro> https://www.w3.org/TR/international-specs/ isn't exactly a questionaire...

tantek: let me split your question

<Loqi> [Richard Ishida] Internationalization Best Practices for Spec Developers

tantek: should we be doing self reviews? that's the first question
... and that's a yes

<aaronpk> here's the checklist https://www.w3.org/International/techniques/developing-specs

tantek: on the should we put them in the spec question, it depends on the spec

<csarven> i18n as well as a11y

tantek: if it's heavily about privacy and security, then that should be there

sandro: another approach to doing this is the issue tracker

<cwebber2> access.bit.ly

<csarven> :) I meant a11y

<csarven> can't count

bigbluehat: that sounds great
... and then go to horizontal with those filled out

tantek: that does sound like a reasonable approach

Arnaud: yes. the sooner we make these horizontal request the better

sandro: yeah. we said we'd definitely do it this week

Arnaud: yeah. sadly it's tricky because if you ask too soon, then they just tell you to come back later

sandro: reviewers want the specs to be simpler and easier to review
... because they also have time pressures

<sandro> https://www.w3.org/2005/08/01-transitions-about

tantek: I want to minimize the unexpected requirements for editors
... and narrow in on things that all the editors agree too
... so I've put MicroPub after lunch and AS2 after that

aaronpk: we'll have just 40 minutes for lunch
... and I think MicroPub will take as long or longer than WebMention

tantek: perhaps there's enough overlap that it'll be faster
... and to rhiaro's point it should help the other editors

AnnBass: are you going to the AC meeting?

tantek: yes.
... and the other groups will chair the combined meetings

adjourned for lunch

<AnnBass> (for the record, I am also going to AC meeting)

<rhiaro> https://http.cat/418

<aaronpk> new hangouts url: https://hangouts.google.com/call/xvjzbgdgzve6rcl7l3sflmucque

<aaronpk> naps photo.jpeg

<cwebber2> scribenick: cwebber2

Micropub

tantek: let's look through open micropub issues, how about starting with #7

aaronpk: the bottom 4 we can ignore, we're waiting on response, the main one I wanted to talk about was #55
... I think cwebber2 may have experience with this
... for right now it's mentioned that when the application should return json, it returns the application/json content type

cwebber2: not sure why it would need a different media type

bigbluehat: there's no need to switch media types as long as processing is the same
... using the profile= thing may be okay but also may be unnecessary

tantek: is there any specific requirements

aaronpk: no it's just some specific terms

bigbluehat: so, we use the json-ld context, but you could reference a schema that says here are the keys we have to have, but you could just ship it as application/json and that's fine
... if your processing model hasn't changed from json that might be fine
... what json-ld says "this term has this meaning throughout the tree"

aaronpk: right and with json-ld it says certain kinds of structure are not allowed

bigbluehat: yes like lists of lists

tantek: what do json based snowflake apis do

bigbluehat: github uses its own vendored media type, but a profile object is a better type
... usually it points to an html spec, it uses an @context

sandro: is github's model common

bigbluehat: sometimes, but profile is starting to be pushed because it's dereferenceable

tantek: you need to register them potentially, etc?

sandro: I am on the ietf types mailing list, but they aren't that common

<tantek> issue URL?

bigbluehat: with hal-json and etc, they have _links and etc

<tantek> GH issue URL

<tantek> ?

bigbluehat: that one did change the processing model, it's now hypermedia, etc
... so if you're just saying I have expected keys or I have a value, etc

aaronpk: there's a place where the actual json struture is expected, which is where microformats2-json (?), which is restricted in its structure in that it has only arrays somewhere
... this is a subset of json, so it may return a microformats 2 json, so

bigbluehat: I'll show you the web annotations spec

<csarven> http://w3c.github.io/web-annotation/model/wd/

<Loqi> [Robert Sanderson] Web Annotation Data Model

<bigbluehat> https://www.w3.org/TR/annotation-protocol/

<Loqi> [Robert Sanderson] Web Annotation Protocol

<bigbluehat> application/ld+json;profile="http://www.w3.org/ns/anno.jsonld"

bigbluehat: it looks like that ^
... the other issue is like with hal you want application/json etc, or you say no I'm a hal client, give me the links
... profile situation you're still operating as json so you can say this is what it means / conforms to
... but if user didn't bother to look this up it can still be treated as json successfully

tantek: do we have any implementations that want to be content negotiating?

aaronpk: there's nothing in micropub that can/does do content negotiation?

bigbluehat: so minting another media type is hard

<aaronpk> https://github.com/w3c/Micropub/issues/55

aaronpk: and this proposal is to do another new media type, but that's not the main issue, so if there's another way to do it, that would be good

bigbluehat: if he can use application/json + profile...

tantek: is there anything else in they can look at; his use case is I want to quickly determine if I made an error

bigbluehat: it's also the right way to do versioning
... you have the option of issueing a new profile when you send a url

tantek: is this worth a normative change that breaks open the CR?
... if it's a SHOULD it's a normative change... maybe make it a note

bigbluehat: I've had experience that it's a MUST that will break open CR
... we define in our most recent work that our application/ld+json(?) + profile....

tantek: that's the thing is, he wants to use it for quick error verification, so he can't rely on it for his use case, I'm not sure what the value is

bigbluehat: beyond the first post he goes deeper into error reporting

aaronpk: there's an error responses section

tantek: if that point if a client is making this request they've already read the spec

bigbluehat: he may be referring to it, but there's a registered media type for a json shape that looks like that or really close

aaronpk: this feels like overkill to me, because at the point that you're talking to a MP server you know you're working with a MP server

tantek: I'm going to call out the versioning point, I'm getting a consensus that we don't need to make any changes for this version of micropub, so part 1 let's resolve on that if there's no objections to close this issues with no changes for this version of micropub

csarven: the successful one doesn't do it, so why should it do anything different

aaronpk: I think he was pointing out that the successful ones do do it, so

tantek: maybe leave this issue open for a future version?
... and maybe have a way to have a micropub 1.1 server to distinguish its responses

aaronpk: right because at that point you know what version of a micropub server you're talking to

bigbluehat: if you don't start now, it's a gues

tantek: if it's new verisons you can make it a MUST that says it's a new version

bigbluehat: right with the caveat that json clients will fall down to application/json, so if they don't get the profile they'll fall down to version 1 (?)

aaronpk: that's worth calling out in a new version, if there is one

tantek: that also has the nice side effect of buying us time for finding out what that would mean
... it sounds like json-ld contexts for that?>

bigbluehat: yes that seems to be what's happening

tantek: so in the future, if that catches on, we might have better guidance

<bigbluehat> as an aside, here's the application/vnd.error+json specification https://github.com/blongden/vnd.error

aaronpk: I want to add for the notes for this of doing the json type is that this is what oauth does

<rhiaro> scribenick: rhiaro

cwebber2: The versioning thing might be something relevent to all specs if we end up taking this path
... Talking to the pumpio people, how they are going to migrate, say maybe we should have AP things put a header or something that indicates

tantek: I thought discovery was different

cwebber2: yeah discovery uses a different media type, that might be sufficient, just thinking briefly
... maybe later on, and we can discuss when we get to AP, have a general discussion about what to do in the group
... Or decide that if in the future we have new versions putting a must for a version number solves it
... That was just the first point I wanted to make
... But the second thing I wanted to say is we started to say a resolution but we didn't capture it

<scribe> scribenick: cwebber2

RESOLUTION: We're not going to make any changes, stick with application/json, but add a note about consideration for future versions, esp if there are incompatible other changes that a mimetype would help with. If there are conventions in the future more specific we could follow that.

RESOLUTION: We're not going to make any changes, stick with application/json, but add a note about consideration for future versions, esp if there are incompatible other changes that a mimetype would help with. If there are conventions in the future more specific we could follow that. (Regarding issue #55.)

tantek: if we have a general approach to versioning for our specs that would be good to discuss... we can see if there are changes to pull into micropub we can cross that bridge when we get there
... ok to move forward?

aaronpk: yes, can I close issue even though commenter has not replied?

tantek: I think you should provide commentary from the group with that explaination and say

sandro: ... "if that's good enough can we close this issue"?

tantek: yes
... if there's still an issue then, we can bring up at next telecon

<bigbluehat> here's RFC6902 which defines the "profile" Link relationship and the profile="" media type parameter discussed just now: https://tools.ietf.org/html/rfc6906#section-3.1

<tantek> I'd like to give ActivityPub the ability to do the right thing for ActivityPub since it is still a WD, and then if there's anything from that that we need to pull back into Micropub we can cross that bridge when we get to it.

<bigbluehat> In sum: "The objective of profiles is that they allow instances to clearly identify what kind of mechanism they are using for expressing additional semantics, should they follow a well-defined framework for doing so"

tantek: I think that makes all your issues awaiting commenter?

aaronpk: #54 commented this morning... great

<tantek> https://github.com/w3c/Micropub/issues/54

aaronpk: this issue was about when querying the micropub endpoint for ? properties of the post, if it doesn't exist it currently errors, this says we should use 404, but I'm arguing against that
... if it replies with http 404 it says not found
... so 400 bad request I think catches that case
... and the actual text in rfc2068 about http response codes would actually forbid using 404

sandro: can you back up and say how we got to this point?

aaronpk: yes, part of micropub involves doing a GET request, which gives you a microformats 2 json response
... if it doesn't exist, it will be 400, and say "not found"

sandro: it should probably give the thing...

aaronpk: it probably didn't

sandro: if you got a 400 vs 404 you might want to convey that....
... conceptually I think you should make it clear you're acting as a proxy

bigbluehat: 400 is the better one to use because it does malformed request syntax, etc...

sandro: I completely agree that 400 is the right thing, 404 is wrong, I was digging a side issue to explain it

bigbluehat: 404 is cacheable by default, so you could cache that your endpoint is gone, even though it's actually something farther out

aaronpk: http 404 would be terrible because it would be handwavey and actually cause failures

sandro: it's about query parameters...

bigbluehat: what if the resource is one hop away...

aaronpk: I will note that he added a comment this morning
... a lot of other tech doesn't use http error codes at all

tantek: any objection to closing this issue without change?

bigbluehat: the one problem was that you said "not found" despite using a different request, so I think that's what tripped him up

sandro: "indirect resource not found" or something

bigbluehat: I like "the guy behind me not found"

aaronpk: there are two parameters using this request, one is q=source, the other is url=blah
... so source not found, that seems to make it explicit and not be likely to be confused
... so suggestion to close this issue is to change error code to "source not found"
... is that an ok change to make?

sandro: that's a magic string in the code?

tantek: so this is a breaking change?

sandro: I'd say put this on a list for "if we go to CR do this, otherwise..."

aaronpk: I think that making this change is nice, but it's maybe not worth it

tantek: but it would also require updating implementations

aaronpk: so there are no test results in my repo of test results, but there cweiske has started to collect some on the indieweb wiki
... interestingly, none of the implementations appeared to support q=source at all
... mine implements it, but mine isn't open source, so

tantek: does another one implement it?

aaronpk: I think so, but think it wasn't open source

tantek: another way to look at it would be, if we got horizontal review from an http working group would we get feedback like "fix this, you must fix it to continue"
... if that's the case this is the chance you get to fix it

<rhiaro> scribenick: rhiaro

<cwebber2> scribenick: cwebber2

aaronpk: right now the spec does not require the client do anything with these errors, so...

bigbluehat: they currently have the same value as their description more or less
... there's no processing expected beyond that right

aaronpk: right most of the actions the client would take are based on the http code, like forbidden vs post is not found

bigbluehat: one thing is that 400 has two potential values

aaronpk: so the this is the only http response defined that has 2 potential string values, the case of the source not found is descriptively covered by the first one "invalid request", which technically covers "this doesn't exist
... since we weren't telling clients to do anything different anyway

tantek: and dropping the string wouldn't change implementaitons right

bigbluehat: I would make an editorial note to say strings using status codes from rfc, 400 Bad Request, and then say "this is the magic string"

sandro: you don't say what to do if you don't get those strings, probably say MUST ignore, but...
... why would a machine even care
... what would happen if an existing implementation already has one of these, and sends it to someone else
... so I suggest you add an editorial comment explaining what we always intended, which is fall back to invalid request
... so fall back to using numeric code

bigbluehat: from a testing perspective this whole section is a MAY

sandro: if an error code is returned, it MUST....
... if someone sends you an error code that isn't that string, it's a MUST
... they're okay by leaving it out, or by using one of these 4 strings

<ben_thatmustbeme> Now I remember. I had started with something like q=source but had switched over to just fetching the object from html since no one had q=source support at the time

tantek: you're making a conformance change but it doesn't break any existing implementaitons, which we can explain to the director

bigbluehat: the question is, now that we've hit it, is how extensible is this space

tantek: if it's open ended, you don't need to deal with it

<aaronpk> PROPOSED: Close #54 by dropping "not_found" from the list of error codes because that case was already covered by "invalid_request", and add a sentence saying how to handle unexpected error codes, and add a header to the bullet list of error codes to indicate this is the list of error strings defined by the spec

csarven: would this change make it through the changelog?

tantek: I've made that request yes

<sandro> sandro: and we're explicitly not saying how other error string values get their meaning, or establish shared meaning. We're not going to do a registry of these things.

<sandro> (agreement)

<sandro> (I'm not thrilled, but this doesn't seem worth the effort)

aaronpk: state of the test suite is I've listed the tests I have to write out

<aaronpk> https://github.com/aaronpk/micropub.rocks/issues

aaronpk: here's the list of tests to write
... what I have so far is I have the framework for someone interacting with these tests
... that's all ready to start actually writing the functionality of each test

tantek: so you have a plan but have to write the tests
... do you have a rough idea when?

aaronpk: when do you think you'll do it?

er s/aaronpk/tantek/

tantek: how about by the 4th

<bigbluehat> From the earlier topic, here's how OAuth2 defines it's error "magic string" space (and extensibility) https://tools.ietf.org/html/rfc6749#section-8.5

aaronpk: let's log that as our status on that...

tantek: do you have imlementation reports?
... when do you think you can have the implementation report ready?

aaronpk: what's more important

tantek: accurate tests are important
... do you also want to try to get that template by the 4th?

aaronpk: no

tantek: week after?

aaronpk: yes

<ben_thatmustbeme> aaronpk, I can probably help with the template too

<aaronpk> https://indieweb.org/Micropub/Servers

<aaronpk> https://indieweb.org/Micropub/Clients

aaronpk: cweiske has been doing implementation reports for clients and servers
... these are the open source implementations he's been looking at, he's been testing out some features

tantek: also a good example of a summary, which we don't have for our projects

aaronpk: he's also checking of specific properties of h-entry or other properties
... so he's being more thorough in some ways, and not as much in others, but he's also only checking open source implementations

(discussion about, what do the links mean?)

tantek: I'm mentioning that since there aren't implementation reports, this helps us go to CR
... theoretically at that telecon ask the group to go to PR

aaronpk: it's definitely how I'm going to be influenced to create the test suite too, it's nice to be able to share the tests stuff

sandro: so I showed earliest working group stuff, and I was joining at CR, and I did test results and went more than needed, I felt like there was a nice feedback loop of people seeing their results as their feed, which they liked *anecdote*

<sandro> static snapshot of that output: https://www.w3.org/2003/08/owl-systems/test-results-out

aaronpk: he does have media endpoint on the list, there's less implementation now, my clients and servers support it, but that's one more thing where we need to get implementation on the server

tantek: this is good, we don't have anything like this for webmention do we?

aaronpk: no

tantek: for as2, I think we don't either?

aaronpk: I don't think so...

tantek: okay, well it sounds good and you gave us dates and etc
... having a test suite with list of features, we can re-evaluate on oct 11th telecon on where we are

sandro: we might consider expecting that to be an extra long telecon?
... 90 minutes at least?

tantek: good idea to look into
... does anyone object to extending talk on 11th to 90 or 120 minutes?
... we have about 25 minutes before break / AC meeting

sandro: I'm skipping the AC meeting, will go to i18n

tantek: is there anything left for schedule, such as activitypub next steps, that we could start looking at

cwebber2: we could start looking at activitypub early?

<sandro> scribe: sandro

ActivityPub

<cwebber2> https://github.com/w3c-social/activitypub/issues

only one substantial, I think

https://github.com/w3c-social/activitypub/issues/108

cwebber2: I don't think that's normative, is it?

tantek: Were you thinking of adding the security question answers?

cwebber2: This doesn;t affect interop

aaronpk: Plenty of documentation about this

tantek: security considerations aren't normative

sandro: yeah, this isn't 2119 "should", it's a more general thing

cwebber2: we do have security considerations, but I'm not sure if I got "non-normative" labels right.

tantek: now is the time to be making all your last-minute normative changes before you go to CR

<AnnBass> me has to leave for AC meeting; am facilitating the first discussion

tantek: and you should label every section non-normative that doesn't have normative content

subtopic: https://github.com/w3c-social/activitypub/issues/107

"source" field #107

cwebber2: This is a problem when the HTML is produced by something.... I'd like to add that?
... the source will not be rendered by the client, but it'll be carried

aaronpk: clients that support editing MUST work on source?

cwebber2: optional, it's a MAY

rhiaro: source might get out of sync

cwebber2: I don't really care. people are probably using the same client to edit.
... the CLIENT converts, the server never has to understand the source
... this is what happens in clients currently -- they do markdown -to- html then lose the markdown

rhiaro: I'd have client go to & from html

cwebber2: But I want emacs orgmode, where the client can't convert from HTML

aaronpk: So what happens if someone edits the HTML, using another client?

cwebber2: then you delete the source

aaronpk: I'd like to see all the cases considered.
... in Micropub, the server is the final authority on the content, and clients are expected to deal with HTML, or not understand the syntax and present to user as text/plain.
... It might be orgmode or markdown or something.

tantek: show us in spec?

aaronpk: it's not written down in a lot of detail
... the motivation /expectation is the person with the mp server knows what the original content should be, and they'll be using multiple clients that don't know what the user wants.
... rather than having the clients know lots of formats

cwebber2: this is useful when editing your own posts
... maybe if you like seeing the original markup / sourcecode in some way
... not everyone's going be writing in plain text

aaronpk: If the client doesn't understand format, then treat as plain text
... I avoid markdown because it's not standard

<bigbluehat> in other news text/markdown is now a Thing: https://tools.ietf.org/html/rfc7763

sandro: I think you can make this work, by protmpting the user, and maybe refusing, in some cases

<aaronpk> oh boy, which markdown is this?

<bigbluehat> aaronpk: see the `variant` parameter

cwebber2: html to other formats is hard and error prone
... that's not good enough for me
... or we could let the server handle it, but then I can't do org-mode !

sandro: if you can't understand the source, you must prompt the user and maybe delete the source

aaron: if the server gets content without source, it must delete the source

<bigbluehat> oh. and here's another bit of RFC goodness that defines what to do with what might be inside a text/markdown response body: https://tools.ietf.org/html/rfc7764

sandro: but client must prompt user before losing source

aaron: yes

rhiaro: I'm going to hate this. The client I make wont want to deal with source.

tantek: Medium provides a nice editor that seems to edit HTML and sends it back to the server

rhiaro: But every client has to add a whole user interaction around this

<tantek> PROPOSED: Add "source" field feature to ActivityPub per issue 107

+1 with the caveats above about clients never losing or corrupting data or getting out of sync without human approval

aaronpk: I'm not thrilled with this architecture. I want the server to be authoritative.

cwebber2: this is more like the state of the world in AP

<rhiaro> +1 at risk though I'm a bit spooked about having to build user interaction if a source is found because I always only want to handle html content

<Loqi> [@bigbluehat] "caffeinated" is a personal "At Risk" feature right now at #TPAC2016 ...time for a break @SocialWebWG #amiright?! (http://twtr.io/1HHiBjiurwt)

cwebber2: pumpa and dianara, the clients do the conversion, not the server.

<cwebber2> +1 at risk

<tsyesika> +1

(my +1 is at risk)

<csarven> +0 add it and see what breaks/works ;)

<aaronpk> +0 with the addition of servers being required to drop source if an update was made with HTML, and recommending that this destructive edit be prompted to the user

<bigbluehat> +0 on the feature; +1 on the "at risk"-ness of it

tantek: I'm not sure we have consensus around any one design here

<csarven> MIME? Is that still around

tantek: so please take this to the issue discussion

<tantek> ben_thatmustbeme: could you present+ yourself?

<ben_thatmustbeme> sorry i forgot that tantek

<KevinMarks> Medium wrote about content editable and their editor

<aaronpk> https://hangouts.google.com/call/vgg2rqyvnzge7lv76rthuhzk4ae

<KevinMarks> https://medium.engineering/why-contenteditable-is-terrible-122d8a40e480#.b1nyq5dyz

<rhiaro> scribenick: sandro

i18n

(introductions)

Addison

Amy Guy

Aaron

Sarven

Sandro

Felix

r12a

(missed two people whose names I couldn't spell)

<addison> http://w3c.github.io/i18n-activity/reviews/

<r12a> https://github.com/w3c/Micropub/issues/39

<addison> https://github.com/w3c/Micropub/commit/82a49a3fa6ff6b19923344eae1288bf367f3b2bf

addison: that looks okay

RESOLUTION: close https://github.com/w3c/Micropub/issues/39 with everyone happy

aaronpk: that was my only still-open micropub one

<aaronpk> https://github.com/w3c/webmention/issues/57

on webmention:

aaronpk: "no language support"
... wm is a server-to-server protocol. In normal operation the response body is never seen.
... it only comes up when people are developing / debugging
... some developers never realized there was a response body
... talking about it today, we're curious about for error responses, is there any typical guidance?

addison: Several classes of things have occured
... in past standard
... ietf has idefault
... not a very global-friendly thing
... we generally look at, if you're going to exchange natural lang text, you should including an indication of the language
... so it's a good idea to provide language information if it's available
... for APIs that interact with users, language negotiation is good
... so the server can respond with the language the user wants
... we not ulta-concerned

sandro: can we just use http header?

addison: that's what I recommended

aaronpk: we're planning to change the example to not include a body, because there's no functionality in having a body
... since we're not recommending that
... and adding a note explaining what implementations have done.
... and saying some endpoints, when the request comes from the browser, give a full HTML response with all the negotiation
... so include something about using HTTP best practices around Accept-Language

addison: example would just be HTTP 201 Created

aaronpk: do we want to remove specific recommendation of returning human-readable text

addison: if you take out human readable, we wouldn't care very much

r12a: Content-Language can have multiple languages, though, so maybe it's not ideal

addison: although that's not best practice

r12a: if you happen to have multiple languages, it could be a problem

sandro: sounds like: if you include a body, you should include a content-language

aaronpk: in practice, there's usually very little information returned from API to reduce attack vector

addison: when running in production

aaronpk: in 3.2.3 error responses

addison: when the server is down, you probably don't have a lot more information. It's nice to do i18nish things, but whatever.

aaronpk: send to target URL that doesn't exist

addison: that's okay
... can leave that section alone
... We'd have nothing to comment on if there's no example there.
... I dont know what else you'd put in a response body

aaronpk: Some return a data dump, some have an English sentence, etc
... none of it affects interop

addison: cool

aaronpk: Noting in issue....

<aaronpk> https://github.com/w3c/webmention/issues/57#issuecomment-248931056

"Notes from discussion with i18n:

Remove example english text from response body
Don't include "bad examples" of returning English without returning a language header
Error response section does not need an i18n recommendation because it does not suggest any response body
r12a: we don't have 167 marked as green

addison: your change will get rid of 167 because there's no longer a text/plain

aaronpk: In POST the body is form-encoded URL

addison: we were just responding to your response examples
... These are just URLs, its fine
... don't include charset with form-encoded. It's with text/plain.
... that's why you MUST pre-define that this is utf-8, because there's no where in the protocol to say that.

https://github.com/w3c/webmention/issues/56

<r12a> https://github.com/w3c/webmention/issues/56

aaronpk: we just covered this

<ben_thatmustbeme> sandro: woohoo

sandro: we'll be sending you two more specs right away, and two more soon-ish

cwebber2: ActivityPub is unlikely to have much i18n, because it mostly just is a user of AS2

<cwebber2> https://www.w3.org/TR/activitypub/ is activitypub

<Loqi> [Christopher Allan Webber] ActivityPub

csarven: In Linked Data Notification (LDN) it's just HTTP

<cwebber2> https://linkedresearch.org/ldn/ is linked data notifications

<cwebber2> https://www.w3.org/International/techniques/developing-specs

addison: Give us URLs and maybe we can take a quick glance, ... or you can look at our list
... And then let us know when you've done that

aaronpk: on json....?

addison: No charset of json, defined as utf8

<addison> www.org/International/

addison: On our homepage is a huge box on how to request review.

<r12a> https://www.w3.org/International/review-request

addison: mostly it means send email.

sandro: so review is likely to go more smoothly if we've done the checklist?

addison: generally, but not everything is clear from the checklist

aaronpk: just to clarify, including charset with json is wrong?

addison: that's right, don't do it.

https://github.com/w3c/i18n-activity/issues/205

https://github.com/w3c/activitystreams/issues/354

r12a: It's hanging around because I suggested adding a note saying it's useful to including a language when you're dealing with strings

cwebber2: the normalization algorithm loses it. I see.

https://www.w3.org/TR/activitystreams-core/#naturalLanguageValues

<Loqi> [James M Snell] Activity Streams 2.0

<cwebber2> http://json-ld.org/playground/?startTab=tab-expanded&json-ld=%7B%22%40context%22%3A%22https%3A%2F%2Fwww.w3.org%2Fns%2Factivitystreams%22%2C%22%40language%22%3A%22fr%22%2C%22type%22%3A%22Note%22%2C%22name%22%3A%22Une%20note%20br%C3%A8ve%22%7D

<cwebber2> http://json-ld.org/playground/?startTab=tab-expanded&json-ld=%7B%22%40context%22%3A%22https%3A%2F%2Fwww.w3.org%2Fns%2Factivitystreams%22%2C%22%40language%22%3A%22fr%22%2C%22type%22%3A%22Note%22%2C%22name%22%3A%22Une%20note%20br%C3%A8ve%22%7D

<cwebber2> http://json-ld.org/playground/#startTab=tab-nquads&json-ld=%7B%22%40context%22%3A%5B%22https%3A%2F%2Fwww.w3.org%2Fns%2Factivitystreams%22%2C%7B%22%40language%22%3A%22fr%22%7D%5D%2C%22type%22%3A%22Note%22%2C%22name%22%3A%22Une%20note%20br%C3%A8ve%22%7D

r12a: In "When using [JSON-LD] mechanisms to produce or consume Activity Streams 2.0 documents, the @language property MAY be used " ... we'd expect SHOULD there

sandro: I think the MAY is about which way you provide lang, not whether you provide lang.
... so maybe somewhere at the start of 4.7 we can say "You should put the language information in there somewhere"

addison: that's what we'd be looking for

sandro: so Example 16 is bad....

r12a: we at one point asked if you could put language in every example
... but didn't insist.

sandro: anyone want to speak for AS2?
... I'd like it'd be fine to make these editorial changes

cwebber2: It is kind of distracting to have it in every example

addison: Maybe state that we omited it from examples, with a ...

sandro: the At Risk phrasing is very confusing

addison: it can be hard to convince people to implement

cwebber2: there's a possible foot-aimed-gun, with developers just hardcoding "en".

addison: SHOULD helps with that, MUST tends to cause that more
... I understand some developers aren't terribly motivated

sandro: Our concern is developers might then just not use AS2

addison: we understand...
... you could provide a more elegant way to specify the language, but that would be a different pain.

cwebber2: that's what @language without the { } is
... it gets lost in RDF-land

sandro: it seems reasonable TO ME to update many/most examples to be like example 19, AND to add a note explaining the importance of including language information, eg around Example 16.
... but there may be other views in the WG, and implementor community

r12a: yes, we'd be happy with that

rhiaro: thinking about Social Web Protocols...
... Any examples are going to use AS2

sandro: doesn't MF2 have the same problem?

aaronpk: yes?

rhiaro: why hasn't this been noticed before?
... so the examples in the MicroPub spec that are in English?

https://www.w3.org/TR/micropub/#new-note

<Loqi> [Aaron Parecki] Micropub

(example of posting some natural language text, with no language indicator)

MicroPub is using MicroFormats (MF2), and MF2 doesn't happen to handle lang

rhiaro: so, this was swept under the rug, and now we've noticed. What do we do about it?

bigbluehat: there is a proposal to add lang to MF2, but it's not mature

rhiaro: Does MP have to resolve this dependency on MF? Can MP proceed without this problem being solved?

addison: it's not exactly your problem that you based MP on MF2, but it is the problem of the international community. Maybe some day we tackle i18n for MF.

<tantek> for reference: https://github.com/microformats/microformats2-parsing/issues/3

<tantek> input welcome!

<addison> I think we are coming to the conclusion that we (I18N) might ought to do a review

<tantek> I think that would be welcomed, certainly speaking for myself. Appreciate the consideration!

<addison> https://www.w3.org/International/wiki/ContentMetadataJavaScriptDiscussion

<r12a> https://www.w3.org/International/questions/qa-lang-why

(from earlier)

aaron: we can handle bidi and non-english text, it's just not labeled
... best plan: write a note in MP saying we recognize this doesnt do lang, please see MF issue, and when that gets updated it will automatically be incorportated by reference.

sandro: yep, sounds fair. references to things that are updated in place are well known, if a bit challenging. For example: unicode.

addison: You don't want a social web protocol that references Unicode 7, so it doesn't have emogi!

<rhiaro> can't have a social web without emoji

<aaronpk> lang=emoji

<rhiaro> o.O

❤️ emoji >

<aaronpk> http://www.bbc.com/future/story/20151012-will-emoji-become-a-new-language

<rhiaro> 💩

04❤

🍔

<tantek> 👥🌐

<tantek> https://socialwg.indiewebcamp.com/irc/social/2016-09-22/line/1474561478592 😃

<Loqi> [tantek] 👥🌐

<rhiaro> http://💩.amy.gy

<rhiaro> aww loqi, let down

<aaronpk> rhiaro, needs h-entry

<rhiaro> FIN

<csarven> http://💩.csarven.ca/

<rhiaro> trackbot, end meeting

Summary of Resolutions

  1. accept aaronpk's proposal to close issue 48
  2. We're not going to make any changes, stick with application/json, but add a note about consideration for future versions, esp if there are incompatible other changes that a mimetype would help with. If there are conventions in the future more specific we could follow that.
  3. We're not going to make any changes, stick with application/json, but add a note about consideration for future versions, esp if there are incompatible other changes that a mimetype would help with. If there are conventions in the future more specific we could follow that. (Regarding issue #55.)
  4. close https://github.com/w3c/Micropub/issues/39 with everyone happy