See also: IRC log
<sandro> scribe: sandro
eprodrom: It'd be nice to have a going-to-CR checklist
aaronpk: Yesterday we discussed
and resolved all the issues
... I'll go ahead and make the required changes
... the big things to do are
... - writing the impl report template
... - some sort of tool for testing
<eprodrom> https://www.w3.org/wiki/Socialwg/CR-checklist
sandro: thoughts on that?>
aaronpk: two tools, for clients
and servers
... the test server will be something you can use in testing
your client
sandro: so it'll be like your current blog but much more strict about the protocol
aaronpk: i'll do it at some URL,
then things will disappear after 48 hours or something
... to test your own server, I'll make a website act like a
client -- posting things to your server.
... it'll go and create a note, then try to edit it and make
sure it's right
sandro: but you can't GET to see
if did the right thing....
... it'd be nice to get Accept: json to see the post data
aaronpk: so that should do it for
testing.
... - I'll check the normative references
tantek: publish an updated WD first, please
eprodrom: So we've closed our issues, so the resolution will be ....
sandro: Before going to CR there should be evidence of wide review
aaronpk: IWC discussion
sandro: be good to get it
beyond
... evidence is usually random public comments
<eprodrom> PROPOSED: publish new editor's draft of Micropub including changes as per resolutions on 6/6 as new working draft
+1
<tantek> +1
<eprodrom> +1
<aaronpk> +1
<cwebber2> +1
<rhiaro> +1
RESOLUTION: publish new editor's draft of Micropub including changes as per resolutions on 6/6 as new working draft
<cwebber2> how about grievous endorsements
tantek: Let's email public-review-announce@w3.org saying we're at zero issues
sandro: we can do chairs@w3.org
and horiz review
... Doing the Transreq at the same time as AS2 could be
nice
https://www.w3.org/wiki/Socialwg/AS2_CR
https://www.w3.org/wiki/Socialwg/Webmention_CR_Transition_Request
eprodrom: Should micropub include some design rationale to help address questions about how this relates to JSON-LD, AS2, etc
rhiaro: The objection is likely to be that mp uses the mf vocab not the as2 vocab
eprodrom: yes, there's that, too
tantek: feels like FAQ
... There's evidence there's been work on convergence
... The group is aware there's different vocab approaches at
work. And has converged them in some places. But in our
parallel approaches work mode, we don't see this as a blocking
issue.
... if there's an implementor that comes to the table and finds
this a blocking issue, we'd want to know.
sandro: parallel between ap and mp
tantek: things like audience targetting
cwebber2: the ap crud stuff has
some addressing happening
... But I think that's future work, largely
sandro: Can I use AS2 with MP
rhiaro: No
aaronpk: If you're using the json format, you're posting the object
cwebber2: can I post a video
rhiaro: MP requires the server to understand h-entry, so no
cwebber2: If you want to just have everything be a side effect....
aaronpk: The tradeoff is whether
the spec is generatic and can be used with any vocab or
... leads to interop
... Oauth has this problem. It leaves too much
unspecificed
... for interop
tantek: It's an antipattern, where the spec doesn't say enough to make things interop
cwebber2: maybe over the next few months it might be a fun experiement to see how far you can get crossing the vocabs and the pub protocols, but it might get us into trouble
rhiaro: I use aaron's mp clients
and on the server I do a little rearrangement and treat it as
AS2
... That's not a huge burden
... but it's not a complete translation, eg on likes
... it's kind of hacky
... but it doesn't tell a good story of why we don't just do
AS2
... it doesn't tell a story about why the group i sbothering
with this
eprodrom: One reason is if we're going to come together to work together for two years on stand...
sandro: that's the other
side
... it sounds like mp has a normative dependency on mf
tantek: we went through this, so
MF has parts which are stable enough, to be referenceable
... so that matches our model of living spec work
sandro: so micropub has to only normatively reference explicitely stable parts of microformats
tantek: right
eprodrom: are we on track to CR? are there other bits.
aaronpk: Nope
sandro: So it sounds like we need to be completely upfront about "Two Stacks are Better than Zero Stacks"
eprodrom: Does it make sense to hold up MP to be in sync with AP ?
cwebber2: We don't know what AP needs for CR
aaronpk: I'm worried about that
timeline
... I have a plan for the validators (eg test suite)
cwebber2: I could maybe have AP reading for CR in a month
aaronpk: I'm concerned about withing for AP when there are unknows for AP
cwebber2: I think AP (without ASub) I could do it within a month or a month and a half....
eprodrom: I only bring this up because we've talked about this before.
sandro: How about instead we just have each draft in a big box point to the other spec, "THis is one of two SOCIAL APIS from the socwg, with sltihgly different use cases and approahced, implementors should check ou tht eother one"
tantek: (wording above)
<aaronpk> "This is one of two client APIs being produced by the working group with slightly different use cases and approaches. implementers should check out and review the other approach here."
tantek: That would greatly help
communicate this to the outside world, yes.
... It helps show that clearly these are clearly from the same
group
<cwebber2> microacts?
<cwebber2> (microactgressions?????)
aaronpk: Conclusion is proceed not in lockstep
eprodrom: Let's have cwebber2 on the MP/AS2 transition call, so he knows what to expect
<aaronpk> and the next WD of both activitypub and micropub will include the section linking to each other
sandro: should be fine, yeah, I think
https://www.w3.org/wiki/Socialwg/AS2_CR
https://www.w3.org/wiki/Socialwg/Webmention_CR_Transition_Request
sandro: maybe we can do an editing session on these later today
<eprodrom> https://www.w3.org/wiki/Socialwg/CR-checklist
<tantek> sandro did you mean like this for Micropub? https://www.w3.org/wiki/Socialwg/Micropub_CR
yes
tantek, re horizontal review: https://www.w3.org/wiki/DocumentReview
seven review groups
<eprodrom> cwebber2: have you ever seen https://github.com/activitystreams/activity-api ?
<eprodrom> scribe: tantek
<cwebber2> eprodrom: huh, I hadn't seen that!
RESUME
30-45 min on this
aaronpk: what is the goal for the end?
eprodrom: if we have any additional work to move forward, and we know what actions we need to do to move forward, or ok if have no actions to do
sandro: would be nice to close open issues
eprodrom: where are we in the process?
aaronpk: we ar waiting for
implementation reports
... I can build more tools for testing a webmention
receiver
... I have a couple of issues on webmention.rocks about how to
create tests for this feature in the spec
... rest of issues on wmr are my to do list
sandro: perhaps we can help with
brainstorming those too
... procedurally, in two more weeks we are at the end of our CR
period
... if we've closed all issues, and exit criteria have been
met, then we do another transition meeting
... or can be sometimes done by email
... then at that point we go to Proposed Recommendation, it
gets republished
aaronpk: change CR to PR and hit publish again?
sandro: yes, and there's the drafting the email to the AC
tantek: how long ac has to vote
sandro: four weeks
... assuming that has no formal objection (occasionally there
is), then there's one more transition meeting
... which is often waived if everything goes smoothly, and then
it gets published as a Recommendation
... there is also Horizontal Review
eprodrom: in terms of
implementations, is there value for us to be seeking
implementations from member organizations?
... is it worth our time to look down the list of member
organizations ?
... it may be more compelling case if we have high profile
implementers
... other companies?
... Medium?
sandro: What about WordPress extension?
paul: my company is interested in Micropub and maybe Webmention
eprodrom: there are also comment
SaaS services, like echo, intense?, disqus
... may be worth reaching out to
paul: if we implement the standard in our company, what should I do?
aaronpk: one way is to submit the implementation report
https://github.com/aaronpk/webmention/tree/master/implementation-reports
<rhiaro> paul ^
aaronpk: if you are a member of a group you can join a teleconference and demonstrate
sandro: there is no need to demo, but you can if you want
aaronpk: official way is to submit the implementation report
eprodrom: other tasks?
aaronpk: open issues
https://github.com/aaronpk/webmention/issues
aaronpk: and an email
sandro: I can close the one I opened for folks to voice an alternative "Webmention should use JSON-LD" #47 https://github.com/aaronpk/webmention/issues/47
<aaronpk> https://github.com/aaronpk/webmention/issues/42
aaronpk: one with the most
comments
... main issue is regarding verifying behind paywalls
... if you have a document like a PDF that is restricted, then
create a separate page that the document references, so that
there's an actual page with the document's metadata
... there are a lot of benefits to that
eprodrom: I think the resolution
you proposed makes sense
... there are two specifics here, one is format
... the second is private documents
aaronpk: webmentions for private
documents where the receiver is expected to have a login is
fine and not that much of a challenge
... the difference here is where the receiver may not have
access to the document
eprodrom: is that notification worthwhile?
aaronpk: what is the goal of that notification in the first place?
sandro: kind of harmless
... or maybe you're revealing private information
aaronpk: this is specifically about you need to pay to get access to this journal
sandro: he provides some text,
which I think is unnecessary
... if there is already a trusted relationship, then there's no
need
<sandro> (where "sender" in his proposal should be read as "owner of the source")
aaronpk: I think what he was
getting at is not actually going to work because webmention is
a thin payload
... or I could add something with the suggestion, if you have
restricted / paid access content, you should create a landing
page for that content that is public that has the links
tantek: issue opener asks for that in his last comment
https://github.com/aaronpk/webmention/issues/42#issuecomment-222242255
aaronpk: do you think doing that
will satisfy the commenter and benefit the spec
... not sure where it will go?
tantek: maybe an appendix?
aaronpk: maybe in privacy considerations?
<sandro> PROPOSAL: Close webmention #42, saying we'll include text suggesting landing pages for this kind of paywall scenario
tantek: not really that close to that, that means something else usually
<sandro> +1
+1
<aaronpk> +1
<rhiaro> +1
<eprodrom> +1
<cwebber2> +1
RESOLUTION: Close webmention #42, saying we'll include text suggesting landing pages for this kind of paywall scenario
aaronpk: next issue https://github.com/aaronpk/webmention/issues/44
sandro: maybe just editorial
tantek: how is conformance class editorial?
sandro: no code has to change
aaronpk: not going to change any implementations
sandro: would be easier to
read
... hesitate because maybe other classes of proxy
aaronpk: tantek do you remember when we talked about this?
tantek: proxy receivers cannot conform to the quoted implementation requirement in the issue?
aaronpk: why not?
rhiaro: difference between proxy that requires explicit signup by target, and proxy for any webmention
tantek: my statement is true in either case
aaronpk: the "receiver" does not have to accept the target domain's rules
sandro: maybe we can test for proxy receivers?
aaronpk: maybe same origin distinction?
<sandro> +1 "same-origin" + "proxy" receivers
tantek: I think the intent of this requirement was that the receiver at the target's domain knows that the target is a valid resource, like the page / redirect actually exists
sandro: maybe I want to accept webmentions for all pages, 404s, and use that to learn of bad links and create redirects
tantek: if we are making it possible for any target to be a valid resource then what is the point of this conformance requirement
aaronpk: the point of this
sentence is that receivers should not accept just all
webmentions
... another example is perhaps a paid proxy that receives
webmentions on behalf of others, and if someone's account
expires, then the proxy would stop accepting webmentions on
behalf of the target
sandro: maybe expand on the "valid resource"
aaronpk: I think that's a good way to handle this
<sandro> including , :for example some servers (wm.io) might accept anything, while other endpoints only accept one particular target URL
aaronpk: so I will add a "for example" informative text, clarifying the original meaning of that sentence
<sandro> sandro: This is an editorial change, trying to better express the editor's intent and WG's understanding
tantek: does not change implementations?
aaronpk: no, does not
WG decided it's an editorial change, ok with aaronpk's edits
aaronpk: let's discuss 40
https://github.com/aaronpk/webmention/issues/40
... the issue proves the reason for the link header in the HTTP
headers
... the proposed solutions do not match the issue raised
<sandro> aaronpk: Moving link header to MAY would mean it is no longer possible to do dsiverovery on non-HTML
sandro: the suggestions break interoperability
aaronpk: and they don't back up
the original statement of the problem
... I could add, if the content type is HTML, then look at the
HTML tags
sandro: if if is not HTML then can you look at the body?
aaronpk: the discovery section
does not mention any other document types
... which is fine because the LINK header supports
everything
... the actual phrasing in the spec could be clarified
... so that if the content type is not HTML you should not be
trying to parse it as HTML
eprodrom: from a discovery standpoint, is there a phrasing along the lines of, implementers may ...
sandro: I don't want that may, e.g. if I'm publishing turtle, I don't want to be unsure about discovery
aaronpk: I was trying to use the
more positive phrasing
... non-HTML documents MUST use the LINK header for their
webmention discovery
eprodrom: next year a new
document format comes out and has a linking mechanism and too
burdensome to use the LINK header
... that seems unlikely
aaronpk: the goal is
interop
... more ways to discover = less interop
... the cost being potentially fewer documents that can use
it
... I think we're fine for the current level of things being
published
... and adding this clarification text is fine
... totally up for adding the explicit: non-HTML documents must
advertise using the HTTP LINK header
<sandro> PROPOSED: Close webmention #40 with editorial revision clarifying that one should only look for HTML tag if content is HTML. Non-HTML resources MUST use the HTTP Link header for discovery. Each additional discovery mechanism imposes a cost on every sender, which we want to avoid.
aaronpk: also helps show that the spec has thought things through
<eprodrom> +1
<sandro> +1
<aaronpk> +1
<rhiaro> +1
+1
tantek: in the rare instance we see what eprodrom is talking about, that can be handled by a spec revision
RESOLUTION: Close webmention #40 with editorial revision clarifying that one should only look for HTML tag if content is HTML. Non-HTML resources MUST use the HTTP Link header for discovery. Each additional discovery mechanism imposes a cost on every sender, which we want to avoid.
sandro: 46? https://github.com/aaronpk/webmention/issues/46
aaronpk: it may potentially cut down on abuse
eprodrom: an abuse is someone sends webmentions with the source is a 404
aaronpk: or the source is a giant video file
sandro: how is a HEAD request better?
aaronpk: because if I get a content type of video I can ignore it
sandro: spec?
aaronpk: spec says what to do
with different media types
... all examples. intentionally left open for other media
types
... e.g. if there's a way you can find a link in a PDF, you can
send a webmention with a source of a PDF
sandro: the verifier should put an ACCEPT header that says what media types they can verify links in
aaronpk: oooh
sandro: what about size?
aaronpk: spec says something
about that for HTML
... it's not required to implement a limit, but if they do,
they would only fetch the first megabyte
sandro: how would you do
that?
... dropping the connection after a 1MB and then 100MB is still
in the pipe? or a range request
... not sure how many support range requests
aaronpk: if you do end up downloading, you can only parse first 1MB
sandro: ok with may, some techniques include, setting right media types on your ACCEPT header, aggressively closing the connection if its a media type you don't know what to do with
tantek: is ACCEPT header in the spec?
aaronpk: that's worth adding to
the spec
... is this guidance?
... adding to limit on get requests?
sandro: yes
aaronpk: perhaps the receiver SHOULD provide an ACCEPT header of the media types they accept
sandro: do we have a DDOS vulnerability here? kind of off-topic
eprodrom: it sounds like our suggestion is to use ACCEPT header ...
aaronpk: you can still do a HEAD
request if you want
... I could put it in security considerations
eprodrom: yes
tantek: is the ACCEPT header a should?
aaronpk: that would affect every implementation
sandro: I would say SHOULD if we
weren't at this point in the process
... all same weight on the accept header is important. don't
have any numeric values on them. equally weighted.
aaronpk: this is related to HTTP
tantek: because you call out specific content types it would be good to note how that works here
<eprodrom> PROPOSED: Add text to security considerations for Webmention to suggest using HEAD request during verification, AND add text to Verification section to suggest using Accept header
<eprodrom> PROPOSED: Add text to security considerations for Webmention to suggest using HEAD request during verification, AND add text to Verification section to suggest using Accept header closing issue #46
<sandro> not "suggest using HEAD" but "clarified that it is allowed to use HEAD"
<sandro> +1
<aaronpk> +1
<cwebber2> +1
+1
<eprodrom> PROPOSED: Add text to security considerations for Webmention to clarify that it allowed to use HEAD request during verification, AND add text to Verification section to suggest using Accept header closing issue #46
<cwebber2> +1
RESOLUTION: Add text to security considerations for Webmention to clarify that it allowed to use HEAD request during verification, AND add text to Verification section to suggest using Accept header closing issue #46
eprodrom: that resolves the
issues that we have
... let's take a 5 min break and finish with AS2 before
noon
break:
5 min
Paul departs meeting
<rhiaro> scribe: rhiaro
<tantek> eprodrom: let's get started
<tantek> sandro: should Tantek chair
<tantek> tantek: ok
<tantek> chair: tantek
tantek: next steps on as2? Still talking about getting to CR
<eprodrom> https://github.com/jasnell/w3c-socialwg-activitystreams/issues
eprodrom: Link to issues ^
... We discussed CR a bit yesterday. These are all editorial
points
<tantek> these all look good
eprodrom: Linking to
implentationr eports, template, linking to test suite,
submission process, change links to repo, adding a note about
dropping features
... Things that don't get implemented will be dropped
... Update a couple of references, eg. CURIE
... Making AS1 an informative reference
... Pushing the JSON-LD context
sandro: json-ld context is done
eprodrom: grea
t
scribe: There are these editing
tasks I'll get to in the next day or so
... Then we'll push a new WD with them.. don't know if we need
to do another resolution?
tantek: didn't we resolve yesterday?
eprodrom: no we did one last
week
... For some of these but not all
tantek: my experience is that editorial changes, unless there's an objection, you can push a new WD
eprodrom: I'll finish these this
week and push new WD
... Next steps are CR transition meeting
... Which we've discussed doing along with micropub
... And then implementations
... That's going to be an interesting next step
... On my plate is .. it's clear that we have a couple dozen ..
somewhere between 10 and 20 implementations of AS1
... It's on the wiki
... I'll find it
... We have a list of implementations of AS1, they're clearly
good targets for discussing AS2
... Next steps there will be contacting the companies on that
list, letting them know we're moving to CR and we'd like to get
their implementation reports
... Which will not only stimulate getting reports, but also
implementors
... After that I'm not sure what else we need to do
... Is there additional work that needs to go into AS2?
... Hopefully more feedback after CR
tantek: I'm specifically looking
to see what percentage of AS1 implementations (that are current
- there are old ones that nobody has touched for years, dont
expect those) to adopt AS2
... That should be our goal
... And then there's greenfield
eprodrom: It's a question of
finding the conversations with potential implementors
... For me personally if AS2 is not taking up as much of my
time, I"d like to help out chris with activitypub
... that might be the best place to be putting myt ime
... Not that I have that much time.
cwebber2: any help appreciated
eprodrom: And should inform.. also means any activitypub implementations are by definition AS2 implementations
sandro: just looking at the
transition request for it, in reverse order: we should link to
the implementations so far, which would at least be the empty
implementation report repo
... But if we know of some already, even withotu reports, would
be good to enumerate them and show something going on
... For wide review, I don't know about wide review for AS2.
There's tons of github issues. Have we sent emails or
announcements we can point to?
eprodrom: Good idea to send emails out to old AS lists
tantek: edit the activitystrea.ms page?
eprodrom: That will be a good way to get wide review
tantek: mediawiki sends an update to everyone who has ever edited that page, so out of the blue they'll get a notification with that diff
aaronpk: wiki.activitystrea.ms vs homepage
tantek: believe homepage is on a github
<tantek> https://github.com/activitystreams/website
eprodrom: updating
activitystreams website, emails to older mailing lists, maybe
updates to open social people..?
... The mailing lists are gone, so would be going through
contacs lists
tantek: you can do a pr to the website?
eprodrom: I can, not sure if I have permissions to edit
sandro: this level of outreach could be done after transition request
eprodrom: yeh
... And going after list of 1.0 implementors
... There may be worthwhile prodcuing a document or wikipage,
AS2 for people who implemented 1
sandro: does anyone have a clever idea of how to count how many issues are from inside or outside wg? 200 is a lot to go through
rhiaro: not insurmountable to do it by hand if necessary
eprodrom: identifying folks who
have participated in issues who are not wg members we should do
before CR meeting
... Anything else?
aaronpk: this may not be related, but when we're trying to get people to implement AS2, what is the incentive for people who are not memers to implement the draft before it's an actual rec?
sandro: so if they come across a
problem there's still time to fix it
... It's unlikely to change, but ifit's going to change.. i
tey're oing to hit a fatal problem with it it's better to know
that before it's to late to change it
eprodrom: there are companies
like getstream.io, activitystreams is their business
... They may want to have that as.. 'we are the first
implementors of AS2'
sandro: also w3c can do some
press around recs, testimonials, quotes from early adopters, so
chance to get into that press cycle
... Usually if they're w3c members
eprodrom: Sounds like we're
moving forward
... We have some evangalism to do, but otherwise we're waiting
for feedback after CR to see if there are any normative
changes
... I think we're fiished with AS2
... And it's 12
<aaronpk> yum
<tantek> aaronpk, eprodrom: perhaps worth mentioning in the transition request re: AS2 & Micropub: https://www.w3.org/wiki/Activity_Streams/Microformats_Mapping
<aaronpk> come back Zakim
<aaronpk> we miss you
<sandro> Lunchtime discussion of http://brighton.ncsa.uiuc.edu/~prajlich/forster.html ("The Machine Stops", 1909) and https://en.wikipedia.org/wiki/List_of_races_and_species_in_The_Hitchhiker's_Guide_to_the_Galaxy#Babel_fish
<sandro> "Now it is such a bizarrely improbable coincidence that anything so mindbogglingly useful could evolve purely by chance that some thinkers have chosen to see it as a final and clinching proof of the non-existence of God. "
<sandro> bblfish, is your name a reference to HHGTTG? What was your thinking in adopting the name?
<eprodrom> my lunch went late; I'll be back soon
<scribe> scribe: rhiaro
<eprodrom> chair: eprodrom
eprodrom: Beena a while since we've seen a new version of this
<tantek> https://tantek.github.io/post-type-discovery/
tantek: you've never seen a
publication ready version, here it is
... The only normative change to this since the last version is
that more people have started publishing video posts so video
got added to the algorithm
... a one line change
... This is there, as well as the source
<tantek> https://github.com/tantek/post-type-discovery/blob/gh-pages/index-src.html
tantek: I'd like to know if folks
are still okay with publishing what was there before, just took
a while to get a draft (thanks ben_thatmustbeme)
... Re-propose to publish a WD
eprodrom: I'll ask a couple of
questions first
... Main changes since the last version is that it's been
respec'd
... I count 5 issues
... Any value to us in resolving these issues before we produce
a WD?
tantek: before the FPWD?
... Good question
... Last time I looked at them they seemed like good ideas to
do but not blockers
<tantek> https://github.com/tantek/post-type-discovery/issues
tantek: But if there's a specific one there that anyone sees as a blocker or might be a blocker, then we should explore it
eprodrom: The question about 'why' would be the first that strikes me
<tantek> https://github.com/tantek/post-type-discovery/issues/4
eprodrom: The other ones seem to
be.. number 5 sounds interesting but more theoretical
... Not sure if 2 is subsumbed by 4
... If we were going to hold of FPWD, 4 would be the one I'd
say
tantek: Let's look at that one
then
... Last year, when we resolved to publish the first time.
sandro raised.
... First, we're doing the general how-does-this-fit-in for all
the drafts
... it references AS2 and AS2 vocab in informative explanations
for, like examples. That's in the document itself, there's no
summary that explains document relationship with AS2
... I'll take an action to add something informative for
that
eprodrom: I feel like the
abstract clearly says ... *reads abstract* ... so you odn't
have a post type (check), you want to determine the type of
that post (check) -> this is sthe algorithm to do it
... It feels like the motivation is fairly clear
<tantek> chair: eprodrom
eprodrom: sandro since this came from you, is there more to this?
sandro: I don't remember
<tantek> https://tantek.github.io/post-type-discovery/
cwebber2: one of the major things
I was interested in this was, that makes it really useful to
the group, especially with having mp and ap moving forward at
the same time, is that it provides a bridge between the things
we currently have in the group
... you're able to mvoe from something you don't have specific
types in a micropub type system, and you can move to a system
with types
... That's one of the major questions in this group anyway, how
do you justify these two different stacks, it seems like this
is helpful
... So maybe putting that somewhere higher
... Right now as2 is mentioned in 2.1 seems like maybe it would
be useful in the introduction
tantek: yep
... I like that.. bridge between systems without explicit post
types to those with explicit post types
... connects it to two of our documents
sandro/tantek: *discussion of microformats that I didn't minute because I thought it would be brief and informative but is still going on*
<tantek> https://github.com/tantek/post-type-discovery/issues/4#issuecomment-224422033
tantek: totally okay with saying
we have to have text like this before we publish fpwd
... What do you think, evan?
<tantek> "Post type discovery helps provide a bridge between systems without explicit post types (e.g. Micropub, jf2) to systems with explicit post types (e.g. ActivityPub, Activity Streams)."
eprodrom: That's more explicit than what's in there now, and says why PTD is important
tantek: I'll just keep tha tissue open until I've made the edit
eprodrom: that would close that issue I belive... sandro?
sandro: *implied yes*
eprodrom: are there other issues on here that would block fpwd?
tantek: this is useful to get agreement between things that are rec track, so probably should be rec track
eprodrom: is that a decisionw e need now?
sandro: kind of
cwebber2: we should see if we already decided that
<eprodrom> Rec track or not?
<sandro> https://lists.w3.org/Archives/Public/public-socialweb/2015Oct/0020.html
rhiaro: the vague language is not good for rec track, would be clearer how it's useful if it specifically used AS2 terms. eg. RSVP post doesn't exist in AS2
tantek: I agree, needs work
eprodrom: If this was rec track, is this testable?
tantek: definitely
<tantek> needs specific examples for generating AS2 objects
eprodrom: if you were using it in an abstract way, you have an untyped format and a typed format itw ould be hard to test
tantek: the algorithm is very generic, so you could test it by parsing an untyped format and outputitng a string to say this is the type
eprodrom: that is testable
sandro: the code would be much
shorter than the spec
... Just an if statement
tantek: we could also have
conformance classes like if you are an AS2 generating
application you must generate the following objects from the
following types
... If you want to open an issue on conformance classes that
would help
... If we get more implementors we can point them at this to
say if you're consuming untyped data, this is how you get to
AS2
... Another possible source for untyped data is RSS
... Various sites that do RSS feeds of their activities that
have made stuff up. I can research to see if there's something
I can add to post type discovery to make that more explicit
<sandro> https://www.w3.org/wiki/Socialwg/2016-01-05-minutes
sandro: I found a resolution from
January to publish PTD as FPWD
... Still not about rec track
tantek: we hadn't said note track explicitly
<eprodrom> PROPOSED: Publish first public working draft of Post Type Discovery including edits agreed upon during this meeting
<sandro> +1
<eprodrom> +1
<aaronpk> +1
sandro: what's the
shortname?
... post-type-discovery?
<cwebber2> +1
<tantek> +1
RESOLUTION: Publish first public working draft of Post Type Discovery including edits agreed upon during this meeting
eprodrom: tantek do you need help with that process?
tantek: probably..
... If I get stuck I'll ask for help
*** crisis as we notice trackbot hasn't been logging since this morning **
eprodrom: we need to remember to
produce logs from Loqi
... We're now on the tip of 3pm. We have a resolution to go to
fpwd. Do we have anything else to talk about PTD this
afternoon?
tantek: not unless there's another blocking issue
eprodrom: ten minute break
<cwebber2> https://www.youtube.com/watch?v=pdxucpPq6Lc
<cwebber2> pretty much the best animation
<tantek> (break)
<tantek> rhiaro: I'm looking at https://github.com/w3c-social/social-web-protocols/blob/gh-pages/respec.html
<tantek> chair: tantek
<tantek> interruption with figuring out repo moving to w3
<eprodrom> https://help.github.com/articles/closing-issues-via-commit-messages/
<tantek> FYI: https://tantek.github.io/post-type-discovery/ is up to date with edits agreed at this meeting
tantek: evan, what are we doing with PuSH?
eprodrom: I'll give a quick
overview and where it's at
... PuSH was originally developed by bradfitz and bret (?) from
Google
... it was a protocol which they published along with an
implementation which is the google hub
... Basically a push-based feed system where you can subscribe
to feeds and receive fat pings
... THe first version 0.3 had a number of interesting
characteristics, one is that it only was defined for atom
feeds. Another was that it had a kind of complicated set of
roles; a publisher and subscriber, and then a 'hub' so you can
set it up so the publisher and subscriber don't have to scale,
but the hub does
... At its height, all google feeds were PuSH were enabled:
buzz, blogger, feedburner
... It was pretty well implemented at google
... a third part implementation called superfeedr was also
enabled for tumblr, wordpress.com, a number of others
... it kind of hit a peak where it was enabled for a lot of rss
and atom feeds
... There were a few issues that made having a new version make
sense
<eprodrom> http://superfeedr-misc.s3.amazonaws.com/pubsubhubbub-core-0.4.html
eprodrom: When the community and business groups at w3c first started, PuSH was one of the first CGs, the lead was Julian (sp?), the ceo of superfeedr
<eprodrom> https://www.w3.org/community/pubsub/
eprodrom: The made a new version
of the spec
... the 0.4 version was implemented by superfeedr and
google
tantek: and by aaronpk
aaronpk: hub is switchboard
eprodrom: big changes in 0.4,
communication between publisher and hub. Redefined how to do
publication and subscription for things that aren't atom
feeds
... anything that can have a url can be subscribed to
tantek: I'm supporting that on my site, publishing via PuSH 0.4 using superfeedr
eprodrom: awesome
... At the same time, PuSH 0.3 had been incorporated in ostatus
which was mostlly atom based
... and it was implemented by most of the ostatus
implementers
... statusnet, diaspora, friendica
... was relatively easy to implement
... if you used a 3rd party hub it was trivial
... or if you did it yourself still relatively easy
... I think when we started this group the question came up of
what role PuSH 0.4 or later would play for us
... And we ran into a couple of problems
... First was that when the open web foundation was first
announced, google had announced that they would be putting a
number of specs under the open web foundation patent license
and so there are blog posts to that effect, but they never
actually published the paperwork that says, signed at the
bottom, this is under this patent
... By the time that we started to be interested in this, and
having it as a w3c spec, the peopel who worked on it were no
longer working on it and there did not seem to be as much of an
institutional interest in this kind of standardisation around
feeds
... Fast forward to now, the superfeedr hub was just acquired
by medium
sandro: how many people worked there?
eprodrom: half a dozen to
10
... Fewer and fewer of their customers were using PuSH, most
were transitioning to using their own apis
... tumblr, foursquare had had it, but stopped. Fewer feeds out
there that are PuSH enabled
... I haven't checked the google hub in a while, probably
should
... *checks*
... It's live on the web
<eprodrom> https://pubsubhubbub.appspot.com/
eprodrom: Still possible to
use
... Protocol is being used somewhat
... I think we have some question about what role it will play
for us
... There are implementations. We have done a CG incubation for
it, so there is an affinity with w3c
... So it does make sense. Those are kind of on the positive
side.
... On the negative side we have the IP confusion, which is
hard to follow and get to a solution
... And there's a quesiton of does it fit into the stack
... The stacks that we're using
... I think when we talk about AP it does not use PuSH for
subscription distribution. It has its own.
... For the webmention/micropub world it plays a bigger role,
but not sure how big
tantek: the number of
implementations more than doubled because of indieweb community
adoption
... There are multiple hubs
... Now we have some diversity of hubs and implementaiton
experience and it seems like everyone's... people are using
different hubs and publishing to differnet hubs, and everything
seems to work. I don't think we've run into interop problems
where your site can only go to one hub because of how it's
implemented, or where a reader support consuming PuSH 0.4 and
support consuming atom or h-feed real time via PuSH 0.4, seems
to work with all of the hubs that
have been developed
sandro: they're all using h-feed?
tantek: some use atom
... There's basically been really good implementation
incubation and maybe we're all sidestepping the problems in the
spec?
aaronpk: the reason they all are
working together is that the holes that were left in the spec
we have all filled in the same way because of the tutorial on
the indiewebcamp wiki
... In a couple of places where the spec doesn't say what to
do, I just said 'do this'
sandro: I read 0.4 on Sunday and I was like... this is so full of holes
aaronpk: but it's also .. theyr'e not that big, you can fill them
sandro: but if you don't fill them you don't have interop
aaronpk: one side, but not all
the way through
... Specifically the notifying the hub of new content is not in
the spec
sandro: intentionally left out. Also what the notifications from the hub are is left out. Gaping hole.
aaronpk: but if you're in an ecosystem where everyone is publishing and expecting the same type of content it works
sandro: the press around it is all about fat pings, but indiewebcamp doesnt' use it for fat pings. There's no format defined for what a fat ping would look like
tantek: we have specifically chosen to use the thin pings subset of o.4
sandro: 0.4 doesn't talk about that. There's nothing in the spec about what you send.
tantek: we just send the url of the thing that's been updated?
sandro: what media type? form-encoded?
aaronpk: it says form encoded
<aaronpk> http://indiewebcamp.com/how-to-push
aaronpk: This is my guide that I
wrote
... And if you go look at the section how to subscribe, it
walks you through every part of the request, including
receiving notifications, including separate sections for
standard and fat pings
... For standard it says will not contain a body
... If you receive an empty notification, treat this as an
update to the url
eprodrom: talk at a more political or editorial or work level
sandro: the takeaway from this description is that PuSH 0.4 by itself is not useful to us, but refined the way aaron has is useful for some subset
eprodrom: well it is being used,
so in that case
... We have two or three options... we take the PuSH 0.4 and
take it to soe sort of rec level right now and kind of steward
it through that process
... The other is that we take the PuSH 0.4, make an 0.5 that
clarifies some of the things that we're doing, but maybe talks
about what's specifically being used in the indieweb
community
... Third is that we don't do anything with it and accept that
it's a community standard but that we don't necessarily have
anything to add to it
sandro: One more: to change the name... like you said for 0.5 but say 'inspired by'
eprodrom: right, we could do something similar. When you do discovery you could do it for some other name, like not 'hub' it's 'publisher' or something
cwebber2: which ones of those are possible within IP if we don't get google to give it up.. how risky is that?
eprodrom: google is a member of
w3c, if we decided to publish a new version of this spec, part
of tha tprocess would be a call for exclusions, which is they
say they have ip considerations that would block publication of
this spec
... It does not seem like we could get to a point of being at
PR and causing problems with murky ip around this spec
... The problem would get solved
... And the people who are being paid a lot of money to figur
eout google's IP will do it instead of you or me
tantek: I would say that if we
took on PuSH as a work item in this group whether called that
or called something else, then if we successfully produced a
rec, it would put it in a stronger .. or in a more
implementable with less ip concern situation than we have
today
... in that there would be at least some degree of w3c
participating member committments implied or explicit through
that process
... The larger/first issue to resolve before the ip issue is
that there was the CG, Julian still felt very strongly about
editing and updating the spec, I think that were we to decide
to go forward with it specifying the details we have figured
out that allow interop woudl be a good thing, and I would not
be comfortable having that gated on someone outside of the
group
... We have approached Julian in the past explicitly to
participate. I think he hasn't had the time, I don't think it
was a negative thing
eprodrom: for him and his business, the state of PuSH 0.4 fine, it works for what he needs
sandro: both what you said and
the name, the right thing to do about the name is to ask the
people who feel they have ownership of the old name, to see if
they want us to call it PuSH 0.5 or name it a new thing
... leave tha tup to them
tantek: I woudl word it more strongly - hey we like the work you've done, we've continue trying to specify details, we would like to take that work and publish it with the same name with a new version number
sandro: we don't want to hostilly claim next version numbers
tantek: I believe brad doesn't
care... bret is happy to see anyone build on it... I think
netiher one of them want to deal with talking to google's
lawyers
... Julian feels the strongest, he produced 0.4. If there's
anyone we need good vibes from, make sure he knows and agrees
with it happening, it would be Julian
eprodrom: Another objection... limited time, limited resources. I'm not going to edit this. I do'nt know who is. But we'd need to have someone step up and do it. We only have 7 months
tantek: or we publish as a note
sandro: still work there
tantek: less work
aaronpk: what's the value in that?
tantek: shows a consensus
sandro: coherance
cwebber2: plugging holes in an offical way..
tantek: plugging the holes in a w3c note is better than plugging them in the iwc wiki
cwebber2: do you know anyone interested in taking this on?
aaronpk: Well, I am the most familiar with the spec..
sandro: how about AFTER webmention and micropub go to CR
cwebber2: finish your vegetables...
aaronpk: if it's a note, there's no requirement for a test suite, which is a lot less wwork
tantek: theoretically possible to start a note after tpac and get it done
eprodrom: the main reason we had
problems with ostatus is that the subscriber is unauthenticated
so you can only publish things that are public
... it does not make a good channel for publishing to small
groups of people, friends, etc
tantek: limited utility
sandro: the subscriber is authenticated in that you confirm the subscription
aaronpk: the subscription is confirmed but there's no..
sandro: no bearer token
... we could add that?
aaronpk: not sure if that works
all the way through
... I haven't thought it through yet. Might work, not
sure.
... Reason because.. it might depend who is trusting the
hub
sandro: the hub has to be the one
enforcing the access control
... Really doesn't work well to have a third party hub with
access control
eprodrom: One technique is to
have different feeds by group. Secret feeds or have a token in
them that's hard to guess
... THe feed of stuff that evan publishes that's available to
sandro might be under a long complicated string
... Shifts that effort onto the subscriber, it's hard to
manage
... It's especially hard to deal with combinations of
things
... That makes it kind of a tricky.. I wouldn't recommend it
for anything that's not public
tantek: sounds like what you're
saying is if you struck down that path of a PuSH based system
you're gonna end up stuck with public-only functionality
... Which is another reason to make it a note not
rec-track
... But helps at least capture the state of the art use of
PuSH, for anyone who wants to know, here are implementations,
if this is good enough for your use cases
cwebber2: would make sense to specificaly call out that it won't work if you need private communication
sandro: there are other ways to do it
aaronpk: or you can do thin pings and authenticate on GET
cwebber2: people get pings for things they can't access?
aaronpk: no you don't ping them if they can't access it
eprodrom: if you have urls as
identities you can say this subscriber endpoint is this
person..
... You can't have third party subscriber endpoints
tantek: there's a lot of brainstomring about what's possible there, we don't know if it works yet
sandro: we can say 0.5 doesn't include that functionality, but wouldn't characterise it as a dead end
tantek: want to highlight the
implementation experience. Ostatus went down that route then
backed off
... If there was an easy way to move it forward then maybe they
would have
sandro: and they had different
constraints
... On the resource thing, is maybe a step here to put the word
out that if if someone is willing to take on the editorship we
would be interested, or do we want to wait until Aaron has
time?
eprodrom: not that I"m not
perfectly happy to waste aaron's time, when we do a new spec it
does affect the rest of us
... There is a collective amount of time we take in
meetings
... So next steps?
<eprodrom> PROPOSED: Request that Julien Genestoux transition PubSubHubbub from Community Group to editor's draft within Social Web WG
<tantek> +1
<eprodrom> +1
<aaronpk> +1
<sandro> +1
<sandro> We'd like to to be Rec Track, but the time is very short
<cwebber2> +1
RESOLUTION: Request that Julien Genestoux transition PubSubHubbub from Community Group to editor's draft within Social WG
+1
5 minute break
<eprodrom> http://w3c.github.io/activitystreams/activitystreams-core/index.html
<eprodrom> http://w3c.github.io/activitystreams/activitystreams-vocabulary/index.html
tantek: I'm definitely going to be at tpac the entire week
sandro: I can go if we're going to meet although I might go just for the plenary day
tantek: it would be really
helpful the mroe people we get to that
... the Sapporo one was distinctly different
... We'll be able to gauge how people care about this
... By then we'll have several CRs which will be
different
... Which building blocks you use depend on your use cases
sandro: sure
... Wednesday aside, the WG meeting
tantek: my point is if you're going to be there Wednesday you will be there Thursdsay and Friday
sandro: if the rest of you are going, I will go
tantek: what's the threshold?
sandro: This is my threshold
*points at room*
... This is the minimum
... We couldn't have done this meeting without the people
here
... [cwebber2, aaronpk, evan, tantek, rhiaro]
aaronpk: remote is possible, but significant timezone shift
tantek: 22 and 23 is reserved
space for us
... of september
... So question is are you available, and would you be able to
be there in person
aaronpk: I don't know how I'll
get there in person
... More likely to be able to be available if I don't have to
travel
... Travel would mean 5 days at least
... Definitely yes to remote. Harder for me to go in
person.
eprodrom: timewise I could
go
... Can't commit right now. Need to take a look.
... Definitely commit to remote
rhiaro: I'll go if everyone else is going
sandro: get a cancellable hotel now, it's peak tourist season
eprodrom: on the other hand, Lisbon is awesome
cwebber2: if there is another place we can do it, I would prefer it. I'm commited to wrapping up my work and if that means I have to take a huge chunk out of my finances I will do it, but I would kind of prefer something less expensive
tantek: if the other meeting we had resovled on last time was November in Boston
rhiaro: I thought it was December in SF
sandro: I don't remember
eprodrom: So, timeframe. Everything currently on the table should be at CR or ready to go to CR. What would we do at a face to face? September
tantek: if we are going to do a
revised CR that will be our last chance to do so, and resolve
all outstanding issues
... If we get dozens of implementaitons, we will get dozens of
issues
... If we're planning for success, we should expect that
sandro: at the very least we have to go through a bunch of issues
cwebber2: ...airbnb has
affordable lodging.. I might be able to do this if we agreed on
it righ tnow
... I think it's really important we have this meeting. This
time is really important. This location.. but maybe this is the
only reasonable time we'll do it. So I'm for it.
sandro: one of the main reasons for this location is if we get people wednesday, and talking to people during tpac, to try to bring in new blood and share. Some may stop by WG meeting
tsyesika, can you make Lisbon in September?
eprodrom: Can we agree to make this decision in our next telecon?
tantek: another key reason is
assuming we are doing some rechartering we would do it
then
... Ideally better if rechartering occurs before chater
expires
eprodrom: Feels like we have
enough of a consensus to go. Everyone can make it work either
in person or remotely
... Let's say we're doing it
tantek: Sign up for wednesday, thursday, friday
FIN
<sandro> https://www.w3.org/2016/09/TPAC/
RRSAgent please generate minutes
<aaronpk> trackbot, end meeting
trackbot please generate minutes
This is scribe.perl Revision: 1.144 of Date: 2015/11/17 08:39:34 Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/ Guessing input format: RRSAgent_Text_Format (score 1.00) Succeeded: s/MF/microformats/ Succeeded: s/difference between proxy for a site/difference between proxy that requires explicit signup by target/ Succeeded: s/some/same/ Succeeded: s/is this DDOS related?/do we have a DDOS vulnerability here? kind of off-topic/ Succeeded: s/Social WG/Social Web WG/ Found Scribe: sandro Inferring ScribeNick: sandro Found Scribe: tantek Inferring ScribeNick: tantek Found Scribe: rhiaro Inferring ScribeNick: rhiaro Found Scribe: rhiaro Inferring ScribeNick: rhiaro Scribes: sandro, tantek, rhiaro ScribeNicks: sandro, tantek, rhiaro WARNING: No "Present: ... " found! Possibly Present: Arnaud Arnaud1 ElijahLynn FYI KevinMarks PROPOSAL PROPOSED aaronpk bblfish ben_thatmustbeme bengo bengo_ bigbluehat bitbear break cwebber2 dromasca dwhly eprodrom eprodrom_ https jasnell jasnell_ jet joined left paul pdurbin raucao rhiaro rrika sandro shepazu shepazu_ social tantek tantek_ the_frey trackbot tsyesika You can indicate people for the Present list like this: <dbooth> Present: dbooth jonathan mary <dbooth> Present+ amy Found Date: 07 Jun 2016 Guessing minutes URL: http://www.w3.org/2016/06/07-social-minutes.html People with action items:[End of scribe.perl diagnostic output]