December 01, 2015

W3C Blog

New Scholarly Coalition Embraces W3C Web Annotations

Today marks the launch of an informal annotation coalition, organized by the Project, a W3C Member. W3C is excited to be part of this growing effort of over 40 leading organizations in the technology and scholarly publishing communities, including W3C Members IDPF, MIT Press, and Wiley.

The partners in this coalition share a vision of how annotation can benefit scholarly publishing, and of open collaboration for integrating web annotation into their platforms, publications, workflow, and communities.

W3C sees an important role for Web Annotations as a new layer of user-generated content and commentary on top of the Web, and across digital publications of all sorts. Today, comments on the Web are disjointed and often disruptive; a unified mechanism for creating, publishing, displaying, and sharing annotations and other comments in a decentralized way can aid in distributed curation, improving the quality of comments that a reader sees for Web content, and improving the reading experience. In parallel, Web users want to organize and remember useful sites on the Web, and want to synchronize their favorite sites across multiple devices, or to share their thoughts about a site with friends or colleagues; Web annotations enable all this by allowing users to make highlights or detailed notes about a site, to add tags for categorization and search, and to share these links and notes across multiple conforming social media services. This is ideal for casual users, or for focused reading circles or classrooms.

The W3C Web Annotation Working Group is working on a set of loosely related “building block” specifications to enable this functionality. The Web Annotation Model serves as a simple but full-featured data structure for interchange between browsers and different annotation-capable services. The Annotation Protocol defines behavior for publishing annotations to a service, for searching these services for annotation content, or for subscribing to annotation feeds. The FindText API lets a browser or plugin “re-anchor” an annotation to its original selection within a Web page, and a related URL fragment specification will leverage the FindText API to let you share a URL that navigates directly to the selection you shared. Together with a few other bits and pieces, these specifications, when implemented, will let you create a new annotation based on a specific selection on a page, share it to your preferred social media service, and let others (either a small group, or the world) discover and read your annotation right in the context of the page you commented on, or to find other annotations in your feed.

In addition to standardizing annotation technologies, W3C is experimenting with using the technology itself. Our standards process includes public review of all of our specifications, and we have enabled feedback via an annotation interface on some of our specifications; the expectation is that it will be easier for readers to provide feedback, and easier for Working Groups to understand, respond, track, and process feedback that’s presented in the context of the specification document itself. if this experiment succeeds, we will spread this feedback mechanism across W3C’s specifications.

Before a full ecosystem develops, where multiple browsers and e-readers, content sites, and annotation services interoperate, the groundwork has to be laid, in communities that already understand the power of annotation. The scholarly community has used annotations (and the related techniques of footnotes and citations) extensively for centuries. This annotation coalition brings that practice into the 21st century, with a solid technological underpinning that will empower this community to use the Web for review, copy-editing, collaboration, categorization, and reference. W3C welcomes technical feedback on its Web Annotation specifications, and the new annotation coalition welcomes all interested stakeholders to participate in all aspects of this effort.

We look forward to keeping the conversation going about how we can meet the needs of this community, and how we can spread this to other communities, from the next generation of “close reading” students who want to engage with content and not just consume it, to the professionals who want to organize their research, to the person who just wants to share their thoughts on content that excites them.

by Doug Schepers at December 01, 2015 09:00 AM

November 30, 2015

W3C Blog

New Draft for Portable Web Publications has been Published

One of the results of the busy TPAC F2F meeting of the DPUB IG Interest Group (see the separate reports on TPAC for the first and second F2F days), the group just published a new version of the Portable Web Publications for the Open Web Platform (PWP) draft. This draft incorporates the discussions at the F2F meeting.

As a reminder: the PWP document describes a future vision on the relationships of Digital Publishing and the Open Web Platform. The vision can be summarized as:

Our vision for Portable Web Publications is to define a class of documents on the Web that would be part of the Digital Publishing ecosystem but would also be fully native citizens of the Open Web Platform. In this vision, the current format- and workflow-level separation between offline/portable and online (Web) document publishing is diminished to zero. These are merely two dynamic manifestations of the same publication: content authored with online use as the primary mode can easily be saved by the user for offline reading in portable document form. Content authored primarily for use as a portable document can be put online, without any need for refactoring the content. Publishers can choose to utilize either or both of these publishing modes, and users can choose either or both of these consumption modes. Essential features flow seamlessly between online and offline modes; examples include cross-references, user annotations, access to online databases, as well as licensing and rights management.

The group already had lots of discussions on this vision, and published a first version of the PWP draft before the TPAC F2F meeting. That version already included a series of terms establishing the notion of Portable Web Documents and also outlined an draft architecture for PWP readers based on Service Workers. The major changes of the new draft (beyond editorial changes) include a better description of that architecture, a reinforced view and role for manifests and, mainly, a completely re-written section on addressing and identification.

The updated section makes a difference between the role of identifiers (e.g., ISBN, DOI, etc.) and locators (or addresses) on the Web, typically an HTTP(S) URL. While the former is a stable identification of the publication, the latter may change when, e.g., the publication is copied, made private, etc. Defining identifiers is beyond the scope of the Interest Group (and indeed of W3C in general); the goal is to further specify the usage patterns around locators, i.e., URL-s. The section looks at the issue of what an HTTP GET would return for such a URL, and what the URL structure of the constituent resources are (remember that a Web Publication being defined as a set of Web Resources with its own identity). All these notions will need further refinements (and the IG has recently set up a task force to look into the details) but the new draft gives a better direction to explore.

As always, issues and comments are welcome on the new document. The preferred way is to use the github issue tracker but, alternatively, mails can be sent to the IG’s mailing list.

(Original blog was published in the Digital Publishing Activity Blog)

by Ivan Herman at November 30, 2015 08:00 AM

November 22, 2015

ishida >> blog

Mongolian picker updated: standardised variants

Picture of the page in action.
>> Use the picker

An update to version 17 of the Mongolian character picker is now available.

When you hover over or select a character in the selection area, the box to the left of that area displays the alternate glyph forms that are appropriate for that character. By default, this only happens when you click on a character, but you can make it happen on hover by clicking on the V in the gray selection bar to the right.

The list includes the default positional forms as well as the forms produced by following the character with a Free Variation Selector (FVS). The latter forms have been updated, based on work which has been taking place in 2015 to standardise the forms produced by using FVS. At the moment, not all fonts will produce the expected shapes for all possible combinations. (For more information, see Notes on Mongolian variant forms.)

An additional new feature is that when the variant list is displayed, you can add an appropriate FVS character to the output area by simply clicking in the list on the shape that you want to see in the output.

This provides an easy way to check what shapes should be produced and what shapes are produced by a given font. (You can specify which font the app should use for display of the output.)

Some small improvements were also made to the user interface. The picker works best in Firefox and Edge desktop browsers, since they now have pretty good support for vertical text. It works least well in Safari (which includes the iPad browsers).

For more information about the picker, see the notes at the bottom of the picker page.

About pickers: Pickers allow you to quickly create phrases in a script by clicking on Unicode characters arranged in a way that aids their identification. Pickers are likely to be most useful if you don’t know a script well enough to use the native keyboard. The arrangement of characters also makes it much more usable than a regular character map utility. See the list of available pickers.

by r12a at November 22, 2015 01:03 PM

November 20, 2015

W3C Blog

W3C Welcomes the FIDO 2.0 Member Submission

Today, W3C welcomes the FIDO 2.0 Platform specifications as a Member Submission. On the Web, passwords are both an everyday inconvenience for users and a weakness against modern security threats. Users re-use passwords across different sites and password databases are irresistible targets for an enterprising attacker. W3C is committed to bringing the Web to its full potential, and that includes providing more secure and easier ways to authenticate in your browser. After our WebCrypto v.Next workshop, W3C started drafting a charter for a Web Authentication Working Group (still a draft).

But how do we “kill passwords”?  These questions are answered by the FIDO 2.0 specifications, which define a unified mechanism to use cryptographic credentials for unphishable authentication on the Web. The specifications enable a wide variety of user experiences and modalities. For example, a user may log into the web by unlocking a nearby bluetooth or NFC connected smart phone which contains the user’s cryptographic credentials. Alternately, the user may use a USB authentication device containing cryptographic credentials which he or she inserts and activates with a touch of a button.The W3C has provided technical and procedural comments on FIDO 2.0.

For more than 20 years, W3C has led the development of open standards for the Web, and one of the benefits of being a W3C Member is that any Member can suggest new standards to W3C at any time. W3C Members Google, Paypal, Microsoft, and Nok-Nok Labs have proposed three FIDO 2.0 specifications, Web APIs, Key Attestation Format, and Signature Format, and we have published these as a Member Submission on the W3C site. Publication of a Member Submission by W3C does not imply endorsement by W3C, including the W3C Team or Members. However, the high technical quality of these specifications and the expertise of the companies proposing them makes them a natural fit for consideration as part of W3C’s standards track.

As our next step, with FIDO 2.0 as an input document, we will formally propose the Web Authentication Working Group charter to the W3C Membership for review; W3C relies on its Members, over 400 industry leaders, to help guide the development of the Web, and their feedback will be used to improve the scope and direction of the proposed work. If the Membership supports the charter, we anticipate launching the group in January 2016.

The W3C is an open standards body. All W3C members and Invited Experts will be welcome to participate and all standardization will be done in the open and publicly archived, with the final W3C standards being licensed under the W3C Royalty-Free Patent Policy.

Announcements about the launch of the Web Authentication Working Group and publication of its specifications will be made on the W3C home page.

As the Web works across all devices, the Open Web Platform is the perfect platform to drive future standardization across all other platforms. W3C and FIDO Alliance will continue to work together to help make secure multi-factor authentication a ‘built-in’ feature on all platforms. If FIDO, W3C, and the rest of the tech industry are successful, future generations may not even know what a password is.

by Harry Halpin at November 20, 2015 10:19 AM

November 16, 2015

W3C Blog

Better specifications for the sake of the Web

This post is co-authored by Virginie Galindo and Richard Ishida, currently working hand in hand to promote better wide review of W3C specifications.

The Open Web Platform is getting increased traction from new communities and markets thanks to the attractive portability and cross-device nature of its specifications – characteristics which are strengthened by horizontal and wide reviews. But the increase in specifications compounds a growing difficulty when it comes to ensuring that specs are adequately reviewed.

The number of specifications initiated in W3C is increasing every year. That growth is welcome, but we want to avoid ending up with series of parallel technologies that lack coherence. That is one reason the W3C is putting efforts into a campaign to ensure that all specifications will benefit from wide review. Reviews, from the public and from experts, to ensure that all features and specifications create a trusted and sustainable Web for All.

Reviewing a specification is not an easy task, especially when a reviewer does so on a voluntary basis, squeezing it in between two or more high value tasks. One can appreciate that a prerequisite for asking for wide review is that the W3C specification is readable by the non-specialist who is affected by the features it addresses.

Think also about the scenario where an accessibility expert is reviewing an automotive API, or an internationalization expert is reviewing a brand new CSS feature, or a security expert is reviewing a new protocol. The spec needs to be understandable to these non-domain experts.

The basics dictate that a specification should contain use cases and vocabulary sections, that it should rely on plain English, etc. But you should also bear in mind that most reviewers have to produce feedback in a limited time, with limited experience, and having perhaps only read the spec through a couple of times.

Here are few additional tricks for specification editors to keep your reviewer on track.

Summarize algorithms. Parts of the spec that are expressed as algorithmic steps can make it difficult to grasp the fundamental lines of what is being proposed (and sometimes it even takes a while to ascertain that it’s not anything particularly complicated in the end). Adding a summary of what the algorithm does can make a huge difference for those needing to get the bigger picture quickly.

Do not fragment information (or do use signage). When information is dispersed around the document to such an extent that one has to hold the whole spec in one’s brain to be able to find or piece together information on a particular topic, this is not good for reviewers. If it’s possible to reduce the fragmentation, that would be helpful. If not, please add plenty of signage, so that it’s clear to people who don’t hold the whole spec in their brain where they need to look for related information.

Use diagrams. Sometimes a large amount of textual information could be expressed very quickly using a railroad diagram, an illustration, or something similar. No-one wants to wade through (often pages of) tedious detail when reading a spec when a diagrammatic approach could summarise the information quickly.

Give examples. Examples are extremely useful and help people grasp even complex ideas quickly. Please use as many as you can. If you are describing a format, include an example of that format which includes most of the quirks and kinks that need to be described. If you are describing a result, show an example of the code and the result. If you are describing something you need the reader to visualise, use a picture. Etc. Basically, please use as many examples as possible.

Ensuring that W3C specifications are readable leads to better reviews and feedback. Better reviews and feedback lead to a more coherent Web and greater support for universal access and interoperability. These latter, in return, lead to greater attractiveness of W3C specifications for new communities and markets.

by Virginie GALINDO at November 16, 2015 01:00 PM

November 13, 2015

W3C Blog

TPAC2015 and IETF94

We held TPAC 2015, our annual organization-wide meeting, on 26-30 October 2015 in Sapporo, Japan.

We registered a record attendance of 580 participants throughout the week, breaking last year’s record participation of 550. 43 work groups met face-to-face, participants organized 50 breakout sessions on security, web payments, web of things, web-based signage, HTML, Testing, CSS, video, digital publishing, etc. “Everyone is using the technologies, and driving new requirements,” said Jun Murai on stage. In a lively discussion on stage, Tim Berners-Lee talked about the new activities we are taking on such as Web Payments and Web of Things. There was a lot of energy in the meetings at TPAC.

I presented the W3C Industry Vertical Champion Program, aimed at understanding the needs of the industries our Members are in and with appointed internal champions, address business problems within the core of the Web in sectors such as Automotive, Digital Marketing, Digital Publishing, Entertainment, Telecommunications and Web Payments.

We announced Web Developers avenue, one-stop page featuring the tools and resources W3C has for Web developers to learn, build, get involved, move the Web forward. A question from the floor during TPAC 2013 in Shenzhen was about W3C giving a greater voice to Web developers. We focused on which of our services the Web developers value in particular, give them a greater voice and increase their affiliation: our free validators and tools, to build Web content that works now and will work in the future; W3C Community Groups to propose and incubate new work, that more than six thousand people have embraced since 2011; our free and premium Training programs, to learn from the creators of the Web technologies; and Discourse, to share ideas and feedback with the community on Web Standards. We also introduced a gratitude program, Friends. We are making it easy to affiliate as Friends, take advantage of our offerings, and we encourage to donate to support us in conducting the activities that fulfill the W3C’s mission.

W3C and NTT Communications jointly organized a W3C Developer meet-up. More than 300 attended that successful event which we built around industry demos and talks on substantial subjects by Natasha Rooney, Lea Verou, Jake Archibald, Hyojin Song, Noriatsu Kudo, Stefan Thomas, Evan Schwartz and Adrian Hope Bailie.

One topic stood out during the week: Web security. At least twenty unconference breakout sessions were related to or touched on security, as well as three presentations at the W3C Advisory Committee Meeting, including a comprehensive report and new work in Web Application Security by Brad Hill of Facebook. During the Technical Plenary (minutes) I moderated a panel on the future of the Internet and the Web. I invited on stage Tim Berners-Lee, inventor of the WWW and Director of the W3C, Vint Cerf, father of the Internet, and Jun Murai, father of Japan’s Internet, who shared historical anecdotes and considerations on security – which has to be in everything, as Tim stressed – on cryptography, strong authentication and trust. Vint calls W3C and IETF enablers. We took an action to foster high-level discussions between the two on what is missing from the enabling protocol space to make strong authentication, high integrity, and other trust building mechanisms on the platform.

We “co-located” TPAC with IETF who held a meeting in Yokohama the following week, that I attended with a few of my colleagues. I was pleased to host some senior people of the IETF at TPAC; Vint Cerf, as you’ve just read; Jari Arkko, IETF Chair; Andrew Sullivan, IAB chair; and a number of participants attending sessions at both meetings, including many in WebRTC-rtcweb groups, where I hear the interaction was conducive to good progress. During the IETF plenary session, one question from the floor was about the co-location and whether it was going to be done again. I went to the microphone and confirmed the co-location was deliberate as we want to be next to them in time and space as often as possible.

Lastly, as part of preparation for TPAC, we published for the Membership “W3C Highlights – October 2015,” now public, which I invite you to read.

We have already begun discussions of TPAC next year, which will take place in Lisbon, Portugal on 19-23 September 2016, and I am looking forward to seeing you there.

by Jeff Jaffe at November 13, 2015 06:41 PM

October 29, 2015

W3C Blog

W3Cx: Expose your skills as a Web developer

With our growing success in W3Cx, our edX’s MOOC platform, W3C is launching a new LinkedIn group as a way for Web developers to showcase their Web technologies skills.

This W3Cx Verified Students group is available to all W3Cx students who have earned a W3Cx Verified Certificate. This group provides a way to recognize the students skills in W3C Web technologies.  You are welcome to use this group as a discussion forum to post job offers, information about interesting startups that these learners should keep an eye on, and also new Web technology developments.

The W3Cx courses are meant to empower you to become the next leaders and innovators on the Web.

by Bernard Gidon at October 29, 2015 05:53 AM

October 28, 2015

W3C Blog

Privacy Bridges with Do Not Track

We welcome the inclusion of W3C’s Do Not Track specification among the “privacy bridges” proposed by an international group of privacy experts as structures to improve US-EU privacy cooperation. “Bridges” is an apt descriptor for W3C’s work on voluntary consensus technical standards, through which we aim to make the Web work for users, developers, and publishers around the world.

The report, presented in a privacy conference this week

, proposes a focus on “user controls.” That has been a key aim of the Tracking Protection Working Group from its charter: “to improve privacy and user control by defining mechanisms for expressing user preferences.” With the publication of the Tracking Preference Expression Candidate Recommendation, and its implementation in browsers, we have given users a tool for preference-expression; with the Compliance specification, we supply some vocabulary for specifying preferences.

As the report (PDF) further recognizes, since the privacy landscape is complex, standard preference mechanisms can help reduce the complexity presented to individual users, even as laws differ around the globe. For the WWW to earn its Ws by operating World Wide, its infrastructure must meet the needs of users around the world. Thus DNT provides scaffolding that can be implemented consistently across jurisdictional borders, while conveying information that users, publishers, and regulators can use to tailor their interpretations to their local jurisdictions. Users everywhere can say “Do Not Track” or “Permit Tracking,” even if jurisdictions vary in looking for “opt-in” or “opt-out.”

Once the technical standards for communicating privacy preferences are set, the work of privacy protection is not yet done. Regulatory attention may be necessary to encourage or enforce implementation and adoption. Participants from both the U.S. Federal Trade Commission and EU Article 29 Working Party have been active discussants in the Working Group. That too is part of the global dialog on privacy.

by Wendy Seltzer at October 28, 2015 07:52 PM

October 27, 2015

W3C Blog

W3C Releases Web Developers Avenue

We are releasing today W3C Developers avenue, a one-stop page featuring the offerings and tools W3C has for Web developers, to guide them to what they need for their work. We are doing this to encourage greater developer engagement with W3C due to their increased importance leveraging the Open Web Platform.

This one-stop page presents the offerings and tools W3C has for Web developers:

  • Free and open-source W3C validators, checkers and tools
  • Discourse, to discuss and learn
  • W3C Community Groups to propose and incubate new web technologies
  • Learning, in a W3Cx MOOC or a course from W3DevCampus
  • Testing the Web Forward

We had previously discussed Webizen, a proposed program to increase developer representation within W3C for a modest fee. Instead we have now dropped the fee, introduced a gratitude program, Friends, and focused on how W3C gives Web developers a greater voice, and which services they value in particular. Our free validators provide the Web a fair, reliable help in building Web content that works now and will work in the future. More than six thousand people have embraced W3C Community Groups since launch in 2011 and a considerable amount of incubated work transitioned to the W3C REC track. We are successfully teaching several thousands of Web developers via our free and premium Training program.

Our strong commitment to those who make the Web work feeds the virtuous circle of increasing developer engagement at W3C and the value of W3C to the community. We are making it easy for you to affiliate as Friends, take advantage of our offerings, and we encourage you to donate to support us in conducting the activities that fulfill the W3C’s mission.

Diagram of W3C offerings for Web developers

by Jeff Jaffe at October 27, 2015 11:49 PM

October 22, 2015

W3C Blog

Geek Week at W3C

A little-known fact about W3C is that once a year there is Geek Week — a time for staff, technical or not, to spend a few days on a project of their choosing, either individually or with others. Some use this time for self-development, such as a reading week. Some use it for creating or tweaking tools that may prove useful (or not!).

All too often meetings or urgent tasks make Geek Week rather shorter than a week, but even so there are always interesting results. This year was no exception with projects including:

  • A picture quiz for new members of staff, helping them learn the faces of people on the team in time for TPAC next week.
  • A prototype email thread flattener to make W3C mailing lists a bit more user-friendly.
  • Work on a new viable version of Commonscribe, to improve scribing and browsing meeting records.
  • A way of converting ReSpec-based W3C TR documents to EPUB.

Personally, I chose to enhance our IRC bot. Yes, IRC is still used at W3C. A lot. There are more modern technologies around but it works well and being an open protocol allows us to mess around with it as much as we want. One example of this is a bot that is a customized version of infobot by Kevin Lenzo. We’ve named ours botie and its main role is to pass on messages to others, like a personal messenger on horseback, galloping along TCP lanes through the meadows of the internet.

When someone is offline, we type botie, inform alice that Tuesday's meeting is cancelled, and when Alice re-joins the IRC channel she’ll get an alert with that message. Unfortunately botie doesn’t understand much human language and we’ve been getting frustrated giving requests and being met with “huh?”. My Geek Week project, with help from Denis and Antonio, was very simply to add more words that botie understands. The regular expression for this is now:

/^(tell|inform|notify|advise|alert|advise|enlighten|send\sword\sto|ping|remind|ask|beseech|beg|say) *([^ ,]*),? (that|to|about)? *(.*)/

meaning we can now be more eloquent by using commands such as botie, beseech Bob to kindly grace us with his presence at Tuesday's gathering, and botie will obey. Unlikely to make a huge impact, admittedly, but it was fun!

Geek Week tends to happen around July or August each year to minimize the impact on day-to-day work. There are inevitably time constraints and ideas flow more freely than the available hours in a day but it’s a valuable W3C tradition and long may it continue.

by Daniel Davis at October 22, 2015 08:53 AM

October 04, 2015

W3C Blog

New online survey: 5 questions on security from the STREWS project

The survey on Web security that we talked about here in August, is now closed. But the STREWS project opened another one:

☑ STREWS Web-security 5-question survey

Update (13 Oct 2015): The same questionnaire is also available via Surveymonkey:

☑ STREWS Web-security 5-question survey (via Surveymonkey)

If you help create or maintain a Web site (or several), you can help us by filling out the survey! It is very short.

The STREWS project is a joint project of KUL, SAP, TCD and W3C and funded in large part by the European Commission. It is finalizing a report called the European Web Security Roadmap and this survey serves as a way to validate its recommendations.

The Roadmap itself will be published later this month. It contains an overview of current (world-wide) research and standardization efferts in the area of Web security, a number of  study results and an overview of what is going on in European policy making. Its goal is to help especially policy makers identify the gaps between existing research and standards on the one hand and current and emerging security threats on the other.

It turns out there are vulnerabilities that are well-researched (Cross-Site Scripting, SQL Injection, etc.), but still as prevalent as ever in practice. There are also relatively new threats, such as Pervasive Monitoring, where people are still struggling to define a model and effective countermeasures. (The STRINT workshop last year, organized by STREWS on behalf of W3C and the IAB, was the first workshop where the IETF, W3C and others worked together on a policy against PM.)

The survey is open until October 16.

by Bert Bos at October 04, 2015 01:38 PM

October 01, 2015

W3C Blog

Work Begins on Extensions to WCAG 2.0

Last week a new charter for the Web Content Accessibility Guidelines (WCAG) Working Group (WG) was formally approved by W3C after having been reviewed by the W3C Member organizations. For the first time since the finalization of WCAG 2.0 in 2008, this charter allows the Working Group to explore ways to provide guidelines beyond WCAG 2.0.

The WCAG 2.0 standard continues to be the preeminent reference for web accessibility.  A growing number of national and organizational policies around the world reference WCAG 2.0, including Canada, Australia, Japan, India, and the United States. WCAG 2.0 holds up well today despite significant changes in technology.

There have been some changes to the technology landscape, however, that were not fully anticipated in the development of WCAG 2.0. Changes in how people access the Web on mobile devices require success criteria that address those situations more specifically. Users with cognitive and learning disabilities and users with low vision have suggested ways in which success criteria could better address their requirements. In recent years the WCAG Working Group formed task forces on mobile, cognitive, and low vision accessibility to define requirements and candidate success criteria for these three areas. New technologies on the horizon and the rapid evolution of the underlying technologies for user interaction on the Web are likely to continue to drive the need for new guidance.

To address these needs, the WCAG Working Group has begun to develop a framework for WCAG 2.0 extensions. These would be separate guideline documents, to increase the amount of coverage on particular accessibility needs. Authors and policy-makers would be able to choose to meet the guidelines with one or more extensions, which inherently meet the base WCAG 2.0 guidelines, while organizations that have policies built around WCAG 2.0 alone would not be impacted by the extensions.

The WCAG charter just approved will serve as bridge to begin work on extensions now while we continue to define what the next generation of WAI guidelines will look like. The Working Group is gathering requirements that may lead to the creation of an updated version of WCAG, or a new set of accessibility guidelines altogether, or both. In order to better integrate the components of web accessibility into a single set of guidelines, the Working Group is exploring the possibility of merging with the Authoring Tool Accessibility Guidelines and User Agent Accessibility Guidelines Working Groups. The Authoring Tool Accessibility Guidelines Working Group (ATAG WG) has just published the completed Authoring Tool Accessibility Guidelines (ATAG) 2.0; and the User Agent Accessibility Guidelines Working Group (UAWG) has just published an updated working draft, rolling in comments from browser vendors and others, and will be publishing the User Agent Accessibility Guidelines (UAAG) 2.0 as a Working Group Note soon.

WCAG 2.0 extensions and setting the stage for next-generation accessibility guidelines means this is an excellent time to join the WCAG Working Group!

by Andrew Kirkpatrick at October 01, 2015 11:00 PM

September 28, 2015

W3C Blog

TPAC 2016 dates and location announced

We have announced today that the 2016 W3C Technical Plenary (TPAC) will be held on 19-23 September 2016 at the Congress Center of Lisbon, in Portugal. Please, save the date!

W3C hosts this yearly five-day event to allow Working Groups (WG) and Interest Groups (IG) to hold their face to face meetings in one place and have the opportunity to meet and liaise with participants from other groups. The W3C Advisory Committee meeting, bi-annual Membership meeting, takes place during the same week.

The 2015 edition of TPAC takes us to Sapporo, Japan, in just a month. We invite our work group members to register for TPAC 2015 by 7 October.

Unconference/breakout is the preferred format of the Technical Plenary Day. That day will consist of a brief Plenary Session in the morning including a panel on the future of the Internet and Web, with Tim Berners-Lee, Jun Murai, and Vint Cerf that Jeff Jaffe will moderate; and breakout sessions. Please, continue to propose breakout sessions until the Technical Plenary Day itself.

by Coralie Mercier at September 28, 2015 01:27 PM

September 24, 2015

W3C Blog

More Accessible Web Authoring with ATAG 2.0

Easier production of accessible Web content is an important aspect of improving accessibility of the Web for people with disabilities. One of the factors that can help towards that goal is better support for accessibility in the authoring tools themselves. WAI is pleased to announce the publication of the Authoring Tool Accessibility Guidelines (ATAG) 2.0 which help authoring tool developers create more accessible products that produce more accessible content. People with disabilities need to be able to use authoring tools, and ATAG provides helpful guidance in areas specific to authoring tools, like an accessible editing view.

Real World, Real Tools

ATAG 2.0 is complete, ready for use and is already being implemented (or is in the process of being implemented) by native and web-based authoring tools including: Content Management Systems (CMS) like Drupal and DeFacto CMS; Learning Management Systems (LMS) and MOOCs like edX; WYSIWYG and HTML editors like Ephox, Achecker and TinyMCE, social media tools like Easy Chirp, and media editing or specialty tools like Common Look Global Access.

More Accessible Authoring for People with Disabilities

Tools that meet ATAG 2.0 make it easier for people with disabilities to author web content, with a focus on the editing functions of the authoring tool. Here are some examples:

  • Edit or create content with the font size and colors you need, while publishing in the size and colors you want for your audience.
  • Identify images and media in your editing view with info like alternative text or captions.
  • Use spellchecking or other status indicators that work with assistive technology (not simply be CSS or other vision-only indicator).
  • Navigate through the content structure or outline
  • Search text and alternative text in the editing view

ATAG will help you conform to WCAG.

The Web Content Accessibility Guidelines (WCAG) 2.0 provide internationally accepted guidance for accessible web content. ATAG 2.0 is closely integrated with WCAG 2.0 and supports WCAG implementation. ATAG gives authoring tool developers guidance on making better tools that help authors in creating content that meets WCAG 2.0. Like other features of tools – spellchecking, grammar checking, syntax validation – accessibility becomes an integrated feature. When the tool helps produce more accessible content, it may improve accessibility at a lower training cost than traditional tools, and help avoid costly revisions incurred by adding accessibility later.

ATAG helps you create more accessible web content by:

  • ensuring that features that support accessibility are as easy to discover and use as other features of the tool.
  • preserving accessibility information across copy-paste or Save As operations
  • identifying what templates are accessible
  • helping authors with accessibility checking and repair of accessibility problems

How Can I Start Using ATAG?

Tool developers can use ATAG 2.0 for guidance on making better authoring tools for their customers. People with disabilities and accessibility advocates can encourage authoring tool vendors to make their tools meet ATAG 2.0. Buyers and purchasing agents of authoring tools can include ATAG 2.0 conformance in Requests for Proposals/Tender, and use ATAG for evaluating the accessibility of tools.

More Information

For additional information about ATAG 2.0. see the ATAG Overview. ATAG 2.0 At a Glance provides a summary of the ATAG guidelines. ATAG’s companion document, Implementing ATAG 2.0, gives detailed description of the intent of each success criteria, examples and use cases for the success criteria and additional resources.

ATAG 2.0’s publication as a web standard provides another step forward in making the web more accessible by providing guidance to authoring tool developers on designing more accessible authoring tools that produce more accessible websites.

by Jeanne F Spellman at September 24, 2015 02:51 PM

August 20, 2015

W3C Blog

TPE to CR: Advancing the conversation about Web tracking preferences

W3C’s Tracking Protection Working Group today published the Candidate Recommendation of the Tracking Preference Expression (TPE) and calls for implementation and testing of the specification. Congratulations to the Working Group on this progress.

Abstract: This specification defines the DNT request header field as an HTTP mechanism for expressing the user’s preference regarding tracking, an HTML DOM property to make that expression readable by scripts, and APIs that allow scripts to register site-specific exceptions granted by the user. It also defines mechanisms for sites to communicate whether and how they honor a received preference through use of the “Tk” response header field and well-known resources that provide a machine-readable tracking status.

The “DNT” header is one piece in a larger privacy conversation. The TPE enables users, through their user-agents, to send a standard signal, “Do Not Track”, or alternatively to indicate that they do not mind being tracked; and it enables servers to recognize and respond to that user preference. DNT is implemented in most current browsers, so users can already make the technical request for privacy and ask for compliance by sites they frequent.

The Working Group was also chartered to define the meaning of compliance with the DNT preference. While the Working Group aims, in a second document, to define a compliance regime that may be useful across a wide of use cases, it chose to make the standard flexible enough to work in a variety of regulatory or business scenarios by enabling sites to indicate (via a URI sent in tracking status responses or at a well-known location) what compliance regime they follow. They may choose to follow the W3C-defined Compliance specification or an alternate.

We welcome the work of other groups considering ways to use the DNT header. EFF and a coalition have announced an alternate, more stringent compliance policy. Users can install EFF’s Privacy Badger extension to support that compliance policy by blocking non-compliant trackers. We see this building on top of the TPE specification not as a competing effort, but as expanding diversity of the Do Not Track ecosystem, using the language of the DNT header to convey a privacy request, and new compliance text to indicate their acceptable responses.

The importance of this work is highlighted by a recent finding from the Technical Architecture Group (TAG) on Unsanctioned Web Tracking. The TAG noted that tracking that abides by Web standards takes into account user needs for privacy and control over data flows, providing transparency to users and researchers, while “unsanctioned tracking” outside of well-defined mechanisms and standards tends to undermine user trust. TPE response and compliance can be tools of Web privacy transparency, helping sites to disclose their practices and meet user expectations. TPE thus enables sites to hear and respond to users’ preferences about tracking — giving alternatives to the regulation the TAG finding suggests might otherwise be necessary.

Next steps: Both the TPE and Compliance specifications are already implemented, but still need further testing (and resolution of remaining issues, on the Compliance spec) before they can be issued as W3C Recommendations. The Working Group will now focus on testing for interoperable implementations and addressing Last Call issues on the Compliance spec. We estimate that both specifications will be published as Recommendations in 2016.

by Wendy Seltzer at August 20, 2015 12:20 PM

August 16, 2015

Reinventing Fire

Bordering on Factual

Yesterday, a cool-looking map showed up on my Facebook feed, shared by a friend; it depicts the North American continent with the historical political boundaries of the native Americans. It listed clear boundaries for separate states of the First Nations: Anasazi, Apache Empire, Arawak, Aztec Empire, Beothuk Empire, Cherokee Soverignty, Cheyenne, Chickasaw, Chilcotin, Chinook, Chumash, Comanche, Cree Federation, Creek, Crow, Dogrib, Flathead, Great Sioux Nation, Haida Gwai, Hopi, Huron Supremacy, Inuit, Iroquois Confederacy, Mayan Empire, Mi’kmaq, Mohican, Navajo, Ojibwa, Olmec Kingdom, Pawnee, Pequot, Pomo, Powhatan, Salish, Shuswap, Slavey, Tlingit, and Ute.

Facebook post of Native American map

I’d never before seen such a clear depiction of the geopolitical boundaries of pre-Columbian America, and it was a stark reminder of how we, as a people, systematically invaded and destroyed a continent of cultured peoples. We wiped away their cultures, their languages, their history, and even the memory of them, leaving only scraps behind, and we protect our current borders of land they used to live on. The American Indian Wars ended in 1924, less than a hundred years ago, but it’s not even part of the American political dialog. And we’ve whitewashed our pogroms against Native Americans, in the same way we’re presently sugar-coating slavery in history courses.

The original person who posted the picture on Facebook also included this commentary,

America before colonization…. I’ve never seen this map in my entire 25 years of formal education. Not in one history book or one lesson. This is not a mistake… Representation matters!!! #NativeHistory #BeforeAmerica

Well said. And others agreed… the post has over 150,000 shares as I write this!

But something smelled wrong to me about the map itself.

Terms like “Empire”, “Soverignty”, “Federation”, “Confederacy”, “Nation”, “Supremacy”, and “Kingdom” seemed oddly specific and out-of-place, and even seemed designed to evoke legitimacy by comparison with European state structures. Were these really accurate labels of the political systems?

The number of different Native American nations seemed far too few; was the map aggregating them in some way I’d never seen before?

The borderlines seemed too crisp; weren’t many of these peoples semi-nomadic?

Glaringly, the Olmec were much earlier than the Aztec and Maya. What era was this supposed to be be representing?

And the biggest red flag… there was no source for the data, no provenance for the map, and the label was truncated.

So I dug into it, using TinEye to find the history of the image.

I couldn’t find the original version of the map, or who made it, but I did find a Reddit post from 9 days ago entitled A map where Europe never discovered America. The image link was broken, but I found a more complete version that clearly shows the alternate history timestamp: “Aztec Calendar: Three Acatl (approx 2015 AD)”:

Imaginary map of Alternate History Non-Columbian North America

We aren’t taught this map, because this map isn’t real.

The reality is far more complicated, fortunately or unfortunately. It doesn’t lend itself to easy and obvious emotional appeals. Images and data visualizations make hard things easier to understand, and thus are extremely tempting to share.

A lot of my friends have already shared this map; they’re smart, well-meaning people, and most of them are Liberals of some stripe or another. Before the week is out, this map will be shared many hundreds of thousands of times. Self-styled Right-Wingers, Conservatives, and Republicans are going to jump on this. They’ll point this out as typical knee-jerk America-hating Liberals, and laugh at the fact that people who consider themselves educated and intelligent were fooled by so obvious a hoax.

Here’s where these Right-Wingers are wrong: the map is incorrect, but the sentiment and the facts informing that sentiment are correct.

It’s easy to laugh at someone for being undereducated if your political party systematically suppresses the correct information that they should be getting from their schools.

That said, we’re all responsible for our own truths, and before you put something out there in the world, or share something someone else has said, you should do some fact-checking. If what you’re saying is a matter of objective fact, rather than subjective opinion, it’s more important to be correct than to be heard, otherwise you might undermine your own valid message.

But in this busy world, if you do make a mistake and spread something that you learn later was incorrect, don’t be so hard on yourself… just correct the record. We make mistakes, and it’s silly and mean-spirited to shame others for that, especially when their intentions are good; and worse yet, it forces people to defend themselves even if they were wrong, and doesn’t reinforce self-correction. Megan, my spouse, casually shared the erroneous map, but when I pointed out the flaws, she corrected herself in her comments, frankly and openly; she didn’t delete the message, she enhanced and corrected it.

This is the lesson we can carry forward from our own history as a nation: we have made mistakes, and we will continue to make mistakes, in how we treat others and how we think about our world; we need to remember these mistakes, and correct our behavior. We need to continue to make this a more perfect nation, knowing we will fail, but with good intentions.

Megan, being a conscientious map-nerd, also found a good source of a well-researched map of the true distribution of the native tribes and nations of North America, lovingly researched and rendered by Aaron Carapella:

Megan also pointed me to a great NPR radio story about Aaron’s maps and the naming of tribes. You can support Aaron’s work by buying his maps (currently on sale).

In the modern era, when it’s so easy for information to spread, it’s our social responsibility to spread factual information and to correct misinformation. It’s important that our technological tools make that easier, not harder; Facebook and Twitter don’t currently provide good tools for either of those tasks. For example, Facebook’s “Report Photo” dialog contains only the options “It’s annoying or not interesting”, “I’m in this photo and I don’t like it”, “I think it shouldn’t be on Facebook”, and “It’s spam”; why can’t they include “It contains factual errors”? How can I politely tell someone that they have made a mistake if the tool doesn’t include a way to do so?

Facebook's “Report image” dialog

I’m hopeful that the work of the W3C Web Annotation Working Group will yield a set of technologies and conforming services that make fact-checking and accountability possible through decentralized annotation. (If this intrigues you, check out, a socially-conscious annotation service you can use today).

In summary, here’s a few suggestions:

  • Don’t post stuff you haven’t verified
  • Don’t share stuff unless you’ve check the sources (Snopes is a good first step, or read an article on Wikipedia if you have more time)
  • Cite your sources
  • Make sure that images and data visualizations accurately reflect the facts at hand
  • Don’t dismiss all facts and opinions just because some mistakes were made; get at the truth of the sentiment, don’t just nitpick
  • Be suspicious when something too closely matches your own world view (i.e. beware confirmation bias)
  • Learn from your mistakes
  • Reward others for learning and growing
  • Don’t assume you know the truth; human knowledge is always expanding

If we want a civil society in a fast-paced, hyper-connected world, we are going to need to adapt our education system, our technological tools, our social norms, and ourselves.



Samuel Cousins informed me on Twitter that the source of the map was a scifi and comic book writer, Joseph Abbott (aka liminalsoup), who posted the map on Reddit looking for feedback for a story they’re writing. I’d guessed it was probably source material for a role-playing game campaign, so I was a bit off base.

by Shepazu at August 16, 2015 10:24 PM

August 12, 2015

Reinventing Fire

Opening Standards

Today, the XML Activity, with several Working Group charters, was approved. This is a major milestone for W3C, not because of the activities of these groups themselves, but for W3C’s process of developing standards.

For the first time, all of W3C’s active Working Groups now operate in public.

When W3C began, it operated largely as a Member-only consortium. Member companies paid substantial dues, and convened behind closed doors with each other and a handful of Invited Experts,  in Member-only Working Groups (WGs). WG mailing lists, teleconferences (telcons), and face-to-face (f2f) meetings were all Member-only, as were editor’s drafts of specifications, and even the list of which organizations and people were participating. Periodic drafts of works in progress (Working Drafts, or WDs) were published on W3C’s public Technical Reports (TR) directory, and feedback was processed on public mailing lists. But the public conversations and member-only conversations didn’t mix, and specific decisions were not transparent. W3C was not a truly open standards body.

When I joined the W3C team in July 2007, one of my personal goals was to open up the organization. I joined the Team to help with SVG; 2 years prior to that, I had actually joined W3C as an Affiliate Member via my small startup, Vectoreal, to move a struggling SVG specification along. I spent the entire revenue of one of my consulting contracts to do so, in the hope that I could make it up in the long run if SVG took off and increased my business; I became a W3C Member because I didn’t feel like the SVG WG was responsive to the then-active SVG community, and I wanted to represent the needs of the average developer. In joining the W3C Team, I took a salary that was about half of my previous years’ earnings. For me, it was important that W3C should not be a “pay-to-play” organization, that Web developers –and not just paying W3C members or hand-picked Invited Experts– should have a strong voice.

This was one of the few things I had in common with the contentious WHATWG folks (among others): a core belief that standards should be developed in the public. When I took over as staff contact of the WebAPI WG, I (along with the WG chairs, Art Barstow and Chaals McCathieNevile) set about merging it with the Web Application Formats (WAF) WG, to make a single WebApps WG that would operate in the public, following on the heels of the newly rechartered HTML WG’s grand experiment as a public WG; unlike the HTML WG, however, a person didn’t have to join the WebApps WG to read and post on the mailing list, lowering the bar to participation and decreasing W3C’s overhead in processing Invited Expert applications. This proved to be the model that the SVG, CSS, and later Working Groups followed.

We were off to a good start, but then we stalled out. Many Working Groups didn’t want to become public, and conversations about making WGs public by default were shut down by Members who’d paid to have a seat, by Invited Experts who liked their personal privilege, and by W3C staffers who had a variety of concerns.

But slowly, with external and internal pressure, including encouragement by some W3C members who put a premium on openness, Working Group charters (renewed every couple of years) more and more commonly designated their group as public. Within a few years, this became the norm, even without pressure. A few Member-only holdouts persisted: the Multimodal Interaction (MMI) WG; the Web Accessibility Initiative (WAI) WGs; and the XML WGs.

</xml> (Closed XML)

In January of 2000, in response to complaints that an update to the public version of an XML specification had taken too long, Tim Bray wrote on an external mailing list, XML-Dev, “[…] But it’s a symptom of the W3C’s #1 problem, lack of resources.” This is a tune we’re all still familiar with, sadly.

Lee Anne Phillips replied to this, on 16 January 2000:

With all respect, I think the lack of resources are the fault of the W3C membership policies, which seem designed to strongly discourage individuals and small organizations and businesses from participating in the process. US$5000 for an Affiliate Membership is beyond the reach of most of us and of many small businesses since that’s in addition to the value of the time spent on the process itself.

Whether this policy is because the big players want negotiations to go on in secret (and secrecy is inherent in the W3C structure so it can’t be an accident) or because W3C just can’t be bothered with the “little people” is a matter of speculation.

What’s certainly true is that there is a vast pool of talent available, many of whom are passionately interested in the development of XML and XML-related standards and might well have more time to spend than the human resources on sometimes grudging lend-lease from major corporations. Witness this and other lists which represent a collective effort of major proportions and a tremendous pool of knowledge and skills.

While we all appreciate the enormous efforts of the organizational participants in the W3C process, who’ve done yeoman service trying to juggle activities which might directly advance their careers at their organizational home with the community responsibilities of the standards process, there just might be a better and more open way.

The Internet standards process started in the RFC methodology, which, though sometimes awkward, chaotic, and slow, allowed rapid innovation and standardization when warranted and was fully public, ensuring participation by the *real* stakeholders in the process, the community served, rather than being dominated by the vendors who want to sell products to them.

In one way or another, we’re the ones who pay for all this work. Surely a way could be found to ensure that we know what the heck is going on. Even better, we could help in the initial stages rather than waiting in front of the curtain until a team of magicians come out and present us with whatever they think we want and are then either cheered or booed off the stage.

Others defended W3C, citing the Invited Expert policy, but Lee Anne (and many other people with similar thoughts) was essentially correct on two major points: it was an exclusionary policy; and it limited the pool of possible contributors that could help a resource-constrained organization.

It took 16 years, but I can now say, truthfully, that W3C is an open standards organization, both in its specifications and in its process.

Yeah, but it is really open?

There are aspects of W3C that are still not open, and are never likely to be, for pragmatic business and collaboration reasons.

Michael Champion from Microsoft said in this same XML thread:

The W3C’s secrecy policies are there because it is a treaty organization of competitors, not a friendly group of collaborators.

This was 16 years ago; I don’t know if he’d say the same thing today, but it’s only a little less true… W3C is a forum for coopetition, and discussions are usually friendly, but it is business. As a W3C staffer, I’m privy to future implementation plans, concerns about patents and Intellectual Property, and other in camera, off-camera discussions that are important for standards makers to know, but which companies won’t or can’t talk about in public. If they couldn’t talk about it in Member-only space, it wouldn’t be talked about at all, or would be couched in lies, deception, and misdirection. This frankness is invaluable, and it should be and is respected by participants. This is part of the bond of trust that makes standards work.

There are other parts of W3C’s standards-making process that are still not completely open for participation, due to logistical issues:

  • telcons: Trying to make decisions, or even have coherent conversations, on a 1-hour telcon with dozens of people, some of whom may not be known to the WG or may not be familiar with the discussion style of the WG, would be a nightmare. Telcons are limited to WG participants, including Invited Experts, and any topic experts they invite on a one-off basis. The logs (meeting minutes) are published publicly, however.
  • f2f meetings: The same constraints for telcons apply to f2f meetings, with the extra factors of the costs for the host (meeting room, food, network, and so on), facility access (NDAs, etc.), planning (who’s available and when and where), and travel costs (which are prohibitive for some Invited Experts, and would be for many members of the public). As with telcons, the meeting minutes are scribed and published publicly. In addition, around these f2f meetings there are sometimes public meet-ups for the locals to see presentations by the WG, to meet them, make suggestions or ask questions, and otherwise socialize with the WG.
  • informal brainstorming: This might happen in a hallway or a cafe, or on an unrelated mailing list or issue tracker; it might be a collaboration between colleagues or competitors, or it might be an individual tinkering with their own thoughts until they feel they have something worth sharing. It’s hard to make this fully open.
  • decision-making: One of the most significant benefits of W3C membership is the right to help make final technical decisions in a Working Group. Anyone can make suggestions and requests, but ultimately, it is the participants of the WG (and often, more specifically, the editors of the spec) who make the decisions, through a process of consensus. This is not only logistical (somebody has to decide among different options) but usually pragmatic as well, since the members of the WG often are the implementers who will have to code and maintain the feature, and who are well-informed about the tradeoffs of factors like usability, performance, difficulty of implementation, and so on. But even if the WG has the final decision power, there is still an appeal process: anyone, whether a W3C member or an average developer, can lodge a Formal Objection if a technical feature is flawed, and the technical merits will be reviewed and decided by W3C’s Director.
  • write access: Who can edit the specifications? Who has control over merging pull requests? There are IP issues with contributions, and also a coherent editorial and technical tone that needs to be adhered to. Currently, this is limited to WG participants, and it’s likely to stay that way.

But the vast majority of the standardization process is now public, including almost all technical discussions, meeting minutes, decision processes, and specification drafts, as well as the lists of people and organizations involved.

W3C has greatly benefited from this openness; we get invaluable feedback and discussion from our mailing lists, and WGs tend to take such feedback very seriously. We value this public input so much that W3C has also expanded its offerings into a project called W3C Community Groups that are free and open to everyone to participate, where anyone can propose a topic for technical discussion, and if others are interested, they can form a group to develop that idea further, to write use cases and requirements and even a technical specification; if the idea gets traction, W3C may even pick it up for the Recommendation-track formal standardization process.

Some people have taken a more extreme view, that W3C should lower its membership dues by getting rid of the technical staff. I’m all for finding ways to lower our membership dues, to be more inclusive, but getting rid of W3c technical staff would mean decreasing oversight and openness, not the reverse; most of the W3C staff are dedicated to making sure that we serve society, as well as our members, and a lot of critical technical work wouldn’t happen without the W3C technical staff. Our members are doing W3C and society a great service by sponsoring our work, even while they benefit technologically and financially themselves.

●。 (Full Circle, Full Stop)

The schism between the XML community and W3C was one of several such schisms in W3C, perhaps the first major schism. The fight over whether W3C should adopt a Royalty-Free Patent Policy on its specifications was another (we did, and W3C’s Royalty-Free patent policy is now one of its crown jewels); the battle of control over HTML with WHATWG was yet another; the ongoing debate about whether the W3C’s specification document license should allow forking and require attribution is still another (W3C has recently released an optional forkable, GPL-compatible, attribution license for software and documents that may be used on a per-Working Group level). All of these were about openness, in one way or another. Openness is about transparency, and accountability, and ability to participate, and freedom of use, but also about control, and who controls what; this will always be a matter of heated debate.

The particular flavor of openness that was being debated in the XML community in 2000 was about a process open to participation, and transparent oversight into the discussions and decision-making at all formal stages of creating a standard. XML is widely used, and largely stable, so interest in further XML standards development has waned over the years; participation has dwindled, and with the final versions of these specifications (XML Core, XSLT, XQuery, XPath, XProc, EXI) being published, this may well be their final charter renewal. Now, with the XML Working Groups rechartered finally and finally as public, we have a nice bookend to a process of open standards.


Closing Words

As usual, the reality is not quite as tidy as the story. After I first wrote this article, I went back to fact-check myself, and found some dusty corners that need cleaning out. For full disclosure, I’m compelled to mention the exceptions. We have some non-technical coordination groups, such as the Patents and Standards Interest Group and Advisory Board that are not public. Some of our Web Accessibility Initiative (WAI) WGs are (ironically) not now public (specifically, the Protocols and Formats WG and WAI Coordination Group), but are currently being rechartered and all the WAI groups will be public later this year when the rechartering is settled. The Member-only Voice Browser WG is winding down, and is scheduled to close at the end of this month. The most notable exception is the Math WG, one of our oldest WGs (chartered in 1997); through a loophole in the charter process, the Math WG has not been rechartered since 2006; instead, it’s been extended by W3C management, on the grounds that they are only doing errata on existing MathML specs, rather than new technical publications. Still, this is an unwarranted exception, and my colleague Mike Smith and I are now advocating to resolve this as soon as possible, so no technical W3C Working Group, even one doing only specification maintenance, operates outside the public.

Once we’ve made sure every technical W3C Working Group is chartered to operate in the public, we all need to ensure that W3C doesn’t slip back into allowing Member-only technical Working Groups. This could take the form of a W3C policy change (petition, anyone?), or failing that, a mindfulness by W3C Advisory Committee Representatives that when we last tried the Member-confidential model, it led to worse specifications, slower progress, more time consuming feedback processes, and a community schism that nearly tore W3C apart. Let’s prevent that mistake from happening again. Eternal vigilance, and all that…

by Shepazu at August 12, 2015 06:39 AM

August 11, 2015

Reinventing Fire

Music to my Eyes!

I’m proud to have helped in the formation of the new W3C Music Notation Community Group. It’s a free and open group, and if you’re interested in the future of digital music representation, you should join now!

Chopin Prelude Opus 28, No. 7

Here’s a little history for you.

When I was kicking off the W3C Audio Incubator Group in 2010, which would spawn the Audio Working Group a year later, I knew that the Web platform needed the ability to generate and process audio, not just play back prerecorded audio streams. I didn’t know how the technology worked (and I’m still fuzzy on it); I didn’t know all of the use cases and requirements; I didn’t know the industry; I didn’t know the culture; I didn’t know the people; and I certainly didn’t know what the future held.

What I did know was that Flash was dying, and all the Web audio projects that had relied on Flash would need a new platform. And I knew that it was important that we somehow capture and encode this important cultural expression. And I knew how to find passionate people who knew all the things I didn’t, and I knew that if I gave them a place to talk (and a little gentle coaching), they would know how to make audio on the Web a reality. I wasn’t disappointed: a demo of the Audio Data API prototype by David Humphrey and Corban Brook rekindled my interest in a Web audio API; Alistair MacDonald led the initial effort as chair of the Audio Incubator Group, providing context and connections for starting the work; Chris Rogers, who designed Apple’s Core Audio before moving to Google, wrote the WebKit implementation and the early drafts of the Web Audio API; Olivier Thereaux and Chris Lowis from BBC picked up the the chair baton for the W3C Audio WG, later handing it to the capable Joe Berkovitz (Noteflight) and Matthew Paradis (BBC); and Chris Wilson (Google) and Paul Adenot (Mozilla) stepped up as editors of the Web Audio API spec when Chris Rogers moved along.

Thanks to the hard work by these and many other dedicated people, we are close to stabilizing the Web Audio API –which is a synthesizer/DSP/mixing board/audio processor in the browser– and we have commitments from all the major browser vendors to implement and ship it. We also have the Web MIDI API, which is not a way to play back bleep-beep-blorp MIDI files in your browser, but a way to control MIDI devices (e.g. musical instruments) via your browser, and vice versa.

These are pretty obvious technologies for W3C to develop. But the scope of the audio standardization work wasn’t always so clear. There was a vocal contingent among the interested parties that wanted us to standardize a music notation format… like HTML for music.

At the time, we decided this was not our priority, not only because it would dilute our focus from an already daunting task, and not only because it was a relatively niche market, but because there was already a winner in the music notation format space: MusicXML. And Michael Good, the creator and maintainer of MusicXML, made it clear that it had been a challenging undertaking, for a competitive market, and that he wasn’t ready to bring MusicXML to a formal standards body.

But the metronome ticks on, and times change. MusicXML was acquired by MakeMusic, a major music software vendor, and Michael began to warm to the idea of his creation having a home at a vendor-neutral standards body like W3C (with Joe Berkovitz patiently encouraging him); at the same time, Daniel Spreadbury (Steinberg) was developing SMuFL (Standard Music Font Layout), and together they encouraged their companies to bring their music standards under the care of a W3C Community Group.

Thus, two weeks ago, we formed the Music Notation Community Group, and already over 160 people have joined the group! Normally, W3C staff doesn’t devote resources to Community Groups, but Ivan Herman and I lent our W3C experience to the transfer and group formation in our spare time, because we saw the cultural value in having music representation on the Web (though unlike all the other people mentioned in this blog post, I’m sadly musically illiterate… “they also serve who only standardize”). Michael, Daniel, and Joe are co-chairing the group, and we’re looking forward to lively conversations.

Music and technical standards may seem like strange bedfellows, but there’s a long tradition there. The New Yorker, in a piece on HTML5 entitled “The Group That Rules the Web” by Paul Ford, referenced a 1908 article in Music Trade Review about player piano standards. In a hauntingly familiar account, the face-to-face meeting of a committee of industry leaders decided upon a nine-to-the-inch perforation scale for player piano rolls (think punch cards on scrolls). The rise and fall of the player piano industry is a fascinating read, and should give us perspective on how we build for eternity and for change.

Will MusicXML (or its successor) ever be natively supported by browsers, so that we can see it, read it, and edit it without the need for JavaScript (and SVG) rendering? Possibly not, if we learn from the history of the even more critical MathML, which still is not properly supported in browsers. But even if it is never natively supported, there are good reasons to have a vendor-neutral digital music notation format for the Web:

  • Simple interchange between music applications, both desktop and (increasingly) Web-based.
  • Annotation, which is common among musicians. As with any kind of data, the representation is the thing that is the proximal target for annotation, but it’s the data that should be annotated; as an example, I might have some CSV data, which I can render as a bar chart, line chart, or pie chart, but an annotation should apply to all chart types, that is, it is inherent in the underlying data, not in the rendering. Similarly, it’s not the SVG <path> or <use> element that represents a musical note that should be annotated, but the underlying music data model.
  • A DOM representation, which can be read and modified by JavaScript, flowed and laid out with CSS, and which could serve as custom elements for a MusicML component library.

And after all, past is prelude, and who knows what the future holds?

As your reward for reading this whole meandering post, here in full is the news article on the historic and infamous Gathering of the Player Men at Buffalo, for your edification and enjoyment, as an image and a OCR transcription, for posterity.

Gathering of the Player Men at Buffalo article

Gathering of the Player Men at Buffalo.

First Meeting of Player Manufacturers Ever Held in This Country—Discussions Anent the Number of Perforations to the Inch for a Standard Roll—Meeting Called to Order by L. L. Dowd—Addresses by Various Representatives Who Argue That Their Position Is the Correct One—Motion to Lay the Matter Over to Convention in Detroit Is Lost—Finally Votey’s Mo­tion Agreeing Upon Nine Perforations to the Inch Is Adopted.

(Special to The Review.)

Buffalo, N. Y., Dec. 10, 1908.

Pursuant to a call issued by the A. B. Chase Co., Norwalk, O., the piano player manufacturers and their representatives gathered in this city to-day, with the object in view of settling the vexed question of the scale to be used for the 88-note players. The player trade was well rep­resented at this first meeting, which may be said to almost reach the dignity of a convention.

There is unquestionably a decided difference of opinion as to the number of perforations re­quired on the music roll to the inch. There are some who hold that the 88-note roll should be no longer than the present 65-note roll. Leading makers hold that nine-to-the-inch must neces­sarily be the standard adopted, and the advo­cates of the nine-to-the-inch won at this meeting.

The first meeting was held at the Hotel Iro­quois, in this city, and opened shortly after ten. The score of delegates present constituted a truly representative gathering, the majority of the leading manufacturers having someone to look after their interests and express their opinions.

The following were present: Wm. J. Keeley, the Autopiano Co., New York; H. W. Metcalf, representing the Simplex Piano Co., Worces­ter, Mass.; the Wilcox & White Co., Meriden, Conn.; J. W. Macy, the Baldwin Co., Cincinnati, O.; E. S. Votey, the Aeolian Co., New York; R. A. Rodesch, the Rodesch Piano Co., Dixon, Ill.; T. M. Pletcher, the Melville Clark Piano Co., Chicago; Gustave Behning, the Behning Piano Co., New York; H. C. Frederici, the Clav­iola Co. and the American Perforated Music Co., New York; J. H. Dickinson, the Gulbransen-Dick­inson Co., Chicago; Otto Higel, Otto Higel Co., Toronto, Ont; D. W. Bayer, Chase & Baker Co., Buffalo, N. Y.; H. Keuchen, Shaw Piano Co., Baltimore. Md.; Chas. G. Gross, Chas. M. Stieff, Baltimore, Md.; Paul E. F. Gottschalk, Niagara Music Co., Buffalo, N. Y.; E. B. Bartlett, W. W. Kimball Co., Chicago; P. B. Klitgh, Cable Com­pany, Chicago; J. A. Stewart, Farrand Co., De­troit, Mich.; L. L. Doud, A. B. Chase Co., Nor­walk, 0.; J. H. Parnham, Hardman, Peck & Co., New York; and J. H. Chase and Jacob Heyl, of the Chase & Baker Co., Buffalo, N. Y.

The meeting was called to order by L. L. Doud, who briefly pointed out the necessity of reaching some definite understanding regarding the best form of music roll for 88-note players and the number of perforations to the inch that would give best results from the viewpoint of both manufacturer and public. Mr. Doud stated that as the 88-note player was but in its infancy, now is the time to adopt some standard music roll that will aid the purchaser in obtaining the best results from a maximum number of rolls to select from—in other words, that the purchaser be not confined to one particular make of music roll and the natural limitations of such a list. At present 6 and 9 perforations to the inch rep­resent the two extremes, the Aeolian Co.’s 12-to-the-inch roll being more in the nature of an ex­periment.

The gentlemen present then selected Mr. Doud as chairman and Mr. Chase as secretary, and then the representatives were called upon to give their individual opinions and make sugges­tions, with the good of the various manufactur­ers and the satisfaction of the public, the real judge and jury, in mind.

T. M. Pletcher, representing the Melville Clark Piano Co., was the first to speak, and said that in the opinion of his company the six-to-the-inch perforations afforded greater possibilities from a musical standpoint, in view of the greater quantity of air controlled by the perforations. Mr. Pletcher added, however, that his company were willing to abide by the sense of the con­vention, and had, in fact, already turned out a number of player-pianos using rolls with nine perforations to the inch.

R. A. Rodesch, who has adopted eight perfora­tions to the inch, then spoke on the subject of a standard roll, and held that such a measurement as he used withstood climatic changes better than the nine-to-the-inch roll, and thereby in­sured proper tracking. Mr. Rodesch held, as did the majority of those present, that the double tracker board, one adapted to 65-note rolls, was a necessity for the present at least, affording pro­tection to both dealer and customer.

In setting forth the Cable Company’s stand, P. B. Klugh said that the nine-to-the-inch scale had been adopted by that company and they were not open to argument on the subject, as such a scale had given entire satisfaction. Mr. Klugh offered as a solution of the improper track­ing question, the adoption of an adjustable end to the roll, which when pressed against a loosely-rolled music roll would force perforations into perfect alignment. He also gave it as his opin­ion that the habit of twisting the roll as tightly as possible before playing was a mistake, as when held tightly, proper adjustment of the roll was impossible. Mr. Klugh stated that when the purchaser understood the secret of this method of adjustment the nine-to-the-inch roll would give entire satisfaction in every instance.

J. H. Parnham also stated that Hardman, Peck & Co. had found no trouble with rolls cut nine-to-the-inch, either before or after selling.

Gustave Behning then informed the meeting that his company had found the nine-to-the-inch scale so satisfactory that they had begun to cut the 65-note music with smaller perforations and with excellent results.

The meeting then adjourned until the after­noon.

The Afternoon Session.

The afternoon session was called to order at 2 p.m., and some time was given over to a gen­eral discussion of the relative value of the rolls having eight and nine perforations to the inch, respectively. Mr. Rodesch offered for examina­tion a number of rolls cut on the eight-to-the-inch scale, which were compared with one of nine shown by Mr. Votey.

The general discussion was here interrupted for the purpose of considering whether or not to finally adopt the 88-note roll in preference to the 85-note roll. Mr. Heyl, of the Chase & Baker Co., spoke at length on the subject, stating that in Europe pianos of seven-octave range, or 85 notes, cutting of the three treble notes, were manufactured in considerable quantities and had a ready sale. In support of the statement, how­ever, that the 88-notes were needed, Mr. Heyl offered the following figures: Out of 3,838 com­positions cut by the Chase & Baker Co., 1,130 needed only 65 notes; 2,425, 78 notes; 2,542 needed 80 notes; 2,660 required 83 notes, and 3,676 could be cut in an 85-note range.

A motion was made and carried that the music be cut to the full 88 notes. It was also moved and carried that the rolls be made with a standard width of 11¼ inches, leaving a mar­gin in each side for future development, it being acknowledged that any advance in future would need the margin in its consummation.

Mr. Rodesch here proposed that the final set­tlement of the perforation question be postponed until the annual meeting of the National Manu­facturers Association, to be held in Detroit next June, that the matter could be more thoroughly studied, and several of those present concurred with him in that opinion, but the general sense of the body was that such a postponement would only increase the feeling of uncertainty among both manufacturers and dealers and cause addi­tional trouble for those manufacturers who were turning out players and music rolls that would not conform with the standard agreed upon.

Mr. Votey then made a motion, which was unanimously carried, to the effect that the mat­ter be decided at once. A standing vote was taken and twelve were found to favor the nine-to-the-inch scale, with only six backing the eight-to-the-inch standard. Upon motion the vote in favor of nine perforations as a standard scale was declared unanimous.

Thus, with a little over four hours discussion, a question was settled that has caused much worriment to the trade for over a year past, and especially so within the last few months. With a standard roll all manufacturers have a chance to do business, for a purchaser can go anywhere and get any selection he desires to play, and is not confined to one list, often restricted.

Mr. Votey, following the settlement of the per­foration standard, offered a suggestion, which was accepted, to the effect that the manufac­turers adopt for the 88-note music rolls the spool about being used by the Aeolian Co. The new spool has clutches inserted in the ends instead of pins, and attachments are furnished for in­serting in the holders on the player, the other end being arranged to fit the clutches placed within the ends of the spool. This new spool, Mr. Votey claims, makes proper tracking a sim­ple proposition, as the roll can be held tightly and accurately, a difficult feat where the pin is used, especially if it is driven into a spool made of cross-grained wood. The spool is also fitted with an adjustable end which may be pressed against the music roll in such a way as to force the perforations into alignment. While this ad­justable end is patented, the Aeolian Co. have not, nor will not, patent the clutch, offering it for the free use of other manufacturers. The individual manufacturers, too, may invent an adjustable end that will not conflict with the patented article, but give the same result.

The question of price also came up before the meeting, and while no action was taken, it was strongly suggested that while the field was a new one, manufacturers should insure both themselves and the dealer a fair and liberal profit while the opportunity offers. Mr. Votey here stated that the Aeolian Co. would sell their 88-note rolls at the same price as the 65-note, believing that in large quantities they can be made nearly as cheaply. This company are also considering the making of player-pianos with only one tracker, that for 88-note rolls.

J. H. Dickinson, of the Gulbransen-Dickinson Co., suggested that the player-piano and music roll manufacturers present effect a permanent organization for meeting at stated times and dis­cussing such questions as interest the meetings. As most of the firms represented were members of the National Manufacturers Association, Mr. Dickinson’s suggestion was not acted upon.

At the close of the convention of player-piano and music roll manufacturers, Paul E. V. Gottschalk, general manager of the Niagara Music Co., Buffalo, presented each one present with a music roll bearing “The Convention March,” composed by Paul R. Godeska, and “Dedicated to the Convention of Player-Piano Manufacturers, held at the Iroquois Hotel, Buf­falo, N. Y., December 10, 1908.” The roll was in a handsome box decorated with holly and made a pleasing souvenir.

by Shepazu at August 11, 2015 11:26 PM

August 09, 2015

ishida >> blog

UniView 8.0.0a: CJK features added

Picture of the page in action.
>> Use UniView

This update allows you to link to information about Han characters and Hangul syllables, and fixes some bugs related to the display of Han character blocks.

Information about Han characters displayed in the lower right area will have a link View data in Unihan database. As expected, this opens a new window at the page of the Unihan database corresponding to this character.

Han and hangul characters also have a link View in PDF code charts (pageXX). On Firefox and Chrome, this will open the PDF file for that block at the page that lists this character. (For Safari and Edge you will need to scroll to the page indicated.) The PDF is useful if there is no picture or font glyph for that character, but also allows you to see the variant forms of the character.

For some Han blocks, the number of characters per page in the PDF file varies slightly. In this case you will see the text approx; you may have to look at a page adjacent to the one you are taken to for these characters.

Note that some of the PDF files are quite large. If the file size exceeds 3Mb, a warning is included.

by r12a at August 09, 2015 08:56 AM

August 05, 2015

W3C Blog

Participate in a survey on Web security by the STREWS project

W3C, along with SAP, TCD and KUL, is partner in a European project called STREWS. The goal is to bring research and standardization in the area of Web security together. The project is funded by the European Commission (7th Framework Programme). It organizes workshops, writes reports, and, as its main goal, writes a “European Roadmap for Research on Web Security”.

That report is due later this year. And to help with assigning priorities to various topics in security, the project has created a survey. It is targeted especially at people maintaining web sites.

If you maintain a Web site, or have helped set one up in the past or plan to do so soon, you can help: Please, take some time to fill out this survey:

STREWS Web-security interactive survey

The STREWS project especially likes to hear from other European research projects, because one of the goals of the Roadmap is to help the European Commission select areas of research that need more support. That’s why a few questions talk specifically about “projects.” Just skip any questions that don’t apply to you.

The survey is open until September 11. The Roadmap is expected to be published later in September or October. It will be freely available from the STREWS Web site.

by Bert Bos at August 05, 2015 11:01 PM