October 01, 2015

W3C Blog

Work Begins on Extensions to WCAG 2.0

Last week a new charter for the Web Content Accessibility Guidelines (WCAG) Working Group (WG) was formally approved by W3C after having been reviewed by the W3C Member organizations. For the first time since the finalization of WCAG 2.0 in 2008, this charter allows the Working Group to explore ways to provide guidelines beyond WCAG 2.0.

The WCAG 2.0 standard continues to be the preeminent reference for web accessibility.  A growing number of national and organizational policies around the world reference WCAG 2.0, including Canada, Australia, Japan, India, and the United States. WCAG 2.0 holds up well today despite significant changes in technology.

There have been some changes to the technology landscape, however, that were not fully anticipated in the development of WCAG 2.0. Changes in how people access the Web on mobile devices require success criteria that address those situations more specifically. Users with cognitive and learning disabilities and users with low vision have suggested ways in which success criteria could better address their requirements. In recent years the WCAG Working Group formed task forces on mobile, cognitive, and low vision accessibility to define requirements and candidate success criteria for these three areas. New technologies on the horizon and the rapid evolution of the underlying technologies for user interaction on the Web are likely to continue to drive the need for new guidance.

To address these needs, the WCAG Working Group has begun to develop a framework for WCAG 2.0 extensions. These would be separate guideline documents, to increase the amount of coverage on particular accessibility needs. Authors and policy-makers would be able to choose to meet the guidelines with one or more extensions, which inherently meet the base WCAG 2.0 guidelines, while organizations that have policies built around WCAG 2.0 alone would not be impacted by the extensions.

The WCAG charter just approved will serve as bridge to begin work on extensions now while we continue to define what the next generation of WAI guidelines will look like. The Working Group is gathering requirements that may lead to the creation of an updated version of WCAG, or a new set of accessibility guidelines altogether, or both. In order to better integrate the components of web accessibility into a single set of guidelines, the Working Group is exploring the possibility of merging with the Authoring Tool Accessibility Guidelines and User Agent Accessibility Guidelines Working Groups. The Authoring Tool Accessibility Guidelines Working Group (ATAG WG) has just published the completed Authoring Tool Accessibility Guidelines (ATAG) 2.0; and the User Agent Accessibility Guidelines Working Group (UAWG) has just published an updated working draft, rolling in comments from browser vendors and others, and will be publishing the User Agent Accessibility Guidelines (UAAG) 2.0 as a Working Group Note soon.

WCAG 2.0 extensions and setting the stage for next-generation accessibility guidelines means this is an excellent time to join the WCAG Working Group!

by Andrew Kirkpatrick at October 01, 2015 11:00 PM

September 28, 2015

W3C Blog

TPAC 2016 dates and location announced

We have announced today that the 2016 W3C Technical Plenary (TPAC) will be held on 19-23 September 2016 at the Congress Center of Lisbon, in Portugal. Please, save the date!

W3C hosts this yearly five-day event to allow Working Groups (WG) and Interest Groups (IG) to hold their face to face meetings in one place and have the opportunity to meet and liaise with participants from other groups. The W3C Advisory Committee meeting, bi-annual Membership meeting, takes place during the same week.

The 2015 edition of TPAC takes us to Sapporo, Japan, in just a month. We invite our work group members to register for TPAC 2015 by 7 October.

Unconference/breakout is the preferred format of the Technical Plenary Day. That day will consist of a brief Plenary Session in the morning including a panel on the future of the Internet and Web, with Tim Berners-Lee, Jun Murai, and Vint Cerf that Jeff Jaffe will moderate; and breakout sessions. Please, continue to propose breakout sessions until the Technical Plenary Day itself.

by Coralie Mercier at September 28, 2015 01:27 PM

September 24, 2015

W3C Blog

More Accessible Web Authoring with ATAG 2.0

Easier production of accessible Web content is an important aspect of improving accessibility of the Web for people with disabilities. One of the factors that can help towards that goal is better support for accessibility in the authoring tools themselves. WAI is pleased to announce the publication of the Authoring Tool Accessibility Guidelines (ATAG) 2.0 which help authoring tool developers create more accessible products that produce more accessible content. People with disabilities need to be able to use authoring tools, and ATAG provides helpful guidance in areas specific to authoring tools, like an accessible editing view.

Real World, Real Tools

ATAG 2.0 is complete, ready for use and is already being implemented (or is in the process of being implemented) by native and web-based authoring tools including: Content Management Systems (CMS) like Drupal and DeFacto CMS; Learning Management Systems (LMS) and MOOCs like edX; WYSIWYG and HTML editors like Ephox, Achecker and TinyMCE, social media tools like Easy Chirp, and media editing or specialty tools like Common Look Global Access.

More Accessible Authoring for People with Disabilities

Tools that meet ATAG 2.0 make it easier for people with disabilities to author web content, with a focus on the editing functions of the authoring tool. Here are some examples:

  • Edit or create content with the font size and colors you need, while publishing in the size and colors you want for your audience.
  • Identify images and media in your editing view with info like alternative text or captions.
  • Use spellchecking or other status indicators that work with assistive technology (not simply be CSS or other vision-only indicator).
  • Navigate through the content structure or outline
  • Search text and alternative text in the editing view

ATAG will help you conform to WCAG.

The Web Content Accessibility Guidelines (WCAG) 2.0 provide internationally accepted guidance for accessible web content. ATAG 2.0 is closely integrated with WCAG 2.0 and supports WCAG implementation. ATAG gives authoring tool developers guidance on making better tools that help authors in creating content that meets WCAG 2.0. Like other features of tools – spellchecking, grammar checking, syntax validation – accessibility becomes an integrated feature. When the tool helps produce more accessible content, it may improve accessibility at a lower training cost than traditional tools, and help avoid costly revisions incurred by adding accessibility later.

ATAG helps you create more accessible web content by:

  • ensuring that features that support accessibility are as easy to discover and use as other features of the tool.
  • preserving accessibility information across copy-paste or Save As operations
  • identifying what templates are accessible
  • helping authors with accessibility checking and repair of accessibility problems

How Can I Start Using ATAG?

Tool developers can use ATAG 2.0 for guidance on making better authoring tools for their customers. People with disabilities and accessibility advocates can encourage authoring tool vendors to make their tools meet ATAG 2.0. Buyers and purchasing agents of authoring tools can include ATAG 2.0 conformance in Requests for Proposals/Tender, and use ATAG for evaluating the accessibility of tools.

More Information

For additional information about ATAG 2.0. see the ATAG Overview. ATAG 2.0 At a Glance provides a summary of the ATAG guidelines. ATAG’s companion document, Implementing ATAG 2.0, gives detailed description of the intent of each success criteria, examples and use cases for the success criteria and additional resources.

ATAG 2.0’s publication as a web standard provides another step forward in making the web more accessible by providing guidance to authoring tool developers on designing more accessible authoring tools that produce more accessible websites.

by Jeanne F Spellman at September 24, 2015 02:51 PM

August 20, 2015

W3C Blog

TPE to CR: Advancing the conversation about Web tracking preferences

W3C’s Tracking Protection Working Group today published the Candidate Recommendation of the Tracking Preference Expression (TPE) and calls for implementation and testing of the specification. Congratulations to the Working Group on this progress.

Abstract: This specification defines the DNT request header field as an HTTP mechanism for expressing the user’s preference regarding tracking, an HTML DOM property to make that expression readable by scripts, and APIs that allow scripts to register site-specific exceptions granted by the user. It also defines mechanisms for sites to communicate whether and how they honor a received preference through use of the “Tk” response header field and well-known resources that provide a machine-readable tracking status.

The “DNT” header is one piece in a larger privacy conversation. The TPE enables users, through their user-agents, to send a standard signal, “Do Not Track”, or alternatively to indicate that they do not mind being tracked; and it enables servers to recognize and respond to that user preference. DNT is implemented in most current browsers, so users can already make the technical request for privacy and ask for compliance by sites they frequent.

The Working Group was also chartered to define the meaning of compliance with the DNT preference. While the Working Group aims, in a second document, to define a compliance regime that may be useful across a wide of use cases, it chose to make the standard flexible enough to work in a variety of regulatory or business scenarios by enabling sites to indicate (via a URI sent in tracking status responses or at a well-known location) what compliance regime they follow. They may choose to follow the W3C-defined Compliance specification or an alternate.

We welcome the work of other groups considering ways to use the DNT header. EFF and a coalition have announced an alternate, more stringent compliance policy. Users can install EFF’s Privacy Badger extension to support that compliance policy by blocking non-compliant trackers. We see this building on top of the TPE specification not as a competing effort, but as expanding diversity of the Do Not Track ecosystem, using the language of the DNT header to convey a privacy request, and new compliance text to indicate their acceptable responses.

The importance of this work is highlighted by a recent finding from the Technical Architecture Group (TAG) on Unsanctioned Web Tracking. The TAG noted that tracking that abides by Web standards takes into account user needs for privacy and control over data flows, providing transparency to users and researchers, while “unsanctioned tracking” outside of well-defined mechanisms and standards tends to undermine user trust. TPE response and compliance can be tools of Web privacy transparency, helping sites to disclose their practices and meet user expectations. TPE thus enables sites to hear and respond to users’ preferences about tracking — giving alternatives to the regulation the TAG finding suggests might otherwise be necessary.

Next steps: Both the TPE and Compliance specifications are already implemented, but still need further testing (and resolution of remaining issues, on the Compliance spec) before they can be issued as W3C Recommendations. The Working Group will now focus on testing for interoperable implementations and addressing Last Call issues on the Compliance spec. We estimate that both specifications will be published as Recommendations in 2016.

by Wendy Seltzer at August 20, 2015 12:20 PM

August 16, 2015

Reinventing Fire

Bordering on Factual

Yesterday, a cool-looking map showed up on my Facebook feed, shared by a friend; it depicts the North American continent with the historical political boundaries of the native Americans. It listed clear boundaries for separate states of the First Nations: Anasazi, Apache Empire, Arawak, Aztec Empire, Beothuk Empire, Cherokee Soverignty, Cheyenne, Chickasaw, Chilcotin, Chinook, Chumash, Comanche, Cree Federation, Creek, Crow, Dogrib, Flathead, Great Sioux Nation, Haida Gwai, Hopi, Huron Supremacy, Inuit, Iroquois Confederacy, Mayan Empire, Mi’kmaq, Mohican, Navajo, Ojibwa, Olmec Kingdom, Pawnee, Pequot, Pomo, Powhatan, Salish, Shuswap, Slavey, Tlingit, and Ute.

Facebook post of Native American map

I’d never before seen such a clear depiction of the geopolitical boundaries of pre-Columbian America, and it was a stark reminder of how we, as a people, systematically invaded and destroyed a continent of cultured peoples. We wiped away their cultures, their languages, their history, and even the memory of them, leaving only scraps behind, and we protect our current borders of land they used to live on. The American Indian Wars ended in 1924, less than a hundred years ago, but it’s not even part of the American political dialog. And we’ve whitewashed our pogroms against Native Americans, in the same way we’re presently sugar-coating slavery in history courses.

The original person who posted the picture on Facebook also included this commentary,

America before colonization…. I’ve never seen this map in my entire 25 years of formal education. Not in one history book or one lesson. This is not a mistake… Representation matters!!! #NativeHistory #BeforeAmerica

Well said. And others agreed… the post has over 150,000 shares as I write this!

But something smelled wrong to me about the map itself.

Terms like “Empire”, “Soverignty”, “Federation”, “Confederacy”, “Nation”, “Supremacy”, and “Kingdom” seemed oddly specific and out-of-place, and even seemed designed to evoke legitimacy by comparison with European state structures. Were these really accurate labels of the political systems?

The number of different Native American nations seemed far too few; was the map aggregating them in some way I’d never seen before?

The borderlines seemed too crisp; weren’t many of these peoples semi-nomadic?

Glaringly, the Olmec were much earlier than the Aztec and Maya. What era was this supposed to be be representing?

And the biggest red flag… there was no source for the data, no provenance for the map, and the label was truncated.

So I dug into it, using TinEye to find the history of the image.

I couldn’t find the original version of the map, or who made it, but I did find a Reddit post from 9 days ago entitled A map where Europe never discovered America. The image link was broken, but I found a more complete version that clearly shows the alternate history timestamp: “Aztec Calendar: Three Acatl (approx 2015 AD)”:

Imaginary map of Alternate History Non-Columbian North America

We aren’t taught this map, because this map isn’t real.

The reality is far more complicated, fortunately or unfortunately. It doesn’t lend itself to easy and obvious emotional appeals. Images and data visualizations make hard things easier to understand, and thus are extremely tempting to share.

A lot of my friends have already shared this map; they’re smart, well-meaning people, and most of them are Liberals of some stripe or another. Before the week is out, this map will be shared many hundreds of thousands of times. Self-styled Right-Wingers, Conservatives, and Republicans are going to jump on this. They’ll point this out as typical knee-jerk America-hating Liberals, and laugh at the fact that people who consider themselves educated and intelligent were fooled by so obvious a hoax.

Here’s where these Right-Wingers are wrong: the map is incorrect, but the sentiment and the facts informing that sentiment are correct.

It’s easy to laugh at someone for being undereducated if your political party systematically suppresses the correct information that they should be getting from their schools.

That said, we’re all responsible for our own truths, and before you put something out there in the world, or share something someone else has said, you should do some fact-checking. If what you’re saying is a matter of objective fact, rather than subjective opinion, it’s more important to be correct than to be heard, otherwise you might undermine your own valid message.

But in this busy world, if you do make a mistake and spread something that you learn later was incorrect, don’t be so hard on yourself… just correct the record. We make mistakes, and it’s silly and mean-spirited to shame others for that, especially when their intentions are good; and worse yet, it forces people to defend themselves even if they were wrong, and doesn’t reinforce self-correction. Megan, my spouse, casually shared the erroneous map, but when I pointed out the flaws, she corrected herself in her comments, frankly and openly; she didn’t delete the message, she enhanced and corrected it.

This is the lesson we can carry forward from our own history as a nation: we have made mistakes, and we will continue to make mistakes, in how we treat others and how we think about our world; we need to remember these mistakes, and correct our behavior. We need to continue to make this a more perfect nation, knowing we will fail, but with good intentions.

Megan, being a conscientious map-nerd, also found a good source of a well-researched map of the true distribution of the native tribes and nations of North America, lovingly researched and rendered by Aaron Carapella:

Megan also pointed me to a great NPR radio story about Aaron’s maps and the naming of tribes. You can support Aaron’s work by buying his maps (currently on sale).

In the modern era, when it’s so easy for information to spread, it’s our social responsibility to spread factual information and to correct misinformation. It’s important that our technological tools make that easier, not harder; Facebook and Twitter don’t currently provide good tools for either of those tasks. For example, Facebook’s “Report Photo” dialog contains only the options “It’s annoying or not interesting”, “I’m in this photo and I don’t like it”, “I think it shouldn’t be on Facebook”, and “It’s spam”; why can’t they include “It contains factual errors”? How can I politely tell someone that they have made a mistake if the tool doesn’t include a way to do so?

Facebook's “Report image” dialog

I’m hopeful that the work of the W3C Web Annotation Working Group will yield a set of technologies and conforming services that make fact-checking and accountability possible through decentralized annotation. (If this intrigues you, check out Hypothes.is, a socially-conscious annotation service you can use today).

In summary, here’s a few suggestions:

  • Don’t post stuff you haven’t verified
  • Don’t share stuff unless you’ve check the sources (Snopes is a good first step, or read an article on Wikipedia if you have more time)
  • Cite your sources
  • Make sure that images and data visualizations accurately reflect the facts at hand
  • Don’t dismiss all facts and opinions just because some mistakes were made; get at the truth of the sentiment, don’t just nitpick
  • Be suspicious when something too closely matches your own world view (i.e. beware confirmation bias)
  • Learn from your mistakes
  • Reward others for learning and growing
  • Don’t assume you know the truth; human knowledge is always expanding

If we want a civil society in a fast-paced, hyper-connected world, we are going to need to adapt our education system, our technological tools, our social norms, and ourselves.



Samuel Cousins informed me on Twitter that the source of the map was a scifi and comic book writer, Joseph Abbott (aka liminalsoup), who posted the map on Reddit looking for feedback for a story they’re writing. I’d guessed it was probably source material for a role-playing game campaign, so I was a bit off base.

by Shepazu at August 16, 2015 10:24 PM

August 12, 2015

Reinventing Fire

Opening Standards

Today, the XML Activity, with several Working Group charters, was approved. This is a major milestone for W3C, not because of the activities of these groups themselves, but for W3C’s process of developing standards.

For the first time, all of W3C’s active Working Groups now operate in public.

When W3C began, it operated largely as a Member-only consortium. Member companies paid substantial dues, and convened behind closed doors with each other and a handful of Invited Experts,  in Member-only Working Groups (WGs). WG mailing lists, teleconferences (telcons), and face-to-face (f2f) meetings were all Member-only, as were editor’s drafts of specifications, and even the list of which organizations and people were participating. Periodic drafts of works in progress (Working Drafts, or WDs) were published on W3C’s public Technical Reports (TR) directory, and feedback was processed on public mailing lists. But the public conversations and member-only conversations didn’t mix, and specific decisions were not transparent. W3C was not a truly open standards body.

When I joined the W3C team in July 2007, one of my personal goals was to open up the organization. I joined the Team to help with SVG; 2 years prior to that, I had actually joined W3C as an Affiliate Member via my small startup, Vectoreal, to move a struggling SVG specification along. I spent the entire revenue of one of my consulting contracts to do so, in the hope that I could make it up in the long run if SVG took off and increased my business; I became a W3C Member because I didn’t feel like the SVG WG was responsive to the then-active SVG community, and I wanted to represent the needs of the average developer. In joining the W3C Team, I took a salary that was about half of my previous years’ earnings. For me, it was important that W3C should not be a “pay-to-play” organization, that Web developers –and not just paying W3C members or hand-picked Invited Experts– should have a strong voice.

This was one of the few things I had in common with the contentious WHATWG folks (among others): a core belief that standards should be developed in the public. When I took over as staff contact of the WebAPI WG, I (along with the WG chairs, Art Barstow and Chaals McCathieNevile) set about merging it with the Web Application Formats (WAF) WG, to make a single WebApps WG that would operate in the public, following on the heels of the newly rechartered HTML WG’s grand experiment as a public WG; unlike the HTML WG, however, a person didn’t have to join the WebApps WG to read and post on the mailing list, lowering the bar to participation and decreasing W3C’s overhead in processing Invited Expert applications. This proved to be the model that the SVG, CSS, and later Working Groups followed.

We were off to a good start, but then we stalled out. Many Working Groups didn’t want to become public, and conversations about making WGs public by default were shut down by Members who’d paid to have a seat, by Invited Experts who liked their personal privilege, and by W3C staffers who had a variety of concerns.

But slowly, with external and internal pressure, including encouragement by some W3C members who put a premium on openness, Working Group charters (renewed every couple of years) more and more commonly designated their group as public. Within a few years, this became the norm, even without pressure. A few Member-only holdouts persisted: the Multimodal Interaction (MMI) WG; the Web Accessibility Initiative (WAI) WGs; and the XML WGs.

</xml> (Closed XML)

In January of 2000, in response to complaints that an update to the public version of an XML specification had taken too long, Tim Bray wrote on an external mailing list, XML-Dev, “[…] But it’s a symptom of the W3C’s #1 problem, lack of resources.” This is a tune we’re all still familiar with, sadly.

Lee Anne Phillips replied to this, on 16 January 2000:

With all respect, I think the lack of resources are the fault of the W3C membership policies, which seem designed to strongly discourage individuals and small organizations and businesses from participating in the process. US$5000 for an Affiliate Membership is beyond the reach of most of us and of many small businesses since that’s in addition to the value of the time spent on the process itself.

Whether this policy is because the big players want negotiations to go on in secret (and secrecy is inherent in the W3C structure so it can’t be an accident) or because W3C just can’t be bothered with the “little people” is a matter of speculation.

What’s certainly true is that there is a vast pool of talent available, many of whom are passionately interested in the development of XML and XML-related standards and might well have more time to spend than the human resources on sometimes grudging lend-lease from major corporations. Witness this and other lists which represent a collective effort of major proportions and a tremendous pool of knowledge and skills.

While we all appreciate the enormous efforts of the organizational participants in the W3C process, who’ve done yeoman service trying to juggle activities which might directly advance their careers at their organizational home with the community responsibilities of the standards process, there just might be a better and more open way.

The Internet standards process started in the RFC methodology, which, though sometimes awkward, chaotic, and slow, allowed rapid innovation and standardization when warranted and was fully public, ensuring participation by the *real* stakeholders in the process, the community served, rather than being dominated by the vendors who want to sell products to them.

In one way or another, we’re the ones who pay for all this work. Surely a way could be found to ensure that we know what the heck is going on. Even better, we could help in the initial stages rather than waiting in front of the curtain until a team of magicians come out and present us with whatever they think we want and are then either cheered or booed off the stage.

Others defended W3C, citing the Invited Expert policy, but Lee Anne (and many other people with similar thoughts) was essentially correct on two major points: it was an exclusionary policy; and it limited the pool of possible contributors that could help a resource-constrained organization.

It took 16 years, but I can now say, truthfully, that W3C is an open standards organization, both in its specifications and in its process.

Yeah, but it is really open?

There are aspects of W3C that are still not open, and are never likely to be, for pragmatic business and collaboration reasons.

Michael Champion from Microsoft said in this same XML thread:

The W3C’s secrecy policies are there because it is a treaty organization of competitors, not a friendly group of collaborators.

This was 16 years ago; I don’t know if he’d say the same thing today, but it’s only a little less true… W3C is a forum for coopetition, and discussions are usually friendly, but it is business. As a W3C staffer, I’m privy to future implementation plans, concerns about patents and Intellectual Property, and other in camera, off-camera discussions that are important for standards makers to know, but which companies won’t or can’t talk about in public. If they couldn’t talk about it in Member-only space, it wouldn’t be talked about at all, or would be couched in lies, deception, and misdirection. This frankness is invaluable, and it should be and is respected by participants. This is part of the bond of trust that makes standards work.

There are other parts of W3C’s standards-making process that are still not completely open for participation, due to logistical issues:

  • telcons: Trying to make decisions, or even have coherent conversations, on a 1-hour telcon with dozens of people, some of whom may not be known to the WG or may not be familiar with the discussion style of the WG, would be a nightmare. Telcons are limited to WG participants, including Invited Experts, and any topic experts they invite on a one-off basis. The logs (meeting minutes) are published publicly, however.
  • f2f meetings: The same constraints for telcons apply to f2f meetings, with the extra factors of the costs for the host (meeting room, food, network, and so on), facility access (NDAs, etc.), planning (who’s available and when and where), and travel costs (which are prohibitive for some Invited Experts, and would be for many members of the public). As with telcons, the meeting minutes are scribed and published publicly. In addition, around these f2f meetings there are sometimes public meet-ups for the locals to see presentations by the WG, to meet them, make suggestions or ask questions, and otherwise socialize with the WG.
  • informal brainstorming: This might happen in a hallway or a cafe, or on an unrelated mailing list or issue tracker; it might be a collaboration between colleagues or competitors, or it might be an individual tinkering with their own thoughts until they feel they have something worth sharing. It’s hard to make this fully open.
  • decision-making: One of the most significant benefits of W3C membership is the right to help make final technical decisions in a Working Group. Anyone can make suggestions and requests, but ultimately, it is the participants of the WG (and often, more specifically, the editors of the spec) who make the decisions, through a process of consensus. This is not only logistical (somebody has to decide among different options) but usually pragmatic as well, since the members of the WG often are the implementers who will have to code and maintain the feature, and who are well-informed about the tradeoffs of factors like usability, performance, difficulty of implementation, and so on. But even if the WG has the final decision power, there is still an appeal process: anyone, whether a W3C member or an average developer, can lodge a Formal Objection if a technical feature is flawed, and the technical merits will be reviewed and decided by W3C’s Director.
  • write access: Who can edit the specifications? Who has control over merging pull requests? There are IP issues with contributions, and also a coherent editorial and technical tone that needs to be adhered to. Currently, this is limited to WG participants, and it’s likely to stay that way.

But the vast majority of the standardization process is now public, including almost all technical discussions, meeting minutes, decision processes, and specification drafts, as well as the lists of people and organizations involved.

W3C has greatly benefited from this openness; we get invaluable feedback and discussion from our mailing lists, and WGs tend to take such feedback very seriously. We value this public input so much that W3C has also expanded its offerings into a project called W3C Community Groups that are free and open to everyone to participate, where anyone can propose a topic for technical discussion, and if others are interested, they can form a group to develop that idea further, to write use cases and requirements and even a technical specification; if the idea gets traction, W3C may even pick it up for the Recommendation-track formal standardization process.

Some people have taken a more extreme view, that W3C should lower its membership dues by getting rid of the technical staff. I’m all for finding ways to lower our membership dues, to be more inclusive, but getting rid of W3c technical staff would mean decreasing oversight and openness, not the reverse; most of the W3C staff are dedicated to making sure that we serve society, as well as our members, and a lot of critical technical work wouldn’t happen without the W3C technical staff. Our members are doing W3C and society a great service by sponsoring our work, even while they benefit technologically and financially themselves.

●。 (Full Circle, Full Stop)

The schism between the XML community and W3C was one of several such schisms in W3C, perhaps the first major schism. The fight over whether W3C should adopt a Royalty-Free Patent Policy on its specifications was another (we did, and W3C’s Royalty-Free patent policy is now one of its crown jewels); the battle of control over HTML with WHATWG was yet another; the ongoing debate about whether the W3C’s specification document license should allow forking and require attribution is still another (W3C has recently released an optional forkable, GPL-compatible, attribution license for software and documents that may be used on a per-Working Group level). All of these were about openness, in one way or another. Openness is about transparency, and accountability, and ability to participate, and freedom of use, but also about control, and who controls what; this will always be a matter of heated debate.

The particular flavor of openness that was being debated in the XML community in 2000 was about a process open to participation, and transparent oversight into the discussions and decision-making at all formal stages of creating a standard. XML is widely used, and largely stable, so interest in further XML standards development has waned over the years; participation has dwindled, and with the final versions of these specifications (XML Core, XSLT, XQuery, XPath, XProc, EXI) being published, this may well be their final charter renewal. Now, with the XML Working Groups rechartered finally and finally as public, we have a nice bookend to a process of open standards.


Closing Words

As usual, the reality is not quite as tidy as the story. After I first wrote this article, I went back to fact-check myself, and found some dusty corners that need cleaning out. For full disclosure, I’m compelled to mention the exceptions. We have some non-technical coordination groups, such as the Patents and Standards Interest Group and Advisory Board that are not public. Some of our Web Accessibility Initiative (WAI) WGs are (ironically) not now public (specifically, the Protocols and Formats WG and WAI Coordination Group), but are currently being rechartered and all the WAI groups will be public later this year when the rechartering is settled. The Member-only Voice Browser WG is winding down, and is scheduled to close at the end of this month. The most notable exception is the Math WG, one of our oldest WGs (chartered in 1997); through a loophole in the charter process, the Math WG has not been rechartered since 2006; instead, it’s been extended by W3C management, on the grounds that they are only doing errata on existing MathML specs, rather than new technical publications. Still, this is an unwarranted exception, and my colleague Mike Smith and I are now advocating to resolve this as soon as possible, so no technical W3C Working Group, even one doing only specification maintenance, operates outside the public.

Once we’ve made sure every technical W3C Working Group is chartered to operate in the public, we all need to ensure that W3C doesn’t slip back into allowing Member-only technical Working Groups. This could take the form of a W3C policy change (petition, anyone?), or failing that, a mindfulness by W3C Advisory Committee Representatives that when we last tried the Member-confidential model, it led to worse specifications, slower progress, more time consuming feedback processes, and a community schism that nearly tore W3C apart. Let’s prevent that mistake from happening again. Eternal vigilance, and all that…

by Shepazu at August 12, 2015 06:39 AM

August 11, 2015

Reinventing Fire

Music to my Eyes!

I’m proud to have helped in the formation of the new W3C Music Notation Community Group. It’s a free and open group, and if you’re interested in the future of digital music representation, you should join now!

Chopin Prelude Opus 28, No. 7

Here’s a little history for you.

When I was kicking off the W3C Audio Incubator Group in 2010, which would spawn the Audio Working Group a year later, I knew that the Web platform needed the ability to generate and process audio, not just play back prerecorded audio streams. I didn’t know how the technology worked (and I’m still fuzzy on it); I didn’t know all of the use cases and requirements; I didn’t know the industry; I didn’t know the culture; I didn’t know the people; and I certainly didn’t know what the future held.

What I did know was that Flash was dying, and all the Web audio projects that had relied on Flash would need a new platform. And I knew that it was important that we somehow capture and encode this important cultural expression. And I knew how to find passionate people who knew all the things I didn’t, and I knew that if I gave them a place to talk (and a little gentle coaching), they would know how to make audio on the Web a reality. I wasn’t disappointed: a demo of the Audio Data API prototype by David Humphrey and Corban Brook rekindled my interest in a Web audio API; Alistair MacDonald led the initial effort as chair of the Audio Incubator Group, providing context and connections for starting the work; Chris Rogers, who designed Apple’s Core Audio before moving to Google, wrote the WebKit implementation and the early drafts of the Web Audio API; Olivier Thereaux and Chris Lowis from BBC picked up the the chair baton for the W3C Audio WG, later handing it to the capable Joe Berkovitz (Noteflight) and Matthew Paradis (BBC); and Chris Wilson (Google) and Paul Adenot (Mozilla) stepped up as editors of the Web Audio API spec when Chris Rogers moved along.

Thanks to the hard work by these and many other dedicated people, we are close to stabilizing the Web Audio API –which is a synthesizer/DSP/mixing board/audio processor in the browser– and we have commitments from all the major browser vendors to implement and ship it. We also have the Web MIDI API, which is not a way to play back bleep-beep-blorp MIDI files in your browser, but a way to control MIDI devices (e.g. musical instruments) via your browser, and vice versa.

These are pretty obvious technologies for W3C to develop. But the scope of the audio standardization work wasn’t always so clear. There was a vocal contingent among the interested parties that wanted us to standardize a music notation format… like HTML for music.

At the time, we decided this was not our priority, not only because it would dilute our focus from an already daunting task, and not only because it was a relatively niche market, but because there was already a winner in the music notation format space: MusicXML. And Michael Good, the creator and maintainer of MusicXML, made it clear that it had been a challenging undertaking, for a competitive market, and that he wasn’t ready to bring MusicXML to a formal standards body.

But the metronome ticks on, and times change. MusicXML was acquired by MakeMusic, a major music software vendor, and Michael began to warm to the idea of his creation having a home at a vendor-neutral standards body like W3C (with Joe Berkovitz patiently encouraging him); at the same time, Daniel Spreadbury (Steinberg) was developing SMuFL (Standard Music Font Layout), and together they encouraged their companies to bring their music standards under the care of a W3C Community Group.

Thus, two weeks ago, we formed the Music Notation Community Group, and already over 160 people have joined the group! Normally, W3C staff doesn’t devote resources to Community Groups, but Ivan Herman and I lent our W3C experience to the transfer and group formation in our spare time, because we saw the cultural value in having music representation on the Web (though unlike all the other people mentioned in this blog post, I’m sadly musically illiterate… “they also serve who only standardize”). Michael, Daniel, and Joe are co-chairing the group, and we’re looking forward to lively conversations.

Music and technical standards may seem like strange bedfellows, but there’s a long tradition there. The New Yorker, in a piece on HTML5 entitled “The Group That Rules the Web” by Paul Ford, referenced a 1908 article in Music Trade Review about player piano standards. In a hauntingly familiar account, the face-to-face meeting of a committee of industry leaders decided upon a nine-to-the-inch perforation scale for player piano rolls (think punch cards on scrolls). The rise and fall of the player piano industry is a fascinating read, and should give us perspective on how we build for eternity and for change.

Will MusicXML (or its successor) ever be natively supported by browsers, so that we can see it, read it, and edit it without the need for JavaScript (and SVG) rendering? Possibly not, if we learn from the history of the even more critical MathML, which still is not properly supported in browsers. But even if it is never natively supported, there are good reasons to have a vendor-neutral digital music notation format for the Web:

  • Simple interchange between music applications, both desktop and (increasingly) Web-based.
  • Annotation, which is common among musicians. As with any kind of data, the representation is the thing that is the proximal target for annotation, but it’s the data that should be annotated; as an example, I might have some CSV data, which I can render as a bar chart, line chart, or pie chart, but an annotation should apply to all chart types, that is, it is inherent in the underlying data, not in the rendering. Similarly, it’s not the SVG <path> or <use> element that represents a musical note that should be annotated, but the underlying music data model.
  • A DOM representation, which can be read and modified by JavaScript, flowed and laid out with CSS, and which could serve as custom elements for a MusicML component library.

And after all, past is prelude, and who knows what the future holds?

As your reward for reading this whole meandering post, here in full is the news article on the historic and infamous Gathering of the Player Men at Buffalo, for your edification and enjoyment, as an image and a OCR transcription, for posterity.

Gathering of the Player Men at Buffalo article

Gathering of the Player Men at Buffalo.

First Meeting of Player Manufacturers Ever Held in This Country—Discussions Anent the Number of Perforations to the Inch for a Standard Roll—Meeting Called to Order by L. L. Dowd—Addresses by Various Representatives Who Argue That Their Position Is the Correct One—Motion to Lay the Matter Over to Convention in Detroit Is Lost—Finally Votey’s Mo­tion Agreeing Upon Nine Perforations to the Inch Is Adopted.

(Special to The Review.)

Buffalo, N. Y., Dec. 10, 1908.

Pursuant to a call issued by the A. B. Chase Co., Norwalk, O., the piano player manufacturers and their representatives gathered in this city to-day, with the object in view of settling the vexed question of the scale to be used for the 88-note players. The player trade was well rep­resented at this first meeting, which may be said to almost reach the dignity of a convention.

There is unquestionably a decided difference of opinion as to the number of perforations re­quired on the music roll to the inch. There are some who hold that the 88-note roll should be no longer than the present 65-note roll. Leading makers hold that nine-to-the-inch must neces­sarily be the standard adopted, and the advo­cates of the nine-to-the-inch won at this meeting.

The first meeting was held at the Hotel Iro­quois, in this city, and opened shortly after ten. The score of delegates present constituted a truly representative gathering, the majority of the leading manufacturers having someone to look after their interests and express their opinions.

The following were present: Wm. J. Keeley, the Autopiano Co., New York; H. W. Metcalf, representing the Simplex Piano Co., Worces­ter, Mass.; the Wilcox & White Co., Meriden, Conn.; J. W. Macy, the Baldwin Co., Cincinnati, O.; E. S. Votey, the Aeolian Co., New York; R. A. Rodesch, the Rodesch Piano Co., Dixon, Ill.; T. M. Pletcher, the Melville Clark Piano Co., Chicago; Gustave Behning, the Behning Piano Co., New York; H. C. Frederici, the Clav­iola Co. and the American Perforated Music Co., New York; J. H. Dickinson, the Gulbransen-Dick­inson Co., Chicago; Otto Higel, Otto Higel Co., Toronto, Ont; D. W. Bayer, Chase & Baker Co., Buffalo, N. Y.; H. Keuchen, Shaw Piano Co., Baltimore. Md.; Chas. G. Gross, Chas. M. Stieff, Baltimore, Md.; Paul E. F. Gottschalk, Niagara Music Co., Buffalo, N. Y.; E. B. Bartlett, W. W. Kimball Co., Chicago; P. B. Klitgh, Cable Com­pany, Chicago; J. A. Stewart, Farrand Co., De­troit, Mich.; L. L. Doud, A. B. Chase Co., Nor­walk, 0.; J. H. Parnham, Hardman, Peck & Co., New York; and J. H. Chase and Jacob Heyl, of the Chase & Baker Co., Buffalo, N. Y.

The meeting was called to order by L. L. Doud, who briefly pointed out the necessity of reaching some definite understanding regarding the best form of music roll for 88-note players and the number of perforations to the inch that would give best results from the viewpoint of both manufacturer and public. Mr. Doud stated that as the 88-note player was but in its infancy, now is the time to adopt some standard music roll that will aid the purchaser in obtaining the best results from a maximum number of rolls to select from—in other words, that the purchaser be not confined to one particular make of music roll and the natural limitations of such a list. At present 6 and 9 perforations to the inch rep­resent the two extremes, the Aeolian Co.’s 12-to-the-inch roll being more in the nature of an ex­periment.

The gentlemen present then selected Mr. Doud as chairman and Mr. Chase as secretary, and then the representatives were called upon to give their individual opinions and make sugges­tions, with the good of the various manufactur­ers and the satisfaction of the public, the real judge and jury, in mind.

T. M. Pletcher, representing the Melville Clark Piano Co., was the first to speak, and said that in the opinion of his company the six-to-the-inch perforations afforded greater possibilities from a musical standpoint, in view of the greater quantity of air controlled by the perforations. Mr. Pletcher added, however, that his company were willing to abide by the sense of the con­vention, and had, in fact, already turned out a number of player-pianos using rolls with nine perforations to the inch.

R. A. Rodesch, who has adopted eight perfora­tions to the inch, then spoke on the subject of a standard roll, and held that such a measurement as he used withstood climatic changes better than the nine-to-the-inch roll, and thereby in­sured proper tracking. Mr. Rodesch held, as did the majority of those present, that the double tracker board, one adapted to 65-note rolls, was a necessity for the present at least, affording pro­tection to both dealer and customer.

In setting forth the Cable Company’s stand, P. B. Klugh said that the nine-to-the-inch scale had been adopted by that company and they were not open to argument on the subject, as such a scale had given entire satisfaction. Mr. Klugh offered as a solution of the improper track­ing question, the adoption of an adjustable end to the roll, which when pressed against a loosely-rolled music roll would force perforations into perfect alignment. He also gave it as his opin­ion that the habit of twisting the roll as tightly as possible before playing was a mistake, as when held tightly, proper adjustment of the roll was impossible. Mr. Klugh stated that when the purchaser understood the secret of this method of adjustment the nine-to-the-inch roll would give entire satisfaction in every instance.

J. H. Parnham also stated that Hardman, Peck & Co. had found no trouble with rolls cut nine-to-the-inch, either before or after selling.

Gustave Behning then informed the meeting that his company had found the nine-to-the-inch scale so satisfactory that they had begun to cut the 65-note music with smaller perforations and with excellent results.

The meeting then adjourned until the after­noon.

The Afternoon Session.

The afternoon session was called to order at 2 p.m., and some time was given over to a gen­eral discussion of the relative value of the rolls having eight and nine perforations to the inch, respectively. Mr. Rodesch offered for examina­tion a number of rolls cut on the eight-to-the-inch scale, which were compared with one of nine shown by Mr. Votey.

The general discussion was here interrupted for the purpose of considering whether or not to finally adopt the 88-note roll in preference to the 85-note roll. Mr. Heyl, of the Chase & Baker Co., spoke at length on the subject, stating that in Europe pianos of seven-octave range, or 85 notes, cutting of the three treble notes, were manufactured in considerable quantities and had a ready sale. In support of the statement, how­ever, that the 88-notes were needed, Mr. Heyl offered the following figures: Out of 3,838 com­positions cut by the Chase & Baker Co., 1,130 needed only 65 notes; 2,425, 78 notes; 2,542 needed 80 notes; 2,660 required 83 notes, and 3,676 could be cut in an 85-note range.

A motion was made and carried that the music be cut to the full 88 notes. It was also moved and carried that the rolls be made with a standard width of 11¼ inches, leaving a mar­gin in each side for future development, it being acknowledged that any advance in future would need the margin in its consummation.

Mr. Rodesch here proposed that the final set­tlement of the perforation question be postponed until the annual meeting of the National Manu­facturers Association, to be held in Detroit next June, that the matter could be more thoroughly studied, and several of those present concurred with him in that opinion, but the general sense of the body was that such a postponement would only increase the feeling of uncertainty among both manufacturers and dealers and cause addi­tional trouble for those manufacturers who were turning out players and music rolls that would not conform with the standard agreed upon.

Mr. Votey then made a motion, which was unanimously carried, to the effect that the mat­ter be decided at once. A standing vote was taken and twelve were found to favor the nine-to-the-inch scale, with only six backing the eight-to-the-inch standard. Upon motion the vote in favor of nine perforations as a standard scale was declared unanimous.

Thus, with a little over four hours discussion, a question was settled that has caused much worriment to the trade for over a year past, and especially so within the last few months. With a standard roll all manufacturers have a chance to do business, for a purchaser can go anywhere and get any selection he desires to play, and is not confined to one list, often restricted.

Mr. Votey, following the settlement of the per­foration standard, offered a suggestion, which was accepted, to the effect that the manufac­turers adopt for the 88-note music rolls the spool about being used by the Aeolian Co. The new spool has clutches inserted in the ends instead of pins, and attachments are furnished for in­serting in the holders on the player, the other end being arranged to fit the clutches placed within the ends of the spool. This new spool, Mr. Votey claims, makes proper tracking a sim­ple proposition, as the roll can be held tightly and accurately, a difficult feat where the pin is used, especially if it is driven into a spool made of cross-grained wood. The spool is also fitted with an adjustable end which may be pressed against the music roll in such a way as to force the perforations into alignment. While this ad­justable end is patented, the Aeolian Co. have not, nor will not, patent the clutch, offering it for the free use of other manufacturers. The individual manufacturers, too, may invent an adjustable end that will not conflict with the patented article, but give the same result.

The question of price also came up before the meeting, and while no action was taken, it was strongly suggested that while the field was a new one, manufacturers should insure both themselves and the dealer a fair and liberal profit while the opportunity offers. Mr. Votey here stated that the Aeolian Co. would sell their 88-note rolls at the same price as the 65-note, believing that in large quantities they can be made nearly as cheaply. This company are also considering the making of player-pianos with only one tracker, that for 88-note rolls.

J. H. Dickinson, of the Gulbransen-Dickinson Co., suggested that the player-piano and music roll manufacturers present effect a permanent organization for meeting at stated times and dis­cussing such questions as interest the meetings. As most of the firms represented were members of the National Manufacturers Association, Mr. Dickinson’s suggestion was not acted upon.

At the close of the convention of player-piano and music roll manufacturers, Paul E. V. Gottschalk, general manager of the Niagara Music Co., Buffalo, presented each one present with a music roll bearing “The Convention March,” composed by Paul R. Godeska, and “Dedicated to the Convention of Player-Piano Manufacturers, held at the Iroquois Hotel, Buf­falo, N. Y., December 10, 1908.” The roll was in a handsome box decorated with holly and made a pleasing souvenir.

by Shepazu at August 11, 2015 11:26 PM

August 09, 2015

ishida >> blog

UniView 8.0.0a: CJK features added

Picture of the page in action.
>> Use UniView

This update allows you to link to information about Han characters and Hangul syllables, and fixes some bugs related to the display of Han character blocks.

Information about Han characters displayed in the lower right area will have a link View data in Unihan database. As expected, this opens a new window at the page of the Unihan database corresponding to this character.

Han and hangul characters also have a link View in PDF code charts (pageXX). On Firefox and Chrome, this will open the PDF file for that block at the page that lists this character. (For Safari and Edge you will need to scroll to the page indicated.) The PDF is useful if there is no picture or font glyph for that character, but also allows you to see the variant forms of the character.

For some Han blocks, the number of characters per page in the PDF file varies slightly. In this case you will see the text approx; you may have to look at a page adjacent to the one you are taken to for these characters.

Note that some of the PDF files are quite large. If the file size exceeds 3Mb, a warning is included.

by r12a at August 09, 2015 08:56 AM

August 05, 2015

W3C Blog

Participate in a survey on Web security by the STREWS project

W3C, along with SAP, TCD and KUL, is partner in a European project called STREWS. The goal is to bring research and standardization in the area of Web security together. The project is funded by the European Commission (7th Framework Programme). It organizes workshops, writes reports, and, as its main goal, writes a “European Roadmap for Research on Web Security”.

That report is due later this year. And to help with assigning priorities to various topics in security, the project has created a survey. It is targeted especially at people maintaining web sites.

If you maintain a Web site, or have helped set one up in the past or plan to do so soon, you can help: Please, take some time to fill out this survey:

STREWS Web-security interactive survey

The STREWS project especially likes to hear from other European research projects, because one of the goals of the Roadmap is to help the European Commission select areas of research that need more support. That’s why a few questions talk specifically about “projects.” Just skip any questions that don’t apply to you.

The survey is open until September 11. The Roadmap is expected to be published later in September or October. It will be freely available from the STREWS Web site.

by Bert Bos at August 05, 2015 11:01 PM

July 29, 2015

W3C Blog

Moving the Web Platform forward

The Web Platform keeps moving forward every day. Back in October last year, following the release of HTML 5.0 as a Recommendation, I wrote about Streaming video on the Web as a good example of more work to do. But that’s only one among many: persistent background processing, frame rate performance data, metadata associated with a web application, or mitigating cross-site attacks are among many additions we’re working on to push the envelop. The Open Web Platform is far from complete and we’ve been focusing on strengthening the parts of the Open Web Platform that developers most urgently need for success, through our push for Application Foundations. Our focus on developers led us to the recent launch of the W3C’s Web Platform Incubator Community Group (WICG). It gives the easiest way possible for developers to propose new platform features and incubate their ideas.

As part of the very rapid pace of innovation in the Web Platform, HTML itself will continue to evolve as well. The work on Web Components is looking to provide Web developers the means to build their own fully-featured HTML elements, to eliminate the need for scaffolding in most Web frameworks or libraries. The Digital Publishing folks are looking to produce structural semantic extensions to accommodate their industry, through the governance model for modularization and extensions of WAI-ARIA.

In the meantime, the work boundaries between the Web Applications Working Group and the HTML Working Group have narrowed over the years, given that it is difficult nowadays to introduce new HTML elements and attributes without looking at their implications at the API level. While there is a desire to reorganize the work in terms of functionalities rather then technical solution, resulting in several Working Groups, we’re proposing the Web Platform Working Group as an interim group while discussion is ongoing regarding the proper modularization of HTML and its APIs. It enables the ongoing specifications to continue to move forward over the next 12 months. The second proposed group will the Timed Media Working Group. The Web is increasingly used to share and consume timed media, especially video and audio, and we need to enhance these experiences by providing a good Web foundation to those uses, by supporting the work of the Audio and Web Real-Time Communications Working Groups.

The challenge in making those innovations and additions is to continue to have an interoperable and royalty-free Web for everyone. Let’s continue to make the Open Web Platform the best platform for documents and applications.

by Philippe le Hegaret at July 29, 2015 04:32 PM

July 08, 2015

W3C Blog

WICG: Evolving the Web from the ground up

We are super excited to announce the launch of the W3C’s Web Platform Incubator Community Group (WICG). Despite the funny name (“the Why-See-Gee, really?”), this is a great new initiative that seeks to make it easier for developers to propose new platform features for standardization.

What we want to achieve

The purpose of the WICG:

  • Make it as easy as possible for developers to propose new platform features, in the spirit of the Extensible Web Manifesto.
  • Provide a space where developers and implementers can discuss new platform features.
  • Incubate those ideas by providing guidance and a supportive and inclusive environment to those who have never contributed to standards (and even to those that have!:D); Ultimately transitioning those ideas to a W3C Working Group for formal standardization (i.e., making a “W3C Recommendation”).
  • Modernize how we do standardization of platform features (yay! no mailing lists… unless you really want one).
  • Provide a legal framework that keeps all contributions free and open.

In short, we want to be a support group for aspiring standardistas. We want to provide you with all the help you would need in order to take your idea or proposal to the next level.

What we are not

We are not planning on being the new powers-that-be. You don’t have to convince us that your idea is any good; and even if you do, that might not help you much. What we would give you though is feedback on your ideas formulation, and help you iterate on them to improve the chances of them getting some traction once presented to the appropriate group.

Are browser makers involved?

Yes! Absolutely. Microsoft, Apple, Google, and Mozilla are fully supporting this effort.

Inspired by what the RICG achieved, the browser vendors want to make it easier to have a dialog about new features in a space that provides the required legal framework with minimal red-tape. However, we require people to join the community group to participate.

Having early buy-in from browser vendors is critical for getting something supported across browsers. Because all the browser vendors are involved in this effort, ideas can be quickly vetted from both a developer and browser vendor’s viewpoint.

By working together, we can create features that are fit for their purpose and, hopefully, a pleasure to use—all while solving real-world problems!

How we hope to make things easier…

In short: GitHub + tooling + a supportive community.

The elves at the W3C have been busy making sure we have the tools in place to make participation as simple as possible. We will develop specs/use case documents like we develop any open source software.

What’s the process?

Roughly, we want to follow the process already established by the RICG, though we will adapt as we go. It will go something like:

  1. State the problem: Write a description of a limitation with the Web platform and either post it to Discourse, spin up a GitHub repo, or just publish it somewhere (e.g., blog post, gist, whatever you like). This should be something you believe is missing in the platform and would make the lives of developers significantly easier if it were added. It can also be something you’ve noticed is a recurring development pattern which would benefit from standardization.
  2. Join the group: Before bringing the above to the group’s attention, join the community group, which means you agree with the terms of the W3C’s Community Contributor License Agreement (CLA). It’s critical if you first join the group or else key members won’t be able to review or discuss your proposal. Don’t worry if you forget, the Chairs will remind you and hound you till you do :)
  3. Evaluation: As a community, we will evaluate if the problem can’t be solved already using Web tech. We will also look at how many users of the Web might be affected. This will require collecting data, real world usage examples, etc.
  4. Use cases: If need be, we will formalize the above into a use cases document. Such a document can help prove to the community that there is indeed a need for a solution that needs standardization (see the Use Cases and Requirements for Standardizing Responsive Images, for example).
  5. Advocate: We circulate this with browser vendors and the community at large—we pitch it to anyone who will listen. Getting everyone on-board and in our corner is critical.
  6. Specify it: Once we have buy-in from browser vendors and the community, we put together some rough proposals (e.g., a new HTML element, API, or HTTP header…) and we do an “intent to migrate“: where we move the spec to a W3C Working group to seek royalty free licensing commitments from W3C members (you know, the “free” in “free and open”).
  7. (Bonus points) Implementation: Help turn the ideas from words on paper into working features in modern browsers.

(If you are interested in the formal process, take a look at the Web Platform Incubator Community Group Charter)


We won’t sugar coat it: standardization is hard (just ask anyone who survived the RICG:)).

The bar to add new things to the Web is going to remain high: We might need to raise money. Or bring everyone together in a room, like we once did in Paris. Or pitch ideas at conferences to get more developer interest and get momentum behind a feature.

However, anyone who chooses to participate will be well supported. We have some extremely experienced browser/standards engineers participating who are here to help. If you don’t know where to start, or don’t know your RFC2119 from your WebIDL, don’t worry. We got your back!

Collaboration with the RICG

So what’s the relationship between this group and the RICG ? Since we share members and acronym parts with it, we thought it’s worth explaining.

The RICG is focused primarily on pushing essential responsive features into standards and browsers, as well as getting more developers involved in the standards process. The WICG will focus mostly on that second bit: incubation of new platform features. We will help you take your idea of something that is missing in the platform and help you grow it until it is ready to be sent off to the appropriate group. This might even include pushing something into the RICG.

The RICG will continue to handle all things “responsive”, tackling the specific issues that have matured from a gleam in someone’s eye into ready-for-prime-time proposals.

What about other Community Groups?

Other Community Groups continue to function as normal. However, the WICG provides a one-stop-shop for features specifically targeted at web browsers. Sometimes, new CGs might be spun off from this one to work on a particular feature.

Got Questions?

You can find the chairs on Twitter:

by Marcos Caceres at July 08, 2015 10:40 AM

July 07, 2015

W3C Blog

ARIA and DPUB published a First Public Working Draft (FPWD)

By: Tzviya Siegman, Markus Gylling, and Rich Schwerdtfeger

We are excited to announce that a joint task force of the Protocols and Formats Working Group and the Digital Publishing Interest Group (DPUB IG) has published a First Public Working Draft of Digital Publishing WAI-ARIA Module 1.0 (DPUB-ARIA 1.0). This draft represents the joint collaboration of experts in both accessibility and digital publishing, and the vocabulary it contains represents their efforts to bring digital publishing structures to the web in a broadly accessible way.

The DPUB-ARIA 1.0 vocabulary is still very much a work in progress, and we are seeking input on its usability and completeness from a number of communities. As a result of this review, we expect that there will be changes, so we ask you to hold off on implementing the vocabulary until a later release, as there are open issues in the document that will affect its final appearance. Instead, please share your comments, run some trials, and tell us how they go.

The roles listed in DPUB-ARIA 1.0 originated with the EPUB 3 Structural Semantic Vocabulary, a vocabulary for adding inflection to HTML, managed by the International Digital Publishing Forum (IDPF), the authors of the EPUB specification. The IDPF drew its original list of terms from the DAISY Consortium, the originators of the Digital Talking Book format. During the decades that the DAISY Consortium has maintained its standard, much has been learned about how to develop semantically rich content that can be easily read and navigated by anyone with a print disability. This work has influenced and been influenced by the broader work of the digital publishing community, and informed the development of the accessible EPUB 3 format.

DPUB-ARIA 1.0 defines structural semantics, which provide authors and publishers a method of conveying intent and specific meaning to HTML tagging. A digital publishing structural semantics vocabulary defines a set of properties or behaviors relating to specific elements of a publication. This can improve general user experience as well as accessibility to users with disabilities. For example, publications often feature glossaries, which are typically marked up as a definition list using the <dl> element. However, a document may contain many definition lists, only one of which is the glossary. Adding role="glossary" to the appropriate list conveys machine-readable and human-readable information about the type of list. The inflection provided by the role enables readers of all abilities to access the glossary component. A user agent might use the declarative “glossary” markup to generate pop-ups glossary terms as the user encounters the terms while reading. At the same time, the role is exposed to assistive technology (AT), so the information is meaningful to users who cannot access the pop-up.

Books and other publications have some unique properties. Publishing has many moving parts, and the nuances of a publication are not always easy to represent in HTML. For example, a book or journal production workflow relies heavily on identifying each granular element to enable processing and reuse of content. It is important for publishers and their tools to be able to distinguish among the components that are “foreword”s, “preface”s, and “chapter”s. Without these declarations, we must rely on heuristics or attributes like "title", which are useful only when working in a single language. There are several ways inflection can be accomplished. Possibilities include embedding RDFa data or stretching the technical definition of the data-* attribute. We weighed the merits of several methods and concluded that one of the most important aspects is to provide a clear method of communicating with assistive technologies. The obvious solution then is extending WAI-ARIA, which already has ways of exposing roles to these technologies.

DAISY has been creating and facilitating accessible books for years, and the time has come to converge with the accessible web. Bringing these semantics into WAI-ARIA represents the next step in the evolution of digital publishing, making it easier to create rich and accessible HTML, whether the author is a traditional publisher or not. This is therefore also the first implementation of the general WAI-ARIA extension mechanism that allows groups to write extensions to accelerate greater semantic interoperability with assistive technologies while providing semantic curb cuts for the broader Web community. An example might be a new digital book reader’s feature that responds to a voice command, “go to the glossary”, and immediately navigate to the book’s glossary without having to flip pages to get there.

One of the roles of the DPUB IG is to work with technical representatives of the web and publishing communities toward convergence of digital publishing and web standards (see a recent blog on the Digital Publishing Interest Group and its current plans). The DPUB IG includes representatives of the standards organizations of the publishing world, the IDPF and DAISY. Publishers have hundreds of years of experience in creating books, journals, magazines, and other publications. Modern publishing workflows and standards rely heavily on the traditions of publishing as well as on the technologies and specifications of the open web platform. The DPUB IG seeks to offer our input toward improving publishing on the web and leverage the technologies of the Web as we improve publishing. This document is one of the first examples. The DPUB ARIA TF also worked with members of the HTML Accessibility Task Force to agree to a method that is appropriate for all user agents.

Shortly, the DPUB ARIA Task Force will begin the work of creating mappings to assistive technologies. We will also work with the IDPF and other members of the digital publishing community to ensure that everyone is comfortable with this proposed vocabulary. We also want to ensure that the dpub- vocabulary is itself extensible. Indeed, there are many terms in the IDPF’s vocabulary that are not yet addressed. For example, the IDPF has been working on specifications for educational publishing, and we know that there is ongoing work to address the accessibility of (educational) assessments.

We publish this draft with some open questions:

  • Are the proposed roles clear and appropriate to the needs of digital publishing?
  • Is the use of the dpub- prefix in role names to avoid potential collision with other WAI ARIA roles acceptable?
  • What mechanism would be suitable for addition of new roles?
  • Is the relationship of this specification to WAI ARIA 1.1 clear?

We are eager to receive your comments. If you’re interested in contributing to the development of this and similar work, consider joining the DPUB Interest Group or the Protocols and Format Working Group. Please submit issues with the label “dpub”. If this is not feasible, please send an email to public-dpub-aria-comments@w3.org.

by Tzviya Siegman at July 07, 2015 02:32 PM

June 29, 2015

W3C Blog

Planning the future of the Digital Publishing Interest Group

Time flies… it has almost been two years since the Digital Publishing Interest Group started its work. Lot has happened in those two years; the group

  • has published a report on the Annotation Use Cases (which contributed to the establishment of a separate Web Annotation Working Group);
  • has conducted a series of interviews (and published a report) with some of the main movers and shakers of metadata in the Publishing Industry;
  • is working with the WAI Protocols and Format Working Group to create a separate vocabulary describing document structures using the ARIA 1.1 technology (and thereby making an extra step towards a better accessibility of Digital Publishing);
  • maintains a document on Requirement for Latin Text Layout and Pagination, which is also used in discussion with other W3C groups on setting the priorities on specific technologies;
  • made an assessment of the various Web Accessibility Guidelines (especially the Web Content Accessibility Guidelines) from the point of view of the Publishing Industry, and plans to document which guidelines are relevant (or not) for that community and which use cases are not yet adequately covered;
  • established a reference wiki page listing the important W3C specifications for the Publishing Industry (by the way, that list is not only public, but can also be edited by anybody with a valid W3C account);
  • has conducted a series of interviews with representatives of STEM Publishing and is currently busy analyzing the results;
  • commented on a number of W3C drafts and ongoing works (in CSS, Internationalization, etc.) to get the the voice of the Publishing Industry adequately heard.

However, the most important result of these two years is the fact that the Interest Group contributed in setting up, at last, a stable and long term contacts between the Web and the Publishing Industries. Collaboration now exist with IDPF (on, e.g., the development of EPUB 3.1 or in the EDUPUB Initiative), with BISG (on, e.g., accessibility issues), and contacts with other organizations (e.g., Readium, IDAlliance, or EDItEUR) have also been established.

The group has also contributed significantly to a vision on the future of Digital Publishing, formalized by experts in IDPF and W3C and currently called “EPUB+WEB”. The vision has been described in a White Paper; its short summary can be summarized as:

[…]portable documents become fully native citizens of the Open Web Platform. In this vision, the current format- and workflow-level separation between offline/portable (EPUB) and online (Web) document publishing is diminished to zero. These are merely two dynamic manifestations of the same publication: content authored with online use as the primary mode can easily be saved by the user for offline reading in portable document form. Content authored primarily for use as a portable document can be put online, without any need for refactoring the content. […] Essential features flow seamlessly between online and offline modes; examples include cross-references, user annotations, access to online databases, as well as licensing and rights management.

But, as I said, time flies: this also means that the Interest Group has to be re-chartered. This is always a time when the group can reflect on what has gone well and what should be changed. The group has therefore also contributed to its new, draft charter. Of course, according to this draft, most of the current activities (e.g., on document structures or accessibility) will continue. However, the work will also be greatly influenced by the vision expressed in the EPUB+WEB White Paper. This vision should serve as a framework for the group’s activities. In particular, the specific technical challenges in realizing this vision are to be identified, relevant use cases should be worked out. Although the Interest Group is not chartered to define W3C Recommendations, it also plans to draft technical solutions, proof-of-concept code, etc., testing the feasibility of a particular approach. If the result of the discussions is that a specific W3C Recommendation should be established on a particular subject, the Interest Group will contribute in formalizing the relevant charter and contribute to the process toward the creation of the group.

The charter is, at this point, a public draft, not yet submitted to the W3C Management or the Advisory Committee for approval. Any comment on the charter (and, actually, on the White Paper, too!) is very welcome: the goal is to submit a final charter for approval reflecting the largest possible constituency. Issues, comments, feedbacks can be submitted through the issues’ list of the charter repository (and, respectively, through the issues’ list of the White Paper repository) or, alternatively, sent to me by email.

Two years have passed; looking forward to another two years (or more)!

by Ivan Herman at June 29, 2015 05:18 AM

June 17, 2015

ishida >> blog

UniView 8.0.0 available

Picture of the page in action.

>> Use UniView

Unicode 8.0.0 is released today. This new version of UniView adds the new characters encoded in Unicode 8.0.0 (including 6 new scripts). The scripts listed in the block selection menu were also reordered to match changes to the Unicode charts page.

The URL for UniView is now https://r12a.github.io/uniview/. Please change your bookmarks.

The github site now holds images for all 28,000+ Unicode codepoints other than Han ideographs and Hangul syllables (in two sizes).

I also fixed the Show Age filter, and brought it up to date.

by r12a at June 17, 2015 06:28 PM

June 15, 2015

W3C Blog

Security standard open kitchen

Standards are an interesting kitchen, where the technology is discussed, cooked, sampled and finally implemented. It could work in closed loop, between vendors. But our world is turning into a user centric manufacturing house. And standards are no exception. This is why, at the same time specifications are developed at W3C, it’s useful to go in front of web developers and ask ‘hey look at what we are doing, is it to your taste?’

Standing in front of a crowd of Web developers is a great way to test a dish from the Standards kitchen. I will be doing that in October, at a conference I really like, where the audience asks questions and challenges the speakers, at the end of sessions or in corridors or casually around beers. This conference is Paris Web, a two-day francophone conference –followed by a day of practical workshops– that attracts over 1,300 participants around the DNA (major ingredients?) of the Web, with topics such as open standards, Web design, accessibility, UX, quality, etc. I’m particularly happy that for its 10th edition, Paris Web gives a strong focus on privacy and security.

In my talk “Quoi de neuf sous le ciel de la sécurité du web et des internets ?” (“What’s up in the heavens of security for the Web and Internet?”) I will promote the recent work in Web Application Security, Web Cryptography, Privacy, together with security and privacy related activities of the Technical Architecture Group.

I’ll do my best to expose the recent security and privacy achievements, ongoing plans, and developing success of W3C which I recently described in my blog.

  • I am planning to convince the audience that security matters and tell how W3C progresses on that quest.
  • How users could win a decent treatment of their application permission, but also better understand the danger and countermeasure of browser fingerprinting.
  • How web developers could implement security policy based on crypto operations, and create mixed content with less security risk, thanks to the Web Crypto API, CORS and CSP.
  • How important it is to improve user and service provider’s interest by promoting usage of HTTPS.
  • How the next features of the open web platform could be made available in secured context.

I believe that demonstrating that W3C is the right place to think and design the trusted Web is also a good means to increase the value of the work which takes place there, contributed by all W3C members.

I want also to win something else by promoting W3C activities: collect good insight from the French and European community during the conference, and possibly get some of these smart people on-board so that they can contribute to the Working Group. I’m looking forward to answering questions after my talk, listening to the audience challenge our work, and sharing a beer with those passionate men and women, as we’ll toast to Paris Web for it’s 10th anniversary!

by Virginie GALINDO at June 15, 2015 06:30 AM

June 05, 2015

W3C Blog

W3C forms Security and Privacy Task Force for Automotive

There is an increasing interest and demand around data and services in Connected Cars, and the automotive industry has been working at W3C since 2013 to bring drivers and passengers a rich Web experience, and to make the Web a competitive platform for the automotive industry.

Many industry reports have confirmed that a significant majority of consumers want safe and secure access to the Web from their connected car. At the recent meetings of the W3C Automotive Working Group and Automotive Webplatform Business Group, there was strong interest in creating a joint Task Force to explore the various Privacy and Security implications for the standards work taking place at W3C for connected cars. We hear this need resonating loudly in the automotive industry.

This task force will be exploring security primarily from the perspective of standards being worked on in the Working Group or under early exploration in the Business Group, focusing on potential attack vectors being created.

Privacy similarly will remain focused on data being exposed by standards emerging from the groups but may broaden to potential use cases of applications based on that data, API interaction, user data rights and clearly communicated opt-in sharing arrangements.

We are seeking explicitly security and privacy experts from within the Automotive Business and Working Groups, W3C’s Privacy and Security focused Interest Groups, W3C Membership, Automotive Industry, researchers and other interested parties. With active participation from the automotive industry, W3C is working to bring drivers and passengers a rich Web experience.

For details on participating please see the public call for participation.

Please send questions as well as interview requests to w3t-pr@w3.org.

by Ted Guild at June 05, 2015 12:56 AM

June 01, 2015

W3C Blog

Web and Digital Publishing Experts Converge at Digital Book Event

The Digital Publishing industry convened at the Javits Center in New York City last week for the Bookexpo America (BEA) trade show and the International Digital Publishing Forum’s (IDPF) Digital Book conference.

At Digital Book, IDPF Executive Director Bill McCoy and Book Industry Study Group (BISG) out-going Executive Director Len Vlahos welcomed more than 400 participants for plenary sessions the morning of 27 May. Over the next two days, representatives from the publishing community, including those from a dozen W3C member organizations and staff, participated in numerous Digital Book track sessions addressing current and future industry challenges and opportunities in business, education and technology.

W3C CEO Dr. Jeff Jaffe spoke on a panel, “The Current State of Book Industry Standards,” with standards organization executives Rob Abel, IMS Global Learning, Graham Bell, EDItEUR, Bill McCoy, IDPF, and Len Vlahos, BISG, moderated by Bill Kasdorf, Apex Covantage.

Jeff Jaffe in panel

Jeff Jaffe speaking on Executive Panel on “The Current State of Book Industry Standards”

In response to the moderator’s question why there are so many standards bodies in digital publishing, Jaffe first clarified that W3C’s mission is to develop global Web standards, and then added that the different standards organizations bring different models and perspectives that are healthy to have because they are indicators of the extensive transformation and innovation taking place in the publishing industry as a result of the Open Web Platform.

He suggested “A more exciting question is what is going to be the impact of these standards on new kinds of book forms, content innovation and business opportunities, similar to what we have seen in the entertainment industry.”

McCoy concurred that the convergence of publishing and the web provides a new rich environment and emphasized that “all this content has to interoperate in a way that it did not have to before.”

The vision for how this interoperability and new features are being addressed was the subject of the session, “The Convergence of EPUB and the Web,” moderated by Tzviya Siegman, Wiley. Siegman, who together with Markus Gylling, IDPF CTO and Daisy Consortium co-chair the W3C Digital Publishing Interest Group, were joined on the panel by Ivan Herman, W3C Digital Publishing Activity Lead.

Ivan Herman speaking

Ivan Herman speaking on “The Convergence of EPUB and the Web” panel with Markus Gylling and Tzviya Siegman

During the presentation Siegman explained that the current publishing standards format EPUB3, while based on foundational W3C technologies such as HTML5, CSS and SVG, does not yet have the full feature set of capabilities of the Web. Conversely the Web does not have the presentation features of EPUB such as pagination and other layout features. She explained the progress of the W3C’s Digital Publishing Interest Group to date in identifying specific use cases and requirements through the work of various task forces.

Gylling and Herman further explored the technical aspects of what is needed to achieve the vision of offline and online states of packaged or portable documents which they have begun to document in a white paper. They invited more industry participation in this conversation about EPUB and Web convergence.

The W3C Digital Publishing Interest Group is open to W3C member participation, but others may follow and comment on the work by joining the public mailing list: public-digipub-ig@w3.org

For more information about W3C’s standards activities in Digital Publishing, contact Ralph Swick, Ivan Herman or Karen Myers.

by Karen Myers at June 01, 2015 11:27 PM

May 24, 2015

W3C Blog

Building the Web of Things

The Internet of Things (IoT) is regularly in the news, and we’re expecting there to be something like one hundred billion IoT devices within ten years. The promise of innovative new services and efficiency savings are fueling interest in a wide range of potential applications across many sectors including smart homes, healthcare, smart grids, smart cities, retail, and smart industry. Currently there is a lot of activity, but it is occurring in isolation, resulting in product silos and incompatible platforms. The World Wide Web Consortium (W3C) is seeking to change that through work on global standards for using web technologies that bridge IoT platforms through the Web, based upon a new class of Web servers. The Internet provides a basis for connecting systems, but like the phone system, it is not useful unless people are speaking in the same language. W3C proposes a conceptual framework with shared semantics and data formats as the basis for interoperability.

This starts with virtual “things” as proxies for physical and abstract entities that are described in terms of metadata, events, properties and actions, along with REST bindings to popular protocols, such as HTTP, Web Sockets, CoAP, MQTT and XMPP. Servers for the Web of Things will be available for microcontrollers, smart phones, home hubs and cloud based server farms. Larger servers will support a range of scripting languages, whilst smaller servers could use precompiled behaviours. There is also increasing interest in enabling end user service creation based upon event-condition-action rules with graphical editing tools and cloud based processing of vocal commands such as “turn down the heating when I leave home”.

The Web of Things Framework allows for distributed control, with control located where appropriate, and the promise of precise synchronisation of behaviour where needed, e.g. for factory robots and process control. The use of Web technologies is expected to dramatically reduce the cost for implementing and deploying IoT services. Companies will be able to realise savings in operational costs, but just as important, companies will have increased flexibility for rapidly reconfiguring manufacturing processes, and a reduction in time from design to shipping of new products. This will enable a shift from mass production to bespoke production where products are tailored to each customer’s specific needs. I am very much looking forward to talking about this at the Industry of Things World this September in Berlin.

There are many existing IoT technologies that serve different requirements and new technologies appear frequently. This necessitates an adaptation layer to bridge to the Web of Things Framework, and decouples services from the details of how devices are connected. This is crucial to building robust systems that are resilient to changes at lower layers. Security and privacy are important topics, and can be challenging for constrained devices. W3C expects to work closely with the IETF and other organisations on bindings to protocols and best practices for end to end security. To manage privacy, data owners will be able to control who can access their data and for what purposes.

With the success of open source software and the advent of open hardware, there is a huge opportunity for hobbyists and members of the “maker” community to get involved and help build momentum around open standards for the Web of Things. It is now possible to build your own IoT services for a few dollars, and I am looking for volunteers to help with developing open source Web of Things servers on a range of scales from microcontrollers, to cloud-based server farms. Working together, we can build strong standards based upon sharing our practical experience of developing services for the Web of Things.

W3C has recently formed the Web of Things Interest Group and plans to launch a Working Group in late 2015 to standardise the Web of Things Framework. We are very interested in understanding use cases and requirements across business sectors, so please join us to help drive the Web to a whole new level!

by Dave Raggett at May 24, 2015 01:26 PM

May 21, 2015

W3C Blog

Make a Mark – Become Verified in HTML5 by W3C!

W3C’s ongoing mission is to make the Web better and one other way to enable this is to offer high quality training programs as a way to increase the skills of Web developers and empower them to become the next leaders and innovators on the Web.

W3Cx logoW3C has recently expanded its Training Program with the edX partnership that resulted in the creation of W3Cx. The first MOOC to be offered on that platform is Learn HTML from W3C (W3Cx HTML5.1a). This exciting 6-week course starts on 1 June 2015 and has been built by Michel Buffa who was the creator and instructor of the HTML5 Course on W3DevCampus. We think that the learning experience that you will have will result in a dramatic increase in your knowledge of HTML5. While you can sign up for the course on the Honor System, I would encourage you to sign up for a Verified Certificate as it provides a mark of distinction for you, the W3C Seal of Approval.

In fact, on W3DevCampus, we were asked by students of our courses to recognize the fact that they had successfully completed the course, so we did that via Open Badges in addition to certificates of completion. People who have earned these badges tell us they make a difference for them as they enter the job market, change opportunities or seek new clients. We believe employers will soon be asking for marks of distinction such as Verified Certificates.

One last comment for your consideration. We are making this a special offer course from a pricing perspective. W3C has set a price for Verified Certificates at $129 per course. For this inaugural course on W3Cx we are offering these at $99 which is an even better bargain!

We look forward to seeing you sign up for this exciting course and the many more that W3C will be offering in the coming years.

by Marie-Claire Forgue at May 21, 2015 07:19 PM

May 15, 2015

W3C Blog

Job: Web Standards Technology Expert

The World Wide Web Consortium (W3C) is looking for a new, full-time staff member to be located at Beihang University in Beijing, where W3C currently has its China Host, to assist, as team contact, W3C Working Groups developing technical specifications in the fields of Ubiquitous Web or Technology and Society.

This is an opportunity for a unique individual to be part of the team responsible for the design of next generation World Wide Web Technologies and to lead a variety of industry and user groups toward the development of technologies that enhance the functionality of the Web. As the recognized leader for the technical development of the World Wide Web, we are seeking an individual with both Web technology and project management experience, an excellent understanding of the Web industry, and enthusiasm for the mission and spirit of W3C.


The individual will work within standard groups to edit specifications, develop change proposals, monitor progress, reach out to different stakeholders, produce tests, and to respond to clarification requests, bugs and issues.

We would welcome candidates with specific technology or business skills such as:

Generic Skills Required

  • Bachelors Degree or higher
  • Background in computer science and software engineering;
  • An understanding of the Web industry and its market, practices, and product development cycles;
  • Familiarity with up-to-date Web technologies, such as HTML, CSS, Web APIs and Scripting;
  • A Team player with good communication / interpersonal skills
  • Ability to work remotely and effectively is required;
  • Experience with the development of open information technology standards is desired;
  • Ability to travel internationally is required
  • Good written and spoken competency in the working language of the W3C, i.e., English, as well as in Chinese is required;
  • Strong writing skills required, with past experiences in editing technical specifications a plus;
  • Knowledge and practical experience of other languages and cultures is a plus

Work starts 1 August 2015

The Position is based at the School of Computer Science & Engineering of Beihang University, No.37 Xueyuan Road, Haidian District, Beijing. There will be regular international travel including to W3C Host sites, and regular remote work with the Working Groups and the W3C Global staff.

To apply, please send a motivation letter, your resume or CV and (if at all possible) copies of your Diplomas (including High School or equivalent) in electronic form to <team-beihang-position@w3.org>.

by Coralie Mercier at May 15, 2015 07:16 AM