Re: Use Cases | ACTION-13 Revisited

Here's a revised shortened proposal for the use-case text (including Vijay's suggestion)

"Use case: Secure application protocol with user and device authentication

Using the Web Cryptography API, a web application implements a secure application protocol in Javascript, supporting confidentiality, message integrity and authentication for both user and for the device. Device authentication is based on pre-provisioned keys, when they are supported by the device. Pre-provisioned keys enable the service to establish the identity of the device into which the keys were originally embedded, thereby establishing a level of trust based on prior knowledge of that device and its security properties. The level of service provided is adapted to the type of device and level of trust established.

For the purposes of this use-case, pre-provisioned keys need only have origin scope and thus need not lead to cross-origin tracking concerns.

When a user navigates to the site making use of these capabilities using a device supporting pre-provisioned keys, the User Agent provides the Javascript application with access to the corresponding pre-provisioned keys through the Web Cryptography API. Before doing so, the User Agent obtains user consent through the use of context-specific mechanisms (possibly including UI prompts)."

…Mark

On Sep 4, 2012, at 2:05 AM, Vijay Bharadwaj wrote:

Mark,

Your last paragraph seems to assume a specific choice of UX behavior. Why not something like:

When a user navigates to a site making use of these capabilities using a device supporting pre-provisioned keys, the User Agent provides the Javascript application with access to the corresponding pre-provisioned keys through the Web Cryptography API. Before doing so, the User Agent obtains user consent through the use of context-specific mechanisms (possibly including UI prompts).


From: Mark Watson [mailto:watsonm@netflix.com]
Sent: Thursday, August 30, 2012 2:55 PM
To: Ryan Sleevi
Cc: Arun Ranganathan; Vijay Bharadwaj; public-webcrypto@w3.org<mailto:public-webcrypto@w3.org> Working Group; estark@mit.edu<mailto:estark@mit.edu>
Subject: Re: Use Cases | ACTION-13 Revisited


On Aug 30, 2012, at 2:20 PM, Ryan Sleevi wrote:


On Thu, Aug 30, 2012 at 11:12 AM, Mark Watson <watsonm@netflix.com<mailto:watsonm@netflix.com>> wrote:
Hi Editors,

I don't think I saw an answer to this. I'm happy to provide revised text for this use-case for the FPWD if you wish. I think it should be included.

…Mark

Sorry about the delay in replying, Mark.

My goal in the selection of use cases would be to try and find some generic set of problems that could demonstrate each of the essential parts of the API - encryption, signing, digesting, etc. It was not to favour nor dismiss certain consumers, simply to provide a quick cross-section and justification.

If you feel there's some essential functionality present in the spec that isn't present in the use cases, I think additional text would be welcome. Note that other members have proposed we remove the use cases entirely, which I'm also reticent to do, so I think we need to find balance in how many and how comprehensive our use cases are.

I think the use-case is relevant to any application which requires a way to bootstrap some form of trust in the client device - or some part thereof - from pre-provisioned keying information. Almost any application delivering paid-for virtual goods may benefit from this. Also think about financial applications where the service must be sure that the client meets some financial regulatory requirements. Or medical applications where the service must be sure the client meets some regulatory privacy requirements.


A lesser concern would be a causing readers of the spec to repeat the experiences from our Face to Face, where some members, upon initially reading Netflix's proposal, believed it to be DRM, and thus reacted negatively. While I'm very appreciative of Netflix's initial contributions and continued involvement, I'm sure you can appreciate the concern that a use case that appeared ideologically controversial might be more harmful than helpful to productive discussions.

Hopefully we can make it clear that this is not DRM. There is also a fundamental qualitative difference between the DRM discussions and what we're proposing here, which is that there is no secret sauce here.



If you have text though, I think it's better to send it, rather than to ask if you should :-)

Well, the question was whether you needed text from me, or were already working on it, since the ACTION was to "add missing use-cases" (It was Arun's action, though, not yours).

So, here is a more general spec-text version of the use-case text:

"Use case: Secure application protocol with user and device authentication

When accessing a paid-for service - such as a commercial video streaming service - the user and the service provider have a common interest in making sure that the subscription and the device accessing the service are genuine and that there is non-repudiation for the transaction. Authentication is a means to achieve such trust.

Using the Web Cryptography API, such an application may implement a secure application protocol in Javascript, including support for confidentiality, message integrity and authentication for both user and for the device, based on pre-provisioned keys (when they are supported by the device). Pre-provisioned keys enable the service to establish the identity of the device into which the keys were originally embedded, thereby establishing a level of trust based on prior knowledge of that device and its security properties. The service may then provide a level of service appropriate for the type of device and level of trust established.

Pre-provisioned keys may be of various types, including those generated on-demand by the platform and those embedded in secure hardware elements. The service is expected to infer the appropriate level of trust for a given set of keys based on prior information. It's important that, for the purposes of this use-case, pre-provisioned keys need only have origin scope and thus need not yield cross-origin tracking concerns.

A user navigating to a site making use of these capabilities, and using a device supporting pre-provisioned keys, may be asked by the User Agent whether they wish to provide their origin-specific device identity to the service. If so the Web Cryptography API provides the Javascript application with access to the corresponding pre-provisioned keys."

…Mark


On Aug 27, 2012, at 1:04 PM, Mark Watson wrote:

> Ryan, other editors,
>
> Are you planning to include the use-case we submitted [1] ? Do you need revised text for this ?
>
> …Mark
>
> [1] http://www.w3.org/2012/webcrypto/wiki/Use_Cases#Client_authentication:_An_Example_From_a_Video_Streaming_App
>
> On Aug 27, 2012, at 11:57 AM, Ryan Sleevi wrote:
>
>> Vijay,
>>
>> Just to make sure I followed - is your proposed use case the same as
>> what I suggested - where the IndexedDB/storage provider takes some key
>> and uses it protect the entire key/value/indexed store?
>>
>> The provenance of the Key doesn't matter to the security properties of
>> the storage. Instead of using a pre-provisioned key, you could equally
>> have the service give the user agent the Key after the user had
>> authenticated. Under such a scenario, you prevent offline access from
>> working, but it nominally protects against device theft (unless, you
>> know, the user stores their password within the OS/user agent, and the
>> attacker can compromise that...)
>>
>> Still, it does seem like we're in agreement that encrypting single
>> key/value pairs stored in stable storage isn't desirable, but what is
>> desirable is to encrypt/protect the entire store (eg: all of
>> IndexeDB/WebStore). While such features would be useful and have a
>> number of interesting properties, it does seem like something that
>> would require multi-WG coordination, and a strong understanding of the
>> security properties and guarantees, which I'm not sure we have quite
>> yet.
>>
>> On Mon, Aug 27, 2012 at 9:37 AM, Arun Ranganathan <arun@mozilla.com<mailto:arun@mozilla.com>> wrote:
>>> Vijay,
>>>
>>> There are two things in this use case:
>>>
>>> 1. The ability for the API container for the Crypto API to access keys from secure elements.  As discussed in the F2F, ideally this is done seamlessly; namely, when a secure element is present, the user agent detects it as a key repository.
>>>
>>> 2. The ability to have an encrypted local store.  But this is the same as the question rsleevi raised before, namely is this reinventing the IndexedDB/localStorage wheel, but enabled for cryptography?  Is this something this group should take on?
>>>
>>> I envision cross-group coordination at some point, but I'm wondering whether for now we should not include encrypted local storage as a use case.
>>>
>>> -- A*
>>>
>>> On Aug 27, 2012, at 5:55 AM, Vijay Bharadwaj wrote:
>>>
>>>> Perhaps there is a case for locally encrypted content when you combine it with a secure token.
>>>>
>>>> Take for example a web app that stores its local data encrypted to a smart card (provisioned out of band, like we have been assuming all trusted smart cards are). Then while the app is vulnerable if it is used after the user agent is compromised, at least it raises the bar by requiring the attacker to do a two-touch attack. An attacker who just compromises the user agent cannot decrypt the locally stored data, because the user agent itself cannot decrypt it without the token.
>>>>
>>>> To be more specific:
>>>>
>>>> Use case: encrypted local storage
>>>>
>>>> When caching sensitive data locally, an application may wish to ensure that this data cannot be compromised in an offline attack. In such a case, the application may leverage a key stored on a secure token distributed out of band (such as a smart card) to encrypt the local cache. Thus, the cache may only be decrypted by the application when the secure token is present; at other times (such as when an attacker has stolen the machine) the local cache is inaccessible and all operations will require online authentication to the application's web service.
>>>>
>>>> -----Original Message-----
>>>> From: Arun Ranganathan [mailto:arun@mozilla.com<mailto:arun@mozilla.com>]
>>>> Sent: Friday, August 17, 2012 7:57 AM
>>>> To: Ryan Sleevi
>>>> Cc: public-webcrypto@w3.org<mailto:public-webcrypto@w3.org>; estark@mit.edu<mailto:estark@mit.edu>
>>>> Subject: Re: Use Cases | ACTION-13 Revisited
>>>>
>>>> Ryan,
>>>>
>>>>
>>>> On Aug 16, 2012, at 7:16 PM, Ryan Sleevi wrote:
>>>>
>>>>> On Thu, Aug 16, 2012 at 3:55 PM, Arun Ranganathan <arun@mozilla.com<mailto:arun@mozilla.com>> wrote:
>>>>>> While working through the use cases (per [ACTION-13]) with Wan-Teh
>>>>>> (wtc), we came up with the following:
>>>>>>
>>>>
>>>> <snip/>
>>>>
>>>>>> 1. The use cases rsleevi added to the draft [spec] are pretty solid;
>>>>>> they are only missing a "local storage" scenario, first mentioned on
>>>>>> the Wiki [cf. local].
>>>>>> [cf. local]
>>>>>> http://www.w3.org/community/webcryptoapi/wiki/Use_Cases#Storing_local
>>>>>> _storage
>>>>
>>>>
>>>>> I'm a little concerned about the "local storage" case, and wondering
>>>>> whether it's something that would necessarily be in scope for this
>>>>> group.
>>>>>
>>>>
>>>>> Consider the example of IndexedDB, which uses "Keys" (IDB keys -
>>>>> http://www.w3.org/TR/IndexedDB/#key-construct ) and returns "Values" (
>>>>> http://www.w3.org/TR/IndexedDB/#value-construct ), and can
>>>>> alternatively be accessed via indices (
>>>>> http://www.w3.org/TR/IndexedDB/#index-concept ).
>>>>>
>>>>
>>>>> A naieve assumption would be that this API would only protect the
>>>>> Values - not the keys, nor the indices. However, as practically
>>>>> deployed today, that wouldn't offer much protection, since both Keys
>>>>> and Indices often reveal quite a bit of information.
>>>>>
>>>>> Further, by ciphering contents, it's a tradeoff between efficiency and
>>>>> privacy. Perfect privacy (storing no relationships about keys/indices,
>>>>> everything randomly distributed) is the worst efficiency, while
>>>>> perfect efficiency (which is what is afforded by today's IndexedDB)
>>>>> has no privacy/cryptography.
>>>>>
>>>>> A refinement might be to have the IndexedDB actually take a Key
>>>>> (Crypto API key), that it can use to protect however the IndexedDB is
>>>>> stored - keys, indices, everything. Call it an "EncryptedIndexedDB".
>>>>> This is better, in that it allows the user agent to decrypt on the fly
>>>>> (see caveat), and allows applications to use existing indices/keys.
>>>>> The caveat, however, is that encryption requires defining an
>>>>> encryption algorithm, and the choice of encryption algorithm directly
>>>>> affects the efficiency of the API. For example, under today's
>>>>> IndexedDB, a user agent can load data on the fly (eg: from disk), but
>>>>> under EncryptedIndexedDB with say, a block cipher alg like AES, it
>>>>> might have to read the entire DB into memory, then decrypt, in order
>>>>> to be able to offer this functionality.
>>>>>
>>>>> Even more fundamentally though, is the question about what attack this
>>>>> is trying to defend against. The arguments I've heard for encrypted
>>>>> local storage seem to be about a remote server, serving a web
>>>>> application, distrusting the client platform. If that's the case, it
>>>>> doesn't seem like any level of cryptography will save them. As I noted
>>>>> in the existing security considerations, it SHOULD be perfectly valid
>>>>> for a user agent to store a key in plaintext on disk, so what actual
>>>>> protections are afforded by this?
>>>>
>>>>
>>>> You're right -- if the use case is primarily about an untrusted multi-user machine or virtual computing environment, we're only as safe as general user safety anyway.  This doesn't seem to be a use case we can salvage, nor one that should influence the API.  We should probably not include it.
>>>>
>>>> But:
>>>>
>>>>>
>>>>> If something like EncryptedIndexedDB is what is meant here, then this
>>>>> seems like something that would likely live in the Web Apps WG (since
>>>>> it's about extending IndexedDB).
>>>>>
>>>>
>>>> Maybe -- I doubt it's worth their while to solve for that use case either :).  Interestingly enough (and not to confuse matters, but) we've just heard from Facebook [FB-ScriptSigning] about localStorage (or IndexedDB) used as a script cache.  People are already using IndexedDB and localStorage in unsafe-ish ways.  Of course, we shouldn't confuse script signing with a general use case for protected/encrypted local storage, but perhaps if we jettison the "protected local storage" use case, we can bolster the "document signing" use case to explicitly refer to documents extracted from local storage for signature verification.
>>>>
>>>> This raises the sticky issue of types of documents.  We might naively say that a script is no ordinary document, and can be used by the relevant JSON primitive if it passes signature validity.
>>>>
>>>> In a nutshell, I'm saying: perhaps we cannot cater to an encrypted local store use case, but we may be able to flesh out the use case for signature verification, including extraction from local storage.  Our use cases should encourage patterns of behavior that we think are desirable.  We can't control or solve for undesirable patterns of behavior :)
>>>>
>>>>
>>>>> I just want to make sure that we're carefully considering the use case
>>>>> and the security implications before committing to them, as well as to
>>>>> figure out what parts of the spec may need to change in order to
>>>>> meaningfully implement them.
>>>>
>>>>
>>>> +1.
>>>>
>>>> -- A*
>>>>
>>>> [FB-ScriptSigning] http://lists.w3.org/Archives/Public/public-webcrypto/2012Aug/0121.html
>>>>
>>>>
>>>>
>>>>
>>>
>>
>>
>
>
>

Received on Tuesday, 4 September 2012 19:36:18 UTC