Update on W3C Credentials Community Group

We’re attending the main call for the W3C Credentials Community Group and we have our own repository set up in line with the work item that was approved by W3C. The repo is empty at the moment.

We have some things to do to establish a presence there so they can help us better. We need to set out some tangible deliverables that make sense both for us and the wider community of folks at CCG.

If there is anything significant we need to get published for wider relevance, beyond co-ops, we need to define it properly in a way that the wider CCG audience can understand it. They have a lot of ‘protocol’ around this and some useful templates. More formal definitions of our use cases would be a great place to start!

I have reported back to them that we are making good progress with engagement of other co-ops and in getting research underway. They’re keen to hear from us, so after our own community call maybe we could give them an update on our status and plans and list some questions so that we can get wider advice, especially on VC standards.

On this week’s call we talked about status of the new version of the VC spec v1.1 which is coming up for approval and about the work item process.

We also talked about a new work item proposal this week to have VC’s automatically refreshed in the background if they are expiring. This makes it easier to have short-lived credentials. It is based on the VC API work done in another task force. Looks very useful for managing risk around credentials and also in prompting manual checks when they are necessary - just like an expiry date on a paper document or inspection certificate.

The (very full!) archive of the minutes of all the meetings is here:


This week’s meeting 25/1 had nothing of particular interest for us - it was mostly technical:

  • new VC spec approved + thinking about ways to allow more ‘crypto-agility’ i.e. make the specs more future proof (lots of innovation in new libraries / proofs)

  • interoperability and testability - some good tools / thinking there for big ecosystems:

Although our early use cases are going to be simple and won’t need any of this, it’s critical to have this in the standards ready for the day when things scale up!
Another part of testing in any scenario is the workflow: It’s important to think about different situations that can occur across a complex ecosystem where many different types of credentials are issued (and potentially revoked) where there are dependencies or contention between them.


Meeting on 15/2:

A very technical and detailed review of the crypto underpinning the VCs against the US NIST standards. Nothing to do with use cases… more about ‘can it be cracked?’ ‘is it safe?’ ‘is it government-approved?’

David Balenson and Anil John did the cryptography review of the DID and VC
specs to increase their level of compliance with federal government
standards… DHS especially.

‘can it be cracked?’ not feasibly …yet, but move to quantum secure keys in future
‘is it safe?’ …as safe as most commercial crypto. ‘Yes if you use it right’. Follow the guidelines.
‘is it government approved?’: in general, yes, but depends on approval of the actual products.

Full CCG call next week will discuss all this in more detail from an implementer perspective: meaning those who are still building the VC infrastructure products, not us, or our community who aim to USE them.


This week’s CCG call featured some news about a new standard (coming out of the same verifiable credentials stable) called “Coalition for Content Provenance and Authenticity (C2PA)”

It might interest any co-ops engaged in the proof of provenance of a digital creative work or perhaps those telling the story about or certifying the supply or value chain in a document about the creation of goods or an asset or object of some kind.


Overview - C2PA

An open technical standard providing publishers, creators, and consumers the ability to trace the origin of different types of media.

The C2PA does not prescribe a unified single platform for authenticity (eg Microsoft Azure cloud services (as in the video) or some particular blockchain platform or other) but instead presents a set of standards that can be used to create and reveal attribution and history for images, documents, time-based media (video, audio) and streaming content.

For example, Resonate co-operative might be able to use it some day to ensure the best possible, verifiable attribution of work uploaded and shared via the platform.

A lot of work has been done on it by folks at Adobe and Microsoft… but it is being published as an open source standard initiative.

Here’s the deck:


Adobe Acrobat


The meeting this week was about decentralised secure, ‘performant’ cloud storage that can compete with the big players, like Amazon S3 or Google Cloud Platform data buckets.


is “a robust object store that encrypts, shards, and distributes data to nodes around the world for storage. Data is stored and served in a manner purposefully designed to prevent breaches using a network of provider nodes and then clients.” Storj has been going for several years as a project (I looked at in 2016 and signed up). It’s not co-operative, but is / is intended to be open source. It’s looking quite mature now as a live service and is good enough to handle the decryption of sharded video on demand

Third edition of the white paper here: https://www.storj.io/storjv3.pdf
Presentation deck to follow, explaining the fit / alignment with verifiable credentials technology.


Cool to see this project mentioned. I believe I collected a few GBs of free space with their alpha offer about three years ago.

1 Like

“quantum secure” - interesting reminder of the performance window of all present crypto applications.


DIDComm Deep Dive

Meeting last Tuesday, 5th April was an excellent, in-depth presentation by Daniel Hardman on decentralised identifiers (DIDs) and DIDComm, a ‘framework for safe, structured interactions built atop DIDs’. Here is the audio and here is the DIDComm spec, the ‘protocol’ thing that Daniel is explaining.

If it helps, my understanding is that verifiable credentials are concerned with authentic, verifiable structured info, mostly about humans, while DIDComm is about the way that NOT ONLY that info could be exchanged BUT ALSO how the communication between all sorts of services (microservices) to do all sorts of things can be achieved without creating big silos of tech. It’s more like a set of building blocks for protocols that do all sorts of fancy things, securely, yet in a very decentralised way.

So… you don’t necessarily need DID’s for Verifiable Credentials. And you don’t need DIDComm if you have used DID’s.

Opinion… beware… I may have a touch of the bends here:

Building identity on top of the DID concept already feels pretty complex, and can be tough to use right now. DIDcomm is a spec, not a standard yet. In the long run, if it gains critical mass and is deployed consistently, everywhere, by products and their stacks it could be a transformational move to a world based on pure peer to peer interactions and not clients and servers. It’s a hugely ambitious thing, almost like a fancy replacement for HTTP.

Up Periscope

I can’t help feeling that all this will take a very long time to mature and reach scale. In the meantime, as co-ops we can try to keep it as simple and as easy to use as possible, even if imperfect, accepting some theoretical limitations. Let’s use pragmatic tech that hides the DID stuff away from users, or even dodges it completely for now?


I think the work that @angus has done on our plugin for Discourse is a great example of that pragmatic approach, and it’s good enough to try out!


Grandma Oauth2 meets the VC kids


Yet another identity wallet - a browser plugin… but open source and simple to ‘sign in with…’ using good old familiar oauth2 protocol to access websites and protected resources at ‘endpoints’

The Good

This week’s call offered an interesting open-source approach that made maximum use of the almost ubiquitous oauth2 protocol… something that any Coop might be able to set up with existing tools and no coding. It does assume an ‘enterprise’ scenario with a trusted authorisation server - so it’s not fully decentralised, but for a co-operative of co-operatives that approach might be acceptable, especially if open source and not reliant on third party cloud services.

All that messing about with DIDs and DID resolution is optional here… (Nikos is focused on easy ‘enterprise’ use more than ‘global interoperability’). Core idea is to make it fast and easy to use a wallet and a VC ‘access token’ to do things with websites without having sneaky old federated ID servers in the background. Nice and simple. Predetermined choice of wallet.

presentation slides

website and demo

The Bad

  • It’s very new and not production ready… firefox wallet plugin only at this point
  • Some of the CCG folks felt there were too many crypto operations in the browser to be safe
  • Some of the CCG folks felt that more attention should have been given to inter-operability (multiple wallets via a credential handler api)… more complexity!
  • Narrower use cases… the credentials are more ‘tokens’ bound to endpoints than true VC’s with a multiplicity of issuers… requires more co-operation and co-ordination

…but I think we can learn a lot from this work?


Some Show and Tell Sessions on Inter-Operability

This seems a long way ahead of where we are here in co-op credentials : still on use cases / user stories, and remaining, in principle and as far as possible, agnostic on solution platforms. It’s a global/industry topic and very technically focused. However, we need to follow it… it might give an idea of where the industry as a whole will end up on identity wallets and infrastructure.

Inter-op is not just about helping things join up sometime in the future… it’s also about avoiding lock-in to a particular vendor’s technology and world view. Apple, Google/Chrome and Microsoft will otherwise continue to drive so much of the landscape… fundamentally, they want to protect their dominance and will move together using bodies they control internal development or by acquisition. Maybe that’s inevitable…?

Markus Sabadello and Danube Tech

Small start-up, very active on inter-op. Lots of test suites now available. The big story is here:

Transatlantic SSI Interop

(The DID universal resolver, one of the technical components that underpins this is Markus’ awesome work.)

It’s worthwhile mentioning that this inter-operability is technical and does not cover the legal and governance aspects of these (proof-of-concept) scenarios. Human social, economic, political and legal inter-operability is by far the most important thing.

Manu Sporny and Chris Abernethy

Cross-Industry Inter-operability

The ‘verticals’ can become ‘silos’ and the global inter-op can be compromised. Active areas right now are

  • Traceability in supply chains - Spec
  • Immigration and education - permanent resident card - interop around citizen credentials
  • Jobs for the future - Workforce credentials - vc-edu
  • ‘mobile drivers licence’ (MDL)

An inter-operability dashboard is proposed… modelled on the browser inter-op dashboards that developers use.

Developer onboarding and tools are improving, but there is a very big learning curve… very hard. OpenAPI and Postman are helping. Linking user stories to the test suites so that there is more of a Test Driven Development (TDD) approach seems to be the preferred route - i.e write tests before you write code!

1 Like

This Week: Drummond Reed and Sam Smith on the linking of Credentials

This week there was a presentation of some exciting work on the linking or chaining of Verifiable Credentials. For example, say I’m a member of the co-operative (cred 1) and I am an elected lead of a sub-group (cred 2). Cred 2 depended on cred 1. We don’t need to load all of this into one big fat credential, nor should we create two overlapping creds… we need some sort of lightweight but secure and decentralised cross-referencing. Hence:

“Asynchronous Chained Digital Credential” (ACDC) which has come out of some really cool work called KERI.

…to be fair, none of this is likely to be relevant to us right now, because our use cases will be simple and less in need of government grade cryptographic protection. It’s good to know however, that these tools will be available as standard.

It will probably be going into version 2 of the VC standard and data model. It brings a greater level of security, reliability and performance to the cross-referencing of credentials to each other, supporting data they point to and the schemas associated with the credentials or datasets. Basically, Verifiable Credentials will be able point to other VCs and other verifiable data by using a special ‘internal’ crypto-identifier called a SAID

It is already having some application / innovation in the world of legal entity identifiers - GLEIF for the trusted issue of company legal entities in a chain back to a root of trust.

The data model is based on GQL - property graphs - and this is particularly useful for the automation of the delegation of authority.

It also has a neat feature called ‘graduated’ disclosure, where the verifier can first look at the metadata and ask ‘what type of stuff do you have in this VC?’ without external references.

Finally, there’s a possibility to build a ‘Ricardian Contract’ in which the clauses of a set of rules and contracts are wrapped up into a digest. This is something we considered doing (much less securely!) using a simple hash of a discussion thread in say Discourse. (@angus and other lawyers will love this one :wink: )

There’s also better protection from statistical correlation in case someone does get hold of a shedload of data and credentials and tries to infer something…

1 Like

I was thinking about just this the other day, and wondering, if there are so many VC providers out there, would there be a way or standard to connect or link to others? It may be outside what is described here, but reading this gives me inspiration :slight_smile:

1 Like

Yes, Authentic Chained Data Containers represent a streamlined, yet capable and powerful way to express your personal data graph including your metadata and schemas (organization and structure of your data).

As far as how the standards would link or connect to others the design of the Decentralized Identifier specification for DIDs, DID methods, DID documents, and DID resolvers provides precisely the facilities needed for interoperability of the core identifiers.

Credentials are the next big thing. Compatibility across credential types, ecosystems, and processes requires a detailed schema reconciliation to know what equivalences, similarities, and differences exist precisely down to the specific attributes. The mechanisms of relating credential systems also permit a straightforward interoperability because each credential type can contain the other credential types, if a bit awkwardly for some credential types.

An additional credential parser, or data processor, is needed for each type of credential you work with. It’s only a matter of time before basic parser stacks are created and thus integrate multiple ecosystems.

I don’t believe a common, shared data model for verifiable credentials or data containers (ACDCs) will ever land since so much of data modeling is bespoke to each organization though for some basic attributes that are overwhelmingly present in many places then there will be convergence on a small, minimal set.

Credential schema registries is an attempt to address this problem though whether local or global credential schema registries are the best option remains to be seen. From the security perspective local credential registries in your own domain (namespace) are the best since you have full control and can protect against schema revocation and schema malleability attacks. Global credential schema registries are vulnerable to those sorts of attacks yet are still very useful so will likely continue to remain popular for applications where the highest levels of security are not required.

I believe there will be a mix of both depending on the use case as time goes on.

1 Like