# Using Welshare Protocol and the SDK
> reference documentation for Welshare
This file contains all documentation content in a single document following the llmstxt.org standard.
## React SDK Usage
# Interacting with Welshare Profiles from Your Application
We deliberately keep all private and most of the correlatable information separated. Applications should not interact with private (or derived) keys of users, and users should not learn more about an application than what they absolutely must know to establish a trust relationship - as any data that they store with a particular application will automatically be shared with that application.
At the moment, web applications interact with a dialog frame that runs on the welshare.app domain and is secured by browser domain boundaries. We're finetuning the security aspects of this (quite common) solution at the moment. In case you want to try it out, make very sure not to promise anything to *your* test users that *we* cannot hold.
## Try our Demo Apps
- Findrisc: https://findrisc.diabetesdao.com/ submits Fhir QuestionnaireResponses for the LOINC panel implementation of the Finnish Findrisc Diabetes Risk Score specification. Public repo: https://github.com/welshare/diabetesdao-findrisc
- Reflex: https://reflex.welshare.app/ uses a custom proprietary data schema to store data that behaves like Wearables. Public repo: https://github.com/elmariachi111/reflex-master

## Install the library
The public integration library contains plain React code and doesn't contain or expose cryptographic features or Nillion connectivity. It's a plain implementation of a conversational message passing layer that orchestrates applications and the Welshare wallet:
(use npm, yarn, bun, whatever you fancy, we use pnpm: )
```
pnpm add @welshare/react
```
### Configuration Prerequisites
- To integrate this, you need to [register a new application id](./API/register-app.md). We're referring to this as `VITE_APP_ID` in the code example. This id is public.
- When you're submitting a questionnaire response, you fist must [register a new questionnaire](./API/register-questionnaire.md) with your app.
- Your app is responsible of transforming any form data to valid `QuestionnaireResponse`s that validate against your schema. Then you use the common `QuestionnaireResponse` schema id provided by Welshare to the `submitData` function (`b14b538f-7de3-4767-ad77-464d755d78bd`).
- Your response must include a reference to the questionnaire that it responds to. Set your `QuestionnaireResponse`'s `questionnaire` field to the questionnaire id that you've received when you created it. We're referring to this id as `VITE_QUESTIONNAIRE_ID` in the code example.
## Submit Data to Welshare
```tsx title="submit-data.tsx"
import { ConnectWelshareButton, Schemas, useWelshare } from "@welshare/react";
export function QuestionnaireForm() {
const { isConnected, openWallet, submitData } = useWelshare({
applicationId: VITE_APP_ID,
environment: "development", //optional, at the moment the environment is always development
callbacks: {
onUploaded: (payload) => console.log('Data uploaded:', payload),
onError: (error) => console.error('Error:', error),
onSessionReady: (storageKey) => console.log('Session ready:', storageKey),
}
});
const handleSubmit = () => {
//response is a QuestionnaireResponse compatible object.
//make sure its `questionnaire` property refers to your questionnaire definition
submitData(Schemas.QuestionnaireResponse, {...response, questionnaire: import.meta.env.VITE_QUESTIONNAIRE_ID});
};
// using the `ConnectWelshareButton` is not mandatory. Use your own buttons if you feel like it.
return (
{!isConnected ? (
Connect to Welshare
) : (
)}
);
}
```
---
## Authentication
One of Welshare's main long term design objectives is guaranteed, provable privacy. Hence, a cornerstone of the protocol design is to derive purpose bound accounts from user root keys that differ by application context. This approach improves the privacy of each entity on the protocol: it's impossible for third parties to correlate identifiers with the root accounts (ie users). If you're interested in the technical backgrounds, check out our [key derivation docs](../basics/key-management.md).
:::warning Alpha Notice
As many other concepts on the Welshare protocol this idea is quite experimental, its implementation not final and the decorrelation guarantees brittle, as we right now use just one derived key for all subsequent operations and that of course is not sufficiently decorrelating user accounts yet.
:::
## Logging in / Root Profile Control
Every Welshare user completes a [Siwe login process](https://docs.login.xyz/) with a keypair that deals as their account root key. They can either use their favorite crypto wallet or create a dedicated [embedded (app) wallet](https://docs.privy.io/wallets/overview#common-usage) that's secured by our partner Privy. Privy's "self-custodial" wallets provide similar security guarantees as non-custodial wallets and they are are recoverable by a social login of your choosing.
## User Storage Keys
When starting a data submission, users derive a new secp256k1 keypair using a deterministic EIP-712 signature by their root key and salt values for added entropy. They are asked to do so in the respective dialogs or can manually create (and locally revoke) it on the Welshare wallet app.

Those keys are stored in local browser storage and are used to sign off secret data transfers. They should by no means be used to control any asset on any chain. Storage keys are used by users to directly interact with Nillion nodes (that's where they got their name from) after Welshare delegated NUCs to them.
---
## Applications
If you're a health service, a Desci Dao or any client that wants to interact with Welshare profiles or build on Questionnaires, your first step is to sign up with an EVM key (this can be a custodial wallet or an MPC wallet): https://wallet.welshare.app/application

## Creating application ids
Creating several application instances with one control key allows you to e.g. run several websites that request user data, which could be potentially separated. If you're running various projects as a company - e.g. like [CerebrumDAO](https://www.cerebrumdao.com/)'s [Transfidelity](https://www.cerebrumdao.com/projects/project-transfidelity) and [Percepta](https://www.cerebrumdao.com/projects/percepta-brain) spinouts - you can create separate application ids for them to separate their data submissions.
Unique applications are identified by a custom string of your choice and callback URLs that certain protocol components can use go gate incoming traffic, e.g. using CORS headers on wallet frontends. Signing the registration message [derives an application specific keypair](../basics/key-management.md) that your application will use to authenticate server side requests or sign off requests by your users.

### Application Registration and Data Access
```mermaid
sequenceDiagram
participant D as DAO
participant W as welshare wallet app
participant A as welshare api
participant N as nillion
%% DAO authentication flow
D->>W: Registers App with wallet
W->>A: stores app metadata
A->>N: saves app metadata
N->>D: returns app id
Note over D,A: this is not in active use right now:
D->>D: self signs scoped JWT
D->>A: uses JWT to request application data
```
### Using application keys to request questionnaire data
This makes use of self signed JWTs under the hood. They are created when an application interacts with welshare APIs from the frontend, e.g. to request their questionnaire submissions.
```mermaid
sequenceDiagram
participant D as DAO
participant W as welshare wallet app
participant A as welshare api
participant N as nillion
%% DAO authentication flow
D->>W: requests questionnaire data
D->>W: derives application keypair from wallet keys
D->>D: self signs read:questionnaire[id] JWT
D->>W: provides JWT for authentication
W->>A: calls questionnaire endpoint
A->>A: verifies signature, and app permissions
A->>N: filters requested data
N->>A: returns filtered data
A->>D: questionnaire data
```
### How's that different from a cloud infra and where's the HPMP?
Besides the key derivation and JWT signing, you might ask, what's the benefit here for application providers and why are we using rather complex logic to run this?
At the moment of writing (Sep 25), this is due to the preliminary state of the Nillion network. Applications actually *will* become the instances [to interact with Nillion nodes directly](../basics/data-storage/storage-flow#tbd-applications-access-owned-collections-directly) - they must be fully authenticated against Nillion nodes, which right now requires them to subscribe individually to Nillion and keep those control keys secret.
In the flow depicted above the *welshare API* can intercept the actual user data. Provided that Nillion once will allow users to grant acl-access for *unsubscribed* "builders" (= apps), the welshare "middleman" will become obsolete.
#### Users Implicitly Grant Read Access to HPMP
The bolder goal that welshare as a company is following here is to ask users to share their data with a trusted blind computation subsystem that runs inside a decentralized trusted execution environment. This allows applications and fourth party research agents to profit from the shared data storage infrastructure.
---
## Questionnaires
Welshare can help you collect Questionnaire responses. We're building upon a baseline of [Fhir schemas](../basics/data-standards.md), so responses and questionnaire definitions should be generally interoperable with other systems and vice versa.
## Crafting a new Questionnaire
The first thing to do after having registered as an application, is to switch over to your favorite form builder / preview tool and build a valid Fhir Qustionnaire.
:::info
At the moment we're not offering a form builder on our own, for obvious reasons: it's not trivial to build one, and there are insanely good ones publicly available. We've tested many of them for simple and complex scenarios, also tried out clinical standard forms (LOINC panels) and we're quite positive that this is quite a good start for Alpha state software.
:::
That being sad, don't be too sad if some Questionnaire code that you found on the internet breaks our inline preview - it likely only contained a feature that our default renderer doesn't handle well. You should still be able to register the Questionnare and collect responses, but you are responsible of *rendering* yourself. Some Fhir questionnaire form builders that we recommend, in no particular order:
- https://form-builder.aidbox.app/ very visual with different options to preview the questionnaires using arbitrary renderers.
- https://smartforms.csiro.au/playground CSIRO.au provides an excellent Fhir library and while they're offering far lesser UI support, it's a great way to quickly validate your definitions
- https://formbuilder.nlm.nih.gov/ The NIH's NLM form builder is feature rich and likely the most advanced tool.

If you're interested in best practices how to craft questionnaires, [checkout this documentation by eheahltsuisse](https://ehealthsuisse.github.io/EPR-by-example/Questionnaire/)
## Registering your Questionnaire with Welshare
Visit your application page and hit the "New Questionnaire" button. On the following page, paste the JSON definition file that you've exported from your favorite tool. We're immediately validating that it conforms to the Fhir Questionnaire specs and display eventual errors.
If the form is valid, you can render a preview to check that it'd likely work as expected. We use [CSIRO's Smart Forms Renderer](https://www.npmjs.com/package/@aehrc/smart-forms-renderer) to render generic forms. You'll likely want to present a far more customized version to your users, however.

### Overwritten fields
Upon pasting your new Questionnaire definition file, we're automatically adding (or replacing) any `publisher` field with the `did:nil:` identity that represents the application you're registering the questionnaire for. If your Questionnaire definition contains an `id` field, it remains untouched, otherwise we're overriding it with the technical database `_id`.
## Creating Form Frontends
A Fhir Questionnaire document resource implies the values of respective QuestionnaireResponse documents. When users submit new records according to that response schema, those responses are validated against the referenced Questionnaire instance. Applications are supposed to come up with virtually *any* custom questionnaire renderer interface that transforms responses into appropriate QuestionnaireResponse documents. We're offering a simple json endpoint that allows you to download the Questionnaire specification using its public unique _id, e.g. `https://wallet.welshare.app/api/questionnaire/49a07119-42db-427c-ae10-d83c76466e31`
Modern LLM based UI builders are shockingly good in rendering a form frontend for that specification, particularly if you make sure to add the Fhir docs to their context. Here's what [v0](https://v0.dev) makes out of a trivial one shot prompt ([Try it](https://v0-fhir-questionnaire-rendering.vercel.app/)), here's another [demo repo](https://github.com/welshare/demo-saq) that submits data for real.
Prompt
> Here's a JSON file that contains a Questionnaire definition according to the Fhir standard: https://wallet.welshare.app/api/questionnaire/49a07119-42db-427c-ae10-d83c76466e31 . Render a nice looking frontend that collects information according to the form, converts it into an appropriate QuestionResponse format upon submission and logs the submitted response to the browser console.

### Overwritten Fields on QuestionnaireResponses
Submitted responses are interpolated with fields that we cannot trust users to be honest about:
- `authored` is replaced with the submission date ("now"). This will always be an ISO timestamp in UTC.
- `status` is set to `completed`
- `subject` is set to to submitting user's profile did
- `author` is set to the controller did of the application that created the questionnaire
- `source` is set to the `_id` of the application that created the questionnaire
### Using the Generic Form Frontend
Mostly for testing reasons we're offering a generically rendered form frontend that you can simply send to your users to have them file it. We're discouraging this approach in favor of a far more appealing onboarding experience.
https://wallet.welshare.app/questionnaire/49a07119-42db-427c-ae10-d83c76466e31
---
## Applications (2)
Welshare avoids collecting any data on its own. Instead, *applications* are supposed to utilize the protocol's data storage capabilities by registering application accounts and delegate capacity to their users.
During the Alpha phase the welshare protocol is free to use for applications - ultimately the protocol will likely ask to contribute to the involved (Nillion's) storage and key management costs, which potentially also can be covered usage fees paid by other parties (e.g. research agents).
Here's our [application registration guide](../API/register-app.md)
## Keypairs
Similar to how users derive their storage keys, applications derive purpose bound interaction keys that identify them on the Nillion network. The respective DIDs start with `did:nil:` followed by the public key part.
## Delegations
Welshare itself has no intrinsic interest to access user profile data directly. We instead ask applications to use the protocol and store their users' information on their respective Welshare profile. Conceptually, users always stay in control over their own information but to be able to *store* data, they somehow must interact with the storage layer.
This works by *delegating* a [NUC policy token](https://docs.nillion.com/build/permissions-and-payments#nuc) from a `builder` account (welshare) to the end user. To figure out, whether a delegation request is legit, it must be signed off by the application that "guarantees" the legitimacy of the user's request. Details on how this works under the hood can be found [in the fundamentals section](./data-storage/storage-flow.md),
## Schemas
Application based *custom* schemas are a topic that we're actively discussing (as it's touching Nillion's core offering). At the moment we're not offering a public endpoint that would allow applications to register custom schemas or collections using the welshare builder account.
## Questionnaires
The current main demo use case is built around Questionnaires and user responses. Applications are free to create new Questionnaire definitions (see [Crafting a New Questionnaire](../API/register-questionnaire.md)) and build frontends that submit their user's responses using the self signed / delegated data submission flow. Our [React SDK](../sdk.md) helps with integrating Welshare Wallet features directly into apps. Here's a fully vibe coded demo that shows how apps can generate a fully working, highly customized, standards based questionnaire submission website without knowing much about the internals: https://github.com/welshare/demo-saq
---
## Data Standards
## It all starts with Fhir
[HL7/FHIR](https://hl7.org/fhir/) (or Fhir for short 🔥) is *the* defacto industry standard to schematize health related information. At first, it's sufficient to know that virtually any system storing electronical health records (EHR) across the world has agreed on this specification. There are many relevant industry products that allow national administrations to store and organize their populations' health data. However tempting, Welshare is not getting into a debate over the highly centralized nature of those solutions; we rather embrace the schematics and standards the industry has agreed on and use them as the foundation of all data that our users are storing.
We obviously cannot be sufficiently feature complete to an industry behemoth standard, so we mostly utilize Fhir [to capture Questionnaires and their responses](https://medblocks.com/blog/a-beginners-guide-to-fhir-questionnaire-question-types).
## Questionnaires
One of the most commonly requested features is to submit Fhir compatible QuestionnaireResponse resources to user profiles. We've created two collection schemas that accept `Questionnaire` resources and `QuestionnaireResponses`.
The Questionnaire schema is a standard collection, only writable by the Welshare Nillion builder. Here's [our documentation](../API/register-questionnaire.md) on how applications can create a new questionnaire definition without any code.
These are the official documented specifications:
- Questionnaire: https://www.hl7.org/fhir/questionnaire.html
- QuestionnaireResponse: https://www.hl7.org/fhir/questionnaireresponse.html
Before you start coming up with your own questionnaires, it might be worthwhile to parse through those that already have been thoroughly specified, coded and defined, e.g.
- LOINC's definition of the Finnish Findrisc panel: https://loinc.org/97055-8
- a collection of commonly used Questionnaires by NIH/NLM: https://lhcforms.nlm.nih.gov/lhcforms
## Reflex Data (as a standin for Wearables)
Besides questionnaire responses, welshare has been built around the idea of combining several information contexts into one queriable user profile, particularly reading and aggregating raw wearable readings. Until we're a the point that we can safely store that data, we're simulating it, e.g. by our very own [Reflex App](https://reflex.welshare.app). It's a demo on how to store arbitrary data associated with a user profile and extract aggregated information from it again.
---
## Nillion - A Blind Data Backend
# Welshare runs on Nillion: A Network for Blind Compute
Welshare's data storage infrastructure is powered by Nillion, a network architecture for secure computation. We can store and process sensitive data while maintaining privacy applying modern cryptographic techniques, eliminating the need to trust any single entity with raw data.
Nillion nodes can be recruited into clusters for specific privacy-enhancing technologies (PETs). Each node operates one or more specialized modules that handle different types of secure computation. When moving forward towards a production ready environment, Welshare plans to become part of a Nillion based cluster that specifically deals with securing health related information.
Nillion's [Blind Modules](https://docs.nillion.com/learn/blind-modules) allow for additive secret sharing that splits data into mathematical shares distributed across multiple nodes. Clients with sufficient access permissions can reconcile the shares to decrypt the original information. These operations require a minimum threshold of nodes to collaborate, but even if some node fails to respond the data remains available. Individual nodes cannot reconstruct the original data from their shares on their own.
Nillion supports basic FHE sum operations using the [Paillier Cryptosystem](https://en.wikipedia.org/wiki/Paillier_cryptosystem) to run basic summation aggregations over encrypted data. The network operates [Trusted Execution Environments](https://learn.microsoft.com/en-us/azure/confidential-computing/trusted-execution-environment) (TEEs) which leverage CPU-level security features for sensitive operations. One application that's particularly relevant for Welshare, is **private LLM Inference**. This allows the envisioned [HPMP](https://mirror.xyz/stadolf.eth/gjxWAnMSXZdR_lkMgIlgaeaAMoRNyc8lpBao2yhJrfs?referrerAddress=0xE231B4e55fE1D0Afb3e746e64E78eEffB5b599d1) to run AI models and patient data based retrieval-augmented generation (RAG) enrichemnt inside the TEEs.
## Data Schemas and Collection Ownership
Nillion's private storage / nilDB builds upon a Mongo DB foundation. If you know how to [interact with data on Mongo](https://www.mongodb.com/docs/manual/tutorial/query-documents/), you know how to query Nillion. To foster decentralization and secrecy aspects, Nillion layers a schema concept on top of raw data which surfaces prominently when you're defining encrypted fields using shared secrets.
There are two conceptual kinds of collections in Nillion's private storage. `standard` collections allow their owner to write, delete and query / filter arbitrary data - very similar to how a typical database collection would behave. In contrast, `owned` collections can be directly accessed by accounts the collection owner delegates access tokens to. These delegates use their indivudal delegate tokens to directly to the Nillion network. Records in `owned` collections carry record level ACL rules that are obeyed by the individual Nillion nodes.
### Authentication
All requests issued for Nillion nodes must be authenticated with an access token that identifies the *builder* or their delegate. At the time of writing *builders* are accounts that actively subscribe to Nillion's network services. They can delegate access permissions over their owned collections to other accounts. Conceptually each access token requires a subscribed **builder** to sign it off, see [Nillion's docs](https://nillion.pub/secretvaults-ts/classes/SecretVaultBuilderClient.html) and the [api access](https://docs.nillion.com/build/network-api-access) docs for reference.
## Earned Security Benefits
Welshare is not operating any data storage service that stores user identifiable information. We also don't use a formally trusted and compliant cloud provider right now. We prefer incentivized sovereignty and follow a "prove don't trust" notion over corporate claims and SLA promises. In the mid term we're headed towards a fully end to end encrypted and user owned system.
That being said, during our Alpha rollout phase, as the root *builders* of the protocol **we are technically able to read records by users** that wrote data directly into Nillion nodes. This major operator leak will be closed by a solution we're working on with Nillion - or that we need to address by making data collecting applications builders themselves (which is very costly at the moment).
---
## Data Storage - A Primer
# Data Storage: A Primer
There are many options how one could store user information in a decentralized context. First, think of how storage and persistence is typically solved in permissioned environments like AWS RDS / S3 or ORM / SQL providers like Prisma, Neon or Supabase. If you were to start some patient record management system from scratch you'd define some database schema, write mappers (or let an ORM handle that for you) and attach query APIs like REST or Graphql. Rolling this out means migrating a certain object relational or document oriented schema to some infrastructure that you might or might not control yourself.
### The Security Bottlenecks in Classical Databases
At some point many applications, developers, users or product owners will stumpble upon an obvious question: **who can actually access that data?** And we're not talking about an authentication issue here; user information is usually sent over encrypted channels but could be intercepted at some debug log sink (when it's decrypted to be inserted into the database).
Not only your Kafka stream processor but also most relational databases keep around a write log with cleartext data that can be replayed to recover from failures or to provision replicas. Administrators with root access can gain read access to the filesystem or might even be able to unmount drives and read their data at the convenience of their own home.
Long ago database vendors started ideating good encryption concepts to avoid this situation, and the solution landscape addressing the fundamental security gaps has evolved significantly.
### Modern Database Encryption: The State of the Art
Contemporary database management systems employ a multi-layered encryption approach that addresses data protection at every stage of its lifecycle. **Encryption in transit** secures data moving between clients against man-in-the-middle attacks. **Encryption at rest** protects data when it's stored on disk. This encompasses several sophisticated techniques: **transparent data encryption (TDE)** encrypts entire database files, including data files, log files, and backup files, while **column-level encryption** allows for granular protection of sensitive fields like social security numbers or credit card information. **File-level encryption** extends protection to the operating system level, ensuring that even if someone gains access to the underlying storage, the data remains unintelligible without proper decryption keys.
### Who holds the keys and knows the data?
The encryption key management landscape has also matured considerably. **Hardware Security Modules (HSMs)** provide tamper-resistant environments for storing and managing encryption keys, while **key rotation policies** ensure that compromised keys can be replaced without data loss.
In cloud provider scenarios it's not uncommon to **Bring Your Own Key (BYOK)** model that allows organizations to maintain control over their encryption keys even when using cloud database services, addressing the fundamental trust issue of cloud providers having access to customer data.
### Adding scifi cryptography
**Query-level encryption** represents the [cutting edge](https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/always-encrypted-database-engine?view=sql-server-ver17), where individual queries can be encrypted before reaching the database, and results can be returned in encrypted form. This approach, combined with **homomorphic encryption** techniques, allows for certain types of computations to be performed on encrypted data without decryption. **Zero-knowledge proofs** enable verification of data properties without revealing the underlying information, opening possibilities for privacy-preserving analytics and compliance verification.
### Trusted Parties can Intercept Data
Despite these advances, traditional encryption approaches still face fundamental limitations. **Backup and replication systems** often require decryption for processing, creating temporary vulnerabilities. **Database administrators** with sufficient privileges can still access decrypted data during maintenance operations.
**Application-level encryption** can provide additional protection, but it complicates querying and indexing, often requiring architectural trade-offs between security and performance. These limitations highlight why decentralized storage solutions, with their inherent cryptographic guarantees and elimination of trusted intermediaries, represent such a compelling alternative for truly sensitive data.
## The Complexity of Decentralization
### The Beauty of Permissionless End to End Encryption
This is where decentralized storage solutions come into play. They shift the paradigm from a traditional "trust but verify" model to a fundamentally trustless architecture. **Permissionless end-to-end encryption** eliminates the need for trusted intermediaries entirely - no database administrators, no cloud providers, no backup systems that could eventually hold keys to the plaintext. User data is encrypted with keys that only they control, stored across a distributed network where no single node has access to the complete, decrypted dataset.
This approach not only solves the fundamental trust issues of centralized systems but also provides **censorship resistance** - data cannot be taken down, modified, or accessed by any authority, regardless of their technical capabilities or legal jurisdiction. The result is a storage system where security is not a feature you configure, but a fundamental property of the architecture itself.
The solution domain is manifold, and it can be approached from different angles. Welshare chooses to rely on Nillion's "blindfold" key share-distribution idea and their TEE based compute and database engines. The next chapter explains [how Welshare utilizes Nillion technology](./nillion.md) in detail.
---
## Delegating Data Storage
:::info Only for reference
The code on this page is only here for reference and illustrative reasons (e.g. if you would build custom interactions with our storage layer). If you're just considering to integrate your users and data, checkout the [Authentication Apps and Users](../../API/authenticating.md) docs.
:::
## Schemas
Welshare is hosting Nillion [owned and standard collections](./nillion.md) on behalf of authorized [applications](../applications.md). Our goal is to abstract away the cryptographic complexities of the underlying storage layer for applications who just want to safely store their users' information. Right now applications can only store data in collections that we're providing, but we're planning to allow them to bring their own schemas or even reuse Welshare derived keys to help their users write into schemas that they manage.
### Authorization
To be able to issue [Nillion delegate tokens (NUCs)](https://docs.nillion.com/build/permissions-and-payments#nuc) for our users, we must ensure that they're legitimate controllers over their keys and actually have a plaubsible intent to write that data in the context of an application that they're using. For that, users present a *self signed JWT* to our delegation endpoints. Since self signed signature verification of ES256K keys inside JWTs [was abandoned in early 2025](https://github.com/panva/jose/discussions/767), they're not trivial to create with standard libraries. Here's a code snippet showing how we get the job done nevertheless:
### creating self signed jwts
```ts
import { secp256k1 } from "@noble/curves/secp256k1";
import { hexToBytes } from "@noble/hashes/utils";
/**
* Represents a secp256k1 elliptic curve key pair with secure key handling
*/
declare class NillionKeypair {
constructor(privateKey: Uint8Array);
toDidString(): DidString;
sign(msg: string, signatureFormat?: "bytes"): Uint8Array;
sign(msg: string, signatureFormat: "hex"): string;
}
function base64urlEncode(data: Uint8Array | string): string {
const bytes = typeof data === 'string'
? new TextEncoder().encode(data)
: data;
let binary = '';
for (let i = 0; i < bytes.length; i++) {
binary += String.fromCharCode(bytes[i]!);
}
return btoa(binary)
.replace(/\+/g, '-')
.replace(/\//g, '_')
.replace(/=/g, '');
}
export async function createJWTForStorageKeys(
userKeypair: NillionKeypair,
payload: Record
): Promise {
const header = {
alg: "ES256K",
typ: "JWT",
};
const now = Math.floor(Date.now() / 1000);
const _payload = {
iss: userKeypair.toDidString(),
iat: now,
exp: now + 3600,
nonce: Math.random().toString(36).substring(2, 15),
...payload,
};
const encodedHeader = base64urlEncode(JSON.stringify(header));
const encodedPayload = base64urlEncode(JSON.stringify(_payload));
const message = `${encodedHeader}.${encodedPayload}`;
const signatureBytes = userKeypair.sign(message, "bytes");
const encodedSignature = base64urlEncode(signatureBytes);
return `${message}.${encodedSignature}`;
}
```
Create a write scoped JWT for users like so:
```ts
const selfSignedJWT = await createJWTForStorageKeys(userKeypair, {
scopes: ["write"],
});
```
Users present this JWT when calling welshare's delegation endpoint `/api/auth/delegate`, and submit their storage key did in the request's body. The delegation endpoint's response contains a `delegation` object.
```ts
// create a nuc for users
const delegateResponse = await fetch("/api/auth/delegate", {
method: "POST",
headers: {
Authorization: `Bearer ${selfSignedJWT}`,
},
body: JSON.stringify({
audienceDid: keypair.toDidString(),
}),
});
if (!delegateResponse.ok) {
throw new Error("Failed to delegate");
}
const { delegation, audienceDid } = await delegateResponse.json();
```
That delegation allows users to write (and read) on Welshare's Nillion collections directly, e.g. using the [secretvaults SDK](https://docs.nillion.com/build/private-storage/quickstart#user-stores-private-data).
```ts
//that's the builder who should be able to read the user's data
//const builderDid = `did:nil:...`;
//to grant access to the HPMP, for now you grant access to welshare:
const builderDid = `did:nil:027a3fcce7f7b12061bb7d872d685b7cbcab838d4e74036ca394f504ea89169a9d`;
//the nodes to store data on
const nillionNodes = "https://nildb-stg-n1.nillion.network"
const userClient = await SecretVaultUserClient.from({
baseUrls: nillionNodes,
keypair: userKeypair,
});
const uploadResults = await userClient.createData(delegation, {
collection: schemaId,
owner: audienceDid,
data: ,
acl: {
grantee: builderDid, // Grant access to the application
read: true, // app can read the data
write: false, // app cannot modify the data
execute: true, // app can run queries on the data
},
});
```
### Storing data on Nillion using delegated welshare NUCs
```mermaid
sequenceDiagram
participant M as member
participant W as welshare wallet app
participant D as Application
participant A as welshare api
participant N as nillion
%% DAO authentication flow
D->>W: render survey frontend
M->>D: fills out questionnaire e.a.
D->>W: Opens new wallet dialog for user
M->>W: signs in (e.g. privy)
M->>W: derives storage key from auth signature
M->>W: signs JWT over their did with storage key
M->>A: authenticates with JWT to request NUC (refers to app)
A->>M: grants storage delegation NUC
D->>W: submits signed data
W->>N: authenticates with NUC
W->>N: Uploads data
```
### Applications use Welshare to Access Owned Collections
```mermaid
sequenceDiagram
participant W as welshare wallet app
participant D as Application
participant A as welshare api
participant N as nillion
%% App authentication flow
D->>W: signs JWT over their application key
W-->>D: JWT
D->>A: auths with JWT
D->>A: requests aggregation
A->>N: requests docs / filters / aggregates
N-->>A:
A->>D: discloses insights
```
### TBD: Applications Access Owned Collections Directly
```mermaid
sequenceDiagram
participant W as welshare wallet app
participant M as Member
participant D as Application
participant A as welshare api
participant N as nillion
M->>W: requests a NUC
W-->>M: delegates write access for collection
M->>N: uses NUC to store data on Nillion
N->>D: grants "read" ACL on the data point
N-->>A: grants "read" ACL for HPMP
D->>N: requests data / runs aggregation
N-->>D: yields only information shared with the app
```
---
## Key Derivation
# Key Management and Derivation
To preserve the protocol user's privacy, we decorrelate user "control" accounts while keeping the relationship between root keys and derived keys provable in zero knowledge.
## Using Privy for Root Keys
[Privy](https://privy.io) is a wallet provisioning service that combines the best of all worlds. It can manage [user owned embedded wallets in TEEs](https://docs.privy.io/wallets/overview#common-usage), works across many chains, comes with a battle-tested UI, runs on arbitrary environments and devices, has [proven to be secure at scale](https://docs.privy.io/security/wallet-infrastructure/architecture#concepts) and it integrates like a charm with Wagmi and React libraries. Also, it doesn't stand in the way when users want to bring their own wallets - the usage difference is really only the UI. That's why we decided trusting Privy to provide key material for users who absolutely don't want to bother with any crypto terms.
## Why crypto keys are really bad for privacy, even when you control them
In his [opinion piece from April](https://vitalik.eth.limo/general/2025/04/14/privacy.html), Vitalik underlines the importance of privacy and that today's understanding of zkSNARKs is perfectly sufficient to operate privacy preserving protocols. Vitalik argues that while cryptographic systems provide security guarantees, they create a fundamental privacy problem through key correlation. Once someone's cryptographic identity (key/address) is known or linked to their real identity, all their past and future cryptographic activities using that key become traceable and correlatable.
His thesis is that cryptographic systems, while secure, inherently compromise privacy through persistent key-based identity correlation, but this problem can now be solved through zero-knowledge proofs and other modern privacy technologies that maintain security while breaking the link between identity and transaction history.
## Using HKDF to derive keys for purpose specific usage
There's no shortage of potential options of how keys could be derived from basic entropy. The most obvious choice is to go with well known BIP-32 HD key derivation that lets you derive an arbitrary amount of accounts from random secret.
Our precursor is slightly different, however: we want to deterministically derive keys using a replayable piece of information that users can only create with their control key and secrets that either are known to the user or parties the user trusts. A key derivation mechanism that we choose must also work for smart contract accounts (e.g. Safe or [EIP-7702](https://eips.ethereum.org/EIPS/eip-7702) contracts) and support signatures conforming to [EIP-1271](https://eips.ethereum.org/EIPS/eip-1271), [EIP-7913](https://eips.ethereum.org/EIPS/eip-7913) and eventually [EIP-7739](https://eips.ethereum.org/EIPS/eip-7739).
[RFC 5869](https://datatracker.ietf.org/doc/html/rfc5869) (HMAC-based Extract-and-Expand Key Derivation Function (HKDF)) provides a cryptographically robust alternative to BIP-32 for general-purpose key derivation. The extract-then-expand paradigm first concentrates entropy through HMAC-based extraction, then generates multiple derived keys through controlled expansion.
Our specific implementation is rooted in a 2025 paper from [IBM, ETH Zürich, and TU Darmstadt](https://eprint.iacr.org/2025/657.pdf) that explores how to add several inputs into the key derivation function.
For secp256k1 applications, HKDF excels in ECDH key agreement scenarios where parties derive symmetric keys from shared secrets. The system ensures derived keys meet secp256k1's range requirements (0 < key < n) through iterative generation with incrementing info parameters.
HKDF's strength lies in its formal security analysis and general applicability. It serves broader cryptographic protocols while maintaining provable security under the pseudorandom function assumption for HMAC.
This is actual code that runs when users derive storage or application keys by signing EIP-712 derivation messages with their root control wallets (injected or privy):
```ts title="sessionKeys.ts"
import { Address, Hex, hexToBytes, PublicClient, WalletClient } from "viem";
import { deriveKey } from "./key-derivation";
interface SessionKeyAuthMessage {
keyId: string;
context: string;
}
interface AuthorizedSessionProof {
message: SessionKeyAuthMessage;
signature: Hex;
signer: Address;
timestamp: string;
}
interface SessionKeyData {
sessionKeyPair: Nillion.Keypair;
authorizationProof: AuthorizedSessionProof;
authorizedBy: Hex; // Main wallet address
signature: Hex; // Main wallet's signature authorizing this session key
}
/**
* Generate a new purpose key pair and get authorization signature from main wallet
*/
export async function deriveAuthorizedKeypair(
walletClient: WalletClient,
authMessage: SessionKeyAuthMessage,
userSecret: string,
options?: {
domainName: string,
primaryType: string ,
extendedTypes: Array<{name: string, type: string}> | undefined
}
): Promise {
if (!walletClient?.account?.address) {
throw new Error("Wallet not connected");
}
const primaryType = options?.primaryType || "SessionKeyAuthorization";
const domainName = options?.domainName || "Welshare Health Wallet";
//note the alphabetical prop ordering is important to recreate the signature payload
const typedData = {
domain: {
name: domainName,
version: "1.0"
},
message: authMessage,
primaryType,
types: {
EIP712Domain: [
{ name: "name", type: "string" },
{ name: "version", type: "string" },
],
[primaryType]: options?.extendedTypes || [
{ name: "context", type: "string" },
{ name: "keyId", type: "string" },
],
},
};
// Sign the authorization message with root wallet
const bindingSignature = await walletClient.signTypedData(typedData);
const _userSecret = new TextEncoder().encode(userSecret);
const derivedKey = await deriveKey(
_userSecret,
hexToBytes(bindingSignature),
authMessage
);
const authorizationProof: AuthorizedSessionProof = {
message: authMessage,
signature: bindingSignature,
signer: walletClient.account.address,
timestamp: new Date().toISOString(),
};
return {
sessionKeyPair: derivedKey,
authorizedBy: walletClient.account.address,
authorizationProof,
signature: bindingSignature,
};
}
```
And the key derivation code, implementing [RFC 5869](https://datatracker.ietf.org/doc/html/rfc5869):
```ts title="/lib/key-derivation.ts"
import { secp256k1 } from "@noble/curves/secp256k1";
import { hmac } from "@noble/hashes/hmac";
import { sha256 } from "@noble/hashes/sha2";
import { concatBytes, utf8ToBytes } from "@noble/hashes/utils";
/**
* User secret examples:
* - Password-derived key: PBKDF2 output from user password
* - Hardware wallet entropy: Internal randomness from secure element
* - Mnemonic-derived: BIP32 master seed from recovery phrase
* - App-specific secret: User's secret data for this application
*/
type UserSecret = Uint8Array; // 32 bytes of entropy
/**
* Key ID examples:
* - Sequential: 0, 1, 2, 3... for multiple keys
* - Purpose-based: "signing", "encryption", "authentication"
* - Account-based: "account-0", "account-1" for different accounts
* - Feature-based: "email-key", "document-key", "chat-key"
*/
type KeyId = string;
/**
* Context examples:
* - Application: "myapp.com", "wallet.ethereum.org"
* - Version: "v1.0", "beta", "production"
* - Environment: "development", "staging", "production"
* - Domain: "user@company.com", "tenant-123"
*/
type Context = string;
const COMMON_KDF_SALT = "SIGNATURE_INTEGRATED_KDF_v1";
/**
* HKDF implementation using HMAC-SHA256
*/
export function hkdf(
inputKeyMaterial: Uint8Array,
contextInformation: Uint8Array,
salt: Uint8Array = utf8ToBytes(COMMON_KDF_SALT),
privateKeyLength: number = 32
): Uint8Array {
// Extract phase (master key for expand phase)
const pseudoRandomKey = hmac(sha256, salt, inputKeyMaterial);
// Expand phase
const output = new Uint8Array(privateKeyLength);
const hashLen = 32; // SHA256 output length
const n = Math.ceil(privateKeyLength / hashLen);
let t = new Uint8Array(0);
let outputPos = 0;
//expand phase
for (let i = 1; i <= n; i++) {
const input = concatBytes(t, contextInformation, new Uint8Array([i]));
t = hmac(sha256, pseudoRandomKey, input);
const copyLen = Math.min(hashLen, privateKeyLength - outputPos);
output.set(t.subarray(0, copyLen), outputPos);
outputPos += copyLen;
}
return output;
}
/**
* Ensure the derived key material is valid for secp256k1
*/
function ensureValidSecp256k1Key(
keyMaterial: Uint8Array,
derivationData: Uint8Array
): Uint8Array {
const n = secp256k1.CURVE.n; // secp256k1 curve order
let candidate = keyMaterial;
let counter = 0;
while (true) {
const keyValue = bytesToBigInt(candidate);
if (keyValue > 0n && keyValue < n) {
return candidate;
}
// If invalid, derive a new candidate
counter++;
const counterBytes = new Uint8Array(4);
new DataView(counterBytes.buffer).setUint32(0, counter, false);
candidate = hkdf(
concatBytes(keyMaterial, counterBytes),
utf8ToBytes("SECP256K1_RETRY"),
derivationData,
32
);
if (counter > 1000) {
throw new Error(
"Failed to generate valid secp256k1 key after 1000 attempts"
);
}
}
}
/**
* Create deterministic derivation data for a specific key
*/
function createDerivationData(authMessage: SessionKeyAuthMessage): Uint8Array {
return utf8ToBytes(JSON.stringify(authMessage));
}
/**
* Derive a new key with cryptographic binding to root key
*/
export async function deriveKey(
userSecret: UserSecret,
bindingSignature: Uint8Array,
authMessage: SessionKeyAuthMessage
): Promise {
// Create derivation commitment
const derivationData = createDerivationData(authMessage);
// Use signature as additional entropy in multi-input KDF
const derivedKeyMaterial = hkdf(
concatBytes(userSecret, bindingSignature),
derivationData
);
// Step 4: Ensure the derived key is valid for secp256k1
const privateKey = ensureValidSecp256k1Key(
derivedKeyMaterial,
derivationData
);
return Nillion.Keypair.from(privateKey);
}
```
## Proving Control of Derived Keys
It's not trivial for a user to prove to a third party that they control a key they derived from their root key as the required signature and the salt values needed to *recover* the key would allow verifiers to *recreate* the actual key material.
We could let them cross sign a two sided "I am account x and I control account y" message - this would prove that the key holders both verifiably *claim* that they control both keys at the same time. This approach would fully disclose the relationship between purpose driven keys and root keys, thereby breaking privacy guarantees - particularly if the verification happens on a public blockchain.
As it turns out, it's very much feasible to generically prove the control over both keys in zero knowledge by using the root signature as a secret input, use a well defined key derivation function as a circuit and demonstrate signing a random public message.
If that message is chosen as some uncorrelated new account created by the prover, a user could e.g. claim rewards for certain actions they used a derived key pair for. They can claim the actual rewards using the fully uncorrelated account and create a nullifier that would refuse executing the claim twice.
## Historically Proving EIP-1271 Signature Validity
EIP-1271 relies on chain state and contract functions to verify signature validity. Hence a contract signature's validity can change from one block to another, depending on the implementation (e.g. a signature that increases the signer threshold of a Safe that's signed by the previous amount of signers would be immediately not considered valid after it passed).
Proving in zero knowledge that some signature was valid at a certain point in time requires to prove EVM execution and transaction ordering at that time. We ran a larger anaylsis on that topic earlier and we're convinced that it will be possible to safely use EIP-1271 signers as root entities for Welshare profiles: https://welshare.notion.site/Proving-Historical-EIP-1271-Signature-Validity-with-Ethereum-State-Proofs-22b5be1dc95d80be8949f1bd7fc80f1e?source=copy_link
# Key Derivation Implementation Notes
While Claude Code helped us translating the key derivation process into Python code, we stumbled upon some issue related to different interpretations of coding primitives. The following is the compaction of our learnings. The key derivation process creates Nillion `did:nil` keypairs from Ethereum EOA private keys using EIP-712 signature-based entropy and HKDF.
**Process Flow:**
1. Derive Ethereum EOA from BIP44 HD wallet (`m/44'/60'/0'/0/{index}`)
2. Sign EIP-712 structured message with Ethereum private key
3. Combine signature + user secret as HKDF input
4. Derive 32-byte key material using HKDF
5. Ensure key is valid for secp256k1 curve
6. Generate Nillion `did:nil` keypair from derived private key
## Critical Implementation Details
### 1. JSON Field Ordering (MOST CRITICAL)
**Problem:** JSON field ordering differs between TypeScript and Python.
**TypeScript behavior:**
```typescript
const authMessage = { keyId: "1", context: "nillion" };
JSON.stringify(authMessage);
// Output: {"context":"nillion","keyId":"1"}
// Alphabetical ordering: context before keyId
```
**Python solution:**
```python
def to_dict(self) -> Dict[str, str]:
# MUST preserve alphabetical order to match TypeScript
d = {}
d["context"] = self.context # context FIRST
d["keyId"] = self.key_id # keyId SECOND
return d
# Use separators to match TypeScript compact format
json.dumps(auth_dict, separators=(',', ':'))
```
**Why this matters:**
- This JSON is used as the HKDF context information
- Even one byte difference changes the entire derived keypair
- TypeScript's `JSON.stringify` outputs fields alphabetically by default
- Python's `json.dumps` with `sort_keys=True` also alphabetizes, but the insertion order matters when `sort_keys=False`
**Test case result:**
- Wrong order: `did:nil:0324fb5d4a3c983a4ef2bd5b7eee31fe01ad97aaeff96470c9f2eafd1730ba61c0`
- Correct order: `did:nil:03ecd47816bb8f475734b77aa9a3f4cc19a6075f3f603de0eebe6e11a784bb2e2d`
### 2. EIP-712 Signature Format
**Format:** 65 bytes (r + s + v)
- r: 32 bytes (signature component)
- s: 32 bytes (signature component)
- v: 1 byte (recovery id)
**Python implementation:**
```python
from eth_account.messages import encode_typed_data
encoded_message = encode_typed_data(full_message=typed_data)
signed_message = account.sign_message(encoded_message)
signature_bytes = signed_message.signature # 65 bytes
```
**TypeScript equivalent:**
```typescript
const bindingSignature = await walletClient.signTypedData(typedData);
// Returns hex string like "0x43f8f2f0..."
const signatureBytes = hexToBytes(bindingSignature); // 65 bytes
```
**Note:** Both implementations produce identical 65-byte signatures.
### 3. HKDF Implementation
**Salt:** `SIGNATURE_INTEGRATED_KDF_v1` (hardcoded constant)
**Python implementation:**
```python
def hkdf(input_key_material: bytes, context_information: bytes,
salt: bytes = COMMON_KDF_SALT, output_length: int = 32) -> bytes:
# Extract phase
prk = hmac.new(salt, input_key_material, hashlib.sha256).digest()
# Expand phase
t = b''
for i in range(1, n + 1):
t = hmac.new(prk, t + context_information + bytes([i]), hashlib.sha256).digest()
output.extend(t)
return bytes(output[:output_length])
```
**Critical parameters:**
- Input: user_secret (UTF-8 bytes) + signature (65 bytes) = 80 bytes
- Context: JSON.stringify(authMessage) as UTF-8 bytes
- Salt: `b"SIGNATURE_INTEGRATED_KDF_v1"`
- Output: 32 bytes
### 4. secp256k1 Key Validation
**Problem:** Not all 32-byte values are valid secp256k1 private keys.
**Valid range:** `0 < key < n` where `n = secp256k1.CURVE.n`
**Python implementation:**
```python
SECP256K1_ORDER = 0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141
def ensure_valid_secp256k1_key(key_material: bytes, derivation_data: bytes) -> bytes:
candidate = key_material
counter = 0
while counter < 1000:
key_value = int.from_bytes(candidate, byteorder='big')
if 0 < key_value < SECP256K1_ORDER:
return candidate
# Re-derive with counter if invalid
counter += 1
counter_bytes = counter.to_bytes(4, byteorder='big')
candidate = hkdf(
candidate + counter_bytes,
b"SECP256K1_RETRY",
derivation_data,
32
)
```
### 5. Public Key Compression
**Format:** 33 bytes (1-byte prefix + 32-byte x-coordinate)
**Prefix determination:**
- `0x02` if y-coordinate is even
- `0x03` if y-coordinate is odd
**Python implementation:**
```python
from ecdsa import SigningKey, SECP256k1
signing_key = SigningKey.from_string(private_key, curve=SECP256k1)
verifying_key = signing_key.get_verifying_key()
public_key_uncompressed = verifying_key.to_string() # 64 bytes
x_coord = public_key_uncompressed[:32]
y_coord = public_key_uncompressed[32:]
y_is_odd = y_coord[-1] & 1
prefix = b'\x03' if y_is_odd else b'\x02'
compressed_public_key = prefix + x_coord # 33 bytes
```
**DID format:** `did:nil:{compressed_public_key_hex}`
### 6. Byte Concatenation
**Critical order:**
```python
# Input key material: user_secret + signature
user_secret_bytes = "user@secret.com".encode('utf-8') # 15 bytes
binding_signature = ... # 65 bytes
input_key_material = user_secret_bytes + binding_signature # 80 bytes total
```
**Must match TypeScript:**
```typescript
const _userSecret = new TextEncoder().encode(userSecret);
const input = concatBytes(_userSecret, hexToBytes(bindingSignature));
```
## Common Pitfalls
### ❌ Using `sort_keys=True` in json.dumps
- Creates different JSON than TypeScript
- Results in completely different derived keypair
### ❌ Wrong field insertion order
- Even with alphabetical fields, must insert `context` before `keyId`
### ❌ Including `0x` prefix in signature
- Signature should be raw bytes, not hex string with prefix
### ❌ Wrong public key format for DID
- Must use compressed (33 bytes), not uncompressed (64 bytes)
### ❌ Not validating secp256k1 key range
- Can result in invalid private keys that can't be used
## Test Vectors
**Known safe test key (Hardhat/Ganache default account #0):**
```
Private Key: ac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80
Ethereum Address: 0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266
Auth Message: { keyId: "1", context: "nillion" }
User Secret: "user@secret.com"
Expected Results:
- EIP-712 Signature: 43f8f2f081a113628a5ab4ab232ca74707a455346b338905b7eb3041961e46ef74a1eeb95a1e9e878665afe68db14900ae7686641bcd07760e46d784312e1aee1c
- Derivation Data: 7b22636f6e74657874223a226e696c6c696f6e222c226b65794964223a2231227d
- HKDF Output: fc7d9e63f27d06c1d69c090f86a7f15a91464f8c5de6ee14be7c3dff6f70f9f1
- Final DID: did:nil:03ecd47816bb8f475734b77aa9a3f4cc19a6075f3f603de0eebe6e11a784bb2e2d
```
## References
- TypeScript implementation: `key-derivation.ts`, `sessionKeys.ts`
- Welshare docs: https://docs.welshare.app/basics/key-management
- HKDF RFC: RFC 5869
- EIP-712 spec: https://eips.ethereum.org/EIPS/eip-712
---
## Welshare
## Matching AI agent researchers with patients to accelerate medical breakthroughs
Welshare is on a mission to accelerate science using human data. All protocols that are part of Welshare's architecture rigurously prioritize data owner sovereignty and privacy. User identificiation and authorization is performed using cryptographically sound protocols, data is stored on decentralized infrastructure and all computation and inference runs on provably trusted execution environment, making use of privacy enhancing technologies.
We want to make data accessible to agentic AI, either for contextual personal health recommendations or to support agentic research hypotheses derived from knowledge graphs, literature research or artificial intuition - but we don't want that everyone must give up their privacy for this.
Welshare strongly advocates against establishing more data silos. Our stack will always put the user first, in a way that if *we* go away, users can still access their information. Applications building on the protocol are the first parties getting access grants to their users' data, but the information itself is at this point already *owned* and controlled by the individual user.
## A note on Alpha (α) Level Software
:::warning
The Welshare protocol at the moment is in **Alpha state**. We're operating on testnets, charge testnet tokens, we're potentially even leaking data at this stage (even though we're building towards a system that makes this impossible). Please be careful when using this.
:::
All the aforementioned claims hold strong to our beliefs and we will not launch anything productively unless we're fully convinced that user data stays under control of the user. That being said, while we're building demos and preliminary software, not all goals will be fully achievable during that early phase. Every time we're talking about "alpha software", we can't guarantee that the software or storage is perfectly safe, persistent or permanent or that private data can't be leaked to third parties. Right now we just architecturally ensure that this will be the case once we're getting closer to production mode.