You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

  1. Protecting data
    1. Security of devices
      1. Physical vulnerabilities
        1. Device lost
        2. Device defection (not availability of device or no battery)
        3. Device stolen
      2. Lack of Device Security
    2. Security of Wallets → one App or wallet with lots of functionalities and different sectors.
      1. Phishing Attacks
      2. Malware and Viruses
      3. Social Engineering
    3. Security of Verifiable Credentials
      1. Just like with traditional passwords, weak keys or improperly stored credentials in distributed identity systems can be vulnerable for hacking
        1. by end user
        2. by service providers
        3. by issuers (tricky)
        4. by third parties → Misusing or reusing data by third parties through illegal access e.g. Intrusion through malicious App, social engineering, duplication, skimming 
    4.  Security of Services → dependency to service security
      1. relying parties
      2. intermediaries
      3. GenAI: here we mean using GenAI outside the wallet (AI-as-a-Service)  DRAFT
        1. 1. Implicit data leakage (even without “sending data”)

          Even if you think you’re only sending:

          • policies

          • capability lists

          • proof requests

          …the structure, timing, and combinations of requests can leak:

          • user attributes

          • behavior patterns

          • service usage profiles

          This is called inference leakage.

          Over time, the AI provider can reconstruct who you are and what you’re doing — without seeing raw identity data.

          2. Loss of user sovereignty

          When AI runs outside the wallet:

          • decision logic lives elsewhere

          • prompt logic evolves without the user’s control

          • model updates silently change behavior

          Result:

          The wallet becomes a UI, not an agent.

          This quietly breaks self-sovereign identity principles.

          3. Policy manipulation & dark negotiation

          External AI can:

          • bias disclosure decisions

          • “optimize” for platform goals

          • subtly over-disclose to reduce friction

          Even without malice:

          • optimization objectives ≠ user interests

          This is algorithmic coercion, not a bug.

          4. Prompt and context retention

          Most AI services:

          • log prompts

          • retain context

          • reuse data for tuning or monitoring

          Even anonymized logs can:

          • correlate identities across services

          • deanonymize users through linkage attacks

          Once logged:

          You can’t revoke it.

          5. Correlation across wallets and services

          A single AI provider serving many wallets can:

          • correlate request fingerprints

          • identify the same user across devices or contexts

          • create a shadow identity graph

          This recreates centralized identity — without consent.

          6. Regulatory and jurisdictional drift

          External AI services may:

          • run in foreign jurisdictions

          • be subject to subpoenas

          • fall under surveillance regimes

          This creates:

          • unclear data residency

          • legal exposure for wallet providers

          • compliance contradictions (GDPR, eIDAS, etc.)

          7. Model hallucination becomes a security risk

          Inside a wallet:

          • AI mistakes are bounded
            Outside:

          • hallucinated policy interpretations

          • incorrect legal assumptions

          • wrong proof selection

          These can cause:

          • over-disclosure

          • invalid consent

          • irreversible identity actions

          Hallucination here is not UX noise — it’s identity damage.

  2. Losing data → lack of support mechanism by security issues
    1. Not enough recovery solution
    2. No insurance
  3. Dark Netsecurity economic → there is a business to generate fake ids or misuse of real ids, which could be used for washing money or any other illegal action 
    1. Fake ID 
    2. Misusing of VC
  4. Trust Infrastructure → any vulnerabilities causes by mistakes in Trust Infrastructure
    1. PKI
    2. Registry
    3. Any intermediaries
  • No labels