Skip to content

Instantly share code, notes, and snippets.

@reflectivedevelopment
Created January 4, 2023 16:57
Show Gist options
  • Save reflectivedevelopment/f7b1f8ab8fae6dab96dccb3627fde316 to your computer and use it in GitHub Desktop.
Save reflectivedevelopment/f7b1f8ab8fae6dab96dccb3627fde316 to your computer and use it in GitHub Desktop.
git diff origin/stable..origin/ubuntu-20.04-upgrade
This file has been truncated, but you can view the full file.
* [X] diff --git a/.devcontainer/Dockerfile b/.devcontainer/Dockerfile
This is fine and we can ignore it since it is part of the new build.
new file mode 100644
index 00000000..e101f8aa
--- /dev/null
+++ b/.devcontainer/Dockerfile
@@ -0,0 +1,74 @@
+# See here for image contents: https://github.com/microsoft/vscode-dev-containers/tree/v0.205.2/containers/ubuntu/.devcontainer/base.Dockerfile
+
+# [Choice] Ubuntu version (use hirsuite or bionic on local arm64/Apple Silicon): hirsute, focal, bionic
+ARG VARIANT="focal"
+FROM mcr.microsoft.com/vscode/devcontainers/base:${VARIANT}
+
+RUN apt-get update -y && apt-get install -y \
+ # common stuff
+ git \
+ wget \
+ gnupg \
+ apt-transport-https \
+ ca-certificates \
+ apt-utils \
+ curl \
+ jq
+
+# ========================================================================================================
+# Update repository signing keys
+# --------------------------------------------------------------------------------------------------------
+# Hyperledger
+RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 9692C00E657DDE61 && \
+ # Sovrin
+ apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
+# ========================================================================================================
+
+# Plenum
+# - https://github.com/hyperledger/indy-plenum/issues/1546
+# - Needed to pick up rocksdb=5.8.8
+RUN echo "deb https://hyperledger.jfrog.io/artifactory/indy focal dev" >> /etc/apt/sources.list && \
+ echo "deb http://security.ubuntu.com/ubuntu bionic-security main" >> /etc/apt/sources.list && \
+ echo "deb https://repo.sovrin.org/deb bionic master" >> /etc/apt/sources.list && \
+ echo "deb https://repo.sovrin.org/sdk/deb bionic master" >> /etc/apt/sources.list
+
+RUN apt-get update -y && apt-get install -y \
+ # Python
+ python3-pip \
+ python3-nacl \
+ # rocksdb python wrapper
+ rocksdb=5.8.8 \
+ libgflags-dev \
+ libsnappy-dev \
+ zlib1g-dev \
+ libbz2-dev \
+ liblz4-dev \
+ libgflags-dev \
+ # zstd is needed for caching in github actions pipeline
+ zstd \
+ # fpm
+ ruby \
+ ruby-dev \
+ rubygems \
+ gcc \
+ make \
+ # Indy Node and Plenum
+ libssl1.0.0 \
+ ursa=0.3.2-1 \
+ # Indy SDK
+ libindy=1.15.0~1625-bionic \
+ # Need to move libursa.so to parent dir
+ && mv /usr/lib/ursa/* /usr/lib && rm -rf /usr/lib/ursa
+
+RUN pip3 install -U \
+ # Required by setup.py
+ setuptools==50.3.2 \
+ 'pyzmq==22.3.0'
+
+
+# install fpm
+RUN gem install --no-document rake
+RUN gem install --no-document fpm -v 1.14.2
+
+RUN apt-get -y autoremove \
+ && rm -rf /var/lib/apt/lists/*
* [X] diff --git a/.devcontainer/devcontainer.json b/.devcontainer/devcontainer.json
This is fine since it is part of the new build
new file mode 100644
index 00000000..e405e52d
--- /dev/null
+++ b/.devcontainer/devcontainer.json
@@ -0,0 +1,31 @@
+// For format details, see https://aka.ms/devcontainer.json. For config options, see the README at:
+// https://github.com/microsoft/vscode-dev-containers/tree/v0.205.2/containers/ubuntu
+{
+ "name": "Ubuntu",
+ "build": {
+ "dockerfile": "Dockerfile",
+ // Update 'VARIANT' to pick an Ubuntu version: hirsute, focal, bionic
+ // Use hirsute or bionic on local arm64/Apple Silicon.
+ "args": { "VARIANT": "focal" }
+ },
+
+ // Set *default* container specific settings.json values on container create.
+ "settings": {},
+
+
+ // Add the IDs of extensions you want installed when the container is created.
+ "extensions": [
+ "mhutchie.git-graph",
+ "eamodio.gitlens",
+ "ms-python.python"
+ ],
+
+ // Use 'forwardPorts' to make a list of ports inside the container available locally.
+ // "forwardPorts": [],
+
+ // Use 'postCreateCommand' to run commands after the container is created.
+ "postCreateCommand": "pip install .[tests]",
+
+ // Comment out connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.
+ //"remoteUser": "vscode"
+}
\ No newline at end of file
* [X] diff --git a/.flake8 b/.flake8
This change makes sense and needs to be carried forward.
* [ ] TODO: Create cherry pick for this commit
commit d346dc3a9e07848bcbab91e2c94743768b1cc3f0
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Wed Nov 29 14:22:51 2017 +0300
Add flake error to ignore
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
.flake8 | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
index 85491324..3e3a277c 100644
--- a/.flake8
+++ b/.flake8
@@ -1,5 +1,5 @@
[flake8]
-ignore = N801, N802, N803, N806, N813, C901, E501, F401, E741
+ignore = N801, N802, N803, N806, N813, C901, E501, F401, W605
exclude =
# common places
.git,__pycache__,docs/source/conf.py,old,build,dist,environment
* [X] diff --git a/.gitattributes b/.gitattributes
This is a new file.
new file mode 100644
index 00000000..92ce082e
--- /dev/null
+++ b/.gitattributes
@@ -0,0 +1,5 @@
+# Set the default behavior, in case people don't have core.autocrlf set.
+* text=auto
+
+# Declare files that will always have LF line endings on checkout.
+**/publishPackages text eol=lf
\ No newline at end of file
* [X] diff --git a/.github/actions/getNewNodeVersion/action.yaml b/.github/actions/getNewNodeVersion/action.yaml
Part of the new build.
new file mode 100644
index 00000000..e67138ab
--- /dev/null
+++ b/.github/actions/getNewNodeVersion/action.yaml
@@ -0,0 +1,38 @@
+name: "Get New NodeVersion"
+description: "Sets version parameters and makes them available as outputs for subsequent processes."
+
+inputs:
+ isRC:
+ description: "A flag indicating whether or not this is a release candidate build; set to either 'true' or 'false'."
+ required: true
+ default: "true"
+
+outputs:
+ nodeVersion:
+ description: "The new Version. (Bumped Patch by 1)"
+ value: ${{ steps.versions.outputs.nodeVersion }}
+
+runs:
+ using: "composite"
+ steps:
+ - uses: actions/checkout@v3
+ - name: Set up Python
+ uses: actions/setup-python@v2
+ with:
+ python-version: '3.8'
+ - name: Get Versions
+ id: versions
+ shell: bash
+ run: |
+ major=$(python3 -c "from indy_node import load_version; patch = load_version().parts[0]; print('' if patch is None else patch)")
+ minor=$(python3 -c "from indy_node import load_version; patch = load_version().parts[1]; print('' if patch is None else patch)")
+ patch=$(python3 -c "from indy_node import load_version; patch = load_version().parts[2]; patch+=1; print('' if patch is None else patch)")
+ if [[ "${{ inputs.isRC }}" == "true" ]]; then
+ export nodeVersion="${major}.${minor}.${patch}rc1"
+ else
+ export nodeVersion="${major}.${minor}.${patch}"
+ fi
+ echo "nodeVersion=$nodeVersion" >> $GITHUB_OUTPUT
+ echo "::group::DEBUG"
+ echo "nodeVersion is set to $nodeVersion"
+ echo "::endgroup::"
\ No newline at end of file
* [X] diff --git a/.github/settings.yml b/.github/settings.yml
This is part of the new build
new file mode 100644
index 00000000..77237599
--- /dev/null
+++ b/.github/settings.yml
@@ -0,0 +1,18 @@
+#
+# SPDX-License-Identifier: Apache-2.0
+#
+
+repository:
+ name: indy-node
+ description: The server portion of a distributed ledger purpose-built for decentralized identity.
+ homepage: https://wiki.hyperledger.org/display/indy
+ default_branch: master
+ has_downloads: false
+ has_issues: true
+ has_projects: false
+ has_wiki: false
+ archived: false
+ private: false
+ allow_squash_merge: true
+ allow_merge_commit: true
+ allow_rebase_merge: true
* [X] diff --git a/.github/workflows/PR.yaml b/.github/workflows/PR.yaml
This is part of the new build.
new file mode 100644
index 00000000..1d491c45
--- /dev/null
+++ b/.github/workflows/PR.yaml
@@ -0,0 +1,84 @@
+name: Indy Node - PR Workflow
+on:
+ pull_request:
+ paths:
+ - '**'
+ - "!indy_node/__version__.json"
+
+ branches:
+ - ubuntu-20.04-upgrade
+ workflow_dispatch:
+
+jobs:
+ workflow-setup:
+ name: Initialize Workflow
+ runs-on: ubuntu-latest
+ outputs:
+ CACHE_KEY_BUILD: ${{ steps.setup.outputs.CACHE_KEY_BUILD }}
+ UBUNTU_VERSION: ${{ steps.setup.outputs.UBUNTU_VERSION }}
+ # Expose the lowercase version of the GitHub repository name
+ # to all subsequent jobs that reference image repositories
+ # as the push and pull operations require the URL of the repository
+ # to be in lowercase.
+ GITHUB_REPOSITORY_NAME: ${{ steps.setup.outputs.GITHUB_REPOSITORY_NAME }}
+ distribution: ${{ steps.setup.outputs.distribution }}
+ publish: ${{ steps.setup.outputs.publish }}
+ testsNeeded: ${{ steps.testsNeeded.outputs.testsNeeded }}
+ steps:
+ - name: checkout source code
+ uses: actions/checkout@v3
+ - name: setup
+ id: setup
+ uses: hyperledger/indy-shared-gha/.github/actions/workflow-setup@v1
+ - name: testsNeeded
+ id: testsNeeded
+ uses: dorny/paths-filter@v2
+ with:
+ filters: |
+ testsNeeded:
+ - '**.py'
+ - '.github/**'
+ - 'build-scripts/**'
+ - 'bump_version.sh'
+
+ lint:
+ name: Lint
+ needs: [workflow-setup]
+ if: ${{ needs.workflow-setup.outputs.testsNeeded == 'true' }}
+ uses: hyperledger/indy-shared-gha/.github/workflows/lint.yaml@v1
+
+ build-image:
+ name: Create Builder Image
+ needs: [workflow-setup, lint]
+ uses: hyperledger/indy-shared-gha/.github/workflows/buildimage.yaml@v1
+ with:
+ CACHE_KEY_BUILD: ${{ needs.workflow-setup.outputs.CACHE_KEY_BUILD }}
+ DOCKER_IMAGE: ghcr.io/${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}/node-build
+ UBUNTU_VERSION: ${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+
+ indy_node_tests:
+ name: Indy Node Tests
+ needs: [workflow-setup, build-image]
+ uses: ./.github/workflows/reuseable_test.yaml
+ with:
+ GITHUB_REPOSITORY_NAME: ${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}
+ UBUNTU_VERSION: ${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+
+ build_packages:
+ name: Build Packages
+ needs: [workflow-setup, indy_node_tests]
+ uses: hyperledger/indy-shared-gha/.github/workflows/buildpackages.yaml@v1
+ with:
+ DOCKER_IMAGE: ghcr.io/${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}/node-build:${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+ UBUNTU_VERSION: ${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+ isDev: true
+ isRC: false
+ moduleName: indy_node
+
+ statusCheck:
+ name: statusCheck
+ runs-on: ubuntu-latest
+ needs: [workflow-setup, build_packages]
+ if: ${{ needs.workflow-setup.outputs.testsNeeded == 'false' || success() }}
+ steps:
+ - run: 'echo "Just a status Check (Always true, when executed) for branch protection rules(blocks merging while test are running and if tests fail)." '
\ No newline at end of file
* [X] diff --git a/.github/workflows/Push.yaml b/.github/workflows/Push.yaml
This is part of the new build.
new file mode 100644
index 00000000..5099ac9a
--- /dev/null
+++ b/.github/workflows/Push.yaml
@@ -0,0 +1,77 @@
+name: Indy Node - Push Workflow
+on:
+ push:
+ branches:
+ - ubuntu-20.04-upgrade
+ paths:
+ - '**.py'
+ - '.github/**'
+ - 'build-scripts/**'
+ - 'bump_version.sh'
+
+jobs:
+ workflow-setup:
+ name: Initialize Workflow
+ runs-on: ubuntu-latest
+ outputs:
+ CACHE_KEY_BUILD: ${{ steps.setup.outputs.CACHE_KEY_BUILD }}
+ UBUNTU_VERSION: ${{ steps.setup.outputs.UBUNTU_VERSION }}
+ # Expose the lowercase version of the GitHub repository name
+ # to all subsequent jobs that reference image repositories
+ # as the push and pull operations require the URL of the repository
+ # to be in lowercase.
+ GITHUB_REPOSITORY_NAME: ${{ steps.setup.outputs.GITHUB_REPOSITORY_NAME }}
+ distribution: ${{ steps.setup.outputs.distribution }}
+ publish: ${{ steps.setup.outputs.publish }}
+ steps:
+ - name: checkout source code
+ uses: actions/checkout@v3
+ - name: setup
+ id: setup
+ uses: hyperledger/indy-shared-gha/.github/actions/workflow-setup@v1
+
+ lint:
+ name: Lint
+ uses: hyperledger/indy-shared-gha/.github/workflows/lint.yaml@v1
+
+ build-image:
+ name: Create Builder Image
+ needs: [workflow-setup, lint]
+ uses: hyperledger/indy-shared-gha/.github/workflows/buildimage.yaml@v1
+ with:
+ CACHE_KEY_BUILD: ${{ needs.workflow-setup.outputs.CACHE_KEY_BUILD }}
+ DOCKER_IMAGE: ghcr.io/${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}/node-build
+ UBUNTU_VERSION: ${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+
+ indy_node_tests:
+ name: Indy Node Tests
+ needs: [workflow-setup, build-image]
+ uses: ./.github/workflows/reuseable_test.yaml
+ with:
+ GITHUB_REPOSITORY_NAME: ${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}
+ UBUNTU_VERSION: ${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+
+ build_packages:
+ name: Build Packages
+ needs: [workflow-setup, indy_node_tests]
+ uses: hyperledger/indy-shared-gha/.github/workflows/buildpackages.yaml@v1
+ with:
+ DOCKER_IMAGE: ghcr.io/${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}/node-build:${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+ UBUNTU_VERSION: ${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+ isDev: true
+ isRC: false
+ moduleName: indy_node
+
+ publish_artifacts:
+ name: Publish Artifacts
+ needs: [workflow-setup, build_packages]
+ if: needs.workflow-setup.outputs.publish == 'true'
+ uses: hyperledger/indy-shared-gha/.github/workflows/publish_artifacts.yaml@v1
+ with:
+ COMPONENT: 'dev'
+ UBUNTU_VERSION: ${{ needs.workflow-setup.outputs.UBUNTU_VERSION }}
+ distribution: ${{ needs.workflow-setup.outputs.distribution }}
+ moduleName: indy_node
+ secrets:
+ INDY_ARTIFACTORY_REPO_CONFIG: ${{ secrets.INDY_ARTIFACTORY_REPO_CONFIG }}
+ PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
* [X] diff --git a/.github/workflows/README.md b/.github/workflows/README.md
This is part of the new build. Not a concern.
index f1587ac3..adddac7a 100644
--- a/.github/workflows/README.md
+++ b/.github/workflows/README.md
@@ -1,25 +1,11 @@
-### Github Actions Workflow
+# GitHub Actions Workflows
-This build file replaces the existing `Jenkins.ci` build process.
+The [PR](PR.yaml) workflow runs on Pull Requests to the ubuntu-20.04-upgrade branch,
+which only contain changes to python files. If no python file is affected it doesn't run.
+The same applies to the [Push](Push.yaml) workflow respectively for pushes.
-`lint.yaml` replaces the `Static code validation` stage of the Jenkins build.
+The [tag](tag.yaml), [releasepr](releasepr.yaml) and [publishRelease](publishRelease.yaml) workflows are used for the new [Release Workflow](../../docs/release-workflow.png).
+They use reuseable workflows from the [indy-shared-gha](https://github.com/hyperledger/indy-shared-gha) repository and the following workflow in this folder.
-`build.yaml` replaces the `Build / Test` stage of the Jenkins build.
-
-Many of the other stages are replaced merely by the fact we're using Github Actions, we use prebuild Docker containers so we don't have to replicate the steps for building containers.
-
-The `Build result notification` stage was not moved to GHA, build failures will be reports via GHA.
-
-The build process for `Jenkins.nightly` was not ported to GHA.
-
-#### Configuring actions
-
-If you are cloning or forking this repo you will need to configure two secrets for Actions to run correctly.
-
-Secrets can be set via Settings -> Secrets -> New repository secret.
-
-CR_USER is your GH username.
-CR_PAT can be created by following [these directions](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token)
-
-Once you have run the build once with those secrets, you have to make then package public.
-Access the package at https://ghcr.io/USER/indy-node/indy-node-build or https://ghcr.io/USER/indy-node/indy-node-lint then change the visibility in 'Package Settings' to 'Public' then re-run the build.
++ [reuseable_test.yaml](reuseable_test.yaml)
+ This workflow runs the tests inside the uploaded docker images.
\ No newline at end of file
* [X] diff --git a/.github/workflows/WIPupdatePlenumDependency b/.github/workflows/WIPupdatePlenumDependency
This is part of the new build.
new file mode 100644
index 00000000..5a805d0d
--- /dev/null
+++ b/.github/workflows/WIPupdatePlenumDependency
@@ -0,0 +1,83 @@
+# WIP to Update node after a plenum update has occured
+# name: Update Plenum Dependency
+
+# on:
+# push:
+# tags:
+# - setPlenum-v**
+
+# jobs:
+# taginfos:
+# name: get Tag infos
+# runs-on: ubuntu-latest
+# outputs:
+# version: ${{ steps.get-release-info.outputs.version }}
+# isPreRelease: ${{ steps.get-release-info.outputs.isRC }}
+# Branch: ${{ steps.setPRBranch.outputs.prBranch }}
+# BASE: ${{ steps.branch.outputs.BASE }}
+# nodeVersion: ${{ steps.nodeVersion.outputs.nodeVersion }}
+# steps:
+# - name: checkout source code
+# uses: actions/checkout@v3
+# - name: extract branch
+# id: branch
+# run: |
+# raw=$(git branch -r --contains ${{ github.ref }})
+# branch=${raw/origin\/}
+# echo ::set-output\ name=BASE::$branch
+# echo "::debug::BASE is being set to $branch"
+# - name: get-release-info
+# id: get-release-info
+# uses: hyperledger/indy-shared-gha/.github/actions/get-release-info@v1
+# with:
+# versionString: "${{ github.ref }}"
+# - name: setPRBranch
+# id: setPRBranch
+# run: |
+# if [[ "${{ steps.get-release-info.outputs.isRC }}" == "true" ]]; then
+# echo ::set-output\ name=prBranch::update-plenum-rc-version
+# else
+# echo ::set-output\ name=prBranch::update-plenum-version
+# fi
+# # - name: Set up Python
+# # uses: actions/setup-python@v2
+# # with:
+# # python-version: '3.8'
+# # - name: Get New Node Version
+# # id: nodeVersion
+# # run: |
+# # major=$(python3 -c "from indy_node import load_version; patch = load_version().parts[0]; print('' if patch is None else patch)")
+# # minor=$(python3 -c "from indy_node import load_version; patch = load_version().parts[1]; print('' if patch is None else patch)")
+# # patch=$(python3 -c "from indy_node import load_version; patch = load_version().parts[2]; patch+=1; print('' if patch is None else patch)")
+# # if [[ "${{ inputs.isRC }}" == "true" ]]; then
+# # echo ::set-output\ name=nodeVersion::${major}.${minor}.${patch}rc1
+# # else
+# # echo ::set-output\ name=nodeVersion::${major}.${minor}.${patch}
+# # fi
+# - name: Get new Node Version
+# id: nodeVersion
+# uses: ./.github/actions/getNewNodeVersion
+# with:
+# isRC: ${{ steps.get-release-info.outputs.isRC }}
+
+# updateAndCommit:
+# runs-on: ubuntu-latest
+# needs: taginfos
+# steps:
+# - name: checkout source code
+# uses: actions/checkout@v3
+# with:
+# token: ${{ secrets.BOT_PR_PAT }}
+# - name: Update Setup.py
+# run: |
+# sed -E -i 's/indy-plenum==[[:digit:]]+.[[:digit:]]+.[[:digit:]]+(.(dev|rc)[[:digit:]]+)?/indy-plenum==${{ needs.taginfos.outputs.version }}/g' setup.py
+# - name: CommitAndTag
+# uses: EndBug/add-and-commit@v9
+# with:
+# author_name: ${{ github.actor }}
+# author_email: ${{ github.event.pusher.email }}
+# commit: --signoff
+# message: 'Update Plenum Dependency to ${{ needs.taginfos.outputs.version }}'
+# push: "origin ${{ needs.taginfos.outputs.Branch }} --set-upstream --force"
+# new_branch: ${{ needs.taginfos.outputs.Branch }}
+# tag: "setRelease-v${{ needs.taginfos.outputs.nodeVersion }}"
\ No newline at end of file
* [X] diff --git a/.github/workflows/build.yaml b/.github/workflows/build.yaml
This was part of the old build.
deleted file mode 100644
index 9ee5ffad..00000000
--- a/.github/workflows/build.yaml
+++ /dev/null
@@ -1,117 +0,0 @@
-name: indy-node-build
-on: [ push, pull_request ]
-
-jobs:
- workflow-setup:
- runs-on: ubuntu-latest
- outputs:
- CACHE_KEY_LINT: ${{ steps.cache.outputs.CACHE_KEY_LINT }}
- CACHE_KEY_BUILD: ${{ steps.cache.outputs.CACHE_KEY_BUILD }}
- # Expose the lowercase version of the GitHub repository name
- # to all subsequent jobs that reference image repositories
- # as the push and pull operations require the URL of the repository
- # to be in lowercase.
- GITHUB_REPOSITORY_NAME: ${{ steps.cache.outputs.GITHUB_REPOSITORY_NAME }}
- steps:
- - name: Git checkout
- uses: actions/checkout@v2
- - name: Set outputs
- id: cache
- run: |
- echo "::set-output name=CACHE_KEY_LINT::${{ hashFiles('.github/workflows/lint/Dockerfile') }}"
- echo "::set-output name=CACHE_KEY_BUILD::${{ hashFiles('.github/workflows/build/Dockerfile') }}"
- echo "::set-output name=GITHUB_REPOSITORY_NAME::$(echo ${GITHUB_REPOSITORY,,})"
-
- build-lint-image:
- needs: workflow-setup
- runs-on: ubuntu-latest
- env:
- DOCKER_BUILDKIT: 1
- CACHE_KEY_LINT: ${{ needs.workflow-setup.outputs.CACHE_KEY_LINT }}
- GITHUB_REPOSITORY_NAME: ${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}
- steps:
- - name: Git checkout
- uses: actions/checkout@v2
- - name: Try load from cache.
- id: cache-image-lint
- uses: actions/cache@v2
- with:
- path: ${GITHUB_WORKSPACE}/cache
- key: ${{ env.CACHE_KEY_LINT }}
- - name: If NOT found in cache, build and push image.
- if: steps.cache-image-lint.outputs.cache-hit != 'true'
- run: |
- echo ${{ secrets.CR_PAT }} | docker login ghcr.io --username ${{ secrets.CR_USER }} --password-stdin
- docker build -f .github/workflows/lint/Dockerfile --no-cache -t ${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-lint:${{ env.CACHE_KEY_LINT }} .
- docker tag ${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-lint:${{ env.CACHE_KEY_LINT }} ghcr.io/${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-lint:latest
- docker push ghcr.io/${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-lint:latest
- mkdir -p ${GITHUB_WORKSPACE}/cache
- touch ${GITHUB_WORKSPACE}/cache/${{ env.CACHE_KEY_LINT }}
-
- build-test-image:
- needs: workflow-setup
- runs-on: ubuntu-latest
- env:
- DOCKER_BUILDKIT: 1
- CACHE_KEY_BUILD: ${{ needs.workflow-setup.outputs.CACHE_KEY_BUILD }}
- GITHUB_REPOSITORY_NAME: ${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}
- steps:
- - name: Git checkout
- uses: actions/checkout@v2
- - name: Try load from cache.
- id: cache-image-build
- uses: actions/cache@v2
- with:
- path: ${GITHUB_WORKSPACE}/cache
- key: ${{ env.CACHE_KEY_BUILD }}
- - name: If NOT found in cache, build and push image.
- if: steps.cache-image-build.outputs.cache-hit != 'true'
- run: |
- echo ${{ secrets.CR_PAT }} | docker login ghcr.io --username ${{ secrets.CR_USER }} --password-stdin
- docker build -f .github/workflows/build/Dockerfile --no-cache -t ${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-build:${{ env.CACHE_KEY_BUILD }} .
- docker tag ${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-build:${{ env.CACHE_KEY_BUILD }} ghcr.io/${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-build:latest
- docker push ghcr.io/${{ env.GITHUB_REPOSITORY_NAME }}/indy-node-build:latest
- mkdir -p ${GITHUB_WORKSPACE}/cache
- touch ${GITHUB_WORKSPACE}/cache/${{ env.CACHE_KEY_BUILD }}
-
- indy_node:
- name: Build Indy Node
- needs: build-test-image
- runs-on: ubuntu-18.04
- container:
- image: ghcr.io/${{ github.repository }}/indy-node-build
- strategy:
- matrix:
- module: [indy_node, indy_common]
- slice: [1, 2, 3, 4 ,5, 6, 7,8, 9, 10, 11]
- fail-fast: false
- steps:
- - name: Check out code
- uses: actions/checkout@v2
-
- - name: Install dependencies
- run: pip install .[tests]
- continue-on-error: true
-
- - name: Run Indy Node ${{ matrix.module }} test slice ${{ matrix.slice }}/${{ strategy.job-total }}
- run: RUSTPYTHONASYNCIODEBUG=0 python3 runner.py --pytest "python3 -m pytest -l -vv --junitxml=test-result-indy-node-${{ matrix.module }}-${{ matrix.slice }}.xml" --dir "${{ matrix.module }}" --output "test-result-indy-node-${{ matrix.slice }}.txt" --test-only-slice "${{ matrix.slice }}/${{ strategy.job-total }}"
-
- - name: Publish Test Report
- uses: scacap/action-surefire-report@v1
- with:
- check_name: Indy Node ${{ matrix.module }} ${{ matrix.slice }}/${{ strategy.job-total }} Test Report
- github_token: ${{ secrets.GITHUB_TOKEN }}
- report_paths: test-result-indy-node-${{ matrix.module }}-${{ matrix.slice }}.xml
-
- lint:
- name: Lint
- runs-on: ubuntu-latest
- container:
- image: ghcr.io/${{ needs.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}/indy-node-lint
- needs: [workflow-setup, build-lint-image]
- steps:
- - name: Check out code
- uses: actions/checkout@v2
-
- - name: flake8
- run: python3 -m flake8
* [X] diff --git a/.github/workflows/build/Dockerfile b/.github/workflows/build/Dockerfile
Part of the old build.
deleted file mode 100644
index 0a1ce13c..00000000
--- a/.github/workflows/build/Dockerfile
+++ /dev/null
@@ -1,17 +0,0 @@
-FROM hyperledger/indy-core-baseci:0.0.3-master
-LABEL maintainer="Hyperledger <hyperledger-indy@lists.hyperledger.org>"
-
-RUN apt-get update -y && apt-get install -y \
- python3-nacl \
- libindy-crypto=0.4.5 \
- libindy=1.13.0~1420 \
-# rocksdb python wrapper
- libbz2-dev \
- zlib1g-dev \
- liblz4-dev \
- libsnappy-dev \
- rocksdb=5.8.8 \
- ursa=0.3.2-2 \
- jq
-
-RUN indy_image_clean
* [X] diff --git a/.github/workflows/build/Dockerfile.ubuntu-2004 b/.github/workflows/build/Dockerfile.ubuntu-2004
Part of new build.
new file mode 100644
index 00000000..39991dc4
--- /dev/null
+++ b/.github/workflows/build/Dockerfile.ubuntu-2004
@@ -0,0 +1,73 @@
+FROM ubuntu:20.04
+
+ARG uid=1000
+ARG user=indy
+
+RUN apt-get update -y && apt-get install -y \
+ # common stuff
+ git \
+ wget \
+ gnupg \
+ apt-transport-https \
+ ca-certificates \
+ apt-utils \
+ curl \
+ jq
+
+# ========================================================================================================
+# Update repository signing keys
+# --------------------------------------------------------------------------------------------------------
+# Hyperledger
+RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 9692C00E657DDE61 && \
+ # Sovrin
+ apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
+# ========================================================================================================
+
+# Plenum
+# - https://github.com/hyperledger/indy-plenum/issues/1546
+# - Needed to pick up rocksdb=5.8.8
+RUN echo "deb https://hyperledger.jfrog.io/artifactory/indy focal dev" >> /etc/apt/sources.list && \
+ echo "deb http://security.ubuntu.com/ubuntu bionic-security main" >> /etc/apt/sources.list && \
+ echo "deb https://repo.sovrin.org/deb bionic master" >> /etc/apt/sources.list && \
+ echo "deb https://repo.sovrin.org/sdk/deb bionic master" >> /etc/apt/sources.list
+
+RUN apt-get update -y && apt-get install -y \
+ # Python
+ python3-pip \
+ python3-nacl \
+ # rocksdb python wrapper
+ rocksdb=5.8.8 \
+ libgflags-dev \
+ libsnappy-dev \
+ zlib1g-dev \
+ libbz2-dev \
+ liblz4-dev \
+ libgflags-dev \
+ # zstd is needed for caching in github actions pipeline
+ zstd \
+ # fpm
+ ruby \
+ ruby-dev \
+ rubygems \
+ gcc \
+ make \
+ # Indy Node and Plenum
+ libssl1.0.0 \
+ ursa=0.3.2-1 \
+ # Indy SDK
+ libindy=1.15.0~1625-bionic \
+ # Need to move libursa.so to parent dir
+ && mv /usr/lib/ursa/* /usr/lib && rm -rf /usr/lib/ursa
+
+RUN pip3 install -U \
+ # Required by setup.py
+ setuptools==50.3.2 \
+ 'pyzmq==22.3.0'
+
+
+# install fpm
+RUN gem install --no-document rake
+RUN gem install --no-document fpm -v 1.14.2
+
+RUN apt-get -y autoremove \
+ && rm -rf /var/lib/apt/lists/*
* [X] diff --git a/.github/workflows/lint/Dockerfile b/.github/workflows/lint/Dockerfile
Part of old build.
deleted file mode 100644
index 21b3067c..00000000
--- a/.github/workflows/lint/Dockerfile
+++ /dev/null
@@ -1,21 +0,0 @@
-# Development
-FROM ubuntu:18.04
-LABEL maintainer="Kevin Griffin <griffin.kev@gmail.com>"
-
-RUN apt-get update && apt-get dist-upgrade -y
-
-# Install environment
-RUN apt-get install -y \
- git \
- wget \
- python3.5 \
- python3-pip \
- python-setuptools \
- python3-nacl
-
-RUN pip3 install -U \
- 'pip<10.0.0' \
- 'setuptools<=50.3.2' \
- pep8==1.7.1 \
- pep8-naming==0.6.1 \
- flake8==3.5.0
* [X] diff --git a/.github/workflows/lint/README.md b/.github/workflows/lint/README.md
Part of old build
deleted file mode 100644
index 6b2a83fd..00000000
--- a/.github/workflows/lint/README.md
+++ /dev/null
@@ -1,3 +0,0 @@
-# Building the lint image
-
-This `Dockerfile` is used as part of the workflow, any changes to it will force the docker image to be rebuilt and that new image will be used to run the downstream workflow.
\ No newline at end of file
* [X] diff --git a/.github/workflows/publishRelease.yaml b/.github/workflows/publishRelease.yaml
Part of new build.
new file mode 100644
index 00000000..88c72219
--- /dev/null
+++ b/.github/workflows/publishRelease.yaml
@@ -0,0 +1,121 @@
+name: Triggered by Version Bump merged
+
+#disable all tags and enable all brannches and only version file
+on:
+ push:
+ branches-ignore:
+ - update-rc-version
+ - update-version
+ paths:
+ - '!**'
+ - "indy_node/__version__.json"
+
+jobs:
+ release-infos:
+ name: release-infos
+ runs-on: ubuntu-latest
+ outputs:
+ isVersionBump: ${{ steps.get-release-info.outputs.isVersionBump }}
+ isPreRelease: ${{ steps.get-release-info.outputs.isRC }}
+ versionTag: ${{ steps.get-release-info.outputs.versionTag }}
+ component: ${{ steps.get-release-info.outputs.component }}
+ CACHE_KEY_BUILD: ${{ steps.workflow-setup.outputs.CACHE_KEY_BUILD }}
+ UBUNTU_VERSION: ${{ steps.workflow-setup.outputs.UBUNTU_VERSION }}
+ # Expose the lowercase version of the GitHub repository name
+ # to all subsequent jobs that reference image repositories
+ # as the push and pull operations require the URL of the repository
+ # to be in lowercase.
+ GITHUB_REPOSITORY_NAME: ${{ steps.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}
+ distribution: ${{ steps.workflow-setup.outputs.distribution }}
+ publish: ${{ steps.workflow-setup.outputs.publish }}
+ steps:
+ - name: checkout source code
+ uses: actions/checkout@v3
+ - name: get-release-info
+ id: get-release-info
+ uses: hyperledger/indy-shared-gha/.github/actions/get-release-info@v1
+ with:
+ versionString: "${{ github.event.head_commit.message }}"
+ - name: workflow-setup
+ id: workflow-setup
+ uses: hyperledger/indy-shared-gha/.github/actions/workflow-setup@v1
+
+ createRelease:
+ name: Create Release
+ needs: [release-infos]
+ if: needs.release-infos.outputs.isVersionBump == 'true'
+ runs-on: ubuntu-latest
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v3
+
+ - name: Download Node deb Artifacts from Github Action Artifacts
+ uses: dawidd6/action-download-artifact@v2
+ with:
+ github_token: ${{ secrets.GITHUB_TOKEN }}
+ workflow: releasepr.yaml
+ workflow_conclusion: success
+ name: indy_node-deb
+ path: artifacts/indy_node-deb
+ - name: Download Node python Artifacts from Github Action Artifacts
+ uses: dawidd6/action-download-artifact@v2
+ with:
+ github_token: ${{ secrets.GITHUB_TOKEN }}
+ workflow: releasepr.yaml
+ workflow_conclusion: success
+ name: indy_node-python
+ path: artifacts/indy_node-python
+ - name: Download Node third party dependency Artifacts from Github Action Artifacts
+ uses: dawidd6/action-download-artifact@v2
+ with:
+ github_token: ${{ secrets.GITHUB_TOKEN }}
+ workflow: releasepr.yaml
+ workflow_conclusion: success
+ name: third-party-dependencies
+ path: artifacts/third-party-dependencies
+ - uses: actions/upload-artifact@v3
+ with:
+ name: third-party-dependencies
+ path: artifacts/third-party-dependencies
+ retention-days: 5
+ - uses: actions/upload-artifact@v3
+ with:
+ name: indy_node-deb
+ path: artifacts/indy_node-deb
+ retention-days: 5
+ - uses: actions/upload-artifact@v3
+ with:
+ name: indy_node-python
+ path: artifacts/indy_node-python
+ retention-days: 5
+ - name: Zip Files for Release
+ run: |
+ zip -r artifacts/indy_node-deb.zip artifacts/indy_node-deb
+ zip -r artifacts/indy_node-python.zip artifacts/indy_node-python
+ zip -r artifacts/third-party-dependencies.zip artifacts/third-party-dependencies
+ - name: Generate Release
+ uses: softprops/action-gh-release@v1
+ with:
+ tag_name: ${{ needs.release-infos.outputs.VERSIONTAG }}
+ files: |
+ artifacts/**.zip
+ generate_release_notes: true
+ body: "[${{ needs.release-infos.outputs.VERSIONTAG }}] "
+ prerelease: ${{ needs.release-infos.outputs.isPreRelease }}
+ target_commitish: ${{ github.event.ref }}
+ name: "${{ needs.release-infos.outputs.VERSIONTAG }}"
+ token: ${{ secrets.BOT_PR_PAT }}
+
+ publish_artifacts:
+ name: Publish Artifacts
+ needs: [release-infos, createRelease]
+ if: needs.release-infos.outputs.isVersionBump == 'true' && needs.release-infos.outputs.publish == 'true'
+ uses: hyperledger/indy-shared-gha/.github/workflows/publish_artifacts.yaml@v1
+ with:
+ COMPONENT: ${{ needs.release-infos.outputs.component }}
+ UBUNTU_VERSION: ${{ needs.release-infos.outputs.UBUNTU_VERSION }}
+ distribution: ${{ needs.release-infos.outputs.distribution }}
+ moduleName: indy_node
+ secrets:
+ INDY_ARTIFACTORY_REPO_CONFIG: ${{ secrets.INDY_ARTIFACTORY_REPO_CONFIG }}
+ PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
* [X] diff --git a/.github/workflows/releasepr.yaml b/.github/workflows/releasepr.yaml
Part of new build
new file mode 100644
index 00000000..44ac6c91
--- /dev/null
+++ b/.github/workflows/releasepr.yaml
@@ -0,0 +1,80 @@
+name: Triggered by Version Bump Release PR
+
+on:
+ pull_request:
+ paths:
+ - '!**'
+ - "indy_node/__version__.json"
+
+jobs:
+ release-infos:
+ name: infos
+ runs-on: ubuntu-latest
+ outputs:
+ isVersionBump: ${{ steps.get-release-info.outputs.isVersionBump }}
+ isPreRelease: ${{ steps.get-release-info.outputs.isRC }}
+ CACHE_KEY_BUILD: ${{ steps.workflow-setup.outputs.CACHE_KEY_BUILD }}
+ UBUNTU_VERSION: ${{ steps.workflow-setup.outputs.UBUNTU_VERSION }}
+ # Expose the lowercase version of the GitHub repository name
+ # to all subsequent jobs that reference image repositories
+ # as the push and pull operations require the URL of the repository
+ # to be in lowercase.
+ GITHUB_REPOSITORY_NAME: ${{ steps.workflow-setup.outputs.GITHUB_REPOSITORY_NAME }}
+ distribution: ${{ steps.workflow-setup.outputs.distribution }}
+ steps:
+ - name: checkout source code
+ uses: actions/checkout@v3
+ - name: get-release-info
+ id: get-release-info
+ uses: hyperledger/indy-shared-gha/.github/actions/get-release-info@v1
+ with:
+ versionString: "${{ github.event.pull_request.body }}"
+ - name: workflow-setup
+ id: workflow-setup
+ uses: hyperledger/indy-shared-gha/.github/actions/workflow-setup@v1
+
+ lint:
+ name: Lint
+ needs: [release-infos]
+ if: needs.release-infos.outputs.isVersionBump == 'true'
+ uses: hyperledger/indy-shared-gha/.github/workflows/lint.yaml@v1
+
+ build-docker-image:
+ name: Create Builder Image
+ needs: [release-infos, lint]
+ if: needs.release-infos.outputs.isVersionBump == 'true'
+ uses: hyperledger/indy-shared-gha/.github/workflows/buildimage.yaml@v1
+ with:
+ CACHE_KEY_BUILD: ${{ needs.release-infos.outputs.CACHE_KEY_BUILD }}
+ DOCKER_IMAGE: ghcr.io/${{ needs.release-infos.outputs.GITHUB_REPOSITORY_NAME }}/node-build
+ UBUNTU_VERSION: ${{ needs.release-infos.outputs.UBUNTU_VERSION }}
+
+
+ indy_node_tests:
+ name: Indy Node Tests
+ needs: [release-infos, build-docker-image]
+ if: needs.release-infos.outputs.isVersionBump == 'true'
+ uses: ./.github/workflows/reuseable_test.yaml
+ with:
+ GITHUB_REPOSITORY_NAME: ${{ needs.release-infos.outputs.GITHUB_REPOSITORY_NAME }}
+ UBUNTU_VERSION: ${{ needs.release-infos.outputs.UBUNTU_VERSION }}
+
+
+ build_packages:
+ name: Build Packages
+ needs: [release-infos, indy_node_tests]
+ if: needs.release-infos.outputs.isVersionBump == 'true'
+ uses: hyperledger/indy-shared-gha/.github/workflows/buildpackages.yaml@v1
+ with:
+ DOCKER_IMAGE: ghcr.io/${{ needs.release-infos.outputs.GITHUB_REPOSITORY_NAME }}/node-build:${{ needs.release-infos.outputs.UBUNTU_VERSION }}
+ UBUNTU_VERSION: ${{ needs.release-infos.outputs.UBUNTU_VERSION }}
+ isDev: 'false'
+ isRC: '${{ needs.release-infos.outputs.isPreRelease }}'
+ moduleName: indy_node
+
+ statusCheck:
+ name: statusCheck
+ runs-on: ubuntu-latest
+ needs: [build_packages]
+ steps:
+ - run: 'echo "Just a status Check (Always true, when executed) for branch protection rules(blocks merging while test are running and if tests fail)." '
\ No newline at end of file
* [X] diff --git a/.github/workflows/reuseable_test.yaml b/.github/workflows/reuseable_test.yaml
Part of new build
new file mode 100644
index 00000000..63368b97
--- /dev/null
+++ b/.github/workflows/reuseable_test.yaml
@@ -0,0 +1,75 @@
+name: "Test Indy Node"
+
+on:
+ workflow_call:
+ inputs:
+ GITHUB_REPOSITORY_NAME:
+ required: true
+ type: string
+ UBUNTU_VERSION:
+ required: true
+ type: string
+
+jobs:
+ indy_node_tests:
+ name: Sliced Module Tests
+ runs-on: ubuntu-20.04
+ # Fix for scacap/action-surefire-report out of memory error:
+ # - https://github.com/ScaCap/action-surefire-report/issues/17
+ env:
+ NODE_OPTIONS: '--max_old_space_size=4096'
+ #SLICE_TOTAL_SLICES needs to match the total number of slices in the matrix strategy.
+ SLICE_TOTAL_SLICES: 11
+ container:
+ image: ghcr.io/${{ inputs.GITHUB_REPOSITORY_NAME }}/node-build:${{ inputs.UBUNTU_VERSION }}
+ strategy:
+ matrix:
+ module: [indy_node, indy_common]
+ slice: [1, 2, 3, 4 ,5, 6, 7, 8, 9, 10, 11]
+ fail-fast: false
+ steps:
+ - name: Check out code
+ uses: actions/checkout@v3
+
+ # ===============================================
+ # Caching cannot be used.
+ # - For some reason as soon as it is enabled
+ # the test start complaining about zmq missing
+ # for the plenum install.
+ # -----------------------------------------------
+ # - name: Cache pip
+ # uses: actions/cache@v3
+ # with:
+ # # pip cache on the node-build image is not in the default location.
+ # # path: ~/.cache/pip
+ # path: /root/.cache/pip
+ # key: ${{ runner.os }}-indy-node-pip-${{ hashFiles('**/requirements.txt', '**/setup.py') }}
+ # restore-keys: |
+ # ${{ runner.os }}-indy-node-pip-
+
+ - name: Install dependencies
+ run: |
+ # Explicitly use the existing pip cache location in the node-build image.
+ pip --cache-dir /root/.cache/pip install .[tests]
+
+ - name: Run Indy Node ${{ matrix.module }} test slice ${{ matrix.slice }}/ ${{ env.SLICE_TOTAL_SLICES }}
+ id: node-test
+ run: RUSTPYTHONASYNCIODEBUG=0 python3 runner.py --pytest "python3 -m pytest -l -vv" --dir "${{ matrix.module }}" --output "test-result-node-${{ matrix.slice }}.txt" --test-only-slice "${{ matrix.slice }}/ ${{ env.SLICE_TOTAL_SLICES }}"
+
+ - name: Publish Test Report
+ if: success() || failure()
+ uses: scacap/action-surefire-report@v1.0.7
+ continue-on-error: true
+ with:
+ check_name: Indy Node ${{ matrix.module }} Test Report for slice ${{ matrix.slice }}/${{ strategy.job-total }}
+ github_token: ${{ secrets.GITHUB_TOKEN }}
+ report_paths: "*-test-results.xml"
+
+ - name: Upload Detailed Test Failure Results
+ # The test runner only emits the detailed test results if the tests fail.
+ if: (steps.node-test.outcome == 'failure') && failure()
+ uses: actions/upload-artifact@v3
+ with:
+ name: detailed-test-result-slice-${{ matrix.slice }}
+ path: test-result-node-${{ matrix.slice }}.txt
+ retention-days: 5
* [X] diff --git a/.github/workflows/tag.yaml b/.github/workflows/tag.yaml
Part of new build
new file mode 100644
index 00000000..797a0fe5
--- /dev/null
+++ b/.github/workflows/tag.yaml
@@ -0,0 +1,74 @@
+name: Triggered by set Tag
+
+on:
+ push:
+ tags:
+ - setRelease-v**
+
+jobs:
+ taginfos:
+ name: get Tag infos
+ runs-on: ubuntu-latest
+ outputs:
+ version: ${{ steps.get-release-info.outputs.version }}
+ versionTag: ${{ steps.get-release-info.outputs.versionTag }}
+ prBranch: ${{ steps.get-release-info.outputs.prBranch }}
+ BASE: ${{ steps.get-branch.outputs.branch }}
+ steps:
+ - name: checkout source code
+ uses: actions/checkout@v1
+ - name: extract branch
+ id: get-branch
+ uses: hyperledger/indy-shared-gha/.github/actions/branch-from-tag@v1
+ with:
+ tag: ${{ github.ref }}
+ - name: get-release-info
+ id: get-release-info
+ uses: hyperledger/indy-shared-gha/.github/actions/get-release-info@v1
+ with:
+ versionString: "${{ github.ref }}"
+
+ bump_version:
+ name: Bump Version Number
+ needs: taginfos
+ runs-on: ubuntu-20.04
+ steps:
+ - name: Check out code
+ uses: actions/checkout@v3
+ - name: Set up Python
+ uses: actions/setup-python@v2
+ with:
+ python-version: '3.8'
+ - name: Install deps for version change
+ run: |
+ sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 9692C00E657DDE61
+ sudo add-apt-repository 'deb https://hyperledger.jfrog.io/artifactory/indy focal dev'
+ sudo apt-get update -y && sudo apt-get install -y \
+ rocksdb=5.8.8 \
+ libgflags-dev \
+ libsnappy-dev \
+ zlib1g-dev \
+ libbz2-dev \
+ liblz4-dev \
+ libgflags-dev
+ pip install packaging \
+ importlib_metadata==3.10.1 \
+ indy-plenum==1.13.1rc3 \
+ pyzmq==22.3.0
+ - name: Prepare package and set version
+ run: |
+ ./bump_version.sh ${{ needs.taginfos.outputs.VERSION }}
+
+ - name: Create Pull Request
+ uses: peter-evans/create-pull-request@v3
+ with:
+ author: ${{ github.actor }} <${{ github.event.pusher.email }}>
+ committer: ${{ github.actor }} <${{ github.event.pusher.email }}>
+ signoff: true
+ commit-message: Update Version number for v${{ needs.taginfos.outputs.version }}
+ base: ${{ needs.taginfos.outputs.BASE }}
+ branch: ${{ needs.taginfos.outputs.prBranch }}
+ title: "[${{ needs.taginfos.outputs.versionTag }}] - Update Version Number for Release"
+ body: "[${{ needs.taginfos.outputs.versionTag }}] - Update Version number for Release"
+ delete-branch: true
+ token: ${{ secrets.BOT_PR_PAT }}
* [X] diff --git a/.gitpod.Dockerfile b/.gitpod.Dockerfile
Part of new build
new file mode 100644
index 00000000..2d62d840
--- /dev/null
+++ b/.gitpod.Dockerfile
@@ -0,0 +1,72 @@
+FROM gitpod/workspace-full as base
+
+USER gitpod
+
+
+RUN sudo apt-get update -y && sudo apt-get install -y \
+ # common stuff
+ git \
+ wget \
+ gnupg \
+ apt-transport-https \
+ ca-certificates \
+ apt-utils \
+ curl \
+ jq
+
+# ========================================================================================================
+# Update repository signing keys
+# --------------------------------------------------------------------------------------------------------
+# Hyperledger
+RUN sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 9692C00E657DDE61 && \
+ # Sovrin
+ sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
+# ========================================================================================================
+
+# Plenum
+# - https://github.com/hyperledger/indy-plenum/issues/1546
+# - Needed to pick up rocksdb=5.8.8
+RUN sudo add-apt-repository 'deb https://hyperledger.jfrog.io/artifactory/indy focal dev' && \
+ sudo add-apt-repository 'deb http://security.ubuntu.com/ubuntu bionic-security main' && \
+ sudo add-apt-repository 'deb https://repo.sovrin.org/deb bionic master' && \
+ sudo add-apt-repository 'deb https://repo.sovrin.org/sdk/deb bionic master'
+
+
+
+RUN sudo apt-get update -y && sudo apt-get install -y \
+ # Python
+ python3-pip \
+ python3-nacl \
+ # rocksdb python wrapper
+ rocksdb=5.8.8 \
+ libgflags-dev \
+ libsnappy-dev \
+ zlib1g-dev \
+ libbz2-dev \
+ liblz4-dev \
+ libgflags-dev \
+ # zstd is needed for caching in github actions pipeline
+ zstd \
+ # fpm
+ ruby \
+ ruby-dev \
+ rubygems \
+ gcc \
+ make \
+ # Indy Node and Plenum
+ libssl1.0.0 \
+ ursa=0.3.2-1 \
+ # Indy SDK
+ libindy=1.15.0~1625-bionic \
+ # Need to move libursa.so to parent dir
+ && sudo mv /usr/lib/ursa/* /usr/lib && sudo rm -rf /usr/lib/ursa
+
+RUN pip3 install -U \
+ # Required by setup.py
+ setuptools==50.3.2 \
+ 'pyzmq==22.3.0'
+
+
+# install fpm
+RUN sudo gem install --no-document rake
+RUN sudo gem install --no-document fpm -v 1.14.2
* [X] diff --git a/.gitpod.yml b/.gitpod.yml
Part of new build
new file mode 100644
index 00000000..6015df5c
--- /dev/null
+++ b/.gitpod.yml
@@ -0,0 +1,26 @@
+image:
+ file: .gitpod.Dockerfile
+
+tasks:
+ - name: Pip installs
+ init: pip install .[tests]
+
+github:
+ prebuilds:
+ # enable for the master/default branch (defaults to true)
+ master: false
+ # enable for all branches in this repo (defaults to false)
+ branches: false
+ # enable for pull requests coming from this repo (defaults to true)
+ pullRequests: false
+ # enable for pull requests coming from forks (defaults to false)
+ pullRequestsFromForks: false
+ # add a "Review in Gitpod" button as a comment to pull requests (defaults to true)
+ addComment: false
+ # add a "Review in Gitpod" button to pull requests (defaults to false)
+ addBadge: false
+ # add a label once the prebuild is ready to pull requests (defaults to false)
+ addLabel: false
+
+vscode:
+ extensions: ["mhutchie.git-graph", "eamodio.gitlens","ms-python.python" ]
\ No newline at end of file
* [X] diff --git a/.vscode/settings.json b/.vscode/settings.json
Part of new build
new file mode 100644
index 00000000..b1ac01a9
--- /dev/null
+++ b/.vscode/settings.json
@@ -0,0 +1,12 @@
+{
+ "python.linting.pylintEnabled": false,
+ "python.linting.flake8Enabled": true,
+ "python.linting.enabled": true,
+ "python.testing.unittestEnabled": false,
+ "python.testing.pytestEnabled": true,
+ "python.linting.mypyEnabled": false,
+ "python.testing.pytestArgs": [
+ "-c",
+ "pytest.ini"
+ ]
+}
\ No newline at end of file
* [X] diff --git a/CHANGELOG.md b/CHANGELOG.md
Hmmm, seems like we might want some changes from here... let us investigate. No, seems we have everything.
index 685962b3..b2a1a94c 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,11 @@
# Hyperledger Indy Node Release Notes
+* [1.12.4](#1124)
+
+* [1.12.3](#1123)
+
+* [1.12.2](#1122)
+
* [1.12.1](#1121)
* [1.12.0](#1120)
@@ -56,6 +62,92 @@
Although every attempt has been made to make this information as accurate as possible, please know there may be things that are omitted, not fully developed yet, or updates since this publication that were not included in the information below. Only the most pressing or significant items have been listed. For the entire list of tickets and or specific information about any given item, please visit the list at [Hyperleder Indy's Jira](https://jira.hyperledger.org/). Once logged in, simply navigate to Projects > Indy.
+## 1.12.4
+### Release date: Aug 19th, 2020
+
+### Component Version Information
+| Components | Version Numbers |
+| --- | --- |
+| indy-plenum | 1.12.4 |
+| indy-node | 1.12.4 |
+| sovrin | 1.1.89 |
+
+### Additional Information:
+
+**There are possible OOM issues during 3+ hours of target load or large catch-ups at 8 GB RAM nodes pool so 32 GB is recommended.**
+
+### Major Changes
+- NYM dynamic validation check transaction w/out verkey or role
+
+### Detailed Changelog
+
+#### Changes and Additions
+| Description | Additional Information | Ticket Number |
+| --- | --- | --- |
+| NYM dynamic validation check transaction w/out verkey or role | | |
+
+## 1.12.3
+### Release date: Jun 1st, 2020
+
+### Component Version Information
+| Components | Version Numbers |
+| --- | --- |
+| indy-plenum | 1.12.3 |
+| indy-node | 1.12.3 |
+| sovrin | 1.1.81 |
+
+### Additional Information:
+
+**There are possible OOM issues during 3+ hours of target load or large catch-ups at 8 GB RAM nodes pool so 32 GB is recommended.**
+
+### Major Changes
+- Bug fix
+
+### Detailed Changelog
+
+#### Major Fixes
+| Description | Additional Information | Ticket Number |
+| --- | --- | --- |
+| TAA signature's validation fix (milliseconds sending broke the primary) | | |
+
+## 1.12.2
+### Release date: Jan 30th, 2020
+
+### Component Version Information
+| Components | Version Numbers |
+| --- | --- |
+| indy-plenum | 1.12.2 |
+| indy-node | 1.12.2 |
+| sovrin | 1.1.71 |
+
+### Additional Information:
+
+**Stop indy-node service on demoted nodes to avoid a minor issue with client's requests processing (see Known Issues for details).**
+
+**There are possible OOM issues during 3+ hours of target load or large catch-ups at 8 GB RAM nodes pool so 32 GB is recommended.**
+
+### Major Changes
+- Stability fixes
+
+### Detailed Changelog
+
+#### Major Fixes
+| Description | Additional Information | Ticket Number |
+| --- | --- | --- |
+| WARNING messages incorrectly logged if tokens are not used | | [INDY-2221](https://jira.hyperledger.org/browse/INDY-2221) |
+| REV_REG_DEF `tag` field is not validated | | [INDY-2314](https://jira.hyperledger.org/browse/INDY-2314) |
+| A node may re-send messages in a loop in case of connection issues | | [INDY-2318](https://jira.hyperledger.org/browse/INDY-2318) |
+| Up to F Nodes may not be able to finish View Change if there are uncommitted NODE txns | | [INDY-2319](https://jira.hyperledger.org/browse/INDY-2319) |
+| A node lagging behind may not be able to finish view change if nodes have been added/demoted | | [INDY-2308](https://jira.hyperledger.org/browse/INDY-2308) |
+| A lagging node may use wrong N and F quorum values and never finish view change if there are NODE txns being processed | | [INDY-2320](https://jira.hyperledger.org/browse/INDY-2320) |
+| A lagging node may be the only one who started view change in case of F Nodes added/promoted in 1 batch | | [INDY-2322](https://jira.hyperledger.org/browse/INDY-2322) |
+| Debug View Change when nodes added/demoted/promoted | | [INDY-2326](https://jira.hyperledger.org/browse/INDY-2326) |
+
+#### Known Issues
+| Description | Additional Information | Ticket Number |
+| --- | --- | --- |
+| Demoted Node should not process client's requests | | [INDY-2334](https://jira.hyperledger.org/browse/INDY-2334) |
+
## 1.12.1
### Release date: Dec 28th, 2019
* [X] diff --git a/CODEOWNERS b/CODEOWNERS
This seems fine
index 7329bfc9..0f08cb39 100644
--- a/CODEOWNERS
+++ b/CODEOWNERS
@@ -1,5 +1,7 @@
-# indy-admin
-* @esplinr @dhh1128 @nage
+# SPDX-License-Identifier: Apache-2.0
+
+# Indy Admin
+* @hyperledger/indy-admin
# Indy Common
-* @anikitinDSR @ashcherbakov @skhoroshavin @Toktar @donqui @ken-ebert
+* @hyperledger/indy-common
* [X] diff --git a/Jenkinsfile.cd b/Jenkinsfile.cd
Part of old build
deleted file mode 100644
index b7174deb..00000000
--- a/Jenkinsfile.cd
+++ /dev/null
@@ -1,138 +0,0 @@
-#!groovy
-
-@Library('SovrinHelpers@v2.2') _
-
-String name = 'indy-node'
-String pkgName = name
-String mainModuleName = 'indy_node'
-Boolean gatherLogs = (params.GATHER_LOGS ?: env.GATHER_LOGS) != 'false'
-
-def nodeTestUbuntu = {
- try {
- echo 'Ubuntu Test: Checkout csm'
- checkout scm
-
- echo 'Ubuntu Test: Build docker image'
- def testEnv = dockerHelpers.build(name)
-
- testEnv.inside('--network host') {
- echo 'Ubuntu Test: Install dependencies'
- sh "pip install 'pip<10.0.0' 'pyzmq==18.1.0'"
- testHelpers.install()
-
- echo 'Ubuntu Test: Test'
- testHelpers.testRunner([resFile: "test-result-node.${NODE_NAME}.txt", testDir: 'indy_node'])
- //testHelpers.testJUnit(resFile: "test-result-node.${NODE_NAME}.xml")
- }
- }
- finally {
- echo 'Ubuntu Test: Cleanup'
- step([$class: 'WsCleanup'])
- }
-}
-
-def commonTestUbuntu = {
- try {
- echo 'Ubuntu Test: Checkout csm'
- checkout scm
-
- echo 'Ubuntu Test: Build docker image'
- def testEnv = dockerHelpers.build(name)
-
- testEnv.inside {
- echo 'Ubuntu Test: Install dependencies'
- sh "pip install 'pip<10.0.0' 'pyzmq==18.1.0'"
- testHelpers.install()
-
- echo 'Ubuntu Test: Test'
- testHelpers.testJUnit([resFile: "test-result-common.${NODE_NAME}.xml", testDir: 'indy_common'])
- }
- }
- finally {
- echo 'Ubuntu Test: Cleanup'
- step([$class: 'WsCleanup'])
- }
-}
-
-def buildDebUbuntu = { releaseVersion, sourcePath, packageVersion=null, missedPkgs=false ->
- def volumeName = "$name-deb-u1604"
- packageVersion = packageVersion ?: releaseVersion
-
- if (env.BRANCH_NAME != '' && env.BRANCH_NAME != 'master') {
- volumeName = "${volumeName}.${BRANCH_NAME}"
- }
- if (sh(script: "docker volume ls -q | grep -q '^$volumeName\$'", returnStatus: true) == 0) {
- sh "docker volume rm $volumeName"
- }
-
- // TODO build only missed ones
- dir('build-scripts/ubuntu-1604') {
- sh "./build-$name-docker.sh \"$sourcePath\" $releaseVersion $volumeName $packageVersion"
- if (missedPkgs == [pkgName]) {
- echo "Skip 3rd parties building"
- } else {
- sh "./build-3rd-parties-docker.sh $volumeName"
- }
- }
- return "$volumeName"
-}
-
-def systemTests = { component, releaseVersion ->
- def localLib
- nodeWrapper('ubuntu') {
- stage('Load local shared library') {
- checkout scm
- localLib = load 'ci/pipeline.groovy'
- }
- }
-
- localLib.systemTests {
- repoChannel = component
- pkgVersion = releaseVersion
- testSchema = [
- ['test_ledger.py'],
- ['TestViewChangeSuite.py'],
- ['TestConsensusSuite.py'],
- ['test_state_proof.py']
- ]
- testVersion = 'v0.8-indy-crypto'
- testVersionByTag = true
- delegate.gatherLogs = gatherLogs
- }
-}
-
-def options = new TestAndPublishOptions()
-options.setPkgName(pkgName)
-options.setApprovers(['QA'])
-options.setNotifEmails([
- QA: [
- to: "${env.INDY_NODE_QA_RECIPIENTS ?: ''}",
- cc: "${env.INDY_NODE_RECIPIENTS ?: ''}"
- ],
- success: [
- to: "${env.INDY_NODE_RECIPIENTS ?: ''}"
- ],
- fail: [
- to: "${env.INDY_NODE_RECIPIENTS ?: ''}"
- ]
-])
-
-// TODO duplicates list from build scripts
-options.setBuiltPkgs([
- 'python3-timeout-decorator': '0.4.0',
- 'python3-distro': '1.3.0',
-])
-
-
-options.enable([StagesEnum.PACK_RELEASE_COPY, StagesEnum.PACK_RELEASE_COPY_ST])
-options.setCopyWithDeps(true)
-options.setSystemTestsCb(systemTests)
-options.setPrContexts([env.INDY_GITHUB_PR_REQUIRED_CONTEXT ?: "ci/hyperledger-jenkins/pr-merge"])
-
-testAndPublish(
- name,
- [
- ubuntu: [node: nodeTestUbuntu, common: commonTestUbuntu]
- ],
- true, options, [ubuntu: buildDebUbuntu], mainModuleName
-)
* [X] diff --git a/Jenkinsfile.ci b/Jenkinsfile.ci
Part of old build
deleted file mode 100644
index 01fb8552..00000000
--- a/Jenkinsfile.ci
+++ /dev/null
@@ -1,220 +0,0 @@
-#!/usr/bin/env groovy
-
-/*
- * This Jenkinsfile is intended to run on https://ci.evernym.com and may fail anywhere else.
- *
- * Environment requirements:
- * - environment variable:
- * - INDY_AGENT_LINUX_DOCKER_LABEL: label for agents with ability
- * to run linux docker containers
- * - (optional) INDY_AGENT_WINDOWS_LABEL: label for windows agents
- * - agents:
- * - linux:
- * - docker
- * - windows:
- * - python3.5 + virtualenv
- * - cygwin
- */
-
-name = 'indy-node'
-
-def config = [
- codeValidation: true,
- runTests: true,
- failFast: false,
- sendNotif: true
-]
-
-
-// TODO enable windows
-def labels = [
- linux: env.INDY_AGENT_LINUX_DOCKER_LABEL ?: 'linux'
-]
-
-if (env.INDY_AGENT_WINDOWS_LABEL) {
- labels[windows] = env.INDY_AGENT_WINDOWS_LABEL
-}
-
-def wsCleanup() {
- try {
- cleanWs()
- } catch (NoSuchMethodError ex) {
- echo "WARNING: failed to clean the workspace, seems ws-cleanup plugin is not installed"
- }
-}
-
-def buildDocker(imageName, dockerfile) {
- def uid = sh(returnStdout: true, script: 'id -u').trim()
- return docker.build("$imageName", "--build-arg uid=$uid -f $dockerfile")
-}
-
-
-def install(options=[:]) {
- options.pip = options.pip ?: 'pip'
- options.isVEnv = options.isVEnv ?: false
- options.deps = options.deps ?: []
-
- for (def dep : options.deps) {
- sh "$options.pip install " + (options.isVEnv ? "-U" : "") + " $dep"
- }
-
- // TODO check that `--ignore-installed` case works when windows is enabled
- // (makes sense only for virtual envs with `--system-site-packages`)
- sh "$options.pip install " + (options.isVEnv ? "--ignore-installed" : "") + " .[tests]"
-}
-
-
-def withTestEnv(body) {
- echo 'Test: Checkout csm'
- checkout scm
-
- if (isUnix()) {
- echo 'Test: Build docker image'
-
- buildDocker("hyperledger/indy-node-ci", "ci/ubuntu.dockerfile ci").inside {
- echo 'Test: Install dependencies'
- sh "pip install 'pip<10.0.0' 'pyzmq==18.1.0'"
- install()
- body.call('python')
- }
- } else { // windows expected
- echo 'Test: Build virtualenv'
- def virtualEnvDir = ".venv"
- sh "virtualenv --system-site-packages $virtualEnvDir"
-
- echo 'Test: Install dependencies'
- install(pip: "$virtualEnvDir/Scripts/pip", isVenv: true)
- body.call("$virtualEnvDir/Scripts/python")
- }
-}
-
-
-def test(options=[:]) {
- options.resFile = options.resFile ?: 'test-result.txt'
- options.testDir = options.testDir ?: '.'
- options.python = options.python ?: 'python'
- options.useRunner = options.useRunner ?: false
- options.testOnlySlice = options.testOnlySlice ?: '1/1'
-
- try {
- if (options.useRunner) {
- sh "PYTHONASYNCIODEBUG='0' $options.python runner.py --pytest \"$options.python -m pytest -l -vv\" --dir $options.testDir --output \"$options.resFile\" --test-only-slice \"$options.testOnlySlice\""
- } else {
- sh "$options.python -m pytest -l -vv --junitxml=$options.resFile $options.testDir"
- }
- }
- finally {
- try {
- sh "ls -la $options.resFile"
- } catch (Exception ex) {
- // pass
- }
-
- if (options.useRunner) {
- archiveArtifacts allowEmptyArchive: true, artifacts: "$options.resFile"
- } else {
- junit "$options.resFile"
- }
- }
-}
-
-
-def staticCodeValidation() {
- try {
- echo 'Static code validation'
- checkout scm
-
- buildDocker('code-validation', 'ci/code-validation.dockerfile ci').inside {
- sh "python3 -m flake8"
- }
- }
- finally {
- echo 'Static code validation: Cleanup'
- wsCleanup()
- }
-}
-
-
-def tests = [
- common: { python ->
- test(
- resFile: "test-result-common.${NODE_NAME}.xml",
- testDir: 'indy_common',
- python: python
- )
- },
- node: { python ->
- test(
- resFile: "test-result-node.${NODE_NAME}.txt",
- testDir: 'indy_node',
- python: python,
- useRunner: true
- )
- },
-].collect {k, v -> [k, v]}
-
-
-def builds = [:]
-def _labels = labels.collect {k, v -> v}
-for (i = 0; i < _labels.size(); i++) {
- def label = _labels[i]
- def descr = "${label}Test"
-
- for(j = 0; j < tests.size(); j++) {
- def part = tests[j][0]
- def testFn = tests[j][1]
- def currDescr = "${descr}-${part}"
- builds[(currDescr)] = {
- stage(currDescr) {
- node(label) {
- try {
- withTestEnv() { python ->
- echo 'Test'
- testFn(python)
- }
- }
- finally {
- echo 'Cleanup'
- wsCleanup()
- }
- }
- }
- }
- }
-}
-
-// PIPELINE
-
-try {
- timeout(180) {
- stage('Static code validation') {
- if (config.codeValidation) {
- node(labels.linux) {
- staticCodeValidation()
- }
- }
- }
- stage('Build / Test') {
- if (config.runTests) {
- builds.failFast = config.failFast
- parallel builds
- }
- }
- currentBuild.result = 'SUCCESS'
- }
-} catch (Exception err) {
- println(err.toString())
- currentBuild.result = 'FAILURE'
-} finally {
- stage('Build result notification') {
- if (config.sendNotif) {
- def emailMessage = [
- body: '$DEFAULT_CONTENT',
- replyTo: '$DEFAULT_REPLYTO',
- subject: '$DEFAULT_SUBJECT',
- recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']]
- ]
- emailext emailMessage
- }
- }
-}
* [X] diff --git a/Jenkinsfile.nightly b/Jenkinsfile.nightly
Part of old build
deleted file mode 100644
index 8c7a5b22..00000000
--- a/Jenkinsfile.nightly
+++ /dev/null
@@ -1,91 +0,0 @@
-#!groovy
-
-@Library('SovrinHelpers@v2.2') _
-
-String pkgName = 'indy-node'
-String mainModuleName = 'indy_node'
-String emailRecipients = params.INDY_NODE_RECIPIENTS ?: env.INDY_NODE_RECIPIENTS ?: ''
-Boolean gatherLogs = (params.GATHER_LOGS ?: env.GATHER_LOGS) != 'false'
-
-def localLib
-def err
-String buildPkgVersion
-String buildSrcVersion
-
-String scmRepoUrl
-String scmSha1
-
-try {
- nodeWrapper('ubuntu') {
- stage('Resolve version to build') {
- docker.image('hyperledger/indy-core-baseci:0.0.3-master').inside('-u 0') {
- sh "apt-get update && apt-get install -y $pkgName"
- releaseVersion = getReleaseVersion(mainModuleName, false)
- buildPkgVersion = "${releaseVersion.release}~${releaseVersion.pre}${releaseVersion.revision}"
- buildSrcVersion = sh(returnStdout: true, script: """
- python3 -c "from $mainModuleName import load_manifest; print(load_manifest()['sha1'])"
- """).trim()
- }
- echo "Version to build: buildSrcVersion=$buildSrcVersion, buildPkgVersion=$buildPkgVersion"
- }
-
- stage('Load local shared library') {
- checkout scm
- localLib = load 'ci/pipeline.groovy'
-
- scmRepoUrl = gitHelper.repoUrl()
- scmSha1 = gitHelper.sha1()
- }
- }
-
- localLib.systemTests {
- repoChannel = 'master'
- pkgVersion = buildPkgVersion
- srcVersion = buildSrcVersion
- testSchema = [
- ['test_ledger.py'],
- ['test_state_proof.py'],
- ['TestViewChangeSuite.py'],
- ['test_off_ledger_signature.py'],
- ['TestConsensusSuite.py', 'TestTAASuite.py'],
- ['test_roles.py', 'test_freshness.py', 'TestMultiSigSuite.py'],
- ['TestAuditSuite.py'],
- ['TestCatchUpSuite.py'],
- ['TestCatchUpSuiteExtended.py'],
- // set of authmap tests
- // TODO might be groupped in parts once https://github.com/docker/docker-py/issues/2278 is resolved
- ['TestAuthMapAttribSuite.py'],
- ['TestAuthMapCredDefSuite.py'],
- ['TestAuthMapMiscSuite.py'],
- ['TestAuthMapNymSuite.py'],
- ['TestAuthMapPluginsSuite.py'],
- ['TestAuthMapRevocRegDefSuite.py'],
- ['TestAuthMapRevocRegEntrySuite.py'],
- ['TestAuthMapSchemaSuite.py'],
- ['TestAuthMapUpgradeSuite.py'],
- ['TestAdHocSuite.py'],
- ['TestProductionSuite.py']
- ]
- testVersion = 'v0.8-indy-crypto'
- testVersionByTag = true
- delegate.gatherLogs = gatherLogs
- }
-} catch(Exception _err) {
- currentBuild.result = "FAILED"
- err = _err
- throw _err
-} finally {
- stage('Build result notification') {
- sendNotification.email {
- to = emailRecipients
- subject = "[${pkgName}][nightly] Build #${this.env.BUILD_NUMBER} ${err ? 'failed' : 'succeed'} for version ${buildPkgVersion} (${buildSrcVersion})"
- srcUrl = "${scmRepoUrl}/tree/${scmSha1}"
- }
- }
-
- if (err) {
- stage('Error dump') {
- echo "Pipeline failed: $err"
- }
- }
-}
* [X] diff --git a/README.md b/README.md
Let us read through the diff here.... Oh that is pretty short. Basically, ursa instead of indy-crypto.
index aa13a7e1..5978da16 100644
--- a/README.md
+++ b/README.md
@@ -1,4 +1,6 @@
![logo](collateral/logos/indy-logo.png)
+
+[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/hyperledger/indy-node/tree/ubuntu-20.04-upgrade)
# Indy Node
* [About Indy Node](#about-indy-node)
* [Technical Overview of Indy Blockchain](#technical-overview-of-indy-blockchain)
@@ -75,9 +77,8 @@ Indy Node repo consists of the following parts:
- An official SDK for Indy.
- It contains client and anoncreds implementation
- You don't need it to contribute to Indy-Node. But please use indy-sdk for your own applications dealing with Indy ecosystem.
-- [indy-crypto](https://github.com/hyperledger/indy-crypto)
- - A shared crypto library
- - It's based on [AMCL](https://github.com/milagro-crypto/amcl)
+- [ursa](https://github.com/hyperledger/ursa)
+ - Hyperledger's shared crypto library
- In particular, it contains BLS multi-signature crypto needed for state proofs support in Indy.
## Contact us
* [X] diff --git a/acceptance/indy-cli-batches/expected/AS-02-01-invalid-cases.expected b/acceptance/indy-cli-batches/expected/AS-02-01-invalid-cases.expected
Newline change...ignoring.
index f507d62b..07335e25 100644
--- a/acceptance/indy-cli-batches/expected/AS-02-01-invalid-cases.expected
+++ b/acceptance/indy-cli-batches/expected/AS-02-01-invalid-cases.expected
@@ -42,4 +42,4 @@ Transaction has been rejected: Proof of possession RbGAR89T5bg6Bg66Xg1fy5NjVarmv
Pool "AS-pool" has been disconnected
Wallet "AS-02-wallet" has been closed
-Goodbye...
\ No newline at end of file
+Goodbye...
* [X] diff --git a/build-scripts/ubuntu-1604/Dockerfile b/build-scripts/ubuntu-1604/Dockerfile
Part of old build
deleted file mode 100644
index 59d79fce..00000000
--- a/build-scripts/ubuntu-1604/Dockerfile
+++ /dev/null
@@ -1,30 +0,0 @@
-FROM ubuntu:16.04
-
-RUN apt-get update -y && apt-get install -y \
- # common stuff
- git \
- wget \
- unzip \
- python3.5 \
- python3-pip \
- python3-venv \
- # fmp
- ruby \
- ruby-dev \
- rubygems \
- gcc \
- make \
- && rm -rf /var/lib/apt/lists/*
-
-# issues with pip>=10:
-# https://github.com/pypa/pip/issues/5240
-# https://github.com/pypa/pip/issues/5221
-RUN python3 -m pip install -U 'pip<10.0.0' 'setuptools<=50.3.2' \
- && pip3 list
-
-# install fpm
-RUN gem install --no-ri --no-rdoc rake fpm
-
-WORKDIR /root
-
-ADD . /root
* [X] diff --git a/build-scripts/ubuntu-1604/prepare-package.sh b/build-scripts/ubuntu-1604/prepare-package.sh
Part of old build
deleted file mode 100755
index 05cc4308..00000000
--- a/build-scripts/ubuntu-1604/prepare-package.sh
+++ /dev/null
@@ -1,42 +0,0 @@
-#!/bin/bash -xe
-
-if [ "$1" = "--help" ] ; then
- echo "Usage: $0 <path-to-repo-folder> <release-version-dotted>"
- exit 0
-fi
-
-repo="$1"
-version_dotted="$2"
-
-pushd $repo
-
-echo -e "\nSetting version to $version_dotted"
-bash -ex ./bump_version.sh $version_dotted
-cat indy_node/__version__.json
-
-echo -e "\nGenerating manifest"
-bash -ex ./generate_manifest.sh
-cat indy_node/__manifest__.json
-
-echo -e "\n\nPrepares indy-plenum debian package version"
-sed -i -r "s~indy-plenum==([0-9\.]+[0-9])(\.)?([a-z]+)~indy-plenum==\1\~\3~" setup.py
-
-echo -e "\nAdapt the dependencies for the Canonical archive"
-sed -i "s~timeout-decorator~python3-timeout-decorator~" setup.py
-sed -i "s~distro~python3-distro~" setup.py
-
-echo "Preparing config files"
-GENERAL_CONFIG_DIR="\/etc\/indy"
-REPO_GENERAL_CONFIG_DIR="indy_node/general_config"
-# Define user config directory
-sed -i "s/^\(GENERAL_CONFIG_DIR\s*=\s*\).*\$/\1\"$GENERAL_CONFIG_DIR\"/" indy_common/config.py
-# Create user config
-cp $REPO_GENERAL_CONFIG_DIR/general_config.py $REPO_GENERAL_CONFIG_DIR/indy_config.py
-cat $REPO_GENERAL_CONFIG_DIR/ubuntu_platform_config.py >> $REPO_GENERAL_CONFIG_DIR/indy_config.py
-rm -f $REPO_GENERAL_CONFIG_DIR/general_config.py
-rm -f $REPO_GENERAL_CONFIG_DIR/ubuntu_platform_config.py
-rm -f $REPO_GENERAL_CONFIG_DIR/windows_platform_config.py
-
-popd
-
-echo -e "\nFinished preparing $repo for publishing\n"
* [X] diff --git a/build-scripts/ubuntu-2004/Dockerfile b/build-scripts/ubuntu-2004/Dockerfile
Part of new build.
new file mode 100644
index 00000000..59fcaf64
--- /dev/null
+++ b/build-scripts/ubuntu-2004/Dockerfile
@@ -0,0 +1,73 @@
+FROM ubuntu:20.04
+
+ARG uid=1000
+ARG user=indy
+
+RUN apt-get update -y && apt-get install -y \
+ # common stuff
+ git \
+ wget \
+ gnupg \
+ apt-transport-https \
+ ca-certificates \
+ apt-utils
+
+# ========================================================================================================
+# Update repository signing keys
+# --------------------------------------------------------------------------------------------------------
+# Hyperledger
+RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 9692C00E657DDE61 && \
+ # Sovrin
+ apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
+# ========================================================================================================
+
+# Plenum
+# - https://github.com/hyperledger/indy-plenum/issues/1546
+# - Needed to pick up rocksdb=5.8.8
+RUN echo "deb https://hyperledger.jfrog.io/artifactory/indy focal dev" >> /etc/apt/sources.list && \
+ echo "deb http://security.ubuntu.com/ubuntu bionic-security main" >> /etc/apt/sources.list && \
+ echo "deb https://repo.sovrin.org/deb bionic master" >> /etc/apt/sources.list && \
+ echo "deb https://repo.sovrin.org/sdk/deb bionic master" >> /etc/apt/sources.list
+
+# Sovrin
+RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
+
+RUN apt-get update -y && apt-get install -y \
+ # Python
+ python3-pip \
+ python3-nacl \
+ # rocksdb python wrapper
+ rocksdb=5.8.8 \
+ libgflags-dev \
+ libsnappy-dev \
+ zlib1g-dev \
+ libbz2-dev \
+ liblz4-dev \
+ libgflags-dev \
+ # zstd is needed for caching in github actions pipeline
+ zstd \
+ # fpm
+ ruby \
+ ruby-dev \
+ rubygems \
+ gcc \
+ make \
+ # Indy Node and Plenum
+ libssl1.0.0 \
+ ursa=0.3.2-1 \
+ # Indy SDK
+ libindy=1.15.0~1625-bionic \
+ # Need to move libursa.so to parent dir
+ && mv /usr/lib/ursa/* /usr/lib && rm -rf /usr/lib/ursa
+
+RUN pip3 install -U \
+ # Required by setup.py
+ setuptools==50.3.2 \
+ 'pyzmq==22.3.0'
+
+# install fpm
+RUN gem install --no-document rake
+RUN gem install --no-document fpm -v 1.14.2
+
+RUN apt-get -y autoremove \
+ && rm -rf /var/lib/apt/lists/*
* [X] diff --git a/build-scripts/ubuntu-1604/README.md b/build-scripts/ubuntu-2004/README.md
Readme for new build.
similarity index 100%
rename from build-scripts/ubuntu-1604/README.md
rename to build-scripts/ubuntu-2004/README.md
* [X] diff --git a/build-scripts/ubuntu-1604/build-3rd-parties-docker.sh b/build-scripts/ubuntu-2004/build-3rd-parties-docker.sh
Seems okay. Part of new build process. NOTE: Interesting that the process requires jq.
similarity index 67%
rename from build-scripts/ubuntu-1604/build-3rd-parties-docker.sh
rename to build-scripts/ubuntu-2004/build-3rd-parties-docker.sh
index e333bd28..9ca5c7ae 100755
--- a/build-scripts/ubuntu-1604/build-3rd-parties-docker.sh
+++ b/build-scripts/ubuntu-2004/build-3rd-parties-docker.sh
@@ -10,10 +10,10 @@ else
fi
PKG_NAME=indy-node
-IMAGE_NAME="${PKG_NAME}-build-u1604"
-OUTPUT_VOLUME_NAME="${1:-"${PKG_NAME}-deb-u1604"}"
+IMAGE_NAME="${PKG_NAME}-build-u2004"
+OUTPUT_VOLUME_NAME="${1:-"${PKG_NAME}-deb-u2004"}"
-docker build -t "${PKG_NAME}-build-u1604" -f Dockerfile .
+docker build -t "${PKG_NAME}-build-u2004" -f Dockerfile .
docker volume create --name "${OUTPUT_VOLUME_NAME}"
docker run \
* [X] diff --git a/build-scripts/ubuntu-1604/build-3rd-parties.sh b/build-scripts/ubuntu-2004/build-3rd-parties.sh
Part of build
similarity index 70%
rename from build-scripts/ubuntu-1604/build-3rd-parties.sh
rename to build-scripts/ubuntu-2004/build-3rd-parties.sh
index 994945d0..57465864 100755
--- a/build-scripts/ubuntu-1604/build-3rd-parties.sh
+++ b/build-scripts/ubuntu-2004/build-3rd-parties.sh
@@ -10,6 +10,10 @@ function build_from_pypi {
if [ -z "$2" ]; then
PACKAGE_VERSION=""
+ # Get the most recent package version from PyPI to be included in the package name of the Debian artifact
+ curl -X GET "https://pypi.org/pypi/${PACKAGE_NAME}/json" > "${PACKAGE_NAME}.json"
+ PACKAGE_VERSION="==$(cat "${PACKAGE_NAME}.json" | jq --raw-output '.info.version')"
+ rm "${PACKAGE_NAME}.json"
else
PACKAGE_VERSION="==$2"
fi
@@ -41,5 +45,11 @@ function build_from_pypi {
# build 3rd parties:
# build_from_pypi <pypi-name> <version>
# TODO duplicates list from Jenkinsfile.cd
-build_from_pypi timeout-decorator 0.4.0
-build_from_pypi distro 1.3.0
+
+SCRIPT_PATH="${BASH_SOURCE[0]}"
+pushd `dirname ${SCRIPT_PATH}` >/dev/null
+
+build_from_pypi timeout-decorator
+build_from_pypi distro 1.7.0
+
+popd >/dev/null
\ No newline at end of file
* [X] diff --git a/build-scripts/ubuntu-1604/build-indy-node-docker.sh b/build-scripts/ubuntu-2004/build-indy-node-docker.sh
Simple rename for new build.
similarity index 88%
rename from build-scripts/ubuntu-1604/build-indy-node-docker.sh
rename to build-scripts/ubuntu-2004/build-indy-node-docker.sh
index f27b9bff..dd5e1779 100755
--- a/build-scripts/ubuntu-1604/build-indy-node-docker.sh
+++ b/build-scripts/ubuntu-2004/build-indy-node-docker.sh
@@ -3,8 +3,8 @@
PKG_SOURCE_PATH="$1"
VERSION="$2"
PKG_NAME=indy-node
-IMAGE_NAME="${PKG_NAME}-build-u1604"
-OUTPUT_VOLUME_NAME="${3:-"${PKG_NAME}-deb-u1604"}"
+IMAGE_NAME="${PKG_NAME}-build-u2004"
+OUTPUT_VOLUME_NAME="${3:-"${PKG_NAME}-deb-u2004"}"
PACKAGE_VERSION="${4:-$VERSION}"
if [[ (-z "${PKG_SOURCE_PATH}") || (-z "${VERSION}") ]]; then
* [X] diff --git a/build-scripts/ubuntu-1604/build-indy-node.sh b/build-scripts/ubuntu-2004/build-indy_node.sh
Build upgrade and newer version of libsodium
similarity index 86%
rename from build-scripts/ubuntu-1604/build-indy-node.sh
rename to build-scripts/ubuntu-2004/build-indy_node.sh
index fe4f0cf4..ec961d17 100755
--- a/build-scripts/ubuntu-1604/build-indy-node.sh
+++ b/build-scripts/ubuntu-2004/build-indy_node.sh
@@ -12,8 +12,8 @@ TMP_DIR="$(mktemp -d)"
cp -r "${INPUT_PATH}/." "${TMP_DIR}"
# prepare the sources
-cd "${TMP_DIR}/build-scripts/ubuntu-1604"
-./prepare-package.sh "${TMP_DIR}" "${VERSION}"
+cd "${TMP_DIR}/build-scripts/ubuntu-2004"
+./prepare-package.sh "${TMP_DIR}" indy_node "${VERSION}" debian-packages
sed -i "s/{package_name}/${PACKAGE_NAME}/" "prerm"
@@ -28,7 +28,7 @@ fpm --input-type "python" \
--exclude "*.pyo" \
--depends at \
--depends iptables \
- --depends libsodium18 \
+ --depends libsodium23 \
--no-python-fix-dependencies \
--maintainer "Hyperledger <hyperledger-indy@lists.hyperledger.org>" \
--before-install "preinst_node" \
* [X] diff --git a/build-scripts/ubuntu-1604/postinst b/build-scripts/ubuntu-2004/postinst
Use python 3.8 instead of python 3.5.
old mode 100755
new mode 100644
similarity index 68%
rename from build-scripts/ubuntu-1604/postinst
rename to build-scripts/ubuntu-2004/postinst
index cf749a65..31b51450
--- a/build-scripts/ubuntu-1604/postinst
+++ b/build-scripts/ubuntu-2004/postinst
@@ -2,7 +2,7 @@
# Automatically added from template:
if which py3compile >/dev/null 2>&1; then
- py3compile -O -p {package_name} /usr/local/lib/python3.5/dist-packages/
+ py3compile -O -p {package_name} /usr/local/lib/python3.8/dist-packages/
fi
# End automatically added section
* [X] diff --git a/build-scripts/ubuntu-1604/postinst_node b/build-scripts/ubuntu-2004/postinst_node
Python version change.
old mode 100755
new mode 100644
similarity index 99%
rename from build-scripts/ubuntu-1604/postinst_node
rename to build-scripts/ubuntu-2004/postinst_node
index a13ddcad..ae86a4b4
--- a/build-scripts/ubuntu-1604/postinst_node
+++ b/build-scripts/ubuntu-2004/postinst_node
@@ -6,7 +6,7 @@ GENERAL_CONFIG_DIR="/etc/indy"
GENERAL_DATA_DIR="/var/lib/indy"
GENERAL_LOG_DIR="/var/log/indy"
-INSTALL_DIR='/usr/local/lib/python3.5/dist-packages'
+INSTALL_DIR='/usr/local/lib/python3.8/dist-packages'
NOFILES_SOFT_LIMIT=65536
NOFILES_HARD_LIMIT=131072
* [X] diff --git a/build-scripts/ubuntu-1604/preinst_node b/build-scripts/ubuntu-2004/preinst_node
Rename
old mode 100755
new mode 100644
similarity index 100%
rename from build-scripts/ubuntu-1604/preinst_node
rename to build-scripts/ubuntu-2004/preinst_node
* [X] diff --git a/build-scripts/ubuntu-2004/prepare-package.sh b/build-scripts/ubuntu-2004/prepare-package.sh
What to make of this code? It appears to be part of the new build system.
new file mode 100755
index 00000000..2f00dfe6
--- /dev/null
+++ b/build-scripts/ubuntu-2004/prepare-package.sh
@@ -0,0 +1,61 @@
+#!/bin/bash -xe
+
+if [ "$1" = "--help" ] ; then
+ echo "Usage: $0 <path-to-repo-folder> <main-module-name> <release-version-dotted> <distro-packages>"
+ echo "<distro-packages> - Set to 'debian-packages' when preparing deb packages, and 'python-packages' when preparing PyPi packages."
+ exit 0
+fi
+
+repo="$1"
+module_name="$2"
+version_dotted="$3"
+distro_packages="$4"
+
+BUMP_SH_SCRIPT="bump_version.sh"
+GENERATE_MANIFEST_SCRIPT="generate_manifest.sh"
+
+pushd $repo
+
+echo -e "\nSetting version to $version_dotted"
+bash -ex $BUMP_SH_SCRIPT $version_dotted
+cat $module_name/__version__.json
+
+echo -e "\nGenerating manifest"
+bash -ex $GENERATE_MANIFEST_SCRIPT
+cat $module_name/__manifest__.json
+
+if [ "$distro_packages" = "debian-packages" ]; then
+ echo -e "\n\nPrepares indy-node debian package version"
+ sed -i -r "s~indy-node==([0-9\.]+[0-9])(\.)?([a-z]+)~indy-node==\1\~\3~" setup.py
+
+ # Update the package names to match the names that are available on the os.
+ echo -e "\nAdapt the dependencies for the Canonical archive"
+ sed -i "s~timeout-decorator~python3-timeout-decorator~" setup.py
+ sed -i "s~distro~python3-distro~" setup.py
+ sed -i "s~importlib-metadata=~python3-importlib-metadata=~" setup.py
+
+ # Only used for the deb package builds, NOT for the PyPi package builds.
+ echo -e "\n\nPrepares indy-plenum debian package version"
+ sed -i -r "s~indy-plenum==([0-9\.]+[0-9])(\.)?([a-z]+)~indy-plenum==\1\~\3~" setup.py
+
+ echo "Preparing config files"
+ GENERAL_CONFIG_DIR="\/etc\/indy"
+ REPO_GENERAL_CONFIG_DIR="indy_node/general_config"
+ # Define user config directory
+ sed -i "s/^\(GENERAL_CONFIG_DIR\s*=\s*\).*\$/\1\"$GENERAL_CONFIG_DIR\"/" indy_common/config.py
+ # Create user config
+ cp $REPO_GENERAL_CONFIG_DIR/general_config.py $REPO_GENERAL_CONFIG_DIR/indy_config.py
+ cat $REPO_GENERAL_CONFIG_DIR/ubuntu_platform_config.py >> $REPO_GENERAL_CONFIG_DIR/indy_config.py
+ rm -f $REPO_GENERAL_CONFIG_DIR/general_config.py
+ rm -f $REPO_GENERAL_CONFIG_DIR/ubuntu_platform_config.py
+ rm -f $REPO_GENERAL_CONFIG_DIR/windows_platform_config.py
+elif [ "$distro_packages" = "python-packages" ]; then
+ echo -e "\nNo adaption of dependencies for python packages"
+else
+ echo -e "\nNo distribution specified. Please, specify distribution as 'debian-packages' or 'python-packages'."
+ exit 1
+fi
+
+popd
+
+echo -e "\nFinished preparing $repo for publishing\n"
\ No newline at end of file
* [X] diff --git a/build-scripts/ubuntu-1604/prerm b/build-scripts/ubuntu-2004/prerm
Rename
old mode 100755
new mode 100644
similarity index 100%
rename from build-scripts/ubuntu-1604/prerm
rename to build-scripts/ubuntu-2004/prerm
* [X] diff --git a/ci/Makefile b/ci/Makefile
Part of old build system.
deleted file mode 100644
index ad91792a..00000000
--- a/ci/Makefile
+++ /dev/null
@@ -1,15 +0,0 @@
-SHELL := /bin/bash
-
-IMAGE_NAME ?= indy-node-test
-
-.PHONY: all test_docker
-
-all: test_docker
-
-test_docker:
- $(eval UID := $(shell id -u))
- docker build -t $(IMAGE_NAME) --build-arg uid=$(UID) -f ubuntu.dockerfile .
- $(eval REPO_PATH := $(realpath ..))
- $(eval WORKDIR := /home/indy/indy-node)
- docker run --rm -it -w $(WORKDIR) -v $(REPO_PATH):$(WORKDIR) $(IMAGE_NAME) \
- bash -c "pip install .[tests] && python -m pytest"
* [X] diff --git a/ci/code-validation.dockerfile b/ci/code-validation.dockerfile
Part of old build system.
deleted file mode 100644
index f7281c1c..00000000
--- a/ci/code-validation.dockerfile
+++ /dev/null
@@ -1,18 +0,0 @@
-# Development
-FROM ubuntu:16.04
-
-ARG uid=1000
-
-# Install environment
-RUN apt-get update -y && apt-get install -y \
- git \
- wget \
- python3.5 \
- python3-pip \
- python-setuptools \
- python3-nacl
-RUN pip3 install -U \
- 'setuptools<=50.3.2' \
- pep8==1.7.1 \
- pep8-naming==0.6.1 \
- flake8==3.5.0
* [X] diff --git a/ci/pipeline.groovy b/ci/pipeline.groovy
Part of old build system.
deleted file mode 100644
index 10b320fd..00000000
--- a/ci/pipeline.groovy
+++ /dev/null
@@ -1,202 +0,0 @@
-#!groovy
-
-def systemTests(Closure body) {
- String prefix = "System Tests"
- String systemTestsNetwork = 'indy-test-automation-network'
- String systemTestsDir = './system_tests'
-
- def config = delegateConfig([
- repoChannel: 'master',
- pkgVersion: null,
- indyNodeRepoUrl: 'https://github.com/hyperledger/indy-node.git',
- srcVersion: null,
- testSchema: [['.']],
- testVersion: null,
- testVersionByTag: false,
- gatherLogs: true
- ],
- body, ['pkgVersion'], {}, prefix
- )
-
- Map systemTestsParams = [
- targetDir: systemTestsDir
- ]
-
- if (config.testVersion) {
- if (!!config.testVersionByTag) {
- systemTestsParams.tag = config.testVersion
- } else {
- systemTestsParams.branch = config.testVersion
- }
- }
-
- Map indyPlenumVersions = [:]
- Map indySDKVersions = [:]
- Map indyCryptoVersions = [:]
-
- def dockerClean = {
- sh "./system/docker/clean.sh $systemTestsNetwork"
-
- try {
- sh "docker ps -q --filter network=$systemTestsNetwork | xargs -r docker rm -f"
- } catch (Exception exc) {
- echo "$prefix: failed to remove docker containers in $systemTestsNetwork network: $exc"
- throw exc
- }
-
- try {
- sh "docker network ls -q --filter name=$systemTestsNetwork | xargs -r docker network rm"
- } catch (Exception exc) {
- echo "$prefix: failed to remove docker $systemTestsNetwork network: $exc"
- throw exc
- }
-
- sh "docker container prune -f"
- sh "docker network prune -f"
- }
-
- def runTest = { testGroup ->
-
- stage("[${testGroup}] Checkout system tests") {
- testHelpers.getSystemTests(systemTestsParams)
- }
-
- dir(systemTestsDir) {
- stage("[${testGroup}] Patch system tests python requirements") {
- sh """
- sed -i 's/python3-indy.*/python3-indy==${indySDKVersions.pypi}/g' ./system/requirements.txt
- #sed -i 's/indy-plenum.*/indy-plenum==${indyPlenumVersions.pypi}/g' ./system/requirements.txt
- #sed -i 's/indy-crypto.*/indy-crypto==${indyCryptoVersions.pypi}/g' ./system/requirements.txt
- """
- }
-
- stage("[${testGroup}] Cleanup docker") {
- dockerClean()
- }
-
- stage("[${testGroup}] Prepare docker env") {
- withEnv([
- "INDY_NODE_REPO_COMPONENT=${config.repoChannel}",
- "LIBINDY_CRYPTO_VERSION=${indyCryptoVersions.debian}",
- "PYTHON3_LIBINDY_CRYPTO_VERSION=${indyCryptoVersions.debian}",
- "INDY_PLENUM_VERSION=${indyPlenumVersions.debian}",
- "INDY_NODE_VERSION=${config.pkgVersion}",
- "LIBINDY_REPO_COMPONENT=${indySDKVersions.debian.replaceAll("-xenial", "") == indySDKVersions.pypi ? 'stable' : 'master'}",
- "LIBINDY_VERSION=${indySDKVersions.debian}",
- ]) {
- sh "./system/docker/prepare.sh $systemTestsNetwork"
- }
- }
-
- try {
- def err
- String testReportFileNameXml = "system_tests_${testGroup}_report.${config.repoChannel}.xml"
- String testReportFileNamePlain = "system_tests_${testGroup}_report.${config.repoChannel}.txt"
- String testTargets = config.testSchema[testGroup].collect{"system/indy-node-tests/$it"}.join(' ')
- String buildLogsDir = "_build/logs"
- String gatherLogsOpt = config.gatherLogs ? ' --gatherlogs' : ''
-
- try {
- stage("[${testGroup}] Run tests") {
- sh """
- bash -c "\
- set -o pipefail; \
- ./system/docker/run.sh \
- \\"$testTargets\\" \
- \\"-l -vv --junit-xml=$testReportFileNameXml ${gatherLogsOpt} --logsdir=${buildLogsDir}\\" \
- \\"$systemTestsNetwork\\" 2>&1 | tee $testReportFileNamePlain;\
- "
- """
- }
- } catch (_err) {
- err = _err
- throw _err
- } finally {
- stage("[${testGroup}] Upload test report") {
- sh "ls -la *report* || true"
- if (err) {
- archiveArtifacts artifacts: testReportFileNamePlain, allowEmptyArchive: true
- archiveArtifacts artifacts: "$buildLogsDir/**/*", allowEmptyArchive: true
- }
- junit testResults: testReportFileNameXml, allowEmptyResults: true
- }
- }
- } catch (Exception exc) {
- echo "$prefix: fail: $exc"
- throw exc
- } finally {
- stage("[${testGroup}] Cleanup docker") {
- dockerClean()
- }
- }
- }
- }
-
- nodeWrapper("ubuntu") {
- stage("Checkout SCM") {
- if (config.srcVersion) {
- checkout([
- $class: 'GitSCM',
- branches: [[name: config.srcVersion]],
- userRemoteConfigs: [[
- url: config.indyNodeRepoUrl,
- ]]
- ])
- } else {
- checkout scm
- }
- }
-
- stage("Get versions of dependencies") {
- String pipLogName = "pip.intsall.log"
- def uid = sh(returnStdout: true, script: 'id -u').trim()
- docker.build("hyperledger/indy-node-ci", "--build-arg uid=$uid -f ci/ubuntu.dockerfile ci").inside {
- sh """
- pip install 'pip<10.0.0' 'pyzmq==18.1.0'
- pip install .[tests] >$pipLogName
- """
-
- indyPlenumVersions.pypi = sh(returnStdout: true, script: """
- grep "^Collecting indy-plenum==" $pipLogName | awk '{print \$2}' | awk -F'==' '{print \$2}'
- """).trim()
- indyPlenumVersions.debian = indyPlenumVersions.pypi.replaceAll(/\.?(dev|rc)(.*)/, "~\$1\$2")
- echo "indy-plenum versions: $indyPlenumVersions"
-
- indySDKVersions.pypi = sh(returnStdout: true, script: """
- grep "^Collecting python3-indy==" $pipLogName | awk '{print \$2}' | awk -F'==' '{print \$2}'
- """).trim()
- indySDKVersions.debian = indySDKVersions.pypi.replaceAll(/-(dev|rc)-(.*)/, "~\$2") + '-xenial'
- echo "indy-sdk version: ${indySDKVersions}"
-
- indyCryptoVersions.pypi = sh(returnStdout: true, script: """
- grep "^Collecting indy-crypto==" $pipLogName | awk '{print \$2}' | awk -F'==' '{print \$2}'
- """).trim()
- indyCryptoVersions.debian = indyCryptoVersions.pypi.replaceAll(/-(dev|rc)-(.*)/, "~\$2")
- echo "indy-crypto version: ${indyCryptoVersions}"
- }
-
- if (!(indyPlenumVersions.debian && indySDKVersions.debian && indyCryptoVersions.debian)) {
- error "Failed to get versions for indy-plenum or indy-crypto or indy-sdk"
- }
- }
- }
-
- Map builds = [:]
- for (int i = 0; i < config.testSchema.size(); i++) {
- String testNames = config.testSchema[i].join(' ')
- Boolean isFirst = (i == 0)
- int testGroup = i
- builds[testNames] = {
- stage("Run ${testNames}") {
- nodeWrapper('ubuntu') {
- runTest(testGroup)
- }
- }
- }
- }
- builds.failFast = false
-
- parallel builds
-}
-
-return this;
* [X] diff --git a/ci/ubuntu.dockerfile b/ci/ubuntu.dockerfile
Part of old build system.
deleted file mode 100644
index b05c8285..00000000
--- a/ci/ubuntu.dockerfile
+++ /dev/null
@@ -1,29 +0,0 @@
-FROM hyperledger/indy-core-baseci:0.0.3-master
-LABEL maintainer="Hyperledger <hyperledger-indy@lists.hyperledger.org>"
-
-ARG uid=1000
-ARG user=indy
-ARG venv=venv
-
-# Update Sovrin signing key
-RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
-
-RUN apt-get update -y && apt-get install -y \
- python3-nacl \
- libindy-crypto=0.4.5 \
- libindy=1.13.0~1420 \
-# rocksdb python wrapper
- libbz2-dev \
- zlib1g-dev \
- liblz4-dev \
- libsnappy-dev \
- rocksdb=5.8.8
-
-ENV PATH="/home/$user/$venv/bin:$PATH"
-
-RUN indy_ci_add_user $uid $user $venv
-
-RUN indy_image_clean
-
-USER $user
-WORKDIR /home/$user
\ No newline at end of file
* [ ] diff --git a/data/migrations/deb/1_0_28_to_1_0_29.py b/data/migrations/deb/1_0_96_to_1_0_97.py
Okay, the migrations here don't make any sense to me? Have we been deleting migrations?
git diff origin/stable..origin/ubuntu-20.04-upgrade --stat --no-renames data
data/migrations/deb/1_0_28_to_1_0_29.py | 29 ---
data/migrations/deb/1_0_96_to_1_0_97.py | 29 +++
data/migrations/deb/1_1_37_to_1_1_38.py | 29 ---
data/migrations/deb/1_1_43_to_1_2_44.py | 41 -----
data/migrations/deb/1_1_150_to_1_1_151.py | 41 +++++
data/migrations/deb/1_2_44_to_1_2_45.py | 198 --------------------
data/migrations/deb/1_2_50_to_1_2_51.py | 96 ----------
data/migrations/deb/1_2_51_to_1_2_52.py | 19 --
data/migrations/deb/1_2_188_to_1_2_189.py | 198 ++++++++++++++++++++
data/migrations/deb/1_2_233_to_1_2_234.py | 96 ++++++++++
data/migrations/deb/1_2_273_to_1_2_274.py | 19 ++
data/migrations/deb/disabled_1_0_29_to_1_0_28.py | 216 ----------------------
data/migrations/deb/disabled_1_0_97_to_1_0_96.py | 216 ++++++++++++++++++++++
data/migrations/deb/helper_1_0_28_to_1_0_29.py | 223 -----------------------
data/migrations/deb/helper_1_0_96_to_1_0_97.py | 223 +++++++++++++++++++++++
data/migrations/deb/helper_1_1_37_to_1_1_38.py | 191 -------------------
16 files changed, 822 insertions(+), 1042 deletions(-)
* [ ] TODO investigate migrations...
similarity index 94%
rename from data/migrations/deb/1_0_28_to_1_0_29.py
rename to data/migrations/deb/1_0_96_to_1_0_97.py
index f90dc267..84e58f2e 100644
--- a/data/migrations/deb/1_0_28_to_1_0_29.py
+++ b/data/migrations/deb/1_0_96_to_1_0_97.py
@@ -13,7 +13,7 @@ migration_script_path = \
os.path.normpath(
os.path.join(
os.path.dirname(os.path.abspath(__file__)),
- 'helper_1_0_28_to_1_0_29.py'))
+ 'helper_1_0_96_to_1_0_97.py'))
logger.info('script path {}'.format(migration_script_path))
ret = subprocess.run(
* [ ] diff --git a/data/migrations/deb/1_1_43_to_1_2_44.py b/data/migrations/deb/1_1_150_to_1_1_151.py
similarity index 100%
rename from data/migrations/deb/1_1_43_to_1_2_44.py
rename to data/migrations/deb/1_1_150_to_1_1_151.py
* [ ] diff --git a/data/migrations/deb/1_1_37_to_1_1_38.py b/data/migrations/deb/1_1_37_to_1_1_38.py
deleted file mode 100644
index f255b2ed..00000000
--- a/data/migrations/deb/1_1_37_to_1_1_38.py
+++ /dev/null
@@ -1,29 +0,0 @@
-#!/usr/bin/python3.5
-import os
-import subprocess
-
-from stp_core.common.log import getlogger
-
-from sovrin_common.util import compose_cmd
-from sovrin_node.utils.node_control_tool import NodeControlTool, TIMEOUT
-
-logger = getlogger()
-
-migration_script_path = \
- os.path.normpath(
- os.path.join(
- os.path.dirname(os.path.abspath(__file__)),
- 'helper_1_1_37_to_1_1_38.py'))
-
-logger.info('script path {}'.format(migration_script_path))
-ret = subprocess.run(
- compose_cmd(
- ["su -c 'python3 {}' sovrin".format(migration_script_path)]
- ),
- shell=True,
- timeout=TIMEOUT)
-
-if ret.returncode != 0:
- msg = 'Migration failed: script returned {}'.format(ret.returncode)
- logger.error(msg)
- raise Exception(msg)
* [ ] diff --git a/data/migrations/deb/1_2_44_to_1_2_45.py b/data/migrations/deb/1_2_188_to_1_2_189.py
similarity index 100%
rename from data/migrations/deb/1_2_44_to_1_2_45.py
rename to data/migrations/deb/1_2_188_to_1_2_189.py
* [ ] diff --git a/data/migrations/deb/1_2_50_to_1_2_51.py b/data/migrations/deb/1_2_233_to_1_2_234.py
similarity index 94%
rename from data/migrations/deb/1_2_50_to_1_2_51.py
rename to data/migrations/deb/1_2_233_to_1_2_234.py
index ed0c2d61..8087b5da 100644
--- a/data/migrations/deb/1_2_50_to_1_2_51.py
+++ b/data/migrations/deb/1_2_233_to_1_2_234.py
@@ -50,9 +50,9 @@ def migrate_nodes_data():
except FileNotFoundError:
visit_dirs = []
for node_name in visit_dirs:
- move_path = os.path.join(old_nodes_data_dir, node_name)
- to_path = os.path.join(new_node_data_dir, node_name)
- ext_copytree(move_path, to_path)
+ move_path = os.path.join(old_nodes_data_dir, node_name)
+ to_path = os.path.join(new_node_data_dir, node_name)
+ ext_copytree(move_path, to_path)
shutil.rmtree(old_nodes_data_dir)
set_own_perm("indy", [new_node_data_dir])
* [ ] diff --git a/data/migrations/deb/1_2_51_to_1_2_52.py b/data/migrations/deb/1_2_273_to_1_2_274.py
similarity index 100%
rename from data/migrations/deb/1_2_51_to_1_2_52.py
rename to data/migrations/deb/1_2_273_to_1_2_274.py
* [ ] diff --git a/data/migrations/deb/disabled_1_0_29_to_1_0_28.py b/data/migrations/deb/disabled_1_0_97_to_1_0_96.py
similarity index 100%
rename from data/migrations/deb/disabled_1_0_29_to_1_0_28.py
rename to data/migrations/deb/disabled_1_0_97_to_1_0_96.py
* [ ] diff --git a/data/migrations/deb/helper_1_0_28_to_1_0_29.py b/data/migrations/deb/helper_1_0_96_to_1_0_97.py
similarity index 100%
rename from data/migrations/deb/helper_1_0_28_to_1_0_29.py
rename to data/migrations/deb/helper_1_0_96_to_1_0_97.py
* [ ] diff --git a/data/migrations/deb/helper_1_1_37_to_1_1_38.py b/data/migrations/deb/helper_1_1_37_to_1_1_38.py
deleted file mode 100644
index b2f687c3..00000000
--- a/data/migrations/deb/helper_1_1_37_to_1_1_38.py
+++ /dev/null
@@ -1,191 +0,0 @@
-#!/usr/bin/python3.5
-import fileinput
-import os
-import shutil
-import sys
-
-from ledger.compact_merkle_tree import CompactMerkleTree
-from ledger.ledger import Ledger
-from plenum.persistence.leveldb_hash_store import LevelDbHashStore
-from stp_core.common.log import getlogger
-
-from sovrin_common.config_util import getConfig
-
-logger = getlogger()
-
-
-def _migrate_ledger(data_directory,
- old_ledger_file, new_ledger_file):
- """
- Test for the directory, open old and new ledger, migrate data, rename directories
- """
-
- # open the current ledger
- logger.info("Old ledger folder: {}, {}".format(
- data_directory, old_ledger_file))
- old_ledger = Ledger(CompactMerkleTree(),
- dataDir=data_directory,
- fileName=old_ledger_file)
- logger.info("old size for {}: {}".format(
- old_ledger_file, str(old_ledger.size)))
-
- # open the new ledger with new serialization
- new_ledger_file_backup = new_ledger_file + "_new"
- logger.info("New ledger folder: {}, {}".format(
- data_directory, new_ledger_file_backup))
- new_ledger = Ledger(CompactMerkleTree(),
- dataDir=data_directory,
- fileName=new_ledger_file_backup)
-
- # add all txns into the new ledger
- for _, txn in old_ledger.getAllTxn():
- # remove all NULL values from there!
- txn = _prepare_old_txn(txn)
- print(txn)
- new_ledger.add(txn)
- logger.info("new size for {}: {}".format(
- new_ledger_file, str(new_ledger.size)))
-
- old_ledger.stop()
- new_ledger.stop()
-
- # now that everything succeeded, remove the old files and move the new
- # files into place
- shutil.rmtree(
- os.path.join(data_directory, old_ledger_file))
- os.rename(
- os.path.join(data_directory, new_ledger_file_backup),
- os.path.join(data_directory, new_ledger_file))
-
- logger.info("Final new ledger folder: {}".format(
- os.path.join(data_directory, new_ledger_file)))
-
-
-def _prepare_old_txn(txn):
- return {k: v for k, v in txn.items() if v is not None}
-
-
-def _open_new_ledger(data_directory, new_ledger_file, hash_store_name):
- # open new Ledger with leveldb hash store (to re-init it)
- logger.info("Open new ledger folder: {}".format(
- os.path.join(data_directory, new_ledger_file)))
- new_ledger = Ledger(CompactMerkleTree(
- hashStore=LevelDbHashStore(
- dataDir=data_directory, fileNamePrefix=hash_store_name)),
- dataDir=data_directory,
- fileName=new_ledger_file)
- new_ledger.stop()
-
-
-def migrate_domain_hash_stores(node_data_directory):
- # the new hash store (merkle tree) will be recovered from the new transaction log after re-start
- # just delete the current hash store
- old_merkle_nodes = os.path.join(node_data_directory, '_merkleNodes')
- old_merkle_leaves = os.path.join(node_data_directory, '_merkleLeaves')
- new_merkle_leaves_domain = os.path.join(
- node_data_directory, 'domain_merkleLeaves')
- new_merkle_nodes_domain = os.path.join(
- node_data_directory, 'domain_merkleNodes')
-
- if os.path.exists(old_merkle_nodes):
- logger.info('removed {}'.format(old_merkle_nodes))
- shutil.rmtree(old_merkle_nodes)
- if os.path.exists(old_merkle_leaves):
- logger.info('removed {}'.format(old_merkle_leaves))
- shutil.rmtree(old_merkle_leaves)
- if os.path.exists(new_merkle_leaves_domain):
- logger.info('removed {}'.format(new_merkle_leaves_domain))
- shutil.rmtree(new_merkle_leaves_domain)
- if os.path.exists(new_merkle_nodes_domain):
- shutil.rmtree(new_merkle_nodes_domain)
- logger.info('removed {}'.format(new_merkle_nodes_domain))
-
- # open new Ledgers
- _, new_domain_ledger_name = _get_domain_ledger_file_names()
- _open_new_ledger(node_data_directory,
- new_domain_ledger_name, 'domain')
-
-
-def _get_domain_ledger_file_names():
- config = getConfig()
- if config.domainTransactionsFile.startswith('domain'):
- old_name = config.domainTransactionsFile
- new_name = config.domainTransactionsFile
- else:
- # domain ledger uses old file name
- old_name = config.domainTransactionsFile.replace('domain_', '')
- new_name = 'domain_' + config.domainTransactionsFile
- # new_name = old_name
- return old_name, new_name
-
-
-def migrate_domain_ledger_for_node(node_data_directory):
- old_name, new_name = _get_domain_ledger_file_names()
- _migrate_ledger(node_data_directory,
- old_name,
- new_name)
-
-
-def migrate_all_states(node_data_directory):
- # the states will be recovered from the ledger during the start-up.
- # just delete the current ones
- pool_state_path = os.path.join(node_data_directory, 'pool_state')
- domain_state_path = os.path.join(node_data_directory, 'domain_state')
- config_state_path = os.path.join(node_data_directory, 'config_state')
-
- if os.path.exists(pool_state_path):
- shutil.rmtree(pool_state_path)
- logger.info('removed {}'.format(pool_state_path))
-
- if os.path.exists(domain_state_path):
- shutil.rmtree(domain_state_path)
- logger.info('removed {}'.format(domain_state_path))
-
- if os.path.exists(config_state_path):
- shutil.rmtree(config_state_path)
- logger.info('removed {}'.format(config_state_path))
-
-
-def migrate_custom_config(config_file):
- # config_file = os.path.join(node_data_directory, 'sovrin_config.py')
- if not os.path.exists(config_file):
- return
-
- logger.info("Migrating custom config file : {}".format(config_file))
-
- for line in fileinput.input(config_file, inplace=1):
- if 'domainTransactionsFile' in line:
- sys.stdout.write("domainTransactionsFile = 'domain_transactions_live'\n")
- continue
- sys.stdout.write(line)
-
-
-def migrate_all():
- config = getConfig()
- base_dir = config.baseDir
- nodes_data_dir = os.path.join(base_dir, config.nodeDataDir)
- if not os.path.exists(nodes_data_dir):
- # TODO: find a better way
- base_dir = '/home/sovrin/.sovrin'
- nodes_data_dir = os.path.join(base_dir, config.nodeDataDir)
- if not os.path.exists(nodes_data_dir):
- msg = 'Can not find the directory with the ledger: {}'.format(
- nodes_data_dir)
- logger.error(msg)
- raise Exception(msg)
-
- for node_dir in os.listdir(nodes_data_dir):
- node_data_dir = os.path.join(nodes_data_dir, node_dir)
- logger.info("Applying migration to {}".format(node_data_dir))
-
- config_file = os.path.join(base_dir, 'sovrin_config.py')
- migrate_custom_config(config_file)
-
- migrate_domain_ledger_for_node(node_data_dir)
- migrate_domain_hash_stores(node_data_dir)
- migrate_all_states(node_data_dir)
-
- # subprocess.run(['chown', '-R', 'sovrin:sovrin', '/home/sovrin/.sovrin'])
-
-
-migrate_all()
* [X] diff --git a/design/anoncreds.md b/design/anoncreds.md
Okay, what has changed here. Do we care? No, this is part of the indy -> ursa upgrade.
index 077e956c..9adf4e1e 100644
--- a/design/anoncreds.md
+++ b/design/anoncreds.md
@@ -16,8 +16,8 @@ Here you can find the requirements and design for Anoncreds workflow (including
Anoncreds protocol links:
- [Anoncreds Sequence Diagram](https://github.com/hyperledger/indy-sdk/blob/master/doc/libindy-anoncreds.svg)
-- [Anoncreds Protocol Math](https://github.com/hyperledger/indy-crypto/blob/master/libindy-crypto/docs/AnonCred.pdf)
-- [Anoncreds Protocol Crypto API](https://github.com/hyperledger/indy-crypto/blob/master/libindy-crypto/docs/anoncreds-design.md)
+- [Anoncreds Protocol Math](https://github.com/hyperledger/ursa-docs/tree/master/specs/anoncreds1)
+- [Anoncreds Protocol Crypto API](https://github.com/hyperledger/ursa/blob/master/libursa/docs/anoncreds-design.md)
## Requirements
1. Creation of Schemas:
* [X] diff --git a/dev-setup/osx/setup-dev-dependencies.sh b/dev-setup/osx/setup-dev-dependencies.sh
We removed indy_crypto, um, maybe we care? Let us look.
* commit 11a7e1346b0b28071376ebd660bbe60aaa0217fd
Author: Brent Zundel <brent.zundel@gmail.com>
Date: Thu Jun 18 15:51:01 2020 -0600
removed all mention of indy-crypto
Signed-off-by: Brent Zundel <brent.zundel@gmail.com>
This was removed in the ubuntu branch, but not stable.
index 479aacb1..4dadc474 100755
--- a/dev-setup/osx/setup-dev-dependencies.sh
+++ b/dev-setup/osx/setup-dev-dependencies.sh
@@ -1,13 +1,12 @@
#!/usr/bin/env bash
if [ "$#" -ne 2 ]; then
- echo "Please specify indy-sdk and indy-crypto version tags"
- echo "e.g ./setup-dev-dependencies.sh 1.6.7 0.4.2"
+ echo "Please specify indy-sdk version tag"
+ echo "e.g ./setup-dev-dependencies.sh 1.6.7"
exit 1
fi
indy_sdk_version=$1
-indy_crypto_version=$2
brew update
@@ -43,16 +42,3 @@ popd
rm -rf indy-sdk
popd
echo 'Installed libindy'
-
-echo 'Installing libcrypto...'
-pushd /tmp
-git clone https://github.com/hyperledger/indy-crypto.git
-pushd indy-crypto/libindy-crypto
-git fetch --all --tags --prune
-git checkout tags/v"${indy_crypto_version}"
-cargo build --release
-cp target/release/libindy_crypto.dylib /usr/local/lib/
-popd
-rm -rf indy-crypto
-popd
-echo 'Installed libcrypto'
* [X] diff --git a/dev-setup/ubuntu/setup-dev-depend-ubuntu14.sh b/dev-setup/ubuntu/setup-dev-depend-ubuntu14.sh
Get rid of libindy-crypto
index 4686ac7c..b01f166e 100755
--- a/dev-setup/ubuntu/setup-dev-depend-ubuntu14.sh
+++ b/dev-setup/ubuntu/setup-dev-depend-ubuntu14.sh
@@ -23,6 +23,6 @@ echo 'Installing libsodium...'
sudo apt-get install -y libsodium13
echo 'Installed libsodium'
-echo 'Installing Libindy and Libindy Crypto...'
-sudo apt-get install -y libindy libindy-crypto
-echo 'Installed Libindy and Libindy Crypto'
+echo 'Installing Libindy...'
+sudo apt-get install -y libindy
+echo 'Installed Libindy'
* [X] diff --git a/dev-setup/ubuntu/setup-dev-depend-ubuntu16.sh b/dev-setup/ubuntu/setup-dev-depend-ubuntu16.sh
Get rid of libindy-crypto, add usra and libindy.
index 74210de4..1bc47b62 100755
--- a/dev-setup/ubuntu/setup-dev-depend-ubuntu16.sh
+++ b/dev-setup/ubuntu/setup-dev-depend-ubuntu16.sh
@@ -30,6 +30,10 @@ sudo apt-get install -y libbz2-dev \
echo 'Installed RocksDB'
-echo 'Installing Libindy and Libindy Crypto...'
-sudo apt-get install -y libindy libindy-crypto
-echo 'Installed Libindy and Libindy Crypto'
+echo 'Installing Ursa...'
+sudo apt-get install -y ursa
+echo 'Installed Ursa'
+
+echo 'Installing Libindy...'
+sudo apt-get install -y libindy
+echo 'Installed Libindy'
* [X] diff --git a/dev-setup/ubuntu/ubuntu-2004/SetupVMTest.txt b/dev-setup/ubuntu/ubuntu-2004/SetupVMTest.txt
new build.
new file mode 100644
index 00000000..faaf67b6
--- /dev/null
+++ b/dev-setup/ubuntu/ubuntu-2004/SetupVMTest.txt
@@ -0,0 +1,105 @@
+#VM 20.04 Setup
+
+##Pre-Install
+
+ sudo apt-get update && sudo apt-get install -y apt-transport-https ca-certificates
+ sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88 || sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys CE7709D068DB5E88
+ sudo echo "deb https://repo.sovrin.org/deb bionic master" >> /etc/apt/sources.list
+ sudo echo "deb https://repo.sovrin.org/deb bionic stable" >> /etc/apt/sources.list
+ sudo echo "deb http://security.ubuntu.com/ubuntu bionic-security main" >> /etc/apt/sources.list
+
+ sudo apt-get update && sudo apt-get install -y \
+ git \
+ wget \
+ unzip \
+ python3-pip \
+ python3-venv \
+ libsodium23 \
+ iptables \
+ at \
+ supervisor \
+ python3-nacl \
+ rocksdb-tools \
+ librocksdb5.17 \
+ librocksdb-dev \
+ libsnappy-dev \
+ liblz4-dev \
+ libbz2-dev \
+ libssl1.0.0 \
+ libindy \
+ ursa
+
+ git clone https://github.com/hyperledger/indy-node.git
+ git clone https://github.com/hyperledger/indy-plenum.git
+ # in both indy-node and indy-plenum checkout origin/ubuntu-20.04-upgrade
+ sudo cp /usr/lib/ursa/libursa.* /usr/lib/
+ # Should be done in python env
+ pip install -U \
+ Pygments==2.2.0 \
+ Pympler==0.8 \
+ apipkg==1.5 \
+ attrs==20.3.0 \
+ base58==2.1.0 \
+ distro==1.5.0 \
+ execnet==1.8.0 \
+ flake8==3.8.4 \
+ indy-plenum==1.13.0.dev14 \
+ indy-node==1.13.0.dev19 \
+ iniconfig==1.1.1 \
+ intervaltree==2.1.0 \
+ ioflo==2.0.2 \
+ jsonpickle==2.0.0 \
+ leveldb==0.201 \
+ libnacl==1.7.2 \
+ mccabe==0.6.1 \
+ msgpack-python==0.5.6 \
+ orderedset==2.0.3 \
+ packaging==20.9 \
+ pip \
+ pluggy==0.13.1 \
+ portalocker==2.2.1 \
+ prompt-toolkit==3.0.16 \
+ psutil==5.6.6 \
+ py==1.10.0 \
+ pycodestyle==2.6.0 \
+ pyflakes==2.2.0 \
+ pyparsing==2.4.7 \
+ pytest==6.2.2 \
+ pytest-asyncio==0.14.0 \
+ pytest-forked==1.3.0 \
+ pytest-runner==5.3.0 \
+ pytest-xdist==2.2.1 \
+ python-dateutil==2.6.1 \
+ python-rocksdb==0.7.0 \
+ python-ursa==0.1.1 \
+ python3-indy==1.13.0 \
+ pyzmq==22.3.0 \
+ rlp==0.6.0 \
+ semver==2.13.0 \
+ setuptools==53.0.0 \
+ sha3==0.2.1 \
+ six==1.15.0 \
+ sortedcontainers==1.5.7 \
+ timeout-decorator==0.5.0 \
+ toml==0.10.2 \
+ ujson==1.33 \
+ wcwidth==0.2.5 \
+ wheel==0.34.2 \
+ zipp==1.2.0
+
+##IDE Setup
+ Pycharm:
+ # Open indy-node
+ # Open indy-plenum - Link
+ # Create virtual env in project structure - python interpreter
+ # Create virtual env in project structure - python interpreter
+ # All pip3 commands mentioned above must be done in env
+
+## Base Dependencies Needed to test
+### Library Dependencies:
+ libindy 1.15.0-bionic
+ libindy-crypto 0.4.5
+ ursa 0.3.2-2
+
+
+
* [X] diff --git a/docs/source/auth_rules.md b/docs/source/auth_rules.md
Okay, so we need to answer, were these added in ubuntu20, or are these removed in stable. Let us find out...
git log --date-order --graph --decorate ^origin/stable origin/ubuntu-20.04-upgrade --stat docs/source/auth_rules.md
* commit 3da39bbe0b51e6c0b31b4b8687a9caac42bac32f
| Author: anton.denishchenko <anton.denishchenko@evernym.com>
| Date: Tue Feb 23 02:03:34 2021 +0300
|
| Update docs after review
|
| Signed-off-by: anton.denishchenko <anton.denishchenko@evernym.com>
|
| docs/source/auth_rules.md | 14 ++++++++++++++
| 1 file changed, 14 insertions(+)
|
* commit 6df828fa4c63638c6f60d349ee26c23927620368
Author: Adam Burdett <burdettadam@gmail.com>
Date: Thu Jan 23 17:31:02 2020 -0700
rs_schema_handler
Signed-off-by: Adam Burdett <burdettadam@gmail.com>
docs/source/auth_rules.md | 30 ++++++++++++++++++++++++++++++
1 file changed, 30 insertions(+)
We can see that they were modified to add the rules after the fact. Observing the patch itself we see that is true.
* commit 3da39bbe0b51e6c0b31b4b8687a9caac42bac32f
| Author: anton.denishchenko <anton.denishchenko@evernym.com>
| Date: Tue Feb 23 02:03:34 2021 +0300
|
| Update docs after review
|
| Signed-off-by: anton.denishchenko <anton.denishchenko@evernym.com>
|
| diff --git a/docs/source/auth_rules.md b/docs/source/auth_rules.md
| index 19444819..3545b970 100644
| --- a/docs/source/auth_rules.md
| +++ b/docs/source/auth_rules.md
| @@ -537,6 +537,15 @@
| <td><sub>1 TRUSTEE OR 1 STEWARD OR 1 NETWORK_MONITOR</sub></td>
| <td><sub>Getting validator_info from pool</sub></td>
| </tr>
| + <tr>
| + <td><sub>LEDGERS_FREEZE</sub></td>
| + <td><sub>EDIT</sub></td>
| + <td><sub><code>*</code></sub></td>
| + <td><sub><code>*</code></sub></td>
| + <td><sub><code>*</code></sub></td>
| + <td><sub>3 TRUSTEE</sub></td>
| + <td><sub>Freeze specific ledgers</sub></td>
| + </tr>
| </table>
|
|
| @@ -729,6 +738,11 @@
| <td><sub>N/A</sub></td>
| </tr>
|
| + <tr>
| + <td><sub>LEDGERS_FREEZE</sub></td>
| + <td><sub>EDIT</sub></td>
| + <td><sub>N/A</sub></td>
| + </tr>
|
| </table>
|
|
* commit 6df828fa4c63638c6f60d349ee26c23927620368
Author: Adam Burdett <burdettadam@gmail.com>
Date: Thu Jan 23 17:31:02 2020 -0700
rs_schema_handler
Signed-off-by: Adam Burdett <burdettadam@gmail.com>
diff --git a/docs/source/auth_rules.md b/docs/source/auth_rules.md
index 4903bffd..19444819 100644
--- a/docs/source/auth_rules.md
+++ b/docs/source/auth_rules.md
@@ -302,6 +302,24 @@
<td><sub><code>*</code></sub></td>
<td><sub>No one can edit existing Context</sub></td>
<td><sub>Editing a Context</sub></td>
+ </tr>
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>ADD</sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub>1 TRUSTEE OR 1 STEWARD OR 1 ENDORSER</sub></td>
+ <td><sub>Adding a new Rich Schema</sub></td>
+ </tr>
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>EDIT</sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub>No one can edit existing Context</sub></td>
+ <td><sub>Editing a Rich Schema</sub></td>
</tr>
<tr>
<td><sub>CLAIM_DEF</sub></td>
@@ -579,6 +597,18 @@
<td><sub>The DID used to create the CONTEXT</sub></td>
</tr>
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>ADD</sub></td>
+ <td><sub>N/A</sub></td>
+ </tr>
+
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>EDIT</sub></td>
+ <td><sub>The DID used to create the RICH_SCHEMA</sub></td>
+ </tr>
+
<tr>
<td><sub>CLAIM_DEF</sub></td>
<td><sub>ADD</sub></td>
index 4903bffd..3545b970 100644
--- a/docs/source/auth_rules.md
+++ b/docs/source/auth_rules.md
@@ -302,6 +302,24 @@
<td><sub><code>*</code></sub></td>
<td><sub>No one can edit existing Context</sub></td>
<td><sub>Editing a Context</sub></td>
+ </tr>
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>ADD</sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub>1 TRUSTEE OR 1 STEWARD OR 1 ENDORSER</sub></td>
+ <td><sub>Adding a new Rich Schema</sub></td>
+ </tr>
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>EDIT</sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub>No one can edit existing Context</sub></td>
+ <td><sub>Editing a Rich Schema</sub></td>
</tr>
<tr>
<td><sub>CLAIM_DEF</sub></td>
@@ -519,6 +537,15 @@
<td><sub>1 TRUSTEE OR 1 STEWARD OR 1 NETWORK_MONITOR</sub></td>
<td><sub>Getting validator_info from pool</sub></td>
</tr>
+ <tr>
+ <td><sub>LEDGERS_FREEZE</sub></td>
+ <td><sub>EDIT</sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub><code>*</code></sub></td>
+ <td><sub>3 TRUSTEE</sub></td>
+ <td><sub>Freeze specific ledgers</sub></td>
+ </tr>
</table>
@@ -579,6 +606,18 @@
<td><sub>The DID used to create the CONTEXT</sub></td>
</tr>
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>ADD</sub></td>
+ <td><sub>N/A</sub></td>
+ </tr>
+
+ <tr>
+ <td><sub>SET_RICH_SCHEMA</sub></td>
+ <td><sub>EDIT</sub></td>
+ <td><sub>The DID used to create the RICH_SCHEMA</sub></td>
+ </tr>
+
<tr>
<td><sub>CLAIM_DEF</sub></td>
<td><sub>ADD</sub></td>
@@ -699,6 +738,11 @@
<td><sub>N/A</sub></td>
</tr>
+ <tr>
+ <td><sub>LEDGERS_FREEZE</sub></td>
+ <td><sub>EDIT</sub></td>
+ <td><sub>N/A</sub></td>
+ </tr>
</table>
* [X] diff --git a/docs/source/ci-cd.md b/docs/source/ci-cd.md
Well, how do we resolve this one? This appears to be build instructions, and they appear to be updated to the ubuntu 20.04 branch. So we call it good and move on.
index 5bf76c03..c43cb163 100644
--- a/docs/source/ci-cd.md
+++ b/docs/source/ci-cd.md
@@ -2,24 +2,28 @@
#### Branches
-- `master` branch contains the latest changes. All PRs usually need to be sent to master.
-- `stable` branch contains latest releases (https://github.com/hyperledger/indy-node/releases). Hotfixes need to be sent to both stable and master.
-- `release-*` branches hold release candidates during release workflow
-- `hotfix-*` branches hold release candidates during hotfix workflow
+At the moment work is being done so that `ubuntu-20.04-upgrade` becomes the new master/main branch.
+The old `master` will be moved to `ubuntu16` branch.
+The documentation for the old "legacy" process will reside in that branch.
+
+- `master` | `ubuntu-20.04` branches contains the latest changes. All PRs usually need to be sent to `master` | `ubuntu-20.04-upgrade`.
+
+
#### Pull Requests
- Each PR needs to be reviewed.
-- PR can be merged only after all tests pass and code is reviewed.
+- PR can be merged only after all tests pass and the code is reviewed.
## Continuous Integration
- for each PR we execute:
- static code validation
- Unit/Integration tests
-- We use pipeline in code approach and Jenkins as our main CI/CD server.
-- CI part of the pipeline (running tests for each PR) is defined in `Jenkinsfile.ci` file.
-- CI part is run on Hyperledger and Sovrin Foundation Jenkins servers, so they are public and open as every contributor needs to see results of the tests run for his or her PR.
+- We use pipeline in code approach and Github Actions as our main CI/CD server.
+- CI part of the pipeline (running tests for each PR) is defined in `.github/workflows/PR.yaml` file.
+- CI part is run on Github, so they are public and open as every contributor needs to see results of the tests run for his or her PR.
+ Github Actions are by default deactivated on new forks. The pipeline contain checks so that no publish action is tried on forks (which would likely fail due to missing login information).
#### Static Code Validation
@@ -33,14 +37,14 @@
## Continuous Delivery
-- CD part of the pipeline is defined in `Jenkinsfile.cd` file.
-- CD part is run on Sovrin Foundation Jenkins server dealing with issuing and uploading new builds.
+- CD part of the pipeline is defined in `.github/workflows/tag.yaml`, `.github/workflows/releasepr.yaml`, and `.github/workflows/publishRelease.yaml` file.
+- CD part is run via GitHubActions issuing and uploading new builds.
#### Builds
What artifacts are produced after each push
- to `master` branch:
- - all artifacts include developmental release segment `devN` in their version
+ - all artifacts include developmental release segment `devN` in their version.
- indy-plenum:
- indy-plenum in [pypi](https://pypi.python.org/pypi/indy-plenum)
- indy-plenum deb package in [`https://repo.sovrin.org/deb xenial master-latest`](https://repo.sovrin.org/lib/apt/xenial/master-latest/)
@@ -49,27 +53,15 @@ What artifacts are produced after each push
- indy-node deb package in [`https://repo.sovrin.org/deb xenial master-latest`](https://repo.sovrin.org/lib/apt/xenial/master-latest/)
- indy-node deb package in [`https://repo.sovrin.org/deb xenial master`](https://repo.sovrin.org/lib/apt/xenial/master/) (copied from `master-latest`)
- indy-plenum deb package in [`https://repo.sovrin.org/deb xenial master`](https://repo.sovrin.org/lib/apt/xenial/master/) (copied from `master-latest`)
-- to `release-*` and `hotfix-*` branches:
- - all artifacts include pre-release segment `rcN` in their version
+- to `ubuntu-20.04-upgrade` branch:
+ - all artifacts include developmental release segment `devN` in their version, where `N` is a unix timestamp.
- indy-plenum:
- indy-plenum in [pypi](https://pypi.python.org/pypi/indy-plenum)
- - indy-plenum deb package in [`https://repo.sovrin.org/deb xenial rc-latest`](https://repo.sovrin.org/lib/apt/xenial/rc-latest/)
+ - indy-plenum deb package in [`https://hyperledger.jfrog.io/artifactory/indy`](https://hyperledger.jfrog.io/artifactory/indy)
- indy-node:
- indy-node in [pypi](https://pypi.python.org/pypi/indy-node)
- - indy-node deb package in [`https://repo.sovrin.org/deb xenial rc-latest`](https://repo.sovrin.org/lib/apt/xenial/rc-latest/)
- - indy-node deb package in [`https://repo.sovrin.org/deb xenial rc`](https://repo.sovrin.org/lib/apt/xenial/rc/) (copied from `rc-latest`)
- - indy-plenum deb package in [`https://repo.sovrin.org/deb xenial rc`](https://repo.sovrin.org/lib/apt/xenial/rc/) (copied from `rc-latest`)
-- to `stable` branch:
- - indy-plenum:
- - indy-plenum in [pypi](https://pypi.python.org/pypi/indy-plenum)
- - indy-plenum deb package in [`https://repo.sovrin.org/deb xenial stable-latest`](https://repo.sovrin.org/lib/apt/xenial/stable-latest/)
- - indy-plenum release tag (https://github.com/hyperledger/indy-plenum/releases)
- - indy-node:
- - indy-node in [pypi](https://pypi.python.org/pypi/indy-node)
- - indy-node deb package in [`https://repo.sovrin.org/deb xenial stable-latest`](https://repo.sovrin.org/lib/apt/xenial/stable-latest/) (re-packed from `rc-latest`)
- - indy-node deb package in [`https://repo.sovrin.org/deb xenial stable`](https://repo.sovrin.org/lib/apt/xenial/stable/) (copied from `rc-latest`)
- - indy-plenum deb package in [`https://repo.sovrin.org/deb xenial stable`](https://repo.sovrin.org/lib/apt/xenial/stable/) (copied from `stable-latest`)
- - indy-node release tag (https://github.com/hyperledger/indy-node/releases)
+ - indy-node deb package in [`https://hyperledger.jfrog.io/artifactory/indy`](https://hyperledger.jfrog.io/artifactory/indy)
+
Use cases for artifacts
- PyPI artifacts can be used for development experiments, but not intended to be used for production.
@@ -83,7 +75,7 @@ Use cases for artifacts
##### Supported platforms and OSes
-- Ubuntu 16.04 on x86_64
+- Ubuntu 20.04 on x86_64
##### Build scripts
@@ -92,8 +84,8 @@ We use [fpm](https://github.com/jordansissel/fpm) for packaging python code into
- https://github.com/hyperledger/indy-plenum/blob/master/build-scripts
We also pack some 3rd parties dependencies which are not presented in canonical ubuntu repositories:
-- https://github.com/hyperledger/indy-node/blob/master/build-scripts/ubuntu-1604/build-3rd-parties.sh
-- https://github.com/hyperledger/indy-plenum/blob/master/build-scripts/ubuntu-1604/build-3rd-parties.sh
+- https://github.com/hyperledger/indy-node/tree/ubuntu-20.04-upgrade/build-scripts/ubuntu-2004/build-3rd-parties.sh
+- https://github.com/hyperledger/indy-plenum/tree/ubuntu-20.04-upgrade/build-scripts//ubuntu-2004/build-3rd-parties.sh
Each `build-scripts` folder includes `Readme.md`. Please check them for more details.
@@ -102,7 +94,7 @@ Each `build-scripts` folder includes `Readme.md`. Please check them for more det
- Please note, that we are using versioning that satisfies [PEP 440](https://www.python.org/dev/peps/pep-0440) with release segment as `MAJOR.MINOR.PATCH` that satisfies [SemVer](https://semver.org/) as well.
- Version is set in the code (see [\_\_version\_\_.json](https://github.com/hyperledger/indy-node/blob/master/indy_node/__version__.json)).
- Version is bumped for new releases / hotfixes either manually or using [bump_version.sh](https://github.com/hyperledger/indy-node/blob/master/indy_node/bump_version.sh) script. The latter is preferred.
-- During development phase version includes developmental segment `devN`, where `N` is set for CD pipeline artifacts as incremented build number of build server jobs. In the source code it is just equal to `0` always.
+- During development phase version includes developmental segment `devN`, where `N` is a unix timestamp at buildtime.
- During release preparation phase (release / hotfix workflows) version includes pre-release segment `rcN`, where `N>=1` and set in the source code by developers.
- Each dependency (including indy-plenum) has a strict version (see [setup.py](https://github.com/hyperledger/indy-node/blob/master/setup.py))
- If you install indy-node (either from pypi, or from deb package), the specified in setup.py version of indy-plenum is installed.
@@ -112,99 +104,42 @@ Each `build-scripts` folder includes `Readme.md`. Please check them for more det
- different versions in migrations scripts
-##### For releases `< 1.7.0` (deprecated)
-- Please note, that we are using semver-like approach for versioning (major, minor, build) for each of the components.
-- Major and minor parts are set in the code (see [\_\_metadata\_\_.py](https://github.com/hyperledger/indy-node/blob/master/indy_node/__metadata__.py)). They must be incremented for new releases manually from code if needed.
-- Build part is incremented with each build on Jenkins (so it always increases, but may be not sequentially)
-- Each dependency (including indy-plenum) has a strict version (see [setup.py](https://github.com/hyperledger/indy-node/blob/master/setup.py))
-- If you install indy-node (either from pypi, or from deb package), the specified in setup.py version of indy-plenum is installed.
-- Master and Stable builds usually have different versions.
-- Differences in master and stable code:
- - `setup.py`:
- - dev suffix in project names and indy-plenum dependency in master; no suffixes in stable
- - different versions of indy-plenum dependency
- - different versions in migrations scripts
-
## Release workflow
+It starts with setting a tag in the form of `setRelease-v<Major>.<Minor>.<Patch>[-rc<Num>]`.
+![release-workflow](./release-workflow.png)
### Feature Release
-#### 1. Release Candidate Preparation
+#### 1. Release Candidate and Release Preparation
1. [**Maintainer**]
- - Create `release-X.Y.Z` branch from `stable` (during the first RC preparation only).
-2. [**Contributor**]
- - Create `rc-X.Y.Z.rcN` branch from `release-X.Y.Z` (`N` starts from `1` and is incremented for each new RC).
- - Apply necessary changes from `master` (either `merge` or `cherry-pick`).
- - (_optional_) [`indy-node`] Set **stable** (just X.Y.Z) `indy-plenum` version in `setup.py`.
- - Set the package version `./bump_version.sh X.Y.Z.rcN`.
- - Commit, push and create a PR to `release-X.Y.Z`.
-3. Until PR is merged:
- 1. [**build server**]
- - Run CI for the PR and notifies GitHub.
- 2. [**Maintainer**]
- - Review the PR.
- - Either ask for changes or merge.
- 3. [**Contributor**]
- - (_optional_) Update the PR if either CI failed or reviewer asked for changes.
- - (_optional_) [**indy-node**] Bump `indy-plenum` version in `setup.py` if changes require new `indy-plenum` release.
-
-#### 2. Release Candidate Acceptance
+ - Create `setRelease-vX.Y.Z` tag on desired branch (most of the time it would be `master|ubuntu20.04-upgrade`).
+2. [**GHA `tag.yaml`**]
+ - Bumps version
+ - creates PR with the updated Version
+2. [**GHA `releasepr.yaml`**]
+ - Runs on the update Version PR
+ - Tests the code (e.g. DCO, CI testing, static code validation etc.).
+ - Builds packages.
+
+#### 2. Release Candidate and Release Acceptance
**Note** If any of the following steps fails new release candidate should be prepared.
1. [**Maintainer**]
- - **Start release candidate pipeline manually**.
-2. [**build server**]
- - Checkout the repository.
- - Publish to PyPI as `X.Y.Z.rcN`.
- - Bump version locally to `X.Y.Z`, commit and push as the `release commit` to remote.
- - Build debian packages:
- - for the project: source code version would be `X.Y.Z`, debian package version `X.Y.Z~rcN`;
- - for the 3rd party dependencies missed in the official debian repositories.
- - Publish the packages to `rc-latest` debian channel.
- - [`indy-node`] Copy the package along with its dependencies (including `indy-plenum`)
- from `rc-latest` to `rc` channel.
- - [`indy-node`] Run system tests for the `rc` channel.
- - Create **release PR** from `release-X.Y.Z` (that points to `release commit`) branch to `stable`.
- - Notify maintainers.
- - Wait for an approval to proceed. **It shouldn't be provided until `release PR` passes all necessary checks** (e.g. DCO, CI testing, maintainers reviews etc.).
-3. [**build server**]
- - Run CI for the PR and notify GitHub.
-4. [**QA**]
- - (_optional_) Perform additional testing.
-5. [**Maintainer**]
- - Review the PR but **do not merge it**.
- - If approved: let build server to proceed.
- - Otherwise: stop the pipeline.
-6. [**build server**]
- - If approved:
- - perform fast-forward merge;
- - create and push tag `vX.Y.Z`;
- - Notify maintainers.
- - Otherwise rollback `release commit` by moving `release-X.Y.Z` to its parent.
-
-#### 3. Publishing
-
-1. [**build server**] triggered once the `release PR` is merged
- - Publish to PyPI as `X.Y.Z`.
- - Download and re-pack debian package `X.Y.Z~rcN` (from `rc-latest` channel) to `X.Y.Z` changing only the package name.
- - Publish the package to `rc-latest` debian channel.
- - Copy the package along with its dependencies from `rc-latest` to `stable-latest` channel.
- - [`indy-node`] Copy the package along with its dependencies (including `indy-plenum`) from `stable-latest` to `stable` channel.
- - [`indy-node`] Run system tests for the `stable` channel.
- - Notify maintainers.
+ - Wait till the PR with the updated Version number from the RC preperation step is created and the Pipeline has run its tests (`releasepr.yaml` run successfully).
+2. [**Maintainer**]
+ - If all checks passed and RC is approved merge the update version PR. It will kick of the `publishRelease.yaml` Pipeline, which creates a Github Release and publishes the packages.
+ - Otherwise: just close the update version PR **without** merging.
+3. [**GHA (`publishRelease.yaml`)**]
+ - Gets the artifacts from the last successfull `releasepr.yaml` run.
+ - Publishes the artifacts to a Github Release, pypi and Artifactory.
#### 4. New Development Cycle Start
-1. [**Contributor**]:
- - Create PR to `master` with version bump to `X'.Y'.Z'.dev0`, where `X'.Y'.Z'` is next target release version. Usually it increments one of `X`, `Y` or `Z` and resets lower parts (check [SemVer](https://semver.org/) for more details), e.g.:
- - `X.Y.Z+1` - bugfix release
- - `X.Y+1.0` - feature release, backwards compatible API additions/changes
- - `X+1.0.0` - major release, backwards incompatible API changes
-
### Hotfix Release
Hotfix release is quite similar except the following difference:
- - hotfix branches named `hotfix-X.Y.Z`;
+ - hotfix branches named `hotfix-X.Y.Z` created from last Release commit;
- `master` usually is not merged since hotfixes (as a rule) should include only fixes for stable code.
+ - `setRelease`-Tag created on Hotfix branch.
* [X] diff --git a/docs/source/release-workflow.png b/docs/source/release-workflow.png
New binary file. A release workflow image.
new file mode 100644
index 00000000..35271888
Binary files /dev/null and b/docs/source/release-workflow.png differ
* [X] diff --git a/docs/source/release-workflow.puml b/docs/source/release-workflow.puml
Release workflow. Okidoke
new file mode 100644
index 00000000..79d17610
--- /dev/null
+++ b/docs/source/release-workflow.puml
@@ -0,0 +1,81 @@
+@startuml
+
+:**Tag Triggered - Release Workflow**;
+start
+
+:Create a tag on the branch and commit on which the release
+is to be based.
+
+The tag must be in the following format:
+ - ""setRelease-v<Major>.<Minor>.<Patch>[-rc<Num>]""
+
+Examples:
+ - To generate an RC Release (marked as a pre-release)
+ - ""setRelease-v1.12.6-rc0""
+ - To generate an official Release
+ - ""setRelease-v1.12.6"";
+
+partition "**Workflow**: tag.yaml" {
+ floating note left
+ Workflow triggered
+ by the ""setRelease""
+ tag being pushed.
+ end note
+
+ :Extract information;
+ note left:Extract version number from tag
+ :Bump version number;
+ :Create Pull Request to the branch, where the tag was set on;
+}
+
+partition "**Workflow**: releasepr.yaml" {
+ floating note left
+ Workflow triggered
+ by Pull Request affecting
+ **ONLY** ""indy_node/_version_.json"".
+ end note
+
+ :Extract information;
+ if (isVersionBump) then (yes)
+ :Lint using ""indy-shared-gha"";
+ :build Docker-images using ""indy-shared-gha"";
+ :Execute tests (""reuseable_tests.yaml"");
+ :Esecute ""indy-test-automation"";
+ note left: WIP
+ :build packages using ""indy-shared-gha"";
+ note left: packages published to workflow
+ else (no)
+ endif
+}
+
+if (**Review PR** - All tests passed?) then (Merge PR)
+ partition "**Workflow**: publishRelease.yaml" {
+ floating note right
+ Workflow triggered by
+ pushes affecting
+ **ONLY**
+ ""indy_node/_version_.json"".
+ end note
+
+ :Extract version number from the commit message;
+ if (isVersionBump) then (yes)
+ :Download artifacts from last successfull ""releasepr"" run;
+ :Create a GitHub Release;
+ :Set tag and title to match release version;
+
+ if (is RC Release) then (yes)
+ :Set pre-release checkbox;
+ else (no)
+ endif
+ :Publish GitHub Release
+ - Containing the artifacts;
+ :Publish Packages using ""indy-shared-gha"";
+ else (no)
+ endif
+ }
+else (Close PR without Merging)
+ :Release process aborted;
+endif
+
+stop
+@enduml
* [X] diff --git a/docs/source/requests-new.md b/docs/source/requests-new.md
Let us double check if this is new or old, I think it is new though....
Ledger freezes were added
* commit 4d2066d82387820fd151e3d42c43fa034fb543b4
Author: anton.denishchenko <anton.denishchenko@evernym.com>
Date: Fri Feb 19 02:40:22 2021 +0300
Add docs for freeze transactions
Signed-off-by: anton.denishchenko <anton.denishchenko@evernym.com>
As we can see with the following git log, which shows it was not in stable.
git log --date-order --graph --decorate ^origin/stable origin/ubuntu-20.04-upgrade -p docs/source/requests-new.md
index e69ff800..cbb1f63e 100644
--- a/docs/source/requests-new.md
+++ b/docs/source/requests-new.md
@@ -16,6 +16,7 @@
* [NODE](#node)
* [POOL_UPGRADE](#pool_upgrade)
* [POOL_CONFIG](#pool_config)
+ * [LEDGERS_FREEZE](#ledgers_freeze)
* [Read Requests](#read-requests)
@@ -24,6 +25,7 @@
* [GET_SCHEMA](#get_schema)
* [GET_CLAIM_DEF](#get_claim_def)
* [GET_TXN](#get_txn)
+ * [GET_FROZEN_LEDGERS](#get_frozen_ledgers)
This doc is about supported client"s Request (both write and read ones).
If you are interested in transactions and their representation on the Ledger (that is internal one),
@@ -1386,6 +1388,66 @@ Command to change Pool's configuration
}
```
+### LEDGERS_FREEZE
+
+Freeze deprecated ledgers (default ledgers such as the domain, config, pool, and audit ledgers cannot be frozen). If a ledger is frozen it can be used for reading but not for writing. Frozen ledgers will not be caught up by new nodes and they can't be unfrozen. Frozen ledgers can be removed without breaking consensus, but this would prevent third parties from auditing the ledger history. [More information is in the Indy Plenum documenation](https://hyperledger/indy-plenum/tree/master/docs/source/transaction_freeze_ledgers.md).
+
+The request has static and dynamic validations. Static validation checks to avoid freezing base ledgers (pool, audit, domain and config). Dynamic validation checks the existence of ledgers before freezing. Authorize checks the permissions for the freeze request (3 trustee signatures are needed by default).
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "9",
+ "ledgers_ids": [1,2,3,4]
+ },
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 311348486,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
+}
+```
+
+*Reply Example*:
+```
+{
+ "op":"REPLY",
+ "result":{
+ "txn":{
+ "type":"9",
+ "data":{
+ "ledgers_ids":[1,2,3,4]
+ },
+ "protocolVersion":2,
+ "metadata":{
+ "reqId":311348486,
+ "payloadDigest":"09a3ccedf806e224beb56b547e967b442f3ee3181d5c87623f063742df7a692e",
+ "from":"M9BJDuS24bqbJNvBRsoGg3",
+ "digest":"dfdc6c5b77181953b4e32f975b0c5e64b25dc3e3061716aca1baae4cbe0ce494"
+ }
+ },
+ "ver":"1",
+ "auditPath":[
+ "DDwrSsKwpFkfGVqp7AxzRMusUuT9D5RmidCmnr8phTWD"
+ ],
+ "txnMetadata":{
+ "txnTime":1613735420,
+ "seqNo":3
+ },
+ "reqSignature":{
+ "type":"ED25519",
+ "values":[
+ {
+ "value":"bSMfwJraLXzBAmdZKQUC1XfoVb3YWygc6UCQAmrTNKDr9beXta7MFyNZnQtbmNoRurjicSHLo3sW7qv7ZTWcZJa",
+ "from":"M9BJDuS24bqbJNvBRsoGg3"
+ }
+ ]
+ },
+ "rootHash":"E27ssC3LK8azpgxxtBY4ETLXvDLGtfmeSjngc6jrV1Qt"
+ }
+}
+```
+
## Read Requests
### GET_NYM
@@ -1898,3 +1960,67 @@ A generic request to get a transaction from Ledger by its sequence number.
}
```
+
+### GET_FROZEN_LEDGERS
+
+Get whole list of frozen ledgers. Reply has follow state format data:
+
+```
+<ledger_id>: {
+ ledger: <ledger_root_hash>,
+ state: <state_root_hash>,
+ seq_no: <last_seq_no>
+}
+```
+
+*Request Example*:
+```
+{
+ "operation":{
+ "type":"10"
+ },
+ "reqId":783857061,
+ "protocolVersion":2,
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3"
+}
+```
+
+*Reply Example*:
+```
+{
+ "result":{
+ "seqNo":3,
+ "type":"10",
+ "state_proof":{
+ "root_hash":"HUv35b31eqncHZ1R8xMQW9pJnCBqAaUVrfCA8AeTtx6u",
+ "multi_signature":{
+ "value":{
+ "pool_state_root_hash":"4bCEk76QsB6p3yCiDntMedpeZmiQtdH9NRpcFyvaLHhc",
+ "state_root_hash":"HUv35b31eqncHZ1R8xMQW9pJnCBqAaUVrfCA8AeTtx6u",
+ "timestamp":1613736202,
+ "ledger_id":2,
+ "txn_root_hash":"BY6PV9SrV1dgQgxy2kpeTLESQfazTYoLdLZfjzVmcLeV"
+ },
+ "signature":"R8FRHVg51YiY5nS8Hh8iXNa1ZPKjrQMmurnrGek2A7QMKq79Pws4DLhgcVgf66PSJGEPjmyASYxFziEnubY1RFHQiE7ZToLZqW4oJt11hhL1XgXwrdswyqTQjuyxx5nzjyE4AzyTvs3BywD54s3w3mUhLG3QWwBp1uTX8agLEKZDkK",
+ "participants":[
+ "Gamma",
+ "Delta",
+ "Beta"
+ ]
+ },
+ "proof_nodes":"+L74vJEgNDpGUk9aRU5fTEVER0VSU7io+Ka4pHsibHNuIjozLCJsdXQiOjE2MTM3MzYyMDIsInZhbCI6eyI5MDkiOnsibGVkZ2VyIjoiR0tvdDVoQnNkODFrTXVwTkNYSGFxYmh2M2h1RWJ4QUZNTG5wY1gyaG5pd24iLCJzZXFfbm8iOjAsInN0YXRlIjoiRGZOTG1INERBSFRLdjYzWVBGSnp1UmRlRXRWd0Y1UnRWbnZLWUhkOGlMRUEifX19"
+ },
+ "txnTime":1613736202,
+ "reqId":666493618,
+ "data":{
+ "909":{
+ "seq_no":0,
+ "state":"DfNLmH4DAHTKv63YPFJzuRdeEtVwF5RtVnvKYHd8iLEA",
+ "ledger":"GKot5hBsd81kMupNCXHaqbhv3huEbxAFMLnpcX2hniwn"
+ }
+ },
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3"
+ },
+ "op":"REPLY"
+}
+```
* [X] diff --git a/docs/source/requests.md b/docs/source/requests.md
This is new in ubuntu 20.04
git log --date-order --graph --decorate ^origin/stable origin/ubuntu-20.04-upgrade -p docs/source/requests.md
* commit e2c6efe2a0d2a3e56d9425e4467ff67cef70cffe
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Fri Feb 21 02:14:29 2020 +0300
|
| INDY-2338: add requests docs
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
| diff --git a/docs/source/requests.md b/docs/source/requests.md
| index 300a7fd8..f44bca28 100644
| --- a/docs/source/requests.md
| +++ b/docs/source/requests.md
| @@ -10,6 +10,12 @@
| * [CLAIM_DEF](#claim_def)
| * [REVOC_REG_DEF](#revoc_reg_def)
| * [REVOC_REG_ENTRY](#revoc_reg_entry)
| + * [JSON_LD_CONTEXT](#json_ld_context)
| + * [RICH_SCHEMA](#rich_schema)
| + * [RICH_SCHEMA_ENCODING](#rich_schema_encoding)
| + * [RICH_SCHEMA_MAPPING](#rich_schema_mapping)
| + * [RICH_SCHEMA_CRED_DEF](#rich_schema_cred_def)
| + * [RICH_SCHEMA_PRES_DEF](#rich_schema_pres_def)
index 6ca1d588..eaba18f0 100644
--- a/docs/source/requests.md
+++ b/docs/source/requests.md
@@ -10,6 +10,12 @@
* [CLAIM_DEF](#claim_def)
* [REVOC_REG_DEF](#revoc_reg_def)
* [REVOC_REG_ENTRY](#revoc_reg_entry)
+ * [JSON_LD_CONTEXT](#json_ld_context)
+ * [RICH_SCHEMA](#rich_schema)
+ * [RICH_SCHEMA_ENCODING](#rich_schema_encoding)
+ * [RICH_SCHEMA_MAPPING](#rich_schema_mapping)
+ * [RICH_SCHEMA_CRED_DEF](#rich_schema_cred_def)
+ * [RICH_SCHEMA_PRES_DEF](#rich_schema_pres_def)
* [NODE](#node)
* [POOL_UPGRADE](#pool_upgrade)
* [POOL_CONFIG](#pool_config)
@@ -18,7 +24,8 @@
* [TRANSACTION_AUTHOR_AGREEMENT](#transaction_author_agreement)
* [TRANSACTION_AUTHOR_AGREEMENT_AML](#transaction_author_agreement_AML)
* [TRANSACTION_AUTHOR_AGREEMENT_DISABLE](#transaction_author_agreement_disable)
- * [SET_CONTEXT](#set_context)
+ * [LEDGERS_FREEZE](#ledgers_freeze)
+
* [Read Requests](#read-requests)
What do we do about the rest of the diffs? Seems to be a lot in this file... Perhaps we see the edit history in stable without ubuntu 20.04... What we are trying to see is are there things in the code base that are in stable, but not 20.04. Or should be removed because they were removed from stable, so we care more about origin/stable history than we do about ubuntu-20.04 history...
So basically this...
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p docs/source/requests.md
* commit 9adcfca1f3c9a82929b71e40359859990a0ca0cb
| Merge: b54404d0 c99cc4cc
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Mon Jan 27 14:50:17 2020 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.12.2.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit b54404d0bdc23dfb60d10bababc9216e99c3ee49
| Merge: ff88db39 e0d3b94d
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Thu Dec 26 15:23:08 2019 +0300
|
| Merge remote-tracking branch 'remotes/upstream/master' into rc-1.12.1.rc1
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: 8940e559 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
| Merge: 1cd837b1 48199562
| Author: Andrey Kononykhin <andkononykhin@gmail.com>
| Date: Thu Jun 27 19:32:01 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
|
| Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
|
* commit 1cd837b1d093f1afc408b5395cbc00353f7e3cca
| Merge: a8784f11 df9959d0
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed May 29 09:36:39 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.8.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
| Merge: c009c3f0 d8c42999
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Apr 24 13:04:08 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
Merge: 089d12e4 16697cf9
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Wed Feb 6 09:41:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.6.83
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
We can see that all of these are merges, that ubuntu 20.04 has. So we can safely assume that we have taken care of this file.
@@ -28,12 +35,16 @@
* [GET_CLAIM_DEF](#get_claim_def)
* [GET_REVOC_REG_DEF](#get_revoc_reg_def)
* [GET_REVOC_REG](#get_revoc_reg)
- * [GET_REVOC_REG_DELTA](#get_revoc_reg_delta)
+ * [GET_REVOC_REG_DELTA](#get_revoc_reg_delta)
+ * [GET_RICH_SCHEMA_OBJECT_BY_ID](#get_rich_schema_object_by_id)
+ * [GET_RICH_SCHEMA_OBJECT_BY_METADATA](#get_rich_schema_object_by_metadata)
* [GET_AUTH_RULE](#get_auth_rule)
* [GET_TRANSACTION_AUTHOR_AGREEMENT](#get_transaction_author_agreement)
* [GET_TRANSACTION_AUTHOR_AGREEMENT_AML](#get_transaction_author_agreement_aml)
* [GET_CONTEXT](#get_context)
+ * [GET_RICH_SCHEMA](#get_rich_schema)
* [GET_TXN](#get_txn)
+ * [GET_FROZEN_LEDGERS](#get_frozen_ledgers)
* [Action Requests](#action-requests)
@@ -54,17 +65,18 @@ Each Request (both write and read) is a JSON with a number of common metadata fi
```
{
- 'operation': {
- 'type': <request type>,
+ "operation": {
+ "type": <request type>,
+ "ver": <operation version>,
<request-specific fields>
},
- 'identifier': <author DID>,
- `endorser`: <endorser DID>,
- 'reqId': <req_id unique integer>,
- 'protocolVersion': 2,
- 'signature': <signature_value>,
- # 'signatures': {
+ "identifier": <author DID>,
+ "endorser": <endorser DID>,
+ "reqId": <req_id unique integer>,
+ "protocolVersion": 2,
+ "signature": <signature_value>,
+ # "signatures": {
# `did1`: <sig1>,
# `did2`: <sig2>,
# }
@@ -94,7 +106,12 @@ Each Request (both write and read) is a JSON with a number of common metadata fi
- REVOC_REG_ENTRY = "114"
- AUTH_RULE = "120"
- AUTH_RULES = "122"
- - SET_CONTEXT = "200"
+ - JSON_LD_CONTEXT = "200"
+ - RICH_SCHEMA = "201"
+ - RICH_SCHEMA_ENCODING = "202"
+ - RICH_SCHEMA_MAPPING = "203"
+ - RICH_SCHEMA_CRED_DEF = "204"
+ - RICH_SCHEMA_PRES_DEF = "205"
- read requests:
@@ -108,8 +125,16 @@ Each Request (both write and read) is a JSON with a number of common metadata fi
- GET_REVOC_REG_DEF = "115"
- GET_REVOC_REG = "116"
- GET_REVOC_REG_DELTA = "117"
- - GET_AUTH_RULE = "121"
- - GET_CONTEXT = "300"
+ - GET_AUTH_RULE = "121"
+ - GET_RICH_SCHEMA_OBJECT_BY_ID = "300"
+ - GET_RICH_SCHEMA_OBJECT_BY_METADATA = "301"
+
+ - `ver` (string; optional)
+
+ Operation request version which will be used as a transaction's payload version.
+ If the input request doesn't have the version specified, then default one will be used.
+ Some transactions (for example TRANSACTION_AUTHOR_AGREEMENT) have non-default transaction payload version
+ defined in source code as a result of evolution of business logic and features.
- request-specific data
@@ -159,22 +184,23 @@ Write requests to Domain and added-by-plugins ledgers may have additional Transa
```
{
- 'operation': {
- 'type': <request type>,
+ "operation": {
+ "type": <request type>,
+ "ver": <operation version>,
<request-specific fields>
},
- 'identifier': <author DID>,
- 'endorser': <endorser DID>,
- 'reqId': <req_id unique integer>,
- 'taaAcceptance': {
- 'taaDigest': <digest hex string>,
- 'mechanism': <mechaism string>,
- 'time': <time integer>
+ "identifier": <author DID>,
+ "endorser": <endorser DID>,
+ "reqId": <req_id unique integer>,
+ "taaAcceptance": {
+ "taaDigest": <digest hex string>,
+ "mechanism": <mechaism string>,
+ "time": <time integer>
}
- 'protocolVersion': 2,
- 'signature': <signature_value>,
- # 'signatures': {
+ "protocolVersion": 2,
+ "signature": <signature_value>,
+ # "signatures": {
# `did1`: <sig1>,
# `did2`: <sig2>,
# }
@@ -205,15 +231,16 @@ of a transaction in the Ledger (see [transactions](transactions.md)).
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": <...>,
"txn": {
"type": <...>,
+ "ver": <...>,
"protocolVersion": <...>,
"data": {
- "ver": <...>,
+
<txn-specific fields>
},
@@ -243,8 +270,8 @@ of a transaction in the Ledger (see [transactions](transactions.md)).
}]
}
- 'rootHash': '5ecipNPSztrk6X77fYPdepzFRUvLdqBuSqv4M9Mcv2Vn',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '3phchUcMsnKFk2eZmcySAWm2T5rnzZdEypW7A5SKi1Qt'],
+ "rootHash": "5ecipNPSztrk6X77fYPdepzFRUvLdqBuSqv4M9Mcv2Vn",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "3phchUcMsnKFk2eZmcySAWm2T5rnzZdEypW7A5SKi1Qt"],
}
}
```
@@ -275,6 +302,7 @@ of a transaction in the Ledger (see [transactions](transactions.md)).
- REVOC_REG_DEF = "114"
- AUTH_RULE = "120"
- SET_CONTEXT = "200"
+ - SET_RICH_SCHEMA = "201"
- `protocolVersion` (integer; optional):
@@ -282,6 +310,13 @@ of a transaction in the Ledger (see [transactions](transactions.md)).
Since clients and different Nodes may be at different versions, we need this field to support backward compatibility
between clients and nodes.
+ - `ver` (string; optional)
+
+ Transaction's payload version as defined in the input request.
+ If the input request doesn't have the version specified, then default one will be used.
+ Some transactions (for example TRANSACTION_AUTHOR_AGREEMENT) have non-default transaction payload version
+ defined in source code as a result of evolution of business logic and features.
+
- `data` (dict):
Transaction-specific data fields (see next sections for each transaction description).
@@ -382,32 +417,32 @@ These common metadata values are added to the result's JSON at the same level as
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '105',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514214863899317,
+ "op": "REPLY",
+ "result": {
+ "type": "105",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514214863899317,
- 'seqNo': 10,
- 'txnTime': 1514214795,
-
- 'state_proof': {
- 'root_hash': '7Wdj3rrMCZ1R1M78H4xK5jxikmdUUGW2kbfJQ1HoEpK',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514214795,
- 'ledger_id': 1,
- 'txn_root_hash': 'DqQ7G4fgDHBfdfVLrE6DCdYyyED1fY5oKw76aDeFsLVr',
- 'pool_state_root_hash': 'TfMhX3KDjrqq94Wj7BHV9sZrgivZyjbHJ3cGRG4h1Zj',
- 'state_root_hash': '7Wdj3rrMCZ1R1M78H4xK5jxikmdUUGW2kbfJQ1HoEpK'
+ "seqNo": 10,
+ "txnTime": 1514214795,
+
+ "state_proof": {
+ "root_hash": "7Wdj3rrMCZ1R1M78H4xK5jxikmdUUGW2kbfJQ1HoEpK",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514214795,
+ "ledger_id": 1,
+ "txn_root_hash": "DqQ7G4fgDHBfdfVLrE6DCdYyyED1fY5oKw76aDeFsLVr",
+ "pool_state_root_hash": "TfMhX3KDjrqq94Wj7BHV9sZrgivZyjbHJ3cGRG4h1Zj",
+ "state_root_hash": "7Wdj3rrMCZ1R1M78H4xK5jxikmdUUGW2kbfJQ1HoEpK"
},
- 'signature': 'RTyxbErBLcmTHBLj1rYCAEpMMkLnL65kchGni2tQczqzomYWZx9QQpLvnvNN5rD2nXkqaVW3USGak1vyAgvj2ecAKXQZXwcfosmnsBvRrH3M2M7cJeZSVWJCACfxMWuxAoMRtuaE2ABuDz6NFcUctXcSa4rdZFkxh5GoLYFqU4og6b',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "RTyxbErBLcmTHBLj1rYCAEpMMkLnL65kchGni2tQczqzomYWZx9QQpLvnvNN5rD2nXkqaVW3USGak1vyAgvj2ecAKXQZXwcfosmnsBvRrH3M2M7cJeZSVWJCACfxMWuxAoMRtuaE2ABuDz6NFcUctXcSa4rdZFkxh5GoLYFqU4og6b",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
- 'data': <transaction-specific data>,
+ "data": <transaction-specific data>,
<request-specific data>
}
@@ -430,6 +465,7 @@ These common metadata values are added to the result's JSON at the same level as
- GET_REVOC_REG_DELTA = "117"
- GET_AUTH_RULE = "121"
- GET_CONTEXT = "300"
+ - GET_RICH_SCHEMA = "301"
- `identifier` (base58-encoded string):
@@ -515,7 +551,17 @@ creation of new DIDs, setting, and rotation of verification key, setting and cha
NYM's alias.
+- `diddocContent` (json string; optional):
+
+ The diddocContent item is stored directly in the ledger state and has a maximum size of 10 KiB (10 x 1024 bytes).
+
+- `version` (integer; optional):
+
+ The NYM transaction version specifies the required level of validation of the relationship between the namespace identifier component of the DID and the intial public key (verkey). This field is optional, but if the NYM transaction version is provided, it must be set upon creation and cannot be updated. The accepted values are as follows:
+ 0 or NYM transaction version is not set: No validation of namespace identifier and initial verkey binding is performed.
+ 1: Validation is performed according to the did:sov method, in which the DID must be the first 16 bytes of the Verification Method public key.
+ 2: Validation is performed according to the did:indy, in which the namespace identifier component of the DID (last element) is derived from the initial public key of the DID, using the base58 encoding of the first 16 bytes of the SHA256 of the Verification Method public key (did = Base58(Truncate_msb(16(SHA256(publicKey))))). This DID is considered self-certifying.
If there is no NYM transaction with the specified DID (`dest`), then it can be considered as the creation of a new DID.
If there is a NYM transaction with the specified DID (`dest`), then this is update of existing DID.
@@ -527,32 +573,32 @@ So, if key rotation needs to be performed, the owner of the DID needs to send a
*Request Example*:
```
{
- 'operation': {
- 'type': '1'
- 'dest': 'GEzcdDLhCpGCYRHW82kjHd',
- 'role': '101',
- 'verkey': '~HmUWn928bnFT6Ephf65YXv'
+ "operation": {
+ "type": "1"
+ "dest": "GEzcdDLhCpGCYRHW82kjHd",
+ "role": "101",
+ "verkey": "~HmUWn928bnFT6Ephf65YXv"
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514213797569745,
- 'protocolVersion': 2,
- 'signature': '49W5WP5jr7x1fZhtpAhHFbuUDqUYZ3AKht88gUjrz8TEJZr5MZUPjskpfBFdboLPZXKjbGjutoVascfKiMD5W7Ba',
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514213797569745,
+ "protocolVersion": 2,
+ "signature": "49W5WP5jr7x1fZhtpAhHFbuUDqUYZ3AKht88gUjrz8TEJZr5MZUPjskpfBFdboLPZXKjbGjutoVascfKiMD5W7Ba",
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
"type":"1",
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver": 1,
"dest":"GEzcdDLhCpGCYRHW82kjHd",
"verkey":"~HmUWn928bnFT6Ephf65YXv",
"role":101,
@@ -578,12 +624,12 @@ So, if key rotation needs to be performed, the owner of the DID needs to send a
}]
}
- 'rootHash': '5ecipNPSztrk6X77fYPdepzFRUvLdqBuSqv4M9Mcv2Vn',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '3phchUcMsnKFk2eZmcySAWm2T5rnzZdEypW7A5SKi1Qt'],
+ "rootHash": "5ecipNPSztrk6X77fYPdepzFRUvLdqBuSqv4M9Mcv2Vn",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "3phchUcMsnKFk2eZmcySAWm2T5rnzZdEypW7A5SKi1Qt"],
- 'dest': 'N22KY2Dyvmuu2PyyqSFKue',
- 'role': '101',
- 'verkey': '31V83xQnJDkZTSvm796X4MnzZFtUc96Tq6GJtuVkFQBE'
+ "dest": "N22KY2Dyvmuu2PyyqSFKue",
+ "role": "101",
+ "verkey": "31V83xQnJDkZTSvm796X4MnzZFtUc96Tq6GJtuVkFQBE"
}
}
```
@@ -614,33 +660,33 @@ Adds or updates an attribute to a NYM record.
*Request Example*:
```
{
- 'operation': {
- 'type': '100'
- 'dest': 'N22KY2Dyvmuu2PyyqSFKue',
- 'raw': '{"name": "Alice"}'
+ "operation": {
+ "type": "100"
+ "dest": "N22KY2Dyvmuu2PyyqSFKue",
+ "raw": "{"name": "Alice"}"
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514213797569745,
- 'protocolVersion': 2,
- 'signature': '49W5WP5jr7x1fZhtpAhHFbuUDqUYZ3AKht88gUjrz8TEJZr5MZUPjskpfBFdboLPZXKjbGjutoVascfKiMD5W7Ba',
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514213797569745,
+ "protocolVersion": 2,
+ "signature": "49W5WP5jr7x1fZhtpAhHFbuUDqUYZ3AKht88gUjrz8TEJZr5MZUPjskpfBFdboLPZXKjbGjutoVascfKiMD5W7Ba",
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
"type":"100",
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
"dest":"N22KY2Dyvmuu2PyyqSFKue",
- 'raw': '{"name":"Alice"}'
+ "raw": "{"name":"Alice"}"
},
"metadata": {
@@ -663,8 +709,8 @@ Adds or updates an attribute to a NYM record.
}]
}
- 'rootHash': '5ecipNPSztrk6X77fYPdepzFRUvLdqBuSqv4M9Mcv2Vn',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '3phchUcMsnKFk2eZmcySAWm2T5rnzZdEypW7A5SKi1Qt'],
+ "rootHash": "5ecipNPSztrk6X77fYPdepzFRUvLdqBuSqv4M9Mcv2Vn",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "3phchUcMsnKFk2eZmcySAWm2T5rnzZdEypW7A5SKi1Qt"],
}
}
@@ -687,39 +733,39 @@ So, if the Schema needs to be evolved, a new Schema with a new version or name n
*Request Example*:
```
{
- 'operation': {
- 'type': '101',
- 'data': {
- 'version': '1.0',
- 'name': 'Degree',
- 'attr_names': ['undergrad', 'last_name', 'first_name', 'birth_date', 'postgrad', 'expiry_date']
+ "operation": {
+ "type": "101",
+ "data": {
+ "version": "1.0",
+ "name": "Degree",
+ "attr_names": ["undergrad", "last_name", "first_name", "birth_date", "postgrad", "expiry_date"]
},
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'endorser': 'D6HG5g65TDQr1PPHHRoiGf',
- 'reqId': 1514280215504647,
- 'protocolVersion': 2,
- 'signature': '5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS'
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
"type":"101",
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
"data": {
"name": "Degree",
"version": "1.0",
- 'attr_names': ['undergrad', 'last_name', 'first_name', 'birth_date', 'postgrad', 'expiry_date']
+ "attr_names": ["undergrad", "last_name", "first_name", "birth_date", "postgrad", "expiry_date"]
}
},
@@ -744,8 +790,8 @@ So, if the Schema needs to be evolved, a new Schema with a new version or name n
}]
}
- 'rootHash': '5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b'],
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
}
}
@@ -782,43 +828,43 @@ a new Claim Def needs to be created by a new Issuer DID (`identifier`).
*Request Example*:
```
{
- 'operation': {
- 'type': '102',
- 'signature_type': 'CL',
- 'ref': 10,
- 'tag': 'some_tag',
- 'data': {
- 'primary': ....,
- 'revocation': ....
+ "operation": {
+ "type": "102",
+ "signature_type": "CL",
+ "ref": 10,
+ "tag": "some_tag",
+ "data": {
+ "primary": ....,
+ "revocation": ....
}
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'endorser': 'D6HG5g65TDQr1PPHHRoiGf',
- 'reqId': 1514280215504647,
- 'protocolVersion': 2,
- 'signature': '5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS'
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
"type":"102",
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
"signature_type":"CL",
- 'ref': 10,
- 'tag': 'some_tag',
- 'data': {
- 'primary': ....,
- 'revocation': ....
+ "ref": 10,
+ "tag": "some_tag",
+ "data": {
+ "primary": ....,
+ "revocation": ....
}
},
@@ -843,8 +889,8 @@ a new Claim Def needs to be created by a new Issuer DID (`identifier`).
}]
},
- 'rootHash': '5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b'],
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
}
}
@@ -874,51 +920,51 @@ It contains public keys, maximum number of credentials the registry may contain,
*Request Example*:
```
{
- 'operation': {
- 'type': '113',
- 'id': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'credDefId': 'FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag'
- 'revocDefType': 'CL_ACCUM',
- 'tag': 'tag1',
- 'value': {
- 'maxCredNum': 1000000,
- 'tailsHash': '6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297',
- 'tailsLocation': 'http://tails.location.com',
- 'issuanceType': 'ISSUANCE_BY_DEFAULT',
- 'publicKeys': {},
+ "operation": {
+ "type": "113",
+ "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "credDefId": "FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag"
+ "revocDefType": "CL_ACCUM",
+ "tag": "tag1",
+ "value": {
+ "maxCredNum": 1000000,
+ "tailsHash": "6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297",
+ "tailsLocation": "http://tails.location.com",
+ "issuanceType": "ISSUANCE_BY_DEFAULT",
+ "publicKeys": {},
},
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'endorser': 'D6HG5g65TDQr1PPHHRoiGf',
- 'reqId': 1514280215504647,
- 'protocolVersion': 2,
- 'signature': '5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS'
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
"type":"113",
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
- 'id': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'credDefId': 'FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag'
- 'revocDefType': 'CL_ACCUM',
- 'tag': 'tag1',
- 'value': {
- 'maxCredNum': 1000000,
- 'tailsHash': '6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297',
- 'tailsLocation': 'http://tails.location.com',
- 'issuanceType': 'ISSUANCE_BY_DEFAULT',
- 'publicKeys': {},
+ "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "credDefId": "FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag"
+ "revocDefType": "CL_ACCUM",
+ "tag": "tag1",
+ "value": {
+ "maxCredNum": 1000000,
+ "tailsHash": "6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297",
+ "tailsLocation": "http://tails.location.com",
+ "issuanceType": "ISSUANCE_BY_DEFAULT",
+ "publicKeys": {},
},
},
@@ -943,8 +989,8 @@ It contains public keys, maximum number of credentials the registry may contain,
}]
},
- 'rootHash': '5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b'],
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
}
}
@@ -968,45 +1014,45 @@ The RevocReg entry containing the new accumulator value and issued/revoked indic
*Request Example*:
```
{
- 'operation': {
- 'type': '114',
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1'
- 'revocDefType': 'CL_ACCUM',
- 'value': {
- 'accum': 'accum_value',
- 'prevAccum': 'prev_acuum_value',
- 'issued': [],
- 'revoked': [10, 36, 3478],
+ "operation": {
+ "type": "114",
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1"
+ "revocDefType": "CL_ACCUM",
+ "value": {
+ "accum": "accum_value",
+ "prevAccum": "prev_acuum_value",
+ "issued": [],
+ "revoked": [10, 36, 3478],
},
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'endorser': 'D6HG5g65TDQr1PPHHRoiGf',
- 'reqId': 1514280215504647,
- 'protocolVersion': 2,
- 'signature': '5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS'
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
"type":"114",
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1'
- 'revocDefType': 'CL_ACCUM',
- 'value': {
- 'accum': 'accum_value',
- 'prevAccum': 'prev_acuum_value',
- 'issued': [],
- 'revoked': [10, 36, 3478],
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1"
+ "revocDefType": "CL_ACCUM",
+ "value": {
+ "accum": "accum_value",
+ "prevAccum": "prev_acuum_value",
+ "issued": [],
+ "revoked": [10, 36, 3478],
},
},
@@ -1031,93 +1077,113 @@ The RevocReg entry containing the new accumulator value and issued/revoked indic
}]
},
- 'rootHash': '5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b'],
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
}
}
```
-### NODE
-Adds a new node to the pool, or updates existing node in the pool.
+### JSON_LD_CONTEXT
+Adds a JSON LD Context as part of Rich Schema feature.
-- `data` (dict):
-
- Data associated with the Node:
-
- - `alias` (string): Node's alias
- - `blskey` (base58-encoded string; optional): BLS multi-signature key as base58-encoded string (it's needed for BLS signatures and state proofs support)
- - `client_ip` (string; optional): Node's client listener IP address, that is the IP clients use to connect to the node when sending read and write requests (ZMQ with TCP)
- - `client_port` (string; optional): Node's client listener port, that is the port clients use to connect to the node when sending read and write requests (ZMQ with TCP)
- - `node_ip` (string; optional): The IP address other Nodes use to communicate with this Node; no clients are allowed here (ZMQ with TCP)
- - `node_port` (string; optional): The port other Nodes use to communicate with this Node; no clients are allowed here (ZMQ with TCP)
- - `services` (array of strings; optional): the service of the Node. `VALIDATOR` is the only supported one now.
+It's not possible to update an existing Context.
+If the Context needs to be evolved, a new Context with a new id and name-version needs to be created.
-- `dest` (base58-encoded string):
- Target Node's verkey as base58-encoded string for 16 or 32 byte DID value.
- It differs from `identifier` metadata field, where `identifier` is the DID of the transaction submitter (Steward's DID).
- *Example*: `identifier` is a DID of a Steward creating a new Node, and `dest` is the verkey of this Node.
+- `id` (string):
-If there is no NODE transaction with the specified Node ID (`dest`), then it can be considered as creation of a new NODE.
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
-If there is a NODE transaction with the specified Node ID (`dest`), then this is update of existing NODE.
-In this case we can specify only the values we would like to override. All unspecified values remain the same.
-So, if a Steward wants to rotate BLS key, then it's sufficient to send a NODE transaction with `dest` and a new `blskey` in `data`.
-There is no need to specify all other fields in `data`, and they will remain the same.
+ Context object as JSON serialized in canonical form. It must have `@context` as a top level key.
+ The `@context` value must be either:
+ 1) a URI (it should dereference to a Context object)
+ 2) a Context object (a dict)
+ 3) an array of Context objects and/or Context URIs
+
+- `rsType` (string):
+
+ Context's type. Currently expected to be `ctx`.
+
+- `rsName` (string):
+
+ Context's name
+
+- `rsVersion` (string):
+
+ Context's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
*Request Example*:
```
{
- 'operation': {
- 'type': '0'
- 'data': {
- 'alias': 'Node1',
- 'client_ip': '127.0.0.1',
- 'client_port': 7588,
- 'node_ip': '127.0.0.1',
- 'node_port': 7587,
- 'blskey': '00000000000000000000000000000000',
- 'services': ['VALIDATOR']}
- } ,
- 'dest': '6HoV7DUEfNDiUP4ENnSC4yePja8w7JDQJ5uzVgyW4nL8'
+ "operation": {
+ "type": "200",
+ "id": "did:sov:GGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "@context": [
+ {
+ "@version": 1.1
+ },
+ "https://www.w3.org/ns/odrl.jsonld",
+ {
+ "ex": "https://example.org/examples#",
+ "schema": "http://schema.org/",
+ "rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#"
+ }
+ ]
+ }",
+ "rsName":"SimpleContext",
+ "rsVersion":"1.0",
+ "rsType": "ctx"
},
-
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj',
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
- "type":0,
+ "type":"200",
+ "ver": 1,
"protocolVersion":2,
"data": {
- "ver":1,
- 'data': {
- 'alias': 'Node1',
- 'client_ip': '127.0.0.1',
- 'client_port': 7588,
- 'node_ip': '127.0.0.1',
- 'node_port': 7587,
- 'blskey': '00000000000000000000000000000000',
- 'services': ['VALIDATOR']}
- } ,
- 'dest': '6HoV7DUEfNDiUP4ENnSC4yePja8w7JDQJ5uzVgyW4nL8'
+ "id": "did:sov:GGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "@context": [
+ {
+ "@version": 1.1
+ },
+ "https://www.w3.org/ns/odrl.jsonld",
+ {
+ "ex": "https://example.org/examples#",
+ "schema": "http://schema.org/",
+ "rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#"
+ }
+ ]
+ }",
+ "rsName":"SimpleContext",
+ "rsVersion":"1.0",
+ "rsType": "ctx"
},
"metadata": {
- "reqId":1514304094738044,
- "from":"21BPzYYrFzbuECcBV3M1FH",
+ "reqId":1514280215504647,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
"digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
"payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
},
@@ -1125,118 +1191,166 @@ There is no need to specify all other fields in `data`, and they will remain the
"txnMetadata": {
"txnTime":1513945121,
"seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0",
},
"reqSignature": {
"type": "ED25519",
"values": [{
- "from": "21BPzYYrFzbuECcBV3M1FH",
- "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}]
}
- 'rootHash': 'DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV',
- 'auditPath': ['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6'],
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
+
}
}
```
-### POOL_UPGRADE
-
-Command to upgrade the Pool (sent by Trustee). It upgrades the specified Nodes (either all nodes in the Pool, or some specific ones).
-
-- `name` (string):
+### RICH_SCHEMA
+Adds a Rich Schema object as part of Rich Schema feature.
- Human-readable name for the upgrade.
+It's not possible to update an existing Rich Schema.
+If the Rich Schema needs to be evolved, a new Rich Schema with a new id and name-version needs to be created.
-- `action` (enum: `start` or `cancel`):
- Starts or cancels the Upgrade.
-
-- `version` (string):
- The version of indy-node package we perform upgrade to.
- Must be greater than existing one (or equal if `reinstall` flag is True).
-
-- `schedule` (dict of node DIDs to timestamps):
+- `id` (string):
- Schedule of when to perform upgrade on each node. This is a map where Node DIDs are keys, and upgrade time is a value (see example below).
- If `force` flag is False, then it's required that time difference between each Upgrade must be not less than 5 minutes
- (to give each Node enough time and not make the whole Pool go down during Upgrade).
-
-- `sha256` (sha256 hash string):
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
- sha256 hash of the package
+ Rich Schema object as JSON serialized in canonical form.
+ This value must be a json-ld, rich schema object. json-ld supports many parameters that are optional for a rich schema txn.
+ However, the following parameters must be there:
-- `force` (boolean; optional):
-
- Whether we should apply transaction (schedule Upgrade) without waiting for consensus
- of this transaction.
- If false, then transaction is applied only after it's written to the ledger.
- Otherwise it's applied regardless of result of consensus, and there are no restrictions on the Upgrade `schedule` for each Node.
- So, we can Upgrade the whole Pool at the same time when it's set to True.
- False by default. Avoid setting to True without good reason.
+ - `@id`: The value of this property must be (or map to, via a context object) a URI.
+ - `@type`: The value of this property must be (or map to, via a context object) a URI.
+ - `@context`(optional): If present, the value of this property must be a context object or a URI which can be dereferenced to obtain a context object.
-- `reinstall` (boolean; optional):
-
- Whether it's allowed to re-install the same version.
- False by default.
-- `timeout` (integer; optional):
+- `rsType` (string):
- Limits upgrade time on each Node.
+ Rich Schema's type. Currently expected to be `sch`.
+
+- `rsName` (string):
-- `justification` (string; optional):
+ Rich Schema's name
+
+- `rsVersion` (string):
- Optional justification string for this particular Upgrade.
+ Rich Schema's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
*Request Example*:
```
{
- 'operation': {
- 'type': '109'
- 'name': `upgrade-13`,
- 'action': `start`,
- 'version': `1.3`,
- 'schedule': {"4yC546FFzorLPgTNTc6V43DnpFrR8uHvtunBxb2Suaa2":"2017-12-25T10:25:58.271857+00:00","AtDfpKFe1RPgcr5nnYBw1Wxkgyn8Zjyh5MzFoEUTeoV3":"2017-12-25T10:26:16.271857+00:00","DG5M4zFm33Shrhjj6JB7nmx9BoNJUq219UXDfvwBDPe2":"2017-12-25T10:26:25.271857+00:00","JpYerf4CssDrH76z7jyQPJLnZ1vwYgvKbvcp16AB5RQ":"2017-12-25T10:26:07.271857+00:00"},
- 'sha256': `db34a72a90d026dae49c3b3f0436c8d3963476c77468ad955845a1ccf7b03f55`,
- 'force': false,
- 'reinstall': false,
- 'timeout': 1
+ "operation": {
+ "type": "201",
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "@id": "test_unique_id",
+ "@context": "ctx:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "@type": "rdfs:Class",
+ "rdfs:comment": "ISO18013 International Driver License",
+ "rdfs:label": "Driver License",
+ "rdfs:subClassOf": {
+ "@id": "sch:Thing"
+ },
+ "driver": "Driver",
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "issuingAuthority": "Text",
+ "licenseNumber": "Text",
+ "categoriesOfVehicles": {
+ "vehicleType": "Text",
+ "vehicleType-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^(A|B|C|D|BE|CE|DE|AM|A1|A2|B1|C1|D1|C1E|D1E)$"
+ },
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "restrictions": "Text",
+ "restrictions-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^([A-Z]|[1-9])$"
+ }
+ },
+ "administrativeNumber": "Text"
+ }",
+ "rsName":"SimpleRichSchema",
+ "rsVersion":"1.0",
+ "rsType": "sch"
},
-
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj',
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}
```
-
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver": 1,
"txn": {
- "type":109,
+ "type":"201",
"protocolVersion":2,
-
+ "ver":1,
+
"data": {
- "ver":1,
- "name":"upgrade-13",
- "action":"start",
- "version":"1.3",
- "schedule":{"4yC546FFzorLPgTNTc6V43DnpFrR8uHvtunBxb2Suaa2":"2017-12-25T10:25:58.271857+00:00","AtDfpKFe1RPgcr5nnYBw1Wxkgyn8Zjyh5MzFoEUTeoV3":"2017-12-25T10:26:16.271857+00:00","DG5M4zFm33Shrhjj6JB7nmx9BoNJUq219UXDfvwBDPe2":"2017-12-25T10:26:25.271857+00:00","JpYerf4CssDrH76z7jyQPJLnZ1vwYgvKbvcp16AB5RQ":"2017-12-25T10:26:07.271857+00:00"},
- "sha256":"db34a72a90d026dae49c3b3f0436c8d3963476c77468ad955845a1ccf7b03f55",
- "force":false,
- "reinstall":false,
- "timeout":1,
- "justification":null,
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "@id": "test_unique_id",
+ "@context": "ctx:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "@type": "rdfs:Class",
+ "rdfs:comment": "ISO18013 International Driver License",
+ "rdfs:label": "Driver License",
+ "rdfs:subClassOf": {
+ "@id": "sch:Thing"
+ },
+ "driver": "Driver",
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "issuingAuthority": "Text",
+ "licenseNumber": "Text",
+ "categoriesOfVehicles": {
+ "vehicleType": "Text",
+ "vehicleType-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^(A|B|C|D|BE|CE|DE|AM|A1|A2|B1|C1|D1|C1E|D1E)$"
+ },
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "restrictions": "Text",
+ "restrictions-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^([A-Z]|[1-9])$"
+ }
+ },
+ "administrativeNumber": "Text"
+ }",
+ "rsName":"SimpleRichSchema",
+ "rsVersion":"1.0",
+ "rsType": "sch"
+ },
+ "meta": {
+ "name":"recipeIngredient",
+ "version":"1.0",
+ "type": "sch",
+ "tag": "sometag"
+ },
},
"metadata": {
- "reqId":1514304094738044,
- "from":"21BPzYYrFzbuECcBV3M1FH",
+ "reqId":1514280215504647,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
"digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
"payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
},
@@ -1244,75 +1358,142 @@ Command to upgrade the Pool (sent by Trustee). It upgrades the specified Nodes (
"txnMetadata": {
"txnTime":1513945121,
"seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1:recipeIngredient:1.0",
},
"reqSignature": {
"type": "ED25519",
"values": [{
- "from": "21BPzYYrFzbuECcBV3M1FH",
- "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}]
- },
-
- 'rootHash': 'DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV',
- 'auditPath': ['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6'],
+ }
+
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
+
}
}
```
-### POOL_CONFIG
+### RICH_SCHEMA_ENCODING
+Adds an Encoding object as part of Rich Schema feature.
+It's not possible to update an existing Encoding.
+If the Encoding needs to be evolved, a new Encoding with a new id and name-version needs to be created.
-Command to change Pool's configuration
-- `writes` (boolean):
- Whether any write requests can be processed by the pool (if false, then pool goes to read-only state).
- True by default.
+- `id` (string):
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
-- `force` (boolean; optional):
+ Encoding object as JSON serialized in canonical form.
+
+ - `input`: a description of the input value
+ - `output`: a description of the output value
+ - `algorithm`:
+ - `documentation`: a URL which references a specific github commit of
+ the documentation that fully describes the transformation algorithm.
+ - `implementation`: a URL that links to a reference implementation of the
+ transformation algorithm. It is not necessary to use the implementation
+ linked to here, as long as the implementation used implements the same
+ transformation algorithm.
+ - `description`: a brief description of the transformation algorithm.
+ - `testVectors`: a URL which references a specific github commit of a
+selection of test vectors that may be used to provide assurance that a
+transformation algorithm implementation is correct.
+
+- `rsType` (string):
+
+ Encoding's type. Currently expected to be `enc`.
+
+- `rsName` (string):
- Whether we should apply transaction (for example, move pool to read-only state) without waiting for consensus
- of this transaction.
- If false, then transaction is applied only after it's written to the ledger.
- Otherwise it's applied regardless of result of consensus.
- False by default. Avoid setting to True without good reason.
+ Encoding's name
+
+- `rsVersion` (string):
+
+ Encoding's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
*Request Example*:
```
{
- 'operation': {
- 'type': '111'
- 'writes':false,
- 'force':true
+ "operation": {
+ "type": "202",
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "input": {
+ "id": "DateRFC3339",
+ "type": "string"
+ },
+ "output": {
+ "id": "UnixTime",
+ "type": "256-bit integer"
+ },
+ "algorithm": {
+ "description": "This encoding transforms an
+ RFC3339-formatted datetime object into the number
+ of seconds since January 1, 1970 (the Unix epoch).",
+ "documentation": URL to specific github commit,
+ "implementation": URL to implementation
+ },
+ "test_vectors": URL to specific github commit
+ }",
+ "rsName":"SimpleEncoding",
+ "rsVersion":"1.0",
+ "rsType": "enc"
},
-
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj',
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}
```
-
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- "ver":1,
+ "op": "REPLY",
+ "result": {
+ "ver": 1,
"txn": {
- "type":111,
+ "type":"202",
"protocolVersion":2,
-
+ "ver":1,
+
"data": {
- "ver":1,
- "writes":false,
- "force":true,
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "input": {
+ "id": "DateRFC3339",
+ "type": "string"
+ },
+ "output": {
+ "id": "UnixTime",
+ "type": "256-bit integer"
+ },
+ "algorithm": {
+ "description": "This encoding transforms an
+ RFC3339-formatted datetime object into the number
+ of seconds since January 1, 1970 (the Unix epoch).",
+ "documentation": URL to specific github commit,
+ "implementation": URL to implementation
+ },
+ "test_vectors": URL to specific github commit
+ }",
+ "rsName":"SimpleEncoding",
+ "rsVersion":"1.0",
+ "rsType": "enc"
},
"metadata": {
- "reqId":1514304094738044,
- "from":"21BPzYYrFzbuECcBV3M1FH",
+ "reqId":1514280215504647,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
"digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
"payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
},
@@ -1320,43 +1501,785 @@ Command to change Pool's configuration
"txnMetadata": {
"txnTime":1513945121,
"seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1:recipeIngredient:1.0",
},
"reqSignature": {
"type": "ED25519",
"values": [{
- "from": "21BPzYYrFzbuECcBV3M1FH",
- "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
}]
- },
-
- 'rootHash': 'DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV',
- 'auditPath': ['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6'],
+ }
+
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
+
}
}
```
-### AUTH_RULE
+### RICH_SCHEMA_MAPPING
+Adds a Mapping object as part of Rich Schema feature.
-A command to change authentication rules.
-Internally authentication rules are stored as a key-value dictionary: `{action} -> {auth_constraint}`.
+It's not possible to update an existing Mapping.
+If the Mapping needs to be evolved, a new Mapping with a new id and name-version needs to be created.
-The list of actions is static and can be found in [auth_rules.md](auth_rules.md).
-There is a default Auth Constraint for every action (defined in [auth_rules.md](auth_rules.md)).
-The `AUTH_RULE` command allows to change the Auth Constraint.
-So, it's not possible to register new actions by this command. But it's possible to override authentication constraints (values) for the given action.
-Please note, that list elements of `GET_AUTH_RULE` output can be used as an input (with a required changes) for `AUTH_RULE`.
+- `id` (string):
-If format of a transaction is incorrect, the client will receive NACK message for the request.
-A client will receive NACK for
-- a request with incorrect format;
-- a request with "ADD" action, but with "old_value";
-- a request with "EDIT" action without "old_value";
-- a request with a key that is not in the [auth_rule](auth_rule.md).
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
-The following input parameters must match an auth rule from the [auth_rules.md](auth_rules.md):
-- `auth_type` (string enum)
+ Mapping object as JSON serialized in canonical form.
+ This value must be a json-ld object. json-ld supports many parameters that are optional for a rich schema txn.
+ However, the following parameters must be there:
+
+ - `@id`: The value of this property must be (or map to, via a context object) a URI.
+ - `@type`: The value of this property must be (or map to, via a context object) a URI.
+ - `@context`(optional): If present, the value of this property must be a context object or a URI which can be dereferenced to obtain a context object.
+ - `schema`: An `id` of the corresponding Rich Schema
+ - `attributes` (dict): A dict of all the schema attributes the Mapping object is going to map to encodings and use in credentials.
+ An attribute may have nested attributes matching the schema structure.
+ It must also contain the following default attributes required by any W3C compatible
+ verifiable credential (plus any additional attributes that may have been included from the
+ W3C verifiable credentials data model):
+
+ - `issuer`
+ - `issuanceDate`
+ - any additional attributes
+
+ Every leaf attribute's value is an array of the following pairs:
+
+ - `enc` (string): Encoding object (referenced by its `id`) to be used for representation of the attribute as an integer.
+ - `rank` (int): Rank of the attribute to define the order in which the attribute is signed by the Issuer. It is important that no two `rank` values may be identical.
+
+
+- `rsType` (string):
+
+ Mapping's type. Currently expected to be `map`.
+
+- `rsName` (string):
+
+ Mapping's name
+
+- `rsVersion` (string):
+
+ Mapping's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "203",
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ '@id': "did:sov:5e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@context': "did:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@type': "rdfs:Class",
+ "schema": "did:sov:4e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "attribuites" : {
+ "issuer": [{
+ "enc": "did:sov:9x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 1
+ }],
+ "issuanceDate": [{
+ "enc": "did:sov:119F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 2
+ }],
+ "expirationDate": [{
+ "enc": "did:sov:119F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 11
+ }],
+ "driver": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 5
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:2x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 4
+ }],
+ "issuingAuthority": [{
+ "enc": "did:sov:3x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 3
+ }],
+ "licenseNumber": [
+ {
+ "enc": "did:sov:4x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 9
+ },
+ {
+ "enc": "did:sov:5x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 10
+ },
+ ],
+ "categoriesOfVehicles": {
+ "vehicleType": [{
+ "enc": "did:sov:6x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 6
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:7x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 7
+ }],
+ },
+ "administrativeNumber": [{
+ "enc": "did:sov:8x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 8
+ }]
+ }
+ }",
+ "rsName":"SimpleMapping",
+ "rsVersion":"1.0",
+ "rsType": "map"
+ },
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
+}
+```
+*Reply Example*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "ver": 1,
+ "txn": {
+ "type":"203",
+ "protocolVersion":2,
+ "ver":1,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ '@id': "did:sov:5e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@context': "did:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@type': "rdfs:Class",
+ "schema": "did:sov:4e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "attribuites" : {
+ "issuer": [{
+ "enc": "did:sov:9x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 1
+ }],
+ "issuanceDate": [{
+ "enc": "did:sov:119F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 2
+ }],
+ "expirationDate": [{
+ "enc": "did:sov:119F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 11
+ }],
+ "driver": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 5
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:2x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 4
+ }],
+ "issuingAuthority": [{
+ "enc": "did:sov:3x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 3
+ }],
+ "licenseNumber": [
+ {
+ "enc": "did:sov:4x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 9
+ },
+ {
+ "enc": "did:sov:5x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 10
+ },
+ ],
+ "categoriesOfVehicles": {
+ "vehicleType": [{
+ "enc": "did:sov:6x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 6
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:7x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 7
+ }],
+ },
+ "administrativeNumber": [{
+ "enc": "did:sov:8x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 8
+ }]
+ }
+ }",
+ "rsName":"SimpleMapping",
+ "rsVersion":"1.0",
+ "rsType": "map"
+ },
+
+ "metadata": {
+ "reqId":1514280215504647,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1:recipeIngredient:1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
+ }]
+ }
+
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
+
+ }
+}
+```
+
+### RICH_SCHEMA_CRED_DEF
+Adds a Credential Definition object as part of Rich Schema feature.
+
+Credential Definition is considered as a mutable object as the Issuer may rotate keys present there.
+However, rotation of Issuer's keys should be done carefully as it will invalidate all
+credentials issued for this key.
+
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
+
+ Credential Definition object as JSON serialized in canonical form.
+
+ - `signatureType` (string): Type of the ZKP signature. `CL` (Camenisch-Lysyanskaya) is the only supported type now.
+ - `mapping` (string): An `id` of the corresponding Mapping
+ - `schema` (string): An `id` of the corresponding Rich Schema. The `mapping` must reference the same Schema.
+ - `publicKey` (dict): Issuer's public keys. Consists ot primary and revocation keys.
+ - `primary` (dict): primary key
+ - `revocation` (dict, optional): revocation key
+
+- `rsType` (string):
+
+ Credential Definition's type. Currently expected to be `cdf`.
+
+- `rsName` (string):
+
+ Credential Definition's name
+
+- `rsVersion` (string):
+
+ Credential Definition's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "204",
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "signatureType": "CL",
+ "mapping": "did:sov:UVj5w8DRzcmPVDpUMr4AZhJ",
+ "schema": "did:sov:U5x5w8DRzcmPVDpUMr4AZhJ",
+ "publicKey": {
+ "primary": "...",
+ "revocation": "..."
+ }
+ }",
+ "rsName":"SimpleCredDef",
+ "rsVersion":"1.0",
+ "rsType": "cdf"
+ },
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
+}
+```
+*Reply Example*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "ver": 1,
+ "txn": {
+ "type":"204",
+ "protocolVersion":2,
+ "ver":1,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "signatureType": "CL",
+ "mapping": "did:sov:UVj5w8DRzcmPVDpUMr4AZhJ",
+ "schema": "did:sov:U5x5w8DRzcmPVDpUMr4AZhJ",
+ "publicKey": {
+ "primary": "...",
+ "revocation": "..."
+ }
+ }",
+ "rsName":"SimpleCredDef",
+ "rsVersion":"1.0",
+ "rsType": "cdf"
+ },
+
+ "metadata": {
+ "reqId":1514280215504647,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1:recipeIngredient:1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
+ }]
+ }
+
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
+
+ }
+}
+```
+
+### RICH_SCHEMA_PRES_DEF
+Adds a Presentation Definition object as part of Rich Schema feature.
+
+Presentation Definition is considered as a mutable object since restrictions to Issuers, Schemas and Credential Definitions
+to be used in proof may evolve.
+ For example, Issuer's key for a given Credential Definition may be compromised,
+ so Presentation Definition can be updated to exclude this Credential Definition from the list of recommended ones.
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
+
+ Presentation Definition object as JSON serialized in canonical form.
+
+- `rsType` (string):
+
+ Presentation Definition's type. Currently expected to be `pdf`.
+
+- `rsName` (string):
+
+ Presentation Definition's name
+
+- `rsVersion` (string):
+
+ Presentation Definition's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "205",
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ TBD
+ }",
+ "rsName":"SimplePresDef",
+ "rsVersion":"1.0",
+ "rsType": "pdf"
+ },
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId": 1514280215504647,
+ "protocolVersion": 2,
+ "signature": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
+}
+```
+*Reply Example*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "ver": 1,
+ "txn": {
+ "type":"205",
+ "protocolVersion":2,
+ "ver":1,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ TBD
+ }",
+ "rsName":"SimplePresDef",
+ "rsVersion":"1.0",
+ "rsType": "pdf"
+ },
+
+ "metadata": {
+ "reqId":1514280215504647,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1:recipeIngredient:1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
+ }]
+ }
+
+ "rootHash": "5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm",
+ "auditPath": ["Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA", "66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b"],
+
+ }
+}
+```
+### NODE
+Adds a new node to the pool, or updates existing node in the pool.
+
+- `data` (dict):
+
+ Data associated with the Node:
+
+ - `alias` (string): Node's alias
+ - `blskey` (base58-encoded string; optional): BLS multi-signature key as base58-encoded string (it's needed for BLS signatures and state proofs support)
+ - `client_ip` (string; optional): Node's client listener IP address, that is the IP clients use to connect to the node when sending read and write requests (ZMQ with TCP)
+ - `client_port` (string; optional): Node's client listener port, that is the port clients use to connect to the node when sending read and write requests (ZMQ with TCP)
+ - `node_ip` (string; optional): The IP address other Nodes use to communicate with this Node; no clients are allowed here (ZMQ with TCP)
+ - `node_port` (string; optional): The port other Nodes use to communicate with this Node; no clients are allowed here (ZMQ with TCP)
+ - `services` (array of strings; optional): the service of the Node. `VALIDATOR` is the only supported one now.
+
+- `dest` (base58-encoded string):
+
+ Target Node's verkey as base58-encoded string for 16 or 32 byte DID value.
+ It differs from `identifier` metadata field, where `identifier` is the DID of the transaction submitter (Steward's DID).
+
+ *Example*: `identifier` is a DID of a Steward creating a new Node, and `dest` is the verkey of this Node.
+
+If there is no NODE transaction with the specified Node ID (`dest`), then it can be considered as creation of a new NODE.
+
+If there is a NODE transaction with the specified Node ID (`dest`), then this is update of existing NODE.
+In this case we can specify only the values we would like to override. All unspecified values remain the same.
+So, if a Steward wants to rotate BLS key, then it's sufficient to send a NODE transaction with `dest` and a new `blskey` in `data`.
+There is no need to specify all other fields in `data`, and they will remain the same.
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "0"
+ "data": {
+ "alias": "Node1",
+ "client_ip": "127.0.0.1",
+ "client_port": 7588,
+ "node_ip": "127.0.0.1",
+ "node_port": 7587,
+ "blskey": "00000000000000000000000000000000",
+ "services": ["VALIDATOR"]}
+ } ,
+ "dest": "6HoV7DUEfNDiUP4ENnSC4yePja8w7JDQJ5uzVgyW4nL8"
+ },
+
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
+}
+```
+
+*Reply Example*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "ver": 1,
+ "txn": {
+ "type":0,
+ "protocolVersion":2,
+ "ver": 1,
+
+ "data": {
+ "data": {
+ "alias": "Node1",
+ "client_ip": "127.0.0.1",
+ "client_port": 7588,
+ "node_ip": "127.0.0.1",
+ "node_port": 7587,
+ "blskey": "00000000000000000000000000000000",
+ "services": ["VALIDATOR"]}
+ } ,
+ "dest": "6HoV7DUEfNDiUP4ENnSC4yePja8w7JDQJ5uzVgyW4nL8"
+ },
+
+ "metadata": {
+ "reqId":1514304094738044,
+ "from":"21BPzYYrFzbuECcBV3M1FH",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "21BPzYYrFzbuECcBV3M1FH",
+ "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
+ }]
+ }
+
+ "rootHash": "DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV",
+ "auditPath": ["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"],
+ }
+}
+```
+
+### POOL_UPGRADE
+
+Command to upgrade the Pool (sent by Trustee). It upgrades the specified Nodes (either all nodes in the Pool, or some specific ones).
+
+- `name` (string):
+
+ Human-readable name for the upgrade.
+
+- `action` (enum: `start` or `cancel`):
+
+ Starts or cancels the Upgrade.
+
+- `version` (string):
+
+ The version of indy-node package we perform upgrade to.
+ Must be greater than existing one (or equal if `reinstall` flag is True).
+
+- `schedule` (dict of node DIDs to timestamps):
+
+ Schedule of when to perform upgrade on each node. This is a map where Node DIDs are keys, and upgrade time is a value (see example below).
+ If `force` flag is False, then it's required that time difference between each Upgrade must be not less than 5 minutes
+ (to give each Node enough time and not make the whole Pool go down during Upgrade).
+
+- `sha256` (sha256 hash string):
+
+ sha256 hash of the package
+
+- `force` (boolean; optional):
+
+ Whether we should apply transaction (schedule Upgrade) without waiting for consensus
+ of this transaction.
+ If false, then transaction is applied only after it's written to the ledger.
+ Otherwise it's applied regardless of result of consensus, and there are no restrictions on the Upgrade `schedule` for each Node.
+ So, we can Upgrade the whole Pool at the same time when it's set to True.
+ False by default. Avoid setting to True without good reason.
+
+- `reinstall` (boolean; optional):
+
+ Whether it's allowed to re-install the same version.
+ False by default.
+
+- `timeout` (integer; optional):
+
+ Limits upgrade time on each Node.
+
+- `justification` (string; optional):
+
+ Optional justification string for this particular Upgrade.
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "109"
+ "name": `upgrade-13`,
+ "action": `start`,
+ "version": `1.3`,
+ "schedule": {"4yC546FFzorLPgTNTc6V43DnpFrR8uHvtunBxb2Suaa2":"2017-12-25T10:25:58.271857+00:00","AtDfpKFe1RPgcr5nnYBw1Wxkgyn8Zjyh5MzFoEUTeoV3":"2017-12-25T10:26:16.271857+00:00","DG5M4zFm33Shrhjj6JB7nmx9BoNJUq219UXDfvwBDPe2":"2017-12-25T10:26:25.271857+00:00","JpYerf4CssDrH76z7jyQPJLnZ1vwYgvKbvcp16AB5RQ":"2017-12-25T10:26:07.271857+00:00"},
+ "sha256": `db34a72a90d026dae49c3b3f0436c8d3963476c77468ad955845a1ccf7b03f55`,
+ "force": false,
+ "reinstall": false,
+ "timeout": 1
+ },
+
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
+}
+```
+
+*Reply Example*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "ver": 1,
+ "txn": {
+ "type":109,
+ "protocolVersion":2,
+ "ver": 1,
+
+ "data": {
+ "name":"upgrade-13",
+ "action":"start",
+ "version":"1.3",
+ "schedule":{"4yC546FFzorLPgTNTc6V43DnpFrR8uHvtunBxb2Suaa2":"2017-12-25T10:25:58.271857+00:00","AtDfpKFe1RPgcr5nnYBw1Wxkgyn8Zjyh5MzFoEUTeoV3":"2017-12-25T10:26:16.271857+00:00","DG5M4zFm33Shrhjj6JB7nmx9BoNJUq219UXDfvwBDPe2":"2017-12-25T10:26:25.271857+00:00","JpYerf4CssDrH76z7jyQPJLnZ1vwYgvKbvcp16AB5RQ":"2017-12-25T10:26:07.271857+00:00"},
+ "sha256":"db34a72a90d026dae49c3b3f0436c8d3963476c77468ad955845a1ccf7b03f55",
+ "force":false,
+ "reinstall":false,
+ "timeout":1,
+ "justification":null,
+ },
+
+ "metadata": {
+ "reqId":1514304094738044,
+ "from":"21BPzYYrFzbuECcBV3M1FH",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "21BPzYYrFzbuECcBV3M1FH",
+ "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
+ }]
+ },
+
+ "rootHash": "DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV",
+ "auditPath": ["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"],
+ }
+}
+```
+
+### POOL_CONFIG
+
+
+Command to change Pool's configuration
+
+- `writes` (boolean):
+
+ Whether any write requests can be processed by the pool (if false, then pool goes to read-only state).
+ True by default.
+
+
+- `force` (boolean; optional):
+
+ Whether we should apply transaction (for example, move pool to read-only state) without waiting for consensus
+ of this transaction.
+ If false, then transaction is applied only after it's written to the ledger.
+ Otherwise it's applied regardless of result of consensus.
+ False by default. Avoid setting to True without good reason.
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "111"
+ "writes":false,
+ "force":true
+ },
+
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
+}
+```
+
+*Reply Example*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "ver":1,
+ "txn": {
+ "type":111,
+ "protocolVersion":2,
+ "ver": 1,
+
+ "data": {
+ "writes":false,
+ "force":true,
+ },
+
+ "metadata": {
+ "reqId":1514304094738044,
+ "from":"21BPzYYrFzbuECcBV3M1FH",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "21BPzYYrFzbuECcBV3M1FH",
+ "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
+ }]
+ },
+
+ "rootHash": "DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV",
+ "auditPath": ["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"],
+ }
+}
+```
+
+### AUTH_RULE
+
+A command to change authentication rules.
+Internally authentication rules are stored as a key-value dictionary: `{action} -> {auth_constraint}`.
+
+The list of actions is static and can be found in [auth_rules.md](auth_rules.md).
+There is a default Auth Constraint for every action (defined in [auth_rules.md](auth_rules.md)).
+
+The `AUTH_RULE` command allows to change the Auth Constraint.
+So, it's not possible to register new actions by this command. But it's possible to override authentication constraints (values) for the given action.
+
+Please note, that list elements of `GET_AUTH_RULE` output can be used as an input (with a required changes) for `AUTH_RULE`.
+
+If format of a transaction is incorrect, the client will receive NACK message for the request.
+A client will receive NACK for
+- a request with incorrect format;
+- a request with "ADD" action, but with "old_value";
+- a request with "EDIT" action without "old_value";
+- a request with a key that is not in the [auth_rule](auth_rule.md).
+
+The following input parameters must match an auth rule from the [auth_rules.md](auth_rules.md):
+- `auth_type` (string enum)
The type of transaction to change the auth constraints to. (Example: "0", "1", ...). See transactions description to find the txn type enum value.
@@ -1386,18 +2309,18 @@ The `constraint_id` fields is where one can define the desired auth constraint f
Constraint Type. As of now, the following constraint types are supported:
- - 'ROLE': a constraint defining how many signatures of a given role are required
- - 'OR': logical disjunction for all constraints from `auth_constraints`
- - 'AND': logical conjunction for all constraints from `auth_constraints`
- - 'FORBIDDEN': a constraint for not allowed actions
+ - "ROLE": a constraint defining how many signatures of a given role are required
+ - "OR": logical disjunction for all constraints from `auth_constraints`
+ - "AND": logical conjunction for all constraints from `auth_constraints`
+ - "FORBIDDEN": a constraint for not allowed actions
- - fields if `'constraint_id': 'OR'` or `'constraint_id': 'AND'`
+ - fields if `"constraint_id": "OR"` or `"constraint_id": "AND"`
- `auth_constraints` (list)
A list of constraints. Any number of nested constraints is supported recursively
- - fields if `'constraint_id': 'ROLE'`:
+ - fields if `"constraint_id": "ROLE"`:
- `role` (string enum)
@@ -1424,7 +2347,7 @@ The `constraint_id` fields is where one can define the desired auth constraint f
Dictionary for additional parameters of the constraint. Can be used by plugins to add additional restrictions.
- - fields if `'constraint_id': 'FORBIDDEN'`:
+ - fields if `"constraint_id": "FORBIDDEN"`:
no fields
@@ -1436,89 +2359,89 @@ Let's consider an example of changing a value of a NODE transaction's `service`
```
{
- 'operation': {
- 'type':'120',
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
- 'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 2,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ "operation": {
+ "type":"120",
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
+ "constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 2,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
},
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj'
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
}
```
*Reply Example*:
```
-{ 'op':'REPLY',
- 'result':{
- 'txnMetadata':{
- 'seqNo':1,
- 'txnTime':1551776783
+{ "op":"REPLY",
+ "result":{
+ "txnMetadata":{
+ "seqNo":1,
+ "txnTime":1551776783
},
- 'reqSignature':{
- 'values':[
+ "reqSignature":{
+ "values":[
{
- 'value':'4j99V2BNRX1dn2QhnR8L9C3W9XQt1W3ScD1pyYaqD1NUnDVhbFGS3cw8dHRe5uVk8W7DoFtHb81ekMs9t9e76Fg',
- 'from':'M9BJDuS24bqbJNvBRsoGg3'
+ "value":"4j99V2BNRX1dn2QhnR8L9C3W9XQt1W3ScD1pyYaqD1NUnDVhbFGS3cw8dHRe5uVk8W7DoFtHb81ekMs9t9e76Fg",
+ "from":"M9BJDuS24bqbJNvBRsoGg3"
}
],
- 'type':'ED25519'
+ "type":"ED25519"
},
- 'txn':{
- 'data':{
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
- 'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 2,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ "txn":{
+ "data":{
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
+ "constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 2,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
},
- 'protocolVersion':2,
- 'metadata':{
- 'from':'M9BJDuS24bqbJNvBRsoGg3',
- 'digest':'ea13f0a310c7f4494d2828bccbc8ff0bd8b77d0c0bfb1ed9a84104bf55ad0436',
- 'payloadDigest': '21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685',
- 'reqId':711182024
+ "protocolVersion":2,
+ "metadata":{
+ "from":"M9BJDuS24bqbJNvBRsoGg3",
+ "digest":"ea13f0a310c7f4494d2828bccbc8ff0bd8b77d0c0bfb1ed9a84104bf55ad0436",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
+ "reqId":711182024
},
- 'type':'120'
+ "type":"120"
},
- 'ver':'1',
- 'rootHash':'GJNfknLWDAb8R93cgAX3Bw6CYDo23HBhiwZnzb4fHtyi',
- 'auditPath':['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6']
+ "ver":"1",
+ "rootHash":"GJNfknLWDAb8R93cgAX3Bw6CYDo23HBhiwZnzb4fHtyi",
+ "auditPath":["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"]
}
}
```
@@ -1577,18 +2500,18 @@ A client will receive NACK for
Constraint Type. As of now, the following constraint types are supported:
- - 'ROLE': a constraint defining how many signatures of a given role are required
- - 'OR': logical disjunction for all constraints from `auth_constraints`
- - 'AND': logical conjunction for all constraints from `auth_constraints`
- - 'FORBIDDEN': a constraint for not allowed actions
+ - "ROLE": a constraint defining how many signatures of a given role are required
+ - "OR": logical disjunction for all constraints from `auth_constraints`
+ - "AND": logical conjunction for all constraints from `auth_constraints`
+ - "FORBIDDEN": a constraint for not allowed actions
- - fields if `'constraint_id': 'OR'` or `'constraint_id': 'AND'`
+ - fields if `"constraint_id": "OR"` or `"constraint_id": "AND"`
- `auth_constraints` (list)
A list of constraints. Any number of nested constraints is supported recursively
- - fields if `'constraint_id': 'ROLE'`:
+ - fields if `"constraint_id": "ROLE"`:
- `role` (string enum)
@@ -1615,103 +2538,103 @@ A client will receive NACK for
Dictionary for additional parameters of the constraint. Can be used by plugins to add additional restrictions.
- - fields if `'constraint_id': 'FORBIDDEN'`:
+ - fields if `"constraint_id": "FORBIDDEN"`:
no fields
*Request Example*:
```
{
- 'operation': {
- 'type':'122',
- 'rules': [
- {'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 1,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ "operation": {
+ "type":"122",
+ "rules": [
+ {"constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 1,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
- 'field' :'services',
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'old_value': [VALIDATOR],
- 'new_value': []
+ "field" :"services",
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "old_value": [VALIDATOR],
+ "new_value": []
},
...
]
},
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 1,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj'
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 1,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
}
```
*Reply Example*:
```
-{ 'op':'REPLY',
- 'result':{
- 'txnMetadata':{
- 'seqNo':1,
- 'txnTime':1551776783
+{ "op":"REPLY",
+ "result":{
+ "txnMetadata":{
+ "seqNo":1,
+ "txnTime":1551776783
},
- 'reqSignature':{
- 'values':[
+ "reqSignature":{
+ "values":[
{
- 'value':'4j99V2BNRX1dn2QhnR8L9C3W9XQt1W3ScD1pyYaqD1NUnDVhbFGS3cw8dHRe5uVk8W7DoFtHb81ekMs9t9e76Fg',
- 'from':'M9BJDuS24bqbJNvBRsoGg3'
+ "value":"4j99V2BNRX1dn2QhnR8L9C3W9XQt1W3ScD1pyYaqD1NUnDVhbFGS3cw8dHRe5uVk8W7DoFtHb81ekMs9t9e76Fg",
+ "from":"M9BJDuS24bqbJNvBRsoGg3"
}
],
- 'type':'ED25519'
+ "type":"ED25519"
},
- 'txn':{
- 'type':'122',
- 'data':{
- 'rules': [
- {'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 1,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ "txn":{
+ "type":"122",
+ "data":{
+ "rules": [
+ {"constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 1,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
- 'field' :'services',
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'old_value': [VALIDATOR],
- 'new_value': []
+ "field" :"services",
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "old_value": [VALIDATOR],
+ "new_value": []
},
...
]
}
- 'protocolVersion':2,
- 'metadata':{
- 'from':'M9BJDuS24bqbJNvBRsoGg3',
- 'digest':'ea13f0a310c7f4494d2828bccbc8ff0bd8b77d0c0bfb1ed9a84104bf55ad0436',
- 'reqId':711182024
+ "protocolVersion":2,
+ "metadata":{
+ "from":"M9BJDuS24bqbJNvBRsoGg3",
+ "digest":"ea13f0a310c7f4494d2828bccbc8ff0bd8b77d0c0bfb1ed9a84104bf55ad0436",
+ "reqId":711182024
}
},
- 'ver':'1',
- 'rootHash':'GJNfknLWDAb8R93cgAX3Bw6CYDo23HBhiwZnzb4fHtyi',
- 'auditPath':[
+ "ver":"1",
+ "rootHash":"GJNfknLWDAb8R93cgAX3Bw6CYDo23HBhiwZnzb4fHtyi",
+ "auditPath":[
]
}
@@ -1780,35 +2703,35 @@ At least one [TRANSACTION_AUTHOR_AGREEMENT_AML](#transaction_author_agreement_am
*New Agreement Request Example*:
```
{
- 'operation': {
- 'type': '4'
- 'version': '1.0',
- 'text': 'Please read carefully before writing anything to the ledger',
- 'ratification_ts': 1514304094738044
+ "operation": {
+ "type": "4"
+ "version": "1.0",
+ "text": "Please read carefully before writing anything to the ledger",
+ "ratification_ts": 1514304094738044
},
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj',
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
}
```
*New Agreement Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver":1,
"txn": {
"type":4,
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
- 'version': '1.0',
- 'text': 'Please read carefully before writing anything to the ledger',
- 'ratification_ts': 1514304094738044
+ "version": "1.0",
+ "text": "Please read carefully before writing anything to the ledger",
+ "ratification_ts": 1514304094738044
},
"metadata": {
@@ -1830,8 +2753,8 @@ At least one [TRANSACTION_AUTHOR_AGREEMENT_AML](#transaction_author_agreement_am
}]
},
- 'rootHash': 'DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV',
- 'auditPath': ['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6'],
+ "rootHash": "DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV",
+ "auditPath": ["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"],
}
}
```
@@ -1840,33 +2763,33 @@ At least one [TRANSACTION_AUTHOR_AGREEMENT_AML](#transaction_author_agreement_am
*Retire Agreement Request Example*:
```
{
- 'operation': {
- 'type': '4'
- 'version': '1.0',
- 'retirement_ts': 1515415195838044
+ "operation": {
+ "type": "4"
+ "version": "1.0",
+ "retirement_ts": 1515415195838044
},
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738066,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj',
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738066,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
}
```
*Retire Agreement Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver":1,
"txn": {
"type":4,
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
- 'version': '1.0',
- 'retirement_ts': 1515415195838044
+ "version": "1.0",
+ "retirement_ts": 1515415195838044
},
"metadata": {
@@ -1888,8 +2811,8 @@ At least one [TRANSACTION_AUTHOR_AGREEMENT_AML](#transaction_author_agreement_am
}]
},
- 'rootHash': 'DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV',
- 'auditPath': ['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6'],
+ "rootHash": "DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV",
+ "auditPath": ["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"],
}
}
```
@@ -1919,8 +2842,8 @@ Each acceptance mechanisms list has a unique version.
*Request Example*:
```
{
- 'operation': {
- 'type': '5'
+ "operation": {
+ "type": "5"
"version": "1.0",
"aml": {
"EULA": "Included in the EULA for the product being used",
@@ -1931,25 +2854,25 @@ Each acceptance mechanisms list has a unique version.
"amlContext": "http://aml-context-descr"
},
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj',
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver":1,
"txn": {
"type":5,
"protocolVersion":2,
+ "ver": 1,
"data": {
- "ver":1,
"version": "1.0",
"aml": {
"EULA": "Included in the EULA for the product being used",
@@ -1979,8 +2902,8 @@ Each acceptance mechanisms list has a unique version.
}]
},
- 'rootHash': 'DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV',
- 'auditPath': ['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6'],
+ "rootHash": "DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV",
+ "auditPath": ["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"],
}
}
```
@@ -1994,168 +2917,112 @@ A new Agreement needs to be sent instead.
*Request Example*:
```
{
- 'operation': {
- 'type': '8'
+ "operation": {
+ "type": "8"
},
- 'identifier': '21BPzYYrFzbuECcBV3M1FH',
- 'reqId': 1514304094738044,
- 'protocolVersion': 2,
- 'signature': '3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj',
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 1514304094738044,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
+ "op": "REPLY",
+ "result": {
"ver":1,
"txn": {
"type":8,
"protocolVersion":2,
+ "ver": 1,
- "data": {},
-
- "metadata": {
- "reqId":1514304094738044,
- "from":"21BPzYYrFzbuECcBV3M1FH",
- "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
- "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
- },
- },
- "txnMetadata": {
- "txnTime":1513945121,
- "seqNo": 10,
- },
- "reqSignature": {
- "type": "ED25519",
- "values": [{
- "from": "21BPzYYrFzbuECcBV3M1FH",
- "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
- }]
- },
-
- 'rootHash': 'DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV',
- 'auditPath': ['6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6'],
- }
-}
-```
-
-### SET_CONTEXT
-Adds Context.
-
-It's not possible to update existing Context.
-So, if the Context needs to be evolved, a new Context with a new version or name needs to be created.
-
-- `data` (dict):
-
- Dictionary with Context's data:
-
- - `@context`: This value must be either:
- 1) a URI (it should dereference to a Context object)
- 2) a Context object (a dict)
- 3) an array of Context objects and/or Context URIs
-
-- `meta` (dict)
-
- Dictionary with Context's metadata
-
- - `name`: Context's name string
- - `version`: Context's version string
- - `type`: 'ctx'
-
-*Request Example*:
-```
-{
- 'operation': {
- 'type': '200',
- "data":{
- "@context": [
- {
- "@version": 1.1
- },
- "https://www.w3.org/ns/odrl.jsonld",
- {
- "ex": "https://example.org/examples#",
- "schema": "http://schema.org/",
- "rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#"
- }
- ]
- },
- "meta": {
- "name":"SimpleContext",
- "version":"1.0",
- "type": "ctx"
- },
- },
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'endorser': 'D6HG5g65TDQr1PPHHRoiGf',
- 'reqId': 1514280215504647,
- 'protocolVersion': 2,
- 'signature': '5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS'
-}
-```
-
-*Reply Example*:
-```
-{
- 'op': 'REPLY',
- 'result': {
- "ver": 1,
- "txn": {
- "type":"200",
- "protocolVersion":2,
-
- "data": {
- "ver":1,
- "data":{
- "@context": [
- {
- "@version": 1.1
- },
- "https://www.w3.org/ns/odrl.jsonld",
- {
- "ex": "https://example.org/examples#",
- "schema": "http://schema.org/",
- "rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#"
- }
- ]
- },
- "meta": {
- "name":"SimpleContext",
- "version":"1.0",
- "type": "ctx"
- },
- },
+ "data": {},
"metadata": {
- "reqId":1514280215504647,
- "from":"L5AD5g65TDQr1PPHHRoiGf",
- "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "reqId":1514304094738044,
+ "from":"21BPzYYrFzbuECcBV3M1FH",
"digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
- "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685"
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
},
},
"txnMetadata": {
"txnTime":1513945121,
"seqNo": 10,
- "txnId":"L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0",
},
"reqSignature": {
"type": "ED25519",
"values": [{
- "from": "L5AD5g65TDQr1PPHHRoiGf",
- "value": "5ZTp9g4SP6t73rH2s8zgmtqdXyTuSMWwkLvfV1FD6ddHCpwTY5SAsp8YmLWnTgDnPXfJue3vJBWjy89bSHvyMSdS"
+ "from": "21BPzYYrFzbuECcBV3M1FH",
+ "value": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj"
}]
- }
-
- 'rootHash': '5vasvo2NUAD7Gq8RVxJZg1s9F7cBpuem1VgHKaFP8oBm',
- 'auditPath': ['Cdsoz17SVqPodKpe6xmY2ZgJ9UcywFDZTRgWSAYM96iA', '66BCs5tG7qnfK6egnDsvcx2VSNH6z1Mfo9WmhLSExS6b'],
-
+ },
+
+ "rootHash": "DvpkQ2aADvQawmrzvTTjF9eKQxjDkrCbQDszMRbgJ6zV",
+ "auditPath": ["6GdvJfqTekMvzwi9wuEpfqMLzuN1T91kvgRBQLUzjkt6"],
}
}
```
+### LEDGERS_FREEZE
+
+Freeze deprecated ledgers (default ledgers such as the domain, config, pool, and audit ledgers cannot be frozen). If a ledger is frozen it can be used for reading but not for writing. Frozen ledgers will not be caught up by new nodes and they can't be unfrozen. Frozen ledgers can be removed without breaking consensus, but this would prevent third parties from auditing the ledger history. [More information is in the Indy Plenum documenation](https://hyperledger/indy-plenum/tree/master/docs/source/transaction_freeze_ledgers.md).
+
+The request has static and dynamic validations. Static validation checks to avoid freezing base ledgers (pool, audit, domain and config). Dynamic validation checks the existence of ledgers before freezing. Authorize checks the permissions for the freeze request (3 trustee signatures are needed by default).
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "9",
+ "ledgers_ids": [1,2,3,4]
+ },
+ "identifier": "21BPzYYrFzbuECcBV3M1FH",
+ "reqId": 311348486,
+ "protocolVersion": 2,
+ "signature": "3YVzDtSxxnowVwAXZmxCG2fz1A38j1qLrwKmGEG653GZw7KJRBX57Stc1oxQZqqu9mCqFLa7aBzt4MKXk4MeunVj",
+}
+```
+
+*Reply Example*:
+```
+{
+ "op":"REPLY",
+ "result":{
+ "txn":{
+ "type":"9",
+ "data":{
+ "ledgers_ids":[ ]
+ },
+ "protocolVersion":2,
+ "metadata":{
+ "reqId":311348486,
+ "payloadDigest":"09a3ccedf806e224beb56b547e967b442f3ee3181d5c87623f063742df7a692e",
+ "from":"M9BJDuS24bqbJNvBRsoGg3",
+ "digest":"dfdc6c5b77181953b4e32f975b0c5e64b25dc3e3061716aca1baae4cbe0ce494"
+ }
+ },
+ "ver":"1",
+ "auditPath":[
+ "DDwrSsKwpFkfGVqp7AxzRMusUuT9D5RmidCmnr8phTWD"
+ ],
+ "txnMetadata":{
+ "txnTime":1613735420,
+ "seqNo":3
+ },
+ "reqSignature":{
+ "type":"ED25519",
+ "values":[
+ { "value":"bSMfwJraLXzBAmdZKQUC1XfoVb3YWygc6UCQAmrTNKDr9beXta7MFyNZnQtbmNoRurjicSHLo3sW7qv7ZTWcZJa",
+ "from":"M9BJDuS24bqbJNvBRsoGg3"
+ }
+ ]
+ },
+ "rootHash":"E27ssC3LK8azpgxxtBY4ETLXvDLGtfmeSjngc6jrV1Qt"
+ }
+}
+```
## Read Requests
@@ -2169,51 +3036,60 @@ Gets information about a DID (NYM).
*Example*: `identifier` is a DID of the read request sender, and `dest` is the requested DID.
+- `timestamp` (POSIX timestamp; optional; mutually exclusive with `seqNo`):
+
+ Retrieve the value of the nym at specified timestamp.
+
+- `seqNo` (integer; optional; mutually exclusive with `timestamp`):
+
+ Retrieve the value of the nym at the time the transaction identified by
+ `seqNo` was written to the ledger.
+
*Request Example*:
```
{
- 'operation': {
- 'type': '105'
- 'dest': '2VkbBskPNNyWrLrZq7DBhk'
+ "operation": {
+ "type": "105"
+ "dest": "2VkbBskPNNyWrLrZq7DBhk"
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '105',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'seqNo': 10,
- 'txnTime': 1514214795,
-
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "op": "REPLY",
+ "result": {
+ "type": "105",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "seqNo": 10,
+ "txnTime": 1514214795,
+
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
- 'data': '{"dest":"2VkbBskPNNyWrLrZq7DBhk","identifier":"L5AD5g65TDQr1PPHHRoiGf","role":null,"seqNo":10,"txnTime":1514308168,"verkey":"~6hAzy6ubo3qutnnw5A12RF"}',
+ "data": "{"dest":"2VkbBskPNNyWrLrZq7DBhk","identifier":"L5AD5g65TDQr1PPHHRoiGf","role":null,"seqNo":10,"txnTime":1514308168,"verkey":"~6hAzy6ubo3qutnnw5A12RF"}",
- 'dest': '2VkbBskPNNyWrLrZq7DBhk'
+ "dest": "2VkbBskPNNyWrLrZq7DBhk"
}
}
```
@@ -2243,53 +3119,62 @@ i.e. reply data contains requested value only.
Encrypted attribute.
+- `timestamp` (POSIX timestamp; optional; mutually exclusive with `seqNo`):
+
+ Retrieve the value of the attrib at specified timestamp.
+
+- `seqNo` (integer; optional; mutually exclusive with `timestamp`):
+
+ Retrieve the value of the attrib at the time the transaction identified by
+ `seqNo` was written to the ledger.
+
*Request Example*:
```
{
- 'operation': {
- 'type': '104'
- 'dest': 'AH4RRiPR78DUrCWatnCW2w',
- 'raw': 'dateOfBirth'
+ "operation": {
+ "type": "104"
+ "dest": "AH4RRiPR78DUrCWatnCW2w",
+ "raw": "dateOfBirth"
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '104',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'seqNo': 10,
- 'txnTime': 1514214795,
-
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "op": "REPLY",
+ "result": {
+ "type": "104",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "seqNo": 10,
+ "txnTime": 1514214795,
+
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
- 'data': '{"dateOfBirth":{"dayOfMonth":23,"month":5,"year":1984}}',
+ "data": "{"dateOfBirth":{"dayOfMonth":23,"month":5,"year":1984}}",
- 'dest': 'AH4RRiPR78DUrCWatnCW2w',
- 'raw': 'dateOfBirth'
+ "dest": "AH4RRiPR78DUrCWatnCW2w",
+ "raw": "dateOfBirth"
}
}
```
@@ -2315,56 +3200,56 @@ Gets Claim's Schema.
*Request Example*:
```
{
- 'operation': {
- 'type': '107'
- 'dest': '2VkbBskPNNyWrLrZq7DBhk',
- 'data': {
- 'name': 'Degree',
- 'version': '1.0'
+ "operation": {
+ "type": "107"
+ "dest": "2VkbBskPNNyWrLrZq7DBhk",
+ "data": {
+ "name": "Degree",
+ "version": "1.0"
},
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '107',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'seqNo': 10,
- 'txnTime': 1514214795,
-
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "op": "REPLY",
+ "result": {
+ "type": "107",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "seqNo": 10,
+ "txnTime": 1514214795,
+
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
- 'data': {
- 'name': 'Degree',
- 'version': '1.0',
- 'attr_names': ['attrib1', 'attrib2', 'attrib3']
+ "data": {
+ "name": "Degree",
+ "version": "1.0",
+ "attr_names": ["attrib1", "attrib2", "attrib3"]
},
- 'dest': '2VkbBskPNNyWrLrZq7DBhk'
+ "dest": "2VkbBskPNNyWrLrZq7DBhk"
}
}
```
@@ -2393,57 +3278,57 @@ Gets Claim Definition.
*Request Example*:
```
{
- 'operation': {
- 'type': '108'
- 'signature_type': 'CL',
- 'origin': '2VkbBskPNNyWrLrZq7DBhk',
- 'ref': 10,
- 'tag': 'some_tag',
+ "operation": {
+ "type": "108"
+ "signature_type": "CL",
+ "origin": "2VkbBskPNNyWrLrZq7DBhk",
+ "ref": 10,
+ "tag": "some_tag",
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '108',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'seqNo': 10,
- 'txnTime': 1514214795,
-
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "op": "REPLY",
+ "result": {
+ "type": "108",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "seqNo": 10,
+ "txnTime": 1514214795,
+
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
- 'data': {
- 'primary': ...,
- 'revocation': ...
+ "data": {
+ "primary": ...,
+ "revocation": ...
},
- 'signature_type': 'CL',
- 'origin': '2VkbBskPNNyWrLrZq7DBhk',
- 'ref': 10,
- 'tag': 'some_tag'
+ "signature_type": "CL",
+ "origin": "2VkbBskPNNyWrLrZq7DBhk",
+ "ref": 10,
+ "tag": "some_tag"
}
}
```
@@ -2457,59 +3342,59 @@ Gets a Revocation Registry Definition, that Issuer creates and publishes for a p
*Request Example*:
```
{
- 'operation': {
- 'type': '115'
- 'id': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
+ "operation": {
+ "type": "115"
+ "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
},
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '115',
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'id': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
-
- 'seqNo': 10,
- 'txnTime': 1514214795,
-
- 'data': {
- 'id': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'credDefId': 'FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag'
- 'revocDefType': 'CL_ACCUM',
- 'tag': 'tag1',
- 'value': {
- 'maxCredNum': 1000000,
- 'tailsHash': '6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297',
- 'tailsLocation': 'http://tails.location.com',
- 'issuanceType': 'ISSUANCE_BY_DEFAULT',
- 'publicKeys': {},
+ "op": "REPLY",
+ "result": {
+ "type": "115",
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+
+ "seqNo": 10,
+ "txnTime": 1514214795,
+
+ "data": {
+ "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "credDefId": "FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag"
+ "revocDefType": "CL_ACCUM",
+ "tag": "tag1",
+ "value": {
+ "maxCredNum": 1000000,
+ "tailsHash": "6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297",
+ "tailsLocation": "http://tails.location.com",
+ "issuanceType": "ISSUANCE_BY_DEFAULT",
+ "publicKeys": {},
},
},
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
@@ -2531,56 +3416,56 @@ Gets a Revocation Registry Accumulator.
*Request Example*:
```
{
- 'operation': {
- 'type': '116'
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'timestamp': 1514214800
+ "operation": {
+ "type": "116"
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "timestamp": 1514214800
},
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '116',
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
+ "op": "REPLY",
+ "result": {
+ "type": "116",
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'timestamp': 1514214800
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "timestamp": 1514214800
- 'seqNo': 10,
- 'txnTime': 1514214795,
+ "seqNo": 10,
+ "txnTime": 1514214795,
- 'data': {
- 'id': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1'
- 'revocDefType': 'CL_ACCUM',
- 'value': {
- 'accum': 'accum_value',
+ "data": {
+ "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1"
+ "revocDefType": "CL_ACCUM",
+ "value": {
+ "accum": "accum_value",
},
},
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
@@ -2614,91 +3499,91 @@ If `from` is not set, then there is just one state proof (as usual) for both `ac
*Request Example when both `from` and `to` present*:
```
{
- 'operation': {
- 'type': '117'
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'from': 1514214100
- 'to': 1514214900
+ "operation": {
+ "type": "117"
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "from": 1514214100
+ "to": 1514214900
},
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example when both `from` and `to` present*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '117',
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'from': 1514214100
- 'to': 1514214900
-
- 'seqNo': 18,
- 'txnTime': 1514214795,
-
- 'data': {
- 'revocDefType': 'CL_ACCUM',
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'value': {
- 'accum_to': {
- 'revocDefType': 'CL_ACCUM',
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'txnTime': 1514214795,
- 'seqNo': 18,
- 'value': {
- 'accum': '9a512a7624'
+ "op": "REPLY",
+ "result": {
+ "type": "117",
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "from": 1514214100
+ "to": 1514214900
+
+ "seqNo": 18,
+ "txnTime": 1514214795,
+
+ "data": {
+ "revocDefType": "CL_ACCUM",
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "value": {
+ "accum_to": {
+ "revocDefType": "CL_ACCUM",
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "txnTime": 1514214795,
+ "seqNo": 18,
+ "value": {
+ "accum": "9a512a7624"
}
},
- 'revoked': [10, 11],
- 'issued': [1, 2, 3],
- 'accum_from': {
- 'revocDefType': 'CL_ACCUM',
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'txnTime': 1514214105,
- 'seqNo': 16,
- 'value': {
- 'accum': 'be080bd74b'
+ "revoked": [10, 11],
+ "issued": [1, 2, 3],
+ "accum_from": {
+ "revocDefType": "CL_ACCUM",
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "txnTime": 1514214105,
+ "seqNo": 16,
+ "value": {
+ "accum": "be080bd74b"
}
}
},
- 'stateProofFrom': {
- 'multi_signature': {
- 'participants': ['Delta', 'Gamma', 'Alpha'],
- 'signature': 'QpP4oVm2MLQ7SzLVZknuFjneXfqYj6UStn3oQtCdSiKiYuS4n1kxRphKRDMwmS7LGeXgUmy3C8GtcVM5X9SN9qLr2MBApjpPtKE9DkBTwyieh3vN1UMq1Kwx2Jkz7vcSJNH2WzjEKSUnpFLEJk4mpFaibqd1xX2hrwruxzSDUi2uCT',
- 'value': {
- 'state_root_hash': '2sfnQcEKkjw78KYnGJyk5Gw9gtwESvX6NdFFPEiQYQsz',
- 'ledger_id': 1,
- 'pool_state_root_hash': 'JDt3NNrZenx3x41oxsvhWeuSFFerdyqEvQUWyGdHX7gx',
- 'timestamp': 1514214105,
- 'txn_root_hash': 'FCkntnPqfaGx4fCX5tTdWeLr1mXdFuZnTNuEehiet32z'
+ "stateProofFrom": {
+ "multi_signature": {
+ "participants": ["Delta", "Gamma", "Alpha"],
+ "signature": "QpP4oVm2MLQ7SzLVZknuFjneXfqYj6UStn3oQtCdSiKiYuS4n1kxRphKRDMwmS7LGeXgUmy3C8GtcVM5X9SN9qLr2MBApjpPtKE9DkBTwyieh3vN1UMq1Kwx2Jkz7vcSJNH2WzjEKSUnpFLEJk4mpFaibqd1xX2hrwruxzSDUi2uCT",
+ "value": {
+ "state_root_hash": "2sfnQcEKkjw78KYnGJyk5Gw9gtwESvX6NdFFPEiQYQsz",
+ "ledger_id": 1,
+ "pool_state_root_hash": "JDt3NNrZenx3x41oxsvhWeuSFFerdyqEvQUWyGdHX7gx",
+ "timestamp": 1514214105,
+ "txn_root_hash": "FCkntnPqfaGx4fCX5tTdWeLr1mXdFuZnTNuEehiet32z"
}
},
- 'root_hash': '2sfnQcEKkjw78KYnGJyk5Gw9gtwESvX6NdFFPEiQYQsz',
- 'proof_nodes': '+QLB+QE3uFEgOk1TaktUV2tQTHRZb1BFYVRGMVRVRGI6NDpNU2pLVFdrUEx0WW9QRWFURjFUVURiOjM6Q0w6MTM6c29tZV90YWc6Q0xfQUNDVU06YTk4ZWO44vjguN57ImxzbiI6MTYsImx1dCI6MTU1ODUyNDEzMSwidmFsIjp7InJldm9jRGVmVHlwZSI6IkNMX0FDQ1VNIiwicmV2b2NSZWdEZWZJZCI6Ik1TaktUV2tQTHRZb1BFYVRGMVRVRGI6NDpNU2pLVFdrUEx0WW9QRWFURjFUVURiOjM6Q0w6MTM6c29tZV90YWc6Q0xfQUNDVU06YTk4ZWMiLCJzZXFObyI6MTYsInR4blRpbWUiOjE1NTg1MjQxMzEsInZhbHVlIjp7ImFjY3VtIjoiYmUwODBiZDc0YiJ9fX34UYCAgICAoAgDh8v1CXNEJGFl302RO98x8R6Ozscy0ZFdRpiCobh3oCnaxHyPnZq6E+mbnfU6oC994Wv1nh7sf0pQOp5g93tbgICAgICAgICAgPkBMYCAgKBmZE7e2jSrhTw9usjxZcAb25uSisJV+TzkbXNUypyJvaA/KAFG4RQqB9dAGRfTgly2XjXvPCeVr7vBn6FSkN7sH4CAoBGxQDip9XfEC/CkgimSkkhCeMm9XnkxxwWiMJwzuhAjgKAmh0g8FUI60e7NBwTu7ukdfz6kaON6u9U87kTeTlPcXICgsk2X2G6MlVhEqMEzthWAT4ey6qRaKXpOuMZOA1kMODagQdHobiexMaAwqtI7P5bbfqNkQEoZD79m6z43DEGQGL6gXQfLXgd+xWWXypr1pDxPvjSU8UtHjfWhJ58aiuqAzmKgyee3YL7GFFd+5oxG9b/q4od/mRjFpLdXKR3YG2o/hAygRIVVdoVD0dpqktsN8kSc03UhYiI76nxdCejX+CV4OX6A'
+ "root_hash": "2sfnQcEKkjw78KYnGJyk5Gw9gtwESvX6NdFFPEiQYQsz",
+ "proof_nodes": "+QLB+QE3uFEgOk1TaktUV2tQTHRZb1BFYVRGMVRVRGI6NDpNU2pLVFdrUEx0WW9QRWFURjFUVURiOjM6Q0w6MTM6c29tZV90YWc6Q0xfQUNDVU06YTk4ZWO44vjguN57ImxzbiI6MTYsImx1dCI6MTU1ODUyNDEzMSwidmFsIjp7InJldm9jRGVmVHlwZSI6IkNMX0FDQ1VNIiwicmV2b2NSZWdEZWZJZCI6Ik1TaktUV2tQTHRZb1BFYVRGMVRVRGI6NDpNU2pLVFdrUEx0WW9QRWFURjFUVURiOjM6Q0w6MTM6c29tZV90YWc6Q0xfQUNDVU06YTk4ZWMiLCJzZXFObyI6MTYsInR4blRpbWUiOjE1NTg1MjQxMzEsInZhbHVlIjp7ImFjY3VtIjoiYmUwODBiZDc0YiJ9fX34UYCAgICAoAgDh8v1CXNEJGFl302RO98x8R6Ozscy0ZFdRpiCobh3oCnaxHyPnZq6E+mbnfU6oC994Wv1nh7sf0pQOp5g93tbgICAgICAgICAgPkBMYCAgKBmZE7e2jSrhTw9usjxZcAb25uSisJV+TzkbXNUypyJvaA/KAFG4RQqB9dAGRfTgly2XjXvPCeVr7vBn6FSkN7sH4CAoBGxQDip9XfEC/CkgimSkkhCeMm9XnkxxwWiMJwzuhAjgKAmh0g8FUI60e7NBwTu7ukdfz6kaON6u9U87kTeTlPcXICgsk2X2G6MlVhEqMEzthWAT4ey6qRaKXpOuMZOA1kMODagQdHobiexMaAwqtI7P5bbfqNkQEoZD79m6z43DEGQGL6gXQfLXgd+xWWXypr1pDxPvjSU8UtHjfWhJ58aiuqAzmKgyee3YL7GFFd+5oxG9b/q4od/mRjFpLdXKR3YG2o/hAygRIVVdoVD0dpqktsN8kSc03UhYiI76nxdCejX+CV4OX6A"
}
},
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
@@ -2712,72 +3597,247 @@ If `from` is not set, then there is just one state proof (as usual) for both `ac
*Request Example when there is only `to` present*:
```
{
- 'operation': {
- 'type': '117'
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'to': 1514214900
+ "operation": {
+ "type": "117"
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "to": 1514214900
},
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example when there is only `to` present*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '117',
- 'identifier': 'T6AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'to': 1514214900
-
- 'seqNo': 18,
- 'txnTime': 1514214795,
-
- 'data': {
- 'revocDefType': 'CL_ACCUM',
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'value': {
- 'accum_to': {
- 'revocDefType': 'CL_ACCUM',
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'txnTime': 1514214795,
- 'seqNo': 18,
- 'value': {
- 'accum': '9a512a7624'
+ "op": "REPLY",
+ "result": {
+ "type": "117",
+ "identifier": "T6AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "to": 1514214900
+
+ "seqNo": 18,
+ "txnTime": 1514214795,
+
+ "data": {
+ "revocDefType": "CL_ACCUM",
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "value": {
+ "accum_to": {
+ "revocDefType": "CL_ACCUM",
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "txnTime": 1514214795,
+ "seqNo": 18,
+ "value": {
+ "accum": "9a512a7624"
}
},
- 'issued': [],
- 'revoked': [1, 2, 3, 4, 5]
+ "issued": [],
+ "revoked": [1, 2, 3, 4, 5]
+ },
+
+
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
+ },
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
+ }
},
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ }
+}
+```
+
+### GET_RICH_SCHEMA_OBJECT_BY_ID
+
+Gets a Rich Schema object (of any type) by its unique `id`.
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+
+
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "300"
+ "id": "did:sov:GGAD5g65TDQr1PPHHRoiGf",
+ },
+
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
+}
+```
+
+*Reply Example (for an Encoding object)*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "type": "300",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+
+ "seqNo": 10,
+ "txnTime": 1514214795,
+
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "input": {
+ "id": "DateRFC3339",
+ "type": "string"
+ },
+ "output": {
+ "id": "UnixTime",
+ "type": "256-bit integer"
+ },
+ "algorithm": {
+ "description": "This encoding transforms an
+ RFC3339-formatted datetime object into the number
+ of seconds since January 1, 1970 (the Unix epoch).",
+ "documentation": URL to specific github commit,
+ "implementation": URL to implementation
+ },
+ "test_vectors": URL to specific github commit
+ }",
+ "rsName":"SimpleEncoding",
+ "rsVersion":"1.0",
+ "rsType": "enc",
+ "from": "89kbBskPNNyWrLrZq7DBhk",
+ "endorser": "45kbBskPNNyWrLrZq7DBhk",
+ "ver": "1"
+ },
+
+ "dest": "2VkbBskPNNyWrLrZq7DBhk"
+ }
+}
+```
+
+### GET_RICH_SCHEMA_OBJECT_BY_METADATA
+
+Gets a Rich Schema object (of any type) by its unique `rsName`, `rsVersion` and `rsType`.
+
+- `rsType` (string):
+ Requested rich schema object's type.
+
+- `rsName` (string):
+
+ Requested rich schema object's name
+
+- `rsVersion` (string):
+
+ Requested rich schema object's version
+
+*Request Example*:
+```
+{
+ "operation": {
+ "type": "301"
+ "rsName":"SimpleEncoding",
+ "rsVersion":"1.0",
+ "rsType": "enc"
+ },
+
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
+}
+```
+*Reply Example (for an Encoding object)*:
+```
+{
+ "op": "REPLY",
+ "result": {
+ "type": "301",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "seqNo": 10,
+ "txnTime": 1514214795,
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 1,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
+ },
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
+ }
+ },
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "input": {
+ "id": "DateRFC3339",
+ "type": "string"
+ },
+ "output": {
+ "id": "UnixTime",
+ "type": "256-bit integer"
+ },
+ "algorithm": {
+ "description": "This encoding transforms an
+ RFC3339-formatted datetime object into the number
+ of seconds since January 1, 1970 (the Unix epoch).",
+ "documentation": URL to specific github commit,
+ "implementation": URL to implementation
+ },
+ "test_vectors": URL to specific github commit
+ }",
+ "rsName":"SimpleEncoding",
+ "rsVersion":"1.0",
+ "rsType": "enc",
+ "from": "89kbBskPNNyWrLrZq7DBhk",
+ "endorser": "45kbBskPNNyWrLrZq7DBhk",
+ "ver": "1"
+ },
+ "dest": "2VkbBskPNNyWrLrZq7DBhk"
}
}
```
+
### GET_AUTH_RULE
A request to get an auth constraint for an authentication rule or a full list of rules from Ledger. The constraint format is described in [AUTH_RULE transaction](#auth_rule).
@@ -2816,76 +3876,76 @@ Each output list element is equal to the input of [AUTH_RULE](#auth_rule), so li
*Request Example (for getting one rule)*:
```
{
- 'reqId':572495653,
- 'signature':'366f89ehxLuxPySGcHppxbURWRcmXVdkHeHrjtPKNYSRKnvaxzUXF8CEUWy9KU251u5bmnRL3TKvQiZgjwouTJYH',
- 'identifier':'M9BJDuS24bqbJNvBRsoGg3',
- 'operation':{
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
+ "reqId":572495653,
+ "signature":"366f89ehxLuxPySGcHppxbURWRcmXVdkHeHrjtPKNYSRKnvaxzUXF8CEUWy9KU251u5bmnRL3TKvQiZgjwouTJYH",
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3",
+ "operation":{
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
},
- 'protocolVersion':2
+ "protocolVersion":2
}
```
*Reply Example (for getting one rule)*:
```
{
- 'op':'REPLY',
- 'result':{
- 'type':'121',
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
+ "op":"REPLY",
+ "result":{
+ "type":"121",
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
- 'reqId':441933878,
- 'identifier':'M9BJDuS24bqbJNvBRsoGg3',
+ "reqId":441933878,
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3",
- 'data':[
+ "data":[
{
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
- 'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 2,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
+ "constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 2,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
}
],
- 'state_proof':{
- 'proof_nodes':'+Pz4+pUgQURELS0xLS1yb2xlLS0qLS0xMDG44vjguN57ImF1dGhfY29uc3RyYWludHMiOlt7ImNvbnN0cmFpbnRfaWQiOiJST0xFIiwibWV0YWRhdGEiOnt9LCJuZWVkX3RvX2JlX293bmVyIjpmYWxzZSwicm9sZSI6IjAiLCJzaWdfY291bnQiOjF9LHsiY29uc3RyYWludF9pZCI6IlJPTEUiLCJtZXRhZGF0YSI6e30sIm5lZWRfdG9fYmVfb3duZXIiOmZhbHNlLCJyb2xlIjoiMiIsInNpZ19jb3VudCI6MX1dLCJjb25zdHJhaW50X2lkIjoiQU5EIn0=',
- 'root_hash':'DauPq3KR6QFnkaAgcfgoMvvWR6UTdHKZgzbjepqWaBqF',
- 'multi_signature':{
- 'signature':'RNsPhUuPwwtA7NEf4VySCg1Fb2NpwapXrY8d64TLsRHR9rQ5ecGhRd89NTHabh8qEQ8Fs1XWawHjbSZ95RUYsJwx8PEXQcFEDGN3jc5VY31Q5rGg3aeBdFFxgYo11cZjrk6H7Md7N8fjHrKRdxo6TzDKSszJTNM1EAPLzyC6kKCnF9',
- 'value':{
- 'state_root_hash':'DauPq3KR6QFnkaAgcfgoMvvWR6UTdHKZgzbjepqWaBqF',
- 'pool_state_root_hash':'9L5CbxzhsNrZeGSJGVVpsC56JpuS5DGdUqfsFsR1RsFQ',
- 'timestamp':1552395470,
- 'txn_root_hash':'4CowHvnk2Axy2HWcYmT8b88A1Sgk45x7yHAzNnxowN9h',
- 'ledger_id':2
+ "state_proof":{
+ "proof_nodes":"+Pz4+pUgQURELS0xLS1yb2xlLS0qLS0xMDG44vjguN57ImF1dGhfY29uc3RyYWludHMiOlt7ImNvbnN0cmFpbnRfaWQiOiJST0xFIiwibWV0YWRhdGEiOnt9LCJuZWVkX3RvX2JlX293bmVyIjpmYWxzZSwicm9sZSI6IjAiLCJzaWdfY291bnQiOjF9LHsiY29uc3RyYWludF9pZCI6IlJPTEUiLCJtZXRhZGF0YSI6e30sIm5lZWRfdG9fYmVfb3duZXIiOmZhbHNlLCJyb2xlIjoiMiIsInNpZ19jb3VudCI6MX1dLCJjb25zdHJhaW50X2lkIjoiQU5EIn0=",
+ "root_hash":"DauPq3KR6QFnkaAgcfgoMvvWR6UTdHKZgzbjepqWaBqF",
+ "multi_signature":{
+ "signature":"RNsPhUuPwwtA7NEf4VySCg1Fb2NpwapXrY8d64TLsRHR9rQ5ecGhRd89NTHabh8qEQ8Fs1XWawHjbSZ95RUYsJwx8PEXQcFEDGN3jc5VY31Q5rGg3aeBdFFxgYo11cZjrk6H7Md7N8fjHrKRdxo6TzDKSszJTNM1EAPLzyC6kKCnF9",
+ "value":{
+ "state_root_hash":"DauPq3KR6QFnkaAgcfgoMvvWR6UTdHKZgzbjepqWaBqF",
+ "pool_state_root_hash":"9L5CbxzhsNrZeGSJGVVpsC56JpuS5DGdUqfsFsR1RsFQ",
+ "timestamp":1552395470,
+ "txn_root_hash":"4CowHvnk2Axy2HWcYmT8b88A1Sgk45x7yHAzNnxowN9h",
+ "ledger_id":2
},
- 'participants':[
- 'Beta',
- 'Gamma',
- 'Delta'
+ "participants":[
+ "Beta",
+ "Gamma",
+ "Delta"
]
}
},
@@ -2896,60 +3956,60 @@ Each output list element is equal to the input of [AUTH_RULE](#auth_rule), so li
*Request Example (for getting all rules)*:
```
{
- 'reqId':575407732,
- 'signature':'4AheMmtrfoHuAEtg5VsFPGe1j2w1UYxAvShRmfsCTSHnBDoA5EbmCa2xZzZVQjQGUFbYr65uznu1iUQhW22RNb1X',
- 'identifier':'M9BJDuS24bqbJNvBRsoGg3',
- 'operation':{
- 'type':'121'
+ "reqId":575407732,
+ "signature":"4AheMmtrfoHuAEtg5VsFPGe1j2w1UYxAvShRmfsCTSHnBDoA5EbmCa2xZzZVQjQGUFbYr65uznu1iUQhW22RNb1X",
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3",
+ "operation":{
+ "type":"121"
},
- 'protocolVersion':2
+ "protocolVersion":2
}
```
*Reply Example (for getting all rules)*:
```
{
- 'op':'REPLY',
- 'result':{
- 'type':'121',
+ "op":"REPLY",
+ "result":{
+ "type":"121",
- 'reqId':575407732,
- 'identifier':'M9BJDuS24bqbJNvBRsoGg3'
+ "reqId":575407732,
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3"
- 'data':[
+ "data":[
{
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
- 'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 2,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
+ "constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 2,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
},
{
- 'auth_type': '102',
- 'auth_action': 'ADD',
- 'field' :'*',
- 'new_value': '*'
- 'constraint':{
- 'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': False,
- 'metadata': {}
+ "auth_type": "102",
+ "auth_action": "ADD",
+ "field" :"*",
+ "new_value": "*"
+ "constraint":{
+ "constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": False,
+ "metadata": {}
},
},
........
@@ -2988,32 +4048,32 @@ All input parameters are optional and mutually exclusive.
*Request Example*:
```
{
- 'operation': {
- 'type': '6'
- 'version': '1.0',
+ "operation": {
+ "type": "6"
+ "version": "1.0",
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '6',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
+ "op": "REPLY",
+ "result": {
+ "type": "6",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
- 'version': '1.0',
+ "version": "1.0",
- 'seqNo': 10,
- 'txnTime': 1514214795,
+ "seqNo": 10,
+ "txnTime": 1514214795,
- 'data': {
+ "data": {
"version": "1.0",
"text": "Please read carefully before writing anything to the ledger",
"digest": "ca11c39b44ce4ec8666a8f63efd5bacf98a8e26c4f8890c87f629f126a3b74f3"
@@ -3021,19 +4081,19 @@ All input parameters are optional and mutually exclusive.
"retirement_ts": 1515415195838044
},
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 2,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 2,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
@@ -3065,32 +4125,32 @@ All input parameters are optional and mutually exclusive.
*Request Example*:
```
{
- 'operation': {
- 'type': '7'
- 'version': '1.0',
+ "operation": {
+ "type": "7"
+ "version": "1.0",
},
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
+ "protocolVersion": 2
}
```
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'type': '7',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
+ "op": "REPLY",
+ "result": {
+ "type": "7",
+ "identifier": "L5AD5g65TDQr1PPHHRoiGf",
+ "reqId": 1514308188474704,
- 'version': '1.0',
+ "version": "1.0",
- 'seqNo': 10,
- 'txnTime': 1514214795,
+ "seqNo": 10,
+ "txnTime": 1514214795,
- 'data': {
+ "data": {
"version": "1.0",
"aml": {
"EULA": "Included in the EULA for the product being used",
@@ -3101,19 +4161,19 @@ All input parameters are optional and mutually exclusive.
"amlContext": "http://aml-context-descr"
},
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 2,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
+ "state_proof": {
+ "root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH",
+ "proof_nodes": "+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=",
+ "multi_signature": {
+ "value": {
+ "timestamp": 1514308168,
+ "ledger_id": 2,
+ "txn_root_hash": "4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y",
+ "pool_state_root_hash": "9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK",
+ "state_root_hash": "81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH"
},
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
+ "signature": "REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3",
+ "participants": ["Delta", "Gamma", "Alpha"]
}
},
@@ -3122,95 +4182,6 @@ All input parameters are optional and mutually exclusive.
}
```
-### GET_CONTEXT
-
-Gets Context.
-
-- `dest` (base58-encoded string):
-
- Context DID as base58-encoded string for 16 or 32 byte DID value.
- It differs from `identifier` metadata field, where `identifier` is the DID of the submitter.
-
- *Example*: `identifier` is a DID of the read request sender, and `dest` is the DID of the Context.
-
-- `meta` (dict):
-
- - `name` (string): Context's name string
- - `version` (string): Context's version string
-
-
-
-*Request Example*:
-```
-{
- 'operation': {
- 'type': '300'
- 'dest': '2VkbBskPNNyWrLrZq7DBhk',
- 'meta': {
- 'name': 'SimpleContext',
- 'version': '1.0',
- 'type': 'ctx'
- },
- },
-
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
- 'protocolVersion': 2
-}
-```
-
-*Reply Example*:
-```
-{
- 'op': 'REPLY',
- 'result': {
- 'type': '300',
- 'identifier': 'L5AD5g65TDQr1PPHHRoiGf',
- 'reqId': 1514308188474704,
-
- 'seqNo': 10,
- 'txnTime': 1514214795,
-
- 'state_proof': {
- 'root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH',
- 'proof_nodes': '+QHl+FGAgICg0he/hjc9t/tPFzmCrb2T+nHnN0cRwqPKqZEc3pw2iCaAoAsA80p3oFwfl4dDaKkNI8z8weRsSaS9Y8n3HoardRzxgICAgICAgICAgID4naAgwxDOAEoIq+wUHr5h9jjSAIPDjS7SEG1NvWJbToxVQbh6+Hi4dnsiaWRlbnRpZmllciI6Ikw1QUQ1ZzY1VERRcjFQUEhIUm9pR2YiLCJyb2xlIjpudWxsLCJzZXFObyI6MTAsInR4blRpbWUiOjE1MTQyMTQ3OTUsInZlcmtleSI6In42dWV3Um03MmRXN1pUWFdObUFkUjFtIn348YCAgKDKj6ZIi+Ob9HXBy/CULIerYmmnnK2A6hN1u4ofU2eihKBna5MOCHiaObMfghjsZ8KBSbC6EpTFruD02fuGKlF1q4CAgICgBk8Cpc14mIr78WguSeT7+/rLT8qykKxzI4IO5ZMQwSmAoLsEwI+BkQFBiPsN8F610IjAg3+MVMbBjzugJKDo4NhYoFJ0ln1wq3FTWO0iw1zoUcO3FPjSh5ytvf1jvSxxcmJxoF0Hy14HfsVll8qa9aQ8T740lPFLR431oSefGorqgM5ioK1TJOr6JuvtBNByVMRv+rjhklCp6nkleiyLIq8vZYRcgIA=',
- 'multi_signature': {
- 'value': {
- 'timestamp': 1514308168,
- 'ledger_id': 1,
- 'txn_root_hash': '4Y2DpBPSsgwd5CVE8Z2zZZKS4M6n9AbisT3jYvCYyC2y',
- 'pool_state_root_hash': '9fzzkqU25JbgxycNYwUqKmM3LT8KsvUFkSSowD4pHpoK',
- 'state_root_hash': '81bGgr7FDSsf4ymdqaWzfnN86TETmkUKH4dj4AqnokrH'
- },
- 'signature': 'REbtR8NvQy3dDRZLoTtzjHNx9ar65ttzk4jMqikwQiL1sPcHK4JAqrqVmhRLtw6Ed3iKuP4v8tgjA2BEvoyLTX6vB6vN4CqtFLqJaPJqMNZvr9tA5Lm6ZHBeEsH1QQLBYnWSAtXt658PotLUEp38sNxRh21t1zavbYcyV8AmxuVTg3',
- 'participants': ['Delta', 'Gamma', 'Alpha']
- }
- },
-
- "data":{
- "@context": [
- {
- "@version": 1.1
- },
- "https://www.w3.org/ns/odrl.jsonld",
- {
- "ex": "https://example.org/examples#",
- "schema": "http://schema.org/",
- "rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#"
- }
- ]
- },
-
- "meta": {
- "name":"SimpleContext",
- "version":"1.0",
- "type": "ctx"
- },
-
- 'dest': '2VkbBskPNNyWrLrZq7DBhk'
- }
-}
-```
### GET_TXN
@@ -3227,15 +4198,15 @@ A generic request to get a transaction from Ledger by its sequence number.
*Request Example (requests a NYM txn with seqNo=9)*:
```
{
- 'operation': {
- 'type': '3',
- 'ledgerId': 1,
- 'data': 9
+ "operation": {
+ "type": "3",
+ "ledgerId": 1,
+ "data": 9
},
- 'identifier': 'MSjKTWkPLtYoPEaTF1TUDb',
- 'reqId': 1514311281279625,
- 'protocolVersion': 2
+ "identifier": "MSjKTWkPLtYoPEaTF1TUDb",
+ "reqId": 1514311281279625,
+ "protocolVersion": 2
}
```
@@ -3300,6 +4271,70 @@ A generic request to get a transaction from Ledger by its sequence number.
}
```
+### GET_FROZEN_LEDGERS
+
+Get whole list of frozen ledgers. Reply has follow state format data:
+
+```
+<ledger_id>: {
+ ledger: <ledger_root_hash>,
+ state: <state_root_hash>,
+ seq_no: <last_seq_no>
+}
+```
+
+*Request Example*:
+```
+{
+ "operation":{
+ "type":"10"
+ },
+ "reqId":783857061,
+ "protocolVersion":2,
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3"
+}
+```
+
+*Reply Example*:
+```
+{
+ "result":{
+ "seqNo":3,
+ "type":"10",
+ "state_proof":{
+ "root_hash":"HUv35b31eqncHZ1R8xMQW9pJnCBqAaUVrfCA8AeTtx6u",
+ "multi_signature":{
+ "value":{
+ "pool_state_root_hash":"4bCEk76QsB6p3yCiDntMedpeZmiQtdH9NRpcFyvaLHhc",
+ "state_root_hash":"HUv35b31eqncHZ1R8xMQW9pJnCBqAaUVrfCA8AeTtx6u",
+ "timestamp":1613736202,
+ "ledger_id":2,
+ "txn_root_hash":"BY6PV9SrV1dgQgxy2kpeTLESQfazTYoLdLZfjzVmcLeV"
+ },
+ "signature":"R8FRHVg51YiY5nS8Hh8iXNa1ZPKjrQMmurnrGek2A7QMKq79Pws4DLhgcVgf66PSJGEPjmyASYxFziEnubY1RFHQiE7ZToLZqW4oJt11hhL1XgXwrdswyqTQjuyxx5nzjyE4AzyTvs3BywD54s3w3mUhLG3QWwBp1uTX8agLEKZDkK",
+ "participants":[
+ "Gamma",
+ "Delta",
+ "Beta"
+ ]
+ },
+ "proof_nodes":"+L74vJEgNDpGUk9aRU5fTEVER0VSU7io+Ka4pHsibHNuIjozLCJsdXQiOjE2MTM3MzYyMDIsInZhbCI6eyI5MDkiOnsibGVkZ2VyIjoiR0tvdDVoQnNkODFrTXVwTkNYSGFxYmh2M2h1RWJ4QUZNTG5wY1gyaG5pd24iLCJzZXFfbm8iOjAsInN0YXRlIjoiRGZOTG1INERBSFRLdjYzWVBGSnp1UmRlRXRWd0Y1UnRWbnZLWUhkOGlMRUEifX19"
+ },
+ "txnTime":1613736202,
+ "reqId":666493618,
+ "data":{
+ "909":{
+ "seq_no":0,
+ "state":"DfNLmH4DAHTKv63YPFJzuRdeEtVwF5RtVnvKYHd8iLEA",
+ "ledger":"GKot5hBsd81kMupNCXHaqbhv3huEbxAFMLnpcX2hniwn"
+ }
+ },
+ "identifier":"M9BJDuS24bqbJNvBRsoGg3"
+ },
+ "op":"REPLY"
+}
+```
+
## Action Requests
### POOL_RESTART
@@ -3351,11 +4386,11 @@ Command provide info from all the connected nodes without need of consensus.
*Request Example*:
```
{
- 'protocolVersion': 2,
- 'reqId': 83193,
- 'identifier': 'M9BJDuS24bqbJNvBRsoGg3',
- 'operation': {
- 'type': '119'
+ "protocolVersion": 2,
+ "reqId": 83193,
+ "identifier": "M9BJDuS24bqbJNvBRsoGg3",
+ "operation": {
+ "type": "119"
}
}
```
@@ -3363,12 +4398,12 @@ Command provide info from all the connected nodes without need of consensus.
*Reply Example*:
```
{
- 'op': 'REPLY',
- 'result': {
- 'reqId': 83193,
- 'data': { <Json with node info> },
- 'type': '119',
- 'identifier': 'M9BJDuS24bqbJNvBRsoGg3'
+ "op": "REPLY",
+ "result": {
+ "reqId": 83193,
+ "data": { <Json with node info> },
+ "type": "119",
+ "identifier": "M9BJDuS24bqbJNvBRsoGg3"
}
}
```
* [X] diff --git a/docs/source/setup-dev.md b/docs/source/setup-dev.md
This appears to cover new build issues. Looks good to me. 😀
index 5157d773..c4d9ef3e 100644
--- a/docs/source/setup-dev.md
+++ b/docs/source/setup-dev.md
@@ -1,9 +1,22 @@
# Dev Setup
+The preferred method of setting up the development environment is to use the devcontainers.
+All configuration files for VSCode and [Gitpod](https://gitpod.io) are already placed in this repository.
+If you are new to the concept of devcontainers in combination with VSCode [here](https://code.visualstudio.com/docs/remote/containers) is a good article about it.
-There are scripts that can help in setting up environment and project for developers.
+Simply clone this repository and VSCode will most likely ask you to open it in the devcontainer, if you have the correct extension("ms-vscode-remote.remote-containers") installed.
+If VSCode didn't ask to open it, open the command palette and use the `Remote-Containers: Rebuild and Reopen in Container` command.
+
+If you want to use Gitpod simply use this [link](https://gitpod.io/#https://github.com/hyperledger/indy-node/tree/ubuntu-20.04-upgrade)
+or if you want to work with your fork, prefix the entire URL of your branch with `gitpod.io/#` so that it looks like `https://gitpod.io/#https://github.com/hyperledger/indy-node/tree/ubuntu-20.04-upgrade`.
+
+**Note**: Be aware that the config files for Gitpod and VSCode are currently only used in the `ubuntu-20.04-upgrade` branch!
+
+
+There are also scripts that can help in setting up an environment and project for developers.
The scripts are in [dev-setup](https://github.com/hyperledger/indy-node/tree/master/dev-setup) folder.
+**Note**: Beware that these may be outdated and more cumbersome to set up.
-**Note**: as of now, we provide scripts for Ubuntu only. It's not guaranteed that the code is working on Windows.
+**Note**: As of now, we provide scripts for Ubuntu only. It's not guaranteed that the code is working on Windows.
- One needs Python 3.5 to work with the code
- We recommend using Python virtual environment for development
@@ -18,7 +31,7 @@ You can also have a look at the scripts mentioned below to follow them and perfo
1. Get scripts from [dev-setup-ubuntu](https://github.com/hyperledger/indy-node/tree/master/dev-setup/ubuntu)
1. Run `setup-dev-python.sh` to setup Python3.5, pip and virtualenv
1. Run `source ~/.bashrc` to apply virtual environment wrapper installation
-1. Run `setup-dev-depend-ubuntu16.sh` to setup dependencies (libindy, libindy-crypto, libsodium)
+1. Run `setup-dev-depend-ubuntu16.sh` to setup dependencies (libindy, ursa, libsodium)
1. Fork [indy-plenum](https://github.com/hyperledger/indy-plenum) and [indy-node](https://github.com/hyperledger/indy-node)
1. Go to the destination folder for the project
1. Run `init-dev-project.sh <github-name> <new-virtualenv-name>` to clone indy-plenum and indy-node projects and
@@ -112,29 +125,12 @@ Once you have homebrew installed, run ```brew install libsodium``` to install li
1. Copy the libsodium-x.dll from libsodium-win32\bin or libsodium-win64\bin to C:\Windows\System or System32 and rename it to libsodium.dll.
-### Setup Indy-Crypto
-
-Indy depends on [Indy-Crypto](https://github.com/hyperledger/indy-crypto).
-
-There is a deb package of libindy-crypto that can be used on Ubuntu:
-```
-sudo apt-get update
-sudo apt-get install apt-transport-https ca-certificates
-apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
-sudo add-apt-repository "deb https://repo.sovrin.org/deb xenial master"
-sudo apt-get update
-sudo apt-get install libindy-crypto
-```
-
-See [Indy-Crypto](https://github.com/hyperledger/indy-crypto) on how it can be installed on other platforms.
-
-
### Setup RocksDB
Indy depends on RocksDB, an embeddable persistent key-value store for fast storage.
Currently Indy requires RocksDB version 5.8.8 or higher. There is a deb package of RocksDB-5.8.8 and related stuff that
-can be used on Ubuntu 16.04 (repository configuration steps may be skipped if Indy-Crypto installation steps have been done):
+can be used on Ubuntu 16.04:
```
# Start of repository configuration steps
sudo apt-get update
@@ -153,19 +149,21 @@ sudo apt-get install libbz2-dev \
See [RocksDB](https://github.com/facebook/rocksdb) on how it can be installed on other platforms.
-### Setup Libindy
+### Setup Libindy and Ursa
Indy needs [Libindy](https://github.com/hyperledger/indy-sdk) as a test dependency.
+It also relies on [ursa](https://github.com/hyperledger/ursa), a library that supplies cryptographic signatures.
-There is a deb package of libindy that can be used on Ubuntu:
+There are deb packages of libindy and ursa that can be used on Ubuntu:
```
sudo add-apt-repository "deb https://repo.sovrin.org/sdk/deb xenial stable"
sudo apt-get update
-sudo apt-get install -y libindy
+sudo apt-get install -y libindy ursa
```
-See [Libindy](https://github.com/hyperledger/indy-sdk) on how it can be installed on other platforms.
-
+See [Libindy](https://github.com/hyperledger/indy-sdk) on how libindy can be installed on other platforms.
+See [Ursa build environment](https://github.com/hyperledger/ursa/blob/master/docs/build-environment.md)
+on how Ursa can be installed and built for other platforms.
### Using a virtual environment (recommended)
* [X] diff --git a/docs/source/setup-iptables.md b/docs/source/setup-iptables.md
I'm having a hard time understanding this diff... It doesn't look right to me. Ohhhhhh, okay, we do remove those lines and and one that looks almost the same... 🤦
We don't have to think about this one too hard.
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p docs/source/setup-iptables.md
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
Merge: 089d12e4 16697cf9
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Wed Feb 6 09:41:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.6.83
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
We can see that there is a single commit in the origin/stable branch, and it is a merge, so we don't have to worry about it because we should have the same merge in 20.04...
index 768cd331..058c8b31 100644
--- a/docs/source/setup-iptables.md
+++ b/docs/source/setup-iptables.md
@@ -1,70 +1,71 @@
# Setup iptables rules (recommended)
-It is strongly recommended to add iptables (or some other firewall) rule that limits the number of simultaneous clients
-connections for client port.
-There are at least two important reasons for this:
- - preventing the indy-node process from reaching of open file descriptors limit caused by clients connections
- - preventing the indy-node process from large memory usage as ZeroMQ creates the separate queue for each TCP connection.
+It is strongly recommended to add iptables (or some other firewall) rules to limit the number of simultaneous clients
+connections to your node's client port.
-NOTE: limitation of the number of *simultaneous clients connections* does not mean that we limit the
-number of *simultaneous clients* the indy-node works with in any time. The IndySDK client does not keep
-connection infinitely, it uses the same connection for request-response session with some optimisations,
-so it's just about **connections**, **not** about **clients**.
+There are at least two important reasons for this:
+ - preventing the indy-node process from exceeding the limit of open file descriptors due to an excessive number of clients connections.
+ - controlling the indy-node process's memory use, as ZeroMQ creates a separate queue for each TCP connection.
-Also iptables can be used to deal with various DoS attacks (e.g. syn flood) but rules' parameters are not estimated yet.
+NOTE: The limitation of the number of *simultaneous clients connections* does not mean that we limit the
+number of *simultaneous clients* indy-node works with in any time. Connections are not left open infinitely. The same connection is used for a request-response session with some optimisations and then closed, therefore it's just about **connections**, **not** about **clients**.
-NOTE: you should be a root to operate with iptables.
+NOTE: You will need to have sudo privileges to work with iptables.
+## Using indy scripts
-## Setting up clients connections limit
+For ease of use and for people that are not familiar with iptables we've added two scripts:
+ - [`setup_iptables`](https://github.com/hyperledger/indy-node/blob/main/scripts/setup_iptables):
+ - By default this scripts adds rules to iptables to limit the number of simultaneous clients connections for a specified port.
+ - To get a full list of options run `./setup_iptables -h` from the scripts directory.
-#### Using raw iptables command or iptables front-end
+ - [`setup_indy_node_iptables`](https://github.com/hyperledger/indy-node/blob/main/scripts/setup_indy_node_iptables):
+ - A wrapper around `setup_iptables` which gets client port and connection limit settings from the `/etc/indy/indy.env` that is created by the `init_indy_node` script.
-In case of deb installation the indy-node environment file /etc/indy/indy.env is created by `init_indy_node` script.
-This environment file contains client port (NODE_CLIENT_PORT) and recommended clients connections limit (CLIENT_CONNECTIONS_LIMIT).
-This parameters can be used to add the iptables rule for chain INPUT:
+Which one you use depends on how you installed indy-node on your server. Refer to the [For deb package based installations](#for-deb-package-based-installations), and [For pip based installations](#for-pip-based-installations) sections below.
-```
-# iptables -I INPUT -p tcp --syn --dport 9702 -m connlimit --connlimit-above 500 --connlimit-mask 0 -j REJECT --reject-with tcp-reset
-```
-Some key options:
- - --dport - a port for which limit is set
- - --connlimit-above - connections limit, exceeding new connections will be rejected using TCP reset
- - --connlimit-mask - group hosts using the prefix length, 0 means "all subnets"
+### Updating the scripts and configuration
-Corresponding fields should be set in case of some iptables front-end usage.
+Before you run the scripts you should ensure you are using the latest scripts and recommended settings by following these steps while logged into your node:
+1. Make a backup copy of the existing `setup_iptables` script by executing the command:
+ ```
+ sudo cp /usr/local/bin/setup_iptables /usr/local/bin/setup_iptables_$(date "+%Y%m%d-%H%M%S")
+ ```
-#### Using indy scripts
+1. Update the default client connection limit to 15000 in `/etc/indy/indy.env`.
+ - NOTE:
+ - `/etc/indy/indy.env` only exists for deb package based installations.
+ - `\1` is an excape sequence `\115000` is not a typo.
+ ```
+ sudo sed -i -re "s/(^CLIENT_CONNECTIONS_LIMIT=).*$/\115000/" /etc/indy/indy.env
+ ```
-For ease of use and for people that are not familiar with iptables we've
-added two scripts:
- - setup_iptables: adds a rule to iptables to limit the number of simultaneous
- clients connections for specified port;
- - setup_indy_node_iptables: a wrapper for setup_iptables script which gets client
- port and recommended connections limit from indy-node environment file that is created by init_indy_node script.
+1. Download the latest version of the script.
+ ```
+ sudo curl -o /usr/local/bin/setup_iptables https://raw.githubusercontent.com/hyperledger/indy-node/main/scripts/setup_iptables
+ ```
+ The sha256 checksum for the current version of the script is `a0e4451cc49897dc38946091b245368c1f1360201f374a3ad121925f9aa80664`
-Links to these scripts:
- - https://github.com/hyperledger/indy-node/blob/master/scripts/setup_iptables
- - https://github.com/hyperledger/indy-node/blob/master/scripts/setup_indy_node_iptables
-
-NOTE: for now the iptables chain for which the rule is added is not parameterized,
-the rule is always added for INPUT chain, we can parameterize it in future if needed.
+### For deb package based installations
-###### For deb installation
-To setup the limit of the number of simultaneous clients connections it is enough to run the following script without parameters
+Run:
```
-# setup_indy_node_iptables
+setup_indy_node_iptables
```
-This script gets client port and recommended connections limit from the indy-node environment file.
+NOTE:
+ - This script should only be called *after* your node has been initialized using `init_indy_node`, to ensure `/etc/indy/indy.env` has been created.
-NOTE: this script should be called *after* `init_indy_node` script.
+### For pip based installations
-###### For pip installation
-The `setup_indy_node_iptables` script can not be used in case of pip installation as indy-node environment file does not exist,
-use the `setup_iptables` script instead (9702 is a client port, 500 is recommended limit for now)
+For pip based installations `/etc/indy/indy.env` does not exist, therefore `setup_indy_node_iptables` cannot be used. Instead you run `setup_iptables` directly.
+
+For example, if your client port is 9702, you would run:
```
-# setup_iptables 9702 500
+setup_iptables 9702 15000
```
-In fact, the `setup_indy_node_iptables` script is just a wrapper for the `setup_iptables` script.
+
+## Using raw iptables command or iptables front-end
+
+If you are confident with using iptables, you may add additional rules as you see fit using iptables directly.
\ No newline at end of file
* [X] diff --git a/docs/source/transactions.md b/docs/source/transactions.md
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p docs/source/transactions.md
* commit b54404d0bdc23dfb60d10bababc9216e99c3ee49
| Merge: ff88db39 e0d3b94d
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Thu Dec 26 15:23:08 2019 +0300
|
| Merge remote-tracking branch 'remotes/upstream/master' into rc-1.12.1.rc1
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: 8940e559 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
| Merge: 1cd837b1 48199562
| Author: Andrey Kononykhin <andkononykhin@gmail.com>
| Date: Thu Jun 27 19:32:01 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
|
| Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
|
* commit 1cd837b1d093f1afc408b5395cbc00353f7e3cca
| Merge: a8784f11 df9959d0
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed May 29 09:36:39 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.8.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
| Merge: c009c3f0 d8c42999
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Apr 24 13:04:08 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
Merge: 089d12e4 16697cf9
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Wed Feb 6 09:41:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.6.83
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
These are all merge commits, with no patches, so we look good.
index a089f9c9..f7ca28c7 100644
--- a/docs/source/transactions.md
+++ b/docs/source/transactions.md
@@ -11,7 +11,12 @@
* [CLAIM_DEF](#claim_def)
* [REVOC_REG_DEF](#revoc_reg_def)
* [REVOC_REG_ENTRY](#revoc_reg_entry)
- * [SET_CONTEXT](#set_context)
+ * [JSON_LD_CONTEXT](#json_ld_context)
+ * [RICH_SCHEMA](#rich_schema)
+ * [RICH_SCHEMA_ENCODING](#rich_schema_encoding)
+ * [RICH_SCHEMA_MAPPING](#rich_schema_mapping)
+ * [RICH_SCHEMA_CRED_DEF](#rich_schema_cred_def)
+ * [RICH_SCHEMA_PRES_DEF](#rich_schema_pres_def)
* [Pool Ledger](#pool-ledger)
* [NODE](#node)
@@ -25,9 +30,8 @@
* [TRANSACTION_AUTHOR_AGREEMENT](#transaction_author_agreement)
* [TRANSACTION_AUTHOR_AGREEMENT_AML](#transaction_author_agreement_AML)
* [TRANSACTION_AUTHOR_AGREEMENT_DISABLE](#transaction_author_agreement_disable)
+ * [LEDGERS_FREEZE](#ledgers_freeze)
-* [Actions](#actions)
- * [POOL_RESTART](#pool_restart)
## General Information
@@ -67,10 +71,10 @@ transaction specific data:
"ver": <...>,
"txn": {
"type": <...>,
+ "ver": <...>,
"protocolVersion": <...>,
"data": {
- "ver": <...>,
<txn-specific fields>
},
@@ -128,8 +132,20 @@ transaction specific data:
- REVOC_REG_DEF = "114"
- AUTH_RULE = "120"
- AUTH_RULES = "122"
- - SET_CONTEXT = "200"
-
+ - JSON_LD_CONTEXT = "200"
+ - RICH_SCHEMA = "201"
+ - RICH_SCHEMA_ENCODING = "202"
+ - RICH_SCHEMA_MAPPING = "203"
+ - RICH_SCHEMA_CRED_DEF = "204"
+ - RICH_SCHEMA_PRES_DEF = "205"
+
+ - `ver` (string)
+
+ Transaction's payload version as defined in the input request.
+ If the input request doesn't have the version specified, then default one will be used.
+ Some transactions (for example TRANSACTION_AUTHOR_AGREEMENT) have non-default transaction payload version
+ defined in source code as a result of evolution of business logic and features.
+
- `protocolVersion` (integer; optional):
The version of client-to-node or node-to-node protocol. Each new version may introduce a new feature in requests/replies/data.
@@ -224,7 +240,7 @@ Creates a new NYM record for a specific user, endorser, steward or trustee.
Note that only trustees and stewards can create new endorsers and a trustee can be created only by other trustees (see [roles](auth_rules.md)).
The transaction can be used for
-creation of new DIDs, setting and rotation of verification key, setting and changing of roles.
+creation of new DIDs, setting and rotation of verification key, setting and changing of roles, setting and changing of DID documents.
- `dest` (base58-encoded string):
@@ -259,6 +275,18 @@ creation of new DIDs, setting and rotation of verification key, setting and chan
NYM's alias.
+- `diddocContent` (json string; optional):
+
+ The diddocContent item is stored directly in the ledger state and has a maximum size of 10 KiB (10 x 1024 bytes).
+
+- `version` (integer; optional):
+
+ The NYM transaction version specifies the required level of validation of the relationship between the namespace identifier component of the DID and the intial public key (verkey). This field is optional, but if the NYM transaction version is provided, it must be set upon creation and cannot be updated. The accepted values are as follows:
+
+ 0 or NYM transaction version is not set: No validation of namespace identifier and initial verkey binding is performed.
+ 1: Validation is performed according to the did:sov method, in which the DID must be the first 16 bytes of the Verification Method public key.
+ 2: Validation is performed according to the did:indy, in which the namespace identifier component of the DID (last element) is derived from the initial public key of the DID, using the base58 encoding of the first 16 bytes of the SHA256 of the Verification Method public key (did = Base58(Truncate_msb(16(SHA256(publicKey))))). This DID is considered self-certifying.
+
If there is no NYM transaction for the specified DID (`did`) yet, then this can be considered as the creation of a new DID.
If there is already a NYM transaction with the specified DID (`did`), then this is is considered an update of that DID.
@@ -274,10 +302,10 @@ So, if key rotation needs to be performed, the owner of the DID needs to send a
"ver": 1,
"txn": {
"type":"1",
+ "ver": 1,
"protocolVersion":2,
"data": {
- "ver": 1,
"dest":"GEzcdDLhCpGCYRHW82kjHd",
"verkey":"~HmUWn928bnFT6Ephf65YXv",
"role":101,
@@ -346,10 +374,10 @@ Adds an attribute to a NYM record
"ver": 1,
"txn": {
"type":"100",
+ "ver": 1,
"protocolVersion":2,
"data": {
- "ver":1,
"dest":"GEzcdDLhCpGCYRHW82kjHd",
"raw":"3cba1e3cf23c8ce24b7e08171d823fbd9a4929aafd9f27516e30699d3a42026a",
},
@@ -403,10 +431,10 @@ So, if the Schema needs to be evolved, a new Schema with a new version or new na
"ver": 1,
"txn": {
"type":101,
+ "ver": 1,
"protocolVersion":2,
"data": {
- "ver":1,
"data": {
"attr_names": ["undergrad","last_name","first_name","birth_date","postgrad","expiry_date"],
"name":"Degree",
@@ -472,10 +500,10 @@ Adds a claim definition (in particular, public key), that Issuer creates and pub
"ver": 1,
"txn": {
"type":102,
+ "ver": 1,
"protocolVersion":2,
"data": {
- "ver":1,
"data": {
"primary": {
...
@@ -486,7 +514,7 @@ Adds a claim definition (in particular, public key), that Issuer creates and pub
},
"ref":12,
"signature_type":"CL",
- 'tag': 'some_tag'
+ "tag": "some_tag"
},
"metadata": {
@@ -544,20 +572,20 @@ It contains public keys, maximum number of credentials the registry may contain,
"ver": 1,
"txn": {
"type":113,
+ "ver": 1,
"protocolVersion":2,
"data": {
- "ver":1,
- 'id': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1',
- 'credDefId': 'FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag'
- 'revocDefType': 'CL_ACCUM',
- 'tag': 'tag1',
- 'value': {
- 'maxCredNum': 1000000,
- 'tailsHash': '6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297',
- 'tailsLocation': 'http://tails.location.com',
- 'issuanceType': 'ISSUANCE_BY_DEFAULT',
- 'publicKeys': {},
+ "id": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1",
+ "credDefId": "FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag"
+ "revocDefType": "CL_ACCUM",
+ "tag": "tag1",
+ "value": {
+ "maxCredNum": 1000000,
+ "tailsHash": "6619ad3cf7e02fc29931a5cdc7bb70ba4b9283bda3badae297",
+ "tailsLocation": "http://tails.location.com",
+ "issuanceType": "ISSUANCE_BY_DEFAULT",
+ "publicKeys": {},
},
},
@@ -565,8 +593,8 @@ It contains public keys, maximum number of credentials the registry may contain,
"reqId":1513945121191691,
"from":"L5AD5g65TDQr1PPHHRoiGf",
"endorser": "D6HG5g65TDQr1PPHHRoiGf",
- 'digest': '4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453',
- 'payloadDigest': '21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685',
+ "digest": "4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
"taaAcceptance": {
"taaDigest": "6sh15d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
"mechanism": "EULA",
@@ -611,17 +639,17 @@ The RevocReg entry containing the new accumulator value and issued/revoked indic
"ver": 1,
"txn": {
"type":114,
+ "ver": 1,
"protocolVersion":2,
"data": {
- "ver":1,
- 'revocRegDefId': 'L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1'
- 'revocDefType': 'CL_ACCUM',
- 'value': {
- 'accum': 'accum_value',
- 'prevAccum': 'prev_acuum_value',
- 'issued': [],
- 'revoked': [10, 36, 3478],
+ "revocRegDefId": "L5AD5g65TDQr1PPHHRoiGf:3:FC4aWomrA13YyvYC1Mxw7:3:CL:14:some_tag:CL_ACCUM:tag1"
+ "revocDefType": "CL_ACCUM",
+ "value": {
+ "accum": "accum_value",
+ "prevAccum": "prev_acuum_value",
+ "issued": [],
+ "revoked": [10, 36, 3478],
},
},
@@ -629,8 +657,8 @@ The RevocReg entry containing the new accumulator value and issued/revoked indic
"reqId":1513945121191691,
"from":"L5AD5g65TDQr1PPHHRoiGf",
"endorser": "D6HG5g65TDQr1PPHHRoiGf",
- 'digest': '4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453',
- 'payloadDigest': '21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685',
+ "digest": "4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
"taaAcceptance": {
"taaDigest": "6sh15d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
"mechanism": "EULA",
@@ -654,29 +682,39 @@ The RevocReg entry containing the new accumulator value and issued/revoked indic
```
-#### SET_CONTEXT
-Adds a Context
+#### JSON_LD_CONTEXT
+Adds a JSON LD Context as part of Rich Schema feature.
It's not possible to update an existing Context.
-If the Context needs to be evolved, a new Context with a new version or new name needs to be created.
+If the Context needs to be evolved, a new Context with a new id and name-version needs to be created.
+
-- `data` (dict):
- Dictionary with Context's data:
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
- - `@context`: This value must be either:
- 1) a URI (it should dereference to a Context object)
- 2) a Context object (a dict)
- 3) an array of Context objects and/or Context URIs
+- `content` (json-serialized string):
+
+ Context object as JSON serialized in canonical form. It must have `@context` as a top level key.
+ The `@context` value must be either:
+ 1) a URI (it should dereference to a Context object)
+ 2) a Context object (a dict)
+ 3) an array of Context objects and/or Context URIs
-- `meta` (dict)
+- `rsType` (string):
- Dictionary with Context's metadata
+ Context's type. Currently expected to be `ctx`.
- - `name`: Context's name string
- - `version`: Context's version string
- - `type`: 'ctx'
+- `rsName` (string):
+ Context's name
+
+- `rsVersion` (string):
+
+ Context's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
**Example**:
```
@@ -684,11 +722,12 @@ If the Context needs to be evolved, a new Context with a new version or new name
"ver": 1,
"txn": {
"type":200,
+ "ver":1,
"protocolVersion":2,
"data": {
- "ver":1,
- "data":{
+ "id": "did:sov:GGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
"@context": [
{
"@version": 1.1
@@ -700,12 +739,552 @@ If the Context needs to be evolved, a new Context with a new version or new name
"rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#"
}
]
- },
- "meta": {
- "name":"SimpleContext",
- "version":"1.0",
- "type": "ctx
- },
+ }",
+ "rsName":"SimpleContext",
+ "rsVersion":"1.0",
+ "rsType": "ctx"
+ },
+
+ "metadata": {
+ "reqId":1513945121191691,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest": "4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
+ "taaAcceptance": {
+ "taaDigest": "6sh15d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "mechanism": "EULA",
+ "time": 1513942017
+ }
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "4X3skpoEK2DRgZxQ9PwuEvCJpL8JHdQ8X4HDDFyztgqE15DM2ZnkvrAh9bQY16egVinZTzwHqznmnkaFM4jjyDgd"
+ }]
+ }
+}
+```
+
+
+#### RICH_SCHEMA
+Adds a Rich Schema object as part of Rich Schema feature.
+
+It's not possible to update an existing Rich Schema.
+If the Rich Schema needs to be evolved, a new Rich Schema with a new id and name-version needs to be created.
+
+
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
+
+ Rich Schema object as JSON serialized in canonical form.
+ This value must be a json-ld, rich schema object. json-ld supports many parameters that are optional for a rich schema txn.
+ However, the following parameters must be there:
+
+ - `@id`: The value of this property must be (or map to, via a context object) a URI.
+ - `@type`: The value of this property must be (or map to, via a context object) a URI.
+ - `@context`(optional): If present, the value of this property must be a context object or a URI which can be dereferenced to obtain a context object.
+
+
+- `rsType` (string):
+
+ Rich Schema's type. Currently expected to be `sch`.
+
+- `rsName` (string):
+
+ Rich Schema's name
+
+- `rsVersion` (string):
+
+ Rich Schema's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+
+**Example**:
+```
+{
+ "ver": 1,
+ "txn": {
+ "type":201,
+ "ver":1,
+ "protocolVersion":2,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "@id": "test_unique_id",
+ "@context": "ctx:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "@type": "rdfs:Class",
+ "rdfs:comment": "ISO18013 International Driver License",
+ "rdfs:label": "Driver License",
+ "rdfs:subClassOf": {
+ "@id": "sch:Thing"
+ },
+ "driver": "Driver",
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "issuingAuthority": "Text",
+ "licenseNumber": "Text",
+ "categoriesOfVehicles": {
+ "vehicleType": "Text",
+ "vehicleType-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^(A|B|C|D|BE|CE|DE|AM|A1|A2|B1|C1|D1|C1E|D1E)$"
+ },
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "restrictions": "Text",
+ "restrictions-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^([A-Z]|[1-9])$"
+ }
+ },
+ "administrativeNumber": "Text"
+ }",
+ "rsName":"SimpleRichSchema",
+ "rsVersion":"1.0",
+ "rsType": "sch"
+ },
+
+ "metadata": {
+ "reqId":1513945121191691,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest": "4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
+ "taaAcceptance": {
+ "taaDigest": "6sh15d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "mechanism": "EULA",
+ "time": 1513942017
+ }
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "4X3skpoEK2DRgZxQ9PwuEvCJpL8JHdQ8X4HDDFyztgqE15DM2ZnkvrAh9bQY16egVinZTzwHqznmnkaFM4jjyDgd"
+ }]
+ }
+}
+```
+
+#### RICH_SCHEMA_ENCODING
+Adds an Encoding object as part of Rich Schema feature.
+
+It's not possible to update an existing Encoding.
+If the Encoding needs to be evolved, a new Encoding with a new id and name-version needs to be created.
+
+
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
+
+ Encoding object as JSON serialized in canonical form.
+
+ - `input`: a description of the input value
+ - `output`: a description of the output value
+ - `algorithm`:
+ - `documentation`: a URL which references a specific github commit of
+ the documentation that fully describes the transformation algorithm.
+ - `implementation`: a URL that links to a reference implementation of the
+ transformation algorithm. It is not necessary to use the implementation
+ linked to here, as long as the implementation used implements the same
+ transformation algorithm.
+ - `description`: a brief description of the transformation algorithm.
+ - `testVectors`: a URL which references a specific github commit of a
+selection of test vectors that may be used to provide assurance that a
+transformation algorithm implementation is correct.
+
+- `rsType` (string):
+
+ Encoding's type. Currently expected to be `enc`.
+
+- `rsName` (string):
+
+ Encoding's name
+
+- `rsVersion` (string):
+
+ Encoding's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+
+**Example**:
+```
+{
+ "ver": 1,
+ "txn": {
+ "type":202,
+ "ver":1,
+ "protocolVersion":2,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "input": {
+ "id": "DateRFC3339",
+ "type": "string"
+ },
+ "output": {
+ "id": "UnixTime",
+ "type": "256-bit integer"
+ },
+ "algorithm": {
+ "description": "This encoding transforms an
+ RFC3339-formatted datetime object into the number
+ of seconds since January 1, 1970 (the Unix epoch).",
+ "documentation": URL to specific github commit,
+ "implementation": URL to implementation
+ },
+ "test_vectors": URL to specific github commit
+ }",
+ "rsName":"SimpleEncoding",
+ "rsVersion":"1.0",
+ "rsType": "enc"
+ },
+
+ "metadata": {
+ "reqId":1513945121191691,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest": "4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
+ "taaAcceptance": {
+ "taaDigest": "6sh15d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "mechanism": "EULA",
+ "time": 1513942017
+ }
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "4X3skpoEK2DRgZxQ9PwuEvCJpL8JHdQ8X4HDDFyztgqE15DM2ZnkvrAh9bQY16egVinZTzwHqznmnkaFM4jjyDgd"
+ }]
+ }
+}
+```
+
+#### RICH_SCHEMA_MAPPING
+Adds a Mapping object as part of Rich Schema feature.
+
+It's not possible to update an existing Mapping.
+If the Mapping needs to be evolved, a new Mapping with a new id and name-version needs to be created.
+
+
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
+
+ Mapping object as JSON serialized in canonical form.
+ This value must be a json-ld object. json-ld supports many parameters that are optional for a rich schema txn.
+ However, the following parameters must be there:
+
+ - `@id`: The value of this property must be (or map to, via a context object) a URI.
+ - `@type`: The value of this property must be (or map to, via a context object) a URI.
+ - `@context`(optional): If present, the value of this property must be a context object or a URI which can be dereferenced to obtain a context object.
+ - `schema`: An `id` of the corresponding Rich Schema
+ - `attributes` (dict): A dict of all the schema attributes the Mapping object is going to map to encodings and use in credentials.
+ An attribute may have nested attributes matching the schema structure.
+ It must also contain the following default attributes required by any W3C compatible
+ verifiable credential (plus any additional attributes that may have been included from the
+ W3C verifiable credentials data model):
+
+ - `issuer`
+ - `issuanceDate`
+ - any additional attributes
+
+ Every leaf attribute's value is an array of the following pairs:
+
+ - `enc` (string): Encoding object (referenced by its `id`) to be used for representation of the attribute as an integer.
+ - `rank` (int): Rank of the attribute to define the order in which the attribute is signed by the Issuer. It is important that no two `rank` values may be identical.
+
+
+- `rsType` (string):
+
+ Mapping's type. Currently expected to be `map`.
+
+- `rsName` (string):
+
+ Mapping's name
+
+- `rsVersion` (string):
+
+ Mapping's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+
+**Example**:
+```
+{
+ "ver": 1,
+ "txn": {
+ "type":203,
+ "ver":1,
+ "protocolVersion":2,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ '@id': "did:sov:5e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@context': "did:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@type': "rdfs:Class",
+ "schema": "did:sov:4e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "attribuites" : {
+ "issuer": [{
+ "enc": "did:sov:9x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 1
+ }],
+ "issuanceDate": [{
+ "enc": "did:sov:119F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 2
+ }],
+ "expirationDate": [{
+ "enc": "did:sov:119F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 11
+ }],
+ "driver": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 5
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:2x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 4
+ }],
+ "issuingAuthority": [{
+ "enc": "did:sov:3x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 3
+ }],
+ "licenseNumber": [
+ {
+ "enc": "did:sov:4x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 9
+ },
+ {
+ "enc": "did:sov:5x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 10
+ },
+ ],
+ "categoriesOfVehicles": {
+ "vehicleType": [{
+ "enc": "did:sov:6x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 6
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:7x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 7
+ }],
+ },
+ "administrativeNumber": [{
+ "enc": "did:sov:8x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 8
+ }]
+ }
+ }",
+ "rsName":"SimpleMapping",
+ "rsVersion":"1.0",
+ "rsType": "map"
+ },
+
+ "metadata": {
+ "reqId":1513945121191691,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest": "4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
+ "taaAcceptance": {
+ "taaDigest": "6sh15d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "mechanism": "EULA",
+ "time": 1513942017
+ }
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "4X3skpoEK2DRgZxQ9PwuEvCJpL8JHdQ8X4HDDFyztgqE15DM2ZnkvrAh9bQY16egVinZTzwHqznmnkaFM4jjyDgd"
+ }]
+ }
+}
+```
+
+#### RICH_SCHEMA_CRED_DEF
+Adds a Credential Definition object as part of Rich Schema feature.
+
+Credential Definition is considered as a mutable object as the Issuer may rotate keys present there.
+However, rotation of Issuer's keys should be done carefully as it will invalidate all
+credentials issued for this key.
+
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
+
+ Credential Definition object as JSON serialized in canonical form.
+
+ - `signatureType` (string): Type of the ZKP signature. `CL` (Camenisch-Lysyanskaya) is the only supported type now.
+ - `mapping` (string): An `id` of the corresponding Mapping
+ - `schema` (string): An `id` of the corresponding Rich Schema. The `mapping` must reference the same Schema.
+ - `publicKey` (dict): Issuer's public keys. Consists ot primary and revocation keys.
+ - `primary` (dict): primary key
+ - `revocation` (dict, optional): revocation key
+
+- `rsType` (string):
+
+ Credential Definition's type. Currently expected to be `cdf`.
+
+- `rsName` (string):
+
+ Credential Definition's name
+
+- `rsVersion` (string):
+
+ Credential Definition's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+
+**Example**:
+```
+{
+ "ver": 1,
+ "txn": {
+ "type":204,
+ "ver":1,
+ "protocolVersion":2,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ "signatureType": "CL",
+ "mapping": "did:sov:UVj5w8DRzcmPVDpUMr4AZhJ",
+ "schema": "did:sov:U5x5w8DRzcmPVDpUMr4AZhJ",
+ "publicKey": {
+ "primary": "...",
+ "revocation": "..."
+ }
+ }",
+ "rsName":"SimpleCredDef",
+ "rsVersion":"1.0",
+ "rsType": "cdf"
+ },
+
+ "metadata": {
+ "reqId":1513945121191691,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "endorser": "D6HG5g65TDQr1PPHHRoiGf",
+ "digest": "4ba05d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
+ "taaAcceptance": {
+ "taaDigest": "6sh15d9b2c27e52aa8778708fb4b3e5d7001eecd02784d8e311d27b9090d9453",
+ "mechanism": "EULA",
+ "time": 1513942017
+ }
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1513945121,
+ "seqNo": 10,
+ "txnId":"L5AD5g65TDQr1PPHHRoiGf1|Degree|1.0",
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "4X3skpoEK2DRgZxQ9PwuEvCJpL8JHdQ8X4HDDFyztgqE15DM2ZnkvrAh9bQY16egVinZTzwHqznmnkaFM4jjyDgd"
+ }]
+ }
+}
+```
+
+#### RICH_SCHEMA_PRES_DEF
+Adds a Presentation Definition object as part of Rich Schema feature.
+
+Presentation Definition is considered as a mutable object since restrictions to Issuers, Schemas and Credential Definitions
+to be used in proof may evolve.
+ For example, Issuer's key for a given Credential Definition may be compromised,
+ so Presentation Definition can be updated to exclude this Credential Definition from the list of recommended ones.
+
+- `id` (string):
+
+ A unique ID (for example a DID with a id-string being base58 representation of the SHA2-256 hash of the `content` field)
+
+- `content` (json-serialized string):
+
+ Presentation Definition object as JSON serialized in canonical form.
+
+- `rsType` (string):
+
+ Presentation Definition's type. Currently expected to be `pdf`.
+
+- `rsName` (string):
+
+ Presentation Definition's name
+
+- `rsVersion` (string):
+
+ Presentation Definition's version
+
+`rsType`, `rsName` and `rsVersion` must be unique among all rich schema objects on the ledger.
+
+
+**Example**:
+```
+{
+ "ver": 1,
+ "txn": {
+ "type":205,
+ "ver":1,
+ "protocolVersion":2,
+
+ "data": {
+ "id": "did:sov:HGAD5g65TDQr1PPHHRoiGf",
+ "content":"{
+ TBD
+ }",
+ "rsName":"SimplePresDef",
+ "rsVersion":"1.0",
+ "rsType": "pdf"
},
"metadata": {
@@ -869,10 +1448,10 @@ Command to upgrade the Pool (sent by Trustee). It upgrades the specified Nodes (
"ver": 1,
"txn": {
"type":109,
+ "ver":1,
"protocolVersion":2,
"data": {
- "ver":1,
"name":"upgrade-13",
"action":"start",
"version":"1.3",
@@ -923,10 +1502,10 @@ Status of each Node's upgrade (sent by each upgraded Node)
"ver":1,
"txn": {
"type":110,
+ "ver":1,
"protocolVersion":2,
"data": {
- "ver":1,
"action":"complete",
"version":"1.2"
},
@@ -977,10 +1556,10 @@ Command to change Pool's configuration
"ver":1,
"txn": {
"type":111,
+ "ver":1,
"protocolVersion":2,
"data": {
- "ver":1,
"writes":false,
"force":true,
},
@@ -1050,17 +1629,17 @@ The `constraint_id` fields is where one can define the desired auth constraint f
Constraint Type. As of now, the following constraint types are supported:
- - 'ROLE': a constraint defining how many siganatures of a given role are required
- - 'OR': logical disjunction for all constraints from `auth_constraints`
- - 'AND': logical conjunction for all constraints from `auth_constraints`
+ - "ROLE": a constraint defining how many siganatures of a given role are required
+ - "OR": logical disjunction for all constraints from `auth_constraints`
+ - "AND": logical conjunction for all constraints from `auth_constraints`
- - fields if `'constraint_id': 'OR'` or `'constraint_id': 'AND'`
+ - fields if `"constraint_id": "OR"` or `"constraint_id": "AND"`
- `auth_constraints` (list)
A list of constraints. Any number of nested constraints is supported recursively
- - fields if `'constraint_id': 'ROLE'`:
+ - fields if `"constraint_id": "ROLE"`:
- `role` (string enum)
@@ -1087,7 +1666,7 @@ The `constraint_id` fields is where one can define the desired auth constraint f
Dictionary for additional parameters of the constraint. Can be used by plugins to add additional restrictions.
- - fields if `'constraint_id': 'FORBIDDEN'`:
+ - fields if `"constraint_id": "FORBIDDEN"`:
no fields
@@ -1099,52 +1678,52 @@ Let's consider an example of changing a value of a NODE transaction's `service`
```
{
- 'txn':{
- 'type':'120',
- 'protocolVersion':2,
- 'data':{
- 'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
- 'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 2,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ "txn":{
+ "type":"120",
+ "protocolVersion":2,
+ "data":{
+ "auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
+ "constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 2,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
},
- 'metadata':{
- 'reqId':252174114,
- 'from':'M9BJDuS24bqbJNvBRsoGg3',
- 'digest':'6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c',
- 'payloadDigest': '21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685',
+ "metadata":{
+ "reqId":252174114,
+ "from":"M9BJDuS24bqbJNvBRsoGg3",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
}
},
- 'txnMetadata':{
- 'txnTime':1551785798,
- 'seqNo':1
+ "txnMetadata":{
+ "txnTime":1551785798,
+ "seqNo":1
},
- 'reqSignature':{
- 'type':'ED25519',
- 'values':[
+ "reqSignature":{
+ "type":"ED25519",
+ "values":[
{
- 'value':'4wpLLAtkT6SeiKEXPVsMcCirx9KvkeKKd11Q4VsMXmSv2tnJrRw1TQKFyov4m2BuPP4C5oCiZ6RUwS9w3EPdywnz',
- 'from':'M9BJDuS24bqbJNvBRsoGg3'
+ "value":"4wpLLAtkT6SeiKEXPVsMcCirx9KvkeKKd11Q4VsMXmSv2tnJrRw1TQKFyov4m2BuPP4C5oCiZ6RUwS9w3EPdywnz",
+ "from":"M9BJDuS24bqbJNvBRsoGg3"
}
]
},
- 'ver':'1'
+ "ver":"1"
}
```
@@ -1194,18 +1773,18 @@ Please note, that list elements of `GET_AUTH_RULE` output can be used as an inpu
Constraint Type. As of now, the following constraint types are supported:
- - 'ROLE': a constraint defining how many siganatures of a given role are required
- - 'OR': logical disjunction for all constraints from `auth_constraints`
- - 'AND': logical conjunction for all constraints from `auth_constraints`
- - 'FORBIDDEN': a constraint for not allowed actions
+ - "ROLE": a constraint defining how many siganatures of a given role are required
+ - "OR": logical disjunction for all constraints from `auth_constraints`
+ - "AND": logical conjunction for all constraints from `auth_constraints`
+ - "FORBIDDEN": a constraint for not allowed actions
- - fields if `'constraint_id': 'OR'` or `'constraint_id': 'AND'`
+ - fields if `"constraint_id": "OR"` or `"constraint_id": "AND"`
- `auth_constraints` (list)
A list of constraints. Any number of nested constraints is supported recursively
- - fields if `'constraint_id': 'ROLE'`:
+ - fields if `"constraint_id": "ROLE"`:
- `role` (string enum)
@@ -1232,7 +1811,7 @@ Please note, that list elements of `GET_AUTH_RULE` output can be used as an inpu
Dictionary for additional parameters of the constraint. Can be used by plugins to add additional restrictions.
- - fields if `'constraint_id': 'FORBIDDEN'`:
+ - fields if `"constraint_id": "FORBIDDEN"`:
no fields
@@ -1240,55 +1819,55 @@ Please note, that list elements of `GET_AUTH_RULE` output can be used as an inpu
```
{
- 'txn':{
- 'type':'120',
- 'protocolVersion':2,
- 'data':{
+ "txn":{
+ "type":"120",
+ "protocolVersion":2,
+ "data":{
rules: [
- {'auth_type': '0',
- 'auth_action': 'EDIT',
- 'field' :'services',
- 'old_value': [VALIDATOR],
- 'new_value': []
- 'constraint':{
- 'constraint_id': 'OR',
- 'auth_constraints': [{'constraint_id': 'ROLE',
- 'role': '0',
- 'sig_count': 2,
- 'need_to_be_owner': False,
- 'metadata': {}},
+ {"auth_type": "0",
+ "auth_action": "EDIT",
+ "field" :"services",
+ "old_value": [VALIDATOR],
+ "new_value": []
+ "constraint":{
+ "constraint_id": "OR",
+ "auth_constraints": [{"constraint_id": "ROLE",
+ "role": "0",
+ "sig_count": 2,
+ "need_to_be_owner": False,
+ "metadata": {}},
- {'constraint_id': 'ROLE',
- 'role': '2',
- 'sig_count': 1,
- 'need_to_be_owner': True,
- 'metadata': {}}
+ {"constraint_id": "ROLE",
+ "role": "2",
+ "sig_count": 1,
+ "need_to_be_owner": True,
+ "metadata": {}}
]
},
},
...
]
- 'metadata':{
- 'reqId':252174114,
- 'from':'M9BJDuS24bqbJNvBRsoGg3',
- 'digest':'6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c',
- 'payloadDigest': '21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685',
+ "metadata":{
+ "reqId":252174114,
+ "from":"M9BJDuS24bqbJNvBRsoGg3",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
}
},
- 'txnMetadata':{
- 'txnTime':1551785798,
- 'seqNo':1
+ "txnMetadata":{
+ "txnTime":1551785798,
+ "seqNo":1
},
- 'reqSignature':{
- 'type':'ED25519',
- 'values':[
+ "reqSignature":{
+ "type":"ED25519",
+ "values":[
{
- 'value':'4wpLLAtkT6SeiKEXPVsMcCirx9KvkeKKd11Q4VsMXmSv2tnJrRw1TQKFyov4m2BuPP4C5oCiZ6RUwS9w3EPdywnz',
- 'from':'M9BJDuS24bqbJNvBRsoGg3'
+ "value":"4wpLLAtkT6SeiKEXPVsMcCirx9KvkeKKd11Q4VsMXmSv2tnJrRw1TQKFyov4m2BuPP4C5oCiZ6RUwS9w3EPdywnz",
+ "from":"M9BJDuS24bqbJNvBRsoGg3"
}
]
},
- 'ver':'1'
+ "ver":"1"
}
```
@@ -1354,13 +1933,13 @@ At least one [TRANSACTION_AUTHOR_AGREEMENT_AML](#transaction_author_agreement_am
**New Agreement Example:**
```
{
- "ver": 2,
+ "ver": 1,
"txn": {
"type":4,
+ "ver": 2,
"protocolVersion":2,
"data": {
- "ver": 2,
"version": "1.0",
"text": "Please read carefully before writing anything to the ledger",
"ratification_ts": 1514304094738044
@@ -1527,31 +2106,40 @@ A new Agreement needs to be sent instead.
}
```
-## Actions
-
-The actions are not written to the Ledger, so this is not a transaction, just a command.
+#### LEDGERS_FREEZE
-#### POOL_RESTART
-POOL_RESTART is the command to restart all nodes at the time specified in field "datetime"(sent by Trustee).
-
-- `datetime` (string):
-
- Restart time in datetime frmat/
- To restart as early as possible, send message without the "datetime" field or put in it value "0" or ""(empty string) or the past date on this place.
- The restart is performed immediately and there is no guarantee of receiving an answer with Reply.
-
-
-- `action` (enum: `start` or `cancel`):
-
- Starts or cancels the Restart.
+Freeze deprecated ledgers (default ledgers such as the domain, config, pool, and audit ledgers cannot be frozen). If a ledger is frozen it can be used for reading but not for writing. Frozen ledgers will not be caught up by new nodes and they can't be unfrozen. Frozen ledgers can be removed without breaking consensus, but this would prevent third parties from auditing the ledger history. [More information is in the Indy Plenum documenation](https://hyperledger/indy-plenum/tree/master/docs/source/transaction_freeze_ledgers.md).
**Example:**
```
{
- "reqId": 98262,
- "type": "118",
- "identifier": "M9BJDuS24bqbJNvBRsoGg3",
- "datetime": "2018-03-29T15:38:34.464106+00:00",
- "action": "start"
+ "ver": 1,
+ "txn": {
+ "type":9,
+ "protocolVersion":2,
+
+ "data": {
+ "ver": 1,
+ "ledgers_ids": [1,2,3,4],
+ },
+
+ "metadata": {
+ "reqId":1513945121191691,
+ "from":"L5AD5g65TDQr1PPHHRoiGf",
+ "digest":"6cee82226c6e276c983f46d03e3b3d10436d90b67bf33dc67ce9901b44dbc97c",
+ "payloadDigest": "21f0f5c158ed6ad49ff855baf09a2ef9b4ed1a8015ac24bccc2e0106cd905685",
+ },
+ },
+ "txnMetadata": {
+ "txnTime":1577836799,
+ "seqNo": 10,
+ },
+ "reqSignature": {
+ "type": "ED25519",
+ "values": [{
+ "from": "L5AD5g65TDQr1PPHHRoiGf",
+ "value": "4X3skpoEK2DRgZxQ9PwuEvCJpL8JHdQ8X4HDDFyztgqE15DM2ZnkvrAh9bQY16egVinZTzwHqznmnkaFM4jjyDgd"
+ }]
+ }
}
```
* [X] diff --git a/docs/source/troubleshooting.md b/docs/source/troubleshooting.md
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p docs/source/troubleshooting.md
There are no commits in stable that aren't in ubuntu 20.04
new file mode 100644
index 00000000..95abad72
--- /dev/null
+++ b/docs/source/troubleshooting.md
@@ -0,0 +1,208 @@
+
+# Indy Node troubleshooting guide
+
+## Background info
+
+From deployment point of view Indy Node is highly available replicated database, capable of withstanding crashes and malicious behavior of individual **validator** nodes, collectively called **pool**.
+In order to do so Indy Node uses protocol called RBFT, which is leader-based byzantine fault tolerant protocol for **ordering** transactions - agreeing upon global order between them on all non-faulty nodes, so that nodes will always write them in same order into their **ledgers** (which are basically transaction logs with merkle tree on top, so that consistency can be easily checked) and end up in same **state** (which is a key-value storage based on Merkle Patricia Tree) after executing ordered transactions.
+This protocol guarantees both safety and liveness as long as no more than **f=N/3** nodes (rounding down) are faulty (either unavailable or performing malicious actions), where **N** is number of all validator nodes.
+
+Protocol progresses (in other words - performs writes) as long as leader node (called **master primary**) creates new **batches** (sets of transactions to be executed), which are then agreed upon by at least of **N-f** of nodes, including master primary.
+Performance of pool is capped by performance of current master primary node - if it doesn't propose new batches then there is nothing to agree upon and execute by the rest of the pool.
+If master primary for some reason is down, or tries to perform some malicios actions (including trying to slow down writes) pool elects a new leader using subprotocol called **view change**.
+Note that during view change incoming write requests are rejected, however read requests are normally processed.
+
+In order to catch performance problems RBFT actually employs _f+1_ PBFT protocol instances, one of them called master, and other backups, each with its own leader node, called primary (so master primary is just a leader of master instance, and leaders of other instances are called **backup primaries**).
+Each instance works independently and spans all nodes (meaning all nodes run all instances), but only transactions ordered by master instance are actually executed.
+Sole purpose of backup instances is to compare their performance to master instance and initiate a view change if master primary is too slow.
+Instances are numbered, master is always 0, and backups are assigned numbers starting from 1.
+
+When node is starting up or detects that it is lagging behind the rest of the pool it can start process of **catch up**, which is basically downloading (along with consistency checks) of latest parts of ledgers from other nodes and applying transactions to state.
+More info about internals of Indy Node, including sequence diagrams of different processes can be found [here](https://github.com/hyperledger/indy-plenum/tree/master/docs/source/diagrams).
+
+In order to be capable of automatically [upgrading](https://github.com/hyperledger/indy-node/blob/master/docs/source/pool-upgrade.md) itself Indy Node employs separate service called `indy-node-control`, which runs along with main service called `indy-node`.
+Also it is worth noting that `indy-node` service is configured to automatically restart node process in case it crashes.
+
+### Types of failures
+
+In an ideal world network connections are always stable, nodes do not fail, and software doesn't have bugs.
+Unfortunatelly in real world this is not the case, so Indy Node pool can fail in some circumstances.
+
+Most of failures can divided into following categories (in order of increasing severity):
+- failures induced by environment problems, like misconfigured firewalls preventing nodes connecting to each others
+- transient consensus failures on some nodes, most likely to some unhandled edge cases, which can go away after restarting affected nodes
+- node failures, like inability to properly perform an upgrade or handle some incoming request due to some bug, which require manual intervention, but doesn't affect ledger data
+- ledger corruption, which require touching ledger data in order to fix it
+
+Of course, if number of affected nodes is _f_ or less then from external point of view functionality of pool will be unaffected.
+However if more than _f_ nodes become affected then pool will not be able to do writes, although pool still will be able to process reads until all nodes fail.
+
+### Where to get info
+
+Most useful places get info are the following:
+- [Indy CLI](https://github.com/hyperledger/indy-sdk/tree/master/cli) can be used for sending read and write requests to pool, as well as checking general connectivity.
+- Either VALIDATOR_INFO command sent through Indy CLI (in case of Sovrin network you'll need to have keys for priveleged DID to do so), or [validator-info](https://github.com/hyperledger/indy-node/blob/master/design/validator_info.md) script run on validator node.
+ These tools provide important information like how many nodes are connected to each other, when last write happened (due to freshness check it should happen at least once per 5 minutes), whether a view change is in progress now or other important data, which is useful to assess pool current health and can be a starting point for further investigation, when needed.
+- `journalctl` logs can be useful because they contain tracebacks of indy-node crashes, if they happened, and these logs are really easy to check.
+ Sometime crashes can be due to some bugs in code, but also they can be caused by insufficient resource (either memory or disk space), and if this is the case `journalctl` logs can provide a quick answer.
+- Indy node logs, located in `/var/log/indy/<network name>/`.
+ They can provide a lot of historical information, and very often are enough to get enough clues in order to properly diagnose situation, however they can be hard to read for unprepared.
+ Lately most of the time it was enough to use `grep` and `sort` command-line tools to analyze them (although we'd recommend using [ripgrep](https://github.com/BurntSushi/ripgrep/releases) instead of plain grep, as it has quite a bunch of usability and performance improvements over traditional grep, while having compatible interface and no extra dependencies).
+ However we also have a [process_logs](https://github.com/hyperledger/indy-plenum/tree/master/scripts/process_logs) utility script, which also can be useful.
+ More info about them will be provided in next sections.
+- Indy node control tool logs, located in `/var/log/indy/node_control.log`.
+ They can be useful when investigating upgrade-related problems.
+- In case of Sovrin network there are public websites showing contents of different ledgers, even if network is down (they basically mirror ledgers in their local database).
+ We find most user-friendly one to be [indyscan.io](https://indyscan.io).
+ This can be useful as a quick check whether some transaction type was written in the past when investigating transaction-specific problems.
+- When in doubt about connectivity issues due to misconfigured firewalls or DPI a purpose-build tool [test_zmq](https://github.com/hyperledger/indy-plenum/blob/master/scripts/test_zmq/README.md) can be used.
+- Debug metrics.
+ They are turned off by default, but can be turned on by adding `METRICS_COLLECTOR_TYPE = 'kv'` to `/etc/indy/indy_config.py` and restarting node.
+ Tools for processing these metrics are scripts `process_logs` and `build_graph_from_csv` bundled with Indy Node.
+ Debug metrics can be used to find some insidios problems like memory leaks and some other hard to detect problems.
+
+### Node data structure
+
+Node stores all data (except local configuration and logs) in `/var/lib/indy/<network name>`.
+Among other things it contains following data:
+- `keys` subdirectory contains node keys, including private ones.
+ **Never ever share this folder with 3rd party!**
+- `data` contains various databases, including ledgers, states and various caches.
+ Since all data inside this directory is effectively public it should be safe to share its contents, for example for debugging purposes.
+
+Directory `/var/lib/indy/<network name>/data/<node_name>` contains directories with RocksDB databases, most interesting ones are:
+- `*_transactions` contains transaction data of ledgers (`*` here can be `audit`, `domain`, `config`, `pool` and possibly some plugin ledgers)
+- `*_merkleNodes` and `*_merkeLeaves` contain merkle trees for transaction data of ledgers.
+ This data can be rebuilt from transactions data if needed.
+- `*_state` contains state in a form of merkle patricia tree (note that there is no state for `audit` ledger, only for `domain`, `config`, `pool` and possibly some plugin ledgers).
+ This data can be rebuilt from transactions data if needed, however if after such rebuild root hashes change clients won't be able to get state proofs in order to trust replies from any single node and will fallback to asking `f+1` nodes for such hashes.
+ This shouldn't be a problem with queries on current data since it gets updated with every new batch ordered (including empty freshness batches), but it can degrade performance of accessing some historical data.
+
+## Troubleshooting checklist
+
+### Emergency checklist
+
+In case of client-visible incidents first of all assess how bad situation is and try to fix it as soon as possible:
+- If failure happened after upgrade
+ - Check that upgrade successfully finished, and all nodes have same versions, if not - investigate and fix it first
+- If it is impossible to read from ledger
+ - Check whether problem affects all read transactions, or just some subset of them.
+ Latter case indicates that most probably there is a bug in source code.
+ This needs longer investigation, but on the other hand it doesn't affect all use cases.
+ First place to look at should be journalctl to make sure nodes are not crashing on receiving transactions.
+ - If problem affects all read transactions:
+ - Check whether nodes are accessible at all from client (using tools like `netcat`).
+ Nodes IPs and ports can be found in pool ledger, which should be present on client.
+ In case of Sovrin network nodes addresses also could be found from some 3rd party sites like [indyscan.io](https://indyscan.io).
+ - If nodes are inaccessible from client check whether nodes are actually running and firewalls are configured properly (by asking corresponding Stewards).
+ When coordinating with Stewards an additional [purpose-built tool](https://github.com/hyperledger/indy-plenum/blob/master/scripts/test_zmq/README.md) can be used for checking ZeroMQ connectivity.
+ - If nodes are running, but do not respond - check journalctl logs, there is a high change that nodes are perpetually crashing while trying to start up.
+ This needs investigation (but usually it shouldn't take too long, since stack trace is available) and fixing source code.
+- If it is implossible to write to ledger
+ - Check whether problem affects all write transactions, or just some subset of them.
+ Latter case indicates that most probably there is a bug in source code.
+ This needs longer investigation, but on the other hand it doesn't affect all use cases.
+ First place to look at should be journalctl to make sure nodes are not crashing on receiving transactions.
+ - If problem affects all write transactions (meaning that write consensus is lost):
+ - Check whether there is an ongoing view change (write requests are rejected during view change), if so - it could be reasonable to give it some time to finish.
+ If it doesn't finish in 5 minutes proceed to next checks.
+ - Check whether at least _N-f_ nodes are online, if not - start more nodes.
+ - Check that all online nodes (and especially master primary) are connected to each other, if not - most probably it is firewall issues which need to be fixed.
+ - Check that all started nodes are participating in consensus (i.e. not in catch-up or view change state)
+ - if some are stuck in catch up or view change - try rebooting them individually, this could help if error condition is transient
+ - if majority of nodes are stuck - send POOL_RESTART action to simultaneosly restart whole pool, this could help if error condition is transient
+ - if restart doesn't help (and it is not a ledger or state corruption, see below) problem needs deeper investigation
+ - Check all nodes have same state - if some nodes appear to be corrupted (in this case corrupted nodes usually contain `incorrect state trie` messages in their logs) delete state from them and restart, they should be able to recreate correct state from ledger, unless there is some bug in code or ledger itself is corrupted.
+ - Check all nodes have same ledgers - if some nodes appear to be corrupted (their logs can contain `incorrect state trie`, `incorrect audit root hash`, `blacklisting` etc) it makes sense to delete all data from them and let them catch up from healty nodes. **This should be done with extreme care, and only if number of corrupted nodes is minor (certainly less than _f_).** Also, after deleting corrupted ledgers it is advised to check whether offending nodes were blacklisted by others, and if so - restart nodes that performed blacklisting.
+
+### Recommended regular health checks
+
+Indy Node pool can function even with some nodes failing, however it is better to catch and fix these failures before too many nodes are affected and we end up with major incident. First things to look at are:
+- check that there are no crashes in journalctl, if there are some - investigate reasons, if it leads to finding some bugs - fix them
+- check that all nodes can connect to each other, if not - it is most likely firewall issue which needs to be fixed
+- check that all nodes participate in consensus, and are not stuck in view change or catch up, if not - investigate why they became stuck and reboot them
+- check that all nodes have correct ledgers and state, if not - investigate why it diverged and then fix (usually by deleting state or full data and letting node restore it)
+- check that nodes are not blacklisting each other, if they do some blacklisting - investigate and fix offending node and then restart all nodes that blacklisted it
+
+**Warning:** when significant number of nodes is out of consensus try to refrain from sending promotion/demotion transactions, especially many in a row.
+
+### First things to look at in node logs during investigations
+
+- `incorrect <anything except time>` - if you see in logs messages blaming some PREPREPAREs to have `incorrect state trie`, `incorrect audit root hash` or the like it means that we've got data corruption.
+ If this happened next steps would be:
+ - try to understand how many and which nodes got their data corrupted:
+ - if these messages are seen on minority of nodes, and view change doesn't happen, then it is these nodes have data corrupted
+ - if these messages are seen on majority of nodes, but then view change happens and ordering continues normally, then data corruption happened on primary and non-blaming nodes (after view change they should start blaming new primary for incorrect PREPREPAREs, since view change will reduce situation to previous case)
+ - try to understand what exactly is corrupted and why:
+ - look through `journalctl` logs for recent crashes, especially due to memory or disk space errors.
+ Sometimes crashes can lead to data corruption.
+ - look through node logs for recent view changes - this is quite a complex process, which sometimes led to data corruption in the past (now this is hopefully fixed with correct implementation of PBFT view change)
+ - if warning was on `incorrect state trie` then most likely only state was corrupted.
+ In this case you can try to stop node, delete state and start it again - node should be able to rebuild correct state from transactions log (ledger).
+ If deleting state doesn't help, and node continues complaining on incorrect state trie, then situation is worse, and probably there is a bug
+
+- `incorrect time` - if you see in logs messages blaming some PREPREPAREs to have `incorrect time`, then there is either one of the following:
+ - either primary node, or node complianing on `incorrect time` has local clock set significantly off, if this is the case clock needs to be adjusted
+ - incoming message queues are filled up faster than processed - which can be indirectly confirmed by increased memory consumption.
+ Possible reasons behind this might be:
+ - previous PREPREPARE messages were also discarded, but by some other reason.
+ If this is the case having PREPREPAREs with incorrect time is just a consequence
+ - there was some performance spike, which caused at first some message delays, followed by an avalanche on message requests.
+ This can be fixed by pool restart (using POOL_RESTART transaction)
+ - there is too much load on the pool (very unlikely situation for current Sovrin deployments, but we've seen this during load tests against test pools)
+
+- `blacklisting` - this message can be seen during catchup when some node find that some other node tries to send it transactions which are not in ledgers of other nodes.
+ Usually this indicates that ledger corruption happened somewhere, which can be diagnosed and fixed like described above in "incorrect state/audit" section.
+ There is twist however - even if offending node is fixed other nodes retain their blacklists until restart.
+
+### Useful fields in validator info output
+
+- `Software` - this can be useful for checking what exact versions of indy-related packages are installed
+- `Pool_info` - shows generic pool parameters and connectivity status, most useful fields are:
+ - `Total_nodes_count` - number of validator nodes in pool
+ - `f_value` - maximum number of faulty nodes that can be tolerated without affecting consensus, should be `Total_nodes_count/3` rounded down
+ - `Unreachable_nodes_count` - number of nodes that cannot be reached, ideally should be 0, if more than `f_value` write consensus is impossible
+ - `Unreachable_nodes` - list of unreachable nodes, ideally should be empty
+ - `Blacklisted_nodes` - list of blacklisted nodes, should be empty, if not and blacklisted nodes were already fixed blacklisting node should be rebooted to clear this list
+- `Node_info` - shows information about node
+ - `Mode` - operation mode of node, normally it should be `participating`, if not - node is not participating in write consensus
+ - `Freshness_status` - status of freshness checks (periodic attempts to order some batch on master instance) per ledger, usually either all are successfull, or all are failed. Latter case means that at least on this node write consensus is broken. If write consensus is broken on more than _f_ nodes then it is certainly broken on all nodes, meaning pool has lost write consensus and cannot process write transactions.
+ - `Catchup_status` - status of either last completed or currently ongoing catchup
+ - `Ledger_statuses` - catchup statuses of individaul ledgers, if some are not `synced` then it means that there is an ongoing catchup
+ - `View_change_status` - status of either last completed or currently ongoing view change
+ - `View_No` - current view no, should be same on all nodes.
+ If there is an ongoing view change this will indicate target view no.
+ - `VC_in_progress` - indicates that there is an ongoing view change at least on this node, this shouldn't last for more than 5 minutes (actually in most cases view change should complete in under 1 minute).
+ Usually all nodes should start and finish view change approximately at the same time, but sometimes due to some edge cases less than _f_ nodes can enter (or fail to finish) view change and linger in this state indefinitely.
+ In this case it is advised to try to reboot such nodes, and if doesn't help - investigate situation further.
+ - `Last_complete_view_no` - indicates view no of last completed view change
+ - `IC_queue` - list of received instance change messages (effectively votes for view change) which didn't trigger a view change yet (at least _N-f_ votes for view change to same view_no is needed)
+
+
+### Useful patterns in logs
+
+- `Starting up indy-node` - node just started. This can be useful to identify restart points among other events
+- `starting catchup (is_initial=<>)` - node started catching up
+- `transitioning from <state> to <state>` - node successfully progressed through catch up. If node fails to finish catch up this could help identify exact stage which failed
+- `caught up to <n> txns in last catch up` - node successfully finished a catch up
+- `sending an instance change` - this shows our node voted for a view change (and includes reason for doing so)
+- `received instance change request` - this indicates that we received vote for a view change from some other node. When node gets _N-f_ instance change messages (including ours) with same `view_no` (however `code` can be different) it starts a view change. This message also can be very useful when you have access to logs from a limited number of nodes but need to diagnose problems happening on other nodes. Codes which are most likely to occur:
+ - 25 - master primary has too low performance
+ - 26 - master primary disconnected (which means it cannot propose new batches to order, and pool needs to select a new one)
+ - 28 - there is an ongoing view change, and it failed to complete in time
+ - 43 - too much time has passed since last successfully ordered batch (in other words node suspects that pool cannot perform write transactions anymore)
+ - 18 - time of PREPREPARE received from master primary is way too off, this can also indirectly indicate that incoming message queues are filled faster than processed
+ - 21 - state trie root hash of received PREPREPARE looks incorrect, **this indicates ledger corruption**, refer to [this](#first-things-to-look-at-in-node-logs-during-investigations) section for more details
+ - 44 - audit root hash of received PREPREPARE looks incorrect, **this indicates ledger corruption**, refer to [this](#first-things-to-look-at-in-node-logs-during-investigations) section for more details
+ - 46-51 - these codes are connected to promotions or demotions of validator nodes, and:
+ - either indicate that some nodes were promoted or demoted, in which case pool need to choose a new primary, hence votes for view change
+ - or indicate some transient problems when selecting new primary after changing number of nodes, so yet another view change is needed
+ - other suspicion codes can be found [here](https://github.com/hyperledger/indy-plenum/blob/master/plenum/server/suspicion_codes.py)
+- `started view change to view` - this marks start of view change
+- `finished view change to view` - view change service accepted NEW_VIEW message (so there is enough connectivity between honest nodes to reach consensus), however there are some cases when ordering in new view fails and another view change will be needed
+- `started participating` - node finished all side activities (like catch up or view change) and started participating in consensus
+- `0 ordered batch request` - node master instance just managed to order one more batch of transactions (so there is write consensus)
+- `<n> ordered batch request` - some backup instance just managed to order one more batch (so there is write consensus on backups, but that doesn't mean write consensus on master, which matters for clients).
+- `disconnected from <node>` - for some reason we lost connection to some other validator node
+- `now connected to <node>` - we managed to connect to some other validator node after being disconnected
+- `connections changed from <list> to <list>` - duplicates information about connection and disconnection events, it can be useful for quickly checking list of connected nodes at given point in time without needing to track all previous individual connection events. Previous data can
* [X] diff --git a/environment/docker/baseimage/indy-baseimage.ubuntu.dockerfile b/environment/docker/baseimage/indy-baseimage.ubuntu.dockerfile
This file seems invalid now, as we are using ubuntu 20.04, not ubuntu 16.04.
* [ ] TODO: We should remove this file and / or replace it with ubuntu 20.04
index 76a2bb90..41fb405a 100644
--- a/environment/docker/baseimage/indy-baseimage.ubuntu.dockerfile
+++ b/environment/docker/baseimage/indy-baseimage.ubuntu.dockerfile
@@ -21,7 +21,7 @@ RUN apt-get update && apt-get install -y \
# pypi based packages
RUN pip3 install -U \
'pip<10.0.0' \
- 'setuptools<=50.3.2' \
+ setuptools \
virtualenv
COPY scripts/clean.sh /usr/local/bin/indy_image_clean
* [X] diff --git a/environment/docker/baseimage/indy-core-repo.preferences b/environment/docker/baseimage/indy-core-repo.preferences
We removed indy-crypto, good.
index f6c62527..459b5f57 100644
--- a/environment/docker/baseimage/indy-core-repo.preferences
+++ b/environment/docker/baseimage/indy-core-repo.preferences
@@ -1,3 +1,3 @@
-Package: /indy-crypto/ /libindy/
+Package: /libindy/
Pin: release l=Indy Main Repository
Pin-Priority: 1000
* [X] diff --git a/environment/docker/pool/core.ubuntu.dockerfile b/environment/docker/pool/core.ubuntu.dockerfile
This builds off of xenial, or ubuntu 16.04.
* [ ] TODO: This should be upgraded to ubuntu 20.04 or removed.
index 71694bf2..b855d424 100644
--- a/environment/docker/pool/core.ubuntu.dockerfile
+++ b/environment/docker/pool/core.ubuntu.dockerfile
@@ -14,7 +14,7 @@ RUN apt-get update -y && apt-get install -y \
ca-certificates
RUN pip3 install -U \
'pip<10.0.0' \
- 'setuptools<=50.3.2'
+ setuptools
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys BD33704C
RUN echo "deb https://repo.sovrin.org/deb xenial master" >> /etc/apt/sources.list
* [X] diff --git a/environment/openshift/core.ubuntu.dockerfile b/environment/openshift/core.ubuntu.dockerfile
This builds off of xenial, or ubuntu 16.04.
* [ ] TODO: This should be upgraded to ubuntu 20.04 or removed.
index 8dcdd9b0..1b1720b9 100644
--- a/environment/openshift/core.ubuntu.dockerfile
+++ b/environment/openshift/core.ubuntu.dockerfile
@@ -17,7 +17,7 @@ RUN apt-get update -y && apt-get install -y \
RUN pip3 install -U \
'pip<10.0.0' \
- 'setuptools<=50.3.2'
+ setuptools
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys CE7709D068DB5E88
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys BD33704C
* [X] diff --git a/indy_common/auth.py b/indy_common/auth.py
Let us poke at the auth and see if this change makes sense. We did add a LEDGERS_FREEZE option, so this looks to me. Checking history to just verify.
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p indy_common/auth.py
returns only merge commits, so I think we are good to move on.
index 176f0eae..dbaf9ab9 100644
--- a/indy_common/auth.py
+++ b/indy_common/auth.py
@@ -1,5 +1,5 @@
from indy_common.config_util import getConfig
-from plenum.common.constants import TRUSTEE, STEWARD, NODE
+from plenum.common.constants import TRUSTEE, STEWARD, NODE, LEDGERS_FREEZE
from stp_core.common.log import getlogger
from indy_common.constants import OWNER, POOL_UPGRADE, ENDORSER, NYM, \
@@ -62,6 +62,8 @@ def generate_auth_map(valid_roles):
{TRUSTEE: []},
'{}_<any>_<any>_<any>'.format(VALIDATOR_INFO):
{TRUSTEE: [], STEWARD: []},
+ '{}_<any>_<any>_<any>'.format(LEDGERS_FREEZE):
+ {TRUSTEE: []},
}
return auth_map
* [X] diff --git a/indy_common/authorize/auth_constraints.py b/indy_common/authorize/auth_constraints.py
Better error message. Looks good to me.
index 8c0e14e9..d136fe61 100644
--- a/indy_common/authorize/auth_constraints.py
+++ b/indy_common/authorize/auth_constraints.py
@@ -136,7 +136,7 @@ class AuthConstraint(AbstractAuthConstraint):
elif role == "ALL" and self.need_to_be_owner and self.sig_count == 1:
error_msg = "1 signature of any role is required and needs to be owner"
elif role == 'ALL' and not self.need_to_be_owner and self.sig_count == 1:
- error_msg = "1 signature of any role is required".format(role)
+ error_msg = "1 signature of any role is required {}".format(role)
elif role == 'ALL' and not self.need_to_be_owner and self.sig_count > 1:
error_msg = "{} signatures of any role are required".format(self.sig_count)
elif role == "ALL" and self.need_to_be_owner and self.sig_count > 1:
* [X] diff --git a/indy_common/authorize/auth_map.py b/indy_common/authorize/auth_map.py
Well, we aren't going to pretend we understand or know what is going on here, so we are going to let the commit log tell the story. First, lets look at the stable commits vs ubuntu 20.04
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p indy_common/authorize/auth_map.py
* commit b54404d0bdc23dfb60d10bababc9216e99c3ee49
| Merge: ff88db39 e0d3b94d
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Thu Dec 26 15:23:08 2019 +0300
|
| Merge remote-tracking branch 'remotes/upstream/master' into rc-1.12.1.rc1
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: 8940e559 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
| Merge: 1cd837b1 48199562
| Author: Andrey Kononykhin <andkononykhin@gmail.com>
| Date: Thu Jun 27 19:32:01 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
|
| Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
|
* commit 1cd837b1d093f1afc408b5395cbc00353f7e3cca
| Merge: a8784f11 df9959d0
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed May 29 09:36:39 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.8.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
| Merge: c009c3f0 d8c42999
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Apr 24 13:04:08 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
Merge: 089d12e4 16697cf9
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Wed Feb 6 09:41:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.6.83
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
As we can see, they are all merge commits that appears to merge in commits that were already merged in by ubuntu 20.04. This means we are good and shouldn't think about this one. 😀
index 00f06343..4a88a232 100644
--- a/indy_common/authorize/auth_map.py
+++ b/indy_common/authorize/auth_map.py
@@ -5,9 +5,11 @@ from indy_common.authorize.auth_actions import AuthActionAdd, AuthActionEdit
from indy_common.authorize.auth_constraints import AuthConstraint, AuthConstraintOr, accepted_roles, IDENTITY_OWNER, \
AuthConstraintForbidden
from indy_common.constants import ENDORSER, POOL_CONFIG, VALIDATOR_INFO, POOL_UPGRADE, POOL_RESTART, NODE, \
- CLAIM_DEF, SCHEMA, SET_CONTEXT, NYM, ROLE, AUTH_RULE, NETWORK_MONITOR, REVOC_REG_ENTRY, REVOC_REG_DEF, ATTRIB, AUTH_RULES
+ CLAIM_DEF, SCHEMA, NYM, ROLE, AUTH_RULE, NETWORK_MONITOR, REVOC_REG_ENTRY, \
+ REVOC_REG_DEF, ATTRIB, AUTH_RULES, JSON_LD_CONTEXT, RICH_SCHEMA, RICH_SCHEMA_MAPPING, \
+ RICH_SCHEMA_ENCODING, RICH_SCHEMA_CRED_DEF, RICH_SCHEMA_PRES_DEF, FLAG
from plenum.common.constants import TRUSTEE, STEWARD, VERKEY, TXN_AUTHOR_AGREEMENT, TXN_AUTHOR_AGREEMENT_AML, \
- TXN_AUTHOR_AGREEMENT_DISABLE
+ TXN_AUTHOR_AGREEMENT_DISABLE, LEDGERS_FREEZE
edit_role_actions = {} # type: Dict[str, Dict[str, AuthActionEdit]]
for role_from in accepted_roles:
@@ -78,15 +80,59 @@ edit_schema = AuthActionEdit(txn_type=SCHEMA,
old_value='*',
new_value='*')
-add_context = AuthActionAdd(txn_type=SET_CONTEXT,
- field='*',
- value='*')
+add_json_ld_context = AuthActionAdd(txn_type=JSON_LD_CONTEXT,
+ field='*',
+ value='*')
-edit_context = AuthActionEdit(txn_type=SET_CONTEXT,
- field='*',
- old_value='*',
- new_value='*')
+edit_json_ld_context = AuthActionEdit(txn_type=JSON_LD_CONTEXT,
+ field='*',
+ old_value='*',
+ new_value='*')
+
+add_rich_schema = AuthActionAdd(txn_type=RICH_SCHEMA,
+ field='*',
+ value='*')
+
+edit_rich_schema = AuthActionEdit(txn_type=RICH_SCHEMA,
+ field='*',
+ old_value='*',
+ new_value='*')
+
+add_rich_schema_encoding = AuthActionAdd(txn_type=RICH_SCHEMA_ENCODING,
+ field='*',
+ value='*')
+
+edit_rich_schema_encoding = AuthActionEdit(txn_type=RICH_SCHEMA_ENCODING,
+ field='*',
+ old_value='*',
+ new_value='*')
+
+add_rich_schema_mapping = AuthActionAdd(txn_type=RICH_SCHEMA_MAPPING,
+ field='*',
+ value='*')
+edit_rich_schema_mapping = AuthActionEdit(txn_type=RICH_SCHEMA_MAPPING,
+ field='*',
+ old_value='*',
+ new_value='*')
+
+add_rich_schema_cred_def = AuthActionAdd(txn_type=RICH_SCHEMA_CRED_DEF,
+ field='*',
+ value='*')
+
+edit_rich_schema_cred_def = AuthActionEdit(txn_type=RICH_SCHEMA_CRED_DEF,
+ field='*',
+ old_value='*',
+ new_value='*')
+
+add_rich_schema_pres_def = AuthActionAdd(txn_type=RICH_SCHEMA_PRES_DEF,
+ field='*',
+ value='*')
+
+edit_rich_schema_pres_def = AuthActionEdit(txn_type=RICH_SCHEMA_PRES_DEF,
+ field='*',
+ old_value='*',
+ new_value='*')
add_claim_def = AuthActionAdd(txn_type=CLAIM_DEF,
field='*',
@@ -190,6 +236,16 @@ edit_revoc_reg_entry = AuthActionEdit(txn_type=REVOC_REG_ENTRY,
old_value='*',
new_value='*')
+edit_frozen_ledgers = AuthActionEdit(txn_type=LEDGERS_FREEZE,
+ field='*',
+ old_value='*',
+ new_value='*')
+
+edit_config_flag = AuthActionEdit(txn_type=FLAG,
+ field='*',
+ old_value='*',
+ new_value='*')
+
# Anyone constraint
anyone_constraint = AuthConstraint(role='*',
sig_count=1)
@@ -208,6 +264,9 @@ steward_owner_constraint = AuthConstraint(STEWARD, 1, need_to_be_owner=True)
# One Trustee constraint
one_trustee_constraint = AuthConstraint(TRUSTEE, 1)
+# Three Trustee constraint
+three_trustee_constraint = AuthConstraint(TRUSTEE, 3)
+
# Steward or Trustee constraint
steward_or_trustee_constraint = AuthConstraintOr([AuthConstraint(STEWARD, 1),
AuthConstraint(TRUSTEE, 1)])
@@ -240,8 +299,20 @@ auth_map = OrderedDict([
(edit_attrib.get_action_id(), owner_constraint),
(add_schema.get_action_id(), endorser_or_steward_or_trustee_constraint),
(edit_schema.get_action_id(), no_one_constraint),
- (add_context.get_action_id(), endorser_or_steward_or_trustee_constraint),
- (edit_context.get_action_id(), no_one_constraint),
+
+ (add_json_ld_context.get_action_id(), endorser_or_steward_or_trustee_constraint),
+ (edit_json_ld_context.get_action_id(), no_one_constraint),
+ (add_rich_schema.get_action_id(), endorser_or_steward_or_trustee_constraint),
+ (edit_rich_schema.get_action_id(), no_one_constraint),
+ (add_rich_schema_encoding.get_action_id(), endorser_or_steward_or_trustee_constraint),
+ (edit_rich_schema_encoding.get_action_id(), no_one_constraint),
+ (add_rich_schema_mapping.get_action_id(), endorser_or_steward_or_trustee_constraint),
+ (edit_rich_schema_mapping.get_action_id(), no_one_constraint),
+ (add_rich_schema_cred_def.get_action_id(), endorser_or_steward_or_trustee_constraint),
+ (edit_rich_schema_cred_def.get_action_id(), no_one_constraint),
+ (add_rich_schema_pres_def.get_action_id(), endorser_or_steward_or_trustee_constraint),
+ (edit_rich_schema_pres_def.get_action_id(), no_one_constraint),
+
(add_claim_def.get_action_id(), endorser_or_steward_or_trustee_constraint),
(edit_claim_def.get_action_id(), owner_constraint),
(adding_new_node.get_action_id(), steward_owner_constraint),
@@ -266,6 +337,8 @@ auth_map = OrderedDict([
(add_revoc_reg_entry.get_action_id(), endorser_or_steward_or_trustee_owner_constraint),
(edit_revoc_reg_def.get_action_id(), owner_constraint),
(edit_revoc_reg_entry.get_action_id(), owner_constraint),
+ (edit_frozen_ledgers.get_action_id(), three_trustee_constraint),
+ (edit_config_flag.get_action_id(), one_trustee_constraint),
])
# Edit Trustee:
* [X] diff --git a/indy_common/compat_set.py b/indy_common/compat_set.py
This appears to be a new file. Let us double check the file doesn't exist in the old stable branch and was deleted or removed.
Doesn't exist. Good.
new file mode 100644
index 00000000..4fcb2d20
--- /dev/null
+++ b/indy_common/compat_set.py
@@ -0,0 +1,515 @@
+"""
+A Set collection with Python 3.5-compatible ordering characteristics.
+
+Adapted from cpython/Objects/setobject.c (written by Raymond D. Hettinger)
+https://github.com/python/cpython/blob/v3.5.10/Objects/setobject.c
+"""
+
+from collections import namedtuple
+from io import StringIO
+from typing import AbstractSet, Any, Iterable, Optional, Tuple
+
+
+class _Entry(namedtuple("_Entry", ("elem", "hash"))):
+ @classmethod
+ def from_elem(cls, elem) -> "_Entry":
+ return super().__new__(cls, elem, hash(elem))
+
+
+MIN_SIZE = 8
+LINEAR_PROBES = 9
+PERTURB_SHIFT = 5
+REMOVED = _Entry(None, 0)
+
+
+class CompatSet:
+ """Compatibility set."""
+
+ def __init__(self, elems: Iterable = None):
+ """Initialize the set."""
+ self._inner = _CompatSetInner()
+ if elems is not None:
+ self._inner.update(elems)
+
+ def add(self, elem):
+ """Add element elem to the set."""
+ self._inner.add(elem)
+
+ def capacity(self) -> int:
+ """Get the current allocated capacity of the set."""
+ return self._inner._mask + 1
+
+ def discard(self, elem):
+ """Remove element elem from the set if it is present."""
+ self._inner.discard(elem)
+
+ def remove(self, elem):
+ """
+ Remove element elem from the set.
+
+ Raises KeyError if elem is not contained in the set.
+ """
+ if not self._inner.discard(elem):
+ raise KeyError(elem)
+
+ def pop(self) -> Any:
+ """
+ Remove and return an arbitrary element from the set.
+
+ Raises KeyError if the set is empty.
+ """
+ return self._inner.pop()
+
+ def clear(self):
+ """Remove all elements from the set."""
+ self._inner = _CompatSetInner()
+
+ def copy(self) -> "CompatSet":
+ """Return a shallow copy of the set."""
+ return CompatSet._from_inner(self._inner.copy())
+
+ def difference(self, *others) -> "CompatSet":
+ """Return a new set with elements in only this set."""
+ inner = None
+ for other in others:
+ if inner is None:
+ inner = self._inner.difference(other)
+ else:
+ inner = inner.difference_update(other)
+ if inner is None:
+ inner = self._inner.copy()
+ return CompatSet._from_inner(inner)
+
+ def symmetric_difference(self, other: AbstractSet) -> "CompatSet":
+ """Return a new set with elements in either set but not both."""
+ inner = _CompatSetInner()
+ inner.update(other)
+ inner = inner.symmetric_difference_update(self)
+ return CompatSet._from_inner(inner)
+
+ def intersection(self, *others) -> "CompatSet":
+ """Return a new set with elements common to the set and all others."""
+ inner = None
+ for other in others:
+ if inner is None:
+ inner = self._inner.intersection(other)
+ else:
+ inner = inner.intersection(other)
+ if inner is None:
+ inner = self._inner.copy()
+ return CompatSet._from_inner(inner)
+
+ def update(self, *others):
+ """Update the set from one or more iterators."""
+ for other in others:
+ self._inner.update(other)
+
+ def isdisjoint(self, other: AbstractSet) -> bool:
+ """Determine if there is no intersection between two sets."""
+ if len(self) > len(other):
+ (other, self) = (self, other)
+ return all(elem not in self for elem in other)
+
+ def issubset(self, other: AbstractSet) -> bool:
+ """Test whether every element in the set is in other."""
+ return all((elem in other) for elem in self)
+
+ def issuperset(self, other: AbstractSet) -> bool:
+ """Test whether every element in other is in the set."""
+ return all((elem in self) for elem in other)
+
+ def union(self, *others) -> "CompatSet":
+ """Return a new set with elements from the set and all others."""
+ inner = self._inner.copy()
+ for other in others:
+ inner.update(other)
+ return CompatSet._from_inner(inner)
+
+ def __iter__(self) -> "_CompatSetIter":
+ """Iterate over the set."""
+ return _CompatSetIter(self._inner)
+
+ def __bool__(self) -> bool:
+ """Get the truthy value of the set."""
+ return self._inner._used > 0
+
+ def __eq__(self, other) -> bool:
+ """Compare two sets."""
+ if isinstance(other, CompatSet):
+ return other._inner == self._inner
+ if isinstance(other, set):
+ return len(other) == len(self) and all((elem in other) for elem in self)
+ return False
+
+ def __len__(self) -> int:
+ """Get the length of the set."""
+ return self._inner._used
+
+ def __contains__(self, elem) -> bool:
+ """Determine if the set contains elem (elem in self)."""
+ return self._inner.contains(elem)
+
+ def __and__(self, other: Iterable) -> "CompatSet":
+ """Return a new set with elements common to both."""
+ return self.intersection(other)
+
+ def __iand__(self, other: Iterable) -> "CompatSet":
+ """Make this set an intersection with the other set."""
+ self._inner = self._inner.intersection(other)
+ return self
+
+ def __or__(self, other: Iterable) -> "CompatSet":
+ """Return a new set with elements in only this set."""
+ return self.union(other)
+
+ def __ior__(self, other: Iterable) -> "CompatSet":
+ """Add elements from the other set."""
+ self._inner.update(other)
+ return self
+
+ def __sub__(self, other: Iterable) -> "CompatSet":
+ """Return a new set with elements in only this set."""
+ return self.difference(other)
+
+ def __isub__(self, other: Iterable) -> "CompatSet":
+ """Remove elements from the other set."""
+ self._inner = self._inner.difference_update(other)
+ return self
+
+ def __xor__(self, other: AbstractSet) -> "CompatSet":
+ """Return a new set with elements in either set but not both."""
+ return self.symmetric_difference(other)
+
+ def __ixor__(self, other: AbstractSet) -> "CompatSet":
+ """Update this set with the symmetric intersection."""
+ self._inner = self._inner.symmetric_difference_update(other)
+ return self
+
+ def __repr__(self) -> str:
+ """Get the set representation."""
+ s = StringIO()
+ s.write("{")
+ fst = True
+ for elem in self:
+ if not fst:
+ s.write(", ")
+ fst = False
+ s.write(repr(elem))
+ s.write("}")
+ return s.getvalue()
+
+ @classmethod
+ def _from_inner(cls, inner: "_CompatSetInner") -> "CompatSet":
+ slf = cls.__new__(cls)
+ slf._inner = inner
+ return slf
+
+
+class _CompatSetInner:
+ def __init__(self):
+ self._fill = 0
+ self._mask = MIN_SIZE - 1
+ self._search = 0
+ self._table = [None] * MIN_SIZE
+ self._used = 0
+
+ def add(self, elem):
+ self.add_entry(_Entry.from_elem(elem))
+
+ def add_entry(self, entry: _Entry):
+ if self.insert_entry(entry) and self._fill * 3 >= (self._mask + 1) * 2:
+ self.grow()
+
+ def contains(self, elem) -> bool:
+ return self.contains_entry(_Entry.from_elem(elem))
+
+ def copy(self) -> "_CompatSetInner":
+ res = _CompatSetInner()
+ res.merge(self)
+ return res
+
+ def contains_entry(self, entry: _Entry):
+ index = self.look_entry(entry)
+ found = self._table[index]
+ return found is not None and found is not REMOVED
+
+ def difference(self, other: Iterable) -> "_CompatSetInner":
+ if isinstance(other, (CompatSet, set)) and (self._used >> 2) <= len(other):
+ res = _CompatSetInner()
+ pos = 0
+ if isinstance(other, CompatSet):
+ inner = other._inner
+ while True:
+ (pos, entry) = self.next(pos)
+ if entry is None:
+ break
+ if not inner.contains_entry(entry):
+ res.add_entry(entry)
+ else:
+ while True:
+ (pos, entry) = self.next(pos)
+ if entry is None:
+ break
+ if entry.elem not in other:
+ res.add_entry(entry)
+ return res
+
+ # If len(self) much more than len(other), it's more efficient to
+ # simply copy and then iterate other looking for common elements
+ res = self.copy()
+ return res.difference_update(other)
+
+ def difference_update(self, other: Iterable) -> "_CompatSetInner":
+ if isinstance(other, CompatSet) and other._inner is self:
+ return _CompatSetInner()
+ for elem in other:
+ self.discard(elem)
+ # If more than 1/5 are removed, then resize them away
+ if (self._fill - self._used) * 5 >= self._mask:
+ self.grow()
+ return self
+
+ def discard(self, elem) -> bool:
+ return self.discard_entry(_Entry.from_elem(elem))
+
+ def discard_entry(self, entry: _Entry) -> bool:
+ index = self.look_entry(entry)
+ found = self._table[index]
+ if found is not None and found is not REMOVED:
+ self._table[index] = REMOVED
+ self._used -= 1
+ return True
+ return False
+
+ def grow(self):
+ if self._used > 50000:
+ sz = self._used * 2
+ else:
+ sz = self._used * 4
+ self.resize(sz)
+
+ def insert_entry(self, entry: _Entry) -> bool:
+ index = self.look_entry(entry)
+ found = self._table[index]
+ if found is None:
+ self._table[index] = entry
+ self._fill += 1
+ self._used += 1
+ return True
+ elif found is REMOVED:
+ self._table[index] = entry
+ self._used += 1
+ return True
+ else:
+ return False # present
+
+ def insert_entry_clean(self, entry: _Entry):
+ """
+ Insert an entry known to not be present.
+
+ Must only be used in a table with no removed entries.
+ """
+ mask = self._mask
+ table = self._table
+ i = entry.hash & mask
+ perturb = entry.hash
+ found = None
+
+ while True:
+ if table[i] is None:
+ found = i
+ break
+
+ if i + LINEAR_PROBES <= mask:
+ for j in range(i + 1, i + LINEAR_PROBES + 1):
+ if table[j] is None:
+ found = j
+ break
+
+ if found is not None:
+ break
+
+ perturb >>= PERTURB_SHIFT
+ i = (i * 5 + 1 + perturb) & mask
+
+ table[found] = entry
+ self._fill += 1
+ self._used += 1
+
+ def intersection(self, other: Iterable) -> "_CompatSetInner":
+ """Return a new inner set representing the intersection with another."""
+ res = _CompatSetInner()
+ if isinstance(other, CompatSet):
+ if other._inner is self:
+ res.update(self)
+ else:
+ inner = other._inner
+ if inner._used > self._used:
+ (self, other) = (other, self)
+ pos = 0
+ while True:
+ (pos, entry) = inner.next(pos)
+ if entry is None:
+ break
+ if self.contains_entry(entry):
+ res.add_entry(entry)
+ else:
+ for elem in other:
+ entry = _Entry.from_elem(elem)
+ if self.contains_entry(entry):
+ res.add_entry(entry)
+ return res
+
+ def look_entry(self, entry: _Entry) -> int:
+ """Find an element in the set, or find an empty entry."""
+ # The set must have at least one empty entry for this to terminate
+ assert self._fill <= self._mask
+
+ free_idx = None
+ mask = self._mask
+ table = self._table
+ i = entry.hash & mask
+ perturb = entry.hash
+
+ while True:
+ if table[i] is None:
+ return i if free_idx is None else i
+ elif table[i] is REMOVED and free_idx is None:
+ free_idx = i
+ elif table[i] == entry:
+ return i
+
+ if i + LINEAR_PROBES <= mask:
+ for j in range(i + 1, i + LINEAR_PROBES + 1):
+ if table[j] is None:
+ return j if free_idx is None else j
+ elif table[j] is REMOVED and free_idx is None:
+ free_idx = j
+ elif table[j] == entry:
+ return j
+
+ perturb >>= PERTURB_SHIFT
+ i = (i * 5 + 1 + perturb) & mask
+
+ def merge(self, other: "_CompatSetInner"):
+ if other is self or not other._used:
+ return
+
+ # Do one big resize at the start, rather than incrementally
+ # resizing as we insert new keys. Expect that there will be no
+ # (or few) overlapping keys.
+ if (self._fill + other._used) * 3 >= (self._mask + 1) * 2:
+ self.resize((self._used + other._used) * 2)
+
+ # If our table is empty, and both tables have the same size, and
+ # there are no removed entries, then just copy the entries.
+ if self._fill == 0 and self._mask == other._mask and other._fill == other._used:
+ self._table = other._table.copy()
+ self._fill = other._fill
+ self._used = other._used
+ return
+
+ # If our table is empty, we can use insert_entry_clean()
+ if self._fill == 0:
+ for entry in other._table:
+ if entry is not None and entry is not REMOVED:
+ self.insert_entry_clean(entry)
+ return
+
+ # We can't assure there are no duplicates, so do normal insertions
+ for entry in other._table:
+ if entry is not None and entry is not REMOVED:
+ self.insert_entry(entry)
+
+ def next(self, pos: int) -> Tuple[int, Optional[_Entry]]:
+ while True:
+ if pos > self._mask:
+ return (pos, None)
+ entry = self._table[pos]
+ pos += 1
+ if entry is not None and entry is not REMOVED:
+ return (pos, entry)
+
+ def pop(self):
+ i = self._search & self._mask
+ if not self._used:
+ raise KeyError("pop from empty set")
+
+ while True:
+ entry = self._table[i]
+ if entry is None or entry is REMOVED:
+ i += 1
+ if i > self._mask:
+ i = 0
+ else:
+ break
+
+ self._table[i] = REMOVED
+ self._used -= 1
+ self._search = i + 1
+ return entry.elem
+
+ def resize(self, min_used: int):
+ """Adjust the allocated size of the set."""
+ old_table = self._table
+ assert min_used >= 0
+
+ new_size = MIN_SIZE
+ while new_size <= min_used:
+ new_size *= 2
+
+ self._table = [None] * new_size
+ self._fill = 0
+ self._mask = new_size - 1
+ self._used = 0
+
+ for entry in old_table:
+ if entry is not None and entry is not REMOVED:
+ self.insert_entry_clean(entry)
+
+ def symmetric_difference_update(self, other: Iterable) -> "_CompatSetInner":
+ if isinstance(other, CompatSet) and other._inner is self:
+ return _CompatSetInner()
+ for elem in other:
+ entry = _Entry.from_elem(elem)
+ if not self.discard_entry(entry):
+ self.add_entry(entry)
+ return self
+
+ def update(self, other: Iterable):
+ """Update the set from an iterator of elements."""
+ if isinstance(other, CompatSet):
+ self.merge(other._inner)
+ return
+ for elem in other:
+ self.add(elem)
+
+ def __eq__(self, other) -> bool:
+ if isinstance(other, _CompatSetInner):
+ if other._used != self._used:
+ return False
+ for pos in range(self._used):
+ entry = self._table[pos]
+ if (
+ entry is not None
+ and entry is not REMOVED
+ and not other.contains_entry(entry)
+ ):
+ return False
+ return True
+ return False
+
+
+class _CompatSetIter:
+ def __init__(self, s: _CompatSetInner):
+ self._s = s
+ self._pos = 0
+
+ def __iter__(self):
+ return self
+
+ def __next__(self):
+ (self._pos, entry) = self._s.next(self._pos)
+ if entry is None:
+ raise StopIteration
+ return entry.elem
* [X] diff --git a/indy_common/config.py b/indy_common/config.py
Let us check the stable branch... All merges... so we are good...
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p indy_common/config.py
* commit b54404d0bdc23dfb60d10bababc9216e99c3ee49
| Merge: 438aa426 e0d3b94d
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Thu Dec 26 15:23:08 2019 +0300
|
| Merge remote-tracking branch 'remotes/upstream/master' into rc-1.12.1.rc1
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
* commit 438aa42694b2d0d7af6d6366e8196604ac229093
| Merge: ff88db39 3c783228
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Nov 28 13:30:56 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.12.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: 8940e559 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
| Merge: a8784f11 48199562
| Author: Andrey Kononykhin <andkononykhin@gmail.com>
| Date: Thu Jun 27 19:32:01 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
|
| Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
|
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
| Merge: c009c3f0 d8c42999
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Apr 24 13:04:08 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
| Merge: d9f0aed8 16697cf9
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Feb 6 09:41:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.6.83
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit d9f0aed8d1aff8580aac478b70abd1a93fca22f2
| Merge: f9e80160 f2d5a028
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Fri Nov 30 10:29:09 2018 +0300
|
| Merge remote-tracking branch 'public/master' into 1.6.79-rc
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit f9e80160c7e68a38c2b96a865b44f95a30b4ef68
| Merge: 0bd19eff 1204b349
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Aug 9 18:00:37 2018 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.6.69
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 0bd19eff3d8d5149a92e3e5c4756753c4b992f8e
| Merge: cbf2c8b4 7dcf675c
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Fri Jun 22 15:03:07 2018 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.4.63
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit cbf2c8b47fa05e6792c5f007827a98abe3bce99e
| Merge: 135dfcf3 dd33ba1a
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Apr 12 15:13:48 2018 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.3.56
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 135dfcf3c36857a619fb4dad70a7855bda544357
| Merge: 19586a3c 5fdb50cc
| Author: dsurnin <dmitry.surnin@dsr-corporation.com>
| Date: Fri Feb 9 18:50:01 2018 +0300
|
| Merge remote-tracking branch 'origin/master' into stable-51
|
| Signed-off-by: dsurnin <dmitry.surnin@dsr-corporation.com>
|
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index 004bac08..a090799c 100644
--- a/indy_common/config.py
+++ b/indy_common/config.py
@@ -97,11 +97,11 @@ INCONSISTENCY_WATCHER_NETWORK_TIMEOUT = 90
# Top level package to be updated via pool upgrade command
UPGRADE_ENTRY = 'indy-node'
-PACKAGES_TO_HOLD = ['indy-plenum', 'indy-node', 'python3-indy-crypto', 'libindy-crypto',
+PACKAGES_TO_HOLD = ['indy-plenum', 'indy-node',
# From indy-plenum:
'python3-ioflo', 'python3-orderedset', 'python3-base58', 'python3-prompt-toolkit', 'python3-rlp', 'python3-sha3',
'python3-libnacl', 'python3-six', 'python3-portalocker', 'python3-sortedcontainers',
- 'python3-dateutil', 'python3-semver', 'python3-pygments', 'python3-psutil', 'python3-pyzmq', 'python3-intervaltree',
+ 'python3-dateutil', 'python3-semver', 'python3-psutil', 'python3-pyzmq',
'python3-jsonpickle', 'python3-rocksdb', 'python3-pympler', 'python3-packaging',
# From indy-node:
'python3-timeout-decorator', 'python3-distro']
@@ -112,3 +112,10 @@ authPolicy = CONFIG_LEDGER_AUTH_POLICY
SCHEMA_ATTRIBUTES_LIMIT = 125
CONTEXT_SIZE_LIMIT = 131072
+JSON_LD_LIMIT = CONTEXT_SIZE_LIMIT
+DIDDOC_CONTENT_SIZE_LIMIT = 10 * 1024
+
+ENABLE_RICH_SCHEMAS = False
+
+# Disables the legacy sorting
+REV_STRATEGY_USE_COMPAT_ORDERING = False
* [X] diff --git a/indy_common/constants.py b/indy_common/constants.py
Check history...
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p indy_common/constants.py
This shows merges only...
* commit 9adcfca1f3c9a82929b71e40359859990a0ca0cb
| Merge: ff88db39 c99cc4cc
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Mon Jan 27 14:50:17 2020 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.12.2.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: 8940e559 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
| Merge: 1cd837b1 48199562
| Author: Andrey Kononykhin <andkononykhin@gmail.com>
| Date: Thu Jun 27 19:32:01 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
|
| Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
|
* commit 1cd837b1d093f1afc408b5395cbc00353f7e3cca
| Merge: a8784f11 df9959d0
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed May 29 09:36:39 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.8.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
| Merge: c009c3f0 d8c42999
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Apr 24 13:04:08 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
| Merge: d9f0aed8 16697cf9
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Feb 6 09:41:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.6.83
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit d9f0aed8d1aff8580aac478b70abd1a93fca22f2
| Merge: f9e80160 f2d5a028
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Fri Nov 30 10:29:09 2018 +0300
|
| Merge remote-tracking branch 'public/master' into 1.6.79-rc
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
.
.
.
index 8950af34..9717fc29 100644
--- a/indy_common/constants.py
+++ b/indy_common/constants.py
@@ -10,15 +10,18 @@ Environment = NamedTuple("Environment", [
("domainLedger", str)
])
-# Rich Schema
-RS_TYPE = "type"
-META = "meta"
-# CONTEXT
-CONTEXT_NAME = "name"
-CONTEXT_VERSION = "version"
-CONTEXT_CONTEXT = "@context"
-CONTEXT_FROM = "dest"
-CONTEXT_TYPE = 'ctx'
+DOMAIN_LEDGER_ID = 1
+
+# NYM
+DIDDOC_CONTENT = "diddocContent"
+NYM_VERSION = "version"
+VERSION_ID = "seqNo"
+VERSION_TIME = "timestamp"
+
+NYM_VERSION_NULL = 0 # No Checks
+NYM_VERSION_CONVENTION = 1 # Nym is base58 of first 16 bytes of verkey
+NYM_VERSION_SELF_CERT = 2 # Nym is base58 of first 16 bytes of sha256 of verkey
+DEFAULT_NYM_VERSION = NYM_VERSION_NULL # No checks by default
# SCHEMA
SCHEMA_NAME = "name"
@@ -170,10 +173,6 @@ GET_REVOC_REG = IndyTransactions.GET_REVOC_REG.value
GET_REVOC_REG_DELTA = IndyTransactions.GET_REVOC_REG_DELTA.value
CHANGE_KEY = IndyTransactions.CHANGE_KEY.value
-# Rich Schema
-SET_CONTEXT = IndyTransactions.SET_CONTEXT.value
-GET_CONTEXT = IndyTransactions.GET_CONTEXT.value
-
POOL_UPGRADE = IndyTransactions.POOL_UPGRADE.value
NODE_UPGRADE = IndyTransactions.NODE_UPGRADE.value
POOL_RESTART = IndyTransactions.POOL_RESTART.value
@@ -184,6 +183,19 @@ AUTH_RULE = IndyTransactions.AUTH_RULE.value
AUTH_RULES = IndyTransactions.AUTH_RULES.value
GET_AUTH_RULE = IndyTransactions.GET_AUTH_RULE.value
+FLAG = IndyTransactions.FLAG.value
+GET_FLAG = IndyTransactions.GET_FLAG.value
+
+# client Rich Schema transaction types
+JSON_LD_CONTEXT = IndyTransactions.JSON_LD_CONTEXT.value
+RICH_SCHEMA = IndyTransactions.RICH_SCHEMA.value
+RICH_SCHEMA_ENCODING = IndyTransactions.RICH_SCHEMA_ENCODING.value
+RICH_SCHEMA_MAPPING = IndyTransactions.RICH_SCHEMA_MAPPING.value
+RICH_SCHEMA_CRED_DEF = IndyTransactions.RICH_SCHEMA_CRED_DEF.value
+RICH_SCHEMA_PRES_DEF = IndyTransactions.RICH_SCHEMA_PRES_DEF.value
+GET_RICH_SCHEMA_OBJECT_BY_ID = IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_ID.value
+GET_RICH_SCHEMA_OBJECT_BY_METADATA = IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_METADATA.value
+
CONFIG_LEDGER_ID = 2
JUSTIFICATION_MAX_SIZE = 1000
@@ -193,3 +205,53 @@ CONFIG_LEDGER_AUTH_POLICY = 2
TAG_LIMIT_SIZE = 256
APP_NAME = "indy-node"
+
+# RICH SCHEMA
+RS_ID = 'id'
+RS_TYPE = 'rsType'
+RS_NAME = 'rsName'
+RS_VERSION = 'rsVersion'
+RS_CONTENT = 'content'
+RS_FROM = 'from'
+RS_ENDORSER = 'endorser'
+
+JSON_LD_CONTEXT_FIELD = "@context"
+JSON_LD_ID_FIELD = "@id"
+JSON_LD_TYPE_FIELD = "@type"
+
+# RICH SCHEMA type names
+RS_CONTEXT_TYPE_VALUE = 'ctx'
+RS_SCHEMA_TYPE_VALUE = 'sch'
+RS_ENCODING_TYPE_VALUE = 'enc'
+RS_MAPPING_TYPE_VALUE = 'map'
+RS_CRED_DEF_TYPE_VALUE = 'cdf'
+RS_PRES_DEF_TYPE_VALUE = 'pdf'
+
+# Specific Rich Schema Object's content fields:
+RS_CRED_DEF_SIG_TYPE = "signatureType"
+RS_CRED_DEF_MAPPING = "mapping"
+RS_CRED_DEF_SCHEMA = "schema"
+RS_CRED_DEF_PUB_KEY = "publicKey"
+
+RS_ENC_INPUT = "input"
+RS_ENC_OUTPUT = "output"
+RS_ENC_ALGORITHM = "algorithm"
+RS_ENC_TEST_VECS = "testVectors"
+RS_ENC_ID = "id"
+RS_ENC_TYPE = "type"
+RS_ENC_ALG_DESC = "description"
+RS_ENC_ALG_DOC = "documentation"
+RS_ENC_ALG_IMPL = "implementation"
+
+RS_MAPPING_ATTRIBUTES = "attributes"
+RS_MAPPING_ISSUER = "issuer"
+RS_MAPPING_ISSUANCE_DATE = "issuanceDate"
+RS_MAPPING_SCHEMA = "schema"
+RS_MAPPING_ENC = "enc"
+RS_MAPPING_RANK = "rank"
+
+# Flag content fields:
+FLAG_NAME = "name"
+FLAG_VALUE = "value"
+# Known flag names
+FLAG_NAME_COMPAT_ORDERING = "REV_STRATEGY_USE_COMPAT_ORDERING"
* [X] diff --git a/indy_common/exceptions.py b/indy_common/exceptions.py
Check history
git log --date-order --graph --decorate origin/stable ^origin/ubuntu-20.04-upgrade -p indy_common/exceptions.py
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
History is boring. Moving on. This file is fine. All changes are in ubuntu 20.04
index ce746a7b..40a817ec 100644
--- a/indy_common/exceptions.py
+++ b/indy_common/exceptions.py
@@ -6,14 +6,19 @@ class InvalidConnectionException(Exception):
pass
+class InvalidDIDDocException(Exception):
+ def __init__(self, reason: str):
+ self.reason = reason
+
+
class NotFound(RuntimeError):
pass
class ConnectionNotFound(NotFound):
- def __init__(self, name: str=None):
+ def __init__(self, name: str = None):
if name:
- self.reason = "Connection with name not found".format(name)
+ self.reason = "Connection with name not found {}".format(name)
class VerkeyNotFound(NotFound):
* [X] diff --git a/indy_common/identity.py b/indy_common/identity.py
This is a formatting change. Don't care.
index 62f12db6..eff6cbc2 100644
--- a/indy_common/identity.py
+++ b/indy_common/identity.py
@@ -12,7 +12,7 @@ from indy_common.types import Request
class Identity(GeneratesRequest):
def __init__(self,
identifier: Identifier,
- endorser: Identifier=None,
+ endorser: Identifier = None,
verkey=None,
role=None,
last_synced=None,
* [X] diff --git a/indy_common/req_utils.py b/indy_common/req_utils.py
Hmmm... check history.
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/req_utils.py
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: 0bd19eff 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 0bd19eff3d8d5149a92e3e5c4756753c4b992f8e
Merge: 089d12e4 7dcf675c
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Fri Jun 22 15:03:07 2018 +0300
Merge remote-tracking branch 'public/master' into rc-1.4.63
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Great, changes are only in ubuntu 20.04. moving on.
index 904d8be3..8eb3e0cc 100644
--- a/indy_common/req_utils.py
+++ b/indy_common/req_utils.py
@@ -1,7 +1,6 @@
from indy_common.constants import SCHEMA_NAME, SCHEMA_VERSION, SCHEMA_ATTR_NAMES, SCHEMA_FROM, \
- CONTEXT_NAME, CONTEXT_VERSION, CONTEXT_CONTEXT, CONTEXT_FROM, \
CLAIM_DEF_SIGNATURE_TYPE, CLAIM_DEF_SCHEMA_REF, CLAIM_DEF_TAG, CLAIM_DEF_PUBLIC_KEYS, CLAIM_DEF_FROM, \
- CLAIM_DEF_TAG_DEFAULT, CLAIM_DEF_CL, META
+ CLAIM_DEF_TAG_DEFAULT, CLAIM_DEF_CL
from plenum.common.constants import DATA
from plenum.common.request import Request
@@ -10,64 +9,6 @@ from plenum.common.txn_util import get_payload_data
# TODO: use data classes instead
-# Rich Schema
-# CONTEXT
-
-def get_write_context_name(req):
- return req.operation[META][CONTEXT_NAME]
-
-
-def get_txn_context_name(txn):
- return get_payload_data(txn)[META][CONTEXT_NAME]
-
-
-def get_write_context_version(req: Request):
- return req.operation[META][CONTEXT_VERSION]
-
-
-def get_txn_context_version(txn):
- return get_payload_data(txn)[META][CONTEXT_VERSION]
-
-
-def get_write_context_data(req: Request):
- return req.operation[DATA]
-
-
-def get_txn_context_data(txn):
- return get_payload_data(txn)[DATA]
-
-
-def get_txn_context_meta(txn):
- return get_payload_data(txn)[META]
-
-
-def get_read_context_name(req: Request):
- return req.operation[META][CONTEXT_NAME]
-
-
-def get_read_context_version(req: Request):
- return req.operation[META][CONTEXT_VERSION]
-
-
-def get_read_context_from(req: Request):
- return req.operation[CONTEXT_FROM]
-
-
-def get_reply_context_name(reply):
- return reply[DATA][CONTEXT_NAME]
-
-
-def get_reply_context_version(reply):
- return reply[DATA][CONTEXT_VERSION]
-
-
-def get_reply_context_context(reply):
- return reply[DATA].get(CONTEXT_CONTEXT)
-
-
-def get_reply_context_from(reply):
- return reply[CONTEXT_FROM]
-
# SCHEMA
* [X] diff --git a/indy_common/state/state_constants.py b/indy_common/state/state_constants.py
Check history..
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/state/state_constants.py
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: 8940e559 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
Merge: 089d12e4 48199562
Author: Andrey Kononykhin <andkononykhin@gmail.com>
Date: Thu Jun 27 19:32:01 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
History is clean. Changes all come from ubuntu 20.04
index 92d28317..ef7c4291 100644
--- a/indy_common/state/state_constants.py
+++ b/indy_common/state/state_constants.py
@@ -6,9 +6,11 @@ MARKER_REVOC_DEF = "4"
MARKER_REVOC_REG_ENTRY = "5"
MARKER_REVOC_REG_ENTRY_ACCUM = "6"
MARKER_CONTEXT = "7"
+MARKER_RS_SCHEMA = "8"
LAST_SEQ_NO = "lsn"
VALUE = "val"
LAST_UPDATE_TIME = "lut"
# CONFIG LEDGER
MARKER_AUTH_RULE = "1"
+MARKER_FLAG = "2"
* [X] diff --git a/indy_common/strict_types.py b/indy_common/strict_types.py
Check history
Again, history is boring... this makes things easy though...
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/strict_types.py
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index 782ad516..348c369d 100644
--- a/indy_common/strict_types.py
+++ b/indy_common/strict_types.py
@@ -31,8 +31,7 @@ class strict_types:
if self.is_complex_type(type_b):
type_b = tuple(
- getattr(type_b, '__args__', None) or
- getattr(type_b, '__union_set_params__', None)
+ getattr(type_b, '__args__', None) or getattr(type_b, '__union_set_params__', None)
)
if self.is_complex_type(type_a):
* [X] diff --git a/indy_common/test/auth/metadata/test_auth_rule_with_metadata_complex.py b/indy_common/test/auth/metadata/test_auth_rule_with_metadata_complex.py
Check history...
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/auth/metadata/test_auth_rule_with_metadata_complex.py
This shows changes. We need to identify what if anything we need here... This is a test so perhaps we can ignore it, but it may indicate some deeper changes that we need. Let us tag it for future investigation.
* [ ] TODO dig deeper into this task.
For example, the following may be of interest.
commit c840a122cec2fe7099b78b5f4f8bceac4cfd0de9
Author: ashcherbakov <alexander.sherbakov@dsr-company.com>
Date: Fri Aug 9 19:13:47 2019 +0300
Merge pull request #1408 from ashcherbakov/transaction-endorser
INDY-2199: Endorsers must be specified within the transaction
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
* [X] design/transaction_endorser.md | 7 ++++
* [X] indy_common/authorize/auth_request_validator.py | 8 +++--
* [X] indy_common/authorize/authorizer.py | 105 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++----
* [X] indy_common/test/auth/metadata/conftest.py | 10 ++++++
* [X] indy_common/test/auth/metadata/helper.py | 57 +++++++++++++++++++--------------
* [X] indy_common/test/auth/multi_sig/conftest.py | 13 ++++++--
* [X] indy_common/test/auth/multi_sig/test_auth_multi_sig_for_1_owner.py | 5 ++-
* [X] indy_common/test/auth/multi_sig/test_auth_multi_sig_for_5_owners.py | 5 ++-
* [X] indy_common/test/auth/test_endorser_authorizer.py | 204 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
* [X] indy_common/test/auth/test_role_authorizer.py | 28 +++++++++++++++-
* [X] indy_node/server/node.py | 1 -
* [X] indy_node/test/auth_rule/test_multisig_auth_rule.py | 15 ---------
* [X] indy_node/test/endorser/test_send_by_endorser.py | 117 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++--
* [X] indy_node/test/nym_txn/test_create_did_without_endorser.py | 27 +---------------
* [X] indy_node/test/upgrade/test_force_upgrade_process_in_view_change.py | 2 +-
* [X] indy_node/test/validator_info/test_validator_info.py | 2 +-
We seem to already have changes for the above files
* [ ] indy_common/test/auth/metadata/test_auth_rule_with_metadata_complex.py | 206 +++++++++++++++++++++++++++++++++++++++++++++++++---------------------------------------------------------------------
* [ ] indy_common/test/auth/metadata/test_auth_rule_with_metadata_composite.py | 316 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++-------------------------------------------
* [ ] indy_common/test/auth/metadata/test_auth_rule_with_metadata_simple.py | 298 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++------------
* [ ] indy_common/test/auth/metadata/test_error_messages.py | 43 +++++++++++++++----------
* [ ] indy_node/test/conftest.py | 24 +++++++++++---
21 files changed, 1170 insertions(+), 323 deletions(-)
I find it really odd that one file would be left out of the changes... 🤔 Or maybe five... 🤔 Don't know what to think of this...
Are the changes we are looking at in the commit ? Perhaps something else happened here? 🤔
These changes look like change that are occurring to the "missing" patch. Hmmm, let us look at the history of the file from the perspective of ubuntu 20.04. Perhaps we can find an identical version over there that was then modified further...
So we want to pay attention to the following info
* commit c840a122cec2fe7099b78b5f4f8bceac4cfd0de9
| Author: ashcherbakov <alexander.sherbakov@dsr-company.com>
| Date: Fri Aug 9 19:13:47 2019 +0300
|
| Merge pull request #1408 from ashcherbakov/transaction-endorser
|
| INDY-2199: Endorsers must be specified within the transaction
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
Perhaps we have something similar on the ubuntu branch? Um, no we don't. 🫠
We aren't the right person to know what to do with this. It seems non-obvious to me. Maybe Adam?
index 0c21e835..762b8b88 100644
--- a/indy_common/test/auth/metadata/test_auth_rule_with_metadata_complex.py
+++ b/indy_common/test/auth/metadata/test_auth_rule_with_metadata_complex.py
@@ -60,17 +60,20 @@ def test_plugin_and_or_rule_diff_role_trustee_no_endorser(write_auth_req_validat
valid_actions=[Action(author=TRUSTEE, endorser=None, sigs={ENDORSER: 3, TRUSTEE: s},
is_owner=owner, amount=3, extra_sigs=True)
for s in range(2, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={STEWARD: s1, ENDORSER: 3, TRUSTEE: s2},
- is_owner=owner, amount=3, extra_sigs=True)
- for s1 in range(2, 4)
- for s2 in range(1, MAX_SIG_COUNT + 1)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={STEWARD: s1, IDENTITY_OWNER: 3, TRUSTEE: s2},
- is_owner=True, amount=3, extra_sigs=True)
- for s1 in range(2, 4)
- for s2 in range(1, MAX_SIG_COUNT + 1)],
-
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={STEWARD: s1,
+ ENDORSER: 3,
+ TRUSTEE: s2},
+ is_owner=owner, amount=3, extra_sigs=True)
+ for s1 in range(2, 4)
+ for s2 in range(1, MAX_SIG_COUNT + 1)
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={STEWARD: s1,
+ IDENTITY_OWNER: 3,
+ TRUSTEE: s2},
+ is_owner=True, amount=3, extra_sigs=True)
+ for s1 in range(2, 4)
+ for s2 in range(1, MAX_SIG_COUNT + 1)],
author=TRUSTEE, endorser=None,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
@@ -105,20 +108,19 @@ def test_plugin_or_and_rule_diff_roles_trustee_no_endorser(write_auth_req_valida
valid_actions=[Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3},
is_owner=False, amount=None, extra_sigs=True),
Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3},
- is_owner=True, amount=None, extra_sigs=True)] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1, STEWARD: s2},
- is_owner=owner, amount=None, extra_sigs=True)
- for s1 in range(1, 4)
- for s2 in range(2, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1, ENDORSER: s2},
- is_owner=True, amount=2, extra_sigs=True)
- for s1 in range(1, 4)
- for s2 in range(2, 4)] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 2, IDENTITY_OWNER: 3},
- is_owner=True, amount=3, extra_sigs=True),
- Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, IDENTITY_OWNER: 3},
- is_owner=True, amount=3, extra_sigs=True)],
+ is_owner=True, amount=None, extra_sigs=True)] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1, STEWARD: s2},
+ is_owner=owner, amount=None, extra_sigs=True)
+ for s1 in range(1, 4)
+ for s2 in range(2, 4)
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1, ENDORSER: s2},
+ is_owner=True, amount=2, extra_sigs=True)
+ for s1 in range(1, 4) for s2 in range(2, 4)] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 2, IDENTITY_OWNER: 3},
+ is_owner=True, amount=3, extra_sigs=True),
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, IDENTITY_OWNER: 3},
+ is_owner=True, amount=3, extra_sigs=True)],
author=TRUSTEE, endorser=None,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
@@ -174,25 +176,25 @@ def test_plugin_complex_trustee_no_endorser(write_auth_req_validator, write_requ
is_owner=owner, amount=1, extra_sigs=True)
for s1 in range(1, 4)
for s2 in range(1, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, STEWARD: s},
- is_owner=owner, amount=1, extra_sigs=True)
- for s in range(2, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, IDENTITY_OWNER: s}, # 2d
- is_owner=True, amount=3, extra_sigs=True)
- for s in range(1, 4)] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, STEWARD: s}, # 3d
- is_owner=owner, amount=None, extra_sigs=True)
- for s in range(2, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, ENDORSER: s},
- is_owner=owner, amount=None, extra_sigs=True)
- for s in range(2, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 2, IDENTITY_OWNER: s}, # 4th
- is_owner=True, amount=2, extra_sigs=True)
- for s in range(1, 4)],
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, STEWARD: s},
+ is_owner=owner, amount=1, extra_sigs=True)
+ for s in range(2, 4)
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, IDENTITY_OWNER: s}, # 2d
+ is_owner=True, amount=3, extra_sigs=True)
+ for s in range(1, 4)] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, STEWARD: s}, # 3d
+ is_owner=owner, amount=None, extra_sigs=True)
+ for s in range(2, 4)
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 3, ENDORSER: s},
+ is_owner=owner, amount=None, extra_sigs=True)
+ for s in range(2, 4)
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: 2, IDENTITY_OWNER: s}, # 4th
+ is_owner=True, amount=2, extra_sigs=True)
+ for s in range(1, 4)],
author=TRUSTEE, endorser=None,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
* [X] diff --git a/indy_common/test/auth/metadata/test_auth_rule_with_metadata_composite.py b/indy_common/test/auth/metadata/test_auth_rule_with_metadata_composite.py
* [ ] TODO needs further investigation.
index a24a50e9..74ec144a 100644
--- a/indy_common/test/auth/metadata/test_auth_rule_with_metadata_composite.py
+++ b/indy_common/test/auth/metadata/test_auth_rule_with_metadata_composite.py
@@ -17,19 +17,21 @@ def test_plugin_or_rule_all_amount_trustee_no_endorser(write_auth_req_validator,
AuthConstraint(role=ENDORSER, sig_count=1, need_to_be_owner=True,
metadata={PLUGIN_FIELD: 3}),
]),
- valid_actions=[Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s},
- is_owner=owner, amount=1, extra_sigs=True)
+ valid_actions=[Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s}, is_owner=owner, amount=1,
+ extra_sigs=True)
for s in range(1, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1, STEWARD: s2},
- is_owner=owner, amount=2, extra_sigs=True)
- for s1 in range(1, 4)
- for s2 in range(1, 4)
- for owner in [True, False]] +
- [Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1, ENDORSER: s2},
- is_owner=True, amount=3, extra_sigs=True)
- for s1 in range(1, 4)
- for s2 in range(1, 4)],
+ for owner in [True, False]
+ ] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1,
+ STEWARD: s2},
+ is_owner=owner, amount=2, extra_sigs=True)
+ for s1 in range(1, 4)
+ for s2 in range(1, 4)
+ for owner in [True, False]] + [
+ Action(author=TRUSTEE, endorser=None, sigs={TRUSTEE: s1, ENDORSER: s2},
+ is_owner=True, amount=3, extra_sigs=True)
+ for s1 in range(1, 4)
+ for s2 in range(1, 4)],
author=TRUSTEE, endorser=None,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
@@ -125,11 +127,12 @@ def test_plugin_or_rule_one_amount_diff_roles_endorser_no_endorser(write_auth_re
valid_actions=[Action(author=ENDORSER, endorser=None, sigs={ENDORSER: s},
is_owner=owner, amount=None, extra_sigs=True)
for s in range(1, 4)
- for owner in [True, False]] +
- [Action(author=ENDORSER, endorser=None, sigs={ENDORSER: s1, IDENTITY_OWNER: s2},
- is_owner=True, amount=1, extra_sigs=True)
- for s1 in range(1, 4)
- for s2 in range(1, 4)],
+ for owner in [True, False]] + [
+ Action(author=ENDORSER, endorser=None, sigs={ENDORSER: s1,
+ IDENTITY_OWNER: s2},
+ is_owner=True, amount=1, extra_sigs=True)
+ for s1 in range(1, 4)
+ for s2 in range(1, 4)],
author=ENDORSER, endorser=None,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
@@ -167,11 +170,12 @@ def test_plugin_or_rule_one_amount_diff_roles_owner_endorser(write_auth_req_vali
is_owner=owner, amount=None, extra_sigs=True)
for s1 in range(1, 4)
for s2 in range(1, 4)
- for owner in [True, False]] +
- [Action(author=IDENTITY_OWNER, endorser=ENDORSER, sigs={IDENTITY_OWNER: s1, ENDORSER: s2},
- is_owner=True, amount=1, extra_sigs=True)
- for s1 in range(1, 4)
- for s2 in range(1, 4)],
+ for owner in [True, False]] + [
+ Action(author=IDENTITY_OWNER, endorser=ENDORSER, sigs={IDENTITY_OWNER: s1,
+ ENDORSER: s2},
+ is_owner=True, amount=1, extra_sigs=True)
+ for s1 in range(1, 4)
+ for s2 in range(1, 4)],
author=IDENTITY_OWNER, endorser=ENDORSER,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
@@ -191,10 +195,9 @@ def test_plugin_or_rule_one_amount_all_roles_endorser_no_endorser(write_auth_req
valid_actions=[Action(author=ENDORSER, endorser=None, sigs={ENDORSER: s},
is_owner=owner, amount=None, extra_sigs=True)
for s in range(1, 4)
- for owner in [True, False]] +
- [Action(author=ENDORSER, endorser=None, sigs={ENDORSER: s},
- is_owner=True, amount=3, extra_sigs=True)
- for s in range(1, 4)],
+ for owner in [True, False]] + [
+ Action(author=ENDORSER, endorser=None, sigs={ENDORSER: s}, is_owner=True, amount=3, extra_sigs=True)
+ for s in range(1, 4)],
author=ENDORSER, endorser=None,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
@@ -234,11 +237,11 @@ def test_plugin_or_rule_one_amount_all_roles_owner_endorser(write_auth_req_valid
is_owner=owner, amount=None, extra_sigs=True)
for s1 in range(1, 4)
for s2 in range(1, 4)
- for owner in [True, False]] +
- [Action(author=IDENTITY_OWNER, endorser=ENDORSER, sigs={IDENTITY_OWNER: s1, ENDORSER: s2},
- is_owner=True, amount=3, extra_sigs=True)
- for s1 in range(1, 4)
- for s2 in range(1, 4)],
+ for owner in [True, False]] + [
+ Action(author=IDENTITY_OWNER, endorser=ENDORSER, sigs={IDENTITY_OWNER: s1, ENDORSER: s2},
+ is_owner=True, amount=3, extra_sigs=True)
+ for s1 in range(1, 4)
+ for s2 in range(1, 4)],
author=IDENTITY_OWNER, endorser=ENDORSER,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
@@ -301,11 +304,12 @@ def test_plugin_or_rule_diff_amount_same_role_owner_endorser(write_auth_req_vali
valid_actions=[Action(author=IDENTITY_OWNER, endorser=ENDORSER, sigs={IDENTITY_OWNER: s, ENDORSER: 2},
is_owner=owner, amount=2, extra_sigs=True)
for s in range(1, 4)
- for owner in [True, False]] +
- [Action(author=IDENTITY_OWNER, endorser=ENDORSER, sigs={IDENTITY_OWNER: s, ENDORSER: 3},
- is_owner=owner, amount=1, extra_sigs=True)
- for s in range(1, 4)
- for owner in [True, False]],
+ for owner in [True, False]] + [
+ Action(author=IDENTITY_OWNER, endorser=ENDORSER, sigs={IDENTITY_OWNER: s,
+ ENDORSER: 3},
+ is_owner=owner, amount=1, extra_sigs=True)
+ for s in range(1, 4)
+ for owner in [True, False]],
author=IDENTITY_OWNER, endorser=ENDORSER,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
* [X] diff --git a/indy_common/test/auth/metadata/test_auth_rule_with_metadata_simple.py b/indy_common/test/auth/metadata/test_auth_rule_with_metadata_simple.py
* [ ] TODO needs further investigation.
index de7ce7a8..1bfce171 100644
--- a/indy_common/test/auth/metadata/test_auth_rule_with_metadata_simple.py
+++ b/indy_common/test/auth/metadata/test_auth_rule_with_metadata_simple.py
@@ -260,11 +260,11 @@ def test_plugin_simple_rule_0_sig_owner_no_endorser(write_auth_req_validator, wr
metadata={PLUGIN_FIELD: 2}),
valid_actions=[Action(author=IDENTITY_OWNER, endorser=None, sigs={},
is_owner=owner, amount=2, extra_sigs=False)
- for owner in [True, False]] +
- [Action(author=IDENTITY_OWNER, endorser=None, sigs={IDENTITY_OWNER: s},
- is_owner=owner, amount=2, extra_sigs=False)
- for owner in [True, False]
- for s in range(1, MAX_SIG_COUNT + 1)],
+ for owner in [True, False]] + [
+ Action(author=IDENTITY_OWNER, endorser=None, sigs={IDENTITY_OWNER: s},
+ is_owner=owner, amount=2, extra_sigs=False)
+ for owner in [True, False]
+ for s in range(1, MAX_SIG_COUNT + 1)],
author=IDENTITY_OWNER, endorser=None,
all_signatures=signatures, is_owner=is_owner, amount=amount,
write_auth_req_validator=write_auth_req_validator,
* [X] diff --git a/indy_common/test/auth/metadata/test_error_messages.py b/indy_common/test/auth/metadata/test_error_messages.py
This change seems pretty straight forward. This actually looks okay. The changes that are only in stable, don't matter...
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/auth/metadata/test_error_messages.py
* commit c840a122cec2fe7099b78b5f4f8bceac4cfd0de9
| Author: ashcherbakov <alexander.sherbakov@dsr-company.com>
| Date: Fri Aug 9 19:13:47 2019 +0300
|
| Merge pull request #1408 from ashcherbakov/transaction-endorser
|
| INDY-2199: Endorsers must be specified within the transaction
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
| diff --git a/indy_common/test/auth/metadata/test_error_messages.py b/indy_common/test/auth/metadata/test_error_messages.py
| index 6b81256d..f4f9dbd2 100644
| --- a/indy_common/test/auth/metadata/test_error_messages.py
| +++ b/indy_common/test/auth/metadata/test_error_messages.py
| @@ -4,7 +4,7 @@ from collections import OrderedDict
| import pytest
|
| from indy_common.authorize.auth_constraints import AuthConstraint, IDENTITY_OWNER, AuthConstraintOr
| -from indy_common.test.auth.metadata.helper import set_auth_constraint, PLUGIN_FIELD, build_req_and_action
| +from indy_common.test.auth.metadata.helper import set_auth_constraint, PLUGIN_FIELD, build_req_and_action, Action
| from plenum.common.constants import TRUSTEE, STEWARD
| from plenum.common.exceptions import UnauthorizedClientRequest
|
| @@ -15,9 +15,11 @@ def test_plugin_simple_error_msg_no_plugin_field(write_auth_req_validator):
| set_auth_constraint(write_auth_req_validator,
| AuthConstraint(role=IDENTITY_OWNER, sig_count=1, need_to_be_owner=True,
| metadata={PLUGIN_FIELD: 2}))
| - req, actions = build_req_and_action(signatures={IDENTITY_OWNER: 1},
| - need_to_be_owner=True,
| - amount=None)
| + req, actions = build_req_and_action(Action(author=IDENTITY_OWNER, endorser=None,
| + sigs={IDENTITY_OWNER: 1},
| + is_owner=True,
| + amount=None,
| + extra_sigs=False))
|
| with pytest.raises(UnauthorizedClientRequest) as excinfo:
| write_auth_req_validator.validate(req, actions)
| @@ -27,9 +29,11 @@ def test_plugin_simple_error_msg_no_plugin_field(write_auth_req_validator):
| def test_plugin_simple_error_msg_extra_plugin_field(write_auth_req_validator):
| set_auth_constraint(write_auth_req_validator,
| AuthConstraint(role=IDENTITY_OWNER, sig_count=1, need_to_be_owner=True))
| - req, actions = build_req_and_action(signatures={IDENTITY_OWNER: 1},
| - need_to_be_owner=True,
| - amount=5)
| + req, actions = build_req_and_action(Action(author=IDENTITY_OWNER, endorser=None,
| + sigs={IDENTITY_OWNER: 1},
| + is_owner=True,
| + amount=5,
| + extra_sigs=False))
|
| with pytest.raises(UnauthorizedClientRequest) as excinfo:
| write_auth_req_validator.validate(req, actions)
| @@ -40,9 +44,11 @@ def test_plugin_simple_error_msg_not_enough_amount(write_auth_req_validator):
| set_auth_constraint(write_auth_req_validator,
| AuthConstraint(role=IDENTITY_OWNER, sig_count=1, need_to_be_owner=True,
| metadata={PLUGIN_FIELD: 10}))
| - req, actions = build_req_and_action(signatures={IDENTITY_OWNER: 1},
| - need_to_be_owner=True,
| - amount=5)
| + req, actions = build_req_and_action(Action(author=IDENTITY_OWNER, endorser=None,
| + sigs={IDENTITY_OWNER: 1},
| + is_owner=True,
| + amount=5,
| + extra_sigs=False))
|
.
.
.
index f4f9dbd2..26af15ef 100644
--- a/indy_common/test/auth/metadata/test_error_messages.py
+++ b/indy_common/test/auth/metadata/test_error_messages.py
@@ -77,7 +77,7 @@ def test_plugin_or_error_msg_not_enough_amount(write_auth_req_validator):
"Constraint: 1 TRUSTEE signature is required, Error: Not enough TRUSTEE signatures",
"Constraint: 1 STEWARD signature is required with additional metadata new_field 10, Error: not enough amount in plugin field"
])
- assert expected in str(excinfo.value.args[0])
+ assert expected in str(excinfo.value.reason)
def test_plugin_or_error_msg_not_enough_amount_multiple_metadata_fields(write_auth_req_validator):
@@ -104,4 +104,4 @@ def test_plugin_or_error_msg_not_enough_amount_multiple_metadata_fields(write_au
"Constraint: 1 TRUSTEE signature is required, Error: Not enough TRUSTEE signatures",
"Constraint: 1 STEWARD signature is required with additional metadata new_field 10 aaa bbb, Error: not enough amount in plugin field"
])
- assert expected in str(excinfo.value.args[0])
+ assert expected in str(excinfo.value.reason)
* [X] diff --git a/indy_common/test/auth/test_auth_constraint.py b/indy_common/test/auth/test_auth_constraint.py
Just merges, we are good.
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/auth/test_auth_constraint.py
* commit b4f6928f4f5121f7c8070aca326540094515a04c
| Merge: 8940e559 522fee89
| Author: Sergey Khoroshavin <sergey.khoroshavin@dsr-corporation.com>
| Date: Tue Jul 30 02:57:19 2019 +0300
|
| Merge remote-tracking branch 'upstream/master' into rc-1.9.1.rc1
|
| Signed-off-by: Sergey Khoroshavin <sergey.khoroshavin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
| Merge: a8784f11 48199562
| Author: Andrey Kononykhin <andkononykhin@gmail.com>
| Date: Thu Jun 27 19:32:01 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
|
| Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
|
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
| Merge: c009c3f0 d8c42999
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Apr 24 13:04:08 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
Merge: 089d12e4 16697cf9
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Wed Feb 6 09:41:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.6.83
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
index fd6828d9..91075cfd 100644
--- a/indy_common/test/auth/test_auth_constraint.py
+++ b/indy_common/test/auth/test_auth_constraint.py
@@ -62,7 +62,7 @@ def test_str_any_1_sig_not_owner():
constraint = AuthConstraint(role='*',
sig_count=1,
need_to_be_owner=False)
- assert str(constraint) == '1 signature of any role is required'
+ assert str(constraint) == '1 signature of any role is required ALL'
def test_str_any_several_sig_not_owner():
* [X] diff --git a/indy_common/test/auth/test_auth_nym_with_new_auth_map.py b/indy_common/test/auth/test_auth_nym_with_new_auth_map.py
We are good.
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/auth/test_auth_nym_with_new_auth_map.py
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
| Merge: c009c3f0 48199562
| Author: Andrey Kononykhin <andkononykhin@gmail.com>
| Date: Thu Jun 27 19:32:01 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
|
| Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
Merge: 089d12e4 16697cf9
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Wed Feb 6 09:41:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.6.83
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
index 3e10c731..d3cd4788 100644
--- a/indy_common/test/auth/test_auth_nym_with_new_auth_map.py
+++ b/indy_common/test/auth/test_auth_nym_with_new_auth_map.py
@@ -306,6 +306,6 @@ def test_change_verkey(write_request_validation, req, is_owner):
assert authorized == write_request_validation(req,
[AuthActionEdit(txn_type=NYM,
field=VERKEY,
- old_value="_verkey".format(req.identifier),
+ old_value="_verkey {}".format(req.identifier),
new_value='new_value',
is_owner=is_owner)])
* [X] diff --git a/indy_common/test/auth/test_auth_context_with_new_auth_map.py b/indy_common/test/auth/test_auth_rich_schema_objects_with_new_auth_map.py
No changes in stable.
similarity index 55%
rename from indy_common/test/auth/test_auth_context_with_new_auth_map.py
rename to indy_common/test/auth/test_auth_rich_schema_objects_with_new_auth_map.py
index e74fd5db..e8368941 100644
--- a/indy_common/test/auth/test_auth_context_with_new_auth_map.py
+++ b/indy_common/test/auth/test_auth_rich_schema_objects_with_new_auth_map.py
@@ -1,19 +1,28 @@
+import pytest
+
from indy_common.authorize.auth_actions import AuthActionAdd, AuthActionEdit
-from indy_common.constants import SET_CONTEXT
+from indy_common.constants import JSON_LD_CONTEXT, RICH_SCHEMA, RICH_SCHEMA_ENCODING, \
+ RICH_SCHEMA_MAPPING, RICH_SCHEMA_CRED_DEF, RICH_SCHEMA_PRES_DEF
-def test_context_adding(write_request_validation, req, is_owner):
+@pytest.mark.parametrize('txn_type',
+ [JSON_LD_CONTEXT, RICH_SCHEMA, RICH_SCHEMA_ENCODING, RICH_SCHEMA_MAPPING,
+ RICH_SCHEMA_CRED_DEF, RICH_SCHEMA_PRES_DEF])
+def test_rich_schema_object_adding(write_request_validation, req, is_owner, txn_type):
authorized = req.identifier in ("trustee_identifier", "steward_identifier", "endorser_identifier")
assert authorized == write_request_validation(req,
- [AuthActionAdd(txn_type=SET_CONTEXT,
+ [AuthActionAdd(txn_type=txn_type,
field='some_field',
value='some_value',
is_owner=is_owner)])
-def test_context_editing(write_request_validation, req, is_owner):
+@pytest.mark.parametrize('txn_type',
+ [JSON_LD_CONTEXT, RICH_SCHEMA, RICH_SCHEMA_ENCODING, RICH_SCHEMA_MAPPING,
+ RICH_SCHEMA_CRED_DEF, RICH_SCHEMA_PRES_DEF])
+def test_rich_schema_object_editing(write_request_validation, req, is_owner, txn_type):
assert not write_request_validation(req,
- [AuthActionEdit(txn_type=SET_CONTEXT,
+ [AuthActionEdit(txn_type=txn_type,
field='some_field',
old_value='old_value',
new_value='new_value',
* [X] diff --git a/indy_common/test/conftest.py b/indy_common/test/conftest.py
Stable branch only has merges. Moving on.
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/conftest.py
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: c009c3f0 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit c009c3f04bb8aace76a7d263ae81905f4554418c
| Merge: f9e80160 16697cf9
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Feb 6 09:41:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.6.83
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit f9e80160c7e68a38c2b96a865b44f95a30b4ef68
| Merge: 0bd19eff 1204b349
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Aug 9 18:00:37 2018 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.6.69
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 0bd19eff3d8d5149a92e3e5c4756753c4b992f8e
| Merge: 135dfcf3 7dcf675c
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Fri Jun 22 15:03:07 2018 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.4.63
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 135dfcf3c36857a619fb4dad70a7855bda544357
| Merge: 19586a3c 5fdb50cc
| Author: dsurnin <dmitry.surnin@dsr-corporation.com>
| Date: Fri Feb 9 18:50:01 2018 +0300
|
| Merge remote-tracking branch 'origin/master' into stable-51
|
| Signed-off-by: dsurnin <dmitry.surnin@dsr-corporation.com>
|
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index 71597480..236b8f69 100644
--- a/indy_common/test/conftest.py
+++ b/indy_common/test/conftest.py
@@ -16,7 +16,7 @@ strict_types.defaultShouldCheck = True
# noinspection PyUnresolvedReferences
from plenum.test.conftest import GENERAL_CONFIG_DIR, \
- txnPoolNodesLooper, overriddenConfigValues # noqa
+ overriddenConfigValues # noqa
logger = getlogger()
* [X] diff --git a/indy_common/test/test_compat_set.py b/indy_common/test/test_compat_set.py
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/test_compat_set.py
No changes not in ubuntu 20.04
new file mode 100644
index 00000000..7e9ee81d
--- /dev/null
+++ b/indy_common/test/test_compat_set.py
@@ -0,0 +1,502 @@
+from indy_common.compat_set import CompatSet
+
+
+def _order_test(old, diff, addition, expected):
+ val = CompatSet(old).difference(diff)
+ val.update(addition)
+ assert list(val) == expected
+
+
+def test_empty_set():
+ val = CompatSet()
+ assert len(val) == 0
+ assert 99 not in val
+ assert None not in val
+ assert "" not in val
+ assert "1" not in val
+
+ assert val == CompatSet()
+ assert val != CompatSet([1])
+ assert val == set()
+ assert val != {1}
+ assert val != dict()
+
+
+def test_simple_set():
+ val = CompatSet([1, 2, 3])
+ assert len(val) == 3
+
+ assert 2 in val
+ assert 99 not in val
+ assert None not in val
+ assert "" not in val
+ assert "1" not in val
+
+ assert val == CompatSet([2, 1, 3])
+ assert val == {1, 2, 3}
+ assert val != {1}
+ assert val != set()
+ assert val != dict()
+
+
+def test_copy_simple():
+ val = CompatSet([12, 4, 5, 6])
+ assert list(val) == [5, 12, 4, 6]
+ val = val.copy()
+ assert list(val) == [5, 12, 4, 6]
+
+
+def test_copy_removed():
+ val = CompatSet([12, 4, 5, 6, 3])
+ assert list(val) == [5, 3, 12, 4, 6]
+ val.remove(5)
+ assert list(val) == [3, 12, 4, 6]
+ val = val.copy()
+ assert list(val) == [3, 12, 4, 6]
+
+
+def test_difference():
+ val = CompatSet([1, 2])
+ dif = val.difference([2, 3])
+ assert val == CompatSet([1, 2])
+ assert dif == CompatSet([1])
+
+ assert 1 in dif
+ assert 2 not in dif
+
+ assert CompatSet([1, 2]) - {2, 3} == dif
+ val -= {2, 3}
+ assert val == dif
+ assert CompatSet([1, 2]) - set() == {1, 2}
+ assert CompatSet([1, 2]) - {1} == {2}
+
+
+def test_intersection():
+ val = CompatSet([1, 2])
+ dif = val.intersection([2, 3])
+ assert val == CompatSet([1, 2])
+ assert dif == CompatSet([2])
+
+ assert 2 in dif
+ assert 1 not in dif
+
+ assert CompatSet([1, 2]) & {2, 3} == dif
+ val = CompatSet([1, 2])
+ val &= {2, 3}
+ assert val == dif
+
+
+def test_isdisjoint():
+ val = CompatSet([1, 2])
+ assert val.isdisjoint(set())
+ assert val.isdisjoint({3})
+ assert not val.isdisjoint({2, 3})
+
+
+def test_issubset():
+ val = CompatSet([1, 2])
+ assert val.issubset({1, 2, 3})
+ assert val.issubset({1, 2})
+ assert not val.issubset({1})
+ assert not val.issubset(set())
+ assert CompatSet().issubset({1})
+
+
+def test_issuperset():
+ val = CompatSet([1, 2])
+ assert val.issuperset(set())
+ assert val.issuperset({1})
+ assert val.issuperset({1, 2})
+ assert not val.issuperset({1, 2, 3})
+ assert CompatSet().issuperset(set())
+ assert not CompatSet().issuperset({1})
+
+
+def test_symmetric_difference():
+ val = CompatSet([1, 2])
+ dif = val.symmetric_difference([2, 3])
+ assert val == CompatSet([1, 2])
+ assert dif == CompatSet([1, 3])
+
+ assert 1 in dif
+ assert 2 not in dif
+
+ assert CompatSet([1, 2]) ^ {2, 3} == dif
+ val ^= {2, 3}
+ assert val == dif
+ assert CompatSet([1, 2]) ^ set() == {1, 2}
+ assert CompatSet([1, 2]) ^ {1} == {2}
+
+
+def test_update():
+ val = CompatSet([1, 2])
+ assert 2 in val
+ assert 3 not in val
+ val.update([3, 4])
+ assert 3 in val
+ assert 2 in val
+ assert 99 not in val
+
+ assert CompatSet([1, 2]) | {3} == {1, 2, 3}
+ val = CompatSet([1, 2])
+ val |= {3}
+ assert val == {1, 2, 3}
+
+
+def test_update_order():
+ val = CompatSet([1, 7, 6, 5])
+ assert list(val) == [1, 5, 6, 7]
+ val.update([9])
+ assert list(val) == [1, 9, 5, 6, 7]
+ val.update([8])
+ assert list(val) == [1, 5, 6, 7, 8, 9]
+
+
+def test_difference_order():
+ val = CompatSet([5, 6, 8, 9, 10, 11, 12]).difference([])
+ assert list(val) == [5, 6, 8, 9, 10, 11, 12]
+ val.update([16])
+ assert list(val) == [16, 5, 6, 8, 9, 10, 11, 12]
+
+ val = CompatSet([5, 6, 8, 9, 10, 11, 12])
+ assert list(val) == [5, 6, 8, 9, 10, 11, 12]
+ val.update([16])
+ assert list(val) == [5, 6, 8, 9, 10, 11, 12, 16]
+
+
+def test_order_simple():
+ _order_test([12, 4, 5, 6], [], [13], [5, 13, 12, 4, 6])
+
+
+def test_order_large():
+ _order_test(
+ [
+ 5,
+ 6,
+ 8,
+ 9,
+ 10,
+ 11,
+ 12,
+ 13,
+ 16,
+ 17,
+ 18,
+ 19,
+ 161,
+ 162,
+ 164,
+ 165,
+ 166,
+ 167,
+ 41,
+ 169,
+ 170,
+ 171,
+ 173,
+ 174,
+ 175,
+ 176,
+ 441,
+ 178,
+ 179,
+ 180,
+ 181,
+ 182,
+ 183,
+ 440,
+ 184,
+ 185,
+ 186,
+ 436,
+ 437,
+ 438,
+ 190,
+ 63,
+ 64,
+ 192,
+ 194,
+ 67,
+ 70,
+ 199,
+ 201,
+ 202,
+ 206,
+ ],
+ [],
+ [442],
+ [
+ 5,
+ 6,
+ 8,
+ 9,
+ 10,
+ 11,
+ 12,
+ 13,
+ 16,
+ 17,
+ 18,
+ 19,
+ 161,
+ 162,
+ 164,
+ 165,
+ 166,
+ 167,
+ 41,
+ 169,
+ 170,
+ 171,
+ 173,
+ 174,
+ 175,
+ 176,
+ 442,
+ 178,
+ 179,
+ 180,
+ 181,
+ 182,
+ 183,
+ 440,
+ 441,
+ 184,
+ 185,
+ 186,
+ 436,
+ 437,
+ 438,
+ 190,
+ 63,
+ 64,
+ 192,
+ 194,
+ 67,
+ 70,
+ 199,
+ 201,
+ 202,
+ 206,
+ ],
+ )
+
+
+def test_order_fill():
+ _order_test(
+ [
+ 1,
+ 2,
+ 3,
+ 4,
+ 5,
+ 6,
+ 7,
+ 8,
+ 9,
+ 10,
+ 11,
+ 12,
+ 13,
+ 14,
+ 15,
+ 16,
+ 17,
+ 18,
+ 19,
+ 20,
+ 21,
+ 22,
+ 23,
+ 24,
+ 25,
+ 26,
+ 27,
+ 28,
+ 29,
+ ],
+ [],
+ [
+ 83,
+ 38,
+ 101,
+ 103,
+ 56,
+ 61,
+ 70,
+ 73,
+ 53,
+ 39,
+ 77,
+ 92,
+ 82,
+ 64,
+ 48,
+ 78,
+ 51,
+ 68,
+ 96,
+ 102,
+ 37,
+ 84,
+ 35,
+ 58,
+ 59,
+ 91,
+ 90,
+ 85,
+ 31,
+ 36,
+ 72,
+ 104,
+ 60,
+ 43,
+ 74,
+ 44,
+ 98,
+ 87,
+ 34,
+ 100,
+ 63,
+ 95,
+ 52,
+ 99,
+ 80,
+ 54,
+ 45,
+ 40,
+ 30,
+ 75,
+ 81,
+ 41,
+ 86,
+ 89,
+ 55,
+ 42,
+ 71,
+ 57,
+ 32,
+ 47,
+ 76,
+ 69,
+ 33,
+ 46,
+ 79,
+ 94,
+ 66,
+ 93,
+ 50,
+ 88,
+ 67,
+ 97,
+ 49,
+ 65,
+ 62,
+ ],
+ [
+ 1,
+ 2,
+ 3,
+ 4,
+ 5,
+ 6,
+ 7,
+ 8,
+ 9,
+ 10,
+ 11,
+ 12,
+ 13,
+ 14,
+ 15,
+ 16,
+ 17,
+ 18,
+ 19,
+ 20,
+ 21,
+ 22,
+ 23,
+ 24,
+ 25,
+ 26,
+ 27,
+ 28,
+ 29,
+ 30,
+ 31,
+ 32,
+ 33,
+ 34,
+ 35,
+ 36,
+ 37,
+ 38,
+ 39,
+ 40,
+ 41,
+ 42,
+ 43,
+ 44,
+ 45,
+ 46,
+ 47,
+ 48,
+ 49,
+ 50,
+ 51,
+ 52,
+ 53,
+ 54,
+ 55,
+ 56,
+ 57,
+ 58,
+ 59,
+ 60,
+ 61,
+ 62,
+ 63,
+ 64,
+ 65,
+ 66,
+ 67,
+ 68,
+ 69,
+ 70,
+ 71,
+ 72,
+ 73,
+ 74,
+ 75,
+ 76,
+ 77,
+ 78,
+ 79,
+ 80,
+ 81,
+ 82,
+ 83,
+ 84,
+ 85,
+ 86,
+ 87,
+ 88,
+ 89,
+ 90,
+ 91,
+ 92,
+ 93,
+ 94,
+ 95,
+ 96,
+ 97,
+ 98,
+ 99,
+ 100,
+ 101,
+ 102,
+ 103,
+ 104,
+ ],
+ )
* [X] diff --git a/indy_common/test/test_strict_types.py b/indy_common/test/test_strict_types.py
Branch only includes merges
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/test_strict_types.py
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index 0044a030..9ada510d 100644
--- a/indy_common/test/test_strict_types.py
+++ b/indy_common/test/test_strict_types.py
@@ -12,14 +12,6 @@ def takesStr(s: str) -> int:
pass
-@strict_types()
-def takesUnion(s: typing.Union[str, None]) -> int:
- try:
- return int(s)
- except ValueError:
- pass
-
-
def testInvalidArgumentType():
with pytest.raises(TypeError):
takesStr(1)
@@ -34,10 +26,6 @@ def testValidInputAndReturn():
takesStr('1')
-def testWorksWithComplexTypes():
- takesUnion('1')
-
-
@decClassMethods(strict_types())
class TestClass:
* [X] diff --git a/indy_common/test/test_transactions.py b/indy_common/test/test_transactions.py
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/test_transactions.py
All commits are merges, so everything is new in ubuntu 20.04
index e5176176..f74055ee 100644
--- a/indy_common/test/test_transactions.py
+++ b/indy_common/test/test_transactions.py
@@ -1,7 +1,11 @@
from indy_common.constants import NYM, NODE, ATTRIB, SCHEMA, CLAIM_DEF, DISCLO, GET_ATTR, GET_NYM, GET_TXNS, \
- GET_SCHEMA, SET_CONTEXT, GET_CONTEXT, GET_CLAIM_DEF, POOL_UPGRADE, NODE_UPGRADE, POOL_CONFIG, REVOC_REG_DEF, REVOC_REG_ENTRY, \
+ GET_SCHEMA, GET_CLAIM_DEF, POOL_UPGRADE, NODE_UPGRADE, \
+ POOL_CONFIG, REVOC_REG_DEF, REVOC_REG_ENTRY, \
GET_REVOC_REG_DEF, GET_REVOC_REG, GET_REVOC_REG_DELTA, POOL_RESTART, VALIDATOR_INFO, CHANGE_KEY, AUTH_RULE, \
- GET_AUTH_RULE, AUTH_RULES
+ GET_AUTH_RULE, AUTH_RULES, RICH_SCHEMA, RICH_SCHEMA_ENCODING, \
+ RICH_SCHEMA_MAPPING, RICH_SCHEMA_CRED_DEF, JSON_LD_CONTEXT, GET_RICH_SCHEMA_OBJECT_BY_METADATA, \
+ GET_RICH_SCHEMA_OBJECT_BY_ID, RICH_SCHEMA_PRES_DEF
+
from indy_common.transactions import IndyTransactions
@@ -35,8 +39,14 @@ def test_transactions_are_encoded():
assert GET_AUTH_RULE == "121"
assert AUTH_RULES == "122"
- assert SET_CONTEXT == "200"
- assert GET_CONTEXT == "300"
+ assert JSON_LD_CONTEXT == "200"
+ assert RICH_SCHEMA == "201"
+ assert RICH_SCHEMA_ENCODING == "202"
+ assert RICH_SCHEMA_MAPPING == "203"
+ assert RICH_SCHEMA_CRED_DEF == "204"
+ assert RICH_SCHEMA_PRES_DEF == "205"
+ assert GET_RICH_SCHEMA_OBJECT_BY_ID == "300"
+ assert GET_RICH_SCHEMA_OBJECT_BY_METADATA == "301"
def test_transaction_enum_decoded():
@@ -69,8 +79,14 @@ def test_transaction_enum_decoded():
assert IndyTransactions.VALIDATOR_INFO.name == "VALIDATOR_INFO"
- assert IndyTransactions.SET_CONTEXT.name == "SET_CONTEXT"
- assert IndyTransactions.GET_CONTEXT.name == "GET_CONTEXT"
+ assert IndyTransactions.JSON_LD_CONTEXT.name == "JSON_LD_CONTEXT"
+ assert IndyTransactions.RICH_SCHEMA.name == "RICH_SCHEMA"
+ assert IndyTransactions.RICH_SCHEMA_ENCODING.name == "RICH_SCHEMA_ENCODING"
+ assert IndyTransactions.RICH_SCHEMA_MAPPING.name == "RICH_SCHEMA_MAPPING"
+ assert IndyTransactions.RICH_SCHEMA_CRED_DEF.name == "RICH_SCHEMA_CRED_DEF"
+ assert IndyTransactions.RICH_SCHEMA_PRES_DEF.name == "RICH_SCHEMA_PRES_DEF"
+ assert IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_ID.name == "GET_RICH_SCHEMA_OBJECT_BY_ID"
+ assert IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_METADATA.name == "GET_RICH_SCHEMA_OBJECT_BY_METADATA"
def test_transaction_enum_encoded():
@@ -99,8 +115,14 @@ def test_transaction_enum_encoded():
assert IndyTransactions.POOL_RESTART.value == "118"
assert IndyTransactions.VALIDATOR_INFO.value == "119"
- assert IndyTransactions.SET_CONTEXT.value == "200"
- assert IndyTransactions.GET_CONTEXT.value == "300"
+ assert IndyTransactions.JSON_LD_CONTEXT.value == "200"
+ assert IndyTransactions.RICH_SCHEMA.value == "201"
+ assert IndyTransactions.RICH_SCHEMA_ENCODING.value == "202"
+ assert IndyTransactions.RICH_SCHEMA_MAPPING.value == "203"
+ assert IndyTransactions.RICH_SCHEMA_CRED_DEF.value == "204"
+ assert IndyTransactions.RICH_SCHEMA_PRES_DEF.value == "205"
+ assert IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_ID.value == "300"
+ assert IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_METADATA.value == "301"
def test_get_name_from_code():
@@ -130,7 +152,19 @@ def test_get_name_from_code():
assert IndyTransactions.get_name_from_code(IndyTransactions.GET_REVOC_REG_DELTA.value) == "GET_REVOC_REG_DELTA"
assert IndyTransactions.get_name_from_code(IndyTransactions.VALIDATOR_INFO.value) == "VALIDATOR_INFO"
- assert IndyTransactions.get_name_from_code(IndyTransactions.SET_CONTEXT.value) == "SET_CONTEXT"
- assert IndyTransactions.get_name_from_code(IndyTransactions.GET_CONTEXT.value) == "GET_CONTEXT"
+ assert IndyTransactions.get_name_from_code(IndyTransactions.JSON_LD_CONTEXT.value) == "JSON_LD_CONTEXT"
+ assert IndyTransactions.get_name_from_code(IndyTransactions.RICH_SCHEMA.value) == "RICH_SCHEMA"
+ assert IndyTransactions.get_name_from_code(
+ IndyTransactions.RICH_SCHEMA_ENCODING.value) == "RICH_SCHEMA_ENCODING"
+ assert IndyTransactions.get_name_from_code(
+ IndyTransactions.RICH_SCHEMA_MAPPING.value) == "RICH_SCHEMA_MAPPING"
+ assert IndyTransactions.get_name_from_code(
+ IndyTransactions.RICH_SCHEMA_CRED_DEF.value) == "RICH_SCHEMA_CRED_DEF"
+ assert IndyTransactions.get_name_from_code(
+ IndyTransactions.RICH_SCHEMA_PRES_DEF.value) == "RICH_SCHEMA_PRES_DEF"
+ assert IndyTransactions.get_name_from_code(
+ IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_ID.value) == "GET_RICH_SCHEMA_OBJECT_BY_ID"
+ assert IndyTransactions.get_name_from_code(
+ IndyTransactions.GET_RICH_SCHEMA_OBJECT_BY_METADATA.value) == "GET_RICH_SCHEMA_OBJECT_BY_METADATA"
assert IndyTransactions.get_name_from_code("some_unexpected_code") == "Unknown_transaction_type"
* [X] diff --git a/indy_common/test/test_util.py b/indy_common/test/test_util.py
* [ ] TODO this change is missing
index eb6ed89d..e9653899 100644
--- a/indy_common/test/test_util.py
+++ b/indy_common/test/test_util.py
@@ -28,14 +28,15 @@ def test_getIndex():
]
)
def test_compose_cmd(pkg_name, package):
- expected_cmd = 'dpkg -s {}'.format(pkg_name)
+ expected_cmd = f'dpkg -s {pkg_name}'
+
cmd = compose_cmd(['dpkg', '-s', package])
assert expected_cmd == cmd
def test_compose_cmd_allows_whitespace():
pkg_name = 'package_7 some_other_package'
- expected_cmd = 'dpkg -s {}'.format(pkg_name)
+ expected_cmd = f'dpkg -s {pkg_name}'
cmd = compose_cmd(['dpkg', '-s', pkg_name])
assert expected_cmd == cmd
* [X] diff --git a/indy_common/test/types/test_attrib.py b/indy_common/test/types/test_attrib.py
Just merges, don't need to worry.
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/types/test_attrib.py
* commit 0bd19eff3d8d5149a92e3e5c4756753c4b992f8e
| Merge: 135dfcf3 7dcf675c
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Fri Jun 22 15:03:07 2018 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.4.63
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 135dfcf3c36857a619fb4dad70a7855bda544357
| Merge: 19586a3c 5fdb50cc
| Author: dsurnin <dmitry.surnin@dsr-corporation.com>
| Date: Fri Feb 9 18:50:01 2018 +0300
|
| Merge remote-tracking branch 'origin/master' into stable-51
|
| Signed-off-by: dsurnin <dmitry.surnin@dsr-corporation.com>
|
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index affc0857..b0b45c48 100644
--- a/indy_common/test/types/test_attrib.py
+++ b/indy_common/test/types/test_attrib.py
@@ -24,7 +24,7 @@ def test_attrib_with_enc_raw_hash_at_same_time_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: only one field "
+ ex_info.match(r"validation error \[ClientAttribOperation\]: only one field "
"from {}, {}, {} is expected"
"".format(RAW, ENC, HASH))
@@ -37,7 +37,7 @@ def test_attrib_without_enc_raw_hash_fails():
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
ex_info.match(
- "validation error \[ClientAttribOperation\]: missed fields - {}, {}, {}"
+ r"validation error \[ClientAttribOperation\]: missed fields - {}, {}, {}"
"".format(
RAW,
ENC,
@@ -52,8 +52,8 @@ def test_attrib_with_raw_string_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: should be a "
- "valid JSON string \({}=foo\)".format(RAW))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: should be a "
+ r"valid JSON string \({}=foo\)".format(RAW))
def test_attrib_with_raw_empty_json_fails():
@@ -65,8 +65,8 @@ def test_attrib_with_raw_empty_json_fails():
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
ex_info.match(
- "validation error \[ClientAttribOperation\]: should contain one attribute "
- "\({}={{}}\)".format(RAW))
+ r"validation error \[ClientAttribOperation\]: should contain one attribute "
+ r"\({}={{}}\)".format(RAW))
def test_attrib_with_raw_array_fails():
@@ -78,8 +78,8 @@ def test_attrib_with_raw_array_fails():
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
ex_info.match(
- "validation error \[ClientAttribOperation\]: should be a dict "
- "\({}=<class 'list'>\)".format(RAW))
+ r"validation error \[ClientAttribOperation\]: should be a dict "
+ r"\({}=<class 'list'>\)".format(RAW))
def test_attrib_with_raw_having_more_one_attrib_fails():
@@ -91,8 +91,8 @@ def test_attrib_with_raw_having_more_one_attrib_fails():
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
ex_info.match(
- "validation error \[ClientAttribOperation\]: should contain one attribute "
- "\({}={{.*}}\)".format(RAW))
+ r"validation error \[ClientAttribOperation\]: should contain one attribute "
+ r"\({}={{.*}}\)".format(RAW))
def test_attrib_with_raw_having_one_attrib_passes():
@@ -140,8 +140,8 @@ def test_attrib_with_raw_having_endpoint_ha_with_ip_address_only_fails():
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
ex_info.match(
- "validation error \[ClientAttribOperation\]: invalid endpoint format ip_address:port "
- "\({}={{'ha': '8.8.8.8'}}\)".format(ENDPOINT))
+ r"validation error \[ClientAttribOperation\]: invalid endpoint format ip_address:port "
+ r"\({}={{'ha': '8.8.8.8'}}\)".format(ENDPOINT))
def test_attrib_with_raw_having_endpoint_ha_with_invalid_port_fails():
@@ -153,8 +153,8 @@ def test_attrib_with_raw_having_endpoint_ha_with_invalid_port_fails():
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
ex_info.match(
- "validation error \[ClientAttribOperation\]: invalid endpoint port "
- "\(ha=8.8.8.8:65536\)")
+ r"validation error \[ClientAttribOperation\]: invalid endpoint port "
+ r"\(ha=8.8.8.8:65536\)")
def test_attrib_with_raw_having_endpoint_ha_with_invalid_ip_address_fails():
@@ -166,8 +166,8 @@ def test_attrib_with_raw_having_endpoint_ha_with_invalid_ip_address_fails():
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
ex_info.match(
- "validation error \[ClientAttribOperation\]: invalid endpoint address "
- "\(ha=256.8.8.8:9700\)")
+ r"validation error \[ClientAttribOperation\]: invalid endpoint address "
+ r"\(ha=256.8.8.8:9700\)")
def test_attrib_with_valid_hash_passes():
@@ -188,8 +188,8 @@ def test_attrib_with_shorter_hash_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: not a valid hash "
- "\(needs to be in hex too\) \({}={}\)".format(HASH, invalid_hash))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: not a valid hash "
+ r"\(needs to be in hex too\) \({}={}\)".format(HASH, invalid_hash))
def test_attrib_with_longer_hash_fails():
@@ -201,8 +201,8 @@ def test_attrib_with_longer_hash_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: not a valid hash "
- "\(needs to be in hex too\) \({}={}\)".format(HASH, invalid_hash))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: not a valid hash "
+ r"\(needs to be in hex too\) \({}={}\)".format(HASH, invalid_hash))
def test_attrib_with_invalid_hash_fails():
@@ -216,8 +216,8 @@ def test_attrib_with_invalid_hash_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: not a valid hash "
- "\(needs to be in hex too\) \({}={}\)".format(HASH, invalid_hash))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: not a valid hash "
+ r"\(needs to be in hex too\) \({}={}\)".format(HASH, invalid_hash))
def test_attrib_with_empty_hash_fails():
@@ -229,8 +229,8 @@ def test_attrib_with_empty_hash_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: not a valid hash "
- "\(needs to be in hex too\) \({}={}\)".format(HASH, empty_hash))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: not a valid hash "
+ r"\(needs to be in hex too\) \({}={}\)".format(HASH, empty_hash))
empty_hash = None
msg = {
@@ -240,8 +240,8 @@ def test_attrib_with_empty_hash_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: expected types "
- "'str', got 'NoneType' \({}={}\)".format(HASH, empty_hash))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: expected types "
+ r"'str', got 'NoneType' \({}={}\)".format(HASH, empty_hash))
def test_attrib_with_enc_passes():
@@ -265,8 +265,8 @@ def test_attrib_with_empty_enc_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: "
- "empty string \({}={}\)".format(ENC, empty_enc))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: "
+ r"empty string \({}={}\)".format(ENC, empty_enc))
empty_enc = None
msg = {
@@ -276,5 +276,5 @@ def test_attrib_with_empty_enc_fails():
}
with pytest.raises(TypeError) as ex_info:
validator.validate(msg)
- ex_info.match("validation error \[ClientAttribOperation\]: expected types "
- "'str', got 'NoneType' \({}={}\)".format(ENC, empty_enc))
+ ex_info.match(r"validation error \[ClientAttribOperation\]: expected types "
+ r"'str', got 'NoneType' \({}={}\)".format(ENC, empty_enc))
* [X] diff --git a/indy_common/test/types/test_attrib_schema.py b/indy_common/test/types/test_attrib_schema.py
Just merges, nothing interesting.
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/types/test_attrib_schema.py
* commit 135dfcf3c36857a619fb4dad70a7855bda544357
| Merge: 19586a3c 5fdb50cc
| Author: dsurnin <dmitry.surnin@dsr-corporation.com>
| Date: Fri Feb 9 18:50:01 2018 +0300
|
| Merge remote-tracking branch 'origin/master' into stable-51
|
| Signed-off-by: dsurnin <dmitry.surnin@dsr-corporation.com>
|
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index 05bbef02..5085e3b8 100644
--- a/indy_common/test/types/test_attrib_schema.py
+++ b/indy_common/test/types/test_attrib_schema.py
@@ -10,7 +10,7 @@ EXPECTED_ORDERED_FIELDS = OrderedDict([
("dest", IdentifierField),
("raw", JsonField),
('enc', LimitedLengthStringField),
- ('hash', Sha256HexField)
+ ('hash', Sha256HexField),
])
* [X] diff --git a/indy_common/test/types/test_context_schema.py b/indy_common/test/types/test_context_schema.py
Only merges
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/types/test_attrib_schema.py
* commit 135dfcf3c36857a619fb4dad70a7855bda544357
| Merge: 19586a3c 5fdb50cc
| Author: dsurnin <dmitry.surnin@dsr-corporation.com>
| Date: Fri Feb 9 18:50:01 2018 +0300
|
| Merge remote-tracking branch 'origin/master' into stable-51
|
| Signed-off-by: dsurnin <dmitry.surnin@dsr-corporation.com>
|
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
deleted file mode 100644
index a120cba2..00000000
--- a/indy_common/test/types/test_context_schema.py
+++ /dev/null
@@ -1,56 +0,0 @@
-from collections import OrderedDict
-
-from indy_common.types import ClientSchemaOperation, SetContextMetaField, SetContextDataField, \
- ClientSetContextOperation, ContextField
-from plenum.common.messages.fields import ConstantField, VersionField, IterableField, LimitedLengthStringField
-
-EXPECTED_ORDERED_CONTEXT_META_FIELDS = OrderedDict([
- ("name", LimitedLengthStringField),
- ("version", VersionField),
- ("type", ConstantField),
-])
-
-
-def test_meta_has_expected_fields_s():
- actual_field_names = OrderedDict(SetContextMetaField.schema).keys()
- assert actual_field_names == EXPECTED_ORDERED_CONTEXT_META_FIELDS.keys()
-
-
-def test_meta_has_expected_validators_s():
- schema = dict(SetContextMetaField.schema)
- for field, validator in EXPECTED_ORDERED_CONTEXT_META_FIELDS.items():
- assert isinstance(schema[field], validator)
-
-
-EXPECTED_ORDERED_CONTEXT_DATA_FIELDS = OrderedDict([
- ("@context", ContextField),
-])
-
-
-def test_data_has_expected_fields_s():
- actual_field_names = OrderedDict(SetContextDataField.schema).keys()
- assert actual_field_names == EXPECTED_ORDERED_CONTEXT_DATA_FIELDS.keys()
-
-
-def test_data_has_expected_validators_s():
- schema = dict(SetContextDataField.schema)
- for field, validator in EXPECTED_ORDERED_CONTEXT_DATA_FIELDS.items():
- assert isinstance(schema[field], validator)
-
-
-EXPECTED_ORDERED_FIELDS = OrderedDict([
- ("type", ConstantField),
- ("meta", SetContextMetaField),
- ("data", SetContextDataField),
-])
-
-
-def test_has_expected_fields():
- actual_field_names = OrderedDict(ClientSetContextOperation.schema).keys()
- assert actual_field_names == EXPECTED_ORDERED_FIELDS.keys()
-
-
-def test_has_expected_validators():
- schema = dict(ClientSetContextOperation.schema)
- for field, validator in EXPECTED_ORDERED_FIELDS.items():
- assert isinstance(schema[field], validator)
* [X] diff --git a/indy_common/test/types/test_get_attrib_schema.py b/indy_common/test/types/test_get_attrib_schema.py
Only merges
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade indy_common/test/types/test_get_attrib_schema.py
* commit 135dfcf3c36857a619fb4dad70a7855bda544357
| Merge: 19586a3c 5fdb50cc
| Author: dsurnin <dmitry.surnin@dsr-corporation.com>
| Date: Fri Feb 9 18:50:01 2018 +0300
|
| Merge remote-tracking branch 'origin/master' into stable-51
|
| Signed-off-by: dsurnin <dmitry.surnin@dsr-corporation.com>
|
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index aee498db..9fae6b16 100644
--- a/indy_common/test/types/test_get_attrib_schema.py
+++ b/indy_common/test/types/test_get_attrib_schema.py
@@ -1,7 +1,7 @@
import pytest
from indy_common.types import ClientGetAttribOperation
from collections import OrderedDict
-from plenum.common.messages.fields import ConstantField, LimitedLengthStringField, IdentifierField, Sha256HexField
+from plenum.common.messages.fields import ConstantField, IntegerField, LimitedLengthStringField, IdentifierField, Sha256HexField, TxnSeqNoField
EXPECTED_ORDERED_FIELDS = OrderedDict([
@@ -9,7 +9,9 @@ EXPECTED_ORDERED_FIELDS = OrderedDict([
("dest", IdentifierField),
("raw", LimitedLengthStringField),
('enc', LimitedLengthStringField),
- ('hash', Sha256HexField)
+ ('hash', Sha256HexField),
+ ('timestamp', IntegerField),
+ ('seqNo', TxnSeqNoField)
])
* [X] diff --git a/indy_common/test/types/test_get_context_schema.py b/indy_common/test/types/test_get_context_schema.py
This is fine
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_common/test/types/test_get_context_schema.py
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
Merge: 089d12e4 1e437ca1
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Thu Oct 3 12:01:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
deleted file mode 100644
index 19084c19..00000000
--- a/indy_common/test/types/test_get_context_schema.py
+++ /dev/null
@@ -1,38 +0,0 @@
-from collections import OrderedDict
-
-from indy_common.types import ClientGetContextOperation, GetContextField
-from plenum.common.messages.fields import ConstantField, IdentifierField, VersionField, LimitedLengthStringField
-
-EXPECTED_ORDERED_FIELDS_SCHEMA = OrderedDict([
- ("name", LimitedLengthStringField),
- ("version", VersionField)
-])
-
-
-def test_has_expected_fields_s():
- actual_field_names = OrderedDict(GetContextField.schema).keys()
- assert actual_field_names == EXPECTED_ORDERED_FIELDS_SCHEMA.keys()
-
-
-def test_has_expected_validators_s():
- schema = dict(GetContextField.schema)
- for field, validator in EXPECTED_ORDERED_FIELDS_SCHEMA.items():
- assert isinstance(schema[field], validator)
-
-
-EXPECTED_ORDERED_FIELDS = OrderedDict([
- ("type", ConstantField),
- ("dest", IdentifierField),
- ('meta', GetContextField),
-])
-
-
-def test_has_expected_fields():
- actual_field_names = OrderedDict(ClientGetContextOperation.schema).keys()
- assert actual_field_names == EXPECTED_ORDERED_FIELDS.keys()
-
-
-def test_has_expected_validators():
- schema = dict(ClientGetContextOperation.schema)
- for field, validator in EXPECTED_ORDERED_FIELDS.items():
- assert isinstance(schema[field], validator)
* [X] diff --git a/indy_common/test/types/test_get_nym_schema.py b/indy_common/test/types/test_get_nym_schema.py
This is fine
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_common/test/types/test_get_nym_schema.py
* commit 19586a3c82361073cf940c5619a623be60d29aeb
Merge: 089d12e4 56cacae3
Author: Dmitry Surnin <dmitry.surnin@dsr-company.com>
Date: Thu Nov 16 18:18:55 2017 +0300
Merge remote-tracking branch 'origin/master' into rc-17.23-44
Signed-off-by: Dmitry Surnin <dmitry.surnin@dsr-company.com>
index fba4698d..bda6bf6f 100644
--- a/indy_common/test/types/test_get_nym_schema.py
+++ b/indy_common/test/types/test_get_nym_schema.py
@@ -1,12 +1,14 @@
import pytest
from indy_common.types import ClientGetNymOperation
from collections import OrderedDict
-from plenum.common.messages.fields import ConstantField, IdentifierField
+from plenum.common.messages.fields import ConstantField, IdentifierField, IntegerField, TxnSeqNoField
EXPECTED_ORDERED_FIELDS = OrderedDict([
("type", ConstantField),
("dest", IdentifierField),
+ ("timestamp", IntegerField),
+ ("seqNo", TxnSeqNoField)
])
* [X] diff --git a/indy_common/test/types/test_get_rich_schema_object_by_id_schema.py b/indy_common/test/types/test_get_rich_schema_object_by_id_schema.py
New file, this is fine
new file mode 100644
index 00000000..87035f7d
--- /dev/null
+++ b/indy_common/test/types/test_get_rich_schema_object_by_id_schema.py
@@ -0,0 +1,20 @@
+from collections import OrderedDict
+
+from indy_common.types import ClientGetRichSchemaObjectByIdOperation
+from plenum.common.messages.fields import ConstantField, NonEmptyStringField
+
+EXPECTED_ORDERED_FIELDS = OrderedDict([
+ ("type", ConstantField),
+ ("id", NonEmptyStringField),
+])
+
+
+def test_has_expected_fields():
+ actual_field_names = OrderedDict(ClientGetRichSchemaObjectByIdOperation.schema).keys()
+ assert actual_field_names == EXPECTED_ORDERED_FIELDS.keys()
+
+
+def test_has_expected_validators():
+ schema = dict(ClientGetRichSchemaObjectByIdOperation.schema)
+ for field, validator in EXPECTED_ORDERED_FIELDS.items():
+ assert isinstance(schema[field], validator)
* [X] diff --git a/indy_common/test/types/test_get_rich_schema_object_by_metadata_schema.py b/indy_common/test/types/test_get_rich_schema_object_by_metadata_schema.py
New file, this is fine.
new file mode 100644
index 00000000..a1919821
--- /dev/null
+++ b/indy_common/test/types/test_get_rich_schema_object_by_metadata_schema.py
@@ -0,0 +1,22 @@
+from collections import OrderedDict
+
+from indy_common.types import ClientGetRichSchemaObjectByMetadataOperation
+from plenum.common.messages.fields import ConstantField, LimitedLengthStringField, VersionField, NonEmptyStringField
+
+EXPECTED_ORDERED_FIELDS = OrderedDict([
+ ("type", ConstantField),
+ ("rsType", NonEmptyStringField),
+ ("rsName", LimitedLengthStringField),
+ ("rsVersion", VersionField),
+])
+
+
+def test_has_expected_fields():
+ actual_field_names = OrderedDict(ClientGetRichSchemaObjectByMetadataOperation.schema).keys()
+ assert actual_field_names == EXPECTED_ORDERED_FIELDS.keys()
+
+
+def test_has_expected_validators():
+ schema = dict(ClientGetRichSchemaObjectByMetadataOperation.schema)
+ for field, validator in EXPECTED_ORDERED_FIELDS.items():
+ assert isinstance(schema[field], validator)
* [X] diff --git a/indy_common/test/types/test_nym.py b/indy_common/test/types/test_nym.py
New file this is fine.
new file mode 100644
index 00000000..cb5f93fa
--- /dev/null
+++ b/indy_common/test/types/test_nym.py
@@ -0,0 +1,37 @@
+from plenum.common.constants import TARGET_NYM, TXN_TYPE
+import pytest
+from indy_common.constants import DIDDOC_CONTENT, NYM
+from indy_common.types import ClientNYMOperation
+
+VALID_TARGET_NYM = 'a' * 43
+
+
+@pytest.fixture
+def validator():
+ yield ClientNYMOperation()
+
+
+def test_nym(validator):
+ """Validate that the NYM transaction accepts only JSON
+ strings as DIDDOC_CONTENT."""
+
+ msg = {
+ TXN_TYPE: NYM,
+ TARGET_NYM: VALID_TARGET_NYM,
+ DIDDOC_CONTENT: '{}',
+ }
+ validator.validate(msg)
+
+
+def test_nym_raw_dictionary(validator):
+ """Validate that the NYM transaction does not accept
+ dictionaries as DIDDOC_CONTENT."""
+ msg = {
+ TXN_TYPE: NYM,
+ TARGET_NYM: VALID_TARGET_NYM,
+ DIDDOC_CONTENT: {},
+ }
+ with pytest.raises(TypeError) as ex_info:
+ validator.validate(msg)
+
+ ex_info.match("validation error")
* [X] diff --git a/indy_common/test/types/test_rich_schema_object_schema.py b/indy_common/test/types/test_rich_schema_object_schema.py
New file.
new file mode 100644
index 00000000..89abcf2b
--- /dev/null
+++ b/indy_common/test/types/test_rich_schema_object_schema.py
@@ -0,0 +1,43 @@
+from collections import OrderedDict
+
+import pytest
+
+from indy_common.types import ClientJsonLdContextOperation, ClientRichSchemaOperation, \
+ ClientRichSchemaEncodingOperation, ClientRichSchemaMappingOperation, ClientRichSchemaCredDefOperation, \
+ ClientRichSchemaPresDefOperation
+from plenum.common.messages.fields import ConstantField, LimitedLengthStringField, NonEmptyStringField, VersionField
+
+EXPECTED_ORDERED_FIELDS = OrderedDict([
+ ("type", ConstantField),
+ ("ver", LimitedLengthStringField),
+ ("id", NonEmptyStringField),
+ ("rsType", ConstantField),
+ ("rsName", LimitedLengthStringField),
+ ("rsVersion", VersionField),
+ ("content", NonEmptyStringField),
+])
+
+
+@pytest.mark.parametrize('operation_class',
+ [ClientJsonLdContextOperation, ClientRichSchemaOperation, ClientRichSchemaEncodingOperation,
+ ClientRichSchemaMappingOperation,
+ ClientRichSchemaCredDefOperation,
+ ClientRichSchemaPresDefOperation])
+def test_has_expected_fields(operation_class):
+ actual_field_names = OrderedDict(operation_class.schema).keys()
+ assert actual_field_names == EXPECTED_ORDERED_FIELDS.keys()
+
+
+@pytest.mark.parametrize('operation_class, txn_type, rs_type',
+ [(ClientJsonLdContextOperation, "200", 'ctx'),
+ (ClientRichSchemaOperation, "201", 'sch'),
+ (ClientRichSchemaEncodingOperation, "202", 'enc'),
+ (ClientRichSchemaMappingOperation, "203", 'map'),
+ (ClientRichSchemaCredDefOperation, "204", 'cdf'),
+ (ClientRichSchemaPresDefOperation, "205", 'pdf')])
+def test_has_expected_validators(operation_class, txn_type, rs_type):
+ schema = dict(operation_class.schema)
+ for field, validator in EXPECTED_ORDERED_FIELDS.items():
+ assert isinstance(schema[field], validator)
+ assert schema["rsType"].value == rs_type
+ assert schema["type"].value == txn_type
* [X] diff --git a/indy_common/test/version/test_node_version_fallback.py b/indy_common/test/version/test_node_version_fallback.py
Merges only
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_common/test/version/test_node_version_fallback.py
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
Merge: 089d12e4 d8c42999
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Wed Apr 24 13:04:08 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
index e81baf9d..5388a648 100644
--- a/indy_common/test/version/test_node_version_fallback.py
+++ b/indy_common/test/version/test_node_version_fallback.py
@@ -12,8 +12,7 @@ def test_node_version_fallback_str():
def test_node_version_fallback_repr():
version = '1.2.3'
- assert (repr(NodeVersionFallback(version)) ==
- "{}(version='{}')".format(NodeVersionFallback.__name__, version))
+ assert (repr(NodeVersionFallback(version)) == "{}(version='{}')".format(NodeVersionFallback.__name__, version))
@pytest.mark.parametrize(
* [X] diff --git a/indy_common/transactions.py b/indy_common/transactions.py
Merges only
index 4f57a7e0..79182b54 100644
--- a/indy_common/transactions.py
+++ b/indy_common/transactions.py
@@ -38,9 +38,18 @@ class IndyTransactions(Transactions):
GET_AUTH_RULE = "121"
AUTH_RULES = "122"
+ FLAG = "130"
+ GET_FLAG = "131"
+
# Rich Schema
- SET_CONTEXT = "200"
- GET_CONTEXT = "300"
+ JSON_LD_CONTEXT = "200"
+ RICH_SCHEMA = "201"
+ RICH_SCHEMA_ENCODING = "202"
+ RICH_SCHEMA_MAPPING = "203"
+ RICH_SCHEMA_CRED_DEF = "204"
+ RICH_SCHEMA_PRES_DEF = "205"
+ GET_RICH_SCHEMA_OBJECT_BY_ID = "300"
+ GET_RICH_SCHEMA_OBJECT_BY_METADATA = "301"
@staticmethod
def get_name_from_code(code: str):
* [X] diff --git a/indy_common/types.py b/indy_common/types.py
Merge only
index b3afa321..78def586 100644
--- a/indy_common/types.py
+++ b/indy_common/types.py
@@ -1,51 +1,54 @@
import json
-import sys
from copy import deepcopy
from hashlib import sha256
-from common.exceptions import PlenumTypeError, PlenumValueError
-
from plenum.common.constants import TARGET_NYM, NONCE, RAW, ENC, HASH, NAME, \
- VERSION, FORCE, ORIGIN, OPERATION_SCHEMA_IS_STRICT
+ VERSION, FORCE, ORIGIN, OPERATION_SCHEMA_IS_STRICT, OP_VER, TXN_METADATA_SEQ_NO, \
+ ALIAS, TXN_TYPE, DATA, VERKEY, ROLE, NYM
from plenum.common.messages.client_request import ClientMessageValidator as PClientMessageValidator
from plenum.common.messages.client_request import ClientOperationField as PClientOperationField
+from plenum.common.messages.client_request import ClientNYMOperation as PClientNYMOperation
from plenum.common.messages.fields import ConstantField, IdentifierField, \
LimitedLengthStringField, TxnSeqNoField, \
Sha256HexField, JsonField, MapField, BooleanField, VersionField, \
ChooseField, IntegerField, IterableField, \
- AnyMapField, NonEmptyStringField, DatetimeStringField, RoleField, AnyField, FieldBase
+ AnyMapField, NonEmptyStringField, DatetimeStringField, RoleField, AnyField, FieldBase, \
+ VerkeyField, DestNymField, NonNegativeNumberField
from plenum.common.messages.message_base import MessageValidator
-from plenum.common.messages.node_messages import NonNegativeNumberField
from plenum.common.request import Request as PRequest
from plenum.common.types import OPERATION
from plenum.common.util import is_network_ip_address_valid, is_network_port_valid
from plenum.config import JSON_FIELD_LIMIT, NAME_FIELD_LIMIT, DATA_FIELD_LIMIT, \
- NONCE_FIELD_LIMIT, \
+ NONCE_FIELD_LIMIT, ALIAS_FIELD_LIMIT, \
ENC_FIELD_LIMIT, RAW_FIELD_LIMIT, SIGNATURE_TYPE_FIELD_LIMIT
from common.version import GenericVersion
+from indy_common.config import DIDDOC_CONTENT_SIZE_LIMIT
from indy_common.authorize.auth_actions import ADD_PREFIX, EDIT_PREFIX
from indy_common.authorize.auth_constraints import ConstraintsEnum, CONSTRAINT_ID, AUTH_CONSTRAINTS, METADATA, \
- NEED_TO_BE_OWNER, SIG_COUNT, ROLE, OFF_LEDGER_SIGNATURE
-from indy_common.config import SCHEMA_ATTRIBUTES_LIMIT, CONTEXT_SIZE_LIMIT
-from indy_common.constants import TXN_TYPE, ATTRIB, GET_ATTR, \
- DATA, GET_NYM, GET_SCHEMA, GET_CLAIM_DEF, ACTION, \
+ NEED_TO_BE_OWNER, SIG_COUNT, ROLE as AUTH_ROLE, OFF_LEDGER_SIGNATURE
+from indy_common.config import SCHEMA_ATTRIBUTES_LIMIT
+from indy_common.constants import ATTRIB, GET_ATTR, \
+ GET_NYM, GET_SCHEMA, GET_CLAIM_DEF, ACTION, \
POOL_UPGRADE, POOL_CONFIG, \
- DISCLO, SCHEMA, ENDPOINT, CLAIM_DEF, SCHEDULE, SHA256, \
+ DISCLO, SCHEMA, ENDPOINT, CLAIM_DEF, SCHEDULE, NYM_VERSION, SHA256, \
TIMEOUT, JUSTIFICATION, JUSTIFICATION_MAX_SIZE, REINSTALL, WRITES, START, CANCEL, \
REVOC_REG_DEF, ISSUANCE_TYPE, MAX_CRED_NUM, PUBLIC_KEYS, \
TAILS_HASH, TAILS_LOCATION, ID, REVOC_TYPE, TAG, CRED_DEF_ID, VALUE, \
REVOC_REG_ENTRY, ISSUED, REVOC_REG_DEF_ID, REVOKED, ACCUM, PREV_ACCUM, \
GET_REVOC_REG_DEF, GET_REVOC_REG, TIMESTAMP, \
GET_REVOC_REG_DELTA, FROM, TO, POOL_RESTART, DATETIME, VALIDATOR_INFO, \
- SET_CONTEXT, GET_CONTEXT, CONTEXT_NAME, CONTEXT_VERSION, CONTEXT_CONTEXT, CONTEXT_FROM, \
SCHEMA_FROM, SCHEMA_NAME, SCHEMA_VERSION, \
SCHEMA_ATTR_NAMES, CLAIM_DEF_SIGNATURE_TYPE, CLAIM_DEF_PUBLIC_KEYS, CLAIM_DEF_TAG, CLAIM_DEF_SCHEMA_REF, \
CLAIM_DEF_PRIMARY, CLAIM_DEF_REVOCATION, CLAIM_DEF_FROM, PACKAGE, AUTH_RULE, AUTH_RULES, CONSTRAINT, AUTH_ACTION, \
AUTH_TYPE, \
- FIELD, OLD_VALUE, NEW_VALUE, GET_AUTH_RULE, RULES, ISSUANCE_BY_DEFAULT, ISSUANCE_ON_DEMAND, RS_TYPE, CONTEXT_TYPE, \
- META, TAG_LIMIT_SIZE
-from indy_common.version import SchemaVersion, ContextVersion
+ FIELD, OLD_VALUE, NEW_VALUE, GET_AUTH_RULE, RULES, ISSUANCE_BY_DEFAULT, ISSUANCE_ON_DEMAND, RS_TYPE, \
+ TAG_LIMIT_SIZE, JSON_LD_CONTEXT, RS_VERSION, \
+ RS_NAME, RS_ID, RS_CONTENT, RS_CONTEXT_TYPE_VALUE, RICH_SCHEMA, RS_SCHEMA_TYPE_VALUE, RS_ENCODING_TYPE_VALUE, \
+ RICH_SCHEMA_ENCODING, RS_MAPPING_TYPE_VALUE, RICH_SCHEMA_MAPPING, RS_CRED_DEF_TYPE_VALUE, \
+ RICH_SCHEMA_CRED_DEF, GET_RICH_SCHEMA_OBJECT_BY_ID, GET_RICH_SCHEMA_OBJECT_BY_METADATA, \
+ RICH_SCHEMA_PRES_DEF, RS_PRES_DEF_TYPE_VALUE, DIDDOC_CONTENT
+from indy_common.version import SchemaVersion
class Request(PRequest):
@@ -68,6 +71,20 @@ class ClientGetNymOperation(MessageValidator):
schema = (
(TXN_TYPE, ConstantField(GET_NYM)),
(TARGET_NYM, IdentifierField()),
+ (TIMESTAMP, IntegerField(optional=True)),
+ (TXN_METADATA_SEQ_NO, TxnSeqNoField(optional=True)),
+ )
+
+
+class ClientNYMOperation(PClientNYMOperation):
+ schema = (
+ (TXN_TYPE, ConstantField(NYM)),
+ (ALIAS, LimitedLengthStringField(max_length=ALIAS_FIELD_LIMIT, optional=True)),
+ (VERKEY, VerkeyField(optional=True, nullable=True)),
+ (TARGET_NYM, DestNymField()),
+ (ROLE, RoleField(optional=True)),
+ (DIDDOC_CONTENT, JsonField(max_length=DIDDOC_CONTENT_SIZE_LIMIT, optional=True)),
+ (NYM_VERSION, IntegerField(optional=True)),
)
@@ -99,49 +116,6 @@ class SchemaField(MessageValidator):
)
-# Rich Schema
-class SetContextMetaField(MessageValidator):
- schema = (
- (CONTEXT_NAME, LimitedLengthStringField(max_length=NAME_FIELD_LIMIT)),
- (CONTEXT_VERSION, VersionField(version_cls=ContextVersion)),
- (RS_TYPE, ConstantField(CONTEXT_TYPE)),
- )
-
-
-class ContextField(FieldBase):
- _base_types = None
-
- def __init__(self, max_size=None, **kwargs):
- if max_size is not None:
- if not isinstance(max_size, int):
- raise PlenumTypeError('max_size', max_size, int)
- if not max_size > 0:
- raise PlenumValueError('max_size', max_size, '> 0')
- self.max_size = max_size
- super().__init__(**kwargs)
-
- def _specific_validation(self, val):
- if self.max_size is not None:
- arr = json.dumps(val)
- size = sys.getsizeof(arr)
- if size > self.max_size:
- return 'size should be at most {}, context has size {}'.format(self.max_size, size)
-
-
-class SetContextDataField(MessageValidator):
- schema = (
- (CONTEXT_CONTEXT, ContextField(
- max_size=CONTEXT_SIZE_LIMIT)),
- )
-
-
-class GetContextField(MessageValidator):
- schema = (
- (CONTEXT_NAME, LimitedLengthStringField(max_length=NAME_FIELD_LIMIT)),
- (CONTEXT_VERSION, VersionField(version_cls=ContextVersion))
- )
-
-
class ClaimDefField(MessageValidator):
schema = (
(CLAIM_DEF_PRIMARY, AnyMapField()),
@@ -204,24 +178,6 @@ class ClientGetSchemaOperation(MessageValidator):
)
-# Rich Schema
-# this class is not actually used for static validation at this time
-class ClientSetContextOperation(MessageValidator):
- schema = (
- (TXN_TYPE, ConstantField(SET_CONTEXT)),
- (META, SetContextMetaField()),
- (DATA, SetContextDataField()),
- )
-
-
-class ClientGetContextOperation(MessageValidator):
- schema = (
- (TXN_TYPE, ConstantField(GET_CONTEXT)),
- (CONTEXT_FROM, IdentifierField()),
- (META, GetContextField()),
- )
-
-
class ClientAttribOperation(MessageValidator):
schema = (
(TXN_TYPE, ConstantField(ATTRIB)),
@@ -263,6 +219,9 @@ class ClientAttribOperation(MessageValidator):
def __validate_endpoint_ha_field(self, endpoint):
if endpoint is None:
return # remove the attribute, valid case
+ if not isinstance(endpoint, dict):
+ self._raise_invalid_fields(ENDPOINT, endpoint,
+ 'should be a dict')
HA_NAME = 'ha'
ha = endpoint.get(HA_NAME)
if ha is None:
@@ -287,6 +246,8 @@ class ClientGetAttribOperation(ClientAttribOperation):
(RAW, LimitedLengthStringField(max_length=RAW_FIELD_LIMIT, optional=True)),
(ENC, LimitedLengthStringField(max_length=ENC_FIELD_LIMIT, optional=True)),
(HASH, Sha256HexField(optional=True)),
+ (TIMESTAMP, IntegerField(optional=True)),
+ (TXN_METADATA_SEQ_NO, TxnSeqNoField(optional=True)),
)
def _validate_message(self, msg):
@@ -404,7 +365,7 @@ class ConstraintField(FieldBase):
class ConstraintEntityField(MessageValidator):
schema = (
(CONSTRAINT_ID, ChooseField(values=ConstraintsEnum.values())),
- (ROLE, RoleField()),
+ (AUTH_ROLE, RoleField()),
(SIG_COUNT, NonNegativeNumberField()),
(NEED_TO_BE_OWNER, BooleanField(optional=True)),
(OFF_LEDGER_SIGNATURE, BooleanField(optional=True)),
@@ -473,8 +434,61 @@ class ClientGetAuthRuleOperation(MessageValidator):
)
+def rich_schema_objects_schema(txn_type, rs_type):
+ return (
+ (TXN_TYPE, ConstantField(txn_type)),
+ (RS_ID, NonEmptyStringField()),
+ (RS_TYPE, ConstantField(rs_type)),
+ (RS_NAME, LimitedLengthStringField(max_length=NAME_FIELD_LIMIT)),
+ (RS_VERSION, VersionField(version_cls=SchemaVersion)),
+ (RS_CONTENT, NonEmptyStringField()),
+ (OP_VER, LimitedLengthStringField(optional=True))
+ )
+
+
+class ClientJsonLdContextOperation(MessageValidator):
+ schema = rich_schema_objects_schema(JSON_LD_CONTEXT, RS_CONTEXT_TYPE_VALUE)
+
+
+class ClientRichSchemaOperation(MessageValidator):
+ schema = rich_schema_objects_schema(RICH_SCHEMA, RS_SCHEMA_TYPE_VALUE)
+
+
+class ClientRichSchemaEncodingOperation(MessageValidator):
+ schema = rich_schema_objects_schema(RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE)
+
+
+class ClientRichSchemaMappingOperation(MessageValidator):
+ schema = rich_schema_objects_schema(RICH_SCHEMA_MAPPING, RS_MAPPING_TYPE_VALUE)
+
+
+class ClientRichSchemaCredDefOperation(MessageValidator):
+ schema = rich_schema_objects_schema(RICH_SCHEMA_CRED_DEF, RS_CRED_DEF_TYPE_VALUE)
+
+
+class ClientRichSchemaPresDefOperation(MessageValidator):
+ schema = rich_schema_objects_schema(RICH_SCHEMA_PRES_DEF, RS_PRES_DEF_TYPE_VALUE)
+
+
+class ClientGetRichSchemaObjectByIdOperation(MessageValidator):
+ schema = (
+ (TXN_TYPE, ConstantField(GET_RICH_SCHEMA_OBJECT_BY_ID)),
+ (RS_ID, NonEmptyStringField()),
+ )
+
+
+class ClientGetRichSchemaObjectByMetadataOperation(MessageValidator):
+ schema = (
+ (TXN_TYPE, ConstantField(GET_RICH_SCHEMA_OBJECT_BY_METADATA)),
+ (RS_TYPE, NonEmptyStringField()),
+ (RS_NAME, LimitedLengthStringField(max_length=NAME_FIELD_LIMIT)),
+ (RS_VERSION, VersionField(version_cls=SchemaVersion)),
+ )
+
+
class ClientOperationField(PClientOperationField):
_specific_operations = {
+ NYM: ClientNYMOperation(),
SCHEMA: ClientSchemaOperation(),
ATTRIB: ClientAttribOperation(),
GET_ATTR: ClientGetAttribOperation(),
@@ -495,8 +509,14 @@ class ClientOperationField(PClientOperationField):
GET_REVOC_REG_DEF: ClientGetRevocRegDefField(),
GET_REVOC_REG: ClientGetRevocRegField(),
GET_REVOC_REG_DELTA: ClientGetRevocRegDeltaField(),
- SET_CONTEXT: ClientSetContextOperation(), # Rich Schema
- GET_CONTEXT: ClientGetContextOperation(),
+ JSON_LD_CONTEXT: ClientJsonLdContextOperation(),
+ RICH_SCHEMA: ClientRichSchemaOperation(),
+ RICH_SCHEMA_ENCODING: ClientRichSchemaEncodingOperation(),
+ RICH_SCHEMA_MAPPING: ClientRichSchemaMappingOperation(),
+ RICH_SCHEMA_CRED_DEF: ClientRichSchemaCredDefOperation(),
+ RICH_SCHEMA_PRES_DEF: ClientRichSchemaPresDefOperation(),
+ GET_RICH_SCHEMA_OBJECT_BY_ID: ClientGetRichSchemaObjectByIdOperation(),
+ GET_RICH_SCHEMA_OBJECT_BY_METADATA: ClientGetRichSchemaObjectByMetadataOperation()
}
# TODO: it is a workaround because INDY-338, `operations` must be a class
* [X] diff --git a/indy_common/version.py b/indy_common/version.py
Merges only
index 34ad536b..fd32edb2 100644
--- a/indy_common/version.py
+++ b/indy_common/version.py
@@ -11,12 +11,6 @@ from indy_common.node_version import NodeVersion
NodeVersion = NodeVersion
-# Rich Schema
-class ContextVersion(DigitDotVersion):
- def __init__(self, version: str, **kwargs):
- super().__init__(version, parts_num=(2, 3), **kwargs)
-
-
class SchemaVersion(DigitDotVersion):
def __init__(self, version: str, **kwargs):
super().__init__(version, parts_num=(2, 3), **kwargs)
* [X] diff --git a/indy_node/__init__.py b/indy_node/__init__.py
Merges only
index 2d5ab7d6..e231f2a8 100644
--- a/indy_node/__init__.py
+++ b/indy_node/__init__.py
@@ -14,7 +14,7 @@ PLUGIN_CLIENT_REQ_OP_TYPES = {}
def setup_plugins():
import sys
import os
- import pip
+ import importlib_metadata
import importlib # noqa
from importlib.util import module_from_spec, spec_from_file_location # noqa: E402
from indy_common.config_util import getConfigOnce # noqa: E402
@@ -50,7 +50,7 @@ def setup_plugins():
format(plugin_root))
sys.path.insert(0, plugin_root.__path__[0])
enabled_plugins = config.ENABLED_PLUGINS
- installed_packages = {p.project_name: p for p in pip.get_installed_distributions()}
+ installed_packages = set(p.metadata["Name"] for p in importlib_metadata.distributions())
for plugin_name in enabled_plugins:
plugin = find_and_load_plugin(plugin_name, plugin_root, installed_packages)
plugin_globals = plugin.__dict__
* [X] diff --git a/indy_node/__metadata__.py b/indy_node/__metadata__.py
The version
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/__metadata__.py
* commit d3136ce536be849cd2f959c9b3acc893eda1ffa1
| Author: dsurnin <dmitry.surnin@dsr-corporation.com>
| Date: Fri Feb 9 18:50:21 2018 +0300
|
| Set version 1.3
|
| Signed-off-by: dsurnin <dmitry.surnin@dsr-corporation.com>
|
| diff --git a/indy_node/__metadata__.py b/indy_node/__metadata__.py
| index a514414d..d9282e19 100644
| --- a/indy_node/__metadata__.py
| +++ b/indy_node/__metadata__.py
| @@ -1,7 +1,7 @@
| """
| indy-node package metadata
| """
| -__version_info__ = (1, 2)
| +__version_info__ = (1, 3)
| __version__ = '.'.join(map(str, __version_info__))
| __author__ = "Hyperledger"
| __license__ = "Apache 2.0"
|
Does this matter? No, the version details match...
index 34768c24..1603cb02 100644
--- a/indy_node/__metadata__.py
+++ b/indy_node/__metadata__.py
@@ -41,13 +41,13 @@ def load_manifest(manifest_file: str = MANIFEST_FILE) -> Any:
try:
with open(manifest_file, 'r') as _f:
return json.load(_f)
- except IOError as exc:
+ except IOError:
return None
def set_manifest(manifest: Any, manifest_file: str = MANIFEST_FILE):
with open(manifest_file, 'w') as _f:
- json.dump(manifest, _f)
+ json.dump(manifest, _f, sort_keys=True)
_f.write('\n')
* [X] diff --git a/indy_node/__version__.json b/indy_node/__version__.json
We are missing a bunch of commits like...
* commit 7cf4a372ed90a07d68eb8b8da88f79fd785b79df (tag: v1.12.6, origin/stable, origin/release-1.12.6)
| Author: Sovbot <bot@sovrin.org>
| Date: Thu Aug 18 18:38:06 2022 +0000
|
| release 1.12.6
|
| Signed-off-by: Sovbot <bot@sovrin.org>
|
| diff --git a/indy_node/__version__.json b/indy_node/__version__.json
| index 807539f6..0031cfb1 100644
| --- a/indy_node/__version__.json
| +++ b/indy_node/__version__.json
| @@ -1 +1 @@
| -[1, 12, 6, "rc", "2"]
| +[1, 12, 6, "", ""]
|
* commit 6bda5ecc58607f8727b51897751d9b79215b1089
| Author: Wade Barnes <wade@neoterictech.ca>
| Date: Thu Aug 18 07:22:42 2022 -0700
|
| [RC2-1.12.6] Fix exception type in update_package_cache
|
| - Bump indy-node version.
|
| Signed-off-by: Wade Barnes <wade@neoterictech.ca>
|
| diff --git a/indy_node/__version__.json b/indy_node/__version__.json
| index c4122823..807539f6 100644
| --- a/indy_node/__version__.json
| +++ b/indy_node/__version__.json
| @@ -1 +1 @@
| -[1, 12, 6, "rc", "1"]
| +[1, 12, 6, "rc", "2"]
These don't matter because the version number will be different anyways...
index 0031cfb1..33a1348c 100644
--- a/indy_node/__version__.json
+++ b/indy_node/__version__.json
@@ -1 +1 @@
-[1, 12, 6, "", ""]
+[1, 13, 2, "rc", 4]
* [X] diff --git a/indy_node/server/action_log.py b/indy_node/server/action_log.py
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/server/action_log.py
* commit 438aa42694b2d0d7af6d6366e8196604ac229093
| Merge: a8784f11 3c783228
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Nov 28 13:30:56 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.12.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit a8784f11815f42422e0d007c0b8bcba19fcf6fd2
| Merge: 0bd19eff d8c42999
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Wed Apr 24 13:04:08 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.7.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 0bd19eff3d8d5149a92e3e5c4756753c4b992f8e
Merge: 089d12e4 7dcf675c
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Fri Jun 22 15:03:07 2018 +0300
Merge remote-tracking branch 'public/master' into rc-1.4.63
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
index 278235c0..165da011 100644
--- a/indy_node/server/action_log.py
+++ b/indy_node/server/action_log.py
@@ -114,8 +114,7 @@ class ActionLogEvent(CsvSerializer):
self._data_items_prefix = '_data_'
self._items = (
- ['ts', 'ev_type'] +
- [(self._data_items_prefix + i) for i in self.data._items]
+ ['ts', 'ev_type'] + [(self._data_items_prefix + i) for i in self.data._items]
)
def __getattr__(self, name):
@@ -137,7 +136,7 @@ class ActionLog:
file_path: str,
data_class: Type[CsvSerializer] = ActionLogData,
event_types: Type[Enum] = ActionLogEvents,
- delimiter: str ='\t'
+ delimiter: str = '\t'
):
self._delimiter = delimiter
self._file_path = file_path
* [X] diff --git a/indy_node/server/node_bootstrap.py b/indy_node/server/node_bootstrap.py
Boring...
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/server/node_bootstrap.py
* commit b54404d0bdc23dfb60d10bababc9216e99c3ee49
| Merge: 438aa426 e0d3b94d
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Thu Dec 26 15:23:08 2019 +0300
|
| Merge remote-tracking branch 'remotes/upstream/master' into rc-1.12.1.rc1
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
* commit 438aa42694b2d0d7af6d6366e8196604ac229093
| Merge: ff88db39 3c783228
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Nov 28 13:30:56 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.12.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
| Merge: b4f6928f 1e437ca1
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Oct 3 12:01:40 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit b4f6928f4f5121f7c8070aca326540094515a04c
| Merge: 8940e559 522fee89
| Author: Sergey Khoroshavin <sergey.khoroshavin@dsr-corporation.com>
| Date: Tue Jul 30 02:57:19 2019 +0300
|
| Merge remote-tracking branch 'upstream/master' into rc-1.9.1.rc1
|
| Signed-off-by: Sergey Khoroshavin <sergey.khoroshavin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
Merge: 089d12e4 48199562
Author: Andrey Kononykhin <andkononykhin@gmail.com>
Date: Thu Jun 27 19:32:01 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
index 7d0bfbd3..3c92c760 100644
--- a/indy_node/server/node_bootstrap.py
+++ b/indy_node/server/node_bootstrap.py
@@ -1,3 +1,8 @@
+from common.serializers.serialization import domain_state_serializer
+from indy_common.authorize.auth_constraints import ConstraintsSerializer
+from indy_common.authorize.auth_map import auth_map
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+from indy_common.constants import CONFIG_LEDGER_ID
from indy_node.persistence.attribute_store import AttributeStore
from indy_node.persistence.idr_cache import IdrCache
from indy_node.server.pool_config import PoolConfig
@@ -7,6 +12,8 @@ from indy_node.server.request_handlers.config_batch_handler import ConfigBatchHa
from indy_node.server.request_handlers.config_req_handlers.auth_rule.auth_rule_handler import AuthRuleHandler
from indy_node.server.request_handlers.config_req_handlers.auth_rule.auth_rule_handler_1_9_1 import AuthRuleHandler191
from indy_node.server.request_handlers.config_req_handlers.auth_rule.auth_rules_handler import AuthRulesHandler
+from indy_node.server.request_handlers.config_req_handlers.flag_handler import FlagRequestHandler
+from indy_node.server.request_handlers.config_req_handlers.ledgers_freeze_handler import LedgersFreezeHandler
from indy_node.server.request_handlers.config_req_handlers.node_upgrade_handler import NodeUpgradeHandler
from indy_node.server.request_handlers.config_req_handlers.pool_config_handler import PoolConfigHandler
from indy_node.server.request_handlers.config_req_handlers.pool_upgrade_handler import PoolUpgradeHandler
@@ -17,35 +24,40 @@ from indy_node.server.request_handlers.config_req_handlers.txn_author_agreement_
from indy_node.server.request_handlers.config_req_handlers.txn_author_agreement_handler import TxnAuthorAgreementHandler
from indy_node.server.request_handlers.config_req_handlers.txn_author_agreement_handler_v1 import \
TxnAuthorAgreementHandlerV1
-from indy_node.server.request_handlers.domain_req_handlers.idr_cache_nym_handler import IdrCacheNymHandler
-from indy_node.server.request_handlers.idr_cache_batch_handler import IdrCacheBatchHandler
-from indy_node.server.request_handlers.read_req_handlers.get_auth_rule_handler import GetAuthRuleHandler
-
-from indy_node.server.request_handlers.domain_req_handlers.claim_def_handler import ClaimDefHandler
-from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_entry_handler import RevocRegEntryHandler
-from indy_node.server.request_handlers.domain_req_handlers.schema_handler import SchemaHandler
-from indy_node.server.request_handlers.domain_req_handlers.context_handler import ContextHandler
-
from indy_node.server.request_handlers.domain_req_handlers.attribute_handler import AttributeHandler
+from indy_node.server.request_handlers.domain_req_handlers.claim_def_handler import ClaimDefHandler
+from indy_node.server.request_handlers.domain_req_handlers.idr_cache_nym_handler import IdrCacheNymHandler
from indy_node.server.request_handlers.domain_req_handlers.nym_handler import NymHandler
from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_def_handler import RevocRegDefHandler
-
-from indy_common.authorize.auth_map import auth_map
-
-from common.serializers.serialization import domain_state_serializer
-from indy_common.authorize.auth_constraints import ConstraintsSerializer
-from indy_common.authorize.auth_request_validator import WriteRequestValidator
-from indy_common.constants import CONFIG_LEDGER_ID
+from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_entry_handler import RevocRegEntryHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.json_ld_context_handler import \
+ JsonLdContextHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_cred_def_handler import \
+ RichSchemaCredDefHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_encoding_handler import \
+ RichSchemaEncodingHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_handler import RichSchemaHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_mapping_handler import \
+ RichSchemaMappingHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_pres_def_handler import \
+ RichSchemaPresDefHandler
+from indy_node.server.request_handlers.domain_req_handlers.schema_handler import SchemaHandler
+from indy_node.server.request_handlers.idr_cache_batch_handler import IdrCacheBatchHandler
from indy_node.server.request_handlers.pool_req_handlers.node_handler import NodeHandler
from indy_node.server.request_handlers.read_req_handlers.get_attribute_handler import GetAttributeHandler
+from indy_node.server.request_handlers.read_req_handlers.get_auth_rule_handler import GetAuthRuleHandler
from indy_node.server.request_handlers.read_req_handlers.get_claim_def_handler import GetClaimDefHandler
+from plenum.server.request_handlers.ledgers_freeze.get_frozen_ledgers_handler import GetFrozenLedgersHandler
+from indy_node.server.request_handlers.read_req_handlers.get_flag_handler import GetFlagRequestHandler
from indy_node.server.request_handlers.read_req_handlers.get_nym_handler import GetNymHandler
from indy_node.server.request_handlers.read_req_handlers.get_revoc_reg_def_handler import GetRevocRegDefHandler
from indy_node.server.request_handlers.read_req_handlers.get_revoc_reg_delta_handler import GetRevocRegDeltaHandler
from indy_node.server.request_handlers.read_req_handlers.get_revoc_reg_handler import GetRevocRegHandler
+from indy_node.server.request_handlers.read_req_handlers.rich_schema.get_rich_schema_object_by_id_handler import \
+ GetRichSchemaObjectByIdHandler
+from indy_node.server.request_handlers.read_req_handlers.rich_schema.get_rich_schema_object_by_metadata_handler import \
+ GetRichSchemaObjectByMetadataHandler
from indy_node.server.request_handlers.read_req_handlers.get_schema_handler import GetSchemaHandler
-from indy_node.server.request_handlers.read_req_handlers.get_context_handler import GetContextHandler
-
from indy_node.server.restarter import Restarter
from indy_node.server.upgrader import Upgrader
from plenum.common.constants import IDR_CACHE_LABEL, ATTRIB_LABEL
@@ -93,10 +105,9 @@ class NodeBootstrap(PNodeBootstrap):
def _register_domain_req_handlers(self):
# Read handlers
- get_nym_handler = GetNymHandler(database_manager=self.node.db_manager)
- get_attribute_handler = GetAttributeHandler(database_manager=self.node.db_manager)
+ get_nym_handler = GetNymHandler(node=self.node, database_manager=self.node.db_manager)
+ get_attribute_handler = GetAttributeHandler(node=self.node, database_manager=self.node.db_manager)
get_schema_handler = GetSchemaHandler(database_manager=self.node.db_manager)
- get_context_handler = GetContextHandler(database_manager=self.node.db_manager)
get_claim_def_handler = GetClaimDefHandler(database_manager=self.node.db_manager)
get_revoc_reg_def_handler = GetRevocRegDefHandler(database_manager=self.node.db_manager)
get_revoc_reg_handler = GetRevocRegHandler(database_manager=self.node.db_manager)
@@ -110,34 +121,55 @@ class NodeBootstrap(PNodeBootstrap):
write_req_validator=self.node.write_req_validator)
schema_handler = SchemaHandler(database_manager=self.node.db_manager,
write_req_validator=self.node.write_req_validator)
- context_handler = ContextHandler(database_manager=self.node.db_manager,
- write_req_validator=self.node.write_req_validator)
claim_def_handler = ClaimDefHandler(database_manager=self.node.db_manager,
write_req_validator=self.node.write_req_validator)
revoc_reg_def_handler = RevocRegDefHandler(database_manager=self.node.db_manager,
write_req_validator=self.node.write_req_validator)
revoc_reg_entry_handler = RevocRegEntryHandler(database_manager=self.node.db_manager,
write_req_validator=self.node.write_req_validator,
- get_revocation_strategy=RevocRegDefHandler.get_revocation_strategy)
+ get_revocation_strategy=RevocRegDefHandler.get_revocation_strategy,
+ node=self.node)
+ json_ld_context_handler = JsonLdContextHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
+ rich_schema_handler = RichSchemaHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
+ rich_schema_encoding_handler = RichSchemaEncodingHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
+ rich_schema_mapping_handler = RichSchemaMappingHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
+ rich_schema_cred_def_handler = RichSchemaCredDefHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
+ rich_schema_pres_def_handler = RichSchemaPresDefHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
+ get_rich_schema_obj_by_id_handler = GetRichSchemaObjectByIdHandler(database_manager=self.node.db_manager)
+ get_rich_schema_obj_by_metadata_handler = GetRichSchemaObjectByMetadataHandler(
+ database_manager=self.node.db_manager)
+
# Register write handlers
self.node.write_manager.register_req_handler(nym_handler)
self.node.write_manager.register_req_handler(attrib_handler)
self.node.write_manager.register_req_handler(schema_handler)
- self.node.write_manager.register_req_handler(context_handler)
self.node.write_manager.register_req_handler(claim_def_handler)
self.node.write_manager.register_req_handler(revoc_reg_def_handler)
self.node.write_manager.register_req_handler(revoc_reg_entry_handler)
+ self.node.write_manager.register_req_handler(json_ld_context_handler)
+ self.node.write_manager.register_req_handler(rich_schema_handler)
+ self.node.write_manager.register_req_handler(rich_schema_encoding_handler)
+ self.node.write_manager.register_req_handler(rich_schema_mapping_handler)
+ self.node.write_manager.register_req_handler(rich_schema_cred_def_handler)
+ self.node.write_manager.register_req_handler(rich_schema_pres_def_handler)
# Additional handler for idCache
self.register_idr_cache_nym_handler()
# Register read handlers
self.node.read_manager.register_req_handler(get_nym_handler)
self.node.read_manager.register_req_handler(get_attribute_handler)
self.node.read_manager.register_req_handler(get_schema_handler)
- self.node.read_manager.register_req_handler(get_context_handler)
self.node.read_manager.register_req_handler(get_claim_def_handler)
self.node.read_manager.register_req_handler(get_revoc_reg_def_handler)
self.node.read_manager.register_req_handler(get_revoc_reg_handler)
self.node.read_manager.register_req_handler(get_revoc_reg_delta_handler)
+ self.node.read_manager.register_req_handler(get_rich_schema_obj_by_id_handler)
+ self.node.read_manager.register_req_handler(get_rich_schema_obj_by_metadata_handler)
def _register_config_req_handlers(self):
# Read handlers
@@ -170,6 +202,13 @@ class NodeBootstrap(PNodeBootstrap):
get_taa_aml_handler = GetTxnAuthorAgreementAmlHandler(database_manager=self.node.db_manager)
get_taa_handler = GetTxnAuthorAgreementHandler(database_manager=self.node.db_manager)
node_upgrade_handler = NodeUpgradeHandler(database_manager=self.node.db_manager)
+ ledgers_freeze_handler = LedgersFreezeHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
+ get_frozen_ledgers_handler = GetFrozenLedgersHandler(database_manager=self.node.db_manager)
+ get_flag_handler = GetFlagRequestHandler(node=self.node,
+ database_manager=self.node.db_manager)
+ flag_handler = FlagRequestHandler(database_manager=self.node.db_manager,
+ write_req_validator=self.node.write_req_validator)
# Register write handlers
self.node.write_manager.register_req_handler(auth_rule_handler)
self.node.write_manager.register_req_handler(auth_rules_handler)
@@ -179,10 +218,14 @@ class NodeBootstrap(PNodeBootstrap):
self.node.write_manager.register_req_handler(taa_handler)
self.node.write_manager.register_req_handler(taa_disable_handler)
self.node.write_manager.register_req_handler(node_upgrade_handler)
+ self.node.write_manager.register_req_handler(ledgers_freeze_handler)
+ self.node.write_manager.register_req_handler(flag_handler)
# Register read handlers
self.node.read_manager.register_req_handler(get_auth_rule_handler)
self.node.read_manager.register_req_handler(get_taa_aml_handler)
self.node.read_manager.register_req_handler(get_taa_handler)
+ self.node.read_manager.register_req_handler(get_frozen_ledgers_handler)
+ self.node.read_manager.register_req_handler(get_flag_handler)
# Register write handlers for a version
self.node.write_manager.register_req_handler_with_version(auth_rule_handler_1_9_1,
version="1.9.1")
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/auth_rule/abstract_auth_rule_handler.py b/indy_node/server/request_handlers/config_req_handlers/auth_rule/abstract_auth_rule_handler.py
Only merges
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/server/request_handlers/config_req_handlers/auth_rule/abstract_auth_rule_handler.py
* commit 438aa42694b2d0d7af6d6366e8196604ac229093
| Merge: 8940e559 3c783228
| Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
| Date: Thu Nov 28 13:30:56 2019 +0300
|
| Merge remote-tracking branch 'public/master' into rc-1.12.0.rc1
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
Merge: 089d12e4 48199562
Author: Andrey Kononykhin <andkononykhin@gmail.com>
Date: Thu Jun 27 19:32:01 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
index 461d52e6..d9d19588 100644
--- a/indy_node/server/request_handlers/config_req_handlers/auth_rule/abstract_auth_rule_handler.py
+++ b/indy_node/server/request_handlers/config_req_handlers/auth_rule/abstract_auth_rule_handler.py
@@ -27,7 +27,7 @@ class AbstractAuthRuleHandler(WriteRequestHandler, metaclass=ABCMeta):
except (ValueError, KeyError) as exp:
raise InvalidClientRequest(identifier,
req_id,
- exp)
+ str(exp))
StaticAuthRuleHelper.check_auth_key(operation, identifier, req_id, self.write_req_validator.auth_map)
def _update_auth_constraint(self, auth_key: str, constraint):
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rule_handler.py b/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rule_handler.py
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rule_handler.py
* commit b54404d0bdc23dfb60d10bababc9216e99c3ee49
| Merge: 8940e559 e0d3b94d
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Thu Dec 26 15:23:08 2019 +0300
|
| Merge remote-tracking branch 'remotes/upstream/master' into rc-1.12.1.rc1
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
Merge: 089d12e4 48199562
Author: Andrey Kononykhin <andkononykhin@gmail.com>
Date: Thu Jun 27 19:32:01 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
index 376c7e7e..0bda7a0c 100644
--- a/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rule_handler.py
+++ b/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rule_handler.py
@@ -21,7 +21,7 @@ class AuthRuleHandler(AbstractAuthRuleHandler):
self._validate_request_type(request)
self._static_validation_for_rule(operation, identifier, req_id)
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
self._validate_request_type(request)
self.write_req_validator.validate(request,
[AuthActionEdit(txn_type=AUTH_RULE,
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rules_handler.py b/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rules_handler.py
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rules_handler.py
* commit b54404d0bdc23dfb60d10bababc9216e99c3ee49
| Merge: 8940e559 e0d3b94d
| Author: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
| Date: Thu Dec 26 15:23:08 2019 +0300
|
| Merge remote-tracking branch 'remotes/upstream/master' into rc-1.12.1.rc1
|
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-corporation.com>
|
* commit 8940e5595b2827dad458c3acf4b5fd75113231f2
Merge: 089d12e4 48199562
Author: Andrey Kononykhin <andkononykhin@gmail.com>
Date: Thu Jun 27 19:32:01 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.9.0.rc1
Signed-off-by: Andrey Kononykhin <andkononykhin@gmail.com>
index d739c473..807b035a 100644
--- a/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rules_handler.py
+++ b/indy_node/server/request_handlers/config_req_handlers/auth_rule/auth_rules_handler.py
@@ -22,7 +22,7 @@ class AuthRulesHandler(AbstractAuthRuleHandler):
for rule in operation.get(RULES):
self._static_validation_for_rule(rule, identifier, req_id)
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
self._validate_request_type(request)
self.write_req_validator.validate(request,
[AuthActionEdit(txn_type=AUTH_RULES,
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/flag_handler.py b/indy_node/server/request_handlers/config_req_handlers/flag_handler.py
New file
new file mode 100644
index 00000000..64cccc2b
--- /dev/null
+++ b/indy_node/server/request_handlers/config_req_handlers/flag_handler.py
@@ -0,0 +1,80 @@
+from typing import Optional
+
+from common.serializers.serialization import config_state_serializer
+from indy_common.authorize.auth_actions import AuthActionEdit
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+from indy_common.constants import (
+ FLAG,
+ CONFIG_LEDGER_ID,
+ FLAG_NAME,
+ FLAG_VALUE,
+)
+from indy_common.state.state_constants import LAST_SEQ_NO, LAST_UPDATE_TIME, MARKER_FLAG
+from plenum.common.exceptions import InvalidClientRequest
+from plenum.common.request import Request
+from plenum.common.txn_util import (
+ get_payload_data,
+ get_seq_no,
+ get_txn_time,
+)
+from plenum.server.database_manager import DatabaseManager
+from plenum.server.request_handlers.handler_interfaces.write_request_handler import (
+ WriteRequestHandler,
+)
+
+
+class FlagRequestHandler(WriteRequestHandler):
+ def __init__(
+ self,
+ database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator,
+ ):
+ super().__init__(database_manager, FLAG, CONFIG_LEDGER_ID)
+ self.write_req_validator = write_req_validator
+ self.state_serializer = config_state_serializer
+
+ def static_validation(self, request: Request):
+ self._validate_request_type(request)
+ name = request.operation.get(FLAG_NAME)
+ value = request.operation.get(FLAG_VALUE)
+ if not name:
+ raise InvalidClientRequest(
+ request.identifier, request.reqId, "Flag name is required"
+ )
+ if not isinstance(name, str):
+ raise InvalidClientRequest(
+ request.identifier, request.reqId, "Flag name must be of type string"
+ )
+ if not (isinstance(value, str) or (value is None)):
+ raise InvalidClientRequest(
+ request.identifier, request.reqId, "Flag value must be of type string or None"
+ )
+
+ def additional_dynamic_validation(
+ self, request: Request, req_pp_time: Optional[int]
+ ):
+ self._validate_request_type(request)
+ self.write_req_validator.validate(request, [AuthActionEdit(txn_type=FLAG)])
+
+ def update_state(self, txn, prev_result, request, is_committed=False):
+ self._validate_txn_type(txn)
+ key = get_payload_data(txn).get(FLAG_NAME)
+ value = get_payload_data(txn).get(FLAG_VALUE)
+ state = {}
+ state[FLAG_VALUE] = value
+ state[LAST_SEQ_NO] = get_seq_no(txn)
+ state[LAST_UPDATE_TIME] = get_txn_time(txn)
+ val = self.state_serializer.serialize(state)
+ path = self.make_state_path_for_flag(key)
+ self.state.set(path, val)
+ return state
+
+ @staticmethod
+ def make_state_path_for_flag(key) -> bytes:
+ return "{MARKER}:{FLAG}".format(MARKER=MARKER_FLAG, FLAG=key).encode()
+
+ @staticmethod
+ def get_state_value(state_raw):
+ if state_raw is None:
+ return state_raw
+ return state_raw.get(FLAG_VALUE)
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/ledgers_freeze_handler.py b/indy_node/server/request_handlers/config_req_handlers/ledgers_freeze_handler.py
New file
new file mode 100644
index 00000000..b189bb63
--- /dev/null
+++ b/indy_node/server/request_handlers/config_req_handlers/ledgers_freeze_handler.py
@@ -0,0 +1,23 @@
+from typing import Optional
+
+from indy_common.authorize.auth_actions import AuthActionEdit
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+from plenum.common.constants import LEDGERS_FREEZE
+from plenum.common.request import Request
+from plenum.server.database_manager import DatabaseManager
+from plenum.server.request_handlers.ledgers_freeze.ledgers_freeze_handler import LedgersFreezeHandler as PLedgersFreezeHandler
+
+
+class LedgersFreezeHandler(PLedgersFreezeHandler):
+
+ def __init__(self, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(database_manager)
+ self.write_req_validator = write_req_validator
+
+ def authorize(self, request):
+ self.write_req_validator.validate(request,
+ [AuthActionEdit(txn_type=LEDGERS_FREEZE,
+ field='*',
+ old_value='*',
+ new_value='*')])
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/node_upgrade_handler.py b/indy_node/server/request_handlers/config_req_handlers/node_upgrade_handler.py
Only merges in history
index f0cb0024..28b96d8e 100644
--- a/indy_node/server/request_handlers/config_req_handlers/node_upgrade_handler.py
+++ b/indy_node/server/request_handlers/config_req_handlers/node_upgrade_handler.py
@@ -16,7 +16,7 @@ class NodeUpgradeHandler(WriteRequestHandler):
def update_state(self, txn, prev_result, request, is_committed=False):
pass
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
pass
def static_validation(self, request: Request):
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/pool_config_handler.py b/indy_node/server/request_handlers/config_req_handlers/pool_config_handler.py
Only merge commits
index 2e046fee..d65adbb9 100644
--- a/indy_node/server/request_handlers/config_req_handlers/pool_config_handler.py
+++ b/indy_node/server/request_handlers/config_req_handlers/pool_config_handler.py
@@ -22,7 +22,7 @@ class PoolConfigHandler(WriteRequestHandler):
def static_validation(self, request: Request):
self._validate_request_type(request)
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
self._validate_request_type(request)
action = '*'
status = '*'
* [X] diff --git a/indy_node/server/request_handlers/config_req_handlers/pool_upgrade_handler.py b/indy_node/server/request_handlers/config_req_handlers/pool_upgrade_handler.py
This simply looks like it is named differently. The history shows a bunch of changes not included, but it appears we ended up with roughly the same code at the end of the day. I'm gonna mark this one as a pass.
index db8bf414..591be921 100644
--- a/indy_node/server/request_handlers/config_req_handlers/pool_upgrade_handler.py
+++ b/indy_node/server/request_handlers/config_req_handlers/pool_upgrade_handler.py
@@ -50,7 +50,7 @@ class PoolUpgradeHandler(WriteRequestHandler):
"{} not a valid schedule since {}".
format(schedule, msg))
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
self._validate_request_type(request)
identifier, req_id, operation = get_request_data(request)
status = '*'
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/attribute_handler.py b/indy_node/server/request_handlers/domain_req_handlers/attribute_handler.py
Only merges
index d2df1d2b..8869ab5a 100644
--- a/indy_node/server/request_handlers/domain_req_handlers/attribute_handler.py
+++ b/indy_node/server/request_handlers/domain_req_handlers/attribute_handler.py
@@ -48,20 +48,19 @@ class AttributeHandler(WriteRequestHandler):
get_key = attrib_raw_data_serializer.deserialize(operation[RAW])
if len(get_key) == 0:
raise InvalidClientRequest(identifier, request.reqId,
- '"row" attribute field must contain non-empty dict'.
+ '"raw" attribute field must contain non-empty JSON object {}'.
format(TARGET_NYM))
except JSONDecodeError:
raise InvalidClientRequest(identifier, request.reqId,
- 'Attribute field must be dict while adding it as a row field'.
+ 'Attribute field must be valid JSON object when adding it as a raw field {}'.
format(TARGET_NYM))
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
self._validate_request_type(request)
identifier, req_id, operation = get_request_data(request)
- if not (not operation.get(TARGET_NYM) or
- self.__has_nym(operation[TARGET_NYM], is_committed=False)):
+ if not (not operation.get(TARGET_NYM) or self.__has_nym(operation[TARGET_NYM], is_committed=False)):
raise InvalidClientRequest(identifier, request.reqId,
'{} should be added before adding '
'attribute for it'.
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/claim_def_handler.py b/indy_node/server/request_handlers/domain_req_handlers/claim_def_handler.py
Again, the history shows some other things going on, and we end up with one function with a different name.
index 5e1b1855..b8b8528e 100644
--- a/indy_node/server/request_handlers/domain_req_handlers/claim_def_handler.py
+++ b/indy_node/server/request_handlers/domain_req_handlers/claim_def_handler.py
@@ -29,7 +29,7 @@ class ClaimDefHandler(WriteRequestHandler):
def static_validation(self, request: Request):
pass
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
# we can not add a Claim Def with existent ISSUER_DID
# sine a Claim Def needs to be identified by seqNo
self._validate_request_type(request)
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/context_handler.py b/indy_node/server/request_handlers/domain_req_handlers/context_handler.py
File deleted
deleted file mode 100644
index 01ae7f95..00000000
--- a/indy_node/server/request_handlers/domain_req_handlers/context_handler.py
+++ /dev/null
@@ -1,111 +0,0 @@
-from typing import Optional
-
-from indy_common.authorize.auth_actions import AuthActionAdd, AuthActionEdit
-from indy_common.authorize.auth_request_validator import WriteRequestValidator
-
-from indy_common.constants import SET_CONTEXT, META, CONTEXT_CONTEXT
-
-from indy_common.req_utils import get_write_context_name, get_write_context_version, get_txn_context_name, \
- get_txn_context_version, get_txn_context_data, get_txn_context_meta
-from indy_common.state.state_constants import MARKER_CONTEXT
-from plenum.common.constants import DOMAIN_LEDGER_ID, DATA
-from plenum.common.exceptions import InvalidClientRequest
-
-from plenum.common.request import Request
-from plenum.common.txn_util import get_request_data, get_from, get_seq_no, get_txn_time
-from plenum.server.database_manager import DatabaseManager
-from plenum.server.request_handlers.handler_interfaces.write_request_handler import WriteRequestHandler
-from plenum.server.request_handlers.utils import encode_state_value
-
-from re import findall
-
-URI_REGEX = r'^(?P<scheme>\w+):(?:(?:(?P<url>//[.\w]+)(?:(/(?P<path>[/\w]+)?)?))|(?:(?P<method>\w+):(?P<id>\w+)))'
-
-
-class ContextHandler(WriteRequestHandler):
-
- def __init__(self, database_manager: DatabaseManager,
- write_req_validator: WriteRequestValidator):
- super().__init__(database_manager, SET_CONTEXT, DOMAIN_LEDGER_ID)
- self.write_req_validator = write_req_validator
-
- def static_validation(self, request: Request):
- self._validate_request_type(request)
- data = request.operation[DATA]
- self._validate_context(data[CONTEXT_CONTEXT], request.identifier, request.reqId)
-
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
- # we can not add a Context with already existent NAME and VERSION
- # since a Context needs to be identified by seqNo
- self._validate_request_type(request)
- identifier, req_id, operation = get_request_data(request)
- context_name = get_write_context_name(request)
- context_version = get_write_context_version(request)
- path = ContextHandler.make_state_path_for_context(identifier, context_name, context_version)
- context, _, _ = self.get_from_state(path)
- if context:
- self.write_req_validator.validate(request,
- [AuthActionEdit(txn_type=SET_CONTEXT,
- field='*',
- old_value='*',
- new_value='*')])
- else:
- self.write_req_validator.validate(request,
- [AuthActionAdd(txn_type=SET_CONTEXT,
- field='*',
- value='*')])
-
- def gen_txn_id(self, txn):
- self._validate_txn_type(txn)
- path = ContextHandler.prepare_context_for_state(txn, path_only=True)
- return path.decode()
-
- def update_state(self, txn, prev_result, request, is_committed=False) -> None:
- self._validate_txn_type(txn)
- path, value_bytes = ContextHandler.prepare_context_for_state(txn)
- self.state.set(path, value_bytes)
-
- def _validate_context(self, context, id, reqId):
- if isinstance(context, list):
- for ctx in context:
- if not isinstance(ctx, dict):
- if self._bad_uri(ctx):
- raise InvalidClientRequest(id, reqId, '@context URI {} badly formed'.format(ctx))
- elif isinstance(context, dict):
- pass
- elif isinstance(context, str):
- if self._bad_uri(context):
- raise InvalidClientRequest(id, reqId, '@context URI {} badly formed'.format(context))
- else:
- raise InvalidClientRequest(id, reqId, "'@context' value must be url, array, or object")
-
- def _bad_uri(self, uri_string):
- url = findall(URI_REGEX, uri_string)
- if not url:
- return True
- return False
-
- @staticmethod
- def make_state_path_for_context(authors_did, context_name, context_version) -> bytes:
- return "{DID}:{MARKER}:{CONTEXT_NAME}:{CONTEXT_VERSION}" \
- .format(DID=authors_did,
- MARKER=MARKER_CONTEXT,
- CONTEXT_NAME=context_name,
- CONTEXT_VERSION=context_version).encode()
-
- @staticmethod
- def prepare_context_for_state(txn, path_only=False):
- origin = get_from(txn)
- context_name = get_txn_context_name(txn)
- context_version = get_txn_context_version(txn)
- value = {
- META: get_txn_context_meta(txn),
- DATA: get_txn_context_data(txn)
- }
- path = ContextHandler.make_state_path_for_context(origin, context_name, context_version)
- if path_only:
- return path
- seq_no = get_seq_no(txn)
- txn_time = get_txn_time(txn)
- value_bytes = encode_state_value(value, seq_no, txn_time)
- return path, value_bytes
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/idr_cache_nym_handler.py b/indy_node/server/request_handlers/domain_req_handlers/idr_cache_nym_handler.py
Merge requests
index b796b45b..988f195c 100644
--- a/indy_node/server/request_handlers/domain_req_handlers/idr_cache_nym_handler.py
+++ b/indy_node/server/request_handlers/domain_req_handlers/idr_cache_nym_handler.py
@@ -36,7 +36,7 @@ class IdrCacheNymHandler(WriteRequestHandler):
def static_validation(self, request):
pass
- def dynamic_validation(self, request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request, req_pp_time: Optional[int]):
pass
def gen_state_key(self, txn):
* [ ] diff --git a/indy_node/server/request_handlers/domain_req_handlers/nym_handler.py b/indy_node/server/request_handlers/domain_req_handlers/nym_handler.py
* [ ] TODO review changes here, as we have some history we need to consider...
index 3ea822cb..43c5be1c 100644
--- a/indy_node/server/request_handlers/domain_req_handlers/nym_handler.py
+++ b/indy_node/server/request_handlers/domain_req_handlers/nym_handler.py
@@ -1,57 +1,117 @@
from binascii import hexlify
-from typing import Optional
+from hashlib import sha256
+import json
+from typing import Mapping, Optional
+import base58
from common.serializers.serialization import domain_state_serializer
-from indy_common.authorize.auth_actions import AuthActionAdd, AuthActionEdit
-from indy_common.authorize.auth_request_validator import WriteRequestValidator
-from indy_common.constants import NYM
-from indy_common.auth import Authoriser
from ledger.util import F
-
-from plenum.common.constants import ROLE, TARGET_NYM, VERKEY, TXN_TIME
+from plenum.common.constants import ROLE, TARGET_NYM, TXN_TIME, VERKEY
from plenum.common.exceptions import InvalidClientRequest
from plenum.common.request import Request
-from plenum.common.txn_util import get_payload_data, get_seq_no, get_txn_time, get_request_data, get_from
+from plenum.common.txn_util import (
+ get_from,
+ get_payload_data,
+ get_request_data,
+ get_seq_no,
+ get_txn_time,
+)
from plenum.common.types import f
from plenum.server.database_manager import DatabaseManager
from plenum.server.request_handlers.nym_handler import NymHandler as PNymHandler
-from plenum.server.request_handlers.utils import nym_to_state_key, get_nym_details
+from plenum.server.request_handlers.utils import get_nym_details, nym_to_state_key
+
+from indy_common.auth import Authoriser
+from indy_common.authorize.auth_actions import AuthActionAdd, AuthActionEdit
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+from indy_common.constants import (
+ DIDDOC_CONTENT,
+ NYM,
+ NYM_VERSION,
+ NYM_VERSION_CONVENTION,
+ NYM_VERSION_NULL,
+ NYM_VERSION_SELF_CERT,
+)
+from indy_common.exceptions import InvalidDIDDocException
class NymHandler(PNymHandler):
state_serializer = domain_state_serializer
- def __init__(self, config, database_manager: DatabaseManager,
- write_req_validator: WriteRequestValidator):
+ def __init__(
+ self,
+ config,
+ database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator,
+ ):
super().__init__(config, database_manager)
self.write_req_validator = write_req_validator
def static_validation(self, request: Request):
self._validate_request_type(request)
identifier, req_id, operation = get_request_data(request)
+ assert isinstance(operation, Mapping)
role = operation.get(ROLE)
nym = operation.get(TARGET_NYM)
if isinstance(nym, str):
nym = nym.strip()
if not nym:
- raise InvalidClientRequest(identifier, req_id,
- "{} needs to be present".
- format(TARGET_NYM))
+ raise InvalidClientRequest(
+ identifier, req_id, "{} needs to be present".format(TARGET_NYM)
+ )
if not Authoriser.isValidRole(role):
- raise InvalidClientRequest(identifier, req_id,
- "{} not a valid role".
- format(role))
+ raise InvalidClientRequest(
+ identifier, req_id, "{} not a valid role".format(role)
+ )
+
+ diddoc_content = operation.get(DIDDOC_CONTENT, None)
+ if diddoc_content:
+ diddoc_content = json.loads(diddoc_content)
+ try:
+ self._validate_diddoc_content(diddoc_content)
+ except InvalidDIDDocException as error:
+ raise InvalidClientRequest(
+ identifier,
+ req_id,
+ error.reason,
+ ) from error
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ version = operation.get(NYM_VERSION)
+ if version is not None and version not in (
+ NYM_VERSION_NULL,
+ NYM_VERSION_CONVENTION,
+ NYM_VERSION_SELF_CERT,
+ ):
+ raise InvalidClientRequest(
+ request.identifier,
+ request.reqId,
+ "Version must be one of {{{}}}, received {}".format(
+ ", ".join(
+ str(val)
+ for val in [
+ NYM_VERSION_NULL,
+ NYM_VERSION_CONVENTION,
+ NYM_VERSION_SELF_CERT,
+ ]
+ ),
+ version
+ ),
+ )
+
+ def additional_dynamic_validation(
+ self, request: Request, req_pp_time: Optional[int]
+ ):
self._validate_request_type(request)
operation = request.operation
+ assert isinstance(operation, Mapping)
+ nym_data = self.database_manager.idr_cache.getNym(
+ operation[TARGET_NYM], isCommitted=False
+ )
- nym_data = self.database_manager.idr_cache.getNym(operation[TARGET_NYM], isCommitted=False)
- if not nym_data:
- # If nym does not exist
- self._validate_new_nym(request, operation)
- else:
+ if nym_data:
self._validate_existing_nym(request, operation, nym_data)
+ else:
+ self._validate_new_nym(request, operation)
def gen_txn_id(self, txn):
self._validate_txn_type(txn)
@@ -62,8 +122,7 @@ class NymHandler(PNymHandler):
def update_state(self, txn, prev_result, request, is_committed=False):
self._validate_txn_type(txn)
nym = get_payload_data(txn).get(TARGET_NYM)
- existing_data = get_nym_details(self.state, nym,
- is_committed=is_committed)
+ existing_data = get_nym_details(self.state, nym, is_committed=is_committed)
txn_data = get_payload_data(txn)
new_data = {}
if not existing_data:
@@ -75,6 +134,10 @@ class NymHandler(PNymHandler):
new_data[ROLE] = txn_data[ROLE]
if VERKEY in txn_data:
new_data[VERKEY] = txn_data[VERKEY]
+ if DIDDOC_CONTENT in txn_data:
+ new_data[DIDDOC_CONTENT] = txn_data[DIDDOC_CONTENT]
+ if NYM_VERSION in txn_data:
+ new_data[NYM_VERSION] = txn_data[NYM_VERSION]
new_data[F.seqNo.name] = get_seq_no(txn)
new_data[TXN_TIME] = get_txn_time(txn)
existing_data.update(new_data)
@@ -87,40 +150,83 @@ class NymHandler(PNymHandler):
identifier, req_id, _ = get_request_data(request)
role = operation.get(ROLE)
- nym_data = self.database_manager.idr_cache.getNym(request.identifier, isCommitted=False)
+ nym_data = self.database_manager.idr_cache.getNym(
+ request.identifier, isCommitted=False
+ )
if not nym_data:
# Non-ledger nym case. These two checks duplicated and mainly executed in client_authn,
# but it has point to repeat them here, for clear understanding of validation non-ledger request cases.
if request.identifier != request.operation.get(TARGET_NYM):
- raise InvalidClientRequest(identifier, req_id, "DID which is not stored on ledger can "
- "send nym txn only if appropriate auth_rules set "
- "and sender did equal to destination nym")
+ raise InvalidClientRequest(
+ identifier,
+ req_id,
+ "DID which is not stored on ledger can "
+ "send nym txn only if appropriate auth_rules set "
+ "and sender did equal to destination nym",
+ )
if not request.operation.get(VERKEY):
- raise InvalidClientRequest(identifier, req_id, "Non-ledger nym txn must contain verkey for new did")
+ raise InvalidClientRequest(
+ identifier,
+ req_id,
+ "Non-ledger nym txn must contain verkey for new did",
+ )
+
+ version = request.operation.get(NYM_VERSION)
+ if version == NYM_VERSION_CONVENTION and not self._legacy_convention_validation(
+ request.operation.get(TARGET_NYM), request.operation.get(VERKEY)
+ ):
+ raise InvalidClientRequest(
+ identifier,
+ req_id,
+ "Identifier with version 1 must be first 16 bytes of verkey",
+ )
+ elif version == NYM_VERSION_SELF_CERT and not self._is_self_certifying(
+ request.operation.get(TARGET_NYM), request.operation.get(VERKEY)
+ ):
+ raise InvalidClientRequest(
+ identifier,
+ req_id,
+ "Identifier with version 2 must be first 16 bytes of SHA256 of verkey",
+ )
- self.write_req_validator.validate(request,
- [AuthActionAdd(txn_type=NYM,
- field=ROLE,
- value=role)])
+ self.write_req_validator.validate(
+ request, [AuthActionAdd(txn_type=NYM, field=ROLE, value=role)]
+ )
def _validate_existing_nym(self, request, operation, nym_data):
origin = request.identifier
- owner = self.database_manager.idr_cache.getOwnerFor(operation[TARGET_NYM], isCommitted=False)
+ owner = self.database_manager.idr_cache.getOwnerFor(
+ operation[TARGET_NYM], isCommitted=False
+ )
is_owner = origin == owner
updateKeys = [ROLE, VERKEY]
updateKeysInOperationOrOwner = is_owner
+
+ if NYM_VERSION in operation:
+ raise InvalidClientRequest(
+ request.identifier,
+ request.reqId,
+ "Cannot set version on existing nym"
+ )
+
for key in updateKeys:
if key in operation:
updateKeysInOperationOrOwner = True
newVal = operation[key]
oldVal = nym_data.get(key)
- self.write_req_validator.validate(request,
- [AuthActionEdit(txn_type=NYM,
- field=key,
- old_value=oldVal,
- new_value=newVal,
- is_owner=is_owner)])
+ self.write_req_validator.validate(
+ request,
+ [
+ AuthActionEdit(
+ txn_type=NYM,
+ field=key,
+ old_value=oldVal,
+ new_value=newVal,
+ is_owner=is_owner,
+ )
+ ],
+ )
if not updateKeysInOperationOrOwner:
raise InvalidClientRequest(request.identifier, request.reqId)
@@ -128,3 +234,52 @@ class NymHandler(PNymHandler):
if encoded:
return domain_state_serializer.deserialize(encoded)
return None, None, None
+
+ def _is_self_certifying(self, identifier: str, verkey: str):
+ """Validate that the identifier is self-certifying.
+
+ DID must be the base58 encoded first 16 bytes of the 32 bytes verkey
+ See https://hyperledger.github.io/indy-did-method/#creation
+ """
+ return identifier == base58.b58encode(
+ sha256(base58.b58decode(verkey)).digest()[:16]
+ ).decode("utf-8")
+
+ def _legacy_convention_validation(self, identifier: str, verkey: str):
+ """Validate old Indy SDK convention for DID generation.
+
+ The did is derived from the first 16 bytes of the verkey.
+ """
+ return identifier == base58.b58encode(
+ base58.b58decode(verkey)[:16]
+ ).decode("utf-8")
+
+ def _validate_diddoc_content(self, diddoc):
+ """Validate DID Doc according to assembly rules.
+
+ See https://hyperledger.github.io/indy-did-method/#creation
+ """
+
+ # Must not have an id property
+ if diddoc.get("id", None):
+ raise InvalidDIDDocException(
+ "diddocContent must not have `id` at root"
+ )
+
+ # No element in diddoc is allowed to have same id as verkey in base
+ # diddoc
+ for el in diddoc.values():
+ if isinstance(el, list):
+ for item in el:
+ if self._has_same_id_fragment(item, "verkey"):
+ raise InvalidDIDDocException(
+ "diddocContent must not have object with `id` "
+ "containing fragment `verkey`"
+ )
+
+ def _has_same_id_fragment(self, item, fragment):
+ return (
+ isinstance(item, dict)
+ and "id" in item
+ and item["id"].partition("#")[2] == fragment
+ )
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_def_handler.py b/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_def_handler.py
index f387cfe5..2a17f5de 100644
--- a/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_def_handler.py
+++ b/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_def_handler.py
@@ -40,7 +40,7 @@ class RevocRegDefHandler(WriteRequestHandler):
"Expected: 'did:marker:signature_type:schema_ref' or "
"'did:marker:signature_type:schema_ref:tag'".format(CRED_DEF_ID))
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
self._validate_request_type(request)
operation = request.operation
cred_def_id = operation.get(CRED_DEF_ID)
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_entry_handler.py b/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_entry_handler.py
Only merges
index 05ac5261..4fb6c1d5 100644
--- a/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_entry_handler.py
+++ b/indy_node/server/request_handlers/domain_req_handlers/revoc_reg_entry_handler.py
@@ -3,32 +3,54 @@ from typing import Callable, Optional
from indy_common.authorize.auth_actions import AuthActionEdit, AuthActionAdd
from indy_common.authorize.auth_request_validator import WriteRequestValidator
-from indy_common.constants import REVOC_REG_ENTRY, REVOC_REG_DEF_ID, VALUE, ISSUANCE_TYPE
+from indy_common.config_util import getConfig
+from indy_common.constants import FLAG_NAME_COMPAT_ORDERING, REVOC_REG_ENTRY, REVOC_REG_DEF_ID, VALUE, ISSUANCE_TYPE
from indy_common.state.state_constants import MARKER_REVOC_REG_ENTRY, MARKER_REVOC_REG_ENTRY_ACCUM
+from indy_node.server.request_handlers.read_req_handlers.get_flag_handler import GetFlagRequestHandler
from plenum.common.constants import DOMAIN_LEDGER_ID, TXN_TIME
from plenum.common.exceptions import InvalidClientRequest
from plenum.common.request import Request
from plenum.common.txn_util import get_from, get_payload_data, get_req_id, get_request_data, get_txn_time, get_seq_no
from plenum.common.types import f
-
from plenum.server.database_manager import DatabaseManager
from plenum.server.request_handlers.handler_interfaces.write_request_handler import WriteRequestHandler
from plenum.server.request_handlers.utils import encode_state_value
+from plenum.server.node import Node
+
+from indy_node.server.request_handlers.config_req_handlers.flag_handler import FlagRequestHandler
class RevocRegEntryHandler(WriteRequestHandler):
def __init__(self, database_manager: DatabaseManager,
write_req_validator: WriteRequestValidator,
- get_revocation_strategy: Callable):
+ get_revocation_strategy: Callable,
+ node: Node):
super().__init__(database_manager, REVOC_REG_ENTRY, DOMAIN_LEDGER_ID)
self.get_revocation_strategy = get_revocation_strategy
self.write_req_validator = write_req_validator
+ # Default sorting behavior
+ self.legacy_sort_config = getConfig().REV_STRATEGY_USE_COMPAT_ORDERING or False
+ # Create a local GetFlagRequestHandler to allow lookups for the config flag
+ get_flag_handler = GetFlagRequestHandler(node=node,
+ database_manager=database_manager)
+ self.get_flag_handler = get_flag_handler
+
+ def use_legacy_sort(self, txn) -> bool:
+ txn_time = get_txn_time(txn)
+ state_raw = self.get_flag_handler.lookup_key(FLAG_NAME_COMPAT_ORDERING, timestamp=txn_time)
+ sort_state = FlagRequestHandler.get_state_value(state_raw)
+ if sort_state and isinstance(sort_state, str):
+ # only allow False as value, ignore otherwise
+ if sort_state.lower() == 'false':
+ return False
+ # default to default behavior
+ return self.legacy_sort_config
def static_validation(self, request: Request):
pass
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
self._validate_request_type(request)
rev_reg_tags = request.operation[REVOC_REG_DEF_ID]
author_did, req_id, operation = get_request_data(request)
@@ -71,7 +93,9 @@ class RevocRegEntryHandler(WriteRequestHandler):
)
writer_cls = self.get_revocation_strategy(
revoc_def[VALUE][ISSUANCE_TYPE])
- writer = writer_cls(self.state)
+
+ sort_legacy = self.use_legacy_sort(txn)
+ writer = writer_cls(self.state, sort_legacy)
writer.write(current_entry, txn)
return txn
* [X] diff --git a/indy_node/test/context/__init__.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/__init__.py
similarity index 100%
rename from indy_node/test/context/__init__.py
rename to indy_node/server/request_handlers/domain_req_handlers/rich_schema/__init__.py
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/rich_schema/abstract_rich_schema_object_handler.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/abstract_rich_schema_object_handler.py
New file
new file mode 100644
index 00000000..b6b171eb
--- /dev/null
+++ b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/abstract_rich_schema_object_handler.py
@@ -0,0 +1,146 @@
+from abc import ABCMeta, abstractmethod
+from typing import Optional
+
+from common.serializers.json_serializer import JsonSerializer
+from indy_common.authorize.auth_actions import AuthActionEdit, AuthActionAdd
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+from indy_common.constants import DOMAIN_LEDGER_ID, RS_ID, RS_TYPE, RS_VERSION, RS_NAME, RS_CONTENT, JSON_LD_ID_FIELD, \
+ JSON_LD_TYPE_FIELD
+from indy_common.state.domain import encode_state_value
+from indy_common.types import Request
+from indy_common.config_util import getConfig
+from plenum.common.constants import TXN_PAYLOAD_METADATA_ENDORSER, TXN_PAYLOAD_METADATA_FROM, TXN_PAYLOAD_VERSION
+from plenum.common.exceptions import InvalidClientRequest
+from plenum.common.txn_util import get_payload_data, get_seq_no, get_txn_time, get_from, get_endorser, \
+ get_payload_txn_version
+from plenum.server.database_manager import DatabaseManager
+from plenum.server.request_handlers.handler_interfaces.write_request_handler import WriteRequestHandler
+
+
+class AbstractRichSchemaObjectHandler(WriteRequestHandler, metaclass=ABCMeta):
+
+ def __init__(self, txn_type, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(database_manager, txn_type, DOMAIN_LEDGER_ID)
+ self.write_req_validator = write_req_validator
+ self.config = getConfig()
+
+ def _enabled(self) -> bool:
+ return self.config.ENABLE_RICH_SCHEMAS
+
+ def _validate_enabled(self, request: Request):
+ if not self._enabled():
+ raise InvalidClientRequest(request.identifier, request.reqId, "RichSchema transactions are disabled")
+
+ @abstractmethod
+ def is_json_ld_content(self):
+ pass
+
+ @abstractmethod
+ def do_static_validation_content(self, content_as_dict, request):
+ pass
+
+ @abstractmethod
+ def do_dynamic_validation_content(self, request):
+ pass
+
+ def static_validation(self, request: Request):
+ self._validate_request_type(request)
+ self._validate_enabled(request)
+
+ try:
+ content_as_dict = JsonSerializer.loads(request.operation[RS_CONTENT])
+ except ValueError:
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "'{}' must be a JSON serialized string".format(RS_CONTENT))
+
+ if self.is_json_ld_content():
+ self.do_static_validation_json_ld(content_as_dict, request)
+
+ self.do_static_validation_content(content_as_dict, request)
+
+ def do_static_validation_json_ld(self, content_as_dict, request):
+ if not content_as_dict.get(JSON_LD_ID_FIELD):
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "'content' must be a valid JSON-LD and have non-empty '{}' field".format(JSON_LD_ID_FIELD))
+ if not content_as_dict.get(JSON_LD_TYPE_FIELD):
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "'content' must be a valid JSON-LD and have non-empty '{}' field".format(
+ JSON_LD_TYPE_FIELD))
+
+ if content_as_dict[JSON_LD_ID_FIELD] != request.operation[RS_ID]:
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "content's @id must be equal to id={}".format(request.operation[RS_ID]))
+
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ self._validate_request_type(request)
+
+ rs_id = request.operation[RS_ID]
+ rs_object, _, _ = self.get_from_state(rs_id)
+
+ # check that (rs_name, rs_type, rs_version) is unique within all rich schema objects
+ secondary_key = self.make_secondary_key(request.operation[RS_TYPE],
+ request.operation[RS_NAME],
+ request.operation[RS_VERSION])
+ if not rs_object and self.state.get(secondary_key, isCommitted=False) is not None:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ 'An object with {rs_name}="{rs_name_value}", {rs_version}="{rs_version_value}" '
+ 'and {rs_type}="{rs_type_value}" already exists. '
+ 'Please choose different {rs_name}, {rs_version} or {rs_type}'.format(
+ rs_name=RS_NAME, rs_version=RS_VERSION, rs_type=RS_TYPE,
+ rs_name_value=request.operation[RS_NAME],
+ rs_version_value=request.operation[RS_VERSION],
+ rs_type_value=request.operation[RS_TYPE]))
+
+ # do common auth-rule-based validation (which will check the default immutability of most of the objects)
+ if rs_object:
+ self.write_req_validator.validate(request,
+ [AuthActionEdit(txn_type=self.txn_type,
+ field='*',
+ old_value='*',
+ new_value='*')])
+ else:
+ self.write_req_validator.validate(request,
+ [AuthActionAdd(txn_type=self.txn_type,
+ field='*',
+ value='*')])
+
+ self.do_dynamic_validation_content(request)
+
+ def update_state(self, txn, prev_result, request, is_committed=False) -> None:
+ self._validate_txn_type(txn)
+
+ txn_data = get_payload_data(txn)
+
+ primary_key = txn_data[RS_ID].encode()
+ secondary_key = self.make_secondary_key(txn_data[RS_TYPE],
+ txn_data[RS_NAME],
+ txn_data[RS_VERSION])
+
+ value = {
+ RS_ID: txn_data[RS_ID],
+ RS_TYPE: txn_data[RS_TYPE],
+ RS_NAME: txn_data[RS_NAME],
+ RS_VERSION: txn_data[RS_VERSION],
+ RS_CONTENT: txn_data[RS_CONTENT],
+ TXN_PAYLOAD_METADATA_FROM: get_from(txn),
+ TXN_PAYLOAD_METADATA_ENDORSER: get_endorser(txn),
+ TXN_PAYLOAD_VERSION: get_payload_txn_version(txn),
+ }
+ seq_no = get_seq_no(txn)
+ txn_time = get_txn_time(txn)
+ value_bytes = encode_state_value(value, seq_no, txn_time)
+
+ self.state.set(primary_key, value_bytes)
+ self.state.set(secondary_key, primary_key)
+
+ def gen_txn_id(self, txn):
+ self._validate_txn_type(txn)
+ return get_payload_data(txn)[RS_ID]
+
+ @staticmethod
+ def make_secondary_key(rs_type, rs_name, rs_version):
+ return "{RS_TYPE}:{RS_NAME}:{RS_VERSION}".format(RS_TYPE=rs_type,
+ RS_NAME=rs_name,
+ RS_VERSION=rs_version).encode()
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/rich_schema/json_ld_context_handler.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/json_ld_context_handler.py
New file
new file mode 100644
index 00000000..202a0ae4
--- /dev/null
+++ b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/json_ld_context_handler.py
@@ -0,0 +1,51 @@
+from re import findall
+
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+from indy_common.constants import JSON_LD_CONTEXT, RS_CONTENT, JSON_LD_CONTEXT_FIELD
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+from plenum.common.exceptions import InvalidClientRequest
+from plenum.common.request import Request
+from plenum.server.database_manager import DatabaseManager
+
+URI_REGEX = r'^(?P<scheme>\w+):(?:(?:(?P<url>//[.\w]+)(?:(/(?P<path>[/\w]+)?)?))|(?:(?P<method>\w+):(?P<id>\w+)))'
+
+
+class JsonLdContextHandler(AbstractRichSchemaObjectHandler):
+
+ def __init__(self, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(JSON_LD_CONTEXT, database_manager, write_req_validator)
+
+ def is_json_ld_content(self):
+ return False
+
+ def do_static_validation_content(self, content_as_dict, request: Request):
+ if JSON_LD_CONTEXT_FIELD not in content_as_dict:
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "'{}' must contain a {} field".format(RS_CONTENT, JSON_LD_CONTEXT_FIELD))
+
+ self._validate_context(content_as_dict[JSON_LD_CONTEXT_FIELD], request.identifier, request.reqId)
+
+ def do_dynamic_validation_content(self, request):
+ pass
+
+ def _validate_context(self, context, id, reqId):
+ if isinstance(context, list):
+ for ctx in context:
+ if not isinstance(ctx, dict):
+ if self._bad_uri(ctx):
+ raise InvalidClientRequest(id, reqId, '@context URI {} badly formed'.format(ctx))
+ elif isinstance(context, dict):
+ pass
+ elif isinstance(context, str):
+ if self._bad_uri(context):
+ raise InvalidClientRequest(id, reqId, '@context URI {} badly formed'.format(context))
+ else:
+ raise InvalidClientRequest(id, reqId, "'@context' value must be url, array, or object")
+
+ def _bad_uri(self, uri_string):
+ url = findall(URI_REGEX, uri_string)
+ if not url:
+ return True
+ return False
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_cred_def_handler.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_cred_def_handler.py
New File
new file mode 100644
index 00000000..6985fd96
--- /dev/null
+++ b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_cred_def_handler.py
@@ -0,0 +1,72 @@
+from common.serializers.json_serializer import JsonSerializer
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+
+from indy_common.constants import RICH_SCHEMA_CRED_DEF, RS_CRED_DEF_SIG_TYPE, RS_CRED_DEF_MAPPING, \
+ RS_CRED_DEF_SCHEMA, RS_CRED_DEF_PUB_KEY, RS_CONTENT, RS_TYPE, RS_SCHEMA_TYPE_VALUE, RS_MAPPING_TYPE_VALUE
+
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+from plenum.common.exceptions import InvalidClientRequest
+
+from plenum.server.database_manager import DatabaseManager
+from stp_core.common.log import getlogger
+
+logger = getlogger()
+
+
+class RichSchemaCredDefHandler(AbstractRichSchemaObjectHandler):
+
+ def __init__(self, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(RICH_SCHEMA_CRED_DEF, database_manager, write_req_validator)
+
+ def is_json_ld_content(self):
+ return False
+
+ def do_static_validation_content(self, content_as_dict, request):
+ missing_fields = []
+ for field in [RS_CRED_DEF_SIG_TYPE, RS_CRED_DEF_MAPPING, RS_CRED_DEF_SCHEMA, RS_CRED_DEF_PUB_KEY]:
+ if not content_as_dict.get(field):
+ missing_fields.append("'{}'".format(field))
+
+ if missing_fields:
+ missing_fields_str = ", ".join(missing_fields)
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "{} must be set in '{}'".format(missing_fields_str, RS_CONTENT))
+
+ def do_dynamic_validation_content(self, request):
+ # it has been checked on static validation step that the content is a valid JSON.
+ # and it has schema and mapping fields
+ content_as_dict = JsonSerializer.loads(request.operation[RS_CONTENT])
+ schema_id = content_as_dict[RS_CRED_DEF_SCHEMA]
+ mapping_id = content_as_dict[RS_CRED_DEF_MAPPING]
+
+ # 1. check that the schema field points to an existing object on the ledger
+ schema, _, _ = self.get_from_state(schema_id)
+ if not schema:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "Can not find a referenced '{}' with id={}; please make sure that it has been added to the ledger".format(
+ RS_CRED_DEF_SCHEMA, schema_id))
+
+ # 2. check that the mapping field points to an existing object on the ledger
+ mapping, _, _ = self.get_from_state(mapping_id)
+ if not mapping:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "Can not find a referenced '{}' with id={}; please make sure that it has been added to the ledger".format(
+ RS_CRED_DEF_MAPPING, mapping_id))
+
+ # 3. check that the schema field points to an object of the Schema type
+ if schema.get(RS_TYPE) != RS_SCHEMA_TYPE_VALUE:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "'{}' field must reference a schema with {}={}".format(
+ RS_CRED_DEF_SCHEMA, RS_TYPE, RS_SCHEMA_TYPE_VALUE))
+
+ # 4. check that the mapping fields points to an object of the Mapping type
+ if mapping.get(RS_TYPE) != RS_MAPPING_TYPE_VALUE:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "'{}' field must reference a mapping with {}={}".format(
+ RS_CRED_DEF_MAPPING, RS_TYPE, RS_MAPPING_TYPE_VALUE))
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_encoding_handler.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_encoding_handler.py
New file
new file mode 100644
index 00000000..052374a5
--- /dev/null
+++ b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_encoding_handler.py
@@ -0,0 +1,58 @@
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+
+from indy_common.constants import RICH_SCHEMA_ENCODING, RS_ENC_INPUT, RS_ENC_OUTPUT, RS_ENC_ALGORITHM, RS_ENC_TEST_VECS, \
+ RS_ENC_ID, RS_ENC_TYPE, RS_ENC_ALG_DESC, RS_ENC_ALG_DOC, RS_ENC_ALG_IMPL, RS_CONTENT
+
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+from plenum.common.exceptions import InvalidClientRequest
+
+from plenum.server.database_manager import DatabaseManager
+from stp_core.common.log import getlogger
+
+logger = getlogger()
+
+
+class RichSchemaEncodingHandler(AbstractRichSchemaObjectHandler):
+
+ def __init__(self, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(RICH_SCHEMA_ENCODING, database_manager, write_req_validator)
+
+ def is_json_ld_content(self):
+ return False
+
+ def do_static_validation_content(self, content_as_dict, request):
+ # 1. check for top level fields
+ missing_fields = []
+ for field in [RS_ENC_INPUT, RS_ENC_OUTPUT, RS_ENC_ALGORITHM, RS_ENC_TEST_VECS]:
+ if not content_as_dict.get(field):
+ missing_fields.append("'{}'".format(field))
+ if missing_fields:
+ missing_fields_str = ", ".join(missing_fields)
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "{} must be set in '{}'".format(missing_fields_str, RS_CONTENT))
+
+ # 2. check for input-output fields
+ for io_field in [RS_ENC_INPUT, RS_ENC_OUTPUT]:
+ missing_io_fields = []
+ for field in [RS_ENC_ID, RS_ENC_TYPE]:
+ if not content_as_dict[io_field].get(field):
+ missing_io_fields.append("'{}'".format(field))
+ if missing_io_fields:
+ missing_io_fields_str = " and ".join(missing_io_fields)
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "{} must be set in '{}'".format(missing_io_fields_str, io_field))
+
+ # 3. check for algorithm fields
+ missing_alg_fields = []
+ for field in [RS_ENC_ALG_DESC, RS_ENC_ALG_DOC, RS_ENC_ALG_IMPL]:
+ if not content_as_dict[RS_ENC_ALGORITHM].get(field):
+ missing_alg_fields.append("'{}'".format(field))
+ if missing_alg_fields:
+ missing_alg_fields_str = ", ".join(missing_alg_fields)
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "{} must be set in '{}'".format(missing_alg_fields_str, RS_ENC_ALGORITHM))
+
+ def do_dynamic_validation_content(self, request):
+ pass
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_handler.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_handler.py
New file
new file mode 100644
index 00000000..5a3472e4
--- /dev/null
+++ b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_handler.py
@@ -0,0 +1,27 @@
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+
+from indy_common.constants import RICH_SCHEMA
+
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+
+from plenum.server.database_manager import DatabaseManager
+from stp_core.common.log import getlogger
+
+logger = getlogger()
+
+
+class RichSchemaHandler(AbstractRichSchemaObjectHandler):
+
+ def __init__(self, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(RICH_SCHEMA, database_manager, write_req_validator)
+
+ def is_json_ld_content(self):
+ return True
+
+ def do_static_validation_content(self, content_as_dict, request):
+ pass
+
+ def do_dynamic_validation_content(self, request):
+ pass
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_mapping_handler.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_mapping_handler.py
New file
new file mode 100644
index 00000000..6023c16b
--- /dev/null
+++ b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_mapping_handler.py
@@ -0,0 +1,144 @@
+from common.serializers.json_serializer import JsonSerializer
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+
+from indy_common.constants import RICH_SCHEMA_MAPPING, RS_MAPPING_SCHEMA, RS_CONTENT, RS_TYPE, RS_SCHEMA_TYPE_VALUE, \
+ RS_MAPPING_ENC, RS_MAPPING_RANK, RS_ENCODING_TYPE_VALUE, RS_MAPPING_ATTRIBUTES, RS_MAPPING_ISSUER, \
+ RS_MAPPING_ISSUANCE_DATE
+
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+from plenum.common.exceptions import InvalidClientRequest
+
+from plenum.server.database_manager import DatabaseManager
+from stp_core.common.log import getlogger
+
+logger = getlogger()
+
+
+class RichSchemaMappingHandler(AbstractRichSchemaObjectHandler):
+
+ def __init__(self, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(RICH_SCHEMA_MAPPING, database_manager, write_req_validator)
+
+ def is_json_ld_content(self):
+ return True
+
+ def do_static_validation_content(self, content_as_dict, request):
+ # 1. check for missing `schema` or `attributes` fields
+ missing_fields = []
+ for field in [RS_MAPPING_SCHEMA, RS_MAPPING_ATTRIBUTES]:
+ if not content_as_dict.get(field):
+ missing_fields.append("'{}'".format(field))
+
+ if missing_fields:
+ missing_fields_str = " and ".join(missing_fields)
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "{} must be set in '{}'".format(missing_fields_str, RS_CONTENT))
+
+ # 2. check for missing defaukt attributes in `attributes` (`issuer` and `issuanceDate`)
+ missing_fields = []
+ for field in [RS_MAPPING_ISSUER, RS_MAPPING_ISSUANCE_DATE]:
+ if not content_as_dict[RS_MAPPING_ATTRIBUTES].get(field):
+ missing_fields.append("'{}'".format(field))
+
+ if missing_fields:
+ missing_fields_str = " and ".join(missing_fields)
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "{} must be in {}'s '{}'".format(missing_fields_str, RS_CONTENT,
+ RS_MAPPING_ATTRIBUTES))
+
+ def do_dynamic_validation_content(self, request):
+ # it has been checked on static validation step that the content is a valid JSON.
+ # and it has schema and attributes fields
+ content_as_dict = JsonSerializer.loads(request.operation[RS_CONTENT])
+
+ # 1. check that the schema field points to an existing object on the ledger
+ schema_id = content_as_dict[RS_MAPPING_SCHEMA]
+ schema, _, _ = self.get_from_state(schema_id)
+ if not schema:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ 'Can not find a schema with id={}; please make sure that it has been added to the ledger'.format(
+ schema_id))
+
+ # 2. check that the schema field points to an object of the Schema type
+ if schema.get(RS_TYPE) != RS_SCHEMA_TYPE_VALUE:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "'{}' field must reference a schema with {}={}".format(
+ RS_MAPPING_SCHEMA, RS_TYPE, RS_SCHEMA_TYPE_VALUE))
+
+ # 3. find all attribute leaf dicts with encoding-rank pairs
+ enc_desc_dicts = list(find_encoding_desc_dicts(content_as_dict[RS_MAPPING_ATTRIBUTES]))
+
+ # 4. check that every dict has encoding and rank fields
+ # Note: this check can be done in static validation, but then we will have to traverse the leaf dicts twice
+ for desc_dict, attr in enc_desc_dicts:
+ if not isinstance(desc_dict, dict):
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "{} and {} must be set for the attribute '{}'".format(RS_MAPPING_ENC,
+ RS_MAPPING_RANK, attr))
+
+ missing_fields = []
+ for field in [RS_MAPPING_ENC, RS_MAPPING_RANK]:
+ v = desc_dict.get(field)
+ if not v and v != 0:
+ missing_fields.append(field)
+
+ if missing_fields:
+ missing_fields_str = " and ".join(missing_fields)
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "{} must be set for the attribute '{}'".format(missing_fields_str, attr))
+
+ # 5. check that all ranks are unique and form a sequence without gaps
+ # Note: this check can be done in static validation, but then we will have to traverse the leaf dicts twice
+ expected_ranks = list(range(1, len(enc_desc_dicts) + 1))
+ ranks = sorted([desc_dict[RS_MAPPING_RANK] for desc_dict, attr in enc_desc_dicts])
+ if ranks != expected_ranks:
+ raise InvalidClientRequest(request.identifier, request.reqId,
+ "the attribute's ranks are not sequential: expected ranks are all values from 1 to {}".format(
+ len(enc_desc_dicts)))
+
+ # 6. check that all the enc fields point to an existing object on the ledger of the type Encoding
+ for desc_dict, attr in enc_desc_dicts:
+ encoding_id = desc_dict[RS_MAPPING_ENC]
+ encoding, _, _ = self.get_from_state(encoding_id)
+ if not encoding:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "Can not find a referenced '{}' with id={} in the '{}' attribute; please make sure that it has been added to the ledger".format(
+ RS_MAPPING_ENC, encoding_id, attr))
+ if encoding.get(RS_TYPE) != RS_ENCODING_TYPE_VALUE:
+ raise InvalidClientRequest(request.identifier,
+ request.reqId,
+ "'{}' field in the '{}' attribute must reference an encoding with {}={}".format(
+ RS_MAPPING_ENC, attr, RS_TYPE, RS_ENCODING_TYPE_VALUE))
+
+
+def find_encoding_desc_dicts(content, last_attr=None):
+ if is_leaf_dict(content):
+ yield content, last_attr
+ return
+
+ for k, v in content.items():
+ if k in []:
+ continue
+ if isinstance(v, list):
+ for item in v:
+ for desc_dict, last_attr in find_encoding_desc_dicts(item, k):
+ yield desc_dict, last_attr
+ else:
+ for desc_dict, last_attr in find_encoding_desc_dicts(v, k):
+ yield desc_dict, last_attr
+
+
+def is_leaf_dict(node):
+ if isinstance(node, dict):
+ for k, v in node.items():
+ if isinstance(v, dict):
+ return False
+ if isinstance(v, list):
+ return False
+ return True
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_pres_def_handler.py b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_pres_def_handler.py
New file
new file mode 100644
index 00000000..05e0e72d
--- /dev/null
+++ b/indy_node/server/request_handlers/domain_req_handlers/rich_schema/rich_schema_pres_def_handler.py
@@ -0,0 +1,28 @@
+from indy_common.authorize.auth_request_validator import WriteRequestValidator
+
+from indy_common.constants import RICH_SCHEMA_PRES_DEF
+
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+
+from plenum.server.database_manager import DatabaseManager
+from stp_core.common.log import getlogger
+
+logger = getlogger()
+
+
+class RichSchemaPresDefHandler(AbstractRichSchemaObjectHandler):
+
+ def __init__(self, database_manager: DatabaseManager,
+ write_req_validator: WriteRequestValidator):
+ super().__init__(RICH_SCHEMA_PRES_DEF, database_manager, write_req_validator)
+
+ def is_json_ld_content(self):
+ return True
+
+ # TODO: implement specific validation
+ def do_static_validation_content(self, content_as_dict, request):
+ pass
+
+ def do_dynamic_validation_content(self, request):
+ pass
* [X] diff --git a/indy_node/server/request_handlers/domain_req_handlers/schema_handler.py b/indy_node/server/request_handlers/domain_req_handlers/schema_handler.py
Only merges
index fe34d203..4702bc2d 100644
--- a/indy_node/server/request_handlers/domain_req_handlers/schema_handler.py
+++ b/indy_node/server/request_handlers/domain_req_handlers/schema_handler.py
@@ -27,7 +27,7 @@ class SchemaHandler(WriteRequestHandler):
def static_validation(self, request: Request):
pass
- def dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
+ def additional_dynamic_validation(self, request: Request, req_pp_time: Optional[int]):
# we can not add a Schema with already existent NAME and VERSION
# sine a Schema needs to be identified by seqNo
self._validate_request_type(request)
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/get_attribute_handler.py b/indy_node/server/request_handlers/read_req_handlers/get_attribute_handler.py
Only merges
index 0dcba0c1..2359b351 100644
--- a/indy_node/server/request_handlers/read_req_handlers/get_attribute_handler.py
+++ b/indy_node/server/request_handlers/read_req_handlers/get_attribute_handler.py
@@ -1,74 +1,112 @@
-from indy_node.server.request_handlers.domain_req_handlers.attribute_handler import AttributeHandler
-from plenum.server.request_handlers.handler_interfaces.read_request_handler import ReadRequestHandler
-from indy_common.constants import ATTRIB, GET_ATTR
-from indy_node.server.request_handlers.utils import validate_attrib_keys
-from plenum.common.constants import RAW, ENC, HASH, TARGET_NYM, DOMAIN_LEDGER_ID
+from typing import Any, Mapping
+
+from plenum.common.constants import DOMAIN_LEDGER_ID, ENC, HASH, RAW, TARGET_NYM
from plenum.common.exceptions import InvalidClientRequest
from plenum.common.request import Request
from plenum.common.txn_util import get_request_data
from plenum.server.database_manager import DatabaseManager
+from plenum.server.request_handlers.utils import decode_state_value
from stp_core.common.log import getlogger
+from indy_common.constants import ATTRIB, GET_ATTR, VERSION_ID, VERSION_TIME
+from indy_node.server.request_handlers.domain_req_handlers.attribute_handler import (
+ AttributeHandler,
+)
+from indy_node.server.request_handlers.read_req_handlers.version_read_request_handler import (
+ VersionReadRequestHandler,
+)
+from indy_node.server.request_handlers.utils import validate_attrib_keys
+
logger = getlogger()
-class GetAttributeHandler(ReadRequestHandler):
+class GetAttributeHandler(VersionReadRequestHandler):
- def __init__(self, database_manager: DatabaseManager):
- super().__init__(database_manager, GET_ATTR, DOMAIN_LEDGER_ID)
+ def __init__(self, node, database_manager: DatabaseManager):
+ super().__init__(node, database_manager, GET_ATTR, DOMAIN_LEDGER_ID)
def get_result(self, request: Request):
self._validate_request_type(request)
+
identifier, req_id, operation = get_request_data(request)
+
+ timestamp = operation.get(VERSION_TIME)
+ seq_no = operation.get(VERSION_ID)
+ if timestamp and seq_no:
+ raise InvalidClientRequest(
+ request.identifier,
+ request.reqId,
+ f"{VERSION_ID} and {VERSION_TIME} are mutually exclusive; only one should be "
+ "specified",
+ )
+ # The above check determines whether the request is valid
+ # A similar check in VersionReadRequestHandler determines
+ # whether the method is used correctly
+
if not validate_attrib_keys(operation):
- raise InvalidClientRequest(identifier, req_id,
- '{} should have one and only one of '
- '{}, {}, {}'
- .format(ATTRIB, RAW, ENC, HASH))
- nym = operation[TARGET_NYM]
+ raise InvalidClientRequest(
+ identifier, req_id,
+ '{} should have one and only one of '
+ '{}, {}, {}'
+ .format(ATTRIB, RAW, ENC, HASH)
+ )
+
+ attr_type = self._get_attr_type(operation)
+ path = AttributeHandler.make_state_path_for_attr(
+ operation[TARGET_NYM],
+ operation[attr_type],
+ attr_type == HASH,
+ )
+
+ encoded, proof = self.lookup_version(
+ path, seq_no=seq_no, timestamp=timestamp, with_proof=True
+ )
+
+ if not encoded:
+ return self.make_result(
+ request=request,
+ data=None,
+ last_seq_no=None,
+ update_time=None,
+ proof=proof
+ )
+
+ store_key, last_seq_no, last_update_time = decode_state_value(encoded)
+
+ if attr_type == HASH:
+ return self.make_result(
+ request=request,
+ data=operation[HASH],
+ last_seq_no=last_seq_no,
+ update_time=last_update_time,
+ proof=proof
+ )
+
+ store_value = self._get_value_from_attribute_store(store_key)
+ return self.make_result(
+ request=request,
+ data=store_value,
+ last_seq_no=last_seq_no,
+ update_time=last_update_time,
+ proof=proof
+ )
+
+ @staticmethod
+ def _get_attr_type(operation: Mapping[str, Any]):
+ """Return attribute type based on presence of keys in operation."""
if RAW in operation:
- attr_type = RAW
- elif ENC in operation:
- # If attribute is encrypted, it will be queried by its hash
- attr_type = ENC
- else:
- attr_type = HASH
- attr_key = operation[attr_type]
- value, last_seq_no, last_update_time, proof = \
- self.get_attr(did=nym, key=attr_key, attr_type=attr_type)
- attr = None
- if value is not None:
- if HASH in operation:
- attr = attr_key
- else:
- attr = value
- return self.make_result(request=request,
- data=attr,
- last_seq_no=last_seq_no,
- update_time=last_update_time,
- proof=proof)
-
- def get_attr(self,
- did: str,
- key: str,
- attr_type,
- is_committed=True) -> (str, int, int, list):
- assert did is not None
- assert key is not None
- path = AttributeHandler.make_state_path_for_attr(did, key, attr_type == HASH)
+ return RAW
+ if ENC in operation:
+ return ENC
+ return HASH
+
+ def _get_value_from_attribute_store(self, key: str):
+ """Retrieve value from attribute store or return None if it doesn't exist."""
try:
- hashed_val, last_seq_no, last_update_time, proof = \
- self.lookup(path, is_committed, with_proof=True)
+ value = self.database_manager.attribute_store.get(key)
except KeyError:
- return None, None, None, None
- if not hashed_val or hashed_val == '':
- # Its a HASH attribute
- return hashed_val, last_seq_no, last_update_time, proof
- else:
- try:
- value = self.database_manager.attribute_store.get(hashed_val)
- except KeyError:
- logger.error('Could not get value from attribute store for {}'
- .format(hashed_val))
- return None, None, None, None
- return value, last_seq_no, last_update_time, proof
+ logger.error(
+ 'Could not get value from attribute store for %s', key
+ )
+ return None
+ return value
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/get_context_handler.py b/indy_node/server/request_handlers/read_req_handlers/get_context_handler.py
Deleted file, only merges in history...
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/server/request_handlers/read_req_handlers/get_context_handler.py
* commit ff88db3954bcc7c111e5c3d2fa5e66717291c370
Merge: 089d12e4 1e437ca1
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Thu Oct 3 12:01:40 2019 +0300
Merge remote-tracking branch 'public/master' into rc-1.10.0.rc1
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
deleted file mode 100644
index 145b6781..00000000
--- a/indy_node/server/request_handlers/read_req_handlers/get_context_handler.py
+++ /dev/null
@@ -1,46 +0,0 @@
-from indy_common.constants import GET_CONTEXT
-from indy_common.req_utils import get_read_context_from, get_read_context_name, get_read_context_version
-from indy_node.server.request_handlers.domain_req_handlers.context_handler import ContextHandler
-from plenum.common.constants import DOMAIN_LEDGER_ID
-from plenum.common.request import Request
-from plenum.server.database_manager import DatabaseManager
-from plenum.server.request_handlers.handler_interfaces.read_request_handler import ReadRequestHandler
-
-
-class GetContextHandler(ReadRequestHandler):
-
- def __init__(self, database_manager: DatabaseManager):
- super().__init__(database_manager, GET_CONTEXT, DOMAIN_LEDGER_ID)
-
- def get_result(self, request: Request):
- self._validate_request_type(request)
- author_did = get_read_context_from(request)
- context_name = get_read_context_name(request)
- context_version = get_read_context_version(request)
- context, last_seq_no, last_update_time, proof = self.get_context(
- author=author_did,
- context_name=context_name,
- context_version=context_version,
- with_proof=True
- )
- return self.make_result(request=request,
- data=context,
- last_seq_no=last_seq_no,
- update_time=last_update_time,
- proof=proof)
-
- def get_context(self,
- author: str,
- context_name: str,
- context_version: str,
- is_committed=True,
- with_proof=True) -> (str, int, int, list):
- assert author is not None
- assert context_name is not None
- assert context_version is not None
- path = ContextHandler.make_state_path_for_context(author, context_name, context_version)
- try:
- keys, seq_no, last_update_time, proof = self.lookup(path, is_committed, with_proof=with_proof)
- return keys, seq_no, last_update_time, proof
- except KeyError:
- return None, None, None, None
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/get_flag_handler.py b/indy_node/server/request_handlers/read_req_handlers/get_flag_handler.py
New file
new file mode 100644
index 00000000..07bff542
--- /dev/null
+++ b/indy_node/server/request_handlers/read_req_handlers/get_flag_handler.py
@@ -0,0 +1,75 @@
+from typing import Any, Dict, Optional, Tuple, cast, Union
+from common.serializers.serialization import config_state_serializer
+from plenum.common.exceptions import InvalidClientRequest
+from plenum.common.request import Request
+from plenum.server.database_manager import DatabaseManager
+from plenum.server.node import Node
+from indy_common.constants import CONFIG_LEDGER_ID, GET_FLAG, FLAG_NAME, VERSION_ID, VERSION_TIME
+from indy_common.state.state_constants import LAST_SEQ_NO, LAST_UPDATE_TIME
+from indy_node.server.request_handlers.config_req_handlers.flag_handler import FlagRequestHandler
+from indy_node.server.request_handlers.read_req_handlers.version_read_request_handler import VersionReadRequestHandler
+from storage.state_ts_store import StateTsDbStorage
+
+
+class GetFlagRequestHandler(VersionReadRequestHandler):
+ """Flag Read Request handler with historic lookup."""
+
+ def __init__(self, node: Node, database_manager: DatabaseManager):
+ super().__init__(node, database_manager, GET_FLAG, CONFIG_LEDGER_ID)
+ self.timestamp_store: StateTsDbStorage = cast(
+ StateTsDbStorage, self.database_manager.ts_store
+ )
+ self.state_serializer = config_state_serializer
+
+ def get_result(self, request: Request) -> Dict[str, Any]:
+ self._validate_request_type(request)
+ key = request.operation.get(FLAG_NAME)
+ if key is None:
+ raise InvalidClientRequest(
+ request.identifier,
+ request.reqId,
+ "Flag name must be provided in get request for config flags"
+ )
+ path = FlagRequestHandler.make_state_path_for_flag(key)
+
+ timestamp = request.operation.get(VERSION_TIME)
+ seq_no = request.operation.get(VERSION_ID)
+
+ if timestamp and seq_no:
+ raise InvalidClientRequest(
+ request.identifier,
+ request.reqId,
+ f"{VERSION_ID} and {VERSION_TIME} are mutually exclusive; only one should be "
+ "specified",
+ )
+
+ flag_data, proof = self.deserialize(self.lookup_version(
+ path, seq_no=seq_no, timestamp=timestamp, with_proof=True
+ ))
+
+ last_seq_no = None
+ update_time = None
+ if flag_data:
+ last_seq_no = flag_data[LAST_SEQ_NO]
+ update_time = flag_data[LAST_UPDATE_TIME]
+
+ result = self.make_result(
+ request=request,
+ data=flag_data,
+ proof=proof,
+ last_seq_no=last_seq_no,
+ update_time=update_time
+ )
+ return result
+
+ def lookup_key(self, flag: str, seq_no=None, timestamp=None) -> Tuple[Optional[Union[bytes, str]], Any]:
+ """Lookup a flag from the ledger state and return only the value"""
+ path = FlagRequestHandler.make_state_path_for_flag(flag)
+ value, _ = self.deserialize(self.lookup_version(path, seq_no, timestamp, with_proof=False))
+ return value
+
+ def deserialize(self, input: Tuple[Optional[Union[bytes, str]], Any]) -> Tuple[Optional[Union[bytes, str]], Any]:
+ (value, proof) = input
+ if value:
+ value = self.state_serializer.deserialize(value)
+ return (value, proof)
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/get_nym_handler.py b/indy_node/server/request_handlers/read_req_handlers/get_nym_handler.py
Only merges
index 92e50b77..fb71cfc7 100644
--- a/indy_node/server/request_handlers/read_req_handlers/get_nym_handler.py
+++ b/indy_node/server/request_handlers/read_req_handlers/get_nym_handler.py
@@ -1,41 +1,62 @@
-from indy_common.constants import GET_NYM
+from indy_common.constants import GET_NYM, VERSION_ID, VERSION_TIME
from common.serializers.serialization import domain_state_serializer
from indy_node.server.request_handlers.domain_req_handlers.nym_handler import NymHandler
from plenum.common.constants import TARGET_NYM, TXN_TIME, DOMAIN_LEDGER_ID
from plenum.common.request import Request
from plenum.common.types import f
+from plenum.common.exceptions import InvalidClientRequest
from plenum.server.database_manager import DatabaseManager
-from plenum.server.request_handlers.handler_interfaces.read_request_handler import ReadRequestHandler
+from .version_read_request_handler import VersionReadRequestHandler
-class GetNymHandler(ReadRequestHandler):
- def __init__(self, database_manager: DatabaseManager):
- super().__init__(database_manager, GET_NYM, DOMAIN_LEDGER_ID)
+class GetNymHandler(VersionReadRequestHandler):
+ def __init__(self, node, database_manager: DatabaseManager):
+ super().__init__(node, database_manager, GET_NYM, DOMAIN_LEDGER_ID)
def get_result(self, request: Request):
self._validate_request_type(request)
nym = request.operation[TARGET_NYM]
path = NymHandler.make_state_path_for_nym(nym)
- nym_data, proof = self._get_value_from_state(path, with_proof=True)
+
+ timestamp = request.operation.get(VERSION_TIME)
+ seq_no = request.operation.get(VERSION_ID)
+
+ if timestamp and seq_no:
+ raise InvalidClientRequest(
+ request.identifier,
+ request.reqId,
+ f"{VERSION_ID} and {VERSION_TIME} are mutually exclusive; only one should be "
+ "specified",
+ )
+ # The above check determines whether the request is valid
+ # A similar check in VersionReadRequestHandler determines
+ # whether the method is used correctly
+
+ data = None
+ last_seq_no = None
+ update_time = None
+ proof = None
+
+ nym_data, proof = self.lookup_version(
+ path, seq_no=seq_no, timestamp=timestamp, with_proof=True
+ )
+
if nym_data:
nym_data = domain_state_serializer.deserialize(nym_data)
nym_data[TARGET_NYM] = nym
data = domain_state_serializer.serialize(nym_data)
- seq_no = nym_data[f.SEQ_NO.nm]
+ last_seq_no = nym_data[f.SEQ_NO.nm]
update_time = nym_data[TXN_TIME]
- else:
- data = None
- seq_no = None
- update_time = None
-
- # TODO: add update time here!
- result = self.make_result(request=request,
- data=data,
- last_seq_no=seq_no,
- update_time=update_time,
- proof=proof)
+
+ result = self.make_result(
+ request=request,
+ data=data, # Serailized retrieved txn data
+ last_seq_no=last_seq_no, # nym_data[seqNo]
+ update_time=update_time, # nym_data[TXN_TIME]
+ proof=proof, # _get_value_from_state(..., with_proof=True)[1]
+ )
result.update(request.operation)
return result
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/get_revoc_reg_delta_handler.py b/indy_node/server/request_handlers/read_req_handlers/get_revoc_reg_delta_handler.py
Only merges
index b4c84e47..cdb16e7a 100644
--- a/indy_node/server/request_handlers/read_req_handlers/get_revoc_reg_delta_handler.py
+++ b/indy_node/server/request_handlers/read_req_handlers/get_revoc_reg_delta_handler.py
@@ -1,4 +1,4 @@
-from collections import Callable
+from collections.abc import Callable
from indy_common.constants import FROM, TO, REVOC_REG_DEF_ID, ISSUANCE_TYPE, REVOKED, ISSUED, VALUE, REVOC_TYPE, \
ACCUM_TO, STATE_PROOF_FROM, ACCUM_FROM, GET_REVOC_REG_DELTA
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/get_rs_schema_handler.py b/indy_node/server/request_handlers/read_req_handlers/get_rs_schema_handler.py
New file
new file mode 100644
index 00000000..99dd64aa
--- /dev/null
+++ b/indy_node/server/request_handlers/read_req_handlers/get_rs_schema_handler.py
@@ -0,0 +1,26 @@
+from indy_common.constants import GET_RS_SCHEMA, RS_META, RS_META_NAME, RS_META_VERSION, RS_SCHEMA_FROM, \
+ DOMAIN_LEDGER_ID
+from plenum.common.request import Request
+from plenum.server.database_manager import DatabaseManager
+from plenum.server.request_handlers.handler_interfaces.read_request_handler import ReadRequestHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rs_schema_handler import RsSchemaHandler
+
+
+class GetRsSchemaHandler(ReadRequestHandler):
+ def __init__(self, database_manager: DatabaseManager):
+ super().__init__(database_manager, GET_RS_SCHEMA, DOMAIN_LEDGER_ID)
+
+ def get_result(self, request: Request):
+ self._validate_request_type(request)
+ author_did = request.operation[RS_SCHEMA_FROM]
+ name = request.operation[RS_META][RS_META_NAME]
+ version = request.operation[RS_META][RS_META_VERSION]
+ state_path = RsSchemaHandler.make_state_path(author_did, name, version)
+
+ rs_schema, last_seq_no, last_update_time, proof = {}, None, None, None
+ try: # lookup returns: keys, seq_no, last_update_time, proof
+ rs_schema, last_seq_no, last_update_time, proof = self.lookup(state_path, True, True)
+ except KeyError:
+ pass
+ return self.make_result(request=request, data=rs_schema, last_seq_no=last_seq_no, update_time=last_update_time,
+ proof=proof)
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/rich_schema/__init__.py b/indy_node/server/request_handlers/read_req_handlers/rich_schema/__init__.py
New file
new file mode 100644
index 00000000..e69de29b
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/rich_schema/abstract_rich_schema_read_req_handler.py b/indy_node/server/request_handlers/read_req_handlers/rich_schema/abstract_rich_schema_read_req_handler.py
New file
new file mode 100644
index 00000000..8abfe522
--- /dev/null
+++ b/indy_node/server/request_handlers/read_req_handlers/rich_schema/abstract_rich_schema_read_req_handler.py
@@ -0,0 +1,26 @@
+from abc import abstractmethod
+
+from plenum.server.request_handlers.handler_interfaces.read_request_handler import ReadRequestHandler
+from plenum.common.exceptions import InvalidClientRequest
+from indy_common.config_util import getConfig
+from plenum.common.request import Request
+from plenum.server.database_manager import DatabaseManager
+
+
+class AbstractRichSchemaReadRequestHandler(ReadRequestHandler):
+
+ def __init__(self, database_manager: DatabaseManager, txn_type, ledger_id):
+ super().__init__(database_manager, txn_type, ledger_id)
+ self.config = getConfig()
+
+ def _enabled(self) -> bool:
+ return self.config.ENABLE_RICH_SCHEMAS
+
+ def _validate_enabled(self, request: Request):
+ if not self._enabled():
+ raise InvalidClientRequest(request.identifier, request.reqId, "RichSchema queries are disabled")
+
+ @abstractmethod
+ def get_result(self, request: Request):
+ super().get_result(request)
+ self._validate_enabled(request)
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/rich_schema/get_rich_schema_object_by_id_handler.py b/indy_node/server/request_handlers/read_req_handlers/rich_schema/get_rich_schema_object_by_id_handler.py
New file
new file mode 100644
index 00000000..8a72c5ff
--- /dev/null
+++ b/indy_node/server/request_handlers/read_req_handlers/rich_schema/get_rich_schema_object_by_id_handler.py
@@ -0,0 +1,31 @@
+from indy_common.constants import GET_RICH_SCHEMA_OBJECT_BY_ID, RS_ID
+from indy_common.config_util import getConfig
+from indy_node.server.request_handlers.read_req_handlers.rich_schema.abstract_rich_schema_read_req_handler import \
+ AbstractRichSchemaReadRequestHandler
+from plenum.common.constants import DOMAIN_LEDGER_ID
+from plenum.common.request import Request
+from plenum.common.exceptions import InvalidClientRequest
+from plenum.server.database_manager import DatabaseManager
+
+
+class GetRichSchemaObjectByIdHandler(AbstractRichSchemaReadRequestHandler):
+
+ def __init__(self, database_manager: DatabaseManager):
+ super().__init__(database_manager, GET_RICH_SCHEMA_OBJECT_BY_ID, DOMAIN_LEDGER_ID)
+
+ def get_result(self, request: Request):
+ super().get_result(request)
+ self._validate_request_type(request)
+
+ id = request.operation[RS_ID]
+
+ try:
+ value, seq_no, last_update_time, proof = self.lookup(id, is_committed=True, with_proof=True)
+ except KeyError:
+ value, seq_no, last_update_time, proof = None, None, None, None
+
+ return self.make_result(request=request,
+ data=value,
+ last_seq_no=seq_no,
+ update_time=last_update_time,
+ proof=proof)
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/rich_schema/get_rich_schema_object_by_metadata_handler.py b/indy_node/server/request_handlers/read_req_handlers/rich_schema/get_rich_schema_object_by_metadata_handler.py
New file
new file mode 100644
index 00000000..44ffb4bc
--- /dev/null
+++ b/indy_node/server/request_handlers/read_req_handlers/rich_schema/get_rich_schema_object_by_metadata_handler.py
@@ -0,0 +1,37 @@
+from indy_common.constants import RS_NAME, GET_RICH_SCHEMA_OBJECT_BY_METADATA, \
+ RS_TYPE, RS_VERSION
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+from indy_node.server.request_handlers.read_req_handlers.rich_schema.abstract_rich_schema_read_req_handler import \
+ AbstractRichSchemaReadRequestHandler
+from plenum.common.constants import DOMAIN_LEDGER_ID
+from plenum.common.request import Request
+from plenum.server.database_manager import DatabaseManager
+
+
+class GetRichSchemaObjectByMetadataHandler(AbstractRichSchemaReadRequestHandler):
+
+ def __init__(self, database_manager: DatabaseManager):
+ super().__init__(database_manager, GET_RICH_SCHEMA_OBJECT_BY_METADATA, DOMAIN_LEDGER_ID)
+
+ def get_result(self, request: Request):
+ super().get_result(request)
+ self._validate_request_type(request)
+
+ secondary_key = AbstractRichSchemaObjectHandler.make_secondary_key(request.operation[RS_TYPE],
+ request.operation[RS_NAME],
+ request.operation[RS_VERSION])
+ value, seq_no, last_update_time, proof = None, None, None, None
+ try:
+ id, proof = self._get_value_from_state(secondary_key, with_proof=True)
+ if id is not None:
+ value, seq_no, last_update_time, proof = self.lookup(id, is_committed=True, with_proof=True)
+ except KeyError:
+ # means absence of data
+ pass
+
+ return self.make_result(request=request,
+ data=value,
+ last_seq_no=seq_no,
+ update_time=last_update_time,
+ proof=proof)
* [X] diff --git a/indy_node/server/request_handlers/read_req_handlers/version_read_request_handler.py b/indy_node/server/request_handlers/read_req_handlers/version_read_request_handler.py
New file
new file mode 100644
index 00000000..8682f266
--- /dev/null
+++ b/indy_node/server/request_handlers/read_req_handlers/version_read_request_handler.py
@@ -0,0 +1,68 @@
+
+from typing import Any, Optional, Tuple, Union, cast
+from plenum.common.txn_util import get_txn_time
+from plenum.server.database_manager import DatabaseManager
+from plenum.server.request_handlers.handler_interfaces.read_request_handler import ReadRequestHandler
+from plenum.server.node import Node
+from storage.state_ts_store import StateTsDbStorage
+
+
+class VersionReadRequestHandler(ReadRequestHandler):
+ """Specialized read request handler enabling looking up past versions."""
+
+ def __init__(
+ self, node: Node, database_manager: DatabaseManager, txn_type, ledger_id
+ ):
+ super().__init__(database_manager, txn_type, ledger_id)
+ self.node = node
+ self.timestamp_store: StateTsDbStorage = cast(
+ StateTsDbStorage, self.database_manager.ts_store
+ )
+
+ def lookup_version(
+ self,
+ path: bytes,
+ seq_no: Optional[int] = None,
+ timestamp: Optional[int] = None,
+ with_proof=False,
+ ) -> Tuple[Optional[Union[bytes, str]], Any]:
+ """Lookup a value from the ledger state, optionally retrieving from the past.
+
+ If seq_no or timestamp is specified and no value is found, returns (None, None).
+ If neither are specified, value in its current state is retrieved, which will
+ also return (None, None) if it is not found on the ledger.
+
+ seq_no and timestamp are mutually exclusive.
+ """
+ if seq_no is not None and timestamp is not None:
+ raise ValueError("seq_no and timestamp are mutually exclusive")
+ # The above check determines whether the method is used correctly
+ # A similar check in GetNymHandler and GetAttributeHandler determines
+ # whether the request is valid
+
+ if seq_no:
+ timestamp = self._timestamp_from_seq_no(seq_no)
+ if not timestamp:
+ return None, None
+
+ if timestamp:
+ past_root = self.timestamp_store.get_equal_or_prev(timestamp, ledger_id=self.ledger_id)
+ if past_root:
+ return self._get_value_from_state(
+ path, head_hash=past_root, with_proof=with_proof
+ )
+
+ return None, None
+
+ return self._get_value_from_state(
+ path, with_proof=with_proof
+ )
+
+ def _timestamp_from_seq_no(self, seq_no: int) -> Optional[int]:
+ """Return timestamp of a transaction identified by seq_no."""
+ db = self.database_manager.get_database(self.ledger_id)
+ txn = self.node.getReplyFromLedger(db.ledger, seq_no, write=False)
+
+ if txn and "result" in txn:
+ return get_txn_time(txn.result)
+ return None
* [X] diff --git a/indy_node/server/restarter.py b/indy_node/server/restarter.py
Git log shows the following changes not in ubuntu from stable..
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/server/restarter.py
|
| diff --git a/indy_node/server/restarter.py b/indy_node/server/restarter.py
| index 1d622220..6f648e86 100644
| --- a/indy_node/server/restarter.py
| +++ b/indy_node/server/restarter.py
| @@ -52,9 +52,6 @@ class Restarter(NodeMaintainer):
| self._actionLog.appendSucceeded(when)
| logger.info("Node '{}' successfully restarted"
| .format(self.nodeName))
| - self._notifier.sendMessageUponNodeRestartComplete(
| - "Restart of node '{}' scheduled on {} "
| - "completed successfully".format(self.nodeName, when))
|
| def handleRestartRequest(self, req: Request) -> None:
| """
| @@ -252,7 +249,6 @@ class Restarter(NodeMaintainer):
| if external_reason:
| logger.error("This problem may have external reasons, "
| "check syslog for more information")
| - self._notifier.sendMessageUponNodeRestartFail(error_message)
|
|
| class RestartMessage(NodeControlToolMessage):
|
Changes are unrelated to diff....
index 08858891..8eaccefd 100644
--- a/indy_node/server/restarter.py
+++ b/indy_node/server/restarter.py
@@ -216,8 +216,7 @@ class Restarter(NodeMaintainer):
"""
logger.info("Timeout exceeded for {}".format(ev_data.when))
last = self._actionLog.last_event
- if (last and last.ev_type == RestartLog.Events.failed and
- last.data == ev_data):
+ if (last and last.ev_type == RestartLog.Events.failed and last.data == ev_data):
return None
self._action_failed(ev_data,
* [X] diff --git a/indy_node/server/revocation_strategy.py b/indy_node/server/revocation_strategy.py
Only merges
index 10e93a94..60015329 100644
--- a/indy_node/server/revocation_strategy.py
+++ b/indy_node/server/revocation_strategy.py
@@ -2,6 +2,7 @@ from abc import abstractmethod, ABCMeta
from copy import deepcopy
+from indy_common.compat_set import CompatSet
from indy_common.state import domain
from indy_common.types import Request
from indy_common.constants import REVOC_REG_DEF_ID, VALUE, ACCUM, PREV_ACCUM, ISSUED, REVOKED
@@ -11,11 +12,12 @@ from plenum.common.txn_util import get_from, get_req_id, get_payload_data
class RevocationStrategy(metaclass=ABCMeta):
- def __init__(self, state):
+ def __init__(self, state, sort_legacy=False):
self.state = state
self.author_did = None
self.revoc_reg_def_id = None
self.req_id = None
+ self.sort_legacy = sort_legacy
def set_parameters_from_txn(self, author_did, revoc_reg_def_id, req_id):
self.author_did = author_did
@@ -137,10 +139,17 @@ class RevokedStrategy(RevocationStrategy):
issued_from_txn = value_from_txn.get(ISSUED, [])
revoked_from_txn = value_from_txn.get(REVOKED, [])
# set with all previous revoked minus issued from txn
- result_indicies = set(indices).difference(issued_from_txn)
- result_indicies.update(revoked_from_txn)
+ if self.sort_legacy:
+ result_indicies = CompatSet(indices).difference(issued_from_txn)
+ result_indicies.update(revoked_from_txn)
+ result_indicies = list(result_indicies)
+ else:
+ result_indicies = set(indices).difference(issued_from_txn)
+ result_indicies.update(revoked_from_txn)
+ result_indicies = list(result_indicies)
+ result_indicies.sort()
value_from_txn[ISSUED] = []
- value_from_txn[REVOKED] = list(result_indicies)
+ value_from_txn[REVOKED] = result_indicies
txn_data[VALUE] = value_from_txn
# contains already changed txn
self.set_to_state(txn)
@@ -195,10 +204,17 @@ class IssuedStrategy(RevocationStrategy):
issued_from_txn = value_from_txn.get(ISSUED, [])
revoked_from_txn = value_from_txn.get(REVOKED, [])
# set with all previous issued minus revoked from txn
- result_indicies = set(indices).difference(revoked_from_txn)
- result_indicies.update(issued_from_txn)
+ if self.sort_legacy:
+ result_indicies = CompatSet(indices).difference(revoked_from_txn)
+ result_indicies.update(issued_from_txn)
+ result_indicies = list(result_indicies)
+ else:
+ result_indicies = set(indices).difference(revoked_from_txn)
+ result_indicies.update(issued_from_txn)
+ result_indicies = list(result_indicies)
+ result_indicies.sort()
value_from_txn[REVOKED] = []
- value_from_txn[ISSUED] = list(result_indicies)
+ value_from_txn[ISSUED] = result_indicies
txn_data[VALUE] = value_from_txn
# contains already changed txn
self.set_to_state(txn)
* [X] diff --git a/indy_node/server/upgrader.py b/indy_node/server/upgrader.py
History shows...
* commit f9d7b639504892755323c30c04278533a568bfd4
| Author: Lovesh <lovesh.harchandani@evernym.com>
| Date: Sat Feb 24 01:53:13 2018 +0300
|
| fix KeyError for "services" when pool ledger contains node update txns
|
| Signed-off-by: Lovesh <lovesh.harchandani@evernym.com>
|
| (cherry picked from commit 83837b4)
| Signed-off-by: ashcherbakov <alexander.sherbakov@dsr-company.com>
|
| diff --git a/indy_node/server/upgrader.py b/indy_node/server/upgrader.py
| index 4189423e..b67413ff 100644
| --- a/indy_node/server/upgrader.py
| +++ b/indy_node/server/upgrader.py
| @@ -258,8 +258,8 @@ class Upgrader(HasActionQueue):
| Validates schedule of planned node upgrades
|
| :param schedule: dictionary of node ids and upgrade times
| - :param nodeSrvs: dictionary of node ids and services
| - :return: whether schedule valid
| + :param node_srvs: dictionary of node ids and services
| + :return: a 2-tuple of whether schedule valid or not and the reason
| """
|
| # flag "force=True" ignore basic checks! only datetime format is
|
The changes appear to have made it in...
index 74985159..16098a9e 100644
--- a/indy_node/server/upgrader.py
+++ b/indy_node/server/upgrader.py
@@ -170,8 +170,8 @@ class Upgrader(NodeMaintainer):
# searching for CANCEL for this upgrade submitted after START txn
last_pool_upgrade_txn_cancel = self.get_upgrade_txn(
lambda txn:
- get_type(txn) == POOL_UPGRADE and get_payload_data(txn)[ACTION] == CANCEL and
- get_payload_data(txn)[VERSION] == get_payload_data(last_pool_upgrade_txn_start)[VERSION],
+ get_type(txn) == POOL_UPGRADE and get_payload_data(txn)[ACTION] == CANCEL and get_payload_data(txn)
+ [VERSION] == get_payload_data(last_pool_upgrade_txn_start)[VERSION],
start_no=last_pool_upgrade_txn_seq_no + 1)
if last_pool_upgrade_txn_cancel:
logger.info('{} found upgrade CANCEL txn {}'.format(
@@ -227,8 +227,7 @@ class Upgrader(NodeMaintainer):
.format(pkg_name))
# TODO weak check
- if (APP_NAME not in pkg_name and
- all([APP_NAME not in d for d in cur_deps])):
+ if APP_NAME not in pkg_name and all([APP_NAME not in d for d in cur_deps]):
return "Package {} doesn't belong to pool".format(pkg_name)
# compare whether it makes sense to try (target >= current, = for reinstall)
@@ -288,8 +287,7 @@ class Upgrader(NodeMaintainer):
last_event = self.lastActionEventInfo
if last_event:
- if (last_event.data.upgrade_id == upgrade_id and
- last_event.ev_type in FINALIZING_EVENT_TYPES):
+ if last_event.data.upgrade_id == upgrade_id and last_event.ev_type in FINALIZING_EVENT_TYPES:
logger.info(
"Node '{}' has already performed an upgrade with upgrade_id {}. "
"Last recorded event is {}"
@@ -322,8 +320,7 @@ class Upgrader(NodeMaintainer):
return
if action == CANCEL:
- if (self.scheduledAction and
- self.scheduledAction.version == version):
+ if self.scheduledAction and self.scheduledAction.version == version:
self._cancelScheduledUpgrade(justification)
logger.info("Node '{}' cancels upgrade to {}".format(
self.nodeName, version))
@@ -448,8 +445,7 @@ class Upgrader(NodeMaintainer):
.format(ev_data.when, ev_data.version))
last = self._actionLog.last_event
# TODO test this
- if (last and last.ev_type == UpgradeLog.Events.failed and
- last.data == ev_data):
+ if last and last.ev_type == UpgradeLog.Events.failed and last.data == ev_data:
return None
self._action_failed(ev_data, reason="exceeded upgrade timeout")
* [X] diff --git a/indy_node/test/api/helper.py b/indy_node/test/api/helper.py
History only shows merges
index e5aa7d33..9d23df19 100644
--- a/indy_node/test/api/helper.py
+++ b/indy_node/test/api/helper.py
@@ -1,18 +1,28 @@
import json
-import base58
+import random
+import base58
from indy.anoncreds import issuer_create_schema
from indy.ledger import build_schema_request
-from indy_common.state.state_constants import MARKER_CONTEXT
-from plenum.common.constants import TXN_TYPE, DATA, CURRENT_PROTOCOL_VERSION
+from indy_common.constants import RS_ID, RS_TYPE, RS_NAME, RS_VERSION, RS_CONTENT
+from plenum.common.constants import TXN_TYPE
+from plenum.test.helper import sdk_get_reply, sdk_sign_and_submit_req, sdk_get_and_check_replies, sdk_gen_request
+
+
+def req_id():
+ id = random.randint(1, 100000000)
+ while True:
+ yield id
+ id += 1
+
-from indy_common.constants import SET_CONTEXT, CONTEXT_TYPE, META, RS_TYPE, CONTEXT_NAME, CONTEXT_VERSION
-from plenum.test.helper import sdk_get_reply, sdk_sign_and_submit_req, sdk_get_and_check_replies
+_reqId = req_id()
# Utility predicates
+
def is_one_of(*args):
def check(v):
return v in args
@@ -180,6 +190,17 @@ def validate_claim_def_txn(txn):
optional(data, 'tag', is_str)
+def validate_rich_schema_txn(txn, txn_type):
+ require(txn, 'type', is_one_of(txn_type))
+
+ data = txn['data']
+ require(data, 'id', is_str)
+ require(data, 'rsName', is_str)
+ require(data, 'rsType', is_str)
+ require(data, 'rsVersion', is_str)
+ require(data, 'content', is_str)
+
+
# Misc utility
@@ -198,6 +219,48 @@ def sdk_build_schema_request(looper, sdk_wallet_client,
)
+def build_get_rs_schema_request(did, txnId):
+ identifier, type, name, version = txnId.split(':')
+ # _id = identifier + ':' + type + ':' + name + ':' + version
+ txn_dict = {
+ 'operation': {
+ 'type': "301",
+ 'from': identifier,
+ 'meta': {
+ 'name': name,
+ 'version': version,
+ 'type': 'sch' # type
+ }
+ },
+ "identifier": did,
+ "reqId": next(_reqId),
+ "protocolVersion": 2
+ }
+ schema_json = json.dumps(txn_dict)
+ return schema_json
+
+
+def build_rs_schema_request(identifier, schema={}, name="", version=""):
+ txn_dict = {
+ 'operation': {
+ 'type': "201",
+ 'meta': {
+ 'name': name,
+ 'version': version,
+ 'type': "sch"
+ },
+ 'data': {
+ 'schema': schema
+ }
+ },
+ "identifier": identifier,
+ "reqId": next(_reqId),
+ "protocolVersion": 2
+ }
+ schema_json = json.dumps(txn_dict)
+ return schema_json
+
+
def sdk_write_schema(looper, sdk_pool_handle, sdk_wallet_client, multi_attribute=[], name="", version=""):
_, identifier = sdk_wallet_client
if multi_attribute:
@@ -208,7 +271,7 @@ def sdk_write_schema(looper, sdk_pool_handle, sdk_wallet_client, multi_attribute
issuer_create_schema(identifier, "name", "1.0", json.dumps(["first", "last"])))
request = looper.loop.run_until_complete(build_schema_request(identifier, schema_json))
return schema_json, \
- sdk_get_reply(looper, sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_client, request))[1]
+ sdk_get_reply(looper, sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_client, request))[1]
def sdk_write_schema_and_check(looper, sdk_pool_handle, sdk_wallet_client,
@@ -220,49 +283,42 @@ def sdk_write_schema_and_check(looper, sdk_pool_handle, sdk_wallet_client,
return rep
-def sdk_write_context(looper, sdk_pool_handle, sdk_wallet_steward, context=[], name="", version=""):
- _wh, did = sdk_wallet_steward
- # create json
- raw_json = {
- 'operation': {
- TXN_TYPE: SET_CONTEXT,
- META: {
- CONTEXT_NAME: name,
- CONTEXT_VERSION: version,
- RS_TYPE: CONTEXT_TYPE
- },
- DATA: context
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": CURRENT_PROTOCOL_VERSION,
- }
- set_context_txn_json = json.dumps(raw_json)
+# Rich Schema
- return json.dumps({'id': did + ':' + MARKER_CONTEXT + ':' + name + ':' + version}), \
- sdk_get_reply(looper, sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_steward, set_context_txn_json))[1]
+def sdk_build_rich_schema_request(looper, sdk_wallet_client,
+ txn_type, rs_id, rs_name, rs_version, rs_type, rs_content):
+ # TODO: replace by real SDK call
+ _, identifier = sdk_wallet_client
+ op = {
+ TXN_TYPE: txn_type,
+ RS_ID: rs_id,
+ RS_NAME: rs_name,
+ RS_TYPE: rs_type,
+ RS_VERSION: rs_version,
+ RS_CONTENT: rs_content
+ }
+ req_obj = sdk_gen_request(op, identifier=sdk_wallet_client[1])
+ return json.dumps(req_obj.as_dict)
-def sdk_write_context_and_check(looper, sdk_pool_handle, sdk_wallet_steward,
- context=[], name="", version="", reqId=12345678):
- _wh, did = sdk_wallet_steward
- # create json
- raw_json = {
- 'operation': {
- TXN_TYPE: SET_CONTEXT,
- META: {
- CONTEXT_NAME: name,
- CONTEXT_VERSION: version,
- RS_TYPE: CONTEXT_TYPE
- },
- DATA: context
- },
- "identifier": did,
- "reqId": reqId,
- "protocolVersion": CURRENT_PROTOCOL_VERSION,
- }
- set_context_txn_json = json.dumps(raw_json)
- req = sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_steward, set_context_txn_json)
+def sdk_write_rich_schema_object_and_check(looper, sdk_wallet_client, sdk_pool_handle,
+ txn_type, rs_id, rs_name, rs_version, rs_type, rs_content):
+ request = sdk_build_rich_schema_request(looper, sdk_wallet_client,
+ txn_type, rs_id=rs_id, rs_name=rs_name,
+ rs_version=rs_version, rs_type=rs_type,
+ rs_content=json.dumps(rs_content))
+ req = sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_client, request)
rep = sdk_get_and_check_replies(looper, [req])
return rep
+
+
+def sdk_write_rich_schema_object(looper, sdk_wallet_client, sdk_pool_handle,
+ txn_type, rs_id, rs_name, rs_version, rs_type, rs_content):
+ request = sdk_build_rich_schema_request(looper, sdk_wallet_client,
+ txn_type, rs_id=rs_id, rs_name=rs_name,
+ rs_version=rs_version, rs_type=rs_type,
+ rs_content=json.dumps(rs_content))
+
+ return sdk_get_reply(looper,
+ sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_client, req))[1]
* [X] diff --git a/indy_node/test/api/test_rich_schema_objects_reply.py b/indy_node/test/api/test_rich_schema_objects_reply.py
New file, no worries
new file mode 100644
index 00000000..b16266d9
--- /dev/null
+++ b/indy_node/test/api/test_rich_schema_objects_reply.py
@@ -0,0 +1,43 @@
+import json
+
+import pytest
+
+from indy_common.constants import JSON_LD_CONTEXT, RS_CONTEXT_TYPE_VALUE, RICH_SCHEMA, RICH_SCHEMA_ENCODING, \
+ RICH_SCHEMA_MAPPING, RICH_SCHEMA_CRED_DEF, RS_CRED_DEF_TYPE_VALUE, RS_MAPPING_TYPE_VALUE, \
+ RS_ENCODING_TYPE_VALUE, RS_SCHEMA_TYPE_VALUE, RICH_SCHEMA_PRES_DEF, RS_PRES_DEF_TYPE_VALUE
+from indy_node.test.api.helper import validate_write_reply, validate_rich_schema_txn, sdk_build_rich_schema_request
+from indy_node.test.helper import rich_schemas_enabled_scope
+from indy_node.test.rich_schema.templates import RICH_SCHEMA_EX1, W3C_BASE_CONTEXT, RICH_SCHEMA_ENCODING_EX1, \
+ RICH_SCHEMA_MAPPING_EX1, RICH_SCHEMA_CRED_DEF_EX1, RICH_SCHEMA_PRES_DEF_EX1
+from plenum.common.util import randomString
+from plenum.test.helper import sdk_get_reply, sdk_sign_and_submit_req
+
+
+@pytest.fixture(scope="module")
+def tconf(tconf):
+ with rich_schemas_enabled_scope(tconf):
+ yield tconf
+
+
+# The order of creation is essential as some rich schema object reference others by ID
+# Encoding's id must be equal to the one used in RICH_SCHEMA_MAPPING_EX1
+
+@pytest.mark.parametrize('txn_type, rs_type, content, rs_id',
+ [(JSON_LD_CONTEXT, RS_CONTEXT_TYPE_VALUE, W3C_BASE_CONTEXT, randomString()),
+ (RICH_SCHEMA, RS_SCHEMA_TYPE_VALUE, RICH_SCHEMA_EX1, RICH_SCHEMA_EX1['@id']),
+ (RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE, RICH_SCHEMA_ENCODING_EX1,
+ "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD"),
+ (RICH_SCHEMA_MAPPING, RS_MAPPING_TYPE_VALUE, RICH_SCHEMA_MAPPING_EX1,
+ RICH_SCHEMA_MAPPING_EX1['@id']),
+ (RICH_SCHEMA_CRED_DEF, RS_CRED_DEF_TYPE_VALUE, RICH_SCHEMA_CRED_DEF_EX1, randomString()),
+ (RICH_SCHEMA_PRES_DEF, RS_PRES_DEF_TYPE_VALUE, RICH_SCHEMA_PRES_DEF_EX1,
+ RICH_SCHEMA_PRES_DEF_EX1['@id'])])
+def test_rich_schema_object_reply_is_valid(looper, sdk_pool_handle, sdk_wallet_steward,
+ txn_type, rs_type, content, rs_id):
+ request = sdk_build_rich_schema_request(looper, sdk_wallet_steward,
+ txn_type=txn_type, rs_id=rs_id, rs_name=randomString(),
+ rs_version='1.0', rs_type=rs_type,
+ rs_content=json.dumps(content))
+ reply = sdk_get_reply(looper, sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_steward, request))[1]
+ validate_write_reply(reply)
+ validate_rich_schema_txn(reply['result']['txn'], txn_type)
* [X] diff --git a/indy_node/test/attrib_txn/test_send_get_attr.py b/indy_node/test/attrib_txn/test_send_get_attr.py
Only merges
index c6abd2c8..6764189c 100644
--- a/indy_node/test/attrib_txn/test_send_get_attr.py
+++ b/indy_node/test/attrib_txn/test_send_get_attr.py
@@ -1,17 +1,22 @@
-import pytest
+from hashlib import sha256
import json
+from random import randint
+import time
+from typing import Optional
from indy.ledger import build_get_attrib_request
from libnacl.secret import SecretBox
-from hashlib import sha256
-
from plenum.common.exceptions import RequestNackedException
-
from plenum.test.helper import sdk_get_and_check_replies
-
-from indy_node.test.helper import createUuidIdentifier, sdk_add_attribute_and_check, \
- sdk_get_attribute_and_check, modify_field
from plenum.test.pool_transactions.helper import sdk_sign_and_send_prepared_request
+import pytest
+
+from indy_node.test.helper import (
+ createUuidIdentifier,
+ modify_field,
+ sdk_add_attribute_and_check,
+ sdk_get_attribute_and_check,
+)
attrib_name = 'dateOfBirth'
@@ -34,6 +39,19 @@ def send_raw_attrib(looper, sdk_pool_handle, sdk_wallet_trustee):
return rep
+@pytest.fixture
+def send_raw_attrib_factory(looper, sdk_pool_handle, sdk_wallet_trustee):
+
+ def _factory(attrib: dict):
+ rep = sdk_add_attribute_and_check(
+ looper, sdk_pool_handle, sdk_wallet_trustee,
+ json.dumps(attrib)
+ )
+
+ return rep
+ return _factory
+
+
@pytest.fixture(scope="module")
def send_enc_attrib(looper, sdk_pool_handle, sdk_wallet_trustee):
rep = sdk_add_attribute_and_check(looper, sdk_pool_handle, sdk_wallet_trustee, None,
@@ -124,3 +142,67 @@ def test_send_get_attr_hash_succeeds_for_existing_uuid_dest(
request_couple = sdk_sign_and_send_prepared_request(looper, sdk_wallet_trustee,
sdk_pool_handle, req)
sdk_get_and_check_replies(looper, [request_couple])
+
+
+def test_get_attr_by_timestamp(
+ looper, sdk_pool_handle, sdk_wallet_trustee, send_raw_attrib_factory
+):
+ _, did = sdk_wallet_trustee
+
+ # Setup
+ initial = send_raw_attrib_factory({"attrib": 1})
+ time.sleep(3)
+ final = send_raw_attrib_factory({"attrib": 2})
+
+ timestamp = initial[0][1]["result"]["txnMetadata"]["txnTime"]
+ update_timestamp = final[0][1]["result"]["txnMetadata"]["txnTime"]
+
+ def _get_attrib(timestamp: Optional[int] = None):
+ raw_req = looper.loop.run_until_complete(
+ build_get_attrib_request(did, did, "attrib", None, None))
+
+ req = json.loads(raw_req)
+ if timestamp:
+ req["operation"]["timestamp"] = timestamp
+
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_trustee,
+ sdk_pool_handle, json.dumps(req)
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+ return json.loads(replies[0][1]["result"]["data"])["attrib"]
+
+ assert _get_attrib() == 2
+ assert _get_attrib(timestamp=timestamp) == 1
+ assert _get_attrib(randint(timestamp + 1, update_timestamp - 1)) == 1
+
+
+def test_get_attr_by_seq_no(
+ looper, sdk_pool_handle, sdk_wallet_trustee, send_raw_attrib_factory
+):
+ _, did = sdk_wallet_trustee
+
+ # Setup
+ initial = send_raw_attrib_factory({"attrib": 1})
+ time.sleep(3)
+ send_raw_attrib_factory({"attrib": 2})
+
+ seq_no = initial[0][1]["result"]["txnMetadata"]["seqNo"]
+
+ def _get_attrib(seq_no: Optional[int] = None):
+ raw_req = looper.loop.run_until_complete(
+ build_get_attrib_request(did, did, "attrib", None, None))
+
+ req = json.loads(raw_req)
+ if seq_no:
+ req["operation"]["seqNo"] = seq_no
+
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_trustee,
+ sdk_pool_handle, json.dumps(req)
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+ return json.loads(replies[0][1]["result"]["data"])["attrib"]
+
+ assert _get_attrib() == 2
+ assert _get_attrib(seq_no=seq_no) == 1
* [X] diff --git a/indy_node/test/auth_rule/auth_framework/edit_frozen_ledgers.py b/indy_node/test/auth_rule/auth_framework/edit_frozen_ledgers.py
New file
new file mode 100644
index 00000000..8dd0d380
--- /dev/null
+++ b/indy_node/test/auth_rule/auth_framework/edit_frozen_ledgers.py
@@ -0,0 +1,79 @@
+import pytest
+from plenum.common.constants import STEWARD, TRUSTEE_STRING, LEDGERS_FREEZE
+
+from indy_node.server.request_handlers.action_req_handlers.pool_restart_handler import PoolRestartHandler
+
+from indy_common.authorize.auth_actions import EDIT_PREFIX
+from indy_common.authorize.auth_constraints import AuthConstraint
+from indy_node.test.auth_rule.auth_framework.basic import AuthTest
+from plenum.common.exceptions import RequestRejectedException
+from plenum.test.freeze_ledgers.helper import sdk_send_freeze_ledgers
+from plenum.test.pool_transactions.helper import sdk_add_new_nym
+
+from indy_node.test.helper import build_auth_rule_request_json, sdk_send_and_check_req_json
+
+
+class EditFrozenLedgersTest(AuthTest):
+ def __init__(self, env, action_id):
+ super().__init__(env, action_id)
+ self.trustee_wallets = [self.trustee_wallet]
+
+ def prepare(self):
+ for i in range(3):
+ wallet = sdk_add_new_nym(self.looper,
+ self.sdk_pool_handle,
+ self.trustee_wallet,
+ alias='trustee{}'.format(i),
+ role=TRUSTEE_STRING)
+ self.trustee_wallets.append(wallet)
+ self.default_auth_rule = self.get_default_auth_rule()
+ self.changed_auth_rule = self.get_changed_auth_rule()
+ for n in self.env.txnPoolNodeSet:
+ for h in n.action_manager.request_handlers.values():
+ if isinstance(h, PoolRestartHandler):
+ h.restarter.handleRestartRequest = lambda *args, **kwargs: True
+
+ def run(self):
+ frozen_ledgers_ids = []
+
+ # Step 1. Check default auth rule
+ sdk_send_freeze_ledgers(self.looper, self.sdk_pool_handle, self.trustee_wallets, frozen_ledgers_ids)
+ with pytest.raises(RequestRejectedException):
+ sdk_send_freeze_ledgers(self.looper, self.sdk_pool_handle, [self.new_default_wallet], frozen_ledgers_ids)
+
+ # Step 2. Change auth rule
+ self.send_and_check(self.changed_auth_rule, wallet=self.trustee_wallet)
+
+ # Step 3. Check, that we cannot send txn the old way
+ sdk_send_freeze_ledgers(self.looper, self.sdk_pool_handle, [self.new_default_wallet], frozen_ledgers_ids)
+ with pytest.raises(RequestRejectedException):
+ sdk_send_freeze_ledgers(self.looper, self.sdk_pool_handle, self.trustee_wallets, frozen_ledgers_ids)
+
+ # Step 4. Check, that we can send restart action in changed way
+ sdk_send_freeze_ledgers(self.looper, self.sdk_pool_handle, [self.new_default_wallet], frozen_ledgers_ids)
+
+ # Step 5. Return default auth rule
+ self.send_and_check(self.default_auth_rule, self.trustee_wallet)
+
+ # Step 6. Check, that default auth rule works
+ sdk_send_freeze_ledgers(self.looper, self.sdk_pool_handle, self.trustee_wallets, frozen_ledgers_ids)
+ with pytest.raises(RequestRejectedException):
+ sdk_send_freeze_ledgers(self.looper, self.sdk_pool_handle, [self.new_default_wallet], frozen_ledgers_ids)
+
+ def result(self):
+ pass
+
+ def get_changed_auth_rule(self):
+ self.new_default_wallet = sdk_add_new_nym(self.looper, self.sdk_pool_handle, self.trustee_wallet, role=STEWARD)
+ constraint = AuthConstraint(role=STEWARD,
+ sig_count=1,
+ need_to_be_owner=False)
+ return build_auth_rule_request_json(
+ self.looper, self.trustee_wallet[1],
+ auth_action=EDIT_PREFIX,
+ auth_type=LEDGERS_FREEZE,
+ field='*',
+ old_value='*',
+ new_value='*',
+ constraint=constraint.as_dict
+ )
* [x] diff --git a/indy_node/test/auth_rule/auth_framework/test_auth_rule_using.py b/indy_node/test/auth_rule/auth_framework/test_auth_rule_using.py
Only merges
index 2f734146..8744fca1 100644
--- a/indy_node/test/auth_rule/auth_framework/test_auth_rule_using.py
+++ b/indy_node/test/auth_rule/auth_framework/test_auth_rule_using.py
@@ -4,6 +4,7 @@ from datetime import datetime, timedelta
from collections import OrderedDict
from indy_node.test.auth_rule.auth_framework.disable_taa import TAADisableTest
+from indy_node.test.auth_rule.auth_framework.edit_frozen_ledgers import EditFrozenLedgersTest
from plenum.common.constants import STEWARD, TRUSTEE, IDENTITY_OWNER
from indy_common.constants import (
@@ -123,6 +124,7 @@ class TestAuthRuleUsing():
auth_map.change_client_port.get_action_id(): EditNodeClientPortTest,
auth_map.change_bls_key.get_action_id(): EditNodeBlsTest,
auth_map.disable_txn_author_agreement.get_action_id(): TAADisableTest,
+ auth_map.edit_frozen_ledgers.get_action_id(): EditFrozenLedgersTest,
})
# TODO a workaround until sdk aceepts empty TAA to make possible its deactivation
* [ ] diff --git a/indy_node/test/auth_rule/test_auth_rule_transaction.py b/indy_node/test/auth_rule/test_auth_rule_transaction.py
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/test/auth_rule/test_auth_rule_transaction.py
* [ ] TODO need to consider this file
index 1e6302a9..b4102170 100644
--- a/indy_node/test/auth_rule/test_auth_rule_transaction.py
+++ b/indy_node/test/auth_rule/test_auth_rule_transaction.py
@@ -63,6 +63,7 @@ def test_reject_with_unacceptable_role_in_constraint(looper,
sdk_wallet_trustee,
constraint=constraint)
e.match('InvalidClientRequest')
+ e.match('client request invalid')
e.match('Role {} is not acceptable'.format(unacceptable_role))
@@ -84,7 +85,8 @@ def test_reqnack_auth_rule_transaction_with_wrong_key(looper,
sdk_pool_handle,
sdk_wallet_trustee,
auth_type="*")
- e.match("InvalidClientRequest")
+ e.match('InvalidClientRequest')
+ e.match("client request invalid")
e.match("is not found in authorization map")
@@ -100,7 +102,8 @@ def test_reqnack_auth_rule_edit_transaction_with_wrong_format(looper,
sdk_send_and_check_req_json(
looper, sdk_pool_handle, sdk_wallet_trustee, req_json,
)
- e.match("InvalidClientRequest")
+ e.match('InvalidClientRequest')
+ e.match("client request invalid")
e.match("Transaction for change authentication "
"rule for {}={} must contain field {}".
format(AUTH_ACTION, EDIT_PREFIX, OLD_VALUE))
@@ -116,7 +119,8 @@ def test_reqnack_auth_rule_add_transaction_with_wrong_format(looper,
sdk_wallet_trustee,
**generate_key(old_value="*")
)
- e.match("InvalidClientRequest")
+ e.match('InvalidClientRequest')
+ e.match("client request invalid")
e.match("Transaction for change authentication "
"rule for {}={} must not contain field {}".
format(AUTH_ACTION, ADD_PREFIX, OLD_VALUE))
* [X] diff --git a/indy_node/test/auth_rule/test_auth_rules_transaction.py b/indy_node/test/auth_rule/test_auth_rules_transaction.py
Merges only
index f391d1ce..937d8811 100644
--- a/indy_node/test/auth_rule/test_auth_rules_transaction.py
+++ b/indy_node/test/auth_rule/test_auth_rules_transaction.py
@@ -83,7 +83,7 @@ def test_reject_with_empty_rules_list(looper,
sdk_wallet_trustee,
sdk_pool_handle):
with pytest.raises(RequestNackedException,
- match="InvalidClientRequest.*length should be at least 1"):
+ match="client request invalid.*InvalidClientRequest.*length should be at least 1"):
sdk_send_and_check_auth_rules_request_invalid(looper,
sdk_pool_handle,
sdk_wallet_trustee,
@@ -102,6 +102,7 @@ def test_reject_with_unacceptable_role_in_constraint(looper,
sdk_wallet_trustee,
rules=[rule])
e.match('InvalidClientRequest')
+ e.match('client request invalid')
e.match('Role {} is not acceptable'.format(unacceptable_role))
@@ -123,7 +124,8 @@ def test_reqnack_auth_rules_transaction_with_wrong_key(looper,
sdk_pool_handle,
sdk_wallet_trustee,
[generate_auth_rule(auth_type="*")])
- e.match("InvalidClientRequest")
+ e.match('InvalidClientRequest')
+ e.match("client request invalid")
e.match("is not found in authorization map")
@@ -137,7 +139,8 @@ def test_reqnack_auth_rules_edit_transaction_with_wrong_format(looper,
sdk_pool_handle,
sdk_wallet_trustee,
rules=[rule])
- e.match("InvalidClientRequest")
+ e.match('InvalidClientRequest')
+ e.match("client request invalid")
e.match("Transaction for change authentication "
"rule for {}={} must contain field {}".
format(AUTH_ACTION, EDIT_PREFIX, OLD_VALUE))
@@ -151,7 +154,8 @@ def test_reqnack_auth_rules_add_transaction_with_wrong_format(looper,
sdk_pool_handle,
sdk_wallet_trustee,
[generate_auth_rule(old_value="*")])
- e.match("InvalidClientRequest")
+ e.match('InvalidClientRequest')
+ e.match("client request invalid")
e.match("Transaction for change authentication "
"rule for {}={} must not contain field {}".
format(AUTH_ACTION, ADD_PREFIX, OLD_VALUE))
* [X] diff --git a/indy_node/test/auth_rule/test_auth_txn_with_deprecated_key.py b/indy_node/test/auth_rule/test_auth_txn_with_deprecated_key.py
New file
new file mode 100644
index 00000000..d2c270b2
--- /dev/null
+++ b/indy_node/test/auth_rule/test_auth_txn_with_deprecated_key.py
@@ -0,0 +1,150 @@
+import shutil
+from contextlib import contextmanager
+
+import pytest
+
+from indy_common.config_helper import NodeConfigHelper
+from indy_node.test.helper import TestNode
+from plenum.test.node_catchup.helper import ensure_all_nodes_have_same_data
+from plenum.test.test_node import ensureElectionsDone, ensure_node_disconnected, checkNodesConnected
+from indy_node.test.auth_rule.helper import sdk_send_and_check_auth_rule_request, sdk_send_and_check_get_auth_rule_request
+from indy_common.authorize.auth_actions import ADD_PREFIX, AuthActionAdd
+from indy_common.authorize.auth_constraints import AuthConstraint, ROLE
+from indy_common.constants import CONSTRAINT, AUTH_TYPE, CONFIG_LEDGER_ID, NYM
+from indy_common.authorize.auth_map import one_trustee_constraint
+from plenum.common.constants import STEWARD, DATA
+from plenum.common.exceptions import RequestNackedException
+
+
+@contextmanager
+def extend_auth_map(nodes, key, constraint):
+ """
+ Context manager to add a new auth rule to the auth map and remove it on exit.
+
+ :param nodes: nodes list which auth maps should be changed
+ :param key: str gotten from AuthActionAdd(...).get_action_id()
+ :param constraint: AuthConstraint
+ """
+ for node in nodes:
+ node.write_req_validator.auth_map[key] = constraint
+ yield
+ for node in nodes:
+ node.write_req_validator.auth_map.pop(key, None)
+
+
+def test_auth_txn_with_deprecated_key(tconf, tdir, allPluginsPath,
+ txnPoolNodeSet,
+ looper,
+ sdk_wallet_trustee,
+ sdk_pool_handle):
+ """
+ Add to the auth_map a fake rule
+ Send AUTH_RULE txn to change this fake rule (and set the fake key to the config state)
+ Send GET_AUTH_RULE txn and check that the fake rule was changed
+ Remove the fake auth rule from the map
+ Check that we can't get the fake auth rule
+ Restart the last node with its state regeneration
+ Check that nodes data is equal after changing the existing auth rule (restarted node regenerate config state)
+ """
+
+ fake_txn_type = "100002"
+ fake_key = AuthActionAdd(txn_type=fake_txn_type,
+ field="*",
+ value="*").get_action_id()
+ fake_constraint = one_trustee_constraint
+ new_auth_constraint = AuthConstraint(role=STEWARD, sig_count=1, need_to_be_owner=False).as_dict
+
+ # Add to the auth_map a fake rule
+ with extend_auth_map(txnPoolNodeSet,
+ fake_key,
+ fake_constraint):
+ # Send AUTH_RULE txn to change this fake rule (and set the fake key to the config state)
+ sdk_send_and_check_auth_rule_request(looper,
+ sdk_pool_handle,
+ sdk_wallet_trustee,
+ auth_action=ADD_PREFIX,
+ auth_type=fake_txn_type,
+ field='*',
+ new_value='*',
+ constraint=new_auth_constraint)
+ # Send GET_AUTH_RULE txn and check that the fake rule was changed
+ result = sdk_send_and_check_get_auth_rule_request(
+ looper,
+ sdk_pool_handle,
+ sdk_wallet_trustee,
+ auth_type=fake_txn_type,
+ auth_action=ADD_PREFIX,
+ field="*",
+ new_value="*"
+ )[0][1]["result"][DATA][0]
+ assert result[AUTH_TYPE] == fake_txn_type
+ assert result[CONSTRAINT] == new_auth_constraint
+
+ # Remove the fake auth rule from the map
+ # Check that we can't get the fake auth rule
+ with pytest.raises(RequestNackedException, match="not found in authorization map"):
+ sdk_send_and_check_auth_rule_request(looper,
+ sdk_pool_handle,
+ sdk_wallet_trustee,
+ auth_action=ADD_PREFIX,
+ auth_type=fake_txn_type,
+ field='*',
+ new_value='*',
+ constraint=AuthConstraint(role=STEWARD, sig_count=2,
+ need_to_be_owner=False).as_dict)
+
+ resp = sdk_send_and_check_get_auth_rule_request(looper,
+ sdk_pool_handle,
+ sdk_wallet_trustee)
+
+ assert all(rule[AUTH_TYPE] != fake_txn_type for rule in resp[0][1]["result"][DATA])
+
+ with pytest.raises(RequestNackedException, match="not found in authorization map"):
+ sdk_send_and_check_get_auth_rule_request(
+ looper,
+ sdk_pool_handle,
+ sdk_wallet_trustee,
+ auth_type=fake_txn_type,
+ auth_action=ADD_PREFIX,
+ field="*",
+ new_value="*"
+ )
+ # Restart the last node with its state regeneration
+ ensure_all_nodes_have_same_data(looper, txnPoolNodeSet)
+
+ node_to_stop = txnPoolNodeSet[-1]
+ node_state = node_to_stop.states[CONFIG_LEDGER_ID]
+ assert not node_state.isEmpty
+ state_db_path = node_state._kv.db_path
+ node_to_stop.cleanupOnStopping = False
+ node_to_stop.stop()
+ looper.removeProdable(node_to_stop)
+ ensure_node_disconnected(looper, node_to_stop, txnPoolNodeSet[:-1])
+
+ shutil.rmtree(state_db_path)
+
+ config_helper = NodeConfigHelper(node_to_stop.name, tconf, chroot=tdir)
+ restarted_node = TestNode(
+ node_to_stop.name,
+ config_helper=config_helper,
+ config=tconf,
+ pluginPaths=allPluginsPath,
+ ha=node_to_stop.nodestack.ha,
+ cliha=node_to_stop.clientstack.ha)
+ looper.add(restarted_node)
+ txnPoolNodeSet[-1] = restarted_node
+
+ # Check that nodes data is equal (restarted node regenerate config state)
+ looper.run(checkNodesConnected(txnPoolNodeSet))
+ ensureElectionsDone(looper, txnPoolNodeSet, customTimeout=30)
+ sdk_send_and_check_auth_rule_request(looper,
+ sdk_pool_handle,
+ sdk_wallet_trustee,
+ auth_action=ADD_PREFIX,
+ auth_type=NYM,
+ field=ROLE,
+ new_value=STEWARD,
+ constraint=AuthConstraint(role=STEWARD, sig_count=2,
+ need_to_be_owner=False).as_dict)
+ ensure_all_nodes_have_same_data(looper, txnPoolNodeSet, custom_timeout=20)
+
* [X] diff --git a/indy_node/test/auth_rule/test_catching_up_auth_rule_txn.py b/indy_node/test/auth_rule/test_catching_up_auth_rule_txn.py
Merge sonly
index 0b2e09f7..45da913c 100644
--- a/indy_node/test/auth_rule/test_catching_up_auth_rule_txn.py
+++ b/indy_node/test/auth_rule/test_catching_up_auth_rule_txn.py
@@ -43,12 +43,16 @@ def test_catching_up_auth_rule_txn(looper,
auth_type=action.txn_type, field=action.field,
new_value=action.value, old_value=None,
constraint=changed_constraint.as_dict)
+ sdk_add_new_nym(looper,
+ sdk_pool_handle,
+ sdk_wallet_trustee,
+ 'newSteward2')
delayed_node.start_catchup()
looper.run(eventually(lambda: assertExp(delayed_node.mode == Mode.participating)))
sdk_add_new_nym(looper,
sdk_pool_handle,
sdk_wallet_steward,
- 'newSteward2',
+ 'newSteward3',
STEWARD_STRING,
dest=new_steward_did, verkey=new_steward_verkey)
ensure_all_nodes_have_same_data(looper, txnPoolNodeSet)
* [ ] diff --git a/indy_node/test/conftest.py b/indy_node/test/conftest.py
* [ ] TODO need to consider further
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/test/conftest.py
index bcdc9008..adb1df18 100644
--- a/indy_node/test/conftest.py
+++ b/indy_node/test/conftest.py
@@ -1,3 +1,4 @@
+import json
import logging
import time
import warnings
@@ -22,7 +23,7 @@ from plenum.server.request_managers.action_request_manager import ActionRequestM
from plenum.server.request_managers.read_request_manager import ReadRequestManager
from plenum.server.request_managers.write_request_manager import WriteRequestManager
from plenum.test.pool_transactions.helper import sdk_add_new_nym, sdk_pool_refresh, prepare_new_node_data, \
- create_and_start_new_node, prepare_node_request, sdk_sign_and_send_prepared_request
+ create_and_start_new_node, prepare_node_request, sdk_sign_and_send_prepared_request, prepare_nym_request
from plenum.test.testing_utils import FakeSomething
from state.pruning_state import PruningState
from storage.kv_in_memory import KeyValueStorageInMemory
@@ -59,6 +60,7 @@ from indy_common.test.conftest import general_conf_tdir, tconf as _tconf, poolTx
domainTxnOrderedFields, looper, setTestLogLevel, node_config_helper_class, config_helper_class
from indy_node.test.helper import TestNode, TestNodeBootstrap
+from indy_node.test.mock import build_nym_request
from indy_node.server.upgrader import Upgrader
from indy_node.utils.node_control_utils import NodeControlUtil
@@ -93,11 +95,31 @@ def warnfilters():
message="The 'warn' method is deprecated")
warnings.filterwarnings(
'ignore', category=ResourceWarning, message='unclosed transport')
+ # These warnings occur when zmq sockets are still open when being garbage collected.
+ # The sockets are then automatically closed.
+ warnings.filterwarnings(
+ 'ignore',
+ category=ResourceWarning,
+ message='unclosed.*socket.*\<zmq\.Socket')
return _
-@pytest.fixture(scope='module')
+@pytest.fixture(scope='module', name="sdk_node_theta_added")
+def sdk_node_theta_added_fixture(looper,
+ txnPoolNodeSet,
+ tdir,
+ tconf,
+ sdk_pool_handle,
+ sdk_wallet_trustee,
+ allPluginsPath,
+ node_config_helper_class,
+ testNodeClass,
+ name=None,
+ services=[VALIDATOR]):
+ return sdk_node_theta_added(looper, txnPoolNodeSet, tdir, tconf, sdk_pool_handle, sdk_wallet_trustee, allPluginsPath, node_config_helper_class, testNodeClass, name)
+
+
def sdk_node_theta_added(looper,
txnPoolNodeSet,
tdir,
@@ -156,6 +178,35 @@ def sdk_node_theta_added(looper,
return new_steward_wallet, new_node
+@pytest.fixture(scope="module")
+def sdk_wallet_endorser_factory(looper, sdk_pool_handle, sdk_wallet_trustee):
+ def _sdk_wallet_endorser_factory(diddoc_content=None, version=None):
+ seed = randomString(32)
+ alias = randomString(5)
+
+ raw_nym_request, did = looper.loop.run_until_complete(
+ prepare_nym_request(sdk_wallet_trustee, seed, alias, role='ENDORSER')
+ )
+ nym_request = json.loads(raw_nym_request)
+ if diddoc_content:
+ nym_request["operation"]["diddocContent"] = (
+ json.dumps(diddoc_content)
+ if isinstance(diddoc_content, dict)
+ else diddoc_content
+ )
+ if version:
+ nym_request["operation"]["version"] = version
+
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_trustee, sdk_pool_handle, json.dumps(nym_request)
+ )
+ sdk_get_and_check_replies(looper, [request_couple])
+
+ return sdk_wallet_trustee[0], did
+
+ return _sdk_wallet_endorser_factory
+
+
@pytest.fixture(scope="module")
def sdk_wallet_endorser(looper, sdk_pool_handle, sdk_wallet_trustee):
return sdk_add_new_nym(looper, sdk_pool_handle, sdk_wallet_trustee,
@@ -244,21 +295,21 @@ def nodeIds(nodeSet):
@pytest.fixture(scope="module")
def pool_ledger(tconf, tmpdir_factory):
- tdir = tmpdir_factory.mktemp('').strpath
+ tdir = tmpdir_factory.mktemp('tmp').strpath
return Ledger(CompactMerkleTree(),
dataDir=tdir)
@pytest.fixture(scope="module")
def domain_ledger(tconf, tmpdir_factory):
- tdir = tmpdir_factory.mktemp('').strpath
+ tdir = tmpdir_factory.mktemp('tmp').strpath
return Ledger(CompactMerkleTree(),
dataDir=tdir)
@pytest.fixture(scope="module")
def config_ledger(tconf, tmpdir_factory):
- tdir = tmpdir_factory.mktemp('').strpath
+ tdir = tmpdir_factory.mktemp('tmp').strpath
return Ledger(CompactMerkleTree(),
dataDir=tdir)
* [X] diff --git a/indy_node/test/context/test_send_context.py b/indy_node/test/context/test_send_context.py
File deleted
Only merges
deleted file mode 100644
index 8a69d228..00000000
--- a/indy_node/test/context/test_send_context.py
+++ /dev/null
@@ -1,232 +0,0 @@
-import pytest
-
-from indy_common.config import CONTEXT_SIZE_LIMIT
-from plenum.common.constants import DATA
-
-from indy_common.authorize.auth_constraints import AuthConstraintForbidden
-from indy_common.constants import CONTEXT_NAME, CONTEXT_VERSION, RS_TYPE, CONTEXT_TYPE, META, \
- SET_CONTEXT
-from indy_node.test.context.helper import W3C_BASE_CONTEXT, SCHEMA_ORG_CONTEXT, EXCESSIVELY_BIG_CONTEXT
-from indy_common.types import SetContextMetaField, SetContextDataField, ClientSetContextOperation
-from indy_node.test.api.helper import validate_write_reply, sdk_write_context_and_check
-from plenum.common.exceptions import RequestRejectedException
-
-
-def test_send_context_pass(looper, sdk_pool_handle,
- sdk_wallet_endorser):
- rep = sdk_write_context_and_check(
- looper, sdk_pool_handle,
- sdk_wallet_endorser,
- SCHEMA_ORG_CONTEXT,
- "Base_Context1",
- "1.0"
- )
- meta = rep[0][0]['operation'][META]
- assert meta[CONTEXT_VERSION] == '1.0'
- assert meta[CONTEXT_NAME] == 'Base_Context1'
- assert meta[RS_TYPE] == CONTEXT_TYPE
- data = rep[0][0]['operation'][DATA]
- assert data == SCHEMA_ORG_CONTEXT
-
-
-def test_write_same_context_returns_same_response(looper, sdk_pool_handle, sdk_wallet_endorser):
- rep1 = sdk_write_context_and_check(
- looper, sdk_pool_handle,
- sdk_wallet_endorser,
- W3C_BASE_CONTEXT,
- "Base_Context2",
- "1.0"
- )
- rep2 = sdk_write_context_and_check(
- looper, sdk_pool_handle,
- sdk_wallet_endorser,
- W3C_BASE_CONTEXT,
- "Base_Context2",
- "1.0"
- )
- assert rep1==rep2
-
-
-def test_write_same_context_with_different_reqid_fails(looper, sdk_pool_handle, sdk_wallet_endorser):
- sdk_write_context_and_check(
- looper, sdk_pool_handle,
- sdk_wallet_endorser,
- SCHEMA_ORG_CONTEXT,
- "Base_Context3",
- "1.0",
- 1234
- )
- with pytest.raises(RequestRejectedException,
- match=str(AuthConstraintForbidden())):
- resp = sdk_write_context_and_check(
- looper, sdk_pool_handle,
- sdk_wallet_endorser,
- SCHEMA_ORG_CONTEXT,
- "Base_Context3",
- "1.0",
- 2345
- )
- validate_write_reply(resp)
-
-
-def test_context_over_maximum_size():
- context = SetContextDataField()
- with pytest.raises(TypeError) as ex_info:
- context.validate(EXCESSIVELY_BIG_CONTEXT)
- ex_info.match(
- "size should be at most {}".format(CONTEXT_SIZE_LIMIT)
- )
-
-
-def test_validate_meta_fail_on_empty():
- meta = SetContextMetaField()
- with pytest.raises(TypeError) as e:
- meta.validate({})
- assert "validation error [SetContextMetaField]: missed fields" in str(e.value)
- assert "name" in str(e.value)
- assert "version" in str(e.value)
- assert "type" in str(e.value)
-
-
-def test_validate_meta_fail_no_name():
- meta = SetContextMetaField()
- with pytest.raises(TypeError) as e:
- meta.validate({
- "version": "2.5",
- "type": "ctx"
- })
- assert "validation error [SetContextMetaField]: missed fields" in str(e.value)
- assert "name" in str(e.value)
-
-
-def test_validate_meta_fail_no_version():
- meta = SetContextMetaField()
- with pytest.raises(TypeError) as e:
- meta.validate({
- "name": "New Context",
- "type": "ctx"
- })
- assert "validation error [SetContextMetaField]: missed fields" in str(e.value)
- assert "version" in str(e.value)
-
-
-def test_validate_meta_fail_invalid_version():
- meta = SetContextMetaField()
- with pytest.raises(TypeError) as e:
- meta.validate({
- "name": "New Context",
- "type": "ctx",
- "version": "A"
- })
- assert "validation error [SetContextMetaField]: Invalid version: 'A' (version=A)" in str(e.value)
-
-
-def test_validate_meta_fail_no_type():
- meta = SetContextMetaField()
- with pytest.raises(TypeError) as e:
- meta.validate({
- "name": "New Context",
- "version": "2.5"
- })
- assert "validation error [SetContextMetaField]: missed fields" in str(e.value)
- assert "type" in str(e.value)
-
-
-def test_validate_meta_fail_wrong_type():
- meta = SetContextMetaField()
- with pytest.raises(TypeError) as e:
- meta.validate({
- "name": "New Context",
- "version": "2.5",
- "type": "sch"
- })
- assert "validation error [SetContextMetaField]: has to be equal ctx (type=sch)" in str(e.value)
-
-
-def test_validate_meta_pass():
- meta = SetContextMetaField()
- meta.validate({
- "name": "New Context",
- "version": "5.2",
- "type": "ctx"
- })
-
-
-def test_validate_data_fail_on_empty():
- data = SetContextDataField()
- with pytest.raises(TypeError) as e:
- data.validate({})
- assert "validation error [SetContextDataField]: missed fields" in str(e.value)
- assert "@context" in str(e.value)
-
-
-def test_validate_data_fail_not_dict():
- data = SetContextDataField()
- with pytest.raises(TypeError) as e:
- data.validate("context")
- assert "validation error [SetContextDataField]: invalid type <class 'str'>, dict expected" in str(e.value)
-
-
-def test_validate_data_fail_no_context_property():
- data = SetContextDataField()
- with pytest.raises(TypeError) as e:
- data.validate({
- "name": "Thing"
- })
- assert "validation error [SetContextDataField]: missed fields - @context" in str(e.value)
-
-
-def test_validate_data_pass():
- data = SetContextDataField()
- data.validate({"@context": "https://www.example.com"})
-
-
-def test_validate_operation_fail_no_meta():
- operation = ClientSetContextOperation()
- with pytest.raises(TypeError) as e:
- operation.validate({
- "data": W3C_BASE_CONTEXT,
- "type": SET_CONTEXT
- })
- assert 'validation error [ClientSetContextOperation]: missed fields - meta' in str(e.value)
-
-
-def test_validate_operation_fail_no_data():
- operation = ClientSetContextOperation()
- with pytest.raises(TypeError) as e:
- operation.validate({
- "meta": {
- "type": CONTEXT_TYPE,
- "name": "TestContext",
- "version": "1.0"
- },
- "type": SET_CONTEXT
- })
- assert "validation error [ClientSetContextOperation]: missed fields - data." in str(e.value)
-
-
-def test_validate_operation_fail_no_type():
- operation = ClientSetContextOperation()
- with pytest.raises(TypeError) as e:
- operation.validate({
- "meta": {
- "type": CONTEXT_TYPE,
- "name": "TestContext",
- "version": "1.0"
- },
- "data": W3C_BASE_CONTEXT
- })
- assert "validation error [ClientSetContextOperation]: missed fields - type." in str(e.value)
-
-
-def test_validate_operation_pass():
- operation = ClientSetContextOperation()
- operation.validate({
- "meta": {
- "type": CONTEXT_TYPE,
- "name": "TestContext",
- "version": "1.0"
- },
- "data": W3C_BASE_CONTEXT,
- "type": SET_CONTEXT
- })
* [X] diff --git a/indy_node/test/context/test_send_get_context.py b/indy_node/test/context/test_send_get_context.py
File Deleted
Merges only
deleted file mode 100644
index aa811f6b..00000000
--- a/indy_node/test/context/test_send_get_context.py
+++ /dev/null
@@ -1,293 +0,0 @@
-import json
-
-import pytest
-
-from indy_common.constants import GET_CONTEXT, CONTEXT_TYPE, RS_TYPE, CONTEXT_NAME, CONTEXT_VERSION, META
-from indy_node.test.context.helper import W3C_BASE_CONTEXT
-from plenum.common.exceptions import RequestNackedException
-
-from plenum.common.constants import DATA, NAME, VERSION, TXN_METADATA, TXN_METADATA_SEQ_NO, TXN_TYPE
-
-from plenum.common.types import OPERATION
-
-from plenum.test.helper import sdk_sign_and_submit_req, sdk_get_and_check_replies
-
-from indy_node.test.api.helper import sdk_write_context
-from indy_node.test.helper import createUuidIdentifier, modify_field
-
-
-TEST_CONTEXT_NAME = "Base_Context"
-TEST_CONTEXT_VERSION = "1.0"
-
-
-@pytest.fixture(scope="module")
-def send_context(looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee):
- context_json, _ = sdk_write_context(looper, sdk_pool_handle, sdk_wallet_trustee,
- W3C_BASE_CONTEXT,
- TEST_CONTEXT_NAME,
- TEST_CONTEXT_VERSION)
- return json.loads(context_json)['id']
-
-
-def test_send_get_context_succeeds(looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: TEST_CONTEXT_VERSION,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- rep = sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- assert rep[0][1]['result']['seqNo']
- assert rep[0][1]['result'][DATA][META][RS_TYPE] == CONTEXT_TYPE
- assert rep[0][1]['result'][DATA][META][CONTEXT_NAME] == TEST_CONTEXT_NAME
- assert rep[0][1]['result'][DATA][META][CONTEXT_VERSION] == TEST_CONTEXT_VERSION
- assert rep[0][1]['result'][DATA][DATA] == W3C_BASE_CONTEXT
-
-
-def test_send_get_context_as_client(looper, sdk_pool_handle, nodeSet, sdk_wallet_client, sdk_wallet_trustee,
- send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: TEST_CONTEXT_VERSION,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- rep = sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_client,
- get_context_txn_json)])
- assert rep[0][1]['result']['seqNo']
- assert rep[0][1]['result'][DATA][META][RS_TYPE] == CONTEXT_TYPE
- assert rep[0][1]['result'][DATA][META][CONTEXT_NAME] == TEST_CONTEXT_NAME
- assert rep[0][1]['result'][DATA][META][CONTEXT_VERSION] == TEST_CONTEXT_VERSION
- assert rep[0][1]['result'][DATA][DATA] == W3C_BASE_CONTEXT
-
-
-def test_send_get_context_fails_with_invalid_name(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: "bad_name",
- CONTEXT_VERSION: TEST_CONTEXT_VERSION,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- rep = sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- assert rep[0][1]['result']['seqNo'] is None
-
-
-def test_send_get_context_fails_with_incorrect_dest(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, sdk_wallet_client, send_context):
- _, did = sdk_wallet_client
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: TEST_CONTEXT_VERSION,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- rep = sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- assert rep[0][1]['result']['seqNo'] is None
-
-
-def test_send_get_context_fails_with_invalid_dest(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': "wrong_did",
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: TEST_CONTEXT_VERSION,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- with pytest.raises(RequestNackedException) as e:
- sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- assert "validation error [ClientGetContextOperation]: should not contain the following chars [\'_\'] (" \
- "dest=wrong_did)" in str(e.value)
-
-
-def test_send_get_context_fails_with_incorrect_version(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: '2.0',
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- rep = sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- assert rep[0][1]['result']['seqNo'] is None
-
-
-def test_send_get_context_fails_with_invalid_version(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: 2.0,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- with pytest.raises(RequestNackedException) as e:
- sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- assert "validation error [GetContextField]: expected types 'str', got 'float' (version=2.0)" in str(e.value)
-
-
-def test_send_get_context_fails_with_invalid_version_syntax(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: 'asd',
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- with pytest.raises(RequestNackedException) as e:
- sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- e.match("Invalid version: 'asd'")
-
-
-def test_send_get_context_fails_without_version(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- with pytest.raises(RequestNackedException) as e:
- sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- e.match('missed fields - version')
-
-
-def test_send_get_context_fails_without_name(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- 'dest': did,
- META: {
- CONTEXT_VERSION: TEST_CONTEXT_VERSION,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- with pytest.raises(RequestNackedException) as e:
- sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- e.match('missed fields - name')
-
-
-def test_send_get_context_fails_without_dest(
- looper, sdk_pool_handle, nodeSet, sdk_wallet_trustee, send_context):
- _, did = sdk_wallet_trustee
- raw_json = {
- 'operation': {
- TXN_TYPE: GET_CONTEXT,
- META: {
- CONTEXT_NAME: TEST_CONTEXT_NAME,
- CONTEXT_VERSION: TEST_CONTEXT_VERSION,
- RS_TYPE: CONTEXT_TYPE
- }
- },
- "identifier": did,
- "reqId": 12345678,
- "protocolVersion": 2,
- }
- get_context_txn_json = json.dumps(raw_json)
- with pytest.raises(RequestNackedException) as e:
- sdk_get_and_check_replies(looper, [sdk_sign_and_submit_req(sdk_pool_handle, sdk_wallet_trustee,
- get_context_txn_json)])
- e.match('missed fields - dest')
* [X] diff --git a/indy_node/test/freeze_ledgers/__init__.py b/indy_node/test/freeze_ledgers/__init__.py
New file
new file mode 100644
index 00000000..e69de29b
* [X] diff --git a/indy_node/test/freeze_ledgers/test_send_ledgers_freeze.py b/indy_node/test/freeze_ledgers/test_send_ledgers_freeze.py
New file
new file mode 100644
index 00000000..2f692fef
--- /dev/null
+++ b/indy_node/test/freeze_ledgers/test_send_ledgers_freeze.py
@@ -0,0 +1,39 @@
+import pytest
+from plenum.common.constants import DATA
+from plenum.common.exceptions import RequestRejectedException
+from plenum.test.freeze_ledgers.helper import sdk_get_frozen_ledgers, sdk_send_freeze_ledgers
+from plenum.test.helper import freshness
+
+FRESHNESS_TIMEOUT = 5
+
+
+@pytest.fixture(scope="module")
+def tconf(tconf):
+ with freshness(tconf, enabled=True, timeout=FRESHNESS_TIMEOUT):
+ yield tconf
+
+
+def test_send_freeze_ledgers(looper, txnPoolNodeSet, sdk_pool_handle, sdk_wallet_trustee_list):
+ with pytest.raises(RequestRejectedException, match="Not enough TRUSTEE signatures"):
+ sdk_send_freeze_ledgers(
+ looper, sdk_pool_handle,
+ sdk_wallet_trustee_list[:-1],
+ []
+ )
+
+ # check that the config state doesn't contain frozen ledgers records
+ result = sdk_get_frozen_ledgers(looper, sdk_pool_handle,
+ sdk_wallet_trustee_list[0])[1]["result"][DATA]
+ assert result is None
+
+ # add to the config state a frozen ledgers record with an empty list
+ sdk_send_freeze_ledgers(
+ looper, sdk_pool_handle,
+ sdk_wallet_trustee_list,
+ []
+ )
+
+ # check that the config state contains a frozen ledgers record with an empty list
+ result = sdk_get_frozen_ledgers(looper, sdk_pool_handle,
+ sdk_wallet_trustee_list[0])[1]["result"][DATA]
+ assert len(result) == 0
* [X] diff --git a/indy_node/test/helper.py b/indy_node/test/helper.py
Do we care about this?
No, looks fine.
* commit 074ba6ba659952fb79ba80698450ae330e59da2f
| Author: toktar <renata.toktar@dsr-corporation.com>
| Date: Thu Aug 22 18:27:17 2019 +0300
|
| Merge pull request #1418 from Toktar/bug-2211-auth-constraints
|
| INDY-2211: don't write OFF_LEDGER_SIGNATURE if it's not explicitly set
|
| Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
|
| diff --git a/indy_node/test/helper.py b/indy_node/test/helper.py
| index f975b9e7..36cece4b 100644
| --- a/indy_node/test/helper.py
| +++ b/indy_node/test/helper.py
| @@ -184,12 +184,6 @@ def sdk_send_and_check_auth_rule_request(
| constraint=constraint
| )
|
| - # temp fix untill new sdk released
| - req_json = json.loads(req_json)
| - if req_json[OPERATION][CONSTRAINT][CONSTRAINT_ID] == 'ROLE':
| - req_json[OPERATION][CONSTRAINT][OFF_LEDGER_SIGNATURE] = constraint[OFF_LEDGER_SIGNATURE]
| - req_json = json.dumps(req_json)
| -
| return sdk_send_and_check_req_json(
| looper, sdk_pool_handle, sdk_wallet, req_json, no_wait=no_wait
| )
| @@ -250,14 +244,16 @@ def generate_constraint_entity(constraint_id=ConstraintsEnum.ROLE_CONSTRAINT_ID,
| role=TRUSTEE,
| sig_count=1,
| need_to_be_owner=False,
| - off_ledger_signature=False,
| + off_ledger_signature=None,
| metadata={}):
| - return {CONSTRAINT_ID: constraint_id,
| - ROLE: role,
| - SIG_COUNT: sig_count,
| - NEED_TO_BE_OWNER: need_to_be_owner,
| - OFF_LEDGER_SIGNATURE: off_ledger_signature,
| - METADATA: metadata}
| + constraint = {CONSTRAINT_ID: constraint_id,
| + ROLE: role,
| + SIG_COUNT: sig_count,
| + NEED_TO_BE_OWNER: need_to_be_owner,
| + METADATA: metadata}
| + if off_ledger_signature is not None:
| + constraint[OFF_LEDGER_SIGNATURE] = off_ledger_signature
| + return constraint
|
|
| base58_alphabet = set(base58.alphabet.decode("utf-8"))
index 36cece4b..a5c1fd2d 100644
--- a/indy_node/test/helper.py
+++ b/indy_node/test/helper.py
@@ -1,6 +1,7 @@
import json
-import base58
+from contextlib import contextmanager
+import base58
from indy.did import replace_keys_start, replace_keys_apply
from indy.ledger import (
build_attrib_request, build_get_attrib_request,
@@ -13,23 +14,21 @@ from indy_common.authorize.auth_constraints import ROLE, CONSTRAINT_ID, Constrai
METADATA, OFF_LEDGER_SIGNATURE
from indy_common.config_helper import NodeConfigHelper
from indy_common.constants import NYM, ENDORSER, CONSTRAINT, AUTH_ACTION, AUTH_TYPE, FIELD, NEW_VALUE, OLD_VALUE
+from indy_common.test.helper import TempStorage
+from indy_node.server.node import Node
from indy_node.server.node_bootstrap import NodeBootstrap
+from indy_node.server.upgrader import Upgrader
from plenum.common.constants import TRUSTEE
from plenum.common.signer_did import DidSigner
from plenum.common.signer_simple import SimpleSigner
-from plenum.common.types import OPERATION
from plenum.common.util import rawToFriendly
-from plenum.test.pool_transactions.helper import sdk_sign_and_send_prepared_request, sdk_add_new_nym
-from stp_core.common.log import getlogger
from plenum.test.helper import sdk_get_and_check_replies, sdk_sign_and_submit_req
+from plenum.test.pool_transactions.helper import sdk_add_new_nym
from plenum.test.test_node import TestNodeCore
from plenum.test.testable import spyable
-from indy_common.test.helper import TempStorage
-from indy_node.server.node import Node
-from indy_node.server.upgrader import Upgrader
+from stp_core.common.log import getlogger
from stp_core.types import HA
-
logger = getlogger()
@@ -324,3 +323,11 @@ def createCryptonym(seed=None):
def createUuidIdentifierAndFullVerkey(seed=None):
didSigner = DidSigner(identifier=createUuidIdentifier(), seed=seed)
return didSigner.identifier, didSigner.verkey
+
+
+@contextmanager
+def rich_schemas_enabled_scope(tconf):
+ old_value = tconf.ENABLE_RICH_SCHEMAS
+ tconf.ENABLE_RICH_SCHEMAS = True
+ yield tconf
+ tconf.ENABLE_RICH_SCHEMAS = old_value
* [X] diff --git a/indy_node/test/mock.py b/indy_node/test/mock.py
New file
new file mode 100644
index 00000000..e6153881
--- /dev/null
+++ b/indy_node/test/mock.py
@@ -0,0 +1,69 @@
+import json
+from random import randint
+from typing import Any, Dict, Optional, Union
+
+from indy_common.constants import GET_NYM, NYM
+
+
+def build_nym_request(
+ identifier: str,
+ dest: str,
+ verkey: Optional[str] = None,
+ diddoc_content: Optional[Union[dict, str]] = None,
+ role: Optional[str] = None,
+ version: Optional[int] = None,
+):
+ request = {
+ "identifier": identifier,
+ "reqId": randint(100, 1000000),
+ "protocolVersion": 2
+ }
+
+ operation: Dict[str, Any] = {
+ "dest": dest,
+ "type": NYM
+ }
+
+ if verkey:
+ operation["verkey"] = verkey
+ if diddoc_content:
+ operation["diddocContent"] = (
+ json.dumps(diddoc_content)
+ if isinstance(diddoc_content, dict)
+ else diddoc_content
+ )
+ if role:
+ operation["role"] = role
+
+ if version:
+ operation["version"] = version
+
+ request["operation"] = operation
+ return json.dumps(request)
+
+
+def build_get_nym_request(
+ identifier: str,
+ dest: str,
+ timestamp: Optional[int] = None,
+ seq_no: Optional[int] = None,
+):
+ request = {
+ "identifier": identifier,
+ "reqId": randint(100, 1000000),
+ "protocolVersion": 2
+ }
+
+ operation: Dict[str, Any] = {
+ "dest": dest,
+ "type": GET_NYM
+ }
+
+ if timestamp:
+ operation["timestamp"] = timestamp
+
+ if seq_no:
+ operation["seqNo"] = seq_no
+
+ request["operation"] = operation
+ return json.dumps(request)
* [ ] diff --git a/indy_node/test/node_control_utils/test_node_control_util.py b/indy_node/test/node_control_utils/test_node_control_util.py
* [ ] TODO dig in further on this
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/test/node_control_utils/test_node_control_util.py
index aecad7d2..bb3e35e1 100644
--- a/indy_node/test/node_control_utils/test_node_control_util.py
+++ b/indy_node/test/node_control_utils/test_node_control_util.py
@@ -33,15 +33,15 @@ def catch_generated_commands(monkeypatch):
some_package_info = 'Package: some_package\nVersion: 1.2.3\nDepends: aaa (= 1.2.4), bbb (>= 1.2.5), ccc, aaa'
some_other_package_info = 'Package: some_other_package\nVersion: 4.5.6\nDepends: ddd (= 3.4.5), eee (>= 5.1.2), fff, ddd'
-app_package_info = 'Package: {}\nVersion: 1.2.3\nDepends: aaa (= 1.2.4), bbb (>= 1.2.5), ccc, aaa'.format(APP_NAME)
+app_package_info = f'Package: {APP_NAME}\nVersion: 1.2.3\nDepends: aaa (= 1.2.4), bbb (>= 1.2.5), ccc, aaa'
any_package_info = 'Package: any_package\nVersion: 1.2.3\nDepends: aaa (= 1.2.4), bbb (>= 1.2.5), ccc, aaa'
@pytest.fixture
def patch_run_shell_command(monkeypatch):
generated_commands[:] = []
- pkg_list = 'openssl\nsed\ntar\nsome_package\nsome_other_package\n{}\nany_package'.format(APP_NAME)
- pkg_info = '{}\n\n{}\n\n{}\n\n{}'.format(some_package_info, some_other_package_info, app_package_info, any_package_info)
+ pkg_list = f'openssl\nsed\ntar\nsome_package\nsome_other_package\n{APP_NAME}\nany_package'
+ pkg_info = f'{some_package_info}\n\n{some_other_package_info}\n\n{app_package_info}\n\n{any_package_info}'
def mock_run_shell_command(command, *args, **kwargs):
# Keep track of the generated commands
@@ -331,7 +331,7 @@ def test_get_latest_pkg_version_invalid_args():
(APP_NAME, None, 'Version: 1.2.3\nVersion: 1.2.4\nVersion: 1.2.5~rc1\nVersion: 1.2.5~dev1\nVersion: 1.2.3.4.5', '1.2.5rc1'),
(APP_NAME, '1.2.5', 'Version: 1.2.3\nVersion: 1.2.4\nVersion: 1.2.5~rc1\nVersion: 1.2.5~dev1\nVersion: 1.2.3.4.5', None),
],
- ids=lambda s: s.replace('\n', '_').replace(' ', '_')
+ ids=lambda s: s.replace('\n', '_').replace(' ', '_') if s else None
)
def test_get_latest_pkg_version(
monkeypatch, pkg_name, upstream, output, expected):
@@ -387,12 +387,12 @@ def test_curr_pkg_info(patch_run_shell_command, pkg_name, version, expected_deps
@pytest.mark.parametrize(
'pkg_name',
[
- pytest.param('{} | echo "hey"; echo "hi" && echo "hello"|echo "hello world"'.format(APP_NAME), id='multiple'),
- pytest.param('{}|echo "hey"'.format(APP_NAME), id='pipe'),
- pytest.param('{};echo "hey"'.format(APP_NAME), id='semi-colon'),
- pytest.param('{}&&echo "hey"'.format(APP_NAME), id='and'),
- pytest.param('{}\necho "hey"'.format(APP_NAME), id='Cr'),
- pytest.param('{} echo "hey"'.format(APP_NAME), id='whitespace'),
+ pytest.param(f'{APP_NAME} | echo "hey"; echo "hi" && echo "hello"|echo "hello world"', id='multiple'),
+ pytest.param(f'{APP_NAME}|echo "hey"', id='pipe'),
+ pytest.param(f'{APP_NAME};echo "hey"', id='semi-colon'),
+ pytest.param(f'{APP_NAME}&&echo "hey"', id='and'),
+ pytest.param(f'{APP_NAME}\necho "hey"', id='Cr'),
+ pytest.param(f'{APP_NAME} echo "hey"', id='whitespace'),
]
)
def test_curr_pkg_info_with_command_concat(patch_run_shell_command, pkg_name):
* [X] diff --git a/indy_node/test/nym_txn/conftest.py b/indy_node/test/nym_txn/conftest.py
Merges only
index 549a5430..48ecafce 100644
--- a/indy_node/test/nym_txn/conftest.py
+++ b/indy_node/test/nym_txn/conftest.py
@@ -1,6 +1,32 @@
+import json
+
import pytest
@pytest.fixture(scope="function", params=[False, True])
def with_verkey(request):
return request.param
+
+
+@pytest.fixture
+def diddoc_content():
+ yield {
+ "@context": [
+ "https://www.w3.org/ns/did/v1",
+ "https://identity.foundation/didcomm-messaging/service-endpoint/v1",
+ ],
+ "serviceEndpoint": [
+ {
+ "id": "did:indy:sovrin:123456#didcomm",
+ "type": "didcomm-messaging",
+ "serviceEndpoint": "https://example.com",
+ "recipientKeys": ["#verkey"],
+ "routingKeys": [],
+ }
+ ],
+ }
+
+
+@pytest.fixture
+def diddoc_content_json(diddoc_content):
+ yield json.dumps(diddoc_content)
* [X] diff --git a/indy_node/test/nym_txn/test_get_nym_versions.py b/indy_node/test/nym_txn/test_get_nym_versions.py
New file
new file mode 100644
index 00000000..647659b5
--- /dev/null
+++ b/indy_node/test/nym_txn/test_get_nym_versions.py
@@ -0,0 +1,198 @@
+import copy
+import json
+import time
+from random import randint
+
+import pytest
+from indy_common.constants import NYM_VERSION_CONVENTION
+from indy_node.test.helper import sdk_send_and_check_req_json
+from indy_node.test.mock import build_get_nym_request, build_nym_request
+from plenum.common.exceptions import RequestNackedException
+from plenum.test.helper import sdk_get_and_check_replies
+from plenum.test.pool_transactions.helper import sdk_sign_and_send_prepared_request
+
+
+def test_get_nym_data_with_diddoc_content_without_seq_no_or_timestamp(
+ looper, sdk_pool_handle, sdk_wallet_endorser_factory, diddoc_content, diddoc_content_json
+):
+ sdk_wallet_endorser = sdk_wallet_endorser_factory(diddoc_content)
+ _, did = sdk_wallet_endorser
+ get_nym_request = build_get_nym_request(did, did)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ assert (
+ json.loads(replies[0][1]["result"]["data"])["diddocContent"] == diddoc_content_json
+ )
+
+
+def test_get_previous_nym_data_by_timestamp(
+ looper, sdk_pool_handle, sdk_wallet_endorser_factory, diddoc_content, diddoc_content_json
+):
+ sdk_wallet_endorser = sdk_wallet_endorser_factory(diddoc_content)
+ _, did = sdk_wallet_endorser
+
+ # Get current nym data
+ get_nym_request = build_get_nym_request(did, did)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ # Get timestamp from data
+ timestamp = replies[0][1]["result"]["txnTime"]
+
+ # Write new nym data
+ new_diddoc_content = copy.deepcopy(diddoc_content)
+ new_diddoc_content["serviceEndpoint"][0][
+ "serviceEndpoint"
+ ] = "https://new.example.com"
+ new_diddoc_content = json.dumps(new_diddoc_content)
+
+ time.sleep(3)
+
+ nym_request = build_nym_request(
+ identifier=did, dest=did, diddoc_content=new_diddoc_content
+ )
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, nym_request
+ )
+ sdk_get_and_check_replies(looper, [request_couple])
+
+ get_nym_request = build_get_nym_request(did, did)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ assert (
+ json.loads(replies[0][1]["result"]["data"])["diddocContent"] == new_diddoc_content
+ )
+
+ update_ts = replies[0][1]["result"]["txnTime"]
+
+ # Get previous nym data by exact timestamp
+ get_nym_request = build_get_nym_request(did, did, timestamp)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ assert (
+ json.loads(replies[0][1]["result"]["data"])["diddocContent"] == diddoc_content_json
+ )
+
+ # Get previous nym data by timestamp but not exact
+ ts = randint(timestamp + 1, update_ts - 1)
+ get_nym_request = build_get_nym_request(did, did, ts)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ assert (
+ json.loads(replies[0][1]["result"]["data"])["diddocContent"] == diddoc_content_json
+ )
+
+
+def test_get_previous_nym_data_by_seq_no(
+ looper, sdk_pool_handle, sdk_wallet_endorser_factory, diddoc_content, diddoc_content_json
+):
+ sdk_wallet_endorser = sdk_wallet_endorser_factory(diddoc_content)
+ _, did = sdk_wallet_endorser
+
+ # Get current nym data
+ get_nym_request = build_get_nym_request(did, did)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ # Get seq_no from data
+ seq_no = replies[0][1]["result"]["seqNo"]
+
+ # Write new nym data
+ new_diddoc_content = copy.deepcopy(diddoc_content)
+ new_diddoc_content["serviceEndpoint"][0][
+ "serviceEndpoint"
+ ] = "https://new.example.com"
+ new_diddoc_content = json.dumps(new_diddoc_content)
+
+ time.sleep(3)
+
+ nym_request = build_nym_request(did, did, None, new_diddoc_content, None)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, nym_request
+ )
+ sdk_get_and_check_replies(looper, [request_couple])
+
+ get_nym_request = build_get_nym_request(did, did)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ assert (
+ json.loads(replies[0][1]["result"]["data"])["diddocContent"] == new_diddoc_content
+ )
+
+ # Get previous nym data by seq_no
+ get_nym_request = build_get_nym_request(did, did, seq_no=seq_no)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ assert (
+ json.loads(replies[0][1]["result"]["data"])["diddocContent"] == diddoc_content_json
+ )
+
+
+def test_nym_txn_rejected_with_both_seqNo_and_timestamp(
+ looper, sdk_pool_handle, sdk_wallet_endorser
+):
+ _, did = sdk_wallet_endorser
+
+ # Attempt to get previous nym data by exact timestamp and seqNo
+ get_nym_request = build_get_nym_request(did, did, int(time.time()), 10)
+
+ with pytest.raises(RequestNackedException) as e:
+ sdk_send_and_check_req_json(
+ looper, sdk_pool_handle, sdk_wallet_endorser, get_nym_request
+ )
+ e.match("InvalidClientRequest")
+ e.match("client request invalid")
+ e.match("mutually exclusive")
+
+
+def test_get_nym_returns_no_nym_version_when_absent(
+ looper, sdk_pool_handle, sdk_wallet_endorser
+):
+ _, did = sdk_wallet_endorser
+ get_nym_request = build_get_nym_request(did, did)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ assert (
+ "version" not in json.loads(replies[0][1]["result"]["data"])
+ )
+
+
+def test_get_nym_returns_nym_version_when_set(
+ looper, sdk_pool_handle, sdk_wallet_endorser_factory
+):
+ sdk_wallet_endorser = sdk_wallet_endorser_factory(version=NYM_VERSION_CONVENTION)
+ _, did = sdk_wallet_endorser
+ get_nym_request = build_get_nym_request(did, did)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, get_nym_request
+ )
+ replies = sdk_get_and_check_replies(looper, [request_couple])
+
+ returned = json.loads(replies[0][1]["result"]["data"])
+ assert "version" in returned
+ assert returned["version"] == NYM_VERSION_CONVENTION
* [X] diff --git a/indy_node/test/nym_txn/test_nym_additional.py b/indy_node/test/nym_txn/test_nym_additional.py
Merges only
index 1c589a27..612f84b1 100644
--- a/indy_node/test/nym_txn/test_nym_additional.py
+++ b/indy_node/test/nym_txn/test_nym_additional.py
@@ -25,7 +25,7 @@ def set_verkey(looper, sdk_pool_handle, sdk_wallet_sender, dest, verkey):
return wh, new_did
-@pytest.fixture("module")
+@pytest.fixture(scope="module")
def endorser_did_verkey(looper, sdk_wallet_client):
wh, _ = sdk_wallet_client
named_did, verkey = looper.loop.run_until_complete(
* [X] diff --git a/indy_node/test/nym_txn/test_nym_diddoc_content.py b/indy_node/test/nym_txn/test_nym_diddoc_content.py
new file
new file mode 100644
index 00000000..9c160b3b
--- /dev/null
+++ b/indy_node/test/nym_txn/test_nym_diddoc_content.py
@@ -0,0 +1,34 @@
+import copy
+
+import pytest
+from indy_node.test.mock import build_nym_request
+from plenum.common.exceptions import RequestNackedException
+from plenum.test.helper import sdk_get_and_check_replies
+from plenum.test.pool_transactions.helper import sdk_sign_and_send_prepared_request
+
+
+def test_diddoc_content_added(
+ looper, sdk_pool_handle, sdk_wallet_endorser, diddoc_content
+):
+ _, did = sdk_wallet_endorser
+ nym_request = build_nym_request(did, did, None, diddoc_content, None)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, nym_request
+ )
+ sdk_get_and_check_replies(looper, [request_couple])
+
+
+def test_add_didoc_with_id_fails(
+ looper, sdk_pool_handle, sdk_wallet_endorser, diddoc_content
+):
+ _, did = sdk_wallet_endorser
+ diddoc_content_with_id = copy.deepcopy(diddoc_content)
+ diddoc_content_with_id["id"] = "someId"
+ nym_request = build_nym_request(did, did, None, diddoc_content_with_id, None)
+ request_couple = sdk_sign_and_send_prepared_request(
+ looper, sdk_wallet_endorser, sdk_pool_handle, nym_request
+ )
+ with pytest.raises(RequestNackedException) as e:
+ sdk_get_and_check_replies(looper, [request_couple])
+ e.match("InvalidClientRequest")
+ e.match("diddocContent must not have `id` at root")
* [ ] diff --git a/indy_node/test/pool_restart/test_pool_restart.py b/indy_node/test/pool_restart/test_pool_restart.py
* [ ] TODO need to look into further
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/test/pool_restart/test_pool_restart.py
index e0ef5142..876c79fb 100644
--- a/indy_node/test/pool_restart/test_pool_restart.py
+++ b/indy_node/test/pool_restart/test_pool_restart.py
@@ -11,7 +11,6 @@ from indy_node.server.restarter import Restarter
from indy_node.test.pool_restart.helper import _createServer, _stopServer, sdk_send_restart
from plenum.common.constants import REPLY, TXN_TYPE
from plenum.common.types import f
-from plenum.test.testing_utils import FakeSomething
def test_pool_restart(
@@ -89,9 +88,9 @@ def test_pool_restart_cancel(
def test_pool_restart_now_without_datetime(
- sdk_pool_handle, sdk_wallet_trustee, looper, tdir, tconf):
+ sdk_pool_handle, sdk_wallet_trustee, looper, tdir, tconf, txnPoolNodeSet):
pool_restart_now(sdk_pool_handle, sdk_wallet_trustee, looper,
- tdir, tconf, START)
+ tdir, tconf, START, txnPoolNodeSet)
def test_pool_restart_in_view_change(sdk_pool_handle, sdk_wallet_trustee, looper,
@@ -101,11 +100,11 @@ def test_pool_restart_in_view_change(sdk_pool_handle, sdk_wallet_trustee, looper
node.master_replica._consensus_data.waiting_for_new_view = True
pool_restart_now(sdk_pool_handle, sdk_wallet_trustee, looper,
- tdir, tconf, START)
+ tdir, tconf, START, txnPoolNodeSet)
def pool_restart_now(sdk_pool_handle, sdk_wallet_trustee, looper, tdir, tconf,
- action, datetime=None):
+ action, txnPoolNodeSet, use_time=True):
server, indicator = looper.loop.run_until_complete(
_createServer(
host=tconf.controlServiceHost,
@@ -113,11 +112,15 @@ def pool_restart_now(sdk_pool_handle, sdk_wallet_trustee, looper, tdir, tconf,
)
)
+ time = None
+ if use_time:
+ unow = datetime.utcnow().replace(tzinfo=dateutil.tz.tzutc())
+ time = str(datetime.isoformat(unow + timedelta(seconds=1000)))
req_obj, resp = sdk_send_restart(looper,
sdk_wallet_trustee,
sdk_pool_handle,
action=action,
- datetime=datetime)
+ datetime=time)
_stopServer(server)
_comparison_reply(resp, req_obj)
* [X] diff --git a/indy_node/test/pool_restart/test_pool_restart_now_with_empty_datetime.py b/indy_node/test/pool_restart/test_pool_restart_now_with_empty_datetime.py
git log --date-order --graph --decorate -p origin/stable ^origin/ubuntu-20.04-upgrade -- indy_node/test/pool_restart/test_pool_restart_now_with_empty_datetime.py
* commit 0bd19eff3d8d5149a92e3e5c4756753c4b992f8e
Merge: 089d12e4 7dcf675c
Author: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Date: Fri Jun 22 15:03:07 2018 +0300
Merge remote-tracking branch 'public/master' into rc-1.4.63
Signed-off-by: Andrew Nikitin <andrew.nikitin@dsr-corporation.com>
Only merges in history
index ac3e2221..bf807da6 100644
--- a/indy_node/test/pool_restart/test_pool_restart_now_with_empty_datetime.py
+++ b/indy_node/test/pool_restart/test_pool_restart_now_with_empty_datetime.py
@@ -3,6 +3,6 @@ from indy_node.test.pool_restart.test_pool_restart import pool_restart_now
def test_pool_restart_now_with_empty_datetime(
- sdk_pool_handle, sdk_wallet_trustee, looper, tdir, tconf):
+ sdk_pool_handle, sdk_wallet_trustee, looper, tdir, tconf, txnPoolNodeSet):
pool_restart_now(sdk_pool_handle, sdk_wallet_trustee, looper,
- tdir, tconf, START, "")
\ No newline at end of file
+ tdir, tconf, START, "", use_time=False)
* [X] diff --git a/indy_node/test/request_handlers/conftest.py b/indy_node/test/request_handlers/conftest.py
Only merges in history
index b8603e55..3304800c 100644
--- a/indy_node/test/request_handlers/conftest.py
+++ b/indy_node/test/request_handlers/conftest.py
@@ -1,18 +1,25 @@
+import random
+
import pytest
-from indy_common.constants import SCHEMA, REVOC_REG_DEF, CRED_DEF_ID, REVOC_TYPE, TAG, CONTEXT_TYPE
+from indy_common.constants import CONFIG_LEDGER_ID, DOMAIN_LEDGER_ID, FLAG, FLAG_NAME, FLAG_VALUE, FLAG_NAME_COMPAT_ORDERING, REVOC_REG_DEF, CRED_DEF_ID, REVOC_TYPE, TAG
from indy_node.persistence.idr_cache import IdrCache
-from indy_node.server.request_handlers.domain_req_handlers.context_handler import ContextHandler
+from indy_node.server.request_handlers.config_req_handlers.flag_handler import FlagRequestHandler
from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_def_handler import RevocRegDefHandler
from indy_node.server.request_handlers.domain_req_handlers.schema_handler import SchemaHandler
+from indy_node.server.request_handlers.read_req_handlers.get_flag_handler import GetFlagRequestHandler
from indy_node.test.auth_rule.helper import generate_auth_rule_operation
-from indy_node.test.context.helper import W3C_BASE_CONTEXT
from indy_node.test.request_handlers.helper import add_to_idr
from plenum.common.constants import KeyValueStorageType, TXN_TYPE, TXN_AUTHOR_AGREEMENT, TXN_AUTHOR_AGREEMENT_TEXT, \
- TXN_AUTHOR_AGREEMENT_VERSION, TXN_AUTHOR_AGREEMENT_RATIFICATION_TS
+ TXN_AUTHOR_AGREEMENT_VERSION, TXN_AUTHOR_AGREEMENT_RATIFICATION_TS, TS_LABEL
from plenum.common.request import Request
from plenum.common.util import randomString
-from storage.helper import initKeyValueStorage
+from plenum.server.node import Node
+from storage.helper import initKeyValueStorage, initKeyValueStorageIntKeys
+from storage.state_ts_store import StateTsDbStorage
+from storage.kv_in_memory import KeyValueStorageInMemory
+from indy_node.test.api.helper import req_id
+_reqId = req_id()
@pytest.fixture(scope="module")
@@ -36,7 +43,7 @@ def schema_request():
return Request(identifier=randomString(),
reqId=5,
signature="sig",
- operation={'type': SCHEMA,
+ operation={'type': "101",
'data': {
'version': '1.0',
'name': 'Degree',
@@ -45,27 +52,78 @@ def schema_request():
}})
-@pytest.fixture(scope="module")
-def context_handler(db_manager, write_auth_req_validator):
- return ContextHandler(db_manager, write_auth_req_validator)
-
-
@pytest.fixture(scope="function")
-def context_request():
- return Request(identifier=randomString(),
- reqId=1234,
+def rs_schema_request():
+ authors_did, name, version, _type = "2hoqvcwupRTUNkXn6ArYzs", randomString(), "1.1", "8"
+ _id = authors_did + ':' + _type + ':' + name + ':' + version
+ return Request(identifier=authors_did,
+ reqId=next(_reqId),
signature="sig",
+ protocolVersion=2,
operation={
+ "type": "201",
"meta": {
- "type": CONTEXT_TYPE,
- "name": "TestContext",
- "version": 1
+ "type": "sch",
+ "name": name,
+ "version": version
},
- "data": W3C_BASE_CONTEXT,
- "type": "200"
+ "data": {
+ "schema": {
+ "@id": _id,
+ "@context": "ctx:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "@type": "rdfs:Class",
+ "rdfs:comment": "ISO18013 International Driver License",
+ "rdfs:label": "Driver License",
+ "rdfs:subClassOf": {
+ "@id": "sch:Thing"
+ },
+ "driver": "Driver",
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "issuingAuthority": "Text",
+ "licenseNumber": "Text",
+ "categoriesOfVehicles": {
+ "vehicleType": "Text",
+ "vehicleType-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^(A|B|C|D|BE|CE|DE|AM|A1|A2|B1|C1|D1|C1E|D1E)$"
+ },
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "restrictions": "Text",
+ "restrictions-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^([A-Z]|[1-9])$"
+ }
+ },
+ "administrativeNumber": "Text"
+ }
+ }
})
+@pytest.fixture(scope="function")
+def rs_schema_broken_request():
+ authors_did, name, version, _type = "2hoqvcwupRTUNkXn6ArYzs", randomString(), "1.1", "8"
+ _id = authors_did + ':' + _type + ':' + name + ':' + version
+ return Request(identifier=authors_did,
+ reqId=random.randint(1, 10000000000000000000),
+ signature="sig",
+ protocolVersion=2,
+ operation={'type': '201',
+ 'data':
+ {'schema':
+ {'@id': '7MpRep22vL3bnyR8VqCvsS:8:ISO18023_Drivers_License:1.2',
+ '@type': '0od'}
+ },
+ 'meta':
+ {'type': 'sch',
+ 'version': '1.2',
+ 'name': 'ISO18023_Drivers_License'}
+ }
+ )
+
+
@pytest.fixture(scope="module")
def revoc_reg_def_handler(db_manager, write_auth_req_validator):
return RevocRegDefHandler(db_manager, write_auth_req_validator)
@@ -107,3 +165,52 @@ def taa_request(creator):
TXN_AUTHOR_AGREEMENT_TEXT: "text",
TXN_AUTHOR_AGREEMENT_VERSION: "version",
TXN_AUTHOR_AGREEMENT_RATIFICATION_TS: 0})
+
+
+@pytest.fixture(scope="module")
+def flag_handler(db_manager_ts, write_auth_req_validator):
+ return FlagRequestHandler(db_manager_ts, write_auth_req_validator)
+
+
+@pytest.fixture(scope="module")
+def ts_store(tmpdir_factory):
+ data_location = tmpdir_factory.mktemp('tmp').strpath
+ config_storage = initKeyValueStorageIntKeys(
+ KeyValueStorageType.Rocksdb,
+ data_location,
+ "config_test_db")
+ return StateTsDbStorage("test", {CONFIG_LEDGER_ID: config_storage})
+
+
+@pytest.fixture(scope="module")
+def db_manager_ts(db_manager, ts_store):
+ db_manager.register_new_store(TS_LABEL, ts_store)
+ return db_manager
+
+
+@pytest.fixture(scope="module")
+def get_flag_request_handler(db_manager_ts):
+ node = Node.__new__(Node)
+ return GetFlagRequestHandler(node, db_manager_ts)
+
+
+@pytest.fixture(scope="function")
+def flag_request():
+ identifier = randomString()
+ return Request(
+ identifier=identifier,
+ reqId=5,
+ operation={TXN_TYPE: FLAG, FLAG_NAME: FLAG_NAME_COMPAT_ORDERING, FLAG_VALUE: "True"},
+ signature="randomString",
+ )
+
+
+@pytest.fixture(scope="function")
+def flag_get_request():
+ identifier = randomString()
+ return Request(
+ identifier=identifier,
+ reqId=5,
+ operation={TXN_TYPE: FLAG, FLAG_NAME: FLAG_NAME_COMPAT_ORDERING, FLAG_VALUE: "True"},
+ signature="randomString",
+ )
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/__init__.py b/indy_node/test/request_handlers/rich_schema/__init__.py
New file
new file mode 100644
index 00000000..e69de29b
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/conftest.py b/indy_node/test/request_handlers/rich_schema/conftest.py
New file
new file mode 100644
index 00000000..9123f04b
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/conftest.py
@@ -0,0 +1,46 @@
+import pytest
+
+from indy_common.constants import RS_CONTEXT_TYPE_VALUE, RS_ENCODING_TYPE_VALUE, RS_CRED_DEF_TYPE_VALUE, \
+ RS_SCHEMA_TYPE_VALUE, RS_MAPPING_TYPE_VALUE, RS_PRES_DEF_TYPE_VALUE
+from indy_common.types import Request
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.abstract_rich_schema_object_handler import \
+ AbstractRichSchemaObjectHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.json_ld_context_handler import \
+ JsonLdContextHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_cred_def_handler import \
+ RichSchemaCredDefHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_encoding_handler import \
+ RichSchemaEncodingHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_handler import RichSchemaHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_mapping_handler import \
+ RichSchemaMappingHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_pres_def_handler import \
+ RichSchemaPresDefHandler
+from indy_node.test.helper import rich_schemas_enabled_scope
+from indy_node.test.request_handlers.rich_schema.helper import context_request, rich_schema_request, \
+ rich_schema_encoding_request, rich_schema_mapping_request, rich_schema_cred_def_request, \
+ rich_schema_pres_def_request
+
+
+@pytest.fixture(scope="module")
+def tconf(tconf):
+ with rich_schemas_enabled_scope(tconf):
+ yield tconf
+
+
+@pytest.fixture(params=[RS_CONTEXT_TYPE_VALUE, RS_SCHEMA_TYPE_VALUE,
+ RS_ENCODING_TYPE_VALUE, RS_MAPPING_TYPE_VALUE,
+ RS_CRED_DEF_TYPE_VALUE, RS_PRES_DEF_TYPE_VALUE])
+def handler_and_request(request, db_manager, write_auth_req_validator) -> (AbstractRichSchemaObjectHandler, Request):
+ if request.param == RS_CONTEXT_TYPE_VALUE:
+ return JsonLdContextHandler(db_manager, write_auth_req_validator), context_request()
+ if request.param == RS_SCHEMA_TYPE_VALUE:
+ return RichSchemaHandler(db_manager, write_auth_req_validator), rich_schema_request()
+ if request.param == RS_ENCODING_TYPE_VALUE:
+ return RichSchemaEncodingHandler(db_manager, write_auth_req_validator), rich_schema_encoding_request()
+ if request.param == RS_MAPPING_TYPE_VALUE:
+ return RichSchemaMappingHandler(db_manager, write_auth_req_validator), rich_schema_mapping_request()
+ if request.param == RS_CRED_DEF_TYPE_VALUE:
+ return RichSchemaCredDefHandler(db_manager, write_auth_req_validator), rich_schema_cred_def_request()
+ if request.param == RS_PRES_DEF_TYPE_VALUE:
+ return RichSchemaPresDefHandler(db_manager, write_auth_req_validator), rich_schema_pres_def_request()
\ No newline at end of file
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/helper.py b/indy_node/test/request_handlers/rich_schema/helper.py
New file
new file mode 100644
index 00000000..2643963d
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/helper.py
@@ -0,0 +1,81 @@
+import copy
+import json
+
+from indy_common.constants import RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE, RICH_SCHEMA_MAPPING, \
+ RS_MAPPING_TYPE_VALUE, RS_CRED_DEF_TYPE_VALUE, RICH_SCHEMA_CRED_DEF, RICH_SCHEMA, RS_SCHEMA_TYPE_VALUE, \
+ JSON_LD_CONTEXT, RS_CONTEXT_TYPE_VALUE, RS_ID, RS_NAME, RS_TYPE, RS_VERSION, RS_CONTENT, RS_PRES_DEF_TYPE_VALUE, \
+ RICH_SCHEMA_PRES_DEF
+from indy_common.types import Request
+from indy_node.test.rich_schema.templates import RICH_SCHEMA_EX1, W3C_BASE_CONTEXT, RICH_SCHEMA_ENCODING_EX1, \
+ RICH_SCHEMA_MAPPING_EX1, RICH_SCHEMA_CRED_DEF_EX1, RICH_SCHEMA_PRES_DEF_EX1
+from plenum.common.constants import TXN_TYPE, OP_VER, CURRENT_PROTOCOL_VERSION
+from plenum.common.txn_util import reqToTxn, append_txn_metadata
+from plenum.common.util import randomString
+
+
+def rs_req(txn_type, rs_type, content, id=None):
+ author = randomString()
+ endorser = randomString()
+ return Request(identifier=author,
+ reqId=1234,
+ signatures={author: "sig1", endorser: "sig2"},
+ endorser=endorser,
+ protocolVersion=CURRENT_PROTOCOL_VERSION,
+ operation={
+ TXN_TYPE: txn_type,
+ OP_VER: '1.1',
+ RS_ID: id or randomString(),
+ RS_NAME: randomString(),
+ RS_TYPE: rs_type,
+ RS_VERSION: '1.0',
+ RS_CONTENT: json.dumps(content)
+ })
+
+
+def context_request():
+ return rs_req(JSON_LD_CONTEXT, RS_CONTEXT_TYPE_VALUE,
+ content=W3C_BASE_CONTEXT)
+
+
+def rich_schema_request():
+ id = randomString()
+ content = copy.deepcopy(RICH_SCHEMA_EX1)
+ content['@id'] = id
+ return rs_req(RICH_SCHEMA, RS_SCHEMA_TYPE_VALUE,
+ content=content, id=id)
+
+
+def rich_schema_encoding_request():
+ return rs_req(RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE,
+ content=RICH_SCHEMA_ENCODING_EX1)
+
+
+def rich_schema_mapping_request():
+ id = randomString()
+ content = copy.deepcopy(RICH_SCHEMA_MAPPING_EX1)
+ content['@id'] = id
+ return rs_req(RICH_SCHEMA_MAPPING, RS_MAPPING_TYPE_VALUE,
+ content=content, id=id)
+
+
+def rich_schema_cred_def_request():
+ return rs_req(RICH_SCHEMA_CRED_DEF, RS_CRED_DEF_TYPE_VALUE,
+ content=RICH_SCHEMA_CRED_DEF_EX1)
+
+
+def rich_schema_pres_def_request():
+ id = randomString()
+ content = copy.deepcopy(RICH_SCHEMA_PRES_DEF_EX1)
+ content['@id'] = id
+ return rs_req(RICH_SCHEMA_PRES_DEF, RS_PRES_DEF_TYPE_VALUE,
+ content=content, id=id)
+
+
+def make_rich_schema_object_exist(handler, request, commit=False):
+ seq_no = 1
+ txn_time = 1560241033
+ txn = reqToTxn(request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ handler.update_state(txn, None, context_request)
+ if commit:
+ handler.state.commit()
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_all_rich_schema_handlers.py b/indy_node/test/request_handlers/rich_schema/test_all_rich_schema_handlers.py
New file
new file mode 100644
index 00000000..3247bc9b
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_all_rich_schema_handlers.py
@@ -0,0 +1,157 @@
+import copy
+import json
+import random
+
+import pytest
+
+from indy_common.authorize.auth_constraints import AuthConstraintForbidden
+from indy_common.constants import RS_ID, RS_TYPE, RS_NAME, RS_VERSION, RS_CONTENT, ENDORSER, JSON_LD_ID_FIELD, \
+ JSON_LD_TYPE_FIELD
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_handler import RichSchemaHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_mapping_handler import \
+ RichSchemaMappingHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_pres_def_handler import \
+ RichSchemaPresDefHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_node.test.request_handlers.rich_schema.helper import context_request, make_rich_schema_object_exist
+from plenum.common.constants import OP_VER, TRUSTEE
+from plenum.common.exceptions import UnauthorizedClientRequest, InvalidClientRequest
+from plenum.common.txn_util import reqToTxn, append_txn_metadata
+from plenum.common.util import SortedDict, randomString
+
+
+def test_update_state(handler_and_request):
+ handler, request = handler_and_request
+ seq_no = 1
+ txn_time = 1560241033
+ txn = reqToTxn(request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ op = request.operation
+
+ handler.update_state(txn, None, context_request)
+
+ value = {
+ 'id': op[RS_ID],
+ 'rsType': op[RS_TYPE],
+ 'rsName': op[RS_NAME],
+ 'rsVersion': op[RS_VERSION],
+ 'content': op[RS_CONTENT],
+ 'from': request.identifier,
+ 'endorser': request.endorser,
+ 'ver': op[OP_VER],
+ }
+ primary_key = op[RS_ID]
+ secondary_key = "{RS_TYPE}:{RS_NAME}:{RS_VERSION}".format(RS_TYPE=op['rsType'],
+ RS_NAME=op['rsName'],
+ RS_VERSION=op['rsVersion']).encode()
+
+ value_from_state = handler.get_from_state(primary_key)
+ assert SortedDict(value_from_state[0]) == SortedDict(value)
+ assert value_from_state[1] == seq_no
+ assert value_from_state[2] == txn_time
+ assert handler.state.get(secondary_key, isCommitted=False) == op[RS_ID].encode()
+
+
+def test_static_validation_pass(handler_and_request):
+ handler, request = handler_and_request
+ handler.static_validation(request)
+
+
+def test_static_validation_content_is_json(handler_and_request):
+ handler, request = handler_and_request
+
+ request.operation[RS_CONTENT] = randomString()
+ with pytest.raises(InvalidClientRequest, match="must be a JSON serialized string"):
+ handler.static_validation(request)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_content_is_json_ld_with_atid(handler_and_request, status):
+ handler, request = handler_and_request
+
+ content = copy.deepcopy(json.loads(request.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop(JSON_LD_ID_FIELD, None)
+ elif status == 'empty':
+ content[JSON_LD_ID_FIELD] = ""
+ elif status == 'none':
+ content[JSON_LD_ID_FIELD] = None
+ request.operation[RS_CONTENT] = json.dumps(content)
+
+ if not isinstance(handler, (RichSchemaMappingHandler, RichSchemaHandler, RichSchemaPresDefHandler)):
+ handler.static_validation(request)
+ return
+
+ with pytest.raises(InvalidClientRequest, match="'content' must be a valid JSON-LD and have non-empty '@id' field"):
+ handler.static_validation(request)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_content_is_json_ld_with_attype(handler_and_request, status):
+ handler, request = handler_and_request
+
+ content = copy.deepcopy(json.loads(request.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop(JSON_LD_TYPE_FIELD, None)
+ elif status == 'empty':
+ content[JSON_LD_TYPE_FIELD] = ""
+ elif status == 'none':
+ content[JSON_LD_TYPE_FIELD] = None
+ request.operation[RS_CONTENT] = json.dumps(content)
+
+ if not isinstance(handler, (RichSchemaMappingHandler, RichSchemaHandler, RichSchemaPresDefHandler)):
+ handler.static_validation(request)
+ return
+
+ with pytest.raises(InvalidClientRequest,
+ match="'content' must be a valid JSON-LD and have non-empty '@type' field"):
+ handler.static_validation(request)
+
+
+def test_static_validation_atid_equals_to_id(handler_and_request):
+ handler, request = handler_and_request
+
+ content = copy.deepcopy(json.loads(request.operation[RS_CONTENT]))
+ content["@id"] = request.operation[RS_ID] + "a"
+ request.operation[RS_CONTENT] = json.dumps(content)
+ if not isinstance(handler, (RichSchemaMappingHandler, RichSchemaHandler, RichSchemaPresDefHandler)):
+ handler.static_validation(request)
+ return
+
+ with pytest.raises(InvalidClientRequest,
+ match="content's @id must be equal to id={}".format(request.operation[RS_ID])):
+ handler.static_validation(request)
+
+
+def test_dynamic_validation_failed_not_authorised(handler_and_request):
+ handler, request = handler_and_request
+ add_to_idr(handler.database_manager.idr_cache, request.identifier, None)
+ with pytest.raises(UnauthorizedClientRequest):
+ handler.dynamic_validation(request, 0)
+
+
+def test_dynamic_validation_for_existing(handler_and_request):
+ handler, request = handler_and_request
+ make_rich_schema_object_exist(handler, request)
+ add_to_idr(handler.database_manager.idr_cache, request.identifier, TRUSTEE)
+ add_to_idr(handler.database_manager.idr_cache, request.endorser, ENDORSER)
+
+ with pytest.raises(UnauthorizedClientRequest, match=str(AuthConstraintForbidden())):
+ handler.dynamic_validation(request, 0)
+
+
+def test_dynamic_validation_for_existing_metadata(handler_and_request):
+ handler, request = handler_and_request
+ make_rich_schema_object_exist(handler, request)
+ add_to_idr(handler.database_manager.idr_cache, request.identifier, TRUSTEE)
+ add_to_idr(handler.database_manager.idr_cache, request.endorser, ENDORSER)
+
+ request.operation[RS_ID] = randomString()
+ request.operation[RS_CONTENT] = randomString()
+ request.reqId = random.randint(10, 1000000000)
+
+ with pytest.raises(InvalidClientRequest,
+ match='An object with rsName="{}", rsVersion="{}" and rsType="{}" already exists. '
+ 'Please choose different rsName, rsVersion or rsType'.format(
+ request.operation[RS_NAME], request.operation[RS_VERSION], request.operation[RS_TYPE])):
+ handler.dynamic_validation(request, 0)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_get_rs_object_by_id_handler.py b/indy_node/test/request_handlers/rich_schema/test_get_rs_object_by_id_handler.py
new file
new file mode 100644
index 00000000..c8aba449
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_get_rs_object_by_id_handler.py
@@ -0,0 +1,134 @@
+import pytest
+
+from indy_common.constants import RS_ID, GET_RICH_SCHEMA_OBJECT_BY_ID, RS_VERSION, RS_NAME, RS_CONTENT, RS_TYPE
+from indy_common.types import Request
+from indy_node.server.request_handlers.read_req_handlers.rich_schema.get_rich_schema_object_by_id_handler import \
+ GetRichSchemaObjectByIdHandler
+from indy_node.test.state_proof.helper import check_valid_proof
+from indy_node.test.state_proof.test_state_multi_proofs_for_get_requests import is_proof_verified, save_multi_sig
+from plenum.common.constants import TXN_TYPE, OP_VER, DATA, CURRENT_PROTOCOL_VERSION
+from plenum.common.txn_util import reqToTxn, append_txn_metadata, get_payload_data
+from plenum.common.util import randomString, SortedDict
+
+
+@pytest.fixture()
+def id():
+ return randomString()
+
+
+@pytest.fixture()
+def get_rich_schema_req(id):
+ return Request(identifier=randomString(),
+ reqId=1234,
+ sig="sig",
+ protocolVersion=CURRENT_PROTOCOL_VERSION,
+ operation={
+ TXN_TYPE: GET_RICH_SCHEMA_OBJECT_BY_ID,
+ RS_ID: id,
+ })
+
+
+@pytest.fixture()
+def get_rich_schema_by_id_handler(db_manager):
+ return GetRichSchemaObjectByIdHandler(db_manager)
+
+
+def test_get_rich_schema_obj(db_manager, handler_and_request, id,
+ get_rich_schema_by_id_handler, get_rich_schema_req):
+ # prepare: store object in state with bls multi-sg
+ handler, request = handler_and_request
+ op = request.operation
+ op[RS_ID] = id
+ seq_no = 1
+ txn_time = 1560241033
+ txn = reqToTxn(request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ handler.update_state(txn, None, request)
+ handler.state.commit()
+ save_multi_sig(db_manager)
+
+ # execute: get object
+ result = get_rich_schema_by_id_handler.get_result(get_rich_schema_req)
+
+ # check
+ assert result
+ expected_data = SortedDict({
+ 'ver': op[OP_VER],
+ 'id': op[RS_ID],
+ 'rsType': op[RS_TYPE],
+ 'rsName': op[RS_NAME],
+ 'rsVersion': op[RS_VERSION],
+ 'content': op[RS_CONTENT],
+ 'from': request.identifier,
+ 'endorser': request.endorser,
+ })
+ assert SortedDict(result['data']) == expected_data
+ assert result['seqNo'] == seq_no
+ assert result['txnTime'] == txn_time
+ assert result['state_proof']
+ check_valid_proof(result)
+ path = id.encode()
+ assert is_proof_verified(db_manager,
+ result['state_proof'],
+ path, result[DATA], seq_no, txn_time)
+
+
+def test_get_rich_schema_obj_not_existent(db_manager, handler_and_request,
+ get_rich_schema_by_id_handler, get_rich_schema_req):
+ save_multi_sig(db_manager)
+
+ # execute: get object
+ result = get_rich_schema_by_id_handler.get_result(get_rich_schema_req)
+
+ # check
+ assert result
+ assert result['data'] is None
+ assert result['seqNo'] is None
+ assert result['txnTime'] is None
+ assert result['state_proof']
+ check_valid_proof(result)
+
+
+def test_get_rich_schema_obj_committed_only(db_manager, handler_and_request, id,
+ get_rich_schema_by_id_handler, get_rich_schema_req):
+ # prepare: store object in state with bls multi-sig, and then update the object (uncommitted)
+ handler, request = handler_and_request
+ op = request.operation
+ op[RS_ID] = id
+ seq_no = 1
+ txn_time = 1560241033
+ txn = reqToTxn(request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ handler.update_state(txn, None, request)
+ handler.state.commit()
+
+ get_payload_data(txn)[RS_NAME] = "new uncommitted name"
+ get_payload_data(txn)[RS_VERSION] = "3.0"
+ handler.update_state(txn, None, request)
+
+ save_multi_sig(db_manager)
+
+ # execute: get object
+ result = get_rich_schema_by_id_handler.get_result(get_rich_schema_req)
+
+ # check
+ assert result
+ expected_data = SortedDict({
+ 'ver': op[OP_VER],
+ 'id': op[RS_ID],
+ 'rsType': op[RS_TYPE],
+ 'rsName': op[RS_NAME],
+ 'rsVersion': op[RS_VERSION],
+ 'content': op[RS_CONTENT],
+ 'from': request.identifier,
+ 'endorser': request.endorser,
+ })
+ assert SortedDict(result['data']) == expected_data
+ assert result['seqNo'] == seq_no
+ assert result['txnTime'] == txn_time
+ assert result['state_proof']
+ check_valid_proof(result)
+ path = id.encode()
+ assert is_proof_verified(db_manager,
+ result['state_proof'],
+ path, result[DATA], seq_no, txn_time)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_get_rs_object_by_metadata_handler.py b/indy_node/test/request_handlers/rich_schema/test_get_rs_object_by_metadata_handler.py
New file
new file mode 100644
index 00000000..b9d52b54
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_get_rs_object_by_metadata_handler.py
@@ -0,0 +1,142 @@
+import pytest
+
+from indy_common.constants import RS_ID, RS_VERSION, RS_NAME, RS_CONTENT, RS_TYPE, \
+ GET_RICH_SCHEMA_OBJECT_BY_METADATA
+from indy_common.types import Request
+from indy_node.server.request_handlers.read_req_handlers.rich_schema.get_rich_schema_object_by_metadata_handler import \
+ GetRichSchemaObjectByMetadataHandler
+from indy_node.test.state_proof.helper import check_valid_proof
+from indy_node.test.state_proof.test_state_multi_proofs_for_get_requests import is_proof_verified, save_multi_sig
+from plenum.common.constants import TXN_TYPE, OP_VER, DATA, CURRENT_PROTOCOL_VERSION
+from plenum.common.txn_util import reqToTxn, append_txn_metadata, get_payload_data
+from plenum.common.util import randomString, SortedDict
+
+
+@pytest.fixture()
+def metadata():
+ return randomString(), randomString(), randomString()
+
+
+@pytest.fixture()
+def get_rich_schema_req(metadata):
+ return Request(identifier=randomString(),
+ reqId=1234,
+ sig="sig",
+ protocolVersion=CURRENT_PROTOCOL_VERSION,
+ operation={
+ TXN_TYPE: GET_RICH_SCHEMA_OBJECT_BY_METADATA,
+ RS_NAME: metadata[0],
+ RS_VERSION: metadata[1],
+ RS_TYPE: metadata[2]
+ })
+
+
+@pytest.fixture()
+def get_rich_schema_by_meta_handler(db_manager):
+ return GetRichSchemaObjectByMetadataHandler(db_manager)
+
+
+def test_get_rich_schema_obj(db_manager, handler_and_request, metadata,
+ get_rich_schema_by_meta_handler, get_rich_schema_req):
+ # prepare: store object in state with bls multi-sg
+ handler, request = handler_and_request
+ rs_name, rs_version, rs_type = metadata
+ op = request.operation
+ op[RS_NAME] = rs_name
+ op[RS_VERSION] = rs_version
+ op[RS_TYPE] = rs_type
+ seq_no = 1
+ txn_time = 1560241033
+ txn = reqToTxn(request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ handler.update_state(txn, None, request)
+ handler.state.commit()
+ save_multi_sig(db_manager)
+
+ # execute: get object
+ result = get_rich_schema_by_meta_handler.get_result(get_rich_schema_req)
+
+ # check
+ assert result
+ expected_data = SortedDict({
+ 'ver': op[OP_VER],
+ 'id': op[RS_ID],
+ 'rsType': op[RS_TYPE],
+ 'rsName': op[RS_NAME],
+ 'rsVersion': op[RS_VERSION],
+ 'content': op[RS_CONTENT],
+ 'from': request.identifier,
+ 'endorser': request.endorser,
+ })
+ assert SortedDict(result['data']) == expected_data
+ assert result['seqNo'] == seq_no
+ assert result['txnTime'] == txn_time
+ assert result['state_proof']
+ check_valid_proof(result)
+ path = op[RS_ID].encode()
+ assert is_proof_verified(db_manager,
+ result['state_proof'],
+ path, result[DATA], seq_no, txn_time)
+
+
+def test_get_rich_schema_obj_not_existent(db_manager, handler_and_request, metadata,
+ get_rich_schema_by_meta_handler, get_rich_schema_req):
+ save_multi_sig(db_manager)
+
+ # execute: get object
+ result = get_rich_schema_by_meta_handler.get_result(get_rich_schema_req)
+
+ # check
+ assert result
+ assert result['data'] is None
+ assert result['seqNo'] is None
+ assert result['txnTime'] is None
+ assert result['state_proof']
+ check_valid_proof(result)
+
+
+def test_get_rich_schema_obj_committed_only(db_manager, handler_and_request, metadata,
+ get_rich_schema_by_meta_handler, get_rich_schema_req):
+ # prepare: store object in state with bls multi-sig, and then update the object (uncommitted)
+ handler, request = handler_and_request
+ rs_name, rs_version, rs_type = metadata
+ op = request.operation
+ op[RS_NAME] = rs_name
+ op[RS_VERSION] = rs_version
+ op[RS_TYPE] = rs_type
+ seq_no = 1
+ txn_time = 1560241033
+ txn = reqToTxn(request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ handler.update_state(txn, None, request)
+ handler.state.commit()
+
+ get_payload_data(txn)[RS_CONTENT] = "new uncommitted content"
+ handler.update_state(txn, None, request)
+
+ save_multi_sig(db_manager)
+
+ # execute: get object
+ result = get_rich_schema_by_meta_handler.get_result(get_rich_schema_req)
+
+ # check
+ assert result
+ expected_data = SortedDict({
+ 'ver': op[OP_VER],
+ 'id': op[RS_ID],
+ 'rsType': op[RS_TYPE],
+ 'rsName': op[RS_NAME],
+ 'rsVersion': op[RS_VERSION],
+ 'content': op[RS_CONTENT],
+ 'from': request.identifier,
+ 'endorser': request.endorser,
+ })
+ assert SortedDict(result['data']) == expected_data
+ assert result['seqNo'] == seq_no
+ assert result['txnTime'] == txn_time
+ assert result['state_proof']
+ check_valid_proof(result)
+ path = op[RS_ID].encode()
+ assert is_proof_verified(db_manager,
+ result['state_proof'],
+ path, result[DATA], seq_no, txn_time)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_jsonld_context_handler.py b/indy_node/test/request_handlers/rich_schema/test_jsonld_context_handler.py
New file
new file mode 100644
index 00000000..88e48061
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_jsonld_context_handler.py
@@ -0,0 +1,117 @@
+import json
+
+import pytest
+
+from common.exceptions import LogicError
+from indy_common.constants import RS_CONTENT, JSON_LD_CONTEXT_FIELD, ENDORSER
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.json_ld_context_handler import \
+ JsonLdContextHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_node.test.request_handlers.rich_schema.helper import context_request
+from indy_node.test.rich_schema.templates import W3C_BASE_CONTEXT, W3C_EXAMPLE_V1_CONTEXT
+from plenum.common.constants import TXN_TYPE, TRUSTEE
+from plenum.common.exceptions import InvalidClientRequest
+
+
+@pytest.fixture()
+def context_handler(db_manager, write_auth_req_validator):
+ return JsonLdContextHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def context_req(context_handler):
+ req = context_request()
+ add_to_idr(context_handler.database_manager.idr_cache, req.identifier, TRUSTEE)
+ add_to_idr(context_handler.database_manager.idr_cache, req.endorser, ENDORSER)
+ return req
+
+
+def test_static_validation_context_no_context_field(context_handler, context_req):
+ context_req.operation[RS_CONTENT] = json.dumps({"aaa": "2http:/..@#$"})
+ with pytest.raises(InvalidClientRequest) as e:
+ context_handler.static_validation(context_req)
+
+ assert "must contain a @context field" in str(e.value)
+
+
+def test_static_validation_context_fail_bad_uri(context_handler, context_req):
+ context_req.operation[RS_CONTENT] = json.dumps({JSON_LD_CONTEXT_FIELD: "2http:/..@#$"})
+ with pytest.raises(InvalidClientRequest) as e:
+ context_handler.static_validation(context_req)
+
+ assert "@context URI 2http:/..@#$ badly formed" in str(e.value)
+
+
+def test_static_validation_fail_context_not_uri_or_array_or_object(context_handler, context_req):
+ context_req.operation[RS_CONTENT] = json.dumps({JSON_LD_CONTEXT_FIELD: 52})
+ with pytest.raises(InvalidClientRequest) as e:
+ context_handler.static_validation(context_req)
+
+ assert "'@context' value must be url, array, or object" in str(e.value)
+
+
+def test_static_validation_pass_context_value_is_dict(context_handler, context_req):
+ context = {
+ "favoriteColor": "https://example.com/vocab#favoriteColor"
+ }
+ context_req.operation[RS_CONTENT] = json.dumps({JSON_LD_CONTEXT_FIELD: context})
+ context_handler.static_validation(context_req)
+
+
+def test_static_validation_pass_context_value_is_list_with_dict_and_uri(context_handler, context_req):
+ context = [
+ {
+ "favoriteColor": "https://example.com/vocab#favoriteColor"
+ },
+ "https://www.w3.org/ns/odrl.jsonld"
+ ]
+ context_req.operation[RS_CONTENT] = json.dumps({JSON_LD_CONTEXT_FIELD: context})
+ context_handler.static_validation(context_req)
+
+
+def test_static_validation_pass_context_w3c_example_15(context_handler, context_req):
+ context = {
+ "@context": {
+ "referenceNumber": "https://example.com/vocab#referenceNumber",
+ "favoriteFood": "https://example.com/vocab#favoriteFood"
+ }
+ }
+ context_req.operation[RS_CONTENT] = json.dumps(context)
+ context_handler.static_validation(context_req)
+
+
+def test_static_validation_fail_context_is_list_with_dict_and_bad_uri(context_handler, context_req):
+ context = [
+ {
+ "favoriteColor": "https://example.com/vocab#favoriteColor"
+ },
+ "this is a bad uri"
+ ]
+ context_req.operation[RS_CONTENT] = json.dumps({JSON_LD_CONTEXT_FIELD: context})
+ with pytest.raises(InvalidClientRequest) as e:
+ context_handler.static_validation(context_req)
+
+ assert "@context URI this is a bad uri badly formed" in str(e.value)
+
+
+def test_static_validation_pass_context_w3c_base(context_handler, context_req):
+ # Sample from specification: https://w3c.github.io/vc-data-model/#base-context
+ # Actual file contents from: https://www.w3.org/2018/credentials/v1
+ context_req.operation[RS_CONTENT] = json.dumps(W3C_BASE_CONTEXT)
+ context_handler.static_validation(context_req)
+
+
+def test_static_validation_pass_context_w3c_examples_v1(context_handler, context_req):
+ # test for https://www.w3.org/2018/credentials/examples/v1
+ context_req.operation[RS_CONTENT] = json.dumps(W3C_EXAMPLE_V1_CONTEXT)
+ context_handler.static_validation(context_req)
+
+
+def test_static_validation_fail_invalid_type(context_handler, context_req):
+ context_req.operation[TXN_TYPE] = "201"
+ with pytest.raises(LogicError):
+ context_handler.static_validation(context_req)
+
+
+def test_schema_dynamic_validation_passes(context_handler, context_req):
+ context_handler.dynamic_validation(context_req, 0)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_rich_schema_cred_def_handler.py b/indy_node/test/request_handlers/rich_schema/test_rich_schema_cred_def_handler.py
new file
new file mode 100644
index 00000000..6489d6fa
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_rich_schema_cred_def_handler.py
@@ -0,0 +1,145 @@
+import copy
+import json
+
+import pytest
+
+from indy_common.constants import RS_CONTENT, ENDORSER, RS_ID, RS_CRED_DEF_SCHEMA, RS_CRED_DEF_MAPPING
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_cred_def_handler import \
+ RichSchemaCredDefHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_handler import RichSchemaHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_mapping_handler import \
+ RichSchemaMappingHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_node.test.request_handlers.rich_schema.helper import rich_schema_request, rich_schema_cred_def_request, \
+ rich_schema_mapping_request, make_rich_schema_object_exist
+from plenum.common.constants import TRUSTEE
+from plenum.common.exceptions import InvalidClientRequest
+from plenum.common.util import randomString
+
+
+@pytest.fixture()
+def cred_def_handler(db_manager, write_auth_req_validator):
+ return RichSchemaCredDefHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def rich_schema_handler(db_manager, write_auth_req_validator):
+ return RichSchemaHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def mapping_handler(db_manager, write_auth_req_validator):
+ return RichSchemaMappingHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def rich_schema_req():
+ return rich_schema_request()
+
+
+@pytest.fixture()
+def mapping_req():
+ return rich_schema_mapping_request()
+
+
+@pytest.fixture()
+def cred_def_req(rich_schema_handler, mapping_handler, rich_schema_req, mapping_req):
+ make_rich_schema_object_exist(rich_schema_handler, rich_schema_req)
+ make_rich_schema_object_exist(mapping_handler, mapping_req)
+
+ req = rich_schema_cred_def_request()
+
+ content = copy.deepcopy(json.loads(req.operation[RS_CONTENT]))
+ content[RS_CRED_DEF_SCHEMA] = rich_schema_req.operation[RS_ID]
+ content[RS_CRED_DEF_MAPPING] = mapping_req.operation[RS_ID]
+ req.operation[RS_CONTENT] = json.dumps(content)
+
+ add_to_idr(rich_schema_handler.database_manager.idr_cache, req.identifier, TRUSTEE)
+ add_to_idr(rich_schema_handler.database_manager.idr_cache, req.endorser, ENDORSER)
+
+ return req
+
+
+def test_static_validation_pass(cred_def_handler, cred_def_req):
+ cred_def_handler.static_validation(cred_def_req)
+
+
+@pytest.mark.parametrize('missing_field', ['signatureType', 'mapping', 'schema', 'publicKey'])
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_no_field(cred_def_handler, cred_def_req, missing_field, status):
+ content = copy.deepcopy(json.loads(cred_def_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop(missing_field, None)
+ elif status == 'empty':
+ content[missing_field] = ""
+ elif status == 'none':
+ content[missing_field] = None
+ cred_def_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'{}' must be set in 'content'".format(missing_field)):
+ cred_def_handler.static_validation(cred_def_req)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_no_all_fields(cred_def_handler, cred_def_req, status):
+ content = copy.deepcopy(json.loads(cred_def_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop('signatureType', None)
+ content.pop('mapping', None)
+ content.pop('schema', None)
+ content.pop('publicKey', None)
+ elif status == 'empty':
+ content['signatureType'] = ""
+ content['mapping'] = ""
+ content['schema'] = ""
+ content['publicKey'] = ""
+ elif status == 'none':
+ content['signatureType'] = None
+ content['mapping'] = None
+ content['schema'] = None
+ content['publicKey'] = None
+ cred_def_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'signatureType', 'mapping', 'schema', 'publicKey' must be set in 'content'"):
+ cred_def_handler.static_validation(cred_def_req)
+
+def test_dynamic_validation_passes(cred_def_handler, cred_def_req):
+ cred_def_handler.dynamic_validation(cred_def_req, 0)
+
+
+@pytest.mark.parametrize('field', [RS_CRED_DEF_SCHEMA, RS_CRED_DEF_MAPPING])
+def test_dynamic_validation_not_existent_ref(cred_def_handler, cred_def_req,
+ field):
+ content = copy.deepcopy(json.loads(cred_def_req.operation[RS_CONTENT]))
+ wrong_id = randomString()
+ content[field] = wrong_id
+ cred_def_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="Can not find a referenced '{}' with id={}; please make sure that it has been added to the ledger".format(
+ field, wrong_id)):
+ cred_def_handler.dynamic_validation(cred_def_req, 0)
+
+
+def test_dynamic_validation_not_schema_in_schema_field(cred_def_handler, cred_def_req,
+ mapping_req):
+ content = copy.deepcopy(json.loads(cred_def_req.operation[RS_CONTENT]))
+ content[RS_CRED_DEF_SCHEMA] = mapping_req.operation[RS_ID]
+ cred_def_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'schema' field must reference a schema with rsType=sch"):
+ cred_def_handler.dynamic_validation(cred_def_req, 0)
+
+
+def test_dynamic_validation_not_mapping_in_mapping_field(cred_def_handler, cred_def_req,
+ rich_schema_req):
+ content = copy.deepcopy(json.loads(cred_def_req.operation[RS_CONTENT]))
+ content[RS_CRED_DEF_MAPPING] = rich_schema_req.operation[RS_ID]
+ cred_def_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'mapping' field must reference a mapping with rsType=map"):
+ cred_def_handler.dynamic_validation(cred_def_req, 0)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_rich_schema_encoding_handler.py b/indy_node/test/request_handlers/rich_schema/test_rich_schema_encoding_handler.py
new file
new file mode 100644
index 00000000..e3bf1125
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_rich_schema_encoding_handler.py
@@ -0,0 +1,143 @@
+import copy
+import json
+
+import pytest
+
+from indy_common.constants import RS_CONTENT, ENDORSER, RS_ENC_ALGORITHM
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_encoding_handler import \
+ RichSchemaEncodingHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_node.test.request_handlers.rich_schema.helper import rich_schema_encoding_request
+from plenum.common.constants import TRUSTEE
+from plenum.common.exceptions import InvalidClientRequest
+
+
+@pytest.fixture()
+def encoding_handler(db_manager, write_auth_req_validator):
+ return RichSchemaEncodingHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def encoding_req(encoding_handler):
+ req = rich_schema_encoding_request()
+ add_to_idr(encoding_handler.database_manager.idr_cache, req.identifier, TRUSTEE)
+ add_to_idr(encoding_handler.database_manager.idr_cache, req.endorser, ENDORSER)
+ return req
+
+
+@pytest.mark.parametrize('missing_field', ['input', 'output', 'algorithm', 'testVectors'])
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_no_field(encoding_handler, encoding_req, missing_field, status):
+ content = copy.deepcopy(json.loads(encoding_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop(missing_field, None)
+ elif status == 'empty':
+ content[missing_field] = {}
+ elif status == 'none':
+ content[missing_field] = None
+ encoding_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'{}' must be set in 'content'".format(missing_field)):
+ encoding_handler.static_validation(encoding_req)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_no_all_fields(encoding_handler, encoding_req, status):
+ content = copy.deepcopy(json.loads(encoding_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop('input', None)
+ content.pop('output', None)
+ content.pop('algorithm', None)
+ content.pop('testVectors', None)
+ elif status == 'empty':
+ content['input'] = {}
+ content['output'] = {}
+ content['algorithm'] = {}
+ content['testVectors'] = {}
+ elif status == 'none':
+ content['input'] = None
+ content['output'] = None
+ content['algorithm'] = None
+ content['testVectors'] = None
+ encoding_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'input', 'output', 'algorithm', 'testVectors' must be set in 'content'"):
+ encoding_handler.static_validation(encoding_req)
+
+
+@pytest.mark.parametrize('missing_field', ['id', 'type'])
+@pytest.mark.parametrize('input_output', ['input', 'output'])
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_input_output(encoding_handler, encoding_req, missing_field, input_output, status):
+ content = copy.deepcopy(json.loads(encoding_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content[input_output].pop(missing_field, None)
+ elif status == 'empty':
+ content[input_output][missing_field] = {}
+ elif status == 'none':
+ content[input_output][missing_field] = None
+ encoding_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'{}' must be set in '{}'".format(missing_field, input_output)):
+ encoding_handler.static_validation(encoding_req)
+
+
+@pytest.mark.parametrize('input_output', ['input', 'output'])
+@pytest.mark.parametrize('status', ['empty', 'none'])
+def test_static_validation_input_output_all_missing(encoding_handler, encoding_req, input_output,
+ status):
+ content = copy.deepcopy(json.loads(encoding_req.operation[RS_CONTENT]))
+ if status == 'empty':
+ content[input_output]['id'] = {}
+ content[input_output]['type'] = {}
+ elif status == 'none':
+ content[input_output]['id'] = None
+ content[input_output]['type'] = None
+ encoding_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'id' and 'type' must be set in '{}'".format(input_output)):
+ encoding_handler.static_validation(encoding_req)
+
+
+@pytest.mark.parametrize('missing_field', ['description', 'documentation', 'implementation'])
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_algorithm(encoding_handler, encoding_req, missing_field, status):
+ content = copy.deepcopy(json.loads(encoding_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content[RS_ENC_ALGORITHM].pop(missing_field, None)
+ elif status == 'empty':
+ content[RS_ENC_ALGORITHM][missing_field] = {}
+ elif status == 'none':
+ content[RS_ENC_ALGORITHM][missing_field] = None
+ encoding_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'{}' must be set in '{}'".format(missing_field, RS_ENC_ALGORITHM)):
+ encoding_handler.static_validation(encoding_req)
+
+
+@pytest.mark.parametrize('status', ['empty', 'none'])
+def test_static_validation_algorithm_all_missing(encoding_handler, encoding_req, status):
+ content = copy.deepcopy(json.loads(encoding_req.operation[RS_CONTENT]))
+ if status == 'empty':
+ content[RS_ENC_ALGORITHM]['description'] = {}
+ content[RS_ENC_ALGORITHM]['documentation'] = {}
+ content[RS_ENC_ALGORITHM]['implementation'] = {}
+ elif status == 'none':
+ content[RS_ENC_ALGORITHM]['description'] = None
+ content[RS_ENC_ALGORITHM]['documentation'] = None
+ content[RS_ENC_ALGORITHM]['implementation'] = None
+ encoding_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'description', 'documentation', 'implementation' must be set in '{}'".format(
+ RS_ENC_ALGORITHM)):
+ encoding_handler.static_validation(encoding_req)
+
+
+def test_dynamic_validation_passes(encoding_handler, encoding_req):
+ encoding_handler.dynamic_validation(encoding_req, 0)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_rich_schema_handler.py b/indy_node/test/request_handlers/rich_schema/test_rich_schema_handler.py
New file
new file mode 100644
index 00000000..8704f7e8
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_rich_schema_handler.py
@@ -0,0 +1,24 @@
+import pytest
+
+from indy_common.constants import ENDORSER
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_handler import RichSchemaHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_node.test.request_handlers.rich_schema.helper import rich_schema_request
+from plenum.common.constants import TRUSTEE
+
+
+@pytest.fixture()
+def rich_schema_handler(db_manager, write_auth_req_validator):
+ return RichSchemaHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def rich_schema_req(rich_schema_handler):
+ req = rich_schema_request()
+ add_to_idr(rich_schema_handler.database_manager.idr_cache, req.identifier, TRUSTEE)
+ add_to_idr(rich_schema_handler.database_manager.idr_cache, req.endorser, ENDORSER)
+ return req
+
+
+def test_schema_dynamic_validation_passes(rich_schema_handler, rich_schema_req):
+ rich_schema_handler.dynamic_validation(rich_schema_req, 0)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_rich_schema_mapping_handler.py b/indy_node/test/request_handlers/rich_schema/test_rich_schema_mapping_handler.py
new file
new file mode 100644
index 00000000..a80613a0
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_rich_schema_mapping_handler.py
@@ -0,0 +1,382 @@
+import copy
+import json
+from functools import reduce
+from operator import getitem
+
+import pytest
+
+from indy_common.constants import RS_CONTENT, ENDORSER, RS_MAPPING_SCHEMA, RS_ID, \
+ RICH_SCHEMA_MAPPING, RS_MAPPING_TYPE_VALUE, RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE, RICH_SCHEMA, \
+ RS_SCHEMA_TYPE_VALUE, RS_MAPPING_ENC, RS_MAPPING_RANK, RS_MAPPING_ATTRIBUTES
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_encoding_handler import \
+ RichSchemaEncodingHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_handler import RichSchemaHandler
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_mapping_handler import \
+ RichSchemaMappingHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_node.test.request_handlers.rich_schema.helper import make_rich_schema_object_exist, rs_req
+from indy_node.test.rich_schema.templates import RICH_SCHEMA_ENCODING_EX1, RICH_SCHEMA_EX1
+from plenum.common.constants import TRUSTEE
+from plenum.common.exceptions import InvalidClientRequest
+from plenum.common.util import randomString
+
+TEST_MAPPING = {
+ '@id': "did:sov:8a9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@context': "did:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@type': "rdfs:Class",
+ "schema": "did:sov:4e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "attributes": {
+ "attr1": [
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 3
+ }
+ ],
+ "attr2": [
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 2
+ },
+ {
+ "enc": "did:sov:2x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 1
+ },
+ ],
+ "attr3": {
+ "attr4": [
+ {
+ "attr5": [
+ {
+ "enc": "did:sov:3x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 4
+ }
+ ]
+ },
+ {
+ "attr6": [
+ {
+ "enc": "did:sov:3x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 5
+ }
+ ]
+ },
+ ]
+ },
+ "issuer": [
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 7
+ }
+ ],
+ "issuanceDate": [
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 6
+ }
+ ],
+ }
+}
+
+TEST_ENCODING_1 = rs_req(RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE,
+ content=RICH_SCHEMA_ENCODING_EX1, id="did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD")
+
+TEST_ENCODING_2 = rs_req(RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE,
+ content=RICH_SCHEMA_ENCODING_EX1, id="did:sov:2x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD")
+
+TEST_ENCODING_3 = rs_req(RICH_SCHEMA_ENCODING, RS_ENCODING_TYPE_VALUE,
+ content=RICH_SCHEMA_ENCODING_EX1, id="did:sov:3x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD")
+
+
+@pytest.fixture()
+def mapping_handler(db_manager, write_auth_req_validator):
+ return RichSchemaMappingHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def rich_schema_handler(db_manager, write_auth_req_validator):
+ return RichSchemaHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def encoding_handler(db_manager, write_auth_req_validator):
+ return RichSchemaEncodingHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def rich_schema_req():
+ id = "did:sov:4e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD"
+ content = copy.deepcopy(RICH_SCHEMA_EX1)
+ content['@id'] = id
+ return rs_req(RICH_SCHEMA, RS_SCHEMA_TYPE_VALUE,
+ content=content, id=id)
+
+
+@pytest.fixture()
+def mapping_req(rich_schema_handler, encoding_handler, rich_schema_req):
+ make_rich_schema_object_exist(rich_schema_handler, rich_schema_req)
+ make_rich_schema_object_exist(encoding_handler, TEST_ENCODING_1)
+ make_rich_schema_object_exist(encoding_handler, TEST_ENCODING_2)
+ make_rich_schema_object_exist(encoding_handler, TEST_ENCODING_3)
+
+ id = randomString()
+ content = copy.deepcopy(TEST_MAPPING)
+ content['@id'] = id
+ req = rs_req(RICH_SCHEMA_MAPPING, RS_MAPPING_TYPE_VALUE,
+ content=content, id=id)
+
+ add_to_idr(rich_schema_handler.database_manager.idr_cache, req.identifier, TRUSTEE)
+ add_to_idr(rich_schema_handler.database_manager.idr_cache, req.endorser, ENDORSER)
+
+ return req
+
+
+def test_static_validation_pass(mapping_handler, mapping_req):
+ mapping_handler.static_validation(mapping_req)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+@pytest.mark.parametrize('missing_field', ['schema', 'attributes'])
+def test_static_validation_fail_no_schema_or_attribute(mapping_handler, mapping_req, status, missing_field):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop(missing_field, None)
+ elif status == 'empty':
+ content[missing_field] = ""
+ elif status == 'none':
+ content[missing_field] = None
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'{}' must be set in 'content'".format(missing_field)):
+ mapping_handler.static_validation(mapping_req)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_fail_no_schema_and_attributes(mapping_handler, mapping_req, status):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content.pop('attributes', None)
+ content.pop('schema', None)
+ elif status == 'empty':
+ content['attributes'] = {}
+ content['schema'] = {}
+ elif status == 'none':
+ content['attributes'] = None
+ content['schema'] = None
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'schema' and 'attributes' must be set in 'content'"):
+ mapping_handler.static_validation(mapping_req)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+@pytest.mark.parametrize('missing_field', ['issuer', 'issuanceDate'])
+def test_static_validation_fail_no_issuer_or_issuance_date(mapping_handler, mapping_req, status, missing_field):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content[RS_MAPPING_ATTRIBUTES].pop(missing_field, None)
+ elif status == 'empty':
+ content[RS_MAPPING_ATTRIBUTES][missing_field] = {}
+ elif status == 'none':
+ content[RS_MAPPING_ATTRIBUTES][missing_field] = None
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'{}' must be in content's 'attributes'".format(missing_field)):
+ mapping_handler.static_validation(mapping_req)
+
+
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+def test_static_validation_fail_no_issuance_date_and_issuer(mapping_handler, mapping_req, status):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ if status == 'missing':
+ content[RS_MAPPING_ATTRIBUTES].pop('issuanceDate', None)
+ content[RS_MAPPING_ATTRIBUTES].pop('issuer', None)
+ elif status == 'empty':
+ content[RS_MAPPING_ATTRIBUTES]['issuanceDate'] = {}
+ content[RS_MAPPING_ATTRIBUTES]['issuer'] = {}
+ elif status == 'none':
+ content[RS_MAPPING_ATTRIBUTES]['issuanceDate'] = None
+ content[RS_MAPPING_ATTRIBUTES]['issuer'] = None
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'issuer' and 'issuanceDate' must be in content's 'attributes'"):
+ mapping_handler.static_validation(mapping_req)
+
+
+def test_schema_dynamic_validation_passes(mapping_handler, mapping_req):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+def test_dynamic_validation_not_existent_schema(mapping_handler, mapping_req):
+ schema_id = randomString()
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ content[RS_MAPPING_SCHEMA] = schema_id
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match='Can not find a schema with id={}; please make sure that it has been added to the ledger'.format(
+ schema_id)):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+def test_dynamic_validation_not_schema_in_schema_field(mapping_handler, mapping_req):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ content[RS_MAPPING_SCHEMA] = TEST_ENCODING_1.operation[RS_ID]
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'schema' field must reference a schema with rsType=sch"):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+def get_mapping_attr_value(keys, mapping_content):
+ return reduce(getitem, keys, mapping_content)
+
+
+# a test against TEST_MAPPING
+@pytest.mark.parametrize('enc_path, index', [
+ (['attr1'], 0),
+ (['attr2'], 0),
+ (['attr2'], 1),
+ (['attr3', 'attr4', 0, 'attr5'], 0),
+ (['attr3', 'attr4', 1, 'attr6'], 0)
+])
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none'])
+@pytest.mark.parametrize('missing_field', [RS_MAPPING_ENC, RS_MAPPING_RANK])
+def test_dynamic_validation_empty_field_in_encoding_desc(mapping_handler, mapping_req,
+ enc_path, index, status, missing_field):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ enc_dict = get_mapping_attr_value(enc_path, content[RS_MAPPING_ATTRIBUTES])[index]
+ if status == 'missing':
+ enc_dict.pop(missing_field, None)
+ elif status == 'empty':
+ enc_dict[missing_field] = ""
+ elif status == 'none':
+ enc_dict[missing_field] = None
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="{} must be set for the attribute '{}'".format(missing_field, enc_path[-1])):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+# a test against TEST_MAPPING
+@pytest.mark.parametrize('enc_path, index', [
+ (['attr1'], 0),
+ (['attr2'], 0),
+ (['attr2'], 1),
+ (['attr3', 'attr4', 0, 'attr5'], 0),
+ (['attr3', 'attr4', 1, 'attr6'], 0)
+])
+@pytest.mark.parametrize('status', ['missing', 'empty', 'none', 'not_a_dict'])
+def test_dynamic_validation_empty_encoding_desc(mapping_handler, mapping_req,
+ enc_path, index, status):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ enc_list = get_mapping_attr_value(enc_path, content[RS_MAPPING_ATTRIBUTES])
+ enc_dict = enc_list[index]
+ if status == 'missing':
+ enc_dict.pop(RS_MAPPING_ENC, None)
+ enc_dict.pop(RS_MAPPING_RANK, None)
+ elif status == 'empty':
+ enc_dict[RS_MAPPING_ENC] = ""
+ enc_dict[RS_MAPPING_RANK] = ""
+ elif status == 'none':
+ enc_dict[RS_MAPPING_ENC] = None
+ enc_dict[RS_MAPPING_RANK] = None
+ elif status == 'not_a_dict':
+ enc_list[index] = "aaaa"
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="enc and rank must be set for the attribute '{}'".format(enc_path[-1])):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+# a test against TEST_MAPPING
+@pytest.mark.parametrize('enc_path, index', [
+ (['attr1'], 0),
+ (['attr2'], 0),
+ (['attr2'], 1),
+ (['attr3', 'attr4', 0, 'attr5'], 0),
+ (['attr3', 'attr4', 1, 'attr6'], 0)
+])
+def test_dynamic_validation_not_existent_encoding(mapping_handler, mapping_req,
+ enc_path, index):
+ wrong_id = randomString()
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ enc_dict = get_mapping_attr_value(enc_path, content[RS_MAPPING_ATTRIBUTES])[index]
+ enc_dict[RS_MAPPING_ENC] = wrong_id
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="Can not find a referenced 'enc' with id={} in the '{}' attribute; please make sure that it has been added to the ledger".format(
+ wrong_id, enc_path[-1])):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+# a test against TEST_MAPPING
+@pytest.mark.parametrize('enc_path, index', [
+ (['attr1'], 0),
+ (['attr2'], 0),
+ (['attr2'], 1),
+ (['attr3', 'attr4', 0, 'attr5'], 0),
+ (['attr3', 'attr4', 1, 'attr6'], 0)
+])
+def test_dynamic_validation_not_encoding_in_enc_field(mapping_handler, mapping_req,
+ rich_schema_req,
+ enc_path, index):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ enc_dict = get_mapping_attr_value(enc_path, content[RS_MAPPING_ATTRIBUTES])[index]
+ enc_dict[RS_MAPPING_ENC] = rich_schema_req.operation[RS_ID]
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="'enc' field in the '{}' attribute must reference an encoding with rsType=enc".format(
+ enc_path[-1])):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+# a test against TEST_MAPPING
+@pytest.mark.parametrize('enc_path, index', [
+ (['attr1'], 0),
+ (['attr2'], 0),
+ (['attr2'], 1),
+ (['attr3', 'attr4', 0, 'attr5'], 0),
+ (['attr3', 'attr4', 1, 'attr6'], 0),
+ (['issuer'], 0),
+ (['issuanceDate'], 0)
+])
+@pytest.mark.parametrize('rank_value', [0, 8, 9, 12, -1])
+def test_dynamic_validation_rank_sequence(mapping_handler, mapping_req,
+ rich_schema_req,
+ enc_path, index, rank_value):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ enc_dict = get_mapping_attr_value(enc_path, content[RS_MAPPING_ATTRIBUTES])[index]
+ enc_dict[RS_MAPPING_RANK] = rank_value
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="the attribute's ranks are not sequential: expected ranks are all values from 1 to 7"):
+ mapping_handler.dynamic_validation(mapping_req, 0)
+
+
+@pytest.mark.parametrize('enc_path, index', [
+ (['attr1'], 0),
+ (['attr2'], 0),
+ (['attr2'], 1),
+])
+@pytest.mark.parametrize('rank_value', [4, 5, 6, 7])
+def test_dynamic_validation_rank_same_rank(mapping_handler, mapping_req,
+ rich_schema_req,
+ enc_path, index, rank_value):
+ content = copy.deepcopy(json.loads(mapping_req.operation[RS_CONTENT]))
+ enc_dict = get_mapping_attr_value(enc_path, content[RS_MAPPING_ATTRIBUTES])[index]
+ enc_dict[RS_MAPPING_RANK] = rank_value
+ mapping_req.operation[RS_CONTENT] = json.dumps(content)
+
+ with pytest.raises(InvalidClientRequest,
+ match="the attribute's ranks are not sequential: expected ranks are all values from 1 to 7"):
+ mapping_handler.dynamic_validation(mapping_req, 0)
* [X] diff --git a/indy_node/test/request_handlers/rich_schema/test_rich_schema_pres_def_handler.py b/indy_node/test/request_handlers/rich_schema/test_rich_schema_pres_def_handler.py
new file
new file mode 100644
index 00000000..69b0302b
--- /dev/null
+++ b/indy_node/test/request_handlers/rich_schema/test_rich_schema_pres_def_handler.py
@@ -0,0 +1,25 @@
+import pytest
+
+from indy_common.constants import ENDORSER
+from indy_node.server.request_handlers.domain_req_handlers.rich_schema.rich_schema_pres_def_handler import \
+ RichSchemaPresDefHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_node.test.request_handlers.rich_schema.helper import rich_schema_pres_def_request
+from plenum.common.constants import TRUSTEE
+
+
+@pytest.fixture()
+def pres_def_handler(db_manager, write_auth_req_validator):
+ return RichSchemaPresDefHandler(db_manager, write_auth_req_validator)
+
+
+@pytest.fixture()
+def pres_def_req(pres_def_handler):
+ req = rich_schema_pres_def_request()
+ add_to_idr(pres_def_handler.database_manager.idr_cache, req.identifier, TRUSTEE)
+ add_to_idr(pres_def_handler.database_manager.idr_cache, req.endorser, ENDORSER)
+ return req
+
+
+def test_schema_dynamic_validation_passes(pres_def_handler, pres_def_req):
+ pres_def_handler.dynamic_validation(pres_def_req, 0)
* [X] diff --git a/indy_node/test/request_handlers/test_claim_def_handler.py b/indy_node/test/request_handlers/test_claim_def_handler.py
Only merges
index 41c4d0d7..4ec25e35 100644
--- a/indy_node/test/request_handlers/test_claim_def_handler.py
+++ b/indy_node/test/request_handlers/test_claim_def_handler.py
@@ -58,7 +58,7 @@ def test_claim_def_dynamic_validation_without_schema(claim_def_request,
with pytest.raises(InvalidClientRequest) as e:
claim_def_handler.dynamic_validation(claim_def_request, 0)
assert "Mentioned seqNo ({}) doesn't exist.".format(claim_def_request.operation[REF]) \
- in e._excinfo[1].args[0]
+ in e._excinfo[1].reason
def test_claim_def_dynamic_validation_for_new_claim_def(claim_def_request, schema,
@@ -104,7 +104,7 @@ def test_claim_def_dynamic_validation_without_ref_to_not_schema(claim_def_reques
with pytest.raises(InvalidClientRequest) as e:
claim_def_handler.dynamic_validation(claim_def_request, 0)
assert "Mentioned seqNo ({}) isn't seqNo of the schema.".format(claim_def_request.operation[REF]) \
- in e._excinfo[1].args[0]
+ in e._excinfo[1].reason
def test_update_state(claim_def_request, claim_def_handler: ClaimDefHandler, schema_handler, schema_request):
* [X] diff --git a/indy_node/test/request_handlers/test_context_handler.py b/indy_node/test/request_handlers/test_context_handler.py
deleted file mode 100644
index 9e752b88..00000000
--- a/indy_node/test/request_handlers/test_context_handler.py
+++ /dev/null
@@ -1,251 +0,0 @@
-
-import pytest
-
-from indy_node.test.request_handlers.helper import add_to_idr
-from plenum.common.constants import DATA, TRUSTEE
-
-from indy_common.authorize.auth_constraints import AuthConstraintForbidden
-from plenum.common.exceptions import UnauthorizedClientRequest, InvalidClientRequest
-
-from indy_common.req_utils import get_write_context_name, get_write_context_version, get_txn_context_meta, \
- get_txn_context_data
-from plenum.common.txn_util import get_request_data, reqToTxn, append_txn_metadata
-
-from common.exceptions import LogicError
-from indy_common.constants import CONTEXT_TYPE, META, RS_TYPE, CONTEXT_CONTEXT
-
-from indy_node.server.request_handlers.domain_req_handlers.context_handler import ContextHandler
-from plenum.common.request import Request
-from indy_node.test.context.helper import W3C_BASE_CONTEXT, W3C_EXAMPLE_V1_CONTEXT
-from plenum.server.request_handlers.utils import encode_state_value
-
-
-def test_static_validation_context_fail_bad_uri():
- context = "2http:/..@#$"
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: context
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- with pytest.raises(InvalidClientRequest) as e:
- ch.static_validation(req)
- assert "@context URI 2http:/..@#$ badly formed" in str(e.value)
-
-
-def test_static_validation_fail_context_not_uri_or_array_or_object():
- context = 52
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: context
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- with pytest.raises(InvalidClientRequest) as e:
- ch.static_validation(req)
- assert "'@context' value must be url, array, or object" in str(e.value)
-
-
-def test_static_validation_pass_context_value_is_dict():
- context = {
- "favoriteColor": "https://example.com/vocab#favoriteColor"
- }
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: context
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- ch.static_validation(req)
-
-
-def test_static_validation_pass_context_value_is_list_with_dict_and_uri():
- context = [
- {
- "favoriteColor": "https://example.com/vocab#favoriteColor"
- },
- "https://www.w3.org/ns/odrl.jsonld"
- ]
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: context
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- ch.static_validation(req)
-
-
-def test_static_validation_pass_context_w3c_example_15():
- context = {
- "@context": {
- "referenceNumber": "https://example.com/vocab#referenceNumber",
- "favoriteFood": "https://example.com/vocab#favoriteFood"
- }
- }
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: context
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- ch.static_validation(req)
-
-
-def test_static_validation_fail_context_is_list_with_dict_and_bad_uri():
- context = [
- {
- "favoriteColor": "https://example.com/vocab#favoriteColor"
- },
- "this is a bad uri"
- ]
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: context
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- with pytest.raises(InvalidClientRequest) as e:
- ch.static_validation(req)
- assert "@context URI this is a bad uri badly formed" in str(e.value)
-
-
-def test_static_validation_pass_context_w3c_base():
- # Sample from specification: https://w3c.github.io/vc-data-model/#base-context
- # Actual file contents from: https://www.w3.org/2018/credentials/v1
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: W3C_BASE_CONTEXT
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- ch.static_validation(req)
-
-
-def test_static_validation_pass_context_w3c_examples_v1():
- # test for https://www.w3.org/2018/credentials/examples/v1
- operation = {
- META: {
- "name": "TestContext",
- "version": 1,
- "type": CONTEXT_TYPE
- },
- DATA: {
- CONTEXT_CONTEXT: W3C_EXAMPLE_V1_CONTEXT
- },
- RS_TYPE: "200"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- ch.static_validation(req)
-
-
-def test_static_validation_fail_invalid_type():
- operation = {
- "meta": {
- "type": "context",
- "name": "TestContext",
- "version": 1
- },
- "data": W3C_BASE_CONTEXT,
- "type": "201"
- }
- req = Request("test", 1, operation, "sig",)
- ch = ContextHandler(None, None)
- with pytest.raises(LogicError):
- ch.static_validation(req)
-
-
-def test_static_validation_fail_no_type(context_handler, context_request):
- del context_request.operation['type']
- with pytest.raises(LogicError):
- context_handler.static_validation(context_request)
-
-
-def make_context_exist(context_request, context_handler):
- identifier, req_id, operation = get_request_data(context_request)
- context_name = get_write_context_name(context_request)
- context_version = get_write_context_version(context_request)
- path = ContextHandler.make_state_path_for_context(identifier, context_name, context_version)
- context_handler.state.set(path, encode_state_value("value", "seqNo", "txnTime"))
-
-
-def test_context_dynamic_validation_failed_existing_context(context_request, context_handler):
- make_context_exist(context_request, context_handler)
- with pytest.raises(UnauthorizedClientRequest, match=str(AuthConstraintForbidden())):
- context_handler.dynamic_validation(context_request, 0)
-
-
-def test_context_dynamic_validation_failed_not_authorised(context_request, context_handler):
- add_to_idr(context_handler.database_manager.idr_cache, context_request.identifier, None)
- with pytest.raises(UnauthorizedClientRequest):
- context_handler.dynamic_validation(context_request, 0)
-
-
-def test_schema_dynamic_validation_passes(context_request, context_handler):
- add_to_idr(context_handler.database_manager.idr_cache, context_request.identifier, TRUSTEE)
- context_handler.dynamic_validation(context_request, 0)
-
-
-def test_update_state(context_request, context_handler):
- seq_no = 1
- txn_time = 1560241033
- txn = reqToTxn(context_request)
- append_txn_metadata(txn, seq_no, txn_time)
- path, value_bytes = ContextHandler.prepare_context_for_state(txn)
- value = {
- META: get_txn_context_meta(txn),
- DATA: get_txn_context_data(txn)
- }
-
- context_handler.update_state(txn, None, context_request)
- assert context_handler.get_from_state(path) == (value, seq_no, txn_time)
-
* [X] diff --git a/indy_node/test/request_handlers/test_flag_handler.py b/indy_node/test/request_handlers/test_flag_handler.py
New file
new file mode 100644
index 00000000..d72c33c4
--- /dev/null
+++ b/indy_node/test/request_handlers/test_flag_handler.py
@@ -0,0 +1,375 @@
+import pytest
+
+from common.exceptions import LogicError
+from indy_node.server.request_handlers.config_req_handlers.flag_handler import (
+ FlagRequestHandler,
+)
+from indy_node.server.request_handlers.read_req_handlers.get_flag_handler import GetFlagRequestHandler
+from indy_node.test.request_handlers.helper import add_to_idr
+from indy_common.constants import (
+ CONFIG_LEDGER_ID,
+ FLAG_NAME,
+ FLAG_VALUE,
+ GET_FLAG,
+ TXN_TYPE,
+ NYM,
+ ENDORSER,
+ VERSION_ID,
+ VERSION_TIME,
+)
+from plenum.common.constants import TRUSTEE, STEWARD, DATA, STATE_PROOF, TXN_TIME
+from plenum.common.exceptions import InvalidClientRequest, UnauthorizedClientRequest
+from plenum.common.request import Request
+from plenum.common.txn_util import reqToTxn, append_txn_metadata
+from plenum.common.types import f
+
+@pytest.fixture(scope="function")
+def get_flag_request(creator):
+ return Request(identifier=creator,
+ reqId=5,
+ operation={TXN_TYPE: GET_FLAG})
+
+
+def test_config_flag_static_validation_wrong_type(
+ flag_handler: FlagRequestHandler, flag_request: Request
+):
+ with pytest.raises(LogicError):
+ flag_request.operation[TXN_TYPE] = NYM
+ flag_handler.static_validation(flag_request)
+
+
+def test_config_flag_static_validation_no_key(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ flag_request.operation[FLAG_NAME] = None
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+ with pytest.raises(InvalidClientRequest, match="Flag name is required"):
+ flag_handler.static_validation(flag_request)
+
+
+def test_config_flag_static_validation_empty_key(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ flag_request.operation[FLAG_NAME] = ""
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+ with pytest.raises(InvalidClientRequest, match="Flag name is required"):
+ flag_handler.static_validation(flag_request)
+
+
+def test_config_flag_static_validation_wrong_name_type(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ flag_request.operation[FLAG_NAME] = 123
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+ with pytest.raises(InvalidClientRequest, match="Flag name must be of type string"):
+ flag_handler.static_validation(flag_request)
+
+
+def test_config_flag_static_validation_wrong_value_type(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ flag_request.operation[FLAG_VALUE] = True
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+ with pytest.raises(InvalidClientRequest, match="Flag value must be of type string or None"):
+ flag_handler.static_validation(flag_request)
+
+
+def test_config_flag_static_validation_pass(
+ flag_handler: FlagRequestHandler, flag_request: Request
+):
+ flag_handler.static_validation(flag_request)
+
+
+def test_config_flag_dynamic_validation_authorize_not_trustee(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache, flag_request.identifier, ENDORSER
+ )
+ with pytest.raises(
+ UnauthorizedClientRequest, match=".*'Not enough TRUSTEE signatures'"
+ ):
+ flag_handler.additional_dynamic_validation(flag_request, 0)
+
+ add_to_idr(
+ flag_handler.database_manager.idr_cache, flag_request.identifier, STEWARD
+ )
+ with pytest.raises(
+ UnauthorizedClientRequest, match=".*'Not enough TRUSTEE signatures'"
+ ):
+ flag_handler.additional_dynamic_validation(flag_request, 0)
+
+
+def test_config_flag_dynamic_validation_authorize_no_nym(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ with pytest.raises(
+ UnauthorizedClientRequest, match=".* is not found in the Ledger"
+ ):
+ flag_handler.additional_dynamic_validation(flag_request, 0)
+
+
+def test_config_flag_dynamic_validation_authorize_no_permission(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ add_to_idr(flag_handler.database_manager.idr_cache, flag_request.identifier, None)
+ with pytest.raises(
+ UnauthorizedClientRequest, match=".*'Not enough TRUSTEE signatures'"
+ ):
+ flag_handler.additional_dynamic_validation(flag_request, 0)
+
+
+def test_config_flag_dynamic_validation_authorize_passes(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+ flag_handler.additional_dynamic_validation(flag_request, 0)
+
+
+def test_config_flag_update_state_no_value(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request,
+ get_flag_request_handler: GetFlagRequestHandler
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ flag_request.operation[FLAG_VALUE] = None
+ key = flag_request.operation.get(FLAG_NAME)
+ seq_no = 2
+ txn_time = 1560241033
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+
+ # lookup
+ state = get_flag_request_handler.lookup_key(key)
+ assert(state)
+ state_value = FlagRequestHandler.get_state_value(state)
+ assert(state_value is None)
+
+
+def test_config_flag_update_state_empty_value(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request,
+ get_flag_request_handler: GetFlagRequestHandler
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ flag_request.operation[FLAG_VALUE] = ""
+ key = flag_request.operation.get(FLAG_NAME)
+ seq_no = 2
+ txn_time = 1560241033
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+
+ state = get_flag_request_handler.lookup_key(key)
+ assert(state)
+ state_value = FlagRequestHandler.get_state_value(state)
+ assert(state_value == "")
+
+
+def test_config_flag_update_state_correct_value(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request,
+ get_flag_request_handler: GetFlagRequestHandler
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ val = "True"
+ flag_request.operation[FLAG_VALUE] = val
+ key = flag_request.operation.get(FLAG_NAME)
+ seq_no = 2
+ txn_time = 1560241033
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+ state = get_flag_request_handler.lookup_key(key)
+ assert(state)
+ state_value = FlagRequestHandler.get_state_value(state)
+ assert(state_value == val)
+
+
+def test_config_flag_update_state_edit(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request,
+ get_flag_request_handler: GetFlagRequestHandler
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ val_old = "old_value"
+ flag_request.operation[FLAG_VALUE] = val_old
+ key = flag_request.operation.get(FLAG_NAME)
+ seq_no = 2
+ txn_time_old = 1560241033
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time_old)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+ get_flag_request_handler.timestamp_store.set(txn_time_old, flag_handler.state.headHash, ledger_id=CONFIG_LEDGER_ID)
+
+ val_new = "new_value"
+ flag_request.operation[FLAG_VALUE] = val_new
+ key = flag_request.operation.get(FLAG_NAME)
+ seq_no = 3
+ txn_time = 1560251034
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+ get_flag_request_handler.timestamp_store.set(txn_time, flag_handler.state.headHash, ledger_id=CONFIG_LEDGER_ID)
+
+ # check current state
+ state = get_flag_request_handler.lookup_key(key)
+ assert(state)
+ state_value = FlagRequestHandler.get_state_value(state)
+ assert(state_value == val_new)
+
+ # check old state
+ state = get_flag_request_handler.lookup_key(key, timestamp=txn_time_old)
+ assert(state)
+ state_value = FlagRequestHandler.get_state_value(state)
+ assert(state_value == val_old)
+
+ # check for None state before
+ state = get_flag_request_handler.lookup_key(key, timestamp=(txn_time_old - 1))
+ assert(state is None)
+
+
+def test_config_flag_get_result_exclusive(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request,
+ get_flag_request_handler: GetFlagRequestHandler,
+ get_flag_request: Request
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ val = "True"
+ flag_request.operation[FLAG_VALUE] = val
+ key = flag_request.operation.get(FLAG_NAME)
+ seq_no = 2
+ txn_time = 1560241033
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+
+ state = get_flag_request_handler.lookup_key(key)
+ assert(state)
+ state_value = FlagRequestHandler.get_state_value(state)
+ assert(state_value == val)
+
+ get_flag_request.operation[FLAG_NAME] = key
+ get_flag_request.operation[VERSION_TIME] = txn_time
+ get_flag_request.operation[VERSION_ID] = seq_no
+
+ with pytest.raises(
+ InvalidClientRequest, match=".*seqNo and timestamp are mutually exclusive; only one should be specified"
+ ):
+ get_flag_request_handler.get_result(get_flag_request)
+
+
+def test_config_flag_get_state_empty_key(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request,
+ get_flag_request_handler: GetFlagRequestHandler
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ seq_no = 2
+ txn_time = 1560241033
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+
+ value = get_flag_request_handler.lookup_key("")
+ assert(value is None)
+
+
+def test_config_flag_get_result_valid(
+ flag_handler: FlagRequestHandler,
+ flag_request: Request,
+ get_flag_request_handler: GetFlagRequestHandler,
+ get_flag_request: Request,
+):
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ val = "Test12345"
+ flag_request.operation[FLAG_VALUE] = val
+ key = flag_request.operation.get(FLAG_NAME)
+ seq_no = 2
+ txn_time = 1560241033
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+ get_flag_request_handler.timestamp_store.set(txn_time, flag_handler.state.headHash, ledger_id=CONFIG_LEDGER_ID)
+
+ get_flag_request.operation[FLAG_NAME] = key
+ result = get_flag_request_handler.get_result(get_flag_request)
+ assert(result.get(TXN_TYPE) == GET_FLAG)
+ assert(result.get(TXN_TIME) == txn_time)
+ assert(result.get(f.SEQ_NO.nm) == seq_no)
+ state_value = FlagRequestHandler.get_state_value(result.get(DATA))
+ assert(state_value == val)
+ assert(result.get(STATE_PROOF) is None)
* [ ] diff --git a/indy_node/test/request_handlers/test_nym_handler.py b/indy_node/test/request_handlers/test_nym_handler.py
* [ ] TODO file needs further consideration
index 7b34085a..f271ea76 100644
--- a/indy_node/test/request_handlers/test_nym_handler.py
+++ b/indy_node/test/request_handlers/test_nym_handler.py
@@ -1,17 +1,27 @@
-import pytest
-from indy_common.auth import Authoriser
-
-from indy_common.constants import NYM, ROLE
+from contextlib import contextmanager
+import json
+from unittest import mock
-from indy_node.server.request_handlers.domain_req_handlers.nym_handler import NymHandler
-from indy_node.test.request_handlers.helper import add_to_idr, get_exception
from ledger.util import F
-from plenum.common.constants import STEWARD, TARGET_NYM, IDENTIFIER, VERKEY
+from plenum.common.constants import IDENTIFIER, STEWARD, TARGET_NYM, TXN_TYPE, VERKEY
from plenum.common.exceptions import InvalidClientRequest, UnauthorizedClientRequest
from plenum.common.request import Request
-from plenum.common.txn_util import reqToTxn, append_txn_metadata
+from plenum.common.txn_util import append_txn_metadata, reqToTxn
from plenum.common.util import randomString
from plenum.server.request_handlers.utils import nym_to_state_key
+import pytest
+
+from indy_common.auth import Authoriser
+from indy_common.constants import (
+ DIDDOC_CONTENT,
+ NYM,
+ NYM_VERSION,
+ NYM_VERSION_CONVENTION,
+ NYM_VERSION_SELF_CERT,
+ ROLE,
+)
+from indy_node.server.request_handlers.domain_req_handlers.nym_handler import NymHandler
+from indy_node.test.request_handlers.helper import add_to_idr, get_exception
@pytest.fixture(scope="module")
@@ -19,16 +29,6 @@ def nym_handler(db_manager, tconf, write_auth_req_validator):
return NymHandler(tconf, db_manager, write_auth_req_validator)
-@pytest.fixture(scope="function")
-def nym_request(creator):
- return Request(identifier=creator,
- reqId=5,
- operation={'type': NYM,
- 'dest': randomString(),
- 'role': None,
- 'verkey': randomString()})
-
-
@pytest.fixture(scope="module")
def creator(db_manager):
identifier = randomString()
@@ -37,47 +37,132 @@ def creator(db_manager):
return identifier
+@pytest.fixture
+def doc():
+ yield {
+ "@context": [
+ "https://www.w3.org/ns/did/v1",
+ "https://identity.foundation/didcomm-messaging/service-endpoint/v1",
+ ],
+ "serviceEndpoint": [
+ {
+ "id": "did:indy:sovrin:123456#didcomm",
+ "type": "didcomm-messaging",
+ "serviceEndpoint": "https://example.com",
+ "recipientKeys": ["#verkey"],
+ "routingKeys": [],
+ }
+ ],
+ }
+
+
+@pytest.fixture
+def nym_request_factory(creator, doc):
+ def _factory(diddoc_content: dict = None, version: int = None):
+ if diddoc_content is None:
+ diddoc_content = doc
+ return Request(
+ identifier=creator,
+ reqId=5,
+ operation={
+ TXN_TYPE: NYM,
+ TARGET_NYM: "X3XUxYQM2cfkSMzfMNma73",
+ ROLE: None,
+ DIDDOC_CONTENT: json.dumps(diddoc_content),
+ VERKEY: "HNjfjoeZ7WAHYDSzWcvzyvUABepctabD7QSxopM48fYx",
+ **({NYM_VERSION: version} if version is not None else {})
+ },
+ )
+
+ yield _factory
+
+
+@pytest.fixture(scope="function")
+def nym_request(nym_request_factory):
+ yield nym_request_factory()
+
+
def test_nym_static_validation_passes(nym_request, nym_handler: NymHandler):
nym_handler.static_validation(nym_request)
-def test_nym_static_validation_failed_without_dest(nym_request, nym_handler: NymHandler):
- del nym_request.operation['dest']
+def test_nym_static_validation_failed_without_dest(
+ nym_request, nym_handler: NymHandler
+):
+ del nym_request.operation["dest"]
with pytest.raises(InvalidClientRequest):
nym_handler.static_validation(nym_request)
-def test_nym_static_validation_failed_with_none_dest(nym_request, nym_handler: NymHandler):
- nym_request.operation['dest'] = None
+def test_nym_static_validation_failed_with_none_dest(
+ nym_request, nym_handler: NymHandler
+):
+ nym_request.operation["dest"] = None
with pytest.raises(InvalidClientRequest):
nym_handler.static_validation(nym_request)
-def test_nym_static_validation_failed_with_empty_dest(nym_request, nym_handler: NymHandler):
- nym_request.operation['dest'] = ''
+def test_nym_static_validation_failed_with_empty_dest(
+ nym_request, nym_handler: NymHandler
+):
+ nym_request.operation["dest"] = ""
with pytest.raises(InvalidClientRequest):
nym_handler.static_validation(nym_request)
-def test_nym_static_validation_failed_with_spaced_dest(nym_request, nym_handler: NymHandler):
- nym_request.operation['dest'] = ' ' * 5
+def test_nym_static_validation_failed_with_spaced_dest(
+ nym_request, nym_handler: NymHandler
+):
+ nym_request.operation["dest"] = " " * 5
with pytest.raises(InvalidClientRequest):
nym_handler.static_validation(nym_request)
def test_nym_static_validation_authorized(nym_request, nym_handler: NymHandler):
for role in Authoriser.ValidRoles:
- nym_request.operation['role'] = role
+ nym_request.operation["role"] = role
+ nym_handler.static_validation(nym_request)
+
+
+def test_nym_static_validation_not_authorized_random(
+ nym_request, nym_handler: NymHandler
+):
+ nym_request.operation["role"] = randomString()
+ with pytest.raises(InvalidClientRequest):
nym_handler.static_validation(nym_request)
-def test_nym_static_validation_not_authorized_random(nym_request, nym_handler: NymHandler):
- nym_request.operation['role'] = randomString()
+def test_nym_static_validation_fails_diddoc_content_with_id(
+ nym_request_factory, doc, nym_handler: NymHandler
+):
+ doc["id"] = randomString()
+ nym_request = nym_request_factory(doc)
with pytest.raises(InvalidClientRequest):
nym_handler.static_validation(nym_request)
-def test_nym_dynamic_validation_for_new_nym(nym_request, nym_handler: NymHandler, creator):
+def test_nym_static_validation_diddoc_content_without_context(
+ nym_request_factory, doc, nym_handler: NymHandler
+):
+ del doc["@context"]
+ nym_request = nym_request_factory(doc)
+ nym_handler.static_validation(nym_request)
+
+
+def test_nym_static_validation_diddoc_content_fails_with_same_id(
+ nym_request_factory, doc, nym_handler: NymHandler
+):
+ doc["authentication"] = [{
+ "id": "did:indy:sovrin:123456#verkey"
+ }]
+ nym_request = nym_request_factory(doc)
+ with pytest.raises(InvalidClientRequest):
+ nym_handler.static_validation(nym_request)
+
+
+def test_nym_dynamic_validation_for_new_nym(
+ nym_request, nym_handler: NymHandler, creator
+):
nym_handler.write_req_validator.validate = get_exception(False)
add_to_idr(nym_handler.database_manager.idr_cache, creator, STEWARD)
nym_handler.dynamic_validation(nym_request, 0)
@@ -87,8 +172,12 @@ def test_nym_dynamic_validation_for_new_nym(nym_request, nym_handler: NymHandler
nym_handler.dynamic_validation(nym_request, 0)
-def test_nym_dynamic_validation_for_existing_nym(nym_request: Request, nym_handler: NymHandler, creator):
- add_to_idr(nym_handler.database_manager.idr_cache, nym_request.operation['dest'], None)
+def test_nym_dynamic_validation_for_existing_nym(
+ nym_request: Request, nym_handler: NymHandler, creator
+):
+ add_to_idr(
+ nym_handler.database_manager.idr_cache, nym_request.operation["dest"], None
+ )
nym_handler.write_req_validator.validate = get_exception(False)
add_to_idr(nym_handler.database_manager.idr_cache, creator, STEWARD)
nym_handler.dynamic_validation(nym_request, 0)
@@ -98,13 +187,15 @@ def test_nym_dynamic_validation_for_existing_nym(nym_request: Request, nym_handl
nym_handler.dynamic_validation(nym_request, 0)
-def test_nym_dynamic_validation_for_existing_nym_fails_with_no_changes(nym_handler: NymHandler,
- creator):
- nym_request = Request(identifier=creator,
- reqId=5,
- operation={'type': NYM,
- 'dest': randomString()})
- add_to_idr(nym_handler.database_manager.idr_cache, nym_request.operation['dest'], None)
+def test_nym_dynamic_validation_for_existing_nym_fails_with_no_changes(
+ nym_handler: NymHandler, creator
+):
+ nym_request = Request(
+ identifier=creator, reqId=5, operation={"type": NYM, "dest": randomString()}
+ )
+ add_to_idr(
+ nym_handler.database_manager.idr_cache, nym_request.operation["dest"], None
+ )
add_to_idr(nym_handler.database_manager.idr_cache, creator, STEWARD)
nym_handler.write_req_validator.validate = get_exception(True)
@@ -127,3 +218,81 @@ def test_update_state(nym_request: Request, nym_handler: NymHandler):
assert state_value[F.seqNo.name] == seq_no
assert state_value[ROLE] == nym_request.operation.get(ROLE)
assert state_value[VERKEY] == nym_request.operation.get(VERKEY)
+ assert state_value[DIDDOC_CONTENT] == nym_request.operation.get(DIDDOC_CONTENT)
+
+
+def test_fail_on_version_update(nym_request: Request, nym_handler: NymHandler, doc, nym_request_factory):
+ nym_request = nym_request_factory(doc, NYM_VERSION_SELF_CERT)
+ txn = reqToTxn(nym_request)
+ seq_no = 1
+ txn_time = 1560241033
+ append_txn_metadata(txn, seq_no, txn_time)
+ nym_data = nym_handler.update_state(txn, None, None, False)
+ with pytest.raises(InvalidClientRequest) as e:
+ nym_handler._validate_existing_nym(nym_request, nym_request.operation, nym_data)
+ e.match("Cannot set version on existing nym")
+
+
+@pytest.mark.parametrize("version", [-1, 3, 100])
+def test_fail_on_bad_version(version: int, nym_request: Request, nym_handler: NymHandler, doc, nym_request_factory):
+ nym_request = nym_request_factory(doc, version)
+ txn = reqToTxn(nym_request)
+ seq_no = 1
+ txn_time = 1560241033
+ append_txn_metadata(txn, seq_no, txn_time)
+ nym_data = nym_handler.update_state(txn, None, None, False)
+ with pytest.raises(InvalidClientRequest) as e:
+ nym_handler.static_validation(nym_request)
+ e.match("Version must be one of")
+
+
+def test_nym_validation_legacy_convention_x(nym_request: Request, nym_handler: NymHandler, doc, nym_request_factory):
+ nym_request = nym_request_factory(doc, NYM_VERSION_CONVENTION)
+ nym_request.operation[TARGET_NYM] = "NOTDERIVEDFROMVERKEY"
+ nym_request.operation[VERKEY] = "HNjfjoeZ7WAHYDSzWcvzyvUABepctabD7QSxopM48fYz"
+ txn = reqToTxn(nym_request)
+ seq_no = 1
+ txn_time = 1560241033
+ append_txn_metadata(txn, seq_no, txn_time)
+ with pytest.raises(InvalidClientRequest) as e:
+ nym_handler._validate_new_nym(nym_request, nym_request.operation)
+ e.match("Identifier with version 1")
+
+
+def test_nym_validation_legacy_convention(nym_request: Request, nym_handler: NymHandler, doc, nym_request_factory):
+ nym_request = nym_request_factory(doc, NYM_VERSION_CONVENTION)
+ nym_request.operation[TARGET_NYM] = "X3XUxYQM2cfkSMzfMNma73"
+ nym_request.operation[VERKEY] = "HNjfjoeZ7WAHYDSzWcvzyvUABepctabD7QSxopM48fYz"
+ txn = reqToTxn(nym_request)
+ seq_no = 1
+ txn_time = 1560241033
+ append_txn_metadata(txn, seq_no, txn_time)
+
+ with mock.patch.object(nym_handler, "write_req_validator"):
+ nym_handler._validate_new_nym(nym_request, nym_request.operation)
+
+
+def test_nym_validation_self_certifying_x(nym_request: Request, nym_handler: NymHandler, doc, nym_request_factory):
+ nym_request = nym_request_factory(doc, NYM_VERSION_SELF_CERT)
+ nym_request.operation[TARGET_NYM] = "NOTSELFCERTIFYING"
+ nym_request.operation[VERKEY] = "HNjfjoeZ7WAHYDSzWcvzyvUABepctabD7QSxopM48fYz"
+ txn = reqToTxn(nym_request)
+ seq_no = 1
+ txn_time = 1560241033
+ append_txn_metadata(txn, seq_no, txn_time)
+ with pytest.raises(InvalidClientRequest) as e:
+ nym_handler._validate_new_nym(nym_request, nym_request.operation)
+ e.match("Identifier with version 2")
+
+
+def test_nym_validation_self_certifying(nym_request: Request, nym_handler: NymHandler, doc, nym_request_factory):
+ nym_request = nym_request_factory(doc, NYM_VERSION_SELF_CERT)
+ nym_request.operation[TARGET_NYM] = "Dyqasan6xG5KsKdLufxCEf"
+ nym_request.operation[VERKEY] = "HNjfjoeZ7WAHYDSzWcvzyvUABepctabD7QSxopM48fYz"
+ txn = reqToTxn(nym_request)
+ seq_no = 1
+ txn_time = 1560241033
+ append_txn_metadata(txn, seq_no, txn_time)
+
+ with mock.patch.object(nym_handler, "write_req_validator"):
+ nym_handler._validate_new_nym(nym_request, nym_request.operation)
* [X] diff --git a/indy_node/test/request_handlers/test_pool_upgrade_handler.py b/indy_node/test/request_handlers/test_pool_upgrade_handler.py
Diff is only formatting
index 57a7a87a..f844447a 100644
--- a/indy_node/test/request_handlers/test_pool_upgrade_handler.py
+++ b/indy_node/test/request_handlers/test_pool_upgrade_handler.py
@@ -96,7 +96,7 @@ def test_pool_upgrade_dynamic_validation_fails_not_installed(
lambda *x: (None, None))
with pytest.raises(InvalidClientRequest) as e:
pool_upgrade_handler.dynamic_validation(pool_upgrade_request, 0)
- e.match('{} is not installed and cannot be upgraded'.format(pool_upgrade_request.operation[PACKAGE]))
+ e.match(f'{pool_upgrade_request.operation[PACKAGE]} is not installed and cannot be upgraded')
def test_pool_upgrade_dynamic_validation_fails_not_installed_mpr(
@@ -133,7 +133,7 @@ def test_pool_upgrade_dynamic_validation_fails_belong(
lambda *x: ('1.1.1', ['some_pkg']))
with pytest.raises(InvalidClientRequest) as e:
pool_upgrade_handler.dynamic_validation(pool_upgrade_request, 0)
- e.match('{} doesn\'t belong to pool'.format(pool_upgrade_request.operation[PACKAGE]))
+ e.match(f'{pool_upgrade_request.operation[PACKAGE]} doesn\'t belong to pool')
def test_pool_upgrade_dynamic_validation_fails_upgradable(
* [X] diff --git a/indy_node/test/request_handlers/test_revoc_reg_entry_handler.py b/indy_node/test/request_handlers/test_revoc_reg_entry_handler.py
Only merges in history
index 5fbc0915..e905ea5c 100644
--- a/indy_node/test/request_handlers/test_revoc_reg_entry_handler.py
+++ b/indy_node/test/request_handlers/test_revoc_reg_entry_handler.py
@@ -1,9 +1,30 @@
import pytest
-from indy_common.constants import REVOC_REG_ENTRY, REVOC_REG_DEF_ID, ISSUANCE_BY_DEFAULT, \
- VALUE, ISSUANCE_TYPE, ISSUED, REVOKED, ACCUM
-from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_def_handler import RevocRegDefHandler
-from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_entry_handler import RevocRegEntryHandler
+from indy_common.constants import (
+ FLAG_NAME,
+ FLAG_NAME_COMPAT_ORDERING,
+ FLAG_VALUE,
+ REVOC_REG_ENTRY,
+ REVOC_REG_DEF_ID,
+ ISSUANCE_BY_DEFAULT,
+ VALUE,
+ ISSUANCE_TYPE,
+ ISSUED,
+ REVOKED,
+ ACCUM,
+ REVOC_REG_DEF,
+ CRED_DEF_ID,
+ REVOC_TYPE,
+ TAG,
+ CONFIG_LEDGER_ID
+)
+from indy_common.config_util import getConfig
+from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_def_handler import (
+ RevocRegDefHandler,
+)
+from indy_node.server.request_handlers.domain_req_handlers.revoc_reg_entry_handler import (
+ RevocRegEntryHandler,
+)
from indy_node.test.request_handlers.helper import add_to_idr
from plenum.common.constants import TXN_TIME, TRUSTEE
@@ -13,72 +34,101 @@ from plenum.common.txn_util import reqToTxn, append_txn_metadata, get_payload_da
from plenum.common.types import f
from plenum.common.util import randomString
from plenum.server.request_handlers.utils import encode_state_value
+from plenum.server.node import Node
-
-@pytest.fixture(scope="module")
-def revoc_reg_entry_handler(db_manager, write_auth_req_validator):
- return RevocRegEntryHandler(db_manager, write_auth_req_validator,
- RevocRegDefHandler.get_revocation_strategy)
+@pytest.fixture(scope="function")
+def revoc_reg_entry_handler(db_manager_ts, write_auth_req_validator):
+ node = Node.__new__(Node)
+ return RevocRegEntryHandler(
+ db_manager_ts, write_auth_req_validator, RevocRegDefHandler.get_revocation_strategy, node
+ )
@pytest.fixture(scope="function")
def revoc_reg_entry_request():
identifier = randomString()
- return Request(identifier= identifier,
- reqId=5,
- operation={'type': REVOC_REG_ENTRY,
- REVOC_REG_DEF_ID: identifier,
- VALUE: {ACCUM: 5}},
- signature="randomString")
-
-
-def test_revoc_reg_entry_dynamic_validation_without_req_def(revoc_reg_entry_handler,
- revoc_reg_entry_request):
- with pytest.raises(InvalidClientRequest,
- match="There is no any REVOC_REG_DEF by path"):
+ return Request(
+ identifier=identifier,
+ reqId=5,
+ operation={
+ "type": REVOC_REG_ENTRY,
+ REVOC_REG_DEF_ID: identifier,
+ VALUE: {ACCUM: 5},
+ },
+ signature="randomString",
+ )
+
+
+def test_revoc_reg_entry_dynamic_validation_without_req_def(
+ revoc_reg_entry_handler, revoc_reg_entry_request
+):
+ with pytest.raises(
+ InvalidClientRequest, match="There is no any REVOC_REG_DEF by path"
+ ):
revoc_reg_entry_handler.dynamic_validation(revoc_reg_entry_request, 0)
-def test_revoc_reg_entry_dynamic_validation_passes(revoc_reg_entry_handler,
- revoc_reg_entry_request):
- add_to_idr(revoc_reg_entry_handler.database_manager.idr_cache,
- revoc_reg_entry_request.identifier,
- TRUSTEE)
+def test_revoc_reg_entry_dynamic_validation_passes(
+ revoc_reg_entry_handler, revoc_reg_entry_request
+):
+ add_to_idr(
+ revoc_reg_entry_handler.database_manager.idr_cache,
+ revoc_reg_entry_request.identifier,
+ TRUSTEE,
+ )
- revoc_reg_entry_handler.state.set(revoc_reg_entry_request.operation[REVOC_REG_DEF_ID].encode(),
- encode_state_value({VALUE: {ISSUANCE_TYPE: ISSUANCE_BY_DEFAULT}},
- "seqNo", "txnTime"))
+ revoc_reg_entry_handler.state.set(
+ revoc_reg_entry_request.operation[REVOC_REG_DEF_ID].encode(),
+ encode_state_value(
+ {VALUE: {ISSUANCE_TYPE: ISSUANCE_BY_DEFAULT}}, "seqNo", "txnTime"
+ ),
+ )
revoc_reg_entry_handler.dynamic_validation(revoc_reg_entry_request, 0)
-def test_revoc_reg_entry_dynamic_validation_fail_in_strategy(revoc_reg_entry_handler,
- revoc_reg_entry_request):
- add_to_idr(revoc_reg_entry_handler.database_manager.idr_cache,
- revoc_reg_entry_request.identifier,
- TRUSTEE)
- revoc_reg_entry_handler.state.set(revoc_reg_entry_request.operation[REVOC_REG_DEF_ID].encode(),
- encode_state_value({VALUE: {ISSUANCE_TYPE: ISSUANCE_BY_DEFAULT}},
- "seqNo", "txnTime"))
- revoc_reg_entry_request.operation[VALUE] = {ISSUED: [1],
- REVOKED: [1]}
- with pytest.raises(InvalidClientRequest, match="Can not have an index in both "
- "'issued' and 'revoked' lists"):
+def test_revoc_reg_entry_dynamic_validation_fail_in_strategy(
+ revoc_reg_entry_handler, revoc_reg_entry_request
+):
+ add_to_idr(
+ revoc_reg_entry_handler.database_manager.idr_cache,
+ revoc_reg_entry_request.identifier,
+ TRUSTEE,
+ )
+ revoc_reg_entry_handler.state.set(
+ revoc_reg_entry_request.operation[REVOC_REG_DEF_ID].encode(),
+ encode_state_value(
+ {VALUE: {ISSUANCE_TYPE: ISSUANCE_BY_DEFAULT}}, "seqNo", "txnTime"
+ ),
+ )
+ revoc_reg_entry_request.operation[VALUE] = {ISSUED: [1], REVOKED: [1]}
+ with pytest.raises(
+ InvalidClientRequest,
+ match="Can not have an index in both " "'issued' and 'revoked' lists",
+ ):
revoc_reg_entry_handler.dynamic_validation(revoc_reg_entry_request, 0)
-def test_revoc_reg_entry_dynamic_validation_without_permission(revoc_reg_entry_handler,
- revoc_reg_entry_request):
- add_to_idr(revoc_reg_entry_handler.database_manager.idr_cache,
- revoc_reg_entry_request.identifier,
- None)
- revoc_reg_entry_handler.state.set(revoc_reg_entry_request.operation[REVOC_REG_DEF_ID].encode(),
- encode_state_value({VALUE: {ISSUANCE_TYPE: ISSUANCE_BY_DEFAULT}},
- "seqNo", "txnTime"))
- revoc_reg_entry_request.operation[VALUE] = {ISSUED: [1],
- REVOKED: [1]}
- with pytest.raises(UnauthorizedClientRequest, match="1 TRUSTEE signature is required and needs to be owner OR "
- "1 STEWARD signature is required and needs to be owner OR "
- "1 ENDORSER signature is required and needs to be owner"):
+def test_revoc_reg_entry_dynamic_validation_without_permission(
+ revoc_reg_entry_handler, revoc_reg_entry_request
+):
+ add_to_idr(
+ revoc_reg_entry_handler.database_manager.idr_cache,
+ revoc_reg_entry_request.identifier,
+ None,
+ )
+ revoc_reg_entry_handler.state.set(
+ revoc_reg_entry_request.operation[REVOC_REG_DEF_ID].encode(),
+ encode_state_value(
+ {VALUE: {ISSUANCE_TYPE: ISSUANCE_BY_DEFAULT}}, "seqNo", "txnTime"
+ ),
+ )
+ revoc_reg_entry_request.operation[VALUE] = {ISSUED: [1], REVOKED: [1]}
+ with pytest.raises(
+ UnauthorizedClientRequest,
+ match="1 TRUSTEE signature is required and needs to be owner OR "
+ "1 STEWARD signature is required and needs to be owner OR "
+ "1 ENDORSER signature is required and needs to be owner",
+ ):
revoc_reg_entry_handler.dynamic_validation(revoc_reg_entry_request, 0)
@@ -87,13 +137,18 @@ def test_failed_update_state(revoc_reg_entry_handler, revoc_reg_entry_request):
txn_time = 1560241033
txn = reqToTxn(revoc_reg_entry_request)
append_txn_metadata(txn, seq_no, txn_time)
- with pytest.raises(InvalidClientRequest,
- match="There is no any REVOC_REG_DEF by path"):
+ with pytest.raises(
+ InvalidClientRequest, match="There is no any REVOC_REG_DEF by path"
+ ):
revoc_reg_entry_handler.update_state(txn, None, revoc_reg_entry_request)
-def test_update_state(revoc_reg_entry_handler, revoc_reg_entry_request,
- revoc_reg_def_handler, revoc_reg_def_request):
+def test_update_state(
+ revoc_reg_entry_handler,
+ revoc_reg_entry_request,
+ revoc_reg_def_handler,
+ revoc_reg_def_request,
+):
# create revoc_req_def
seq_no = 1
txn_time = 1560241030
@@ -101,8 +156,7 @@ def test_update_state(revoc_reg_entry_handler, revoc_reg_entry_request,
revoc_reg_def_request.operation[VALUE][ISSUANCE_TYPE] = ISSUANCE_BY_DEFAULT
txn = reqToTxn(revoc_reg_def_request)
append_txn_metadata(txn, seq_no, txn_time)
- path = RevocRegDefHandler.prepare_revoc_def_for_state(txn,
- path_only=True)
+ path = RevocRegDefHandler.prepare_revoc_def_for_state(txn, path_only=True)
revoc_reg_def_handler.update_state(txn, None, revoc_reg_def_request)
# create revoc_req_entry
@@ -118,9 +172,206 @@ def test_update_state(revoc_reg_entry_handler, revoc_reg_entry_request,
txn_data[f.SEQ_NO.nm] = seq_no
txn_data[TXN_TIME] = txn_time
assert revoc_reg_entry_handler.get_from_state(
- RevocRegEntryHandler.prepare_revoc_reg_entry_for_state(txn,
- path_only=True)) == (txn_data, seq_no, txn_time)
+ RevocRegEntryHandler.prepare_revoc_reg_entry_for_state(txn, path_only=True)
+ ) == (txn_data, seq_no, txn_time)
# check state for revoc_reg_entry
txn_data[VALUE] = {ACCUM: txn_data[VALUE][ACCUM]}
path, _ = RevocRegEntryHandler.prepare_revoc_reg_entry_accum_for_state(txn)
assert revoc_reg_entry_handler.get_from_state(path) == (txn_data, seq_no, txn_time)
+
+
+def test_legacy_switch_by_default_new(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler
+):
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler
+ )
+ assert state[VALUE][REVOKED] == [5, 6, 12, 13]
+
+
+def test_legacy_switch_old(
+ revoc_reg_entry_request,
+ revoc_reg_def_handler,
+ revoc_reg_entry_handler
+):
+ revoc_reg_entry_handler.legacy_sort_config = True
+
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler
+ )
+ assert state[VALUE][REVOKED] == [12, 13, 5, 6]
+
+
+def _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler, seq_no=1, txn_time=1560241030
+):
+ # create revoc_req_def
+ # Force a new rev reg def for these tests
+ revoc_reg_def_request = Request(
+ identifier=randomString(),
+ reqId=5,
+ signature="sig",
+ operation={
+ "type": REVOC_REG_DEF,
+ CRED_DEF_ID: "credDefId",
+ REVOC_TYPE: randomString(),
+ TAG: randomString(),
+ },
+ )
+
+ revoc_reg_def_request.operation[VALUE] = {}
+ revoc_reg_def_request.operation[VALUE][ISSUANCE_TYPE] = ISSUANCE_BY_DEFAULT
+ txn = reqToTxn(revoc_reg_def_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ path = RevocRegDefHandler.prepare_revoc_def_for_state(txn, path_only=True)
+ revoc_reg_def_handler.update_state(txn, None, revoc_reg_def_request)
+
+ # create first revoc_req_entry
+ seq_no += 1
+ txn_time += 50
+ revoc_reg_entry_request.operation[REVOC_REG_DEF_ID] = path.decode()
+ revoc_reg_entry_request.operation[VALUE][ISSUED] = []
+ revoc_reg_entry_request.operation[VALUE][REVOKED] = [4, 5, 6, 12]
+ txn = reqToTxn(revoc_reg_entry_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ revoc_reg_entry_handler.update_state(txn, None, revoc_reg_entry_request)
+
+ # create second revoc_req_entry
+ seq_no += 1
+ txn_time += 123
+ revoc_reg_entry_request.operation[REVOC_REG_DEF_ID] = path.decode()
+ revoc_reg_entry_request.operation[VALUE][ISSUED] = [4]
+ revoc_reg_entry_request.operation[VALUE][REVOKED] = [13]
+ txn = reqToTxn(revoc_reg_entry_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ revoc_reg_entry_handler.update_state(txn, None, revoc_reg_entry_request)
+ state = revoc_reg_entry_handler.get_from_state(
+ RevocRegEntryHandler.prepare_revoc_reg_entry_for_state(txn, path_only=True)
+ )
+ return state[0]
+
+
+def test_ordering_switch_via_transaction(
+ flag_handler,
+ flag_request,
+ get_flag_request_handler,
+ revoc_reg_def_handler,
+ revoc_reg_entry_handler,
+ revoc_reg_entry_request
+):
+ # enable legacy sorting
+ revoc_reg_entry_handler.legacy_sort_config = True
+
+ # First run to make sure legacy sorting works
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler, seq_no=1
+ )
+ assert state[VALUE][REVOKED] == [12, 13, 5, 6]
+
+ # config flag transaction
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ seq_no = 4
+ txn_time = 1560241033
+ flag_request.operation[FLAG_NAME] = FLAG_NAME_COMPAT_ORDERING
+ flag_request.operation[FLAG_VALUE] = "False"
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+ get_flag_request_handler.timestamp_store.set(txn_time, flag_handler.state.headHash, ledger_id=CONFIG_LEDGER_ID)
+
+ # Second run to test the switch to proper sorting via config transaction
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler, seq_no=5
+ )
+ assert state[VALUE][REVOKED] == [5, 6, 12, 13]
+
+
+def test_ordering_switch_via_transaction_catchup(
+ flag_handler,
+ get_flag_request_handler,
+ flag_request,
+ revoc_reg_def_handler,
+ revoc_reg_entry_handler,
+ revoc_reg_entry_request
+):
+
+ # enable legacy sorting locally
+ revoc_reg_entry_handler.legacy_sort_config = True
+
+ # config flag transaction
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ seq_no = 1
+ txn_time = 156025000
+ flag_request.operation[FLAG_NAME] = FLAG_NAME_COMPAT_ORDERING
+ flag_request.operation[FLAG_VALUE] = "False"
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+ get_flag_request_handler.timestamp_store.set(txn_time, flag_handler.state.headHash, ledger_id=CONFIG_LEDGER_ID)
+
+ # First run to make sure legacy sorting works
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler, seq_no=2, txn_time=156021000
+ )
+ assert state[VALUE][REVOKED] == [12, 13, 5, 6]
+
+ # Second run to test the switch to proper sorting via config transaction
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler, seq_no=5, txn_time=156029000
+ )
+ assert state[VALUE][REVOKED] == [5, 6, 12, 13]
+
+
+def test_ordering_switch_via_transaction_catchup_locally(
+ flag_handler,
+ flag_request,
+ get_flag_request_handler,
+ revoc_reg_def_handler,
+ revoc_reg_entry_handler,
+ revoc_reg_entry_request
+):
+
+ # disable legacy sorting locally
+ revoc_reg_entry_handler.legacy_sort_config = False
+
+ # config flag transaction
+ add_to_idr(
+ flag_handler.database_manager.idr_cache,
+ flag_request.identifier,
+ TRUSTEE,
+ )
+
+ seq_no = 1
+ txn_time = 156025000
+ flag_request.operation[FLAG_NAME] = FLAG_NAME_COMPAT_ORDERING
+ flag_request.operation[FLAG_VALUE] = "False"
+ txn = reqToTxn(flag_request)
+ append_txn_metadata(txn, seq_no, txn_time)
+ flag_handler.update_state(txn, None, flag_request)
+ flag_handler.state.commit()
+ get_flag_request_handler.timestamp_store.set(txn_time, flag_handler.state.headHash, ledger_id=CONFIG_LEDGER_ID)
+
+
+ # First run to make sure we use new sorting before the transaction time because of local sort setting
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler, seq_no=2, txn_time=156021000
+ )
+ assert state[VALUE][REVOKED] == [5, 6, 12, 13]
+
+ # Second run to test the switch to proper sorting via config transaction
+ state = _test_ordering(
+ revoc_reg_entry_handler, revoc_reg_entry_request, revoc_reg_def_handler, seq_no=5, txn_time=156029000
+ )
+ assert state[VALUE][REVOKED] == [5, 6, 12, 13]
* [X] diff --git a/indy_node/test/request_propagates/test_request_propagates.py b/indy_node/test/request_propagates/test_request_propagates.pya
Only merges in history
index 09f56284..3ec8ee22 100644
--- a/indy_node/test/request_propagates/test_request_propagates.py
+++ b/indy_node/test/request_propagates/test_request_propagates.py
@@ -7,7 +7,7 @@ from indy.ledger import build_attrib_request, sign_request, build_schema_request
from indy_node.test.helper import createHalfKeyIdentifierAndAbbrevVerkey
from indy_common.types import Request
-from indy_node.test.api.helper import sdk_write_schema
+from indy_node.test.api.helper import sdk_write_schema, build_rs_schema_request
from plenum.common.messages.node_messages import Propagate
from plenum.common.types import f
from plenum.server.message_handlers import PropagateHandler
@@ -25,7 +25,7 @@ def emulate_received(node, msg):
)
-@pytest.fixture(scope="module", params=["ATTRIB", "SCHEMA", "CLAIM_DEF", "NYM"])
+@pytest.fixture(scope="module", params=["ATTRIB", "SCHEMA", "RS_SCHEMA", "CLAIM_DEF", "NYM"])
def req(request, looper, sdk_pool_handle, sdk_wallet_steward):
wallet_handle, identifier = sdk_wallet_steward
if request.param == "ATTRIB":
@@ -36,6 +36,9 @@ def req(request, looper, sdk_pool_handle, sdk_wallet_steward):
_, schema_json = looper.loop.run_until_complete(
issuer_create_schema(identifier, "name", "1.0", json.dumps(["first", "last"])))
request_json = looper.loop.run_until_complete(build_schema_request(identifier, schema_json))
+ elif request.param == "RS_SCHEMA":
+ rs_schema = {'@id': "fakeId234e", '@type': "0od"}
+ request_json = build_rs_schema_request(identifier, rs_schema, "ISO18023_Drivers_License", "1.1")
elif request.param == "CLAIM_DEF":
schema_json, _ = sdk_write_schema(looper, sdk_pool_handle, sdk_wallet_steward)
schema_id = json.loads(schema_json)['id']
* [X] diff --git a/indy_node/test/rich_schema/__init__.py b/indy_node/test/rich_schema/__init__.py
New file
new file mode 100644
index 00000000..e69de29b
* [X] diff --git a/indy_node/test/context/helper.py b/indy_node/test/rich_schema/templates.py
New file
similarity index 51%
rename from indy_node/test/context/helper.py
rename to indy_node/test/rich_schema/templates.py
index b001a072..aa127caf 100644
--- a/indy_node/test/context/helper.py
+++ b/indy_node/test/rich_schema/templates.py
@@ -1,4 +1,127 @@
+RICH_SCHEMA_CRED_DEF_EX1 = {
+ "signatureType": "CL",
+ "mapping": "did:sov:8a9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "schema": "did:sov:3e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "publicKey": {
+ "primary": "...",
+ "revocation": "..."
+ }
+}
+
+# TODO: finalize PresDef format
+RICH_SCHEMA_PRES_DEF_EX1 = {
+ '@id': "did:sov:9f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@context': "did:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@type': "rdfs:Class",
+ "attr1": "",
+ "attr2": ""
+}
+
+RICH_SCHEMA_MAPPING_EX1 = {
+ '@id': "did:sov:8a9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@context': "did:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@type': "rdfs:Class",
+ "schema": "did:sov:3e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "attributes": {
+ "issuer": [
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 1
+ }
+ ],
+ "issuanceDate": [
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 2
+ }
+ ],
+ "driver": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 5
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 4
+ }],
+ "issuingAuthority": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 3
+ }],
+ "licenseNumber": [
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 8
+ },
+ {
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 9
+ },
+ ],
+ "categoriesOfVehicles": {
+ "vehicleType": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 6
+ }],
+ "dateOfIssue": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 7
+ }],
+ },
+ "administrativeNumber": [{
+ "enc": "did:sov:1x9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ "rank": 10
+ }]
+ }
+}
+
+RICH_SCHEMA_ENCODING_EX1 = {
+ "input": {
+ "id": "DateRFC3339",
+ "type": "string"
+ },
+ "output": {
+ "id": "UnixTime",
+ "type": "256-bit integer"
+ },
+ "algorithm": {
+ "description": "This encoding transforms an RFC3339 - formatted datetime object into the number of seconds since January 1, 1970(the Unix epoch).",
+ "documentation": "https://github.com/hyperledger/indy-hipe/commit/3a39665fd384254f08316eef6230c2f411b8f765",
+ "implementation": "https://github.com/hyperledger/indy-hipe/commit/3a39665fd384254f08316eef6230c2f411b8f869",
+ },
+ "testVectors": "https://github.com/hyperledger/indy-hipe/commit/3a39665fd384254f08316eef6230c2f411b8f766"
+}
+
+RICH_SCHEMA_EX1 = {
+ '@id': "did:sov:3e9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@context': "did:sov:2f9F8ZmxuvDqRiqqY29x6dx9oU4qwFTkPbDpWtwGbdUsrCD",
+ '@type': "rdfs:Class",
+ "rdfs:comment": "ISO18013 International Driver License",
+ "rdfs:label": "Driver License",
+ "rdfs:subClassOf": {
+ "@id": "sch:Thing"
+ },
+ "driver": "Driver",
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "issuingAuthority": "Text",
+ "licenseNumber": "Text",
+ "categoriesOfVehicles": {
+ "vehicleType": "Text",
+ "vehicleType-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^(A|B|C|D|BE|CE|DE|AM|A1|A2|B1|C1|D1|C1E|D1E)$"
+ },
+ "dateOfIssue": "Date",
+ "dateOfExpiry": "Date",
+ "restrictions": "Text",
+ "restrictions-input": {
+ "@type": "sch:PropertyValueSpecification",
+ "valuePattern": "^([A-Z]|[1-9])$"
+ }
+ },
+ "administrativeNumber": "Text"
+}
W3C_EXAMPLE_V1_CONTEXT = {
"@context": [
@@ -49,7 +172,6 @@ W3C_EXAMPLE_V1_CONTEXT = {
]
}
-
W3C_BASE_CONTEXT = {
"@context": {
"@version": 1.1,
@@ -246,12 +368,11 @@ W3C_BASE_CONTEXT = {
}
}
-
SCHEMA_ORG_CONTEXT = {
- "@context": {
+ "@context": {
"type": "@type",
"id": "@id",
- "HTML": { "@id": "rdf:HTML" },
+ "HTML": {"@id": "rdf:HTML"},
"@vocab": "http://schema.org/",
"xml": "http://www.w3.org/XML/1998/namespace",
@@ -1345,1302 +1466,1319 @@ SCHEMA_ORG_CONTEXT = {
"XRay": {"@id": "schema:XRay"},
"ZoneBoardingPolicy": {"@id": "schema:ZoneBoardingPolicy"},
"Zoo": {"@id": "schema:Zoo"},
- "about": { "@id": "schema:about"},
- "abridged": { "@id": "schema:abridged"},
- "accelerationTime": { "@id": "schema:accelerationTime"},
- "acceptedAnswer": { "@id": "schema:acceptedAnswer"},
- "acceptedOffer": { "@id": "schema:acceptedOffer"},
- "acceptedPaymentMethod": { "@id": "schema:acceptedPaymentMethod"},
- "acceptsReservations": { "@id": "schema:acceptsReservations"},
- "accessCode": { "@id": "schema:accessCode"},
- "accessMode": { "@id": "schema:accessMode"},
- "accessModeSufficient": { "@id": "schema:accessModeSufficient"},
- "accessibilityAPI": { "@id": "schema:accessibilityAPI"},
- "accessibilityControl": { "@id": "schema:accessibilityControl"},
- "accessibilityFeature": { "@id": "schema:accessibilityFeature"},
- "accessibilityHazard": { "@id": "schema:accessibilityHazard"},
- "accessibilitySummary": { "@id": "schema:accessibilitySummary"},
- "accountId": { "@id": "schema:accountId"},
- "accountMinimumInflow": { "@id": "schema:accountMinimumInflow"},
- "accountOverdraftLimit": { "@id": "schema:accountOverdraftLimit"},
- "accountablePerson": { "@id": "schema:accountablePerson"},
- "acquiredFrom": { "@id": "schema:acquiredFrom"},
- "acrissCode": { "@id": "schema:acrissCode"},
- "action": { "@id": "schema:action"},
- "actionAccessibilityRequirement": { "@id": "schema:actionAccessibilityRequirement"},
- "actionApplication": { "@id": "schema:actionApplication"},
- "actionOption": { "@id": "schema:actionOption"},
- "actionPlatform": { "@id": "schema:actionPlatform"},
- "actionStatus": { "@id": "schema:actionStatus"},
- "actionableFeedbackPolicy": { "@id": "schema:actionableFeedbackPolicy", "@type": "@id"},
- "activeIngredient": { "@id": "schema:activeIngredient"},
- "activityDuration": { "@id": "schema:activityDuration"},
- "activityFrequency": { "@id": "schema:activityFrequency"},
- "actor": { "@id": "schema:actor"},
- "actors": { "@id": "schema:actors"},
- "addOn": { "@id": "schema:addOn"},
- "additionalName": { "@id": "schema:additionalName"},
- "additionalNumberOfGuests": { "@id": "schema:additionalNumberOfGuests"},
- "additionalProperty": { "@id": "schema:additionalProperty"},
- "additionalType": { "@id": "schema:additionalType", "@type": "@id"},
- "additionalVariable": { "@id": "schema:additionalVariable"},
- "address": { "@id": "schema:address"},
- "addressCountry": { "@id": "schema:addressCountry"},
- "addressLocality": { "@id": "schema:addressLocality"},
- "addressRegion": { "@id": "schema:addressRegion"},
- "administrationRoute": { "@id": "schema:administrationRoute"},
- "advanceBookingRequirement": { "@id": "schema:advanceBookingRequirement"},
- "adverseOutcome": { "@id": "schema:adverseOutcome"},
- "affectedBy": { "@id": "schema:affectedBy"},
- "affiliation": { "@id": "schema:affiliation"},
- "afterMedia": { "@id": "schema:afterMedia", "@type": "@id"},
- "agent": { "@id": "schema:agent"},
- "aggregateRating": { "@id": "schema:aggregateRating"},
-
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment