Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found
Select Git revision
  • 1.6
  • 1.7
  • 1427_pre-commit
  • 1433-distance-rule-is-not-consistent-between-duniter-1-7-and-1-8
  • bugfix/invalid-rules-distance-sigqty
  • ci_tags
  • dev
  • duniter-v2s-issue-123-industrialize-releases
  • fast-docker-build
  • feature/build-aarch64-nodejs16
  • feature/build-apple-silicon
  • feature/deb-arm-systemd
  • feature/dev-on-windows
  • feature/docker-set-latest
  • feature/dump-distance
  • feature/fast-docker-build-1.8.4
  • feature/node-20
  • feature/oxyde-pow
  • feature/wotwizard-1.8
  • fix/1441/node_summary_with_storage
  • fix/1442/improve_bma_tx_history
  • fix/1448/1.8/txs_not_stored
  • fix/security-vulnerabilities
  • fix/windows-build-electron
  • hotfix/tx-comment-uses-keywords
  • hugo_1.7
  • oxyde-bc-db
  • oxyde-dal
  • oxyde-scrypt
  • pini-1.8-docker
  • pini-docker
  • pini-sync-onlypeers
  • reduce_cpu
  • release/1.8
  • release/1.9
  • release/1.9.1
  • stable
  • sync
  • 0.14.1
  • 0.20.0a21
  • 0.31.0a1
  • 0.99.10
  • 0.99.11
  • 0.99.12
  • 0.99.13
  • 0.99.14
  • 0.99.15
  • 0.99.16
  • 0.99.17
  • 0.99.18
  • 0.99.19
  • 0.99.2
  • 0.99.20
  • 0.99.21
  • 0.99.3
  • 0.99.30
  • 0.99.4
  • 0.99.5
  • 0.99.6
  • 0.99.7
  • 0.99.8
  • 0.99.9
  • 1.6.10
  • 1.6.12
  • 2019.0324.1706
  • 2019.0325.1919
  • 2019.0327.1837
  • 2019.0329.1732
  • 2019.0403.1317
  • 2019.0405.1251
  • 2019.0407.1406
  • 2019.0407.1508
  • v0.0.1
  • v0.0.2
  • v0.0.3
  • v0.0.4
  • v0.0.4.1
  • v0.1.0
  • v0.12-beta
  • v0.12-beta2
  • v0.12-beta3
  • v0.12-beta4
  • v0.12-beta5
  • v0.12-beta6
  • v0.12.0
  • v0.12.1
  • v0.12.10
  • v0.12.2
  • v0.12.3
  • v0.12.4
  • v0.12.5
  • v0.12.6
  • v0.12.7
  • v0.12.8
  • v0.12.9
  • v0.13.0
  • v0.13.1
  • v0.13.2
  • v0.13.3
  • v0.13.4
  • v0.13.4bis
  • v0.13.5
  • v0.13.5a1
  • v0.13.5a2
  • v0.13.6
  • v0.13.6a1
  • v0.13.6a2
  • v0.13.7
  • v0.13.7a1
  • v0.13.8a1
  • v0.13.8a2
  • v0.14.0
  • v0.14.0a1
  • v0.14.0a2
  • v0.14.0b1
  • v0.14.1
  • v0.14.2
  • v0.14.2a1
  • v0.20.0
  • v0.20.0a1
  • v0.20.0a10
  • v0.20.0a11
  • v0.20.0a12
  • v0.20.0a13
  • v0.20.0a14
  • v0.20.0a15
  • v0.20.0a16
  • v0.20.0a17
  • v0.20.0a18
  • v0.20.0a19
  • v0.20.0a2
  • v0.20.0a20
  • v0.20.0a21
  • v0.20.0a22
  • v0.20.0a23
  • v0.20.0a24
  • v0.20.0a25
  • v0.20.0a26
138 results

Target

Select target project
No results found
Select Git revision
  • 1.6
  • 1.7
  • 1427_pre-commit
  • 1433-distance-rule-is-not-consistent-between-duniter-1-7-and-1-8
  • bugfix/invalid-rules-distance-sigqty
  • ci_tags
  • dev
  • duniter-v2s-issue-123-industrialize-releases
  • fast-docker-build
  • feature/build-aarch64-nodejs16
  • feature/build-apple-silicon
  • feature/deb-arm-systemd
  • feature/dev-on-windows
  • feature/docker-set-latest
  • feature/dump-distance
  • feature/fast-docker-build-1.8.4
  • feature/node-20
  • feature/oxyde-pow
  • feature/wotwizard-1.8
  • fix/1441/node_summary_with_storage
  • fix/1442/improve_bma_tx_history
  • fix/1448/1.8/txs_not_stored
  • fix/security-vulnerabilities
  • fix/windows-build-electron
  • hotfix/tx-comment-uses-keywords
  • hugo_1.7
  • oxyde-bc-db
  • oxyde-dal
  • oxyde-scrypt
  • pini-1.8-docker
  • pini-docker
  • pini-sync-onlypeers
  • reduce_cpu
  • release/1.8
  • release/1.9
  • release/1.9.1
  • stable
  • sync
  • 0.14.1
  • 0.20.0a21
  • 0.31.0a1
  • 0.99.10
  • 0.99.11
  • 0.99.12
  • 0.99.13
  • 0.99.14
  • 0.99.15
  • 0.99.16
  • 0.99.17
  • 0.99.18
  • 0.99.19
  • 0.99.2
  • 0.99.20
  • 0.99.21
  • 0.99.3
  • 0.99.30
  • 0.99.4
  • 0.99.5
  • 0.99.6
  • 0.99.7
  • 0.99.8
  • 0.99.9
  • 1.6.10
  • 1.6.12
  • 2019.0324.1706
  • 2019.0325.1919
  • 2019.0327.1837
  • 2019.0329.1732
  • 2019.0403.1317
  • 2019.0405.1251
  • 2019.0407.1406
  • 2019.0407.1508
  • v0.0.1
  • v0.0.2
  • v0.0.3
  • v0.0.4
  • v0.0.4.1
  • v0.1.0
  • v0.12-beta
  • v0.12-beta2
  • v0.12-beta3
  • v0.12-beta4
  • v0.12-beta5
  • v0.12-beta6
  • v0.12.0
  • v0.12.1
  • v0.12.10
  • v0.12.2
  • v0.12.3
  • v0.12.4
  • v0.12.5
  • v0.12.6
  • v0.12.7
  • v0.12.8
  • v0.12.9
  • v0.13.0
  • v0.13.1
  • v0.13.2
  • v0.13.3
  • v0.13.4
  • v0.13.4bis
  • v0.13.5
  • v0.13.5a1
  • v0.13.5a2
  • v0.13.6
  • v0.13.6a1
  • v0.13.6a2
  • v0.13.7
  • v0.13.7a1
  • v0.13.8a1
  • v0.13.8a2
  • v0.14.0
  • v0.14.0a1
  • v0.14.0a2
  • v0.14.0b1
  • v0.14.1
  • v0.14.2
  • v0.14.2a1
  • v0.20.0
  • v0.20.0a1
  • v0.20.0a10
  • v0.20.0a11
  • v0.20.0a12
  • v0.20.0a13
  • v0.20.0a14
  • v0.20.0a15
  • v0.20.0a16
  • v0.20.0a17
  • v0.20.0a18
  • v0.20.0a19
  • v0.20.0a2
  • v0.20.0a20
  • v0.20.0a21
  • v0.20.0a22
  • v0.20.0a23
  • v0.20.0a24
  • v0.20.0a25
  • v0.20.0a26
138 results
Show changes

Commits on Source 487

387 additional commits have been omitted to prevent performance issues.
179 files
+ 10989
2865
Compare changes
  • Side-by-side
  • Inline

Files

.cargo/config

0 → 100644
+11 −0
Original line number Diff line number Diff line
[alias]
bdex = "build --release --package duniter-dbex"
ca = "check --all"
cn = "check --manifest-path neon/native/Cargo.toml"
dex = "run --release --package duniter-dbex --"
ta = "test --all"
rr = "run --release --"
uc = "update -p duniter-core"
ug = "update -p duniter-gva"
ugc = "update -p duniter-gva-conf"
xtask = "run --package xtask --"

.dockerignore

0 → 100644
+75 −0
Original line number Diff line number Diff line
# Do not edit this file. It is generated from this command:
# ./dockerignore.make

.cargo
.git*
doc
dockerignore.make
gui
test

# ------------------
# .gitignore content
# ------------------

*.sublime*
node_modules/
*.html
npm-debug.log
bin/jpgp*.jar
.idea/
gui/nw

# Vim swap files
*~
*.swp
*.swo

# Vagrant
.vagrant/
vagrant/*.log
vagrant/duniter

# Python compiled
*.pyc

# Releases
/work
*.deb
*.tar.gz
*.log
*.exe

# vscode
.vscode

# istanbul
.nyc_output
coverage/

# typecode
typedoc/

# files generated by tsc
/index.js*
/index.d.ts
/server.js*
/server.d.ts
*/**/*.js*
app/**/*.d.ts
neon/lib/*.d.ts
test/**/*.d.ts

# files generated by neon
neon/native/artifacts.json

# rust binaries
bin/duniter
neon/native/index.node
target

# files generated by rust tests
neon/native/tests/*.txt
neon/native/tests/wotb-*
test2.bin.gz
**/*.wot
+5 −2
Original line number Diff line number Diff line
@@ -46,13 +46,16 @@ app/**/*.d.ts
neon/lib/*.d.ts
test/**/*.d.ts

# files generated by neon tests
test2.bin.gz
# files generated by neon
neon/native/artifacts.json

# rust binaries
bin/duniter
neon/native/index.node
target

# files generated by rust tests
neon/native/tests/*.txt
neon/native/tests/wotb-*
test2.bin.gz
**/*.wot
+40 −65
Original line number Diff line number Diff line
@@ -11,35 +11,32 @@ workflow:
    - changes:
      - .gitlab/**/*
      - app/**/*
      - bin/duniter
      - bin/duniter_js
      - neon/**/*
      - releases/**/*
      - release/**/*
      - rust-bins/**/*
      - rust-libs/**/*
      - test/**/*
      - .gitlab-ci.yml
      - Cargo.toml
      - Cargo.lock
      - index.ts
      - package.json
      - package-lock.json
      - server.ts

.env: &env
  image: registry.duniter.org/docker/duniter-ci:v0.0.4
.env:
  image: registry.duniter.org/docker/duniter-ci:v0.2.0
  tags:
    - redshift
  before_script:
    - curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
    - export PATH="$HOME/.cargo/bin:$PATH"
    - export RUSTFLAGS="-D warnings"
    - export NVM_DIR="$HOME/.nvm"
    - . "$NVM_DIR/nvm.sh"
    - nvm install 10
    - nvm use 10
    - export RUSTFLAGS="-D warnings"


.cached_nvm: &cached_nvm
  <<: *env
  cache:
    untracked: true
    paths:
      - node_modules/

.rust_env:
  image: registry.duniter.org/docker/rust/rust-x64-stable-ci:latest
@@ -49,33 +46,8 @@ workflow:
    - export PATH="$HOME/.cargo/bin:$PATH"
    - rustup show && rustc --version && cargo --version


#pages:
#  <<: *env
#  stage: pages
#  cache: {}
#  script:
#    - npm i
#    - npm doc
#    - mkdir -p public
#    - cp .gitlab/pages/pages-index.html public/index.html
#    - sed -i "s/{BRANCH}/$CI_COMMIT_REF_NAME/g" public/index.html
#    - mv typedoc public/
#    - echo "$CI_JOB_ID"
#    - curl "https://git.duniter.org/nodes/typescript/duniter/-/jobs/$CI_JOB_ID/artifacts/raw/coverage.tar.gz"
#    - tar xzf coverage.tar.gz
#    - mv coverage "public/coverage"
#    - ls public
#  artifacts:
#    untracked: true
#    paths:
#      - public
#  only:
#    - loki
#    - dev

tests:
  <<: *env
  extends: .env
  rules:
    - if: $CI_COMMIT_REF_NAME =~ /^wip*$/
      when: manual
@@ -85,6 +57,7 @@ tests:
    - when: manual
  stage: tests
  script:
    - cargo test --all
    - npm i
    - npm run format:check
    - npm test
@@ -120,9 +93,9 @@ audit_dependencies:
    - cargo deny -V
  stage: quality
  script:
    - cargo deny check
    - cargo deny --workspace check

.integration_rules: &integration_rules
.integration_rules:
  allow_failure: true
  rules:
    - if: $CI_COMMIT_TAG
@@ -130,24 +103,26 @@ audit_dependencies:
    - when: manual

sync_g1:
  <<: *env
  <<: *integration_rules
  extends:
    - .env
    - .integration_rules
  stage: integration
  script:
    - npm i
    - bash .gitlab/test/check_g1_sync.sh

sync_gtest:
  <<: *env
  <<: *integration_rules
  extends:
    - .env
    - .integration_rules
  stage: integration
  script:
    - npm i
    - bash .gitlab/test/check_gt_sync.sh

.build_releases: &build_releases
.build_releases:
  stage: package
  image: duniter/release-builder:v1.4.0
  image: duniter/release-builder:v2.1.0
  cache: {}
  tags:
    - redshift
@@ -156,7 +131,7 @@ sync_gtest:
      - work/bin/

releases:test:
  <<: *build_releases
  extends: .build_releases
  allow_failure: true
  rules:
    - if: $CI_COMMIT_TAG
@@ -170,7 +145,7 @@ releases:test:
    expire_in: 72h

releases:x64:
  <<: *build_releases
  extends: .build_releases
  rules:
    - if: $CI_COMMIT_TAG
  script:
@@ -182,18 +157,18 @@ releases:x64:

.docker-build-app-image:
  stage: package
  image: docker:18.06
  image: docker:git
  tags:
    - redshift
  services:
    - docker:18.06-dind
  before_script:
    - docker info
    - docker
  script:
    - docker pull $CI_REGISTRY_IMAGE:$IMAGE_TAG || true
    - docker build --cache-from $CI_REGISTRY_IMAGE:$IMAGE_TAG --pull -t "$CI_REGISTRY_IMAGE:$IMAGE_TAG" -f release/docker/Dockerfile .
    - docker login -u "gitlab-ci-token" -p "$CI_BUILD_TOKEN" $CI_REGISTRY
    - docker push "$CI_REGISTRY_IMAGE:$IMAGE_TAG"
    - docker build --cache-from $CI_REGISTRY_IMAGE:$IMAGE_TAG --pull -t "$CI_REGISTRY_IMAGE:$IMAGE_TAG" --build-arg="INSTALL_DEX=$INSTALL_DEX" -f release/docker/Dockerfile .
    #- docker login -u "gitlab-ci-token" -p "$CI_BUILD_TOKEN" $CI_REGISTRY
    #- docker push "$CI_REGISTRY_IMAGE:$IMAGE_TAG"
    # Temporary push on dockerhub 
    - docker login -u "duniterteam" -p "$DUNITERTEAM_PASSWD"
    - docker tag "$CI_REGISTRY_IMAGE:$IMAGE_TAG" "duniter/duniter:$IMAGE_TAG"
    - docker push "duniter/duniter:$IMAGE_TAG"

package:test:docker-test-image:
  extends: .docker-build-app-image
@@ -206,6 +181,7 @@ package:test:docker-test-image:
    - when: manual
  variables:
    IMAGE_TAG: "test-image"
    INSTALL_DEX: "yes"

package:dev:docker:
  extends: .docker-build-app-image
@@ -215,16 +191,15 @@ package:dev:docker:
    - if: $CI_COMMIT_BRANCH == "dev"
  variables:
    IMAGE_TAG: "dev"
    INSTALL_DEX: "yes"

package:prod:docker:
  stage: package
  rules:
    - if: $CI_COMMIT_TAG
  image: docker:18.06
  image: docker:git
  tags:
    - redshift
  services:
    - docker:18.06-dind
    - docker
  script:
    - docker build --pull -t "$CI_REGISTRY_IMAGE:$CI_COMMIT_TAG" -f release/docker/Dockerfile .
    - docker login -u "gitlab-ci-token" -p "$CI_BUILD_TOKEN" $CI_REGISTRY
@@ -236,7 +211,7 @@ package:prod:docker:
    - docker push duniter/duniter


.release_jobs: &release_jobs
.release_jobs:
  image: rdadev/jinja2:py3.6
  tags:
    - redshift
@@ -244,7 +219,7 @@ package:prod:docker:
    - python3 .gitlab/releaser

prerelease:
  <<: *release_jobs
  extends: .release_jobs
  rules:
    - if: $CI_COMMIT_TAG
  stage: prerelease
@@ -253,7 +228,7 @@ prerelease:
    SOURCE_EXT: '["tar.gz", "zip"]'

publish:
  <<: *release_jobs
  extends: .release_jobs
  rules:
    - if: $CI_COMMIT_TAG
      when: manual
Original line number Diff line number Diff line
@@ -13,7 +13,8 @@ class ReleaseNote(ProjectApi):
    __PH_NOTE = PlaceHolder('note')

    def __init__(self):
        ProjectApi.__init__(self, '/repository/tags/{}'.format(os.environ['CI_COMMIT_TAG']))
        ProjectApi.__init__(self)
        self.token = ('PRIVATE-TOKEN', os.environ['RELEASER_TOKEN'])
        self.message_read = False

    def get_note(self):
@@ -22,7 +23,7 @@ class ReleaseNote(ProjectApi):
        :return: The note if it exists, None otherwise.
        :rtype: str or None
        '''
        request = self.build_request()
        request = self.build_request('/repository/tags/{}'.format(os.environ['CI_COMMIT_TAG']))
        response = urllib.request.urlopen(request)
        response_data = response.read().decode()
        data = json.loads(response_data)
@@ -69,6 +70,10 @@ class ReleaseNote(ProjectApi):
            'description': note
        }
        send_data_serialized = json.dumps(send_data).encode('utf-8')
        request = self.build_request('/release', data=send_data_serialized, method=method)
        if not self.message_read:
            request = self.build_request('/releases', data=send_data_serialized, method=method)
        else:
            request = self.build_request('/releases/{}'.format(os.environ['CI_COMMIT_TAG']), data=send_data_serialized, method=method)
        request.add_header('Content-Type', 'application/json')
        request.add_header(*self.token)
        urllib.request.urlopen(request)
Original line number Diff line number Diff line
@@ -18,7 +18,7 @@ checksum_test() {
  local correct_hash=$2
  local db=$3
  echo "Checking $table's checksum..."
  bin/duniter --mdb ${db} dump table "$table" > "$DUMP_DIR/$table"
  bin/duniter_js --mdb ${db} dump table "$table" > "$DUMP_DIR/$table"
  result_hash=`sha1sum "$DUMP_DIR/$table" | grep -Po ".* " | grep -Po "[a-f0-9]+"`
#  rm -f "$DUMP_DIR/$table"
  if [ "$result_hash" == "$correct_hash" ]; then
@@ -33,8 +33,8 @@ sync_data() {
  local db=$1
  local target=$2
  local target_block=$3
  local reset_data="bin/duniter --mdb ${db} reset all"
  local sync="bin/duniter --mdb ${db} sync ${target} --nointeractive ${target_block}"
  local reset_data="bin/duniter_js --mdb ${db} reset all"
  local sync="bin/duniter_js --mdb ${db} sync ${target} --nointeractive ${target_block}"
  echo "$reset_data"
  ${reset_data}
  echo "$sync"

.rusty-hook.toml

0 → 100644
+5 −0
Original line number Diff line number Diff line
[hooks]
pre-commit = "cargo fmt -- --check"

[logging]
verbose = true
+9 −1
Original line number Diff line number Diff line
# CHANGELOG

## v1.8.0 (XX XXXX 2020)
<!-- next-header -->

## [Unreleased] - ReleaseDate

## [1.8.0] - 2020-03-12

### Highlights

@@ -125,3 +129,7 @@ Thanks @bpresles, @c-geek
## v1.7.14 (29th March 2019)

- … To be completed

<!-- next-url -->
[Unreleased]: https://git.duniter.org/nodes/typescript/duniter/compare/v1.8.0...HEAD
[1.8.0]: https://git.duniter.org/nodes/typescript/duniter/compare/v1.7.21...v1.8.0
+7 −2
Original line number Diff line number Diff line
@@ -11,6 +11,10 @@ via [the technical forum] before making a change.

Please note we have a specific workflow, please follow it in all your interactions with the project.

## Developer documentation

Please read [Developer documentation] before contribute.

## Workflow

- You must create an issue for each feature you wish to develop, with a precise title and a
@@ -46,7 +50,8 @@ Please note we have a specific workflow, please follow it in all your interactio
- @Moul
- @c-geek

[commit naming conventions]: ./doc/dev/git-conventions.md#naming-commits
[Developer documentation]: ./doc/dev/index.md
[project's git conventions]: ./doc/dev/git-conventions.md
[Setting up your development environment]: ./doc/dev/setup_env_dev.md
[the technical forum]: https://forum.duniter.org
[project's git conventions]: ./doc/dev/git-conventions.md
[commit naming conventions]: ./doc/dev/git-conventions.md#naming-commits
+3280 −302

File changed.

Preview size limit exceeded, changes collapsed.

+51 −1
Original line number Diff line number Diff line
[package]
authors = ["elois <elois@duniter.org>"]
description = "Duniter cli."
edition = "2018"
keywords = ["duniter"]
license = "AGPL-3.0"
name = "duniter-cli"
repository = "https://git.duniter.org/nodes/typescript/duniter"
version = "1.9.0-dev"

[[bin]]
bench = false
path = "rust-bins/duniter-cli/src/main.rs"
name = "duniter"

[dependencies]
anyhow = "1.0.32"
ctrlc = "3.1.6"
daemonize-me = "0.3.1"
dirs = "3.0.1"
duniter-core = { git = "https://git.duniter.org/nodes/rust/duniter-core", features = ["bc-writer"] }
duniter-gva-conf = { git = "https://git.duniter.org/nodes/rust/modules/duniter-gva" }
log = "0.4.11"
logwatcher = "0.1.1"
nix = "0.17.0"
read_input = "0.8.4"
serde_json = "1.0.53"
structopt = "0.3.18"

[dev-dependencies]
rusty-hook = "0.11.2"

[workspace]
members = ["neon/native"]
members = [
    "neon/native",
    "rust-bins/duniter-dbex",
    "rust-bins/xtask",
    "rust-libs/duniter-server",
    "rust-libs/tests/duniter-integration-tests",
]

[patch.'https://git.duniter.org/nodes/rust/duniter-core']
#duniter-core = { path = "../duniter-core" }

[patch.'https://git.duniter.org/nodes/rust/modules/duniter-gva']
#duniter-gva = { path = "../duniter-gva" }

[patch.crates-io]
#dubp = { git = "https://git.duniter.org/libs/dubp-rs-libs" }
#dubp = { path = "../dubp-rs-libs" }

#leveldb_minimal = { path = "../../../../rust/leveldb_minimal" }
+19 −22
Original line number Diff line number Diff line
![Duniter logo](https://git.duniter.org/nodes/typescript/duniter/raw/dev/images/250%C3%97250.png)

# Duniter [![build status](https://git.duniter.org/nodes/typescript/duniter/badges/dev/pipeline.svg)](https://git.duniter.org/nodes/typescript/duniter/commits/dev) [![Coverage Status](https://coveralls.io/repos/github/duniter/duniter/badge.svg?branch=master)](https://coveralls.io/github/duniter/duniter?branch=master) [![Dependencies](https://david-dm.org/duniter/duniter.svg)](https://david-dm.org/duniter/duniter)
# Duniter [![build status](https://git.duniter.org/nodes/typescript/duniter/badges/dev/pipeline.svg)](https://git.duniter.org/nodes/typescript/duniter/commits/dev) [![Coverage Status](https://coveralls.io/repos/github/duniter/duniter/badge.svg?branch=master)](https://coveralls.io/github/duniter/duniter?branch=master) [![Dependencies](https://david-dm.org/duniter/duniter.svg)](https://david-dm.org/duniter/duniter) [![Minimum rustc version](https://img.shields.io/badge/rustc-1.47.0+-yellow.svg)](https://github.com/rust-lang/rust/blob/master/RELEASES.md)

Duniter (previously uCoin) is a libre software allowing to create a new kind of P2P crypto-currencies based on individuals and Universal Dividend.

@@ -18,22 +18,9 @@ However, we are running simultaneously a testing currency.

See [Install a node documentation](https://duniter.org/en/wiki/duniter/install/).

### Clients, wallets
### Clients, wallets, GUI, Web-UI, Mobile-App

#### Cesium

- [Website](https://cesium.app/)
- [Repository](https://git.duniter.org/clients/cesium-grp/cesium)

#### Sakia

- [Website](http://sakia-wallet.org)
- [Repository](https://git.duniter.org/clients/python/sakia)

#### Silkaj

- [Website](https://silkaj.duniter.org)
- [Repository](https://git.duniter.org/clients/python/silkaj)
See [duniter.org](https://duniter.org/) or for end user [monnaie-libre.fr](https://monnaie-libre.fr/)

## Going further

@@ -42,6 +29,15 @@ See [Install a node documentation](https://duniter.org/en/wiki/duniter/install/)
- See [CONTRIBUTING](./CONTRIBUTING.md).
- [Guide (fr)](./doc/dev/contribute-french.md)

#### In a post-it

```bash
git clone https://git.duniter.org/nodes/typescript/duniter.git
cd duniter
cargo xtask build
./target/release/duniter start
```

### Documentation

Visit [Duniter website](https://duniter.org): it gathers theoretical informations, FAQ and several useful links. If you want to learn, this is the first place to visit.
@@ -65,15 +61,16 @@ The fact of migrating from code to [Rust] is commonly called "oxidation", so we

The long-term goal is to oxidize Duniter entirely, but it is a long process that will take several years.

Duniter is divided into several  git repositories:
Duniter's code is divided into several git repositories:

- [Duniter](https://git.duniter.org/nodes/typescript/duniter): this repository.
- [Dubp-rs-libs](https://git.duniter.org/libs/dubp-rs-libs): Set of Rust libraries common to Duniter and a possible future Rust client/wallet.
- [Web admin](https://git.duniter.org/nodes/typescript/modules/duniter-ui): web administration interface (optional).
- [GVA](https://git.duniter.org/nodes/typescript/modules/gva-api): Future client API aimed to replace BMA. GVA stands for GraphQL Validation API.
- **[dubp-rs-libs]** contains the logic common to Duniter and its clients.
- **[duniter-core]** contains the core code of Duniter.
- The gitlab subgroup **[nodes/rust/modules]** contains the main Duniter modules code (gva, admin, etc).
- **[duniter]** repository contains the "official" implementations of the "duniter-cli" and "duniter-desktop" programs with their default modules (also contains the historical implementation being migrated).

Optional repositories:
Old optional repositories (will be archived when the migration is complete):

- [**Web admin**](https://git.duniter.org/nodes/typescript/modules/duniter-ui): web administration interface (optional).
- [Currency monit](https://git.duniter.org/nodes/typescript/modules/duniter-currency-monit): charts to monitor currency and web of trust state.
- [Remuniter](https://github.com/duniter/remuniter): service to remunerate blocks issuers.

Original line number Diff line number Diff line
@@ -325,7 +325,7 @@ export class DuniterBlockchain {
    await dal.trimSandboxes(block);

    // Saves the block (DAL)
    await dal.saveBlock(dbb);
    await dal.saveBlock(dbb, conf);

    // Save wot file
    if (!dal.fs.isMemoryOnly()) {
@@ -487,11 +487,16 @@ export class DuniterBlockchain {
  }

  static async revertBlock(
    conf: ConfDTO,
    number: number,
    hash: string,
    dal: FileDAL,
    block?: DBBlock
  ) {
    if (block) {
      dal.rustServer.revertBlock(BlockDTO.fromJSONObject(block));
    }

    const blockstamp = [number, hash].join("-");

    // Revert links
@@ -587,7 +592,7 @@ export class DuniterBlockchain {
    for (const obj of block.transactions) {
      obj.currency = block.currency;
      let tx = TransactionDTO.fromJSONObject(obj);
      await dal.saveTransaction(DBTx.fromTransactionDTO(tx));
      await dal.saveTransaction(tx);
    }
  }

@@ -641,7 +646,7 @@ export class DuniterBlockchain {
      obj.currency = block.currency;
      const tx = TransactionDTO.fromJSONObject(obj);
      const txHash = tx.getHash();
      await dal.removeTxByHash(txHash);
      await dal.removePendingTxByHash(txHash);
    }
  }

Original line number Diff line number Diff line
@@ -18,12 +18,12 @@ const GT = "g1-test";
const CURRENCY = "[a-zA-Z0-9-_ ]{2,50}";
const BASE58 = "[123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz]+";
const PUBKEY =
  "[123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz]{43,44}";
  "[123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz]{40,44}";
const SIGNATURE = "[A-Za-z0-9+\\/=]{87,88}";
const USER_ID = "[A-Za-z0-9_-]{2,100}";
const INTEGER = "(0|[1-9]\\d{0,18})";
const FINGERPRINT = "[A-F0-9]{64}";
const BLOCK_VERSION = "(10|11|12)";
const BLOCK_VERSION = "(1[0-3])";
const TX_VERSION = "(10)";
const DIVIDEND = "[1-9][0-9]{0,5}";
const ZERO_OR_POSITIVE_INT = "0|[1-9][0-9]{0,18}";
@@ -53,7 +53,7 @@ const CONDITIONS =
  "\\)|CSV\\(" +
  CSV_INTEGER +
  "\\))))*";

const CONDITION_SIG_PUBKEY = "SIG\\((" + PUBKEY + ")\\)";
const BMA_REGEXP = /^BASIC_MERKLED_API( ([a-z_][a-z0-9-_.]*))?( ([0-9.]+))?( ([0-9a-f:]+))?( ([0-9]+))$/;
const BMAS_REGEXP = /^BMAS( ([a-z_][a-z0-9-_.]*))?( ([0-9.]+))?( ([0-9a-f:]+))?( ([0-9]+))( (\/.+))?$/;
const BMATOR_REGEXP = /^BMATOR( ([a-z0-9]{16})\.onion)( ([0-9.]+))?( ([0-9a-f:]+))?( ([0-9]+))$/;
@@ -162,6 +162,8 @@ export const CommonConstants = {
  MAXIMUM_LEN_OF_OUTPUT,
  MAXIMUM_LEN_OF_UNLOCK,

  PUSH_NEW_PENDING_TXS_EVERY_MS: 30_000,

  POW_TURN_DURATION_PC: 100,
  POW_TURN_DURATION_ARM: 500,

@@ -533,6 +535,8 @@ export const CommonConstants = {
    LOCKTIME: find("Locktime: (" + INTEGER + ")"),
    INLINE_COMMENT: exact(COMMENT),
    OUTPUT_CONDITION: exact(CONDITIONS),
    OUTPUT_CONDITION_SIG_PUBKEY: find(CONDITION_SIG_PUBKEY),
    OUTPUT_CONDITION_SIG_PUBKEY_UNIQUE: exact(CONDITION_SIG_PUBKEY),
  },
  PEER: {
    BLOCK: find("Block: (" + INTEGER + "-" + FINGERPRINT + ")"),
Original line number Diff line number Diff line
@@ -49,42 +49,42 @@ export class BlockParser extends GenericParser {
        { prop: "membersCount", regexp: CommonConstants.BLOCK.MEMBERS_COUNT },
        {
          prop: "identities",
          regexp: /Identities:\n([\s\S]*)Joiners/,
          regexp: /Identities:\n([\s\S]*?)Joiners/,
          parser: splitAndMatch("\n", CommonConstants.IDENTITY.INLINE),
        },
        {
          prop: "joiners",
          regexp: /Joiners:\n([\s\S]*)Actives/,
          regexp: /Joiners:\n([\s\S]*?)Actives/,
          parser: splitAndMatch("\n", CommonConstants.BLOCK.JOINER),
        },
        {
          prop: "actives",
          regexp: /Actives:\n([\s\S]*)Leavers/,
          regexp: /Actives:\n([\s\S]*?)Leavers/,
          parser: splitAndMatch("\n", CommonConstants.BLOCK.ACTIVE),
        },
        {
          prop: "leavers",
          regexp: /Leavers:\n([\s\S]*)Excluded/,
          regexp: /Leavers:\n([\s\S]*?)Excluded/,
          parser: splitAndMatch("\n", CommonConstants.BLOCK.LEAVER),
        },
        {
          prop: "revoked",
          regexp: /Revoked:\n([\s\S]*)Excluded/,
          regexp: /Revoked:\n([\s\S]*?)Excluded/,
          parser: splitAndMatch("\n", CommonConstants.BLOCK.REVOCATION),
        },
        {
          prop: "excluded",
          regexp: /Excluded:\n([\s\S]*)Certifications/,
          regexp: /Excluded:\n([\s\S]*?)Certifications/,
          parser: splitAndMatch("\n", CommonConstants.PUBLIC_KEY),
        },
        {
          prop: "certifications",
          regexp: /Certifications:\n([\s\S]*)Transactions/,
          regexp: /Certifications:\n([\s\S]*?)Transactions/,
          parser: splitAndMatch("\n", CommonConstants.CERT.OTHER.INLINE),
        },
        {
          prop: "transactions",
          regexp: /Transactions:\n([\s\S]*)/,
          regexp: /Transactions:\n([\s\S]*)/, // No need for greedy "?" regexp capture, "Transaction" parsing is different from previous multiline fields
          parser: extractTransactions,
        },
        { prop: "inner_hash", regexp: CommonConstants.BLOCK.INNER_HASH },
Original line number Diff line number Diff line
@@ -28,8 +28,6 @@ export interface ProgramOptions {
  loglevel?: string;
  sqlTraces?: boolean;
  memory?: boolean;
  storeTxs?: boolean;
  storeWw?: boolean;
}

export const cliprogram: ProgramOptions = {
Original line number Diff line number Diff line
@@ -174,6 +174,7 @@ export class BlockchainContext {
      throw DataErrors[DataErrors.BLOCK_TO_REVERT_NOT_FOUND];
    }
    await DuniterBlockchain.revertBlock(
      this.conf,
      head_1.number,
      head_1.hash,
      this.dal,
@@ -187,7 +188,12 @@ export class BlockchainContext {
  async revertCurrentHead() {
    const head_1 = await this.dal.bindexDAL.head(1);
    this.logger.debug("Reverting HEAD~1... (b#%s)", head_1.number);
    await DuniterBlockchain.revertBlock(head_1.number, head_1.hash, this.dal);
    await DuniterBlockchain.revertBlock(
      this.conf,
      head_1.number,
      head_1.hash,
      this.dal
    );
    // Invalidates the head, since it has changed.
    await this.refreshHead();
  }
Original line number Diff line number Diff line
@@ -112,9 +112,21 @@ export class CFSCore {
   * @param content String content to write.
   * @param deep Wether to make a deep write or not.
   */
  async write(filePath: string, content: string, deep: boolean): Promise<void> {
  async write(
    filePath: string,
    content: string,
    deep: boolean,
    secureMode: boolean = false
  ): Promise<void> {
    if (secureMode) {
      return this.qfs.fsWriteSecure(
        path.join(this.rootPath, filePath),
        content
      );
    } else {
      return this.qfs.fsWrite(path.join(this.rootPath, filePath), content);
    }
  }

  /**
   * REMOVE operation of CFS. Set given file as removed. Logical deletion since physical won't work due to the algorithm of CFS.
Original line number Diff line number Diff line
@@ -19,6 +19,8 @@ export interface IIndexDAO extends ReduceableDAO<IindexEntry> {

  searchThoseMatching(search: string): Promise<OldIindexEntry[]>;

  getOldFromPubkey(pub: string): Promise<OldIindexEntry | null>;

  getFullFromUID(uid: string): Promise<FullIindexEntry>;

  getFullFromPubkey(pub: string): Promise<FullIindexEntry>;
+0 −42
Original line number Diff line number Diff line
import { GenericDAO } from "./GenericDAO";
import { TransactionDTO } from "../../../dto/TransactionDTO";
import { SandBox } from "../../sqliteDAL/SandBox";
import { DBTx } from "../../../db/DBTx";

export interface TxsDAO extends GenericDAO<DBTx> {
  trimExpiredNonWrittenTxs(limitTime: number): Promise<void>;

  getAllPending(versionMin: number): Promise<DBTx[]>;

  getTX(hash: string): Promise<DBTx>;

  addLinked(
    tx: TransactionDTO,
    block_number: number,
    time: number
  ): Promise<DBTx>;

  addPending(dbTx: DBTx): Promise<DBTx>;

  getLinkedWithIssuer(pubkey: string): Promise<DBTx[]>;

  getLinkedWithRecipient(pubkey: string): Promise<DBTx[]>;

  getPendingWithIssuer(pubkey: string): Promise<DBTx[]>;

  getPendingWithRecipient(pubkey: string): Promise<DBTx[]>;

  removeTX(hash: string): Promise<void>;

  removeAll(): Promise<void>;

  sandbox: SandBox<{
    issuers: string[];
    output_base: number;
    output_amount: number;
  }>;

  getSandboxRoom(): Promise<number>;

  setSandboxSize(size: number): void;
}
Original line number Diff line number Diff line
@@ -268,7 +268,7 @@ export class LevelDBBlockchain extends LevelDBTable<DBBlock>
    block.fork = false;
    // We remove the eventual fork
    const forkKey = LevelDBBlockchain.trimForkKey(block.number, block.hash);
    if (this.forks.getOrNull(forkKey)) {
    if (await this.forks.getOrNull(forkKey)) {
      await this.forks.del(forkKey);
    }
    // We return the saved block
Original line number Diff line number Diff line
@@ -280,4 +280,12 @@ export class LevelDBIindex extends LevelDBTable<IindexEntry[]>
      .filter((u) => u.pub)
      .concat(pubIdentities.filter((p) => p.pub));
  }

  async getOldFromPubkey(pub: string): Promise<OldIindexEntry | null> {
    const identities = await this.findByPub(pub);
    if (!identities.length) {
      return null;
    }
    return OldTransformers.toOldIindexEntry(reduce(identities));
  }
}
Original line number Diff line number Diff line
@@ -12,12 +12,14 @@ import { SIndexDAO } from "../abstract/SIndexDAO";
import { Underscore } from "../../../common-libs/underscore";
import { pint } from "../../../common-libs/pint";
import { arrayPruneAllCopy } from "../../../common-libs/array-prune";
import { CommonConstants } from "../../../common-libs/constants";

export class LevelDBSindex extends LevelDBTable<SindexEntry>
  implements SIndexDAO {
  private indexForTrimming: LevelDBTable<string[]>;
  private indexForConsumed: LevelDBTable<string[]>;
  private indexForConditions: LevelDBTable<string[]>;
  private indexOfComplexeConditionForPubkeys: LevelDBTable<string[]>;

  constructor(protected getLevelDB: (dbName: string) => Promise<LevelUp>) {
    super("level_sindex", getLevelDB);
@@ -41,9 +43,14 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
      "level_sindex/conditions",
      this.getLevelDB
    );
    this.indexOfComplexeConditionForPubkeys = new LevelDBTable<string[]>(
      "level_sindex/complex_condition_pubkeys",
      this.getLevelDB
    );
    await this.indexForTrimming.init();
    await this.indexForConsumed.init();
    await this.indexForConditions.init();
    await this.indexOfComplexeConditionForPubkeys.init();
  }

  async close(): Promise<void> {
@@ -51,6 +58,7 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
    await this.indexForTrimming.close();
    await this.indexForConsumed.close();
    await this.indexForConditions.close();
    await this.indexOfComplexeConditionForPubkeys.close();
  }

  /**
@@ -127,14 +135,14 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
      pos: number;
    }[]
  > {
    // TODO: very costly: needs a full scan, would be better to change this implementatio
    const entries = await this.findWhere((e) =>
      e.conditions.includes(`SIG(${pubkey})`)
    const forSimpleConditions = await this.getForConditions(`SIG(${pubkey})`);
    const forComplexConditions = await this.getForComplexeConditionPubkey(
      pubkey
    );
    const reduced = Indexer.DUP_HELPERS.reduceBy(
      forSimpleConditions.concat(forComplexConditions),
      ["identifier", "pos"]
    );
    const reduced = Indexer.DUP_HELPERS.reduceBy(entries, [
      "identifier",
      "pos",
    ]);
    return reduced.filter((r) => !r.consumed);
  }

@@ -269,6 +277,20 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
    return found;
  }

  async getForComplexeConditionPubkey(pubkey: string): Promise<SindexEntry[]> {
    const ids =
      (await this.indexOfComplexeConditionForPubkeys.getOrNull(pubkey)) || [];
    const found: SindexEntry[] = [];
    for (const id of ids) {
      const entries = await this.findByIdentifierAndPos(
        id.split("-")[0],
        pint(id.split("-")[1])
      );
      entries.forEach((e) => found.push(e));
    }
    return found;
  }

  async removeBlock(blockstamp: string): Promise<void> {
    const writtenOn = pint(blockstamp);
    // We look at records written on this blockstamp: `indexForTrimming` allows to get them
@@ -316,24 +338,25 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
  }

  private async trimConditions(condition: string, id: string) {
    // Get all the account's TX sources
    // Get all the condition's sources
    const existing = (await this.indexForConditions.getOrNull(condition)) || [];
    // Prune the source from the account
    // Prune the source from the condition
    const trimmed = arrayPruneAllCopy(existing, id);
    if (trimmed.length) {
      // If some sources are left for this "account", persist what remains
      // If some sources are left for this "condition", persist what remains
      await this.indexForConditions.put(condition, trimmed);
    } else {
      // Otherwise just delete the "account"
      await this.indexForConditions.del(condition);
    }

    // If complex conditions
    if (this.isComplexCondition(condition)) {
      const pubkeys = this.getDistinctPubkeysFromCondition(condition);
      await this.trimComplexeConditionPubkeys(pubkeys, id);
    }
  }

  /**
   * Duplicate with trimConditions?!
   * @param writtenOn
   * @param id
   */
  private async trimWrittenOn(writtenOn: number, id: string) {
    const k = LevelDBSindex.trimWrittenOnKey(writtenOn);
    const existing = await this.getWrittenOnSourceIds(writtenOn);
@@ -356,6 +379,28 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
    }
  }

  private async trimComplexeConditionPubkeys(pubkeys: string[], id: string) {
    if (!pubkeys || !pubkeys.length) return;
    for (const p of pubkeys) {
      await this.trimComplexeConditionPubkey(p, id);
    }
  }

  private async trimComplexeConditionPubkey(pubkey: string, id: string) {
    // Get all the condition's sources
    const existing =
      (await this.indexOfComplexeConditionForPubkeys.getOrNull(pubkey)) || [];
    // Prune the source from the condition
    const trimmed = arrayPruneAllCopy(existing, id);
    if (trimmed.length) {
      // If some sources are left for this "condition", persist what remains
      await this.indexOfComplexeConditionForPubkeys.put(pubkey, trimmed);
    } else {
      // Otherwise just delete the "account"
      await this.indexOfComplexeConditionForPubkeys.del(pubkey);
    }
  }

  private async getWrittenOnSourceIds(writtenOn: number) {
    const indexForTrimmingId = LevelDBSindex.trimWrittenOnKey(writtenOn);
    return (await this.indexForTrimming.getOrNull(indexForTrimmingId)) || [];
@@ -393,6 +438,7 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
    const byConsumed: { [k: number]: SindexEntry[] } = {};
    const byWrittenOn: { [k: number]: SindexEntry[] } = {};
    const byConditions: { [k: string]: SindexEntry[] } = {};
    const byPubkeys: { [k: string]: SindexEntry[] } = {};
    records
      .filter((r) => r.consumed)
      .forEach((r) => {
@@ -410,12 +456,24 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
        arrWO = byWrittenOn[r.writtenOn] = [];
      }
      arrWO.push(r);
      // Conditiosn
      // Conditions
      let arrCN = byConditions[r.conditions];
      if (!arrCN) {
        arrCN = byConditions[r.conditions] = [];
      }
      arrCN.push(r);

      // If complex condition
      if (this.isComplexCondition(r.conditions)) {
        const pubkeys = this.getDistinctPubkeysFromCondition(r.conditions);
        pubkeys.forEach((pub) => {
          let arrPub = byPubkeys[pub];
          if (!arrPub) {
            arrPub = byPubkeys[pub] = [];
          }
          arrPub.push(r);
        });
      }
    });
    // Index consumed => (identifier + pos)[]
    for (const k of Underscore.keys(byConsumed)) {
@@ -446,5 +504,47 @@ export class LevelDBSindex extends LevelDBTable<SindexEntry>
        Underscore.uniq(existing.concat(newSources))
      );
    }
    // Index pubkeys => (identifier + pos)[]
    for (const k of Underscore.keys(byPubkeys).map(String)) {
      const existing =
        (await this.indexOfComplexeConditionForPubkeys.getOrNull(k)) || [];
      const newSources = byPubkeys[k].map((r) =>
        LevelDBSindex.trimPartialKey(r.identifier, r.pos)
      );
      await this.indexOfComplexeConditionForPubkeys.put(
        k,
        Underscore.uniq(existing.concat(newSources))
      );
    }
  }

  private isComplexCondition(condition: string): boolean {
    return (
      (condition &&
        !CommonConstants.TRANSACTION.OUTPUT_CONDITION_SIG_PUBKEY_UNIQUE.test(
          condition
        )) ||
      false
    );
  }
  /**
   * Get all pubkeys used by an output condition (e.g. 'SIG(A) && SIG(B)' will return ['A', 'B']
   * @param condition
   * @private
   */
  private getDistinctPubkeysFromCondition(condition: string): string[] {
    const pubKeys: string[] = [];
    if (!condition) return pubKeys;
    let match: RegExpExecArray | null;
    while (
      (match = CommonConstants.TRANSACTION.OUTPUT_CONDITION_SIG_PUBKEY.exec(
        condition
      )) !== null
    ) {
      pubKeys.push(match[1]);
      condition = condition.substring(match.index + match[0].length);
    }

    return Underscore.uniq(pubKeys);
  }
}
Original line number Diff line number Diff line
@@ -91,12 +91,12 @@ export class LevelMIndexExpiresOnIndexer extends LevelDBDataIndex<
    const values: MindexEntry[] = Underscore.values(
      newStateByPub
    ).map((entries) => reduce(entries));
    const byExpiredOn = reduceGroupBy(values, "expired_on");
    const byExpiresOnForExpired = reduceGroupBy(values, "expires_on");
    await Promise.all(
      Underscore.keys(byExpiredOn).map(async (expiresOn) =>
      Underscore.keys(byExpiresOnForExpired).map(async (expiresOn) =>
        this.addAllKeysToExpiresOn(
          pint(expiresOn),
          byExpiredOn[expiresOn].map((e) => e.pub)
          byExpiresOnForExpired[expiresOn].map((e) => e.pub)
        )
      )
    );
@@ -112,8 +112,10 @@ export class LevelMIndexExpiresOnIndexer extends LevelDBDataIndex<
      entry = [];
    }
    for (const pub of pubkeys) {
      if (!entry.includes(pub)) {
        entry.push(pub);
      }
    }
    await this.put(key, entry);
  }

Original line number Diff line number Diff line
import { FullIindexEntry, IindexEntry, Indexer } from "../../../indexer";
import {
  FullIindexEntry,
  IindexEntry,
  Indexer,
  reduce,
} from "../../../indexer";
import { SQLiteDriver } from "../../drivers/SQLiteDriver";
import { MonitorExecutionTime } from "../../../debug/MonitorExecutionTime";
import { IIndexDAO } from "../abstract/IIndexDAO";
@@ -212,6 +217,18 @@ export class SqliteIIndex extends SqliteTable<IindexEntry>
    return (await this.getFromUID(uid)) as FullIindexEntry;
  }

  @MonitorExecutionTime()
  async getOldFromPubkey(pub: string): Promise<OldIindexEntry | null> {
    const entries = await this.find(
      "SELECT * FROM iindex WHERE pub = ? order by writtenOn ASC",
      [pub]
    );
    if (!entries.length) {
      return null;
    }
    return OldTransformers.toOldIindexEntry(reduce(entries));
  }

  @MonitorExecutionTime()
  async getMembers(): Promise<{ pubkey: string; uid: string | null }[]> {
    const members = await this.find(
Original line number Diff line number Diff line
@@ -328,6 +328,10 @@ export class IdentityDAL extends AbstractSQLite<DBIdentity> {
    });
  }

  findByPub(pub: string) {
    return this.sqlFind({ pubkey: pub });
  }

  async trimExpiredIdentities(medianTime: number) {
    await this.exec(
      "DELETE FROM " +

File changed.

Preview size limit exceeded, changes collapsed.

Original line number Diff line number Diff line
@@ -346,7 +346,7 @@ export class PeerDTO implements Cloneable {
      p.signature,
      p.status,
      p.statusTS,
      false
      !p.nonWoT
    );
  }

+31 −15

File changed.

Preview size limit exceeded, changes collapsed.

+38 −137

File changed.

Preview size limit exceeded, changes collapsed.

File changed.

Preview size limit exceeded, changes collapsed.

File changed.

Preview size limit exceeded, changes collapsed.

+25 −1

File changed.

Preview size limit exceeded, changes collapsed.

+85 −79

File changed.

Preview size limit exceeded, changes collapsed.

doc/api/gva.md

0 → 100644
+108 −0

File added.

Preview size limit exceeded, changes collapsed.

doc/dev/index.md

0 → 100644
+12 −0

File added.

Preview size limit exceeded, changes collapsed.

+103 −0

File added.

Preview size limit exceeded, changes collapsed.

doc/use/configure.md

0 → 100644
+219 −0

File added.

Preview size limit exceeded, changes collapsed.

File changed.

Preview size limit exceeded, changes collapsed.

doc/use/index.md

0 → 100644
+13 −0

File added.

Preview size limit exceeded, changes collapsed.

doc/use/install.md

0 → 100644
+54 −0

File added.

Preview size limit exceeded, changes collapsed.

dockerignore.make

0 → 100755
+44 −0

File added.

Preview size limit exceeded, changes collapsed.

duniter.sh

deleted100755 → 0
+0 −52

File deleted.

Preview size limit exceeded, changes collapsed.

format.sh

deleted100755 → 0
+0 −11

File deleted.

Preview size limit exceeded, changes collapsed.

File changed.

Preview size limit exceeded, changes collapsed.

+1 −1

File changed.

Preview size limit exceeded, changes collapsed.

+7 −10

File changed.

Preview size limit exceeded, changes collapsed.

+2 −22

File changed.

Preview size limit exceeded, changes collapsed.

File changed.

Preview size limit exceeded, changes collapsed.

neon/native/artifacts.json

deleted100644 → 0
+0 −1
Original line number Diff line number Diff line
{"active":"debug","targets":{"debug":{"rustc":"","env":{"npm_config_target":null,"npm_config_arch":null,"npm_config_target_arch":null,"npm_config_disturl":null,"npm_config_runtime":null,"npm_config_build_from_source":null,"npm_config_devdir":null}},"release":{"rustc":"","env":{"npm_config_target":null,"npm_config_arch":null,"npm_config_target_arch":null,"npm_config_disturl":null,"npm_config_runtime":null,"npm_config_build_from_source":null,"npm_config_devdir":null}}}}
 No newline at end of file
+15 −0

File added.

Preview size limit exceeded, changes collapsed.

+148 −0

File added.

Preview size limit exceeded, changes collapsed.

+10 −134

File changed.

Preview size limit exceeded, changes collapsed.

+9 −17

File changed.

Preview size limit exceeded, changes collapsed.

release.toml

0 → 100644
+25 −0

File added.

Preview size limit exceeded, changes collapsed.

File changed.

Preview size limit exceeded, changes collapsed.

release/docker/dex.sh

0 → 100755
+5 −0

File added.

Preview size limit exceeded, changes collapsed.

release/docker/duniter.sh

100644 → 100755
+15 −16

File changed.File mode changed from 100644 to 100755.

Preview size limit exceeded, changes collapsed.

release/new_version.sh

deleted100755 → 0
+0 −30

File deleted.

Preview size limit exceeded, changes collapsed.

+37 −8

File changed.

Preview size limit exceeded, changes collapsed.

test/run.sh

deleted100755 → 0
+0 −11

File deleted.

Preview size limit exceeded, changes collapsed.

File changed.

Preview size limit exceeded, changes collapsed.