Skip to content
Snippets Groups Projects
Select Git revision
  • account-balance
  • main default protected
  • hugo-rebase-mr-23
  • fix-23-before-rebase
  • add-certifications-date
  • proxy-limiter
  • convert-hash-bytes-to-hexa
  • 0.2.9
  • 0.2.8
  • 0.2.7
  • 0.2.6
  • 0.2.4
  • 0.2.3
  • 0.2.2
  • 0.2.0
  • 0.1.3
  • 0.1.2
  • 0.1.1
18 results

duniter-squid

  • Clone with SSH
  • Clone with HTTPS
  • Hugo Trentesaux's avatar
    Hugo Trentesaux authored
    b3c85896
    History

    Duniter-Squid

    A Squid-based indexer. It takes ĞDev data and serves it via GraphQL API.

    Run in production

    1. Copy files from this repo to a new local directory:

      • docker-compose.yml
      • .env.example

      Rename .env.example to .env and adapt it to your own configuration.

    2. Start docker container with.

    docker compose up -d

    Dev requirements

    • node 20.x
    • npm
    • docker
    • sqd: npm install --global @subsquid/cli@latest
    • hasura: curl -L https://github.com/hasura/graphql-engine/raw/stable/cli/get.sh | bash

    Run

    Example commands below use sqd. Please install it before proceeding.

    # 1. Install dependencies
    npm ci
    
    # 2. Start target Postgres database and detach
    sqd up
    
    # 3. Build the project
    sqd build
    
    # 4. Start both the squid processor and the GraphQL server
    sqd run .

    A GraphiQL playground will be available at localhost:4350/graphql.

    Dev flow

    TL;DR

    # reboot the database
    sqd down && sqd up
    # after modifying typegen.json
    sqd typegen
    # after modifying schema.graphql (need `sqd build` first time for hasura export metadata)
    sqd db:update
    # build and run
    sqd build && sqd run .

    The rest was here in the squid substrate template.

    1. Define database schema

    Start development by defining the schema of the target database via schema.graphql. Schema definition consists of regular graphql type declarations annotated with custom directives. Full description of schema.graphql dialect is available here.

    2. Generate TypeORM classes

    Mapping developers use TypeORM entities to interact with the target database during data processing. All necessary entity classes are generated by the squid framework from schema.graphql. This is done by running npx squid-typeorm-codegen or (equivalently) sqd codegen command.

    3. Apply changes in database

    We use Hasura as GraphQL engine. So we need to use hasura cli to generate needed metadata and apply DB changes. Just use sqd --help command help to see all hasura:* availables commands.

    4. Generate database migration

    Subsquid database changes are applied through migration files located at db/migrations. squid-typeorm-migration(1) tool provides several commands to drive the process. It is all TypeORM under the hood.

    # Connect to database, analyze its state and generate migration to match the target schema.
    # The target schema is derived from entity classes generated earlier.
    # Don't forget to compile your entity classes beforehand!
    npx squid-typeorm-migration generate
    
    # Create template file for custom database changes
    npx squid-typeorm-migration create
    
    # Apply database migrations from `db/migrations`
    npx squid-typeorm-migration apply
    
    # Revert the last performed migration
    npx squid-typeorm-migration revert

    In top of this, we need to generate Hasura metadata. We don't need to use npx commands listed above. We can use these sqd shortcut instead.

    # Export Hasura metadata with compiled js script `lib/generateHasuraMetadata.js`
    sqd hasura:export
    
    # Apply metadata with command `hasura metadata apply`
    sqd hasura:apply

    4. Generate TypeScript definitions for substrate events, calls and storage

    This is an optional part, but it is very advisable.

    Event, call and runtime storage data come to mapping handlers as raw untyped json. While it is possible to work with raw untyped json data, it's extremely error-prone and the json structure may change over time due to runtime upgrades.

    Squid framework provides a tool for generating type-safe wrappers around events, calls and runtime storage items for each historical change in the spec version. See the Substrate typegen documentation page.

    Deploy the Squid

    See https://duniter.org/wiki/duniter-v2/indexers/duniter-squid/

    More

    Processus de mise-à-jour du runtime

    Pour mettre à jour les métadonnées du runtime

    pnpm install --global @subsquid/substrate-metadata-explorer
    squid-substrate-metadata-explorer --rpc ws://127.0.0.1:9944 --out gdev-metadata.jsonl

    Processus de mise-à-jour du réseau

    Pour mettre à jour les assets (nouveau réseau)

    # copier les liens depuis https://git.duniter.org/nodes/rust/duniter-v2s/-/releases/
    # 1. genesis
    wget https://nodes.pages.duniter.org/-/rust/duniter-v2s/-/jobs/120948/artifacts/release/gdev.json -O ./assets/gdev.json
    # 2. tx history
    wget https://nodes.pages.duniter.org/-/rust/duniter-v2s/-/jobs/126142/artifacts/release/gdev-indexer.json -O ./assets/gdev-indexer.json
    jq ".transactions_history" ./assets/gdev-indexer.json > ./assets/history.json
    rm ./assets/gdev-indexer.json

    Note : pour l'instant gdev-indexer.json compile tout dans un énorme fichier, mais ici on préfère séparer en plusieurs fichiers, d'où la ligne jq pour extraire "transaction_history". CI duniter à mettre à jour.

    Scripts

    Observez le dossier scripts/