The proof of work is here to allow machines sharing a database as part of a p2p environment to synchronize with each other. In Duniter’s case, our database is of course our blockchain, which acts as a ledger keeping a trace of all transactions, status of the web-of-trust and so much more.

How can we let several machines add data (ie: a transaction) at the same time ?

In addition, how do we settle on how much time has gone by since the blockchain was last updated ? This is of the utmost importance as we need to know if it’s time to create the Universal Dividend but also need to keep track of membership and certification expiries etc..

Proof-of-work provides a clever solution to both problems:

1. Any machine can write into the blockchain (=mining a new block) but is only authorised to do if it has previously solved a mathematical equation that will require the said machine to do some work. The challenge has to be hard enough that no two machines might solve it at the same time, ensuring the unicity of a block’s creator.

2. Solving this challenge takes a certain amount of time, which depends on the calculating power of the network as a whole. This provides a common ground for defining the time reference that we need. All we have to do after that is setting a block time (ie: 1 block = 5 min ) and then adapting the difficulty of the challenge so that a new block is created on average every 5 minutes.

## Only members can “mine”

One of Duniter’s major differences with other PoW-based cryptocurrencies is that only members are allowed to author blocks.

Each block is signed with the member’s private key, allowing the algorithm to determine a personalised difficulty. Without this mechanism in place, the Ğ1 would be just as asymmetrical and ‘non-libre’ as Bitcoin.

This personalised difficulty also eliminates the rat-race for the most sophisticated and latest mining equipment. Another benefit is the fact that no “supercomputer”could take control of the blockchain. Lastly, the personalised difficulty means there is a rotation in mining members. This lightweight pow also means its very friendly on the environment and members can mine with anything from a raspberry pi to a privacy-first internet cube.

As you can see this hash starts with 5 zeros which was very hard to achieve and took a lot of *work* for someone’s computer. Hence the term “*proof of work”.

### The common difficulty

In order to settle on a “yardstick” for our time unit we need a common difficulty whose role is to make sure that the blockchain moves forward at a steady pace - 1 block every `avgGenTime` seconds, `avgGenTime` being one of the 20 parameters behind the Duniter protocol-.

This difficulty’s initial value can be set to any arbitrary value (`70` in Duniter `v1.5.x`) and then acts as a spring, regulating blocktime creation by increasing itself if the creation interval drops under `avgGenTime` and vice-versa.

#### How is difficulty applied?

The numeric value of difficulty is taken from an array of possible hashes itself draw from all possible hashes. In duniter v1.5.x the hash of a block is its sha256 hexadecimal hash. This means that each character/digit in the hash can only take one of 16 values: numbers 0 to 9 and letters A to F.

To understand the difficulty, we make a euclidian division of the difficulty by 16.

Here’s an example with a value of `70` : 70 // 16 = **4** with a remainder of **6**. The valid hashes are the ones starting with 4 zeros and with the 5<sup>th</sup> character less than or equal to 9 -because we start at `F` (index 0), and then go down from there ... `E(1), D(2), C(3), B(4), A(5), 9(6)`-. The valid hashes are then written as starting with : `0000[0-9]`

> Fine, but the hash of a mined block will never change and there’s no reason it should start with a given sequence of numbers. So how then can we make sure a block’s hash starts with exactly the sequence needed?

Enter the nonce, short for “number once”. When a member is mining a new block, his computer freezes the block’s content and changes the Nonce until the hash reaches the required number of zeroes.

### The Nonce

The nonce allows us to mine a new `block$ by finding a hash. The hash’s value allows us in turn to determine the difficulty level of the proof-of-work performed.

Examples of possible Nonce values :

* 10100000112275

* 10300000288743

* 10400000008538

* 10700000079653

* 10300000070919

In reality the `Nonce` value follows a pre-determined format akin to `XYY00000000000`. The Nonce’s value isn’t the number of attempts but rather a value within a set of possible ones. This is how the Nonce is built:

* X is a number assigned to a specific peer. Let’s assume that someone has several nodes each with the same private key, this would lead to possible collisions if this person were to mine the same block with different nodes. Each block will therefore have its own unique X to prevent this from happening.

* Y is the number of cores of the processor. The Nonce starting with `107[…]` belongs to a 7 cores processor, while `199[...]` could be the proof generated by a 99 cores processor.

The rest of the Nonce, the part that follows after the XYY, is the numerical space for this individual node and is unique to each of the CPU’s core. This space is comprised of 11 digits (`00000000000`). For the sake of accuracy, we use the term CPU in the wider sense, it can be understood as a bi-CPU for example. We take into consideration the number of cores for the resulting PoW.

### Personalised difficulty

Earlier in this article, we explained that the personalised difficulty is the new and key concept that sets Duniter apart from other *PoW-based* cryptocurrencies such as Bitcoin.

Here is how this personalised difficulty is calculated and assigned :

It is determined by a combination of two different constraints with complimentary roles: the **exclusion factor** and the **handicap**.

Let `powMin` be the common difficulty, `exFact` a member’s exclusion factor and `handicap` his/her handicap. This member’s personalised difficulty `diff` is :

diff = powMin*exFact + handicap

#### Understanding `exFact` , the exclusion factor

Members who have never produced blocks or haven’t for quite some time are assigned an exclusion factor of `1`. Their personalised difficulty is therefore simply the sum of `powMin + handicap`.

Before reading on, one must grasp the role of this exclusion factor. When a member adds a block to the chain, his factor jumps up from 1 to a very high value, to prevent him/her from mining other blocks immediately after and taking control of the blockchain.

The exclusion factor will then rapidly return to 1, how fast this will occur is expressed in a number of blocks and is calculated as a proportion of the number of members mining.In the Ğ1’s case, this proportion is 1/3, meaning that if there are 15 members currently mining, the member’s exclusion factor will drop down to 1 after 5 blocks.

> Hold on, can you detail what exactly is intended by “the number of members mining”?

Very good question. Here we mean the number of members trying to mine the next block. In reality, there is no way of knowing precisely how many members are calculating at any given time because it’s impossible to view the entire network but also because there are ways to make oneself invisible from the network. In spite of this we do have to convene on a formula to get this number, without which it would be impossible to assign a personalised difficulty.

The way we currently do this is by looking back at the chain and assuming that the number of members mining is the number of those who have found at least 1 block in the X blocks we are looking at, minus the very last one.

> So how do you chose X ?

We use the concept of current window, X’s value is equal to the size of this window. Let’s see how it works:

`issuersFrame` is the size of the current window in blocks and `issuersCount` the number of members who have calculated at least one block during the current window. When first starting a blockchain, the very first block has an `issuersFrame=1` and an `issuersCount=0`. The genesis block being excluded as it is the last one at that precise point in time there are no members in the current window!

From the second block onwards (block #1) we track the variation of `issuersCount`. The member having mined block #0 enters the current window and in block #1 we will therefore mention `issuersCount=1`.

`issuersFrame` then varies as follows:

If `issuersCount` increases by X -and with a maximum step of X = 1 - then `issuersFrame` will increase by 1 unit over a period of 5X blocks. Conversely, if `issuersCount` decreases by Y -and with Y = 2 at most : current window inching forward + loss of one calculating member-, then `issuersFrame` will decrease by 1 unit during 5Y blocks. The effects compound over time meaning that if a new member starts calculating at block `T` and another stops at `T+1`, `issuersFrame` will increase by 1 unit at `T+1` then decrease by 1 unit at `T+2`, to then settle at that number. The calculation can be found under rules [BR_G05](https://github.com/duniter/duniter/blob/master/doc/Protocol.md#br_g05---headissuersframe) and [BR_G06](https://github.com/duniter/duniter/blob/master/doc/Protocol.md#br_g06---headissuersframevar) of the DUP protocol.

> Let’s go back to our personalised difficulty for a second !

We’ve explained that `exFact` spikes immediately after the member has found a block then rapidly decreases to `1` after a number of blocks equal to 1/3 of the number of calculating members. Let’s see precisely how we calculate `exFact` :

Let `nbPreviousIssuers` be the value of the issuersCount field at the last block found by the member and `nbBlocksSince` the number of blocks found by the rest of the network since the block in question.

The FLOOR is a simple truncate function. For `exFact` to exclude the member, we need the ratio of `(0.67 * nbPreviousIssuers / (1 + nbBlocksSince))` to be greater than or equal to 2. We can see that if `nbBlocksSince` is greater than 1/3 of the calculating members = `0.33*nbPreviousIssuers (0.67* 6 / 1 + 3) < 2 whereas (.67 * 6 / 1 + 1) = 2.01` then the ratio will be less than 2 and the member will not be excluded from calculating the next block.

On the other hand, if the last block was authored by the said member then `nbBlocksSince=0` and `exFact` is equal to `0.67 * nbPreviousIssuers`, which increases according to the number of members calculating. Just imagine how high the number would be if hundreds or thousands of members were calculating together. One’s difficulty would spike so high that even the most powerful of supercomputers or mining farm would be excluded. We have therefore succeeded in our intent to deter attempts to seize the blockchain and its currency.

However, at any time t (no, not tea time 🍵 ), the two-thirds of calculating members all have an exclusion factor of `1`, even though they might not all have the same computing power at hand. If the personalised difficulty only took into account the exclusion factor then only the members with the highest computational power from the remaining third would be able to author new blocks and the other 2/3s would almost always be excluded. Machines such as Raspberry PIs wouldn’t stand a chance …

#### The handicap

The handicap is the second parameter of the personalised difficulty. Its main role is to improve the rotation mechanism between calculating peers by assigning a higher handicap to members that have higher calculating power to afford better chances to lesser machines and decrease the currency’s environmental footprint.

In defining the handicap’s value we take into account the median numbers of blocks authored by the peer within the current window. The goal is to handicap the half that has the authored the most blocks to favour the other one.

Let `nbPersonalBlocksInFrame` be the number of blocks authored by a single member within the current window and `medianOfBlocksInFrame` the median number of blocks written by the calculating members during the same timeframe.

`(nbPersonalBlocksInFrame + 1) / medianOfBlocksInFrame)` is simply the ratio between the number of blocks authored by the peer and the median number of blocks.

For example, assume that a peer has authored `9` blocks in the current window and that the median is at `5`, the ratio will then be `(9+1)/5 = 2`. The MAX function allows us to ensure that the handicap has a value at least equal to `1`.

We then proceed by taking the Napierian Logarithm of this ratio to prevent the handicap from becoming excluding, its role being to level the calculating field so that all peers stand a chance and not to exclude anyone.

If we want the handicap to be applied as soon as the median is reached, we’d have to divide it by `LN(1)`, the problem is that we’ve already set a minimum value of `1` with the MAX function, so if we were to divide the ratio by `LN(1)` all calculating peers would have a handicap >= `1`. In addition, is it really fair to handicap a member who’s right on the median?

Si l'on veut que le handicap s'applique dés la médiane il faudrait ensuite diviser par `LN(1)`, le problème c'est qu'on a déjà nivellé la rapport à `1` avec la fonction MAX, donc si l'on divisait par `LN(1)` tous les membres calculant auraient un handicap >= `1`, qui plus est, est-il bien juste de donner un handicap à un membre qui est tout juste à la médiane ?

That’s why we went for `1.189` rather than `1` meaning that you have to be at least `18,9 %` above the median to be assigned a handicap -if we leave out the +1 in the formula -which becomes negligible as the number of calculating peers increases-.

Why `18,9%` you might ask ?

It’s actually 16^(1/16), the difficulty factor between two levels of the proof work whose hash is a hexadecimal.

To conclude, you have to remember that the handicap is indexed on the logarithm of the ratio to the median and that it is only applied on members whose ratio to the median is greater than the ratio between two levels of the proof-of-work’s difficulty.