Compare commits

...

141 Commits
dev ... main

Author SHA1 Message Date
Ladd Hoffman 3c4a5cbb7a started on tests for lightweight bench 2024-06-29 12:57:07 -05:00
Ladd Hoffman da20410f87 update DAO to use global forum 2024-06-28 13:44:18 -05:00
Ladd Hoffman ed928043ed move private structs inside bench contract 2024-06-28 10:46:52 -05:00
Ladd Hoffman 282d9478df add global forum contract 2024-06-28 10:46:21 -05:00
Ladd Hoffman ef19b9bd66 add lightweight bench contract 2024-06-28 10:46:04 -05:00
Ladd Hoffman 6b37cead66 future work: reduce on-chain costs 2024-06-19 17:58:10 -05:00
Ladd Hoffman abc7cf692c links to test cases 2024-06-19 17:49:20 -05:00
Ladd Hoffman fcdaa3e3b4 add notes from discussion 2024-06-19 17:43:08 -05:00
Ladd Hoffman 467ec9b13c availability stake wording 2024-06-13 15:23:58 -05:00
Ladd Hoffman ff74de6b23 reputation can be assigned to a contract address 2024-06-13 15:19:32 -05:00
Ladd Hoffman 6646c7643f link to wiki 2024-06-13 09:56:03 -05:00
Ladd Hoffman 425c334eab remove gitea action 2024-06-10 20:02:46 -05:00
Ladd Hoffman 7a60015726 user stories: new member
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 28s Details
2024-06-10 20:01:22 -05:00
Ladd Hoffman 2c1dfa6753 user stories: initialize DAO
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-06-10 17:10:31 -05:00
Ladd Hoffman 7aca83714f arbitration: more text
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-06-10 16:56:56 -05:00
Ladd Hoffman 19967d0827 user stories: deploy DAO 2024-06-10 16:56:48 -05:00
Ladd Hoffman 9bb6340636 arbitration: appeal
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 38s Details
2024-06-10 16:07:44 -05:00
Ladd Hoffman 06be48fa0b arbitration
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-06-10 15:48:21 -05:00
Ladd Hoffman 918a2bc701 capitalize Batch Worker
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 43s Details
2024-06-10 12:38:35 -05:00
Ladd Hoffman 3023408e04 compute references for batch post
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 29s Details
2024-06-10 12:31:48 -05:00
Ladd Hoffman a81e637907 sys design: Onboarding
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-06-10 11:29:09 -05:00
Ladd Hoffman 9375e8f0db minor
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 31s Details
2024-06-05 20:34:19 -05:00
Ladd Hoffman 774ede921f minor
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 37s Details
2024-06-05 20:32:46 -05:00
Ladd Hoffman 844f7217f1 sys design: matrix pools
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 32s Details
2024-06-05 20:11:38 -05:00
Ladd Hoffman efdf838a41 sys design: rollup details
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-06-04 10:51:31 -05:00
Ladd Hoffman e2f47f0be2 sys design: more forum details
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-06-04 10:01:02 -05:00
Ladd Hoffman f537ea93e5 add user stories section
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-06-03 09:02:55 -05:00
Ladd Hoffman 30792ee479 sys design: more VP details
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-21 16:33:05 -05:00
Ladd Hoffman e54339aecf more valdiation pool details
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-05-21 15:44:44 -05:00
Ladd Hoffman 5a407efd2d sys design: api read
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 27s Details
2024-05-20 16:49:21 -05:00
Ladd Hoffman ee6730e13f sys design: api write
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 41s Details
2024-05-20 16:41:42 -05:00
Ladd Hoffman 94778d711c fixup ss import 2024-05-20 16:31:04 -05:00
Ladd Hoffman 75ba4dd349 rename citation -> reference 2024-05-20 15:26:28 -05:00
Ladd Hoffman 6376a37d9b stubs for sys design sections
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 38s Details
2024-05-20 08:54:38 -05:00
Ladd Hoffman c8a512a401 remove superfluous text
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-05-19 17:57:36 -05:00
Ladd Hoffman fd0c4896c3 add arrow Work --> Rollup
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-05-19 17:56:10 -05:00
Ladd Hoffman c1cd822a92 fixup links
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 42s Details
2024-05-19 17:53:57 -05:00
Ladd Hoffman 6ae2b6d472 system design for contracts
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 27s Details
2024-05-19 17:52:46 -05:00
Ladd Hoffman 3c05c32dff flatten sys design section
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-19 17:05:13 -05:00
Ladd Hoffman e5af69de76 formatting
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-05-19 16:24:23 -05:00
Ladd Hoffman fe165184f3 more REP reqs
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 43s Details
2024-05-19 16:19:52 -05:00
Ladd Hoffman 1d1641ac36 More rep requirement details
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-05-19 15:45:26 -05:00
Ladd Hoffman 828de931d0 more general reqs for reputation
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 24s Details
2024-05-19 15:36:26 -05:00
Ladd Hoffman 60231f8913 add wiki link to readme
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 32s Details
2024-05-19 13:42:19 -05:00
Ladd Hoffman f054a04724 add links to readme
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 31s Details
2024-05-19 13:41:01 -05:00
Ladd Hoffman 5f5a0c7150 fixup
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-05-16 21:39:24 -05:00
Ladd Hoffman 9dde941574 fixup
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-05-16 21:37:52 -05:00
Ladd Hoffman d45f46843e sys design: organizing and add diagram
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 26s Details
2024-05-16 21:25:04 -05:00
Ladd Hoffman 22c62d9953 spec: system design: contracts and matrix, stub
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-05-16 20:12:30 -05:00
Ladd Hoffman b11f777362 Rollup requirements
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-05-16 12:26:27 -05:00
Ladd Hoffman 40da0a0a2c more VP details
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 29s Details
2024-05-15 17:02:32 -05:00
Ladd Hoffman 7ad2cf5bf7 docs
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 38s Details
2024-05-15 16:48:08 -05:00
Ladd Hoffman 856b56238c requirements: core contracts
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-05-15 16:33:53 -05:00
Ladd Hoffman 3a5736f6a3 judicial requirements
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-15 15:29:50 -05:00
Ladd Hoffman eac3605c62 Intro paragraph for executive reqs
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-05-15 14:25:39 -05:00
Ladd Hoffman 6272d5c153 requirements numbering
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 37s Details
2024-05-15 14:16:50 -05:00
Ladd Hoffman 66fe02221b requirements: executive
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-15 14:15:51 -05:00
Ladd Hoffman 0b63aac805 Add terminology
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 31s Details
2024-05-15 13:47:21 -05:00
Ladd Hoffman b098ec5764 need index.md to generate index.html
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 37s Details
2024-05-15 13:28:17 -05:00
Ladd Hoffman 4e48613de9 initial stub for specification document
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 37s Details
2024-05-15 13:13:07 -05:00
Ladd Hoffman f64eba070c allow VP with no fee
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-05-15 11:35:26 -05:00
Ladd Hoffman d9479152da only stake half available REP if an error was encountered when evaluating a VP 2024-05-15 11:34:54 -05:00
Ladd Hoffman 72e16651fb minor refactor, add additional subdir for rollup
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 29s Details
2024-05-09 12:06:09 -05:00
Ladd Hoffman f266f2c261 Merge branch 'main-local'
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-05-09 11:47:54 -05:00
Ladd Hoffman f7dcf0ec5e rollup: use time rather than count for interval; add resetBatchWorker method 2024-05-09 11:47:05 -05:00
Ladd Hoffman 081b360837 Reposition note in diagram
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 41s Details
2024-05-08 19:41:26 -05:00
Ladd Hoffman 837b972149 add rollup docs
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 28s Details
2024-05-08 18:41:01 -05:00
Ladd Hoffman 2190ac3aaf fix typo in filename
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-05-07 15:08:11 -05:00
Ladd Hoffman 33b7111b2c fix batch items record keeping 2024-05-07 15:07:45 -05:00
Ladd Hoffman 29087d37bd fix typo in filename 2024-05-07 15:04:48 -05:00
Ladd Hoffman 4d8889e1af Update backend/src/util/gate-by-proposal.js
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 39s Details
fix typo in console log
2024-05-06 15:10:44 -05:00
Ladd Hoffman 86d7fa921a fixup batch items record keeping 2024-05-05 12:30:24 -05:00
Ladd Hoffman efef9e8169 Command to restart a matrix pool
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 28s Details
2024-05-04 20:05:39 -05:00
Ladd Hoffman 4fe17240a6 logging
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-05-04 19:57:11 -05:00
Ladd Hoffman 1e1686619d attempt to fix batch ordering
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 40s Details
2024-05-04 19:52:44 -05:00
Ladd Hoffman 0617fea89e more logging
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-05-04 19:30:49 -05:00
Ladd Hoffman 75919de640 More logging, and added a test case
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-04 19:24:22 -05:00
Ladd Hoffman aab9998462 More logging
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 42s Details
2024-05-04 18:20:02 -05:00
Ladd Hoffman ea56cefe90 fixup filename
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-05-04 18:10:25 -05:00
Ladd Hoffman 50a0a5d4d8 Use promise rather than callback mode for fastq
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 40s Details
2024-05-04 17:50:56 -05:00
Ladd Hoffman 890d5d3c14 sepolia deploy
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 37s Details
2024-05-03 20:49:37 -05:00
Ladd Hoffman 981dda97b1 Fixes for matrix pool batching
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 31s Details
2024-05-03 20:43:58 -05:00
Ladd Hoffman f9f8fc8f5f remove extraneous export
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-03 14:41:58 -05:00
Ladd Hoffman 8be06678d7 null result -> abstain from vote
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-05-03 14:30:24 -05:00
Ladd Hoffman d3f4740422 refactored rollup for code clarity
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-03 14:18:20 -05:00
Ladd Hoffman 69c869a693 pause matrix outbound queue until target room id is set 2024-05-03 13:19:20 -05:00
Ladd Hoffman f7b1bfcb3b verify batch item identifiers on-chain at batch submit
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-05-03 13:10:25 -05:00
Ladd Hoffman 7823b97c60 logging
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 29s Details
2024-05-02 20:38:37 -05:00
Ladd Hoffman a2aa5c8134 log before submitBatch
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 39s Details
2024-05-02 20:18:59 -05:00
Ladd Hoffman ff42215809 reset batch will send matrix pool start if needed
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 41s Details
2024-05-02 20:10:41 -05:00
Ladd Hoffman 36697e1f6f additional logging
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-05-02 19:51:25 -05:00
Ladd Hoffman 7d9734318a fixup console log of batch post id
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 27s Details
2024-05-02 19:36:04 -05:00
Ladd Hoffman 233dcee11f retry contract calls
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-05-02 19:32:52 -05:00
Ladd Hoffman 6911c04684 fixup, require bluebird
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-05-02 19:20:11 -05:00
Ladd Hoffman 8944212c03 sepolia deploy
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 41s Details
2024-05-02 19:11:13 -05:00
Ladd Hoffman 2a4b9ef744 fixup, include roomId and eventId in matrixPool record 2024-05-02 19:11:01 -05:00
Ladd Hoffman f7afd0105f rollup is working
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 32s Details
2024-05-02 19:08:53 -05:00
Ladd Hoffman 9c95e813a2 try/catch
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 39s Details
2024-05-02 14:00:56 -05:00
Ladd Hoffman 01cf630a20 fixup logic for stakeAvailability allowance values
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 27s Details
2024-05-02 13:57:32 -05:00
Ladd Hoffman c16db92e27 rework availability contract to keep one active stake per worker 2024-05-02 13:44:37 -05:00
Ladd Hoffman 2790c9262b comments
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-05-02 12:01:14 -05:00
Ladd Hoffman 073f6e61aa first cut at backend part of rollup implementation
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 40s Details
2024-05-01 22:25:04 -05:00
Ladd Hoffman 33a458aba1 backend: refactor bot commands and identity registration into topics
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 32s Details
2024-04-30 22:27:21 -05:00
Ladd Hoffman b3371c84d9 backend, slight refactor; we already had a wallet available to export
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 29s Details
2024-04-30 19:22:55 -05:00
Ladd Hoffman 62741ead01 enable staking
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 37s Details
2024-04-30 17:53:57 -05:00
Ladd Hoffman 4e9ebfa7c3 fixup backend
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 31s Details
2024-04-30 17:32:55 -05:00
Ladd Hoffman fb3f842355 send http 202 accepted if import is taking a long time
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-04-30 16:57:34 -05:00
Ladd Hoffman 40c1fd43d7 deploy core contracts separately due to size limit
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-04-30 15:56:43 -05:00
Ladd Hoffman 5aee8c9314 staking enable/diable fixup
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-04-29 18:13:24 -05:00
Ladd Hoffman e709a1bc67 import from matrix: fixup
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 25s Details
2024-04-29 17:51:51 -05:00
Ladd Hoffman 337c4824fd import from matrix: sign as sender
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 36s Details
2024-04-29 17:21:03 -05:00
Ladd Hoffman 276677e1c8 stub for backend automatic staking
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 28s Details
2024-04-29 17:00:56 -05:00
Ladd Hoffman 0843a3279d frontend: refactor to consolidate main tabs as a component
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-04-28 20:27:25 -05:00
Ladd Hoffman 2c5fb00180 rename WorkContract to Work
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-04-28 18:11:20 -05:00
Ladd Hoffman fe0326bf2c rename contentId to postId
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-04-28 18:06:26 -05:00
Ladd Hoffman 9702626e0e add batch size check in rollup contract
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 44s Details
2024-04-28 18:02:39 -05:00
Ladd Hoffman 08dee20a29 delegated stake minor improvements
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 32s Details
2024-04-28 17:45:07 -05:00
Ladd Hoffman 37fd387cf3 add post separate from propose
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 38s Details
2024-04-28 16:51:35 -05:00
Ladd Hoffman 8e272bf2e8 refactor and stub for rollup 2024-04-28 15:06:10 -05:00
Ladd Hoffman b175a34b9f project architecture documentation
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 39s Details
2024-04-28 14:19:23 -05:00
Ladd Hoffman 696a6d53b6 Merge pull request 'minor refactor' (#10) from Virtual-branch into main
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 29s Details
Reviewed-on: #10
2024-04-27 19:18:05 -05:00
Ladd Hoffman d568c08cce minor refactor
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 32s Details
2024-04-27 19:19:53 -05:00
Ladd Hoffman 6cca9a1d95 Merge pull request 'remove unneeded comments' (#9) from Virtual-branch into main
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 25s Details
Reviewed-on: #9
2024-04-27 18:55:25 -05:00
Ladd Hoffman 27efdafc94 remove unneeded comments
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 33s Details
2024-04-27 18:56:38 -05:00
Ladd Hoffman 551e6dfa54 fix papers import
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 39s Details
2024-04-27 18:38:01 -05:00
Ladd Hoffman 5e580768e9 formatting
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-04-27 18:09:06 -05:00
Ladd Hoffman 2399294e13 destructure hash property of write result
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 41s Details
2024-04-27 18:07:41 -05:00
Ladd Hoffman 93345f6152 remove infura key
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-04-27 17:25:43 -05:00
Ladd Hoffman 53fcc2f2d3 Merge pull request 'frontend: add infura api key' (#7) from frontend-add-infura-api-key- into main
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 27s Details
Reviewed-on: #7
2024-04-27 17:19:03 -05:00
Ladd Hoffman b1d1ea6d7a frontend: add infura api key
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-04-27 17:03:16 -05:00
Ladd Hoffman d2d781aef3 send reply notice when importing matrix post
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 34s Details
2024-04-27 16:11:10 -05:00
Ladd Hoffman 0b6d147f9c matrix event import is working
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 26s Details
2024-04-27 15:59:36 -05:00
Ladd Hoffman c497b55294 consolidate more into outbound queue
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 30s Details
2024-04-27 13:49:27 -05:00
Ladd Hoffman de4ce0da5e update readme for backend
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 41s Details
2024-04-27 13:40:11 -05:00
Ladd Hoffman f7bd1fc67b refactor matrix.js into separate files
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 38s Details
2024-04-27 13:37:16 -05:00
Ladd Hoffman 17e8a559cf backend: refactor files into directories 2024-04-27 12:57:15 -05:00
Ladd Hoffman 44821a2556 remove outdated comments
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 38s Details
2024-04-26 22:58:23 -05:00
Ladd Hoffman b1b59e6df5 ignore cited papers with no authors
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 40s Details
2024-04-26 22:48:52 -05:00
Ladd Hoffman 6a644097fe indicate if a user was already registered
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 38s Details
2024-04-26 21:56:20 -05:00
Ladd Hoffman 260de4e724 remove "watch rep" button
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 32s Details
2024-04-26 19:52:50 -05:00
Ladd Hoffman 58dc26ee30 use wildcard for widget client origin
Gitea Actions Demo / Explore-Gitea-Actions (push) Failing after 35s Details
2024-04-26 19:45:33 -05:00
130 changed files with 10141 additions and 2894 deletions

View File

@ -1,19 +0,0 @@
name: Gitea Actions Demo
run-name: ${{ gitea.actor }} is testing out Gitea Actions 🚀
on: [push]
jobs:
Explore-Gitea-Actions:
runs-on: ubuntu-latest
steps:
- run: echo "🎉 The job was automatically triggered by a ${{ gitea.event_name }} event."
- run: echo "🐧 This job is now running on a ${{ runner.os }} server hosted by Gitea!"
- run: echo "🔎 The name of your branch is ${{ gitea.ref }} and your repository is ${{ gitea.repository }}."
- name: Check out repository code
uses: actions/checkout@v4
- run: echo "💡 The ${{ gitea.repository }} repository has been cloned to the runner."
- run: echo "🖥️ The workflow is now ready to test your code on the runner."
- name: List files in the repository
run: |
ls ${{ gitea.workspace }}
- run: echo "🍏 This job's status is ${{ job.status }}."

104
README.md
View File

@ -1,6 +1,110 @@
# DGF Prototype
Decentralized Governance Framework
* [Specification](https://spec.dgov.io)
* [Demo](https://demo.dgov.io)
* [Wiki](https://daogovernanceframework.com/wiki/DAO_Governance_Framework)
## Project Architecture
| directory | description |
| --------- | ----------- |
| ethereum | Solidity smart contracts and associated deploy scripts |
| backend | Node.js application with an HTTP API that also functions as a Matrix bot and Ethereum client |
| frontend | React.js frontend with a WebApp route and a Matrix Widget route |
### Data Flow Diagram
```mermaid
flowchart TD
Blockchain <-- ethers --> API
Blockchain <-- Web3<br>+ MetaMask --> WebApp
Blockchain <-- Web3<br>+ MetaMask --> Widget
WebApp <-- HTTPS --> API
Widget <-- HTTPS --> API
Widget <-- matrix-widget-api --> Matrix
API <-- matrix-bot-sdk --> Matrix
```
## Rollup
Instead of calling `DAO.initiateValidationPool()`, a contract can call `Rollup.addItem()`.
We demonstrate this by extending our base `Work` contract as `RollableWork`.
Our work contract normally triggeres a validation pool when the customer submits work approval. Instead, the fee and worker availability stakes are transferred to the `Rollup` contract.
The `Rollup` contract itself uses the `Availability` contract to assign a batch worker. This worker is responsible for making sure off-chain pools are conducted.
When ready, the worker submits the current batch on-chain by calling `DAO.addPost()` to create a batch post and then calling `Rollup.submitBatch()`, which initiates a validation pool targeting the batch post.
```mermaid
sequenceDiagram
participant client as Staking client
participant matrix as Matrix room
box Blockchain
participant worker as Worker
participant customer as Customer
participant work as Work contract
participant rollup as Rollup contract
participant vp as Validation pool
participant forum as Forum
end
worker ->> work : Availability stake<br />(REP)
activate worker
activate work
customer ->> work : Request work<br />(fee)
activate customer
worker ->> work : Submit work evidence<br />(postId)
deactivate worker
customer ->> work : Submit work approval
deactivate customer
work ->> rollup : Add item<br />(fee, REP, postId)
activate rollup
deactivate work
rollup ->> client : Event: BatchItemAdded
activate client
client ->> matrix : io.dgov.pool.start<br />(postId)
activate matrix
matrix -->> client :
client ->> matrix : io.dgov.pool.stake<br />(postId, REP, inFavor)
matrix -->> client :
client ->> matrix : io.dgov.pool.result<br />(postId, votePasses, quorumMet)
matrix -->> client :
note right of client : All staking clients<br/>record each other's stakes
client ->> forum : Add post<br />(batchPostId)
activate forum
client ->> rollup : Submit batch<br />(batchPostId)
client ->> matrix : io.dgov.rollup.submit
matrix -->> client :
deactivate matrix
rollup ->> vp : Initiate validation pool<br />(fee, REP, batchPostId)
activate vp
note right of vp : Mints REP in<br />proportion to fee
deactivate rollup
vp ->> client : Event: ValidationPoolInitiated
note right of client : Each staking client <br />verifies the rollup post
client ->> vp : Stake for/against
client ->> vp : Evaluate outcome
vp ->> client : REP rewards for policin
deactivate client
vp ->> forum : Minted REP
deactivate vp
forum ->> worker : REP rewards for batch post authors
deactivate forum
```
## Local development setup
Clone this repository to a directory on your machine

View File

@ -10,4 +10,9 @@ MATRIX_PASSWORD=
MATRIX_ACCESS_TOKEN=
BOT_STORAGE_PATH="./data/bot-storage.json"
BOT_CRYPTO_STORAGE_PATH="./data/bot-crypto"
BOT_INSTANCE_ID=
BOT_INSTANCE_ID=
ENABLE_API=
ENABLE_MATRIX=
ENABLE_STAKING=
START_PROPOSAL_ID=
STOP_PROPOSAL_ID=

View File

@ -1,6 +1,6 @@
module.exports = {
root: true,
env: { es2020: true },
env: { es2020: true, mocha: true },
extends: [
'eslint:recommended',
'airbnb',
@ -11,7 +11,7 @@ module.exports = {
'import/no-extraneous-dependencies': [
'error',
{
devDependencies: false,
devDependencies: ['**/*.test.js'],
optionalDependencies: false,
peerDependencies: false,
},

3
backend/.mocharc.js Normal file
View File

@ -0,0 +1,3 @@
module.exports = {
spec: ['src/**/*.test.js'],
};

View File

@ -1,17 +1,19 @@
# Setup
1.
1. Prepare the environment variables
cp .env.example .env
cp .env.example .env
1.
1. Install packages so that we can run the login script (next command)
npm install
npm install
1.
1. Run the login script
npm run registration
npm run login
1.
1. Edit `.env` to include the `MATRIX_AUTH_TOKEN` output from the previous command
docker compose up -d --build
1. Build and start the docker container
docker compose up -d --build

View File

@ -1,14 +1,24 @@
{
"localhost": {
"DAO": "0x57BDFFf79108E5198dec6268A6BFFD8B62ECfA38",
"Work1": "0xB8f0cd092979F273b752FDa060F82BF2745f192e",
"Onboarding": "0x8F00038542C87A5eAf18d5938B7723bF2A04A4e4",
"Proposals": "0x6c18eb38b7450F8DaE5A5928A40fcA3952493Ee4"
"DAO": "0x3734B0944ea37694E85AEF60D5b256d19EDA04be",
"Work1": "0x8BDA04936887cF11263B87185E4D19e8158c6296",
"Onboarding": "0x8688E736D0D72161db4D25f68EF7d0EE4856ba19",
"Proposals": "0x3287061aDCeE36C1aae420a06E4a5EaE865Fe3ce",
"Rollup": "0x71cb20D63576a0Fa4F620a2E96C73F82848B09e1",
"Work2": "0x76Dfe9F47f06112a1b78960bf37d87CfbB6D6133",
"Reputation": "0xEAefe601Aad7422307B99be65bbE005aeA966012",
"Forum": "0x79e365342329560e8420d7a0f016633d7640cB18",
"Bench": "0xC0f00E5915F9abE6476858fD1961EAf79395ea64"
},
"sepolia": {
"DAO": "0x8e5bd58B2ca8910C5F9be8de847d6883B15c60d2",
"Work1": "0x1708A144F284C1a9615C25b674E4a08992CE93e4",
"Onboarding": "0xb21D4c986715A1adb5e87F752842613648C20a7B",
"Proposals": "0x930c47293F206780E8F166338bDaFF3520306032"
"DAO": "0xBA2e65ae29667E145343bD5Fd655A72dcf873b08",
"Work1": "0x251dB891768ea85DaCA6bb567669F97248D09Fe3",
"Onboarding": "0x78FC8b520001560A9D7a61072855218320C71BDC",
"Proposals": "0xA888cDC4Bd80d402b14B1FeDE5FF471F1737570c",
"Reputation": "0x62cc0035B17F1686cE30320B90373c77fcaA58CD",
"Forum": "0x51b5Af12707e0d879B985Cb0216bFAC6dca85501",
"Bench": "0x98d9F0e97Af71936747819040ddBE896A548ef4d",
"Rollup": "0x678DC2c846bfDCC813ea27DfEE428f1d7f2521ED",
"Work2": "0x609102Fb6cA15da80D37E8cA68aBD5e1bD9C855B"
}
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -18,16 +18,21 @@
"express-async-errors": "^3.1.1",
"fastq": "^1.17.1",
"level": "^8.0.1",
"lodash": "^4.17.21",
"matrix-bot-sdk": "^0.7.1",
"object-hash": "^3.0.0"
"object-hash": "^3.0.0",
"uuid": "^9.0.1"
},
"devDependencies": {
"assert": "^2.1.0",
"eslint": "^8.56.0",
"eslint-config-airbnb": "^19.0.4",
"eslint-plugin-import": "^2.29.1",
"eslint-plugin-jsx-a11y": "^6.8.0",
"eslint-plugin-react": "^7.33.2",
"eslint-plugin-react-hooks": "^4.6.0"
"eslint-plugin-react-hooks": "^4.6.0",
"mocha": "^10.4.0",
"proxyquire": "^2.1.3"
}
},
"node_modules/@aashutoshrathi/word-wrap": {
@ -610,6 +615,15 @@
"resolved": "https://registry.npmjs.org/another-json/-/another-json-0.2.0.tgz",
"integrity": "sha512-/Ndrl68UQLhnCdsAzEXLMFuOR546o2qbYRqCglaNHbjXrwG1ayTcdwr3zkSGOGtGXDyR5X9nCFfnyG2AFJIsqg=="
},
"node_modules/ansi-colors": {
"version": "4.1.1",
"resolved": "https://registry.npmjs.org/ansi-colors/-/ansi-colors-4.1.1.tgz",
"integrity": "sha512-JoX0apGbHaUJBNl6yF+p6JAFYZ666/hhCGKN5t9QFjbJQKUU/g8MNbFDbvfrgKXvI1QpZplPOnwIo99lX/AAmA==",
"dev": true,
"engines": {
"node": ">=6"
}
},
"node_modules/ansi-regex": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
@ -633,6 +647,19 @@
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
}
},
"node_modules/anymatch": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/anymatch/-/anymatch-3.1.3.tgz",
"integrity": "sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==",
"dev": true,
"dependencies": {
"normalize-path": "^3.0.0",
"picomatch": "^2.0.4"
},
"engines": {
"node": ">= 8"
}
},
"node_modules/argparse": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz",
@ -805,6 +832,19 @@
"safer-buffer": "~2.1.0"
}
},
"node_modules/assert": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/assert/-/assert-2.1.0.tgz",
"integrity": "sha512-eLHpSK/Y4nhMJ07gDaAzoX/XAKS8PSaojml3M0DM4JpV1LAi5JOJ/p6H/XWrl8L+DzVEvVCW1z3vWAaB9oTsQw==",
"dev": true,
"dependencies": {
"call-bind": "^1.0.2",
"is-nan": "^1.3.2",
"object-is": "^1.1.5",
"object.assign": "^4.1.4",
"util": "^0.12.5"
}
},
"node_modules/assert-plus": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/assert-plus/-/assert-plus-1.0.0.tgz",
@ -948,6 +988,18 @@
"resolved": "https://registry.npmjs.org/tweetnacl/-/tweetnacl-0.14.5.tgz",
"integrity": "sha512-KXXFFdAbFXY4geFIwoyNK+f5Z1b7swfXABfL7HXCmoIWMKU3dmS26672A4EeQtDzLKy7SXmfBu51JolvEKwtGA=="
},
"node_modules/binary-extensions": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/binary-extensions/-/binary-extensions-2.3.0.tgz",
"integrity": "sha512-Ceh+7ox5qe7LJuLHoY0feh3pHuUDHAcRUeyL2VYghZwfpkNIy/+8Ocg0a3UuSoYzavmylwuLWQOf3hl0jjMMIw==",
"dev": true,
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/bluebird": {
"version": "3.7.2",
"resolved": "https://registry.npmjs.org/bluebird/-/bluebird-3.7.2.tgz",
@ -999,6 +1051,18 @@
"concat-map": "0.0.1"
}
},
"node_modules/braces": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz",
"integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==",
"dev": true,
"dependencies": {
"fill-range": "^7.0.1"
},
"engines": {
"node": ">=8"
}
},
"node_modules/browser-level": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/browser-level/-/browser-level-1.0.1.tgz",
@ -1010,6 +1074,12 @@
"run-parallel-limit": "^1.1.0"
}
},
"node_modules/browser-stdout": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/browser-stdout/-/browser-stdout-1.3.1.tgz",
"integrity": "sha512-qhAVI1+Av2X7qelOfAIYwXONood6XlZE/fXaBSmW/T5SzLAmCgzi+eiWE7fUvbHaeNBQH13UftjpXxsfLkMpgw==",
"dev": true
},
"node_modules/buffer": {
"version": "6.0.3",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-6.0.3.tgz",
@ -1068,6 +1138,18 @@
"node": ">=6"
}
},
"node_modules/camelcase": {
"version": "6.3.0",
"resolved": "https://registry.npmjs.org/camelcase/-/camelcase-6.3.0.tgz",
"integrity": "sha512-Gmy6FhYlCY7uOElZUSbxo2UCDH8owEk996gkbrpsgGtrJLM3J7jGxl9Ic7Qwwj4ivOE5AWZWRMecDdF7hqGjFA==",
"dev": true,
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/caseless": {
"version": "0.12.0",
"resolved": "https://registry.npmjs.org/caseless/-/caseless-0.12.0.tgz",
@ -1096,6 +1178,45 @@
"url": "https://github.com/chalk/chalk?sponsor=1"
}
},
"node_modules/chokidar": {
"version": "3.5.3",
"resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.5.3.tgz",
"integrity": "sha512-Dr3sfKRP6oTcjf2JmUmFJfeVMvXBdegxB0iVQ5eb2V10uFJUCAS8OByZdVAyVb8xXNz3GjjTgj9kLWsZTqE6kw==",
"dev": true,
"funding": [
{
"type": "individual",
"url": "https://paulmillr.com/funding/"
}
],
"dependencies": {
"anymatch": "~3.1.2",
"braces": "~3.0.2",
"glob-parent": "~5.1.2",
"is-binary-path": "~2.1.0",
"is-glob": "~4.0.1",
"normalize-path": "~3.0.0",
"readdirp": "~3.6.0"
},
"engines": {
"node": ">= 8.10.0"
},
"optionalDependencies": {
"fsevents": "~2.3.2"
}
},
"node_modules/chokidar/node_modules/glob-parent": {
"version": "5.1.2",
"resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz",
"integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==",
"dev": true,
"dependencies": {
"is-glob": "^4.0.1"
},
"engines": {
"node": ">= 6"
}
},
"node_modules/classic-level": {
"version": "1.4.1",
"resolved": "https://registry.npmjs.org/classic-level/-/classic-level-1.4.1.tgz",
@ -1112,6 +1233,17 @@
"node": ">=12"
}
},
"node_modules/cliui": {
"version": "7.0.4",
"resolved": "https://registry.npmjs.org/cliui/-/cliui-7.0.4.tgz",
"integrity": "sha512-OcRE68cOsVMXp1Yvonl/fzkQOyjLSu/8bhPDfQt0e0/Eb283TKP20Fs2MqoPsr9SwA595rRCA+QMzYc9nBP+JQ==",
"dev": true,
"dependencies": {
"string-width": "^4.2.0",
"strip-ansi": "^6.0.0",
"wrap-ansi": "^7.0.0"
}
},
"node_modules/color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
@ -1246,6 +1378,18 @@
}
}
},
"node_modules/decamelize": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/decamelize/-/decamelize-4.0.0.tgz",
"integrity": "sha512-9iE1PgSik9HeIIw2JO94IidnE3eBoQrFJ3w7sFuzSX4DpmZ3v5sZpUiV5Swcf6mQEF+Y0ru8Neo+p+nyh2J+hQ==",
"dev": true,
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/deep-is": {
"version": "0.1.4",
"resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz",
@ -1327,6 +1471,15 @@
"npm": "1.2.8000 || >= 1.4.16"
}
},
"node_modules/diff": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/diff/-/diff-5.0.0.tgz",
"integrity": "sha512-/VTCrvm5Z0JGty/BWHljh+BAiw3IK+2j87NGMu8Nwc/f48WoDAC395uomO9ZD117ZOBaHmkX1oyLvkVM/aIT3w==",
"dev": true,
"engines": {
"node": ">=0.3.1"
}
},
"node_modules/doctrine": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/doctrine/-/doctrine-3.0.0.tgz",
@ -1586,6 +1739,15 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/escalade": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/escalade/-/escalade-3.1.2.tgz",
"integrity": "sha512-ErCHMCae19vR8vQGe50xIsVomy19rg6gFu3+r3jkEO46suLMWBksvVyoGgQV+jOfl84ZSOSlmv6Gxa89PmTGmA==",
"dev": true,
"engines": {
"node": ">=6"
}
},
"node_modules/escape-html": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
@ -2162,6 +2324,31 @@
"node": "^10.12.0 || >=12.0.0"
}
},
"node_modules/fill-keys": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/fill-keys/-/fill-keys-1.0.2.tgz",
"integrity": "sha512-tcgI872xXjwFF4xgQmLxi76GnwJG3g/3isB1l4/G5Z4zrbddGpBjqZCO9oEAcB5wX0Hj/5iQB3toxfO7in1hHA==",
"dev": true,
"dependencies": {
"is-object": "~1.0.1",
"merge-descriptors": "~1.0.0"
},
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/fill-range": {
"version": "7.0.1",
"resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz",
"integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==",
"dev": true,
"dependencies": {
"to-regex-range": "^5.0.1"
},
"engines": {
"node": ">=8"
}
},
"node_modules/finalhandler": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-1.2.0.tgz",
@ -2208,6 +2395,15 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/flat": {
"version": "5.0.2",
"resolved": "https://registry.npmjs.org/flat/-/flat-5.0.2.tgz",
"integrity": "sha512-b6suED+5/3rTpUBdG1gupIl8MPFCAMA0QXwmljLhvCUKcUvdE4gWky9zpuGCcXHOsz4J9wPGNWq6OKpmIzz3hQ==",
"dev": true,
"bin": {
"flat": "cli.js"
}
},
"node_modules/flat-cache": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/flat-cache/-/flat-cache-3.2.0.tgz",
@ -2299,6 +2495,20 @@
"integrity": "sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw==",
"dev": true
},
"node_modules/fsevents": {
"version": "2.3.3",
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz",
"integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==",
"dev": true,
"hasInstallScript": true,
"optional": true,
"os": [
"darwin"
],
"engines": {
"node": "^8.16.0 || ^10.6.0 || >=11.0.0"
}
},
"node_modules/function-bind": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
@ -2334,6 +2544,15 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/get-caller-file": {
"version": "2.0.5",
"resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
"integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==",
"dev": true,
"engines": {
"node": "6.* || 8.* || >= 10.*"
}
},
"node_modules/get-intrinsic": {
"version": "1.2.4",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.4.tgz",
@ -2572,6 +2791,15 @@
"node": ">= 0.4"
}
},
"node_modules/he": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/he/-/he-1.2.0.tgz",
"integrity": "sha512-F/1DnUGPopORZi0ni+CvrCgHQ5FyEAHRLSApuYWMmrbSwoN2Mn/7k+Gl38gJnR7yyDZk6WLXwiGod1JOWNDKGw==",
"dev": true,
"bin": {
"he": "bin/he"
}
},
"node_modules/html-to-text": {
"version": "9.0.5",
"resolved": "https://registry.npmjs.org/html-to-text/-/html-to-text-9.0.5.tgz",
@ -2752,6 +2980,22 @@
"node": ">= 0.10"
}
},
"node_modules/is-arguments": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/is-arguments/-/is-arguments-1.1.1.tgz",
"integrity": "sha512-8Q7EARjzEnKpt/PCD7e1cgUS0a6X8u5tdSiMqXhojOdoV9TsMsiO+9VLC5vAmO8N7/GmXn7yjR8qnA6bVAEzfA==",
"dev": true,
"dependencies": {
"call-bind": "^1.0.2",
"has-tostringtag": "^1.0.0"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-array-buffer": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/is-array-buffer/-/is-array-buffer-3.0.4.tgz",
@ -2795,6 +3039,18 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-binary-path": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/is-binary-path/-/is-binary-path-2.1.0.tgz",
"integrity": "sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==",
"dev": true,
"dependencies": {
"binary-extensions": "^2.0.0"
},
"engines": {
"node": ">=8"
}
},
"node_modules/is-boolean-object": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/is-boolean-object/-/is-boolean-object-1.1.2.tgz",
@ -2893,6 +3149,15 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-fullwidth-code-point": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/is-generator-function": {
"version": "1.0.10",
"resolved": "https://registry.npmjs.org/is-generator-function/-/is-generator-function-1.0.10.tgz",
@ -2929,6 +3194,22 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-nan": {
"version": "1.3.2",
"resolved": "https://registry.npmjs.org/is-nan/-/is-nan-1.3.2.tgz",
"integrity": "sha512-E+zBKpQ2t6MEo1VsonYmluk9NxGrbzpeeLC2xIViuO2EjU2xsXsBPwTr3Ykv9l08UYEVEdWeRZNouaZqF6RN0w==",
"dev": true,
"dependencies": {
"call-bind": "^1.0.0",
"define-properties": "^1.1.3"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-negative-zero": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/is-negative-zero/-/is-negative-zero-2.0.3.tgz",
@ -2941,6 +3222,15 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-number": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz",
"integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==",
"dev": true,
"engines": {
"node": ">=0.12.0"
}
},
"node_modules/is-number-object": {
"version": "1.0.7",
"resolved": "https://registry.npmjs.org/is-number-object/-/is-number-object-1.0.7.tgz",
@ -2956,6 +3246,15 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-object": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/is-object/-/is-object-1.0.2.tgz",
"integrity": "sha512-2rRIahhZr2UWb45fIOuvZGpFtz0TyOZLf32KxBbSoUCeZR495zCKlWUKKUByk3geS2eAs7ZAABt0Y/Rx0GiQGA==",
"dev": true,
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/is-path-inside": {
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/is-path-inside/-/is-path-inside-3.0.3.tgz",
@ -2965,6 +3264,15 @@
"node": ">=8"
}
},
"node_modules/is-plain-obj": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-2.1.0.tgz",
"integrity": "sha512-YWnfyRwxL/+SsrWYfOpUtz5b3YD+nyfkHvjbcanzk8zgyO4ASD67uVMRt8k5bM4lLMDnXfriRhOpemw+NfT1eA==",
"dev": true,
"engines": {
"node": ">=8"
}
},
"node_modules/is-plain-object": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/is-plain-object/-/is-plain-object-5.0.0.tgz",
@ -3068,6 +3376,18 @@
"resolved": "https://registry.npmjs.org/is-typedarray/-/is-typedarray-1.0.0.tgz",
"integrity": "sha512-cyA56iCMHAh5CdzjJIa4aohJyeO1YbwLi3Jc35MmRU6poroFjIGZzUzupGiRPOjgHg9TLu43xbpwXk523fMxKA=="
},
"node_modules/is-unicode-supported": {
"version": "0.1.0",
"resolved": "https://registry.npmjs.org/is-unicode-supported/-/is-unicode-supported-0.1.0.tgz",
"integrity": "sha512-knxG2q4UC3u8stRGyAVJCOdxFmv5DZiRcdlIaAQXAbSfJya+OhopNotLQrstBhququ4ZpuKbDc/8S6mgXgPFPw==",
"dev": true,
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-weakmap": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/is-weakmap/-/is-weakmap-2.0.1.tgz",
@ -3334,6 +3654,22 @@
"integrity": "sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==",
"dev": true
},
"node_modules/log-symbols": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/log-symbols/-/log-symbols-4.1.0.tgz",
"integrity": "sha512-8XPvpAA8uyhfteu8pIvQxpJZ7SYYdpUivZpGy6sFsBuKRY/7rQGavedeB8aK+Zkyq6upMFVL/9AW6vOYzfRyLg==",
"dev": true,
"dependencies": {
"chalk": "^4.1.0",
"is-unicode-supported": "^0.1.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/loose-envify": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
@ -3504,6 +3840,102 @@
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/mocha": {
"version": "10.4.0",
"resolved": "https://registry.npmjs.org/mocha/-/mocha-10.4.0.tgz",
"integrity": "sha512-eqhGB8JKapEYcC4ytX/xrzKforgEc3j1pGlAXVy3eRwrtAy5/nIfT1SvgGzfN0XZZxeLq0aQWkOUAmqIJiv+bA==",
"dev": true,
"dependencies": {
"ansi-colors": "4.1.1",
"browser-stdout": "1.3.1",
"chokidar": "3.5.3",
"debug": "4.3.4",
"diff": "5.0.0",
"escape-string-regexp": "4.0.0",
"find-up": "5.0.0",
"glob": "8.1.0",
"he": "1.2.0",
"js-yaml": "4.1.0",
"log-symbols": "4.1.0",
"minimatch": "5.0.1",
"ms": "2.1.3",
"serialize-javascript": "6.0.0",
"strip-json-comments": "3.1.1",
"supports-color": "8.1.1",
"workerpool": "6.2.1",
"yargs": "16.2.0",
"yargs-parser": "20.2.4",
"yargs-unparser": "2.0.0"
},
"bin": {
"_mocha": "bin/_mocha",
"mocha": "bin/mocha.js"
},
"engines": {
"node": ">= 14.0.0"
}
},
"node_modules/mocha/node_modules/brace-expansion": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.1.tgz",
"integrity": "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==",
"dev": true,
"dependencies": {
"balanced-match": "^1.0.0"
}
},
"node_modules/mocha/node_modules/glob": {
"version": "8.1.0",
"resolved": "https://registry.npmjs.org/glob/-/glob-8.1.0.tgz",
"integrity": "sha512-r8hpEjiQEYlF2QU0df3dS+nxxSIreXQS1qRhMJM0Q5NDdR386C7jb7Hwwod8Fgiuex+k0GFjgft18yvxm5XoCQ==",
"dev": true,
"dependencies": {
"fs.realpath": "^1.0.0",
"inflight": "^1.0.4",
"inherits": "2",
"minimatch": "^5.0.1",
"once": "^1.3.0"
},
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/mocha/node_modules/minimatch": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-5.0.1.tgz",
"integrity": "sha512-nLDxIFRyhDblz3qMuq+SoRZED4+miJ/G+tdDrjkkkRnjAsBexeGpgjLEQ0blJy7rHhR2b93rhQY4SvyWu9v03g==",
"dev": true,
"dependencies": {
"brace-expansion": "^2.0.1"
},
"engines": {
"node": ">=10"
}
},
"node_modules/mocha/node_modules/ms": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
"dev": true
},
"node_modules/mocha/node_modules/supports-color": {
"version": "8.1.1",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-8.1.1.tgz",
"integrity": "sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q==",
"dev": true,
"dependencies": {
"has-flag": "^4.0.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/chalk/supports-color?sponsor=1"
}
},
"node_modules/module-error": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/module-error/-/module-error-1.0.2.tgz",
@ -3512,6 +3944,12 @@
"node": ">=10"
}
},
"node_modules/module-not-found-error": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/module-not-found-error/-/module-not-found-error-1.0.1.tgz",
"integrity": "sha512-pEk4ECWQXV6z2zjhRZUongnLJNUeGQJ3w6OQ5ctGwD+i5o93qjRQUk2Rt6VdNeu3sEP0AB4LcfvdebpxBRVr4g==",
"dev": true
},
"node_modules/morgan": {
"version": "1.10.0",
"resolved": "https://registry.npmjs.org/morgan/-/morgan-1.10.0.tgz",
@ -3613,6 +4051,15 @@
"node-gyp-build-test": "build-test.js"
}
},
"node_modules/normalize-path": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/normalize-path/-/normalize-path-3.0.0.tgz",
"integrity": "sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==",
"dev": true,
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/oauth-sign": {
"version": "0.9.0",
"resolved": "https://registry.npmjs.org/oauth-sign/-/oauth-sign-0.9.0.tgz",
@ -3646,6 +4093,22 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/object-is": {
"version": "1.1.6",
"resolved": "https://registry.npmjs.org/object-is/-/object-is-1.1.6.tgz",
"integrity": "sha512-F8cZ+KfGlSGi09lJT7/Nd6KJZ9ygtvYC0/UYYLI9nmQKLMnydpB9yvbv9K1uSkEu7FU9vYPmVwLg328tX+ot3Q==",
"dev": true,
"dependencies": {
"call-bind": "^1.0.7",
"define-properties": "^1.2.1"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/object-keys": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/object-keys/-/object-keys-1.1.1.tgz",
@ -3915,6 +4378,18 @@
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.0.tgz",
"integrity": "sha512-1fygroTLlHu66zi26VoTDv8yRgm0Fccecssto+MhsZ0D/DGW2sm8E8AjW7NU5VVTRt5GxbeZ5qBuJr+HyLYkjQ=="
},
"node_modules/picomatch": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz",
"integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==",
"dev": true,
"engines": {
"node": ">=8.6"
},
"funding": {
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/pify": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/pify/-/pify-3.0.0.tgz",
@ -4016,6 +4491,17 @@
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg=="
},
"node_modules/proxyquire": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/proxyquire/-/proxyquire-2.1.3.tgz",
"integrity": "sha512-BQWfCqYM+QINd+yawJz23tbBM40VIGXOdDw3X344KcclI/gtBbdWF6SlQ4nK/bYhF9d27KYug9WzljHC6B9Ysg==",
"dev": true,
"dependencies": {
"fill-keys": "^1.0.2",
"module-not-found-error": "^1.0.1",
"resolve": "^1.11.1"
}
},
"node_modules/psl": {
"version": "1.9.0",
"resolved": "https://registry.npmjs.org/psl/-/psl-1.9.0.tgz",
@ -4062,6 +4548,15 @@
}
]
},
"node_modules/randombytes": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/randombytes/-/randombytes-2.1.0.tgz",
"integrity": "sha512-vYl3iOX+4CKUWuxGi9Ukhie6fsqXqS9FE2Zaic4tNFD2N2QQaXOMFbuKK4QmDHC0JO6B1Zp41J0LpT0oR68amQ==",
"dev": true,
"dependencies": {
"safe-buffer": "^5.1.0"
}
},
"node_modules/range-parser": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
@ -4090,6 +4585,18 @@
"integrity": "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ==",
"dev": true
},
"node_modules/readdirp": {
"version": "3.6.0",
"resolved": "https://registry.npmjs.org/readdirp/-/readdirp-3.6.0.tgz",
"integrity": "sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==",
"dev": true,
"dependencies": {
"picomatch": "^2.2.1"
},
"engines": {
"node": ">=8.10.0"
}
},
"node_modules/reflect.getprototypeof": {
"version": "1.0.5",
"resolved": "https://registry.npmjs.org/reflect.getprototypeof/-/reflect.getprototypeof-1.0.5.tgz",
@ -4228,6 +4735,15 @@
"uuid": "bin/uuid"
}
},
"node_modules/require-directory": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
"integrity": "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==",
"dev": true,
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/resolve": {
"version": "1.22.8",
"resolved": "https://registry.npmjs.org/resolve/-/resolve-1.22.8.tgz",
@ -4456,6 +4972,15 @@
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="
},
"node_modules/serialize-javascript": {
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-6.0.0.tgz",
"integrity": "sha512-Qr3TosvguFt8ePWqsvRfrKyQXIiW+nGbYpy8XK24NQHE83caxWt+mIymTT19DGFbNWNLfEwsrkSmN64lVWB9ag==",
"dev": true,
"dependencies": {
"randombytes": "^2.1.0"
}
},
"node_modules/serve-static": {
"version": "1.15.0",
"resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.15.0.tgz",
@ -4605,6 +5130,26 @@
"graceful-fs": "^4.1.3"
}
},
"node_modules/string-width": {
"version": "4.2.3",
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
"dev": true,
"dependencies": {
"emoji-regex": "^8.0.0",
"is-fullwidth-code-point": "^3.0.0",
"strip-ansi": "^6.0.1"
},
"engines": {
"node": ">=8"
}
},
"node_modules/string-width/node_modules/emoji-regex": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==",
"dev": true
},
"node_modules/string.prototype.matchall": {
"version": "4.0.10",
"resolved": "https://registry.npmjs.org/string.prototype.matchall/-/string.prototype.matchall-4.0.10.tgz",
@ -4740,6 +5285,18 @@
"integrity": "sha512-N+8UisAXDGk8PFXP4HAzVR9nbfmVJ3zYLAWiTIoqC5v5isinhr+r5uaO8+7r3BMfuNIufIsA7RdpVgacC2cSpw==",
"dev": true
},
"node_modules/to-regex-range": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
"integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==",
"dev": true,
"dependencies": {
"is-number": "^7.0.0"
},
"engines": {
"node": ">=8.0"
}
},
"node_modules/toidentifier": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.1.tgz",
@ -4938,6 +5495,19 @@
"punycode": "^2.1.0"
}
},
"node_modules/util": {
"version": "0.12.5",
"resolved": "https://registry.npmjs.org/util/-/util-0.12.5.tgz",
"integrity": "sha512-kZf/K6hEIrWHI6XqOFUiiMa+79wE/D8Q+NCNAWclkyg3b4d2k7s0QGepNjiABc+aR3N1PAyHL7p6UcLY6LmrnA==",
"dev": true,
"dependencies": {
"inherits": "^2.0.3",
"is-arguments": "^1.0.4",
"is-generator-function": "^1.0.7",
"is-typed-array": "^1.1.3",
"which-typed-array": "^1.1.2"
}
},
"node_modules/utils-merge": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/utils-merge/-/utils-merge-1.0.1.tgz",
@ -5070,6 +5640,29 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/workerpool": {
"version": "6.2.1",
"resolved": "https://registry.npmjs.org/workerpool/-/workerpool-6.2.1.tgz",
"integrity": "sha512-ILEIE97kDZvF9Wb9f6h5aXK4swSlKGUcOEGiIYb2OOu/IrDU9iwj0fD//SsA6E5ibwJxpEvhullJY4Sl4GcpAw==",
"dev": true
},
"node_modules/wrap-ansi": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
"integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
"dev": true,
"dependencies": {
"ansi-styles": "^4.0.0",
"string-width": "^4.1.0",
"strip-ansi": "^6.0.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/chalk/wrap-ansi?sponsor=1"
}
},
"node_modules/wrappy": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz",
@ -5096,11 +5689,62 @@
}
}
},
"node_modules/y18n": {
"version": "5.0.8",
"resolved": "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz",
"integrity": "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==",
"dev": true,
"engines": {
"node": ">=10"
}
},
"node_modules/yallist": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A=="
},
"node_modules/yargs": {
"version": "16.2.0",
"resolved": "https://registry.npmjs.org/yargs/-/yargs-16.2.0.tgz",
"integrity": "sha512-D1mvvtDG0L5ft/jGWkLpG1+m0eQxOfaBvTNELraWj22wSVUMWxZUvYgJYcKh6jGGIkJFhH4IZPQhR4TKpc8mBw==",
"dev": true,
"dependencies": {
"cliui": "^7.0.2",
"escalade": "^3.1.1",
"get-caller-file": "^2.0.5",
"require-directory": "^2.1.1",
"string-width": "^4.2.0",
"y18n": "^5.0.5",
"yargs-parser": "^20.2.2"
},
"engines": {
"node": ">=10"
}
},
"node_modules/yargs-parser": {
"version": "20.2.4",
"resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-20.2.4.tgz",
"integrity": "sha512-WOkpgNhPTlE73h4VFAFsOnomJVaovO8VqLDzy5saChRBFQFBoMYirowyW+Q9HB4HFF4Z7VZTiG3iSzJJA29yRA==",
"dev": true,
"engines": {
"node": ">=10"
}
},
"node_modules/yargs-unparser": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/yargs-unparser/-/yargs-unparser-2.0.0.tgz",
"integrity": "sha512-7pRTIA9Qc1caZ0bZ6RYRGbHJthJWuakf+WmHK0rVeLkNrrGhfoabBNdue6kdINI6r4if7ocq9aD/n7xwKOdzOA==",
"dev": true,
"dependencies": {
"camelcase": "^6.0.0",
"decamelize": "^4.0.0",
"flat": "^5.0.2",
"is-plain-obj": "^2.1.0"
},
"engines": {
"node": ">=10"
}
},
"node_modules/yocto-queue": {
"version": "0.1.0",
"resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz",

View File

@ -4,7 +4,7 @@
"description": "",
"main": "src/index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"test": "mocha",
"login": "node scripts/matrix-login"
},
"author": "",
@ -19,15 +19,20 @@
"express-async-errors": "^3.1.1",
"fastq": "^1.17.1",
"level": "^8.0.1",
"lodash": "^4.17.21",
"matrix-bot-sdk": "^0.7.1",
"object-hash": "^3.0.0"
"object-hash": "^3.0.0",
"uuid": "^9.0.1"
},
"devDependencies": {
"assert": "^2.1.0",
"eslint": "^8.56.0",
"eslint-config-airbnb": "^19.0.4",
"eslint-plugin-import": "^2.29.1",
"eslint-plugin-jsx-a11y": "^6.8.0",
"eslint-plugin-react": "^7.33.2",
"eslint-plugin-react-hooks": "^4.6.0"
"eslint-plugin-react-hooks": "^4.6.0",
"mocha": "^10.4.0",
"proxyquire": "^2.1.3"
}
}

View File

@ -13,8 +13,8 @@ const {
const login = async () => {
console.log(`MATRIX_HOMESERVER_URL="${MATRIX_HOMESERVER_URL}"`);
const auth = new MatrixAuth(MATRIX_HOMESERVER_URL);
const client = await auth.passwordLogin(MATRIX_USER, MATRIX_PASSWORD);
console.log(`MATRIX_ACCESS_TOKEN="${client.accessToken}"`);
const matrixClient = await auth.passwordLogin(MATRIX_USER, MATRIX_PASSWORD);
console.log(`MATRIX_ACCESS_TOKEN="${matrixClient.accessToken}"`);
};
login();

View File

@ -0,0 +1,73 @@
const { matrixClient } = require('../matrix-bot');
const { matrixUserToAuthorAddress } = require('../util/db');
const write = require('../util/forum/write');
const { wallet } = require('../util/contracts');
const addPostWithRetry = require('../util/add-post-with-retry');
const {
ETH_NETWORK,
} = process.env;
module.exports = async (req, res) => {
const {
body: {
eventUri,
},
} = req;
if (!eventUri) {
res.status(400).end();
return;
}
console.log(`importFromMatrix: event ${eventUri}`);
// URI format:
// https://matrix.to/#/${roomId}/${eventId}?via=
const uriRegex = /#\/(![A-Za-z0-9:._-]+)\/(\$[A-Za-z0-9._-]+)(\?.*)$/;
const [, roomId, eventId] = uriRegex.exec(new URL(eventUri).hash);
console.log('roomId', roomId);
console.log('eventId', eventId);
const event = await matrixClient.getEvent(roomId, eventId);
console.log('event', event);
let authorAddress;
try {
authorAddress = await matrixUserToAuthorAddress.get(event.sender);
} catch (e) {
// Matrix user has not registered their author address
res.send(`Author address not registered for matrix user ${event.sender}`);
return;
}
// We want to add a post representing this matrix message.
const authors = [{ authorAddress, weightPPM: 1000000 }];
// TODO: Take references as input to this API call, referencing other posts or matrix events
const references = [];
const content = `Matrix event URI: ${eventUri}`;
const embeddedData = {
roomId,
eventId,
};
// We can't sign it on behalf of the author, but we can sign it with our own key
const sender = await wallet.getAddress();
const contentToVerify = `${content}\n\n${JSON.stringify(embeddedData, null, 2)}`;
const signature = await wallet.signMessage(contentToVerify);
const { hash } = await write({
sender, authors, references, content, embeddedData, signature,
});
// Now we want to add a post on-chain
const { alreadyAdded } = await addPostWithRetry(authors, hash, references);
if (alreadyAdded) {
console.log(`Post already added for matrix event ${eventUri}`);
} else {
console.log(`Added post to blockchain for matrix event ${eventUri}`);
// Send matrix event reply to the targeted event, notifying of this blockchain post
await matrixClient.replyNotice(roomId, event, `Added to ${ETH_NETWORK} blockchain as post ${hash}`);
}
res.json({ postId: hash, alreadyAdded });
};

View File

@ -1,15 +1,15 @@
const axios = require('axios');
const ethers = require('ethers');
const crypto = require('crypto');
const objectHash = require('object-hash');
const Promise = require('bluebird');
const verifySignature = require('./verify-signature');
const { authorAddresses, authorPrivKeys, forum } = require('./db');
const { dao } = require('./contracts');
const objectHash = require('object-hash');
const { authorAddresses, authorPrivKeys } = require('../util/db');
const { dao } = require('../util/contracts');
const write = require('../util/forum/write');
// Each post allocates 30% of its reputation to citations
const PPM_TO_CITATIONS = 300000;
// Each post allocates 30% of its reputation to references
const PPM_TO_REFERENCES = 300000;
const fetchWithRetry = async (url, retryDelay = 5000) => {
let retry = false;
@ -72,8 +72,8 @@ const getOrCreateAuthors = async (paper) => Promise.mapSeries(
// Generate and store a new account
const id = crypto.randomBytes(32).toString('hex');
authorPrivKey = `0x${id}`;
const wallet = new ethers.Wallet(authorPrivKey);
authorAddress = wallet.address;
const authorWallet = new ethers.Wallet(authorPrivKey);
authorAddress = authorWallet.address;
await authorAddresses.put(authorId, authorAddress);
await authorPrivKeys.put(authorAddress, authorPrivKey);
}
@ -90,9 +90,9 @@ const generatePost = async (paper) => {
throw new Error('Paper has no authors with id');
}
const firstAuthorWallet = new ethers.Wallet(authorsInfo[0].authorPrivKey);
const eachAuthorWeightPercent = Math.floor(1000000 / authorsInfo.length);
const eachAuthorWeightPPM = Math.floor(1000000 / authorsInfo.length);
const authors = authorsInfo.map(({ authorAddress }) => ({
weightPPM: eachAuthorWeightPercent,
weightPPM: eachAuthorWeightPPM,
authorAddress,
}));
// Make sure author weights sum to 100%
@ -111,29 +111,20 @@ HREF ${paper.url}`;
contentToSign += `\n\n${JSON.stringify(embeddedData, null, 2)}`;
}
const signature = firstAuthorWallet.signMessageSync(contentToSign);
const verified = verifySignature({
authors, content, signature, embeddedData,
});
if (!verified) {
throw new Error('Signature verification failed');
}
const hash = objectHash({
authors, content, signature, embeddedData,
});
return {
hash, authors, content, signature, embeddedData,
authors, content, signature, embeddedData,
};
};
const addPostWithRetry = async (authors, hash, citations, retryDelay = 5000) => {
const addPostWithRetry = async (authors, hash, references, retryDelay = 5000) => {
try {
await dao.addPost(authors, hash, citations);
await dao.addPost(authors, hash, references);
} catch (e) {
if (e.code === 'REPLACEMENT_UNDERPRICED') {
console.log('retry delay (sec):', retryDelay / 1000);
await Promise.delay(retryDelay);
return addPostWithRetry(authors, hash, citations, retryDelay * 2);
} if (e.reason === 'A post with this contentId already exists') {
return addPostWithRetry(authors, hash, references, retryDelay * 2);
} if (e.reason === 'A post with this postId already exists') {
return { alreadyAdded: true };
}
throw e;
@ -144,40 +135,46 @@ const addPostWithRetry = async (authors, hash, citations, retryDelay = 5000) =>
const importPaper = async (paper) => {
console.log('references count:', paper.references.length);
const { paperId } = paper;
const references = paper.references.filter((x) => !!x.paperId);
const eachCitationWeightPercent = Math.floor(PPM_TO_CITATIONS / references.length);
const citations = await Promise.mapSeries(
references,
const paperReferences = paper.references.filter((x) => !!x.paperId);
const eachReferenceWeightPPM = Math.floor(PPM_TO_REFERENCES / paperReferences.length);
const references = (await Promise.mapSeries(
paperReferences,
async (citedPaper) => {
// We need to fetch this paper so we can generate the post we WOULD add to the forum.
// That way, if we later add the cited paper to the blockchain it will have the correct hash.
// The forum allows dangling citations to support this use case.
const citedPost = await generatePost(citedPaper);
return {
weightPPM: eachCitationWeightPercent,
targetPostId: citedPost.hash,
};
// The forum allows dangling references to support this use case.
try {
const citedPost = await generatePost(citedPaper);
const citedPostHash = objectHash(citedPost);
return {
weightPPM: eachReferenceWeightPPM,
targetPostId: citedPostHash,
};
} catch (e) {
return null;
}
},
);
)).filter((x) => !!x);
// Make sure citation weights sum to the designated total
if (citations.length) {
const totalCitationWeight = citations.reduce((t, { weightPPM }) => t + weightPPM, 0);
citations[0].weightPPM += PPM_TO_CITATIONS - totalCitationWeight;
// Make sure reference weights sum to the designated total
if (references.length) {
const totalReferenceWeight = references.reduce((t, { weightPPM }) => t + weightPPM, 0);
references[0].weightPPM += PPM_TO_REFERENCES - totalReferenceWeight;
}
// Create a post for this paper
const {
hash, authors, content, signature, embeddedData,
authors, content, signature, embeddedData,
} = await generatePost(paper);
// Write the new post to our database
await forum.put(hash, {
authors, content, signature, embeddedData, citations,
const { hash } = await write({
authors, content, signature, embeddedData, references,
});
// Add the post to the forum (on-chain)
const { alreadyAdded } = await addPostWithRetry(authors, hash, citations);
console.log('addPostWithRetry', { authors, hash, references });
const { alreadyAdded } = await addPostWithRetry(authors, hash, references);
if (alreadyAdded) {
console.log(`Post already added for paper ${paperId}`);
} else {
@ -204,11 +201,20 @@ module.exports = async (req, res) => {
console.log(`importFromSS: author ${authorId}`);
const papers = await fetchAuthorPapers(authorId);
console.log('papers count:', papers.length);
const earlyResponseTimeout = setTimeout(() => {
res.status(202).end();
});
const result = await Promise.mapSeries(papers, importPaper);
clearTimeout(earlyResponseTimeout);
if (result.length) {
console.log(`Added posts for ${result.length} papers by author ${authorId}`);
}
res.json(result);
if (!res.headersSent) {
res.json(result);
}
} else {
res.status(400).end();
}

View File

@ -2,16 +2,21 @@ const express = require('express');
require('express-async-errors');
const read = require('./read');
const write = require('./write');
const read = require('../util/forum/read');
const write = require('../util/forum/write');
const importFromSS = require('./import-from-ss');
const importFromMatrix = require('./import-from-matrix');
const app = express();
const port = process.env.API_LISTEN_PORT || 3000;
app.use(express.json());
app.post('/write', write);
app.post('/write', async (req, res) => {
const { hash } = await write(req.body);
console.log('write', hash);
res.send(hash);
});
app.get('/read/:hash', async (req, res) => {
const { hash } = req.params;
@ -22,6 +27,8 @@ app.get('/read/:hash', async (req, res) => {
app.post('/importFromSemanticScholar', importFromSS);
app.post('/importFromMatrix', importFromMatrix);
app.get('*', (req, res) => {
console.log(`404 req.path: ${req.path}`);
res.status(404).json({ errorCode: 404 });
@ -31,7 +38,9 @@ app.use((err, req, res, next) => {
const status = err.response?.status ?? err.status ?? 500;
const message = err.response?.data?.error ?? err.message;
console.error(`error: ${message}`, err);
res.status(status).send(message);
if (!res.headersSent) {
res.status(status).send(message);
}
next();
});

View File

@ -1,38 +0,0 @@
const ethers = require('ethers');
const { getContractAddressByNetworkName } = require('./contract-config');
const DAOArtifact = require('../contractArtifacts/DAO.json');
const ProposalsArtifact = require('../contractArtifacts/Proposals.json');
const network = process.env.ETH_NETWORK;
console.log('network:', network);
const getProvider = () => {
switch (network) {
case 'localhost':
return ethers.getDefaultProvider('http://localhost:8545');
case 'sepolia':
return new ethers.InfuraProvider(
network,
process.env.INFURA_API_KEY,
);
default:
throw new Error('Unknown network');
}
};
const wallet = new ethers.Wallet(process.env.ETH_PRIVATE_KEY, getProvider());
module.exports = {
dao: new ethers.Contract(
getContractAddressByNetworkName(process.env.ETH_NETWORK, 'DAO'),
DAOArtifact.abi,
wallet,
),
proposals: new ethers.Contract(
getContractAddressByNetworkName(process.env.ETH_NETWORK, 'Proposals'),
ProposalsArtifact.abi,
wallet,
),
};

View File

@ -0,0 +1,100 @@
const { registerMatrixMessageHandler } = require('../matrix-bot');
const { setTargetRoomId } = require('../matrix-bot/outbound-queue');
const {
appState,
proposalEventIds,
matrixPools,
} = require('../util/db');
const submitRollup = require('./rollup/submit-rollup');
const { resetBatchItems } = require('./rollup/batch-items');
const { initiateMatrixPools } = require('./rollup/matrix-pools/initiate-matrix-pools');
const initiateMatrixPool = require('./rollup/matrix-pools/initiate');
const read = require('../util/forum/read');
const {
BOT_INSTANCE_ID,
ETH_NETWORK,
} = process.env;
// TODO: Refactor into separate files
const handleCommand = async (client, roomId, event) => {
// Don't handle unhelpful events (ones that aren't text messages, are redacted, or sent by us)
if (event.content?.msgtype !== 'm.text') return;
if (event.sender === await client.getUserId()) return;
const helloRegex = /^!hello\b/i;
const targetRegex = /^!target (.*)\b/i;
const proposalRegex = /\bprop(|osal) ([0-9]+)\b/i;
const submitRollupRegex = /^!submitBatch\b/i;
const resetBatchRegex = /^!resetBatch (.*)\b/i;
const restartMatrixPoolRegex = /^!restartMatrixPool (.*)\b/i;
const { body } = event.content;
if (helloRegex.test(body)) {
console.log(`!hello roomId ${roomId}`);
await client.replyNotice(roomId, event, 'Hello world!');
} else if (targetRegex.test(body)) {
const [, instanceId] = targetRegex.exec(body);
console.log(`!target roomId ${roomId} instanceId ${instanceId}`);
if (instanceId === BOT_INSTANCE_ID) {
setTargetRoomId(roomId);
await appState.put('targetRoomId', roomId);
await client.replyNotice(roomId, event, `Events will be sent to this room (${roomId}) for network ${ETH_NETWORK}`);
}
} else if (proposalRegex.test(body)) {
const [, , proposalIndexStr] = proposalRegex.exec(body);
const proposalIndex = parseInt(proposalIndexStr, 10);
console.log(`mention of proposal ${proposalIndex} in roomId ${roomId}`);
try {
const proposalEventId = await proposalEventIds.get(proposalIndex);
const proposalEventUri = `https://matrix.to/#/${roomId}/${proposalEventId}`;
// TODO: Send HTML message
const content = {
body: `Proposal ${proposalIndex}: ${proposalEventUri}`,
msgtype: 'm.text',
};
if (event.content['m.relates_to']?.rel_type === 'm.thread') {
content['m.relates_to'] = event.content['m.relates_to'];
}
await client.sendEvent(roomId, 'm.room.message', content);
} catch (e) {
// Not found
}
} else if (submitRollupRegex.test(body)) {
console.log('!submitBatch');
const { batchPostId, batchItems, authors } = await submitRollup();
if (batchItems.length) {
await client.replyText(roomId, event, `Submitted batch, post ${batchPostId} with ${batchItems.length} posts by ${authors.length} authors`);
} else {
await client.replyText(roomId, event, 'No matrix pools have finished since the last batch was submitted');
}
} else if (resetBatchRegex.test(body)) {
const [, instanceId] = resetBatchRegex.exec(body);
console.log(`!resetBatch roomId ${roomId} instanceId ${instanceId}`);
if (instanceId === BOT_INSTANCE_ID) {
console.log('!resetBatch');
const batchItems = await resetBatchItems();
await initiateMatrixPools();
await client.replyText(roomId, event, `Reset batch, now contains ${batchItems.length} items`);
}
} else if (restartMatrixPoolRegex.test(body)) {
const [, postId] = restartMatrixPoolRegex.exec(body);
console.log(`!restartMatrixPool roomId ${roomId} postId ${postId}`);
try {
const { sender, fee } = await matrixPools.get(postId);
const post = await read(postId);
await initiateMatrixPool(postId, post, sender, fee);
} catch (e) {
// Can't restart if it was never started
}
}
};
const start = () => {
registerMatrixMessageHandler(handleCommand);
};
module.exports = {
start,
};

View File

@ -0,0 +1,19 @@
const proposalsNotifier = require('./proposals-notifier');
const validationPools = require('./validation-pools');
const work1 = require('./work1');
const rollup = require('./rollup');
const registerIdentity = require('./register-identity');
const botCommands = require('./bot-commands');
const start = () => {
proposalsNotifier.start();
validationPools.start();
work1.start();
rollup.start();
registerIdentity.start();
botCommands.start();
};
module.exports = {
start,
};

View File

@ -0,0 +1,54 @@
const { proposals } = require('../util/contracts');
const read = require('../util/forum/read');
const { sendMatrixText } = require('../matrix-bot/outbound-queue');
const { proposalEventIds } = require('../util/db');
// Subscribe to proposal events
const start = () => {
console.log('registering proposal listener for proposal notifier');
proposals.on('NewProposal', async (proposalIndex) => {
console.log('New Proposal, index', proposalIndex);
const proposal = await proposals.proposals(proposalIndex);
console.log('postId:', proposal.postId);
// Read post from database
let post;
try {
post = await read(proposal.postId);
} catch (e) {
// Post for proposal not found
console.error(`error: post for proposal ${proposalIndex} not found`);
return;
}
console.log('post.content:', post.content);
// Send matrix room event
// TODO: Send HTML message
let message = `Proposal ${proposalIndex}\n\n${post.content}`;
if (post.embeddedData && Object.entries(post.embeddedData).length) {
message += `\n\n${JSON.stringify(post.embeddedData, null, 2)}`;
}
try {
await proposalEventIds.get(Number(proposalIndex));
// If this doesn't throw, it means we already sent a message for this proposal
} catch (e) {
if (e.status === 404) {
console.log('sending new proposal event to room', { message });
const { eventId } = await sendMatrixText(message);
await proposalEventIds.put(Number(proposalIndex), eventId);
}
}
});
proposals.on('ProposalAccepted', async (proposalIndex) => {
console.log('Proposal Accepted, index:', proposalIndex);
// TODO: Send notification as a reply to the new proposal message
});
};
module.exports = {
start,
};

View File

@ -0,0 +1,42 @@
const { recoverPersonalSignature } = require('@metamask/eth-sig-util');
const {
matrixUserToAuthorAddress,
authorAddressToMatrixUser,
} = require('../util/db');
const { registerMatrixEventHandler } = require('../matrix-bot');
const handleRegisterIdentity = async (client, roomId, event) => {
if (event.type !== 'io.dgov.identity.register') return;
const { message, signature } = event.content;
console.log('Received request to register identity');
let account;
try {
account = recoverPersonalSignature({ data: message, signature });
} catch (e) {
console.log('error: failed to recover signature:', e.message);
}
if (account) {
try {
const authorAddress = await matrixUserToAuthorAddress.get(event.sender);
if (account === authorAddress) {
await client.sendNotice(roomId, `Matrix user ${event.sender} author address ${account} already registered`);
} else {
await client.sendNotice(roomId, `Matrix user ${event.sender} updated author address from ${authorAddress} to ${account}`);
}
} catch (e) {
// Not found
await client.sendNotice(roomId, `Matrix user ${event.sender} registered author address ${account}`);
}
await matrixUserToAuthorAddress.put(event.sender, account);
await authorAddressToMatrixUser.put(account, event.sender);
}
};
const start = () => {
registerMatrixEventHandler(handleRegisterIdentity);
};
module.exports = {
start,
};

View File

@ -0,0 +1,39 @@
const { applicationData } = require('../../../util/db');
let batchItems;
const initializeBatchItems = async () => {
try {
batchItems = await applicationData.get('batchItems');
} catch (e) {
batchItems = [];
}
};
const getBatchItems = () => batchItems;
const addBatchItem = async (postId) => {
if (!batchItems.includes(postId)) {
batchItems.push(postId);
await applicationData.put('batchItems', batchItems);
}
};
const clearBatchItems = async (itemsToClear) => {
batchItems = batchItems.filter((item) => !itemsToClear.includes(item));
await applicationData.put('batchItems', batchItems);
};
const resetBatchItems = async () => {
batchItems = [];
await applicationData.put('batchItems', batchItems);
return batchItems;
};
module.exports = {
initializeBatchItems,
getBatchItems,
addBatchItem,
clearBatchItems,
resetBatchItems,
};

View File

@ -0,0 +1,34 @@
const { rollup, wallet } = require('../../../util/contracts');
let batchWorker;
let batchStart;
const getCurrentBatchWorker = () => batchWorker;
const initializeBatchWorker = async () => {
batchWorker = await rollup.batchWorker();
console.log('At startup, batch worker:', batchWorker);
rollup.on('BatchWorkerAssigned', async (batchWorker_) => {
batchWorker = batchWorker_;
batchStart = new Date();
console.log('Batch worker assigned:', batchWorker);
if (batchWorker === await wallet.getAddress()) {
console.log('This instance is the new batch worker');
}
});
};
const setBatchWorker = (batchWorker_) => {
batchWorker = batchWorker_;
};
const getBatchAge = (new Date() - batchStart) / 1000;
module.exports = {
getCurrentBatchWorker,
initializeBatchWorker,
setBatchWorker,
getBatchAge,
};

View File

@ -0,0 +1,59 @@
const Promise = require('bluebird');
const read = require('../../../util/forum/read');
const { matrixPools } = require('../../../util/db');
const WEIGHT_TO_REFERENCES = 300000;
const computeBatchPost = async (batchItems_) => {
const weights = {};
let references = [];
await Promise.each(batchItems_, async (postId) => {
const post = await read(postId);
const matrixPool = await matrixPools.get(postId);
const { fee, result: { votePasses, quorumMet } } = matrixPool;
if (votePasses && quorumMet) {
post.authors.forEach(({ authorAddress, weightPPM }) => {
weights[authorAddress] = weights[authorAddress] ?? 0;
// scale by matrix pool fee
weights[authorAddress] += weightPPM * fee;
});
post.references?.forEach(({ targetPostId, weightPPM }) => {
// scale by matrix pool fee
references.push({
targetPostId,
weightPPM: weightPPM * fee,
});
});
}
// TODO: Rewards for policing
});
// Rescale author weights so they sum to 1000000
const sumOfWeights = Object.values(weights).reduce((t, v) => t + v, 0);
if (!sumOfWeights) {
return [];
}
const scaledWeights = Object.values(weights)
.map((weight) => Math.floor((weight * 1000000) / sumOfWeights));
const sumOfScaledWeights = scaledWeights.reduce((t, v) => t + v, 0);
scaledWeights[0] += 1000000 - sumOfScaledWeights;
const authors = Object.keys(weights)
.map((authorAddress, i) => ({ authorAddress, weightPPM: scaledWeights[i] }));
// Rescale reference weights so they sum to WEIGHT_TO_REFERENCES
if (references.length) {
const sumOfReferenceWeights = references.reduce((t, { weightPPM }) => t + weightPPM, 0);
const scaledReferences = references.map((reference) => ({
targetPostId: reference.targetPostId,
weightPPM: Math.floor((reference.weightPPM * WEIGHT_TO_REFERENCES) / sumOfReferenceWeights),
}));
const sumOfScaledReferenceWeights = scaledReferences
.reduce((t, { weightPPM }) => t + weightPPM, 0);
scaledReferences[0].weightPPM += WEIGHT_TO_REFERENCES - sumOfScaledReferenceWeights;
references = scaledReferences;
}
return { authors, references };
};
module.exports = computeBatchPost;

View File

@ -0,0 +1,128 @@
// const { expect } = require('chai');
const assert = require('assert');
const proxyquire = require('proxyquire');
let posts = {};
let pools = {};
const read = (postId) => posts[postId];
const matrixPools = {
get: (postId) => pools[postId],
};
const computeBatchPost = proxyquire('./compute-batch-post', {
'../../../util/forum/read': read,
'../../../util/db': { matrixPools },
});
describe('computeBatchPost', () => {
it('multiple posts by one author', async () => {
posts = {
a: { authors: [{ authorAddress: '0xa1', weightPPM: 1000000 }] },
b: { authors: [{ authorAddress: '0xa1', weightPPM: 1000000 }] },
c: { authors: [{ authorAddress: '0xa1', weightPPM: 1000000 }] },
};
pools = {
a: { fee: 100, result: { votePasses: true, quorumMet: true } },
b: { fee: 100, result: { votePasses: true, quorumMet: true } },
c: { fee: 100, result: { votePasses: true, quorumMet: true } },
};
const { authors, references } = await computeBatchPost(['a', 'b', 'c']);
assert.deepEqual(authors, [{ authorAddress: '0xa1', weightPPM: 1000000 }]);
assert.deepEqual(references, []);
});
it('posts by different authors', async () => {
posts = {
a: { authors: [{ authorAddress: '0xa1', weightPPM: 1000000 }] },
b: { authors: [{ authorAddress: '0xa2', weightPPM: 1000000 }] },
};
pools = {
a: { fee: 100, result: { votePasses: true, quorumMet: true } },
b: { fee: 100, result: { votePasses: true, quorumMet: true } },
};
const { authors, references } = await computeBatchPost(['a', 'b']);
assert.deepEqual(authors, [
{ authorAddress: '0xa1', weightPPM: 500000 },
{ authorAddress: '0xa2', weightPPM: 500000 },
]);
assert.deepEqual(references, []);
});
it('posts by different authors and pools with different fees', async () => {
posts = {
a: { authors: [{ authorAddress: '0xa1', weightPPM: 1000000 }] },
b: { authors: [{ authorAddress: '0xa2', weightPPM: 1000000 }] },
};
pools = {
a: { fee: 100, result: { votePasses: true, quorumMet: true } },
b: { fee: 200, result: { votePasses: true, quorumMet: true } },
};
const { authors, references } = await computeBatchPost(['a', 'b']);
assert.deepEqual(authors, [
{ authorAddress: '0xa1', weightPPM: 333334 },
{ authorAddress: '0xa2', weightPPM: 666666 },
]);
assert.deepEqual(references, []);
});
it('posts with multiple authors', async () => {
posts = {
a: { authors: [{ authorAddress: '0xa1', weightPPM: 500000 }, { authorAddress: '0xa2', weightPPM: 500000 }] },
b: { authors: [{ authorAddress: '0xa1', weightPPM: 500000 }, { authorAddress: '0xa3', weightPPM: 500000 }] },
};
pools = {
a: { fee: 100, result: { votePasses: true, quorumMet: true } },
b: { fee: 100, result: { votePasses: true, quorumMet: true } },
};
const { authors, references } = await computeBatchPost(['a', 'b']);
assert.deepEqual(authors, [
{ authorAddress: '0xa1', weightPPM: 500000 },
{ authorAddress: '0xa2', weightPPM: 250000 },
{ authorAddress: '0xa3', weightPPM: 250000 },
]);
assert.deepEqual(references, []);
});
it('post with references', async () => {
posts = {
a: { authors: [{ authorAddress: '0xa1', weightPPM: 1000000 }] },
b: { authors: [{ authorAddress: '0xa2', weightPPM: 1000000 }], references: [{ targetPostId: 'a', weightPPM: 500000 }] },
};
pools = {
b: { fee: 100, result: { votePasses: true, quorumMet: true } },
};
const { authors, references } = await computeBatchPost(['b']);
assert.deepEqual(authors, [
{ authorAddress: '0xa2', weightPPM: 1000000 },
]);
assert.deepEqual(references, [{ targetPostId: 'a', weightPPM: 300000 }]);
});
it('post with references and pools with different fees', async () => {
posts = {
a: { authors: [{ authorAddress: '0xa1', weightPPM: 1000000 }] },
b: { authors: [{ authorAddress: '0xa2', weightPPM: 1000000 }] },
c: { authors: [{ authorAddress: '0xa3', weightPPM: 1000000 }], references: [{ targetPostId: 'a', weightPPM: 500000 }] },
d: { authors: [{ authorAddress: '0xa4', weightPPM: 1000000 }], references: [{ targetPostId: 'b', weightPPM: 500000 }] },
};
pools = {
c: { fee: 100, result: { votePasses: true, quorumMet: true } },
d: { fee: 200, result: { votePasses: true, quorumMet: true } },
};
const { authors, references } = await computeBatchPost(['c', 'd']);
assert.deepEqual(authors, [
{ authorAddress: '0xa3', weightPPM: 333334 },
{ authorAddress: '0xa4', weightPPM: 666666 },
]);
assert.deepEqual(references, [
{ targetPostId: 'a', weightPPM: 100000 },
{ targetPostId: 'b', weightPPM: 200000 },
]);
});
});

View File

@ -0,0 +1,13 @@
const { rollup } = require('../../../util/contracts');
const fetchBatchItemsInfo = async () => {
// Read from Rollup.items
const itemCount = await rollup.itemCount();
const promises = [];
for (let i = 0; i < itemCount; i += 1) {
promises.push(rollup.items(i));
}
return Promise.all(promises);
};
module.exports = fetchBatchItemsInfo;

View File

@ -0,0 +1,71 @@
const { v4: uuidv4 } = require('uuid');
const write = require('../../../util/forum/write');
const addPostWithRetry = require('../../../util/add-post-with-retry');
const callWithRetry = require('../../../util/call-with-retry');
const { getBatchItems, clearBatchItems } = require('./batch-items');
const computeBatchPost = require('./compute-batch-post');
const { wallet, rollup } = require('../../../util/contracts');
const { sendMatrixEvent } = require('../../../matrix-bot');
const { stakeRollupAvailability } = require('../utils');
const fetchBatchItemsInfo = require('./fetch-batch-items-info');
const submitRollup = async () => {
const availableBatchItems = getBatchItems();
const batchItems = [];
const batchItemsInfo = await fetchBatchItemsInfo();
console.log('available batch items', availableBatchItems);
for (let i = 0; i < batchItemsInfo.length; i += 1) {
const { postId } = batchItemsInfo[i];
if (availableBatchItems.includes(postId)) {
console.log(`post ${postId} is available`);
batchItems.push(postId);
} else {
// Batch items have to be submitted in the correct order, with no gaps
console.log(`post ${postId} is not available`);
break;
}
}
if (!batchItems.length) {
return { batchItems: [] };
}
const { authors, references } = await computeBatchPost(batchItems);
if (!authors.length) {
return { batchItems: [] };
}
const content = `Batch of ${batchItems.length} items`;
const embeddedData = {
batchItems,
nonce: uuidv4().replaceAll('-', ''),
};
const sender = await wallet.getAddress();
const contentToVerify = `${content}\n\n${JSON.stringify(embeddedData, null, 2)}`;
const signature = await wallet.signMessage(contentToVerify);
// Write to the forum database
const { hash: batchPostId } = await write({
sender, authors, references, content, embeddedData, signature,
});
// Add rollup post on-chain
console.log('adding batch post on-chain', { authors, batchPostId, references });
await addPostWithRetry(authors, batchPostId, references);
// Stake our availability to be the next rollup worker
console.log('staking availability to be the next rollup worker');
await stakeRollupAvailability();
// Call Rollup.submitBatch
console.log('Submitting batch', { batchPostId, batchItems, authors });
const poolDuration = 60;
await callWithRetry(() => rollup.submitBatch(batchPostId, batchItems, poolDuration));
// Send matrix event
await sendMatrixEvent('io.dgov.rollup.submit', {
batchPostId, batchItems, authors, references,
});
// Clear the batch in preparation for next batch
await clearBatchItems(batchItems);
return {
batchPostId,
batchItems,
authors,
};
};
module.exports = submitRollup;

View File

@ -0,0 +1,9 @@
const {
ROLLUP_AVAILABILITY_STAKE_DURATION,
ROLLUP_INTERVAL,
} = process.env;
module.exports = {
rollupInterval: ROLLUP_INTERVAL,
availabilityStakeDuration: ROLLUP_AVAILABILITY_STAKE_DURATION || 600,
};

View File

@ -0,0 +1,189 @@
const { isEqual } = require('lodash');
const { registerDecider } = require('../validation-pools');
const { registerMatrixEventHandler, sendMatrixText, sendMatrixEvent } = require('../../matrix-bot');
const { matrixPools, matrixUserToAuthorAddress } = require('../../util/db');
const {
rollup, wallet,
} = require('../../util/contracts');
const read = require('../../util/forum/read');
const { availabilityStakeDuration } = require('./config');
const {
stakeRollupAvailability, authorsMatch, validatePost,
referencesMatch,
} = require('./utils');
const computeMatrixPoolResult = require('./matrix-pools/compute-result');
const { initializeBatchItems, addBatchItem, clearBatchItems } = require('./batch/batch-items');
const { getCurrentBatchWorker, initializeBatchWorker } = require('./batch/batch-worker');
const initiateMatrixPool = require('./matrix-pools/initiate');
const { initiateMatrixPools } = require('./matrix-pools/initiate-matrix-pools');
const computeBatchPost = require('./batch/compute-batch-post');
const start = async () => {
console.log('registering validation pool decider for rollup');
registerDecider(async (pool, post) => {
// If this is not sent by the work1 contract, it's not of interest here.
if (pool.sender !== rollup.target) return false;
// A rollup post should contain
// - a list of off-chain validation pools
// - authorship corresponding to the result of those off-chain pools
if (!post.embeddedData?.batchItems) return false;
// Our task here is to check whether the posted result agrees with our own computations
try {
const { authors, references } = await computeBatchPost(post.embeddedData.batchItems);
const valid = authorsMatch(post.authors, authors)
&& referencesMatch(post.references, references);
console.log(`batch post ${pool.props.postId} is ${valid ? 'valid' : 'invalid'}`);
return valid;
} catch (e) {
console.error('Error calculating batch post author weights', e);
return null;
}
});
// Even if we're not the current batch worker, keep track of batch items
initializeBatchItems();
// Check for an assigned batch worker
await initializeBatchWorker();
// Stake availability and set an interval to maintain it
await stakeRollupAvailability();
setInterval(stakeRollupAvailability, availabilityStakeDuration * 1000);
// Initiate any matrix pools that haven't already occurred
await initiateMatrixPools();
/// `sender` is the address that called Rollup.addItem on chain, i.e. the Work2 contract.
rollup.on('BatchItemAdded', async (postId, sender, fee) => {
// If we are the batch worker or there is no batch worker, initiate a matrix pool
const batchWorker = getCurrentBatchWorker();
if (batchWorker === await wallet.getAddress()
|| batchWorker === '0x0000000000000000000000000000000000000000') {
let post;
try {
post = await read(postId);
} catch (e) {
console.error(`Post ID ${postId} not found`);
return;
}
// Initialize a matrix pool
try {
await matrixPools.get(postId);
// If this doesn't throw, it means we or someone else already sent this event
console.log(`Matrix pool start event has already been sent for postId ${postId}`);
} catch (e) {
if (e.status === 404) {
await initiateMatrixPool(postId, post, sender, fee);
} else {
throw e;
}
}
}
});
registerMatrixEventHandler(async (client, roomId, event) => {
switch (event.type) {
case 'io.dgov.pool.start': {
// Note that matrix pools are identified by the postId to which they pertain.
// This means that for a given post there can only be one matrix pool at a time.
const { postId, sender, ...params } = event.content;
// We can use LevelDB to store information about validation pools
const eventId = event.event_id;
console.log('Matrix pool started', { postId, ...params });
// Validate the target post, and stake for/against
let post;
try {
post = await read(postId);
} catch (e) {
console.error(`Post ID ${postId} not found`);
break;
}
// Register our own stake and send a message
const { amount, inFavor } = await validatePost(sender, post);
sendMatrixEvent('io.dgov.pool.stake', { postId, amount, inFavor });
const matrixPool = {
postId,
roomId,
eventId,
...params,
stakes: [{ amount, inFavor, account: await wallet.getAddress() }],
};
await matrixPools.put(postId, matrixPool);
break;
}
case 'io.dgov.pool.stake': {
const { postId, amount, inFavor } = event.content;
let account;
try {
account = await matrixUserToAuthorAddress(event.sender);
} catch (e) {
// Error, sender has not registered their matrix identity
sendMatrixText(`Matrix user ${event.sender} has not registered their wallet address`);
break;
}
let matrixPool;
try {
matrixPool = await matrixPools.get(postId);
} catch (e) {
// Error. matrix pool not found
sendMatrixText(`Received stake for unknown matrix pool, for post ${postId}. Stake sent by ${event.sender}`);
break;
}
const stake = { account, amount, inFavor };
matrixPool.stakes = matrixPool.stakes ?? [];
matrixPool.stakes.push(stake);
await matrixPools.put(postId, matrixPool);
console.log(`registered stake in matrix pool for post ${postId} by ${account}`);
break;
}
case 'io.dgov.pool.result': {
// This should be sent by the current batch worker
// const { stakedFor, stakedAgainst, totalSupply, votePasses, quorumMet, } = result;
const { postId, result } = event.content;
let matrixPool;
try {
matrixPool = await matrixPools.get(postId);
} catch (e) {
// Error. matrix pool not found
sendMatrixText(`Received result for unknown matrix pool, for post ${postId}. Result sent by ${event.sender}`);
break;
}
// Compare batch worker's result with ours to verify and provide early warning
const expectedResult = await computeMatrixPoolResult(matrixPool);
if (!isEqual(result, expectedResult)) {
sendMatrixText(`Unexpected result for matrix pool, for post ${postId}. Result sent by ${event.sender}\n\n`
+ `received ${JSON.stringify(result)}\n`
+ `expected ${JSON.stringify(expectedResult)}`);
}
matrixPool.result = result;
await matrixPools.put(postId, matrixPool);
await addBatchItem(postId);
break;
}
case 'io.dgov.rollup.submit': {
// This should include the identifier of the on-chain validation pool
const {
batchPostId, batchItems, authors, references,
} = event.content;
// Compare batch worker's result with ours to verify
const { expectedAuthors, expectedReferences } = await computeBatchPost(batchItems);
if (!authorsMatch(authors, expectedAuthors)
|| !referencesMatch(references, expectedReferences)) {
sendMatrixText(`Unexpected result for batch post ${batchPostId}`);
}
// Reset batchItems in preparation for next batch
await clearBatchItems(batchItems);
break;
}
default:
}
});
};
module.exports = {
start,
};

View File

@ -0,0 +1,23 @@
const {
dao,
} = require('../../../util/contracts');
const computeMatrixPoolResult = async (matrixPool) => {
// This should already contain all the info we need to evaluate the outcome
const { stakes, quorum, winRatio } = matrixPool;
const stakedFor = stakes
.filter((x) => x.inFavor)
.reduce((total, { amount }) => total + amount, 0);
const stakedAgainst = stakes
.filter((x) => !x.inFavor)
.reduce((total, { amount }) => total + amount, 0);
const votePasses = stakedFor * winRatio[1] >= (stakedFor + stakedAgainst) * winRatio[0];
const totalSupply = Number(await dao.totalSupply());
const quorumMet = (stakedFor + stakedAgainst) * quorum[1] >= totalSupply * quorum[0];
const result = {
stakedFor, stakedAgainst, totalSupply, votePasses, quorumMet,
};
return result;
};
module.exports = computeMatrixPoolResult;

View File

@ -0,0 +1,45 @@
const { sendMatrixEvent } = require('../../../matrix-bot');
const { wallet } = require('../../../util/contracts');
const { matrixPools } = require('../../../util/db');
const { addBatchItem, getBatchItems } = require('../batch/batch-items');
const { getCurrentBatchWorker, getBatchAge } = require('../batch/batch-worker');
const computeMatrixPoolResult = require('./compute-result');
const { rollupInterval } = require('../config');
const submitRollup = require('../batch/submit-rollup');
const { stakeRollupAvailability } = require('../utils');
const evaluateMatrixPoolOutcome = async (postId) => {
const matrixPool = await matrixPools.get(postId);
const result = await computeMatrixPoolResult(matrixPool);
console.log(`Matrix pool for post ${postId} outcome evaluated`, result);
matrixPool.result = result;
await matrixPools.put(postId, matrixPool);
sendMatrixEvent('io.dgov.pool.result', { postId, result });
await addBatchItem(postId);
let submitBatch = false;
const batchWorker = getCurrentBatchWorker();
if (batchWorker === '0x0000000000000000000000000000000000000000') {
// If there's no batch worker, we should stake our availability
// and then submit the batch immediately.
console.log('There is no batch worker assigned. Staking availability and submitting first batch.');
submitBatch = true;
} else if (batchWorker === await wallet.getAddress()) {
// If we are the batch worker, we should wait an appropriate amout of time /
// number of matrix pools before submitting a batch.
const batchAge = getBatchAge();
const batchItems = getBatchItems();
if (batchAge > rollupInterval && batchItems.length) {
console.log(`Batch age = ${batchAge}, size = ${batchItems.length}. Submitting batch.`);
submitBatch = true;
}
}
if (submitBatch) {
await stakeRollupAvailability();
await submitRollup();
}
return result;
};
module.exports = evaluateMatrixPoolOutcome;

View File

@ -0,0 +1,35 @@
const Promise = require('bluebird');
const { matrixPools } = require('../../../util/db');
const read = require('../../../util/forum/read');
const initiateMatrixPool = require('./initiate');
const { addBatchItem, getBatchItems } = require('../batch/batch-items');
const fetchBatchItemsInfo = require('../batch/fetch-batch-items-info');
const initiateMatrixPools = async () => {
const batchItemsInfo = await fetchBatchItemsInfo();
// Make sure there's a matrix pool for each batch item.
// If there's not, then let's start one.
await Promise.each(batchItemsInfo, async ({ postId, sender, fee }) => {
let post;
try {
post = await read(postId);
} catch (e) {
console.error(`Post ID ${postId} not found`);
return;
}
try {
const matrixPool = await matrixPools.get(postId);
if (matrixPool.result) {
await addBatchItem(postId);
}
} catch (e) {
// TODO: It's possible we missed messages about pools that have already occurred.
await initiateMatrixPool(postId, post, sender, fee);
}
});
console.log('batch items count:', getBatchItems().length);
};
module.exports = {
initiateMatrixPools,
};

View File

@ -0,0 +1,47 @@
const { sendMatrixEvent } = require('../../../matrix-bot');
const { validatePost } = require('../utils');
const evaluateMatrixPoolOutcome = require('./evaluate');
const { matrixPools } = require('../../../util/db');
const { wallet } = require('../../../util/contracts');
const initiateMatrixPool = async (postId, post, sender, fee) => {
const duration = 20;
const quorum = [1, 3];
const winRatio = [1, 2];
const params = {
sender,
fee: Number(fee),
duration,
quorum,
winRatio,
};
console.log('sending matrix pool start event');
const { roomId, eventId } = await sendMatrixEvent('io.dgov.pool.start', {
postId,
...params,
});
console.log('sent matrix pool start event');
// Register our own stake and send a message
const { amount, inFavor } = await validatePost(sender, post);
sendMatrixEvent('io.dgov.pool.stake', { postId, amount, inFavor });
const matrixPool = {
postId,
roomId,
eventId,
...params,
stakes: [{ amount, inFavor, account: await wallet.getAddress() }],
};
await matrixPools.put(postId, matrixPool);
// Since we're assuming responsibility as the batch worker,
// set a timeout to evaulate the outcome
setTimeout(
() => {
evaluateMatrixPoolOutcome(postId);
},
duration * 1000,
);
};
module.exports = initiateMatrixPool;

View File

@ -0,0 +1,58 @@
const callWithRetry = require('../../util/call-with-retry');
const {
rollup, wallet, dao,
work2,
} = require('../../util/contracts');
const { availabilityStakeDuration } = require('./config');
const stakeRollupAvailability = async () => {
const currentRep = await dao.balanceOf(await wallet.getAddress());
if (currentRep) {
await callWithRetry(() => dao.stakeAvailability(
rollup.target,
currentRep,
availabilityStakeDuration,
));
}
};
const authorsMatch = async (authors, expectedAuthors) => {
if (expectedAuthors.length !== authors.length) return false;
return authors.every(({ authorAddress, weightPPM }) => {
const expectedAuthor = expectedAuthors.find((x) => x.authorAddress === authorAddress);
return weightPPM === expectedAuthor.weightPPM;
});
};
const referencesMatch = async (references, expectedReferences) => {
if (expectedReferences.length !== references.length) return false;
return references.every(({ targetPostId, weightPPM }) => {
const expectedReference = expectedReferences.find((x) => x.targetPostId === targetPostId);
return weightPPM === expectedReference.weightPPM;
});
};
const validateWorkEvidence = async (sender, post) => {
let valid = false;
if (sender === work2.target) {
const expectedContent = 'This is a work evidence post';
valid = post.content.startsWith(expectedContent);
}
console.log(`Work evidence ${valid ? 'matched' : 'did not match'} the expected content`);
return valid;
};
const validatePost = async (sender, post) => {
const currentRep = Number(await dao.balanceOf(await wallet.getAddress()));
const valid = await validateWorkEvidence(sender, post);
const stake = { amount: currentRep, inFavor: valid };
return stake;
};
module.exports = {
stakeRollupAvailability,
authorsMatch,
referencesMatch,
validateWorkEvidence,
validatePost,
};

View File

@ -0,0 +1,82 @@
const Promise = require('bluebird');
const { dao, wallet } = require('../util/contracts');
const read = require('../util/forum/read');
const gateByProposal = require('../util/gate-by-proposal');
const {
ENABLE_STAKING,
} = process.env;
const deciders = [];
const registerDecider = (decider) => {
deciders.push(decider);
};
let enableStaking;
if (ENABLE_STAKING === 'false') {
console.log('STAKING DISABLED');
enableStaking = false;
} else {
gateByProposal(((enable) => {
enableStaking = enable;
}));
}
const start = async () => {
dao.on('ValidationPoolInitiated', async (poolIndex) => {
console.log('Validation Pool Initiated, index', poolIndex);
const pool = await dao.getValidationPool(poolIndex);
// Read post from database
let post;
try {
post = await read(pool.props.postId);
} catch (e) {
// Post not found
console.error(`error: post for validation pool ${poolIndex} not found`);
return;
}
console.log('postId:', pool.props.postId);
console.log('post.content:', post.content);
// We have the opportunity to stake for/against this validation pool.
// To implement the legislative process of upgrading this protocol,
// the execution of this code can be protected by a given proposal.
// The code will only execute if the proposal has been accepted.
if (!enableStaking) {
return;
}
const decisions = await Promise.mapSeries(deciders, (decider) => decider(pool, post));
const inFavor = decisions.some((x) => x === true);
const nullResult = decisions.some((x) => x === null);
const currentRep = await dao.balanceOf(await wallet.getAddress());
let stakeAmount = currentRep;
if (!inFavor && nullResult) {
console.log(`Obtained a NULL RESULT for pool ${poolIndex}.`);
// TODO: Retry?
// TODO: Notify
// Calculate the minimum stake S against the post, such that if the honest actors
// each stake S, the result will be enough to meet the win ratio.
// This way, we combat the threat of a truly invalid post,
// while reducing our exposure in the case that the error is unique to us.
// Assume 2/3 honest actors.
// S * (2/3) = 1/3
// S = 1/2;
stakeAmount = Math.ceil(currentRep / 2);
}
// Stake all available reputation
console.log(`STAKING ${stakeAmount} ${inFavor ? 'in favor of' : 'against'} pool ${poolIndex}`);
try {
await dao.stakeOnValidationPool(poolIndex, stakeAmount, inFavor);
} catch (e) {
// Maybe the end time passed?
console.error(`STAKING failed, reason: ${e.reason}`);
}
});
};
module.exports = {
start,
registerDecider,
};

View File

@ -0,0 +1,26 @@
const { getContractAddressByNetworkName } = require('../util/contract-config');
const { registerDecider } = require('./validation-pools');
const {
ETH_NETWORK,
} = process.env;
const work1Address = getContractAddressByNetworkName(ETH_NETWORK, 'Work1');
const start = async () => {
console.log('registering validation pool decider for work1');
registerDecider((pool, post) => {
// If this is not sent by the work1 contract, it's not of interest here.
if (pool.sender !== work1Address) return false;
const expectedContent = 'This is a work evidence post';
const result = post.content.startsWith(expectedContent);
console.log(`Work evidence ${result ? 'matched' : 'did not match'} the expected content`);
return result;
});
};
module.exports = {
start,
};

View File

@ -1,5 +1,20 @@
require('dotenv').config();
require('./api').start();
require('./matrix').start();
require('./proposals').start();
const api = require('./api');
const matrixBot = require('./matrix-bot');
const eventHandlers = require('./event-handlers');
const {
ENABLE_API,
ENABLE_MATRIX,
} = process.env;
if (ENABLE_API !== 'false') {
api.start();
}
if (ENABLE_MATRIX !== 'false') {
matrixBot.start();
}
eventHandlers.start();

View File

@ -0,0 +1,64 @@
const {
AutojoinRoomsMixin,
MatrixClient,
RustSdkCryptoStorageProvider,
SimpleFsStorageProvider,
} = require('matrix-bot-sdk');
const {
MATRIX_HOMESERVER_URL,
MATRIX_ACCESS_TOKEN,
BOT_STORAGE_PATH,
BOT_CRYPTO_STORAGE_PATH,
} = process.env;
const storageProvider = new SimpleFsStorageProvider(BOT_STORAGE_PATH);
const cryptoProvider = new RustSdkCryptoStorageProvider(BOT_CRYPTO_STORAGE_PATH);
console.log('MATRIX_HOMESERVER_URL:', MATRIX_HOMESERVER_URL);
const matrixClient = new MatrixClient(
MATRIX_HOMESERVER_URL,
MATRIX_ACCESS_TOKEN,
storageProvider,
cryptoProvider,
);
let joinedRooms;
const { initializeOutboundQueue, sendMatrixEvent, sendMatrixText } = require('./outbound-queue');
const start = async () => {
// Automatically join a room to which we are invited
AutojoinRoomsMixin.setupOnClient(matrixClient);
joinedRooms = await matrixClient.getJoinedRooms();
console.log('joined rooms:', joinedRooms);
matrixClient.start().then(() => {
console.log('Matrix bot started!');
// Start the outbound queue
initializeOutboundQueue(matrixClient);
});
};
const registerMatrixMessageHandler = (eventHandler) => {
matrixClient.on('room.message', async (roomId, event) => {
if (event.sender === await matrixClient.getUserId()) return;
eventHandler(matrixClient, roomId, event);
});
};
const registerMatrixEventHandler = (eventHandler) => {
matrixClient.on('room.event', async (roomId, event) => {
if (event.sender === await matrixClient.getUserId()) return;
if (event.state_key !== undefined) return; // state event
eventHandler(matrixClient, roomId, event);
});
};
module.exports = {
start,
matrixClient,
registerMatrixMessageHandler,
registerMatrixEventHandler,
sendMatrixEvent,
sendMatrixText,
};

View File

@ -0,0 +1,82 @@
const fastq = require('fastq');
const { applicationData } = require('../util/db');
let matrixClient;
let targetRoomId;
const processOutboundQueue = async ({ type, ...args }) => {
switch (type) {
case 'MatrixEvent': {
const { eventType, content, onSend } = args;
const eventId = await matrixClient.sendEvent(targetRoomId, eventType, content);
onSend(targetRoomId, eventId);
break;
}
case 'MatrixText': {
const { text, onSend } = args;
const eventId = await matrixClient.sendText(targetRoomId, text);
onSend(targetRoomId, eventId);
break;
}
default:
}
};
const outboundQueue = fastq.promise(processOutboundQueue, 1);
// Pause outbound queue until matrixClient and targetRoomId are set
outboundQueue.pause();
const setTargetRoomId = async (roomId) => {
targetRoomId = roomId;
console.log('target room ID:', targetRoomId);
await applicationData.put('targetRoomId', targetRoomId);
if (matrixClient) {
console.log('Starting Matrix outbound queue processor');
outboundQueue.resume();
}
};
const initializeOutboundQueue = async (matrixClient_) => {
matrixClient = matrixClient_;
try {
targetRoomId = await applicationData.get('targetRoomId');
console.log('target room ID:', targetRoomId);
} catch (e) {
// No target room set
console.warn('target room ID is not set -- will not be able to send messages until it is set. Use !target <bot-id>');
}
if (targetRoomId) {
console.log('Starting Matrix outbound queue processor');
outboundQueue.resume();
}
};
const sendMatrixEvent = async (eventType, content) => new Promise((resolve) => {
outboundQueue.push({
type: 'MatrixEvent',
eventType,
content,
onSend: ((roomId, eventId) => {
resolve({ roomId, eventId });
}),
});
});
const sendMatrixText = async (text) => new Promise((resolve) => {
outboundQueue.push({
type: 'MatrixText',
text,
onSend: ((roomId, eventId) => {
resolve({ roomId, eventId });
}),
});
});
module.exports = {
setTargetRoomId,
outboundQueue,
initializeOutboundQueue,
sendMatrixEvent,
sendMatrixText,
};

View File

@ -1,164 +0,0 @@
const {
AutojoinRoomsMixin,
MatrixClient,
RustSdkCryptoStorageProvider,
SimpleFsStorageProvider,
} = require('matrix-bot-sdk');
const fastq = require('fastq');
const { recoverPersonalSignature } = require('@metamask/eth-sig-util');
const {
appState,
proposalEventIds,
matrixUserToAuthorAddress,
authorAddressToMatrixUser,
} = require('./db');
const {
MATRIX_HOMESERVER_URL,
MATRIX_ACCESS_TOKEN,
BOT_STORAGE_PATH,
BOT_CRYPTO_STORAGE_PATH,
BOT_INSTANCE_ID,
ETH_NETWORK,
} = process.env;
const storageProvider = new SimpleFsStorageProvider(BOT_STORAGE_PATH);
const cryptoProvider = new RustSdkCryptoStorageProvider(BOT_CRYPTO_STORAGE_PATH);
let client;
let joinedRooms;
let targetRoomId;
const processOutboundQueue = async ({ type, ...args }) => {
if (!targetRoomId) return;
switch (type) {
case 'NewProposal': {
const { proposalIndex, text } = args;
try {
await proposalEventIds.get(Number(proposalIndex));
} catch (e) {
if (e.status === 404) {
console.log('sending to room', targetRoomId, { text });
const eventId = await client.sendText(targetRoomId, text);
await proposalEventIds.put(Number(proposalIndex), eventId);
}
}
break;
}
default:
}
};
const outboundQueue = fastq(processOutboundQueue, 1);
outboundQueue.pause();
const start = async () => {
console.log('MATRIX_HOMESERVER_URL:', MATRIX_HOMESERVER_URL);
client = new MatrixClient(
MATRIX_HOMESERVER_URL,
MATRIX_ACCESS_TOKEN,
storageProvider,
cryptoProvider,
);
// Automatically join a room to which we are invited
AutojoinRoomsMixin.setupOnClient(client);
joinedRooms = await client.getJoinedRooms();
console.log('joined rooms:', joinedRooms);
try {
targetRoomId = await appState.get('targetRoomId');
} catch (e) {
// Leave targetRoomId uninitialized for now
}
const handleCommand = async (roomId, event) => {
// Don't handle unhelpful events (ones that aren't text messages, are redacted, or sent by us)
if (event.content?.msgtype !== 'm.text') return;
if (event.sender === await client.getUserId()) return;
const helloRegex = /^!hello\b/i;
const targetRegex = /^!target (.*)\b/i;
const proposalRegex = /\bprop(|osal) ([0-9]+)\b/i;
const { body } = event.content;
if (helloRegex.test(body)) {
console.log(`!hello roomId ${roomId}`);
await client.replyNotice(roomId, event, 'Hello world!');
} else if (targetRegex.test(body)) {
const [, instanceId] = targetRegex.exec(body);
console.log(`!target roomId ${roomId} instanceId ${instanceId}`);
if (instanceId === BOT_INSTANCE_ID) {
targetRoomId = roomId;
await appState.put('targetRoomId', targetRoomId);
await client.replyNotice(roomId, event, `Proposal events will be sent to this room for network ${ETH_NETWORK}`);
}
} else if (proposalRegex.test(body)) {
const [, , proposalIndexStr] = proposalRegex.exec(body);
const proposalIndex = parseInt(proposalIndexStr, 10);
console.log(`mention of proposal ${proposalIndex} in roomId ${roomId}`);
try {
const proposalEventId = await proposalEventIds.get(proposalIndex);
const proposalEventUri = `https://matrix.to/#/${roomId}/${proposalEventId}`;
const content = {
body: `Proposal ${proposalIndex}: ${proposalEventUri}`,
msgtype: 'm.text',
};
if (event.content['m.relates_to']?.rel_type === 'm.thread') {
content['m.relates_to'] = event.content['m.relates_to'];
}
await client.sendEvent(roomId, 'm.room.message', content);
} catch (e) {
// Not found
}
}
};
const handleRegisterIdentity = async (roomId, event) => {
const { message, signature } = event.content;
console.log('Received request to register identity');
let account;
try {
account = recoverPersonalSignature({ data: message, signature });
} catch (e) {
console.log('error: failed to recover signature:', e.message);
}
if (account) {
await matrixUserToAuthorAddress.put(event.sender, account);
await authorAddressToMatrixUser.put(account, event.sender);
}
await client.sendNotice(roomId, `Registered matrix user ${event.sender} to author address ${account}`);
};
// Before we start the bot, register our command handler
client.on('room.message', handleCommand);
// Handler for custom events
client.on('room.event', (roomId, event) => {
// Note that state events can also be sent down this listener too
if (event.state_key !== undefined) return; // state event
switch (event.type) {
case 'io.dgov.identity.register':
handleRegisterIdentity(roomId, event);
break;
default:
}
});
client.start().then(() => {
console.log('Bot started!');
outboundQueue.resume();
});
};
const sendNewProposalEvent = (proposalIndex, text) => {
outboundQueue.push({ type: 'NewProposal', proposalIndex, text });
};
module.exports = {
start,
sendNewProposalEvent,
};

View File

@ -1,28 +0,0 @@
const { proposals } = require('./contracts');
const read = require('./read');
const { sendNewProposalEvent } = require('./matrix');
// Subscribe to proposal events
const start = () => {
proposals.on('NewProposal', async (proposalIndex) => {
console.log('New Proposal, index', proposalIndex);
const proposal = await proposals.proposals(proposalIndex);
console.log('postId:', proposal.postId);
// Read post from database
const post = await read(proposal.postId);
console.log('post.content:', post.content);
// Send matrix room event
let message = `Proposal ${proposalIndex}\n\n${post.content}`;
if (post.embeddedData && Object.entries(post.embeddedData).length) {
message += `\n\n${JSON.stringify(post.embeddedData, null, 2)}`;
}
sendNewProposalEvent(proposalIndex, message);
});
};
module.exports = {
start,
};

View File

@ -1,34 +0,0 @@
const objectHash = require('object-hash');
const verifySignature = require('./verify-signature');
const { forum } = require('./db');
const read = async (hash) => {
// Fetch content
const data = await forum.get(hash);
data.embeddedData = data.embeddedData || undefined;
const {
authors, content, signature, embeddedData, citations,
} = data;
// Verify hash
const derivedHash = objectHash({
authors, content, signature, embeddedData,
});
if (derivedHash !== hash) {
throw new Error('hash mismatch');
}
// Verify signature
if (!verifySignature(data)) {
throw new Error('signature verificaition failed');
}
return {
authors, content, signature, embeddedData, citations,
};
};
module.exports = read;

View File

@ -0,0 +1,16 @@
const callWithRetry = require('./call-with-retry');
const { dao } = require('./contracts');
const addPostWithRetry = async (authors, hash, references) => {
try {
await callWithRetry(() => dao.addPost(authors, hash, references));
} catch (e) {
if (e.reason === 'A post with this postId already exists') {
return { alreadyAdded: true };
}
throw e;
}
return { alreadyAdded: false };
};
module.exports = addPostWithRetry;

View File

@ -0,0 +1,18 @@
const Promise = require('bluebird');
const callWithRetry = async (contractCall, retryDelay = 5000) => {
let result;
try {
result = await contractCall();
} catch (e) {
if (e.code === 'REPLACEMENT_UNDERPRICED') {
console.log('retry delay (sec):', retryDelay / 1000);
await Promise.delay(retryDelay);
return callWithRetry(contractCall, retryDelay * 2);
}
throw e;
}
return result;
};
module.exports = callWithRetry;

View File

@ -1,4 +1,4 @@
const contractAddresses = require('../contract-addresses.json');
const contractAddresses = require('../../contract-addresses.json');
const networks = {
localhost: '0x539',

View File

@ -0,0 +1,56 @@
const ethers = require('ethers');
const { getContractAddressByNetworkName } = require('./contract-config');
const DAOArtifact = require('../../contractArtifacts/DAO.json');
const ProposalsArtifact = require('../../contractArtifacts/Proposals.json');
const RollupArtifact = require('../../contractArtifacts/Rollup.json');
const Work2Artifact = require('../../contractArtifacts/Work2.json');
const {
ETH_NETWORK,
ETH_PRIVATE_KEY,
INFURA_API_KEY,
} = process.env;
console.log('network:', ETH_NETWORK);
const getProvider = () => {
switch (ETH_NETWORK) {
case 'localhost':
return ethers.getDefaultProvider('http://localhost:8545');
case 'sepolia':
return new ethers.InfuraProvider(
ETH_NETWORK,
INFURA_API_KEY,
);
default:
throw new Error('Unknown network');
}
};
const wallet = new ethers.Wallet(ETH_PRIVATE_KEY, getProvider());
module.exports = {
wallet,
getProvider,
dao: new ethers.Contract(
getContractAddressByNetworkName(ETH_NETWORK, 'DAO'),
DAOArtifact.abi,
wallet,
),
proposals: new ethers.Contract(
getContractAddressByNetworkName(ETH_NETWORK, 'Proposals'),
ProposalsArtifact.abi,
wallet,
),
rollup: new ethers.Contract(
getContractAddressByNetworkName(ETH_NETWORK, 'Rollup'),
RollupArtifact.abi,
wallet,
),
work2: new ethers.Contract(
getContractAddressByNetworkName(ETH_NETWORK, 'Work2'),
Work2Artifact.abi,
wallet,
),
};

View File

@ -3,6 +3,7 @@ const { Level } = require('level');
const dataDir = process.env.LEVEL_DATA_DIR || 'data';
module.exports = {
applicationData: new Level(`${dataDir}/applicationData`, { valueEncoding: 'json' }),
forum: new Level(`${dataDir}/forum`, { valueEncoding: 'json' }),
authorAddresses: new Level(`${dataDir}/authorAddresses`),
authorPrivKeys: new Level(`${dataDir}/authorPrivKeys`),
@ -11,4 +12,5 @@ module.exports = {
referendumEventIds: new Level(`${dataDir}/referendumEventIds`, { keyEncoding: 'json' }),
matrixUserToAuthorAddress: new Level(`${dataDir}/matrixUserToAuthorAddress`),
authorAddressToMatrixUser: new Level(`${dataDir}/authorAddressToMatrixUser`),
matrixPools: new Level(`${dataDir}/matrixPools`, { valueEncoding: 'json' }),
};

View File

@ -0,0 +1,36 @@
const objectHash = require('object-hash');
const verifySignature = require('../verify-signature');
const { forum } = require('../db');
const read = async (hash) => {
// Fetch content
const data = await forum.get(hash);
data.embeddedData = data.embeddedData || undefined;
const {
sender, authors, content, signature, embeddedData, references,
} = data;
// Verify hash
const derivedHash = objectHash({
sender, authors, content, signature, embeddedData,
});
if (derivedHash !== hash) {
throw new Error('hash mismatch');
}
if (signature) {
// Verify signature
if (!verifySignature(data)) {
throw new Error('signature verificaition failed');
}
}
return {
sender, authors, content, signature, embeddedData, references,
};
};
module.exports = read;

View File

@ -0,0 +1,56 @@
const objectHash = require('object-hash');
const verifySignature = require('../verify-signature');
const { forum } = require('../db');
const read = require('./read');
const write = async ({
sender, authors, content, references, embeddedData, signature,
}) => {
// Check author signature
if (!verifySignature({
sender, authors, content, signature, embeddedData,
})) {
const err = new Error();
err.status = 403;
err.message = 'Signature verification failed';
throw err;
}
// Compute content hash
const data = {
sender, authors, content, signature, embeddedData, references,
};
// We omit references from the hash in order to support forum graph import.
// When a post is imported, the hashes can be precomputed for cited posts,
// without traversing the graph infinitely to compute hashes along entire reference chain.
const hash = objectHash({
sender, authors, content, signature, embeddedData,
});
// Make sure a post with this hash has not already been written
let existingPost;
try {
existingPost = await read(hash);
// If this doesn't throw, it means a post with this hash was already written
} catch (e) {
if (e.status !== 404) {
throw e;
}
}
if (existingPost) {
const err = new Error();
err.status = 403;
err.message = `A post with hash ${hash} already exists`;
throw err;
}
// Store content
await forum.put(hash, data);
// Return hash
return { hash, data };
};
module.exports = write;

View File

@ -0,0 +1,61 @@
const { proposals } = require('./contracts');
const {
START_PROPOSAL_ID,
STOP_PROPOSAL_ID,
} = process.env;
const gateByProposal = async (enable) => {
enable(true);
if (STOP_PROPOSAL_ID) {
enable(false);
// Check for status
const proposal = await proposals.proposals(STOP_PROPOSAL_ID);
if (proposal.stage === BigInt(5)) {
// Proposal is accepted
enable(false);
console.log(`STOP_PROPOSAL_ID ${STOP_PROPOSAL_ID} proposal is accepted. Disabling staking.`);
} else if (proposal.stage === BigInt(4)) {
// Proposal is failed
console.log(`STOP_PROPOSAL_ID ${STOP_PROPOSAL_ID} proposal is failed. No effect.`);
} else {
// Register a ProposalAccepted event handler.
console.log(`STOP_PROPOSAL_ID ${STOP_PROPOSAL_ID} proposal is stage ${proposal.stage.toString()}. Registering listener.`);
const proposalAcceptedHandler = (proposalIndex) => {
if (proposalIndex.toString() === STOP_PROPOSAL_ID) {
console.log(`STOP_PROPOSAL_ID ${STOP_PROPOSAL_ID} proposal is accepted. Disabling staking.`);
enable(false);
proposals.off('ProposalAccepted', proposalAcceptedHandler);
}
};
proposals.on('ProposalAccepted', proposalAcceptedHandler);
}
}
if (START_PROPOSAL_ID) {
enable(false);
// Check for status
const proposal = await proposals.proposals(START_PROPOSAL_ID);
if (proposal.stage === BigInt(5)) {
// Proposal is accepted
enable(true);
console.log(`START_PROPOSAL_ID ${START_PROPOSAL_ID} proposal is accepted. Enabling staking.`);
} else if (proposal.stage === BigInt(4)) {
// Proposal is failed
console.log(`START_PROPOSAL_ID ${START_PROPOSAL_ID} proposal is failed. Disabling staking.`);
} else {
// Register a ProposalAccepted event handler.
console.log(`START_PROPOSAL_ID ${START_PROPOSAL_ID} proposal is stage ${proposal.stage.toString()}. Registering listener.`);
const proposalAcceptedHandler = (proposalIndex) => {
if (proposalIndex.toString() === START_PROPOSAL_ID) {
console.log(`START_PROPOSAL_ID ${START_PROPOSAL_ID} proposal is accepted. Enabling staking.`);
enable(true);
proposals.off('ProposalAccepted', proposalAcceptedHandler);
}
};
proposals.on('ProposalAccepted', proposalAcceptedHandler);
}
}
};
module.exports = gateByProposal;

View File

@ -1,7 +1,7 @@
const { recoverPersonalSignature } = require('@metamask/eth-sig-util');
const verifySignature = ({
authors, content, signature, embeddedData,
sender, authors, content, signature, embeddedData,
}) => {
let contentToVerify = content;
if (embeddedData && Object.entries(embeddedData).length) {
@ -9,9 +9,12 @@ const verifySignature = ({
}
try {
const account = recoverPersonalSignature({ data: contentToVerify, signature });
const authorAddresses = authors.map((author) => author.authorAddress.toLowerCase());
if (!authorAddresses.includes(account.toLowerCase())) {
console.log('error: signer is not among the authors');
const addresses = authors
.map((author) => author.authorAddress)
.concat(sender ? [sender] : [])
.map((authorAddress) => authorAddress.toLowerCase());
if (!addresses.includes(account.toLowerCase())) {
console.log('error: signer is not among the authors or sender');
return false;
}
} catch (e) {

View File

@ -0,0 +1,11 @@
const ethers = require('ethers');
const { getProvider } = require('./contracts');
const {
ETH_PRIVATE_KEY,
} = process.env;
const wallet = new ethers.Wallet(ETH_PRIVATE_KEY, getProvider());
module.exports = wallet;

View File

@ -1,35 +0,0 @@
const objectHash = require('object-hash');
const verifySignature = require('./verify-signature');
const { forum } = require('./db');
module.exports = async (req, res) => {
const {
body: {
authors, content, signature, embeddedData, citations,
},
} = req;
// Check author signature
if (!verifySignature({
authors, content, signature, embeddedData,
})) {
res.status(403).end();
return;
}
// Compute content hash
const data = {
authors, content, signature, embeddedData, citations,
};
const hash = objectHash({
authors, content, signature, embeddedData,
});
console.log('write', hash);
console.log(data);
// Store content
await forum.put(hash, data);
// Return hash
res.send(hash);
};

View File

@ -6,4 +6,5 @@ MAINNET_PRIVATE_KEY=
SEED_PHRASE=
ETHERSCAN_API_KEY=
WORK1_PRICE="0.001"
ONBOARDING_PRICE="0.001"
ONBOARDING_PRICE="0.001"
ROLLUP_INTERVAL=120

View File

@ -1,14 +1,24 @@
{
"localhost": {
"DAO": "0x57BDFFf79108E5198dec6268A6BFFD8B62ECfA38",
"Work1": "0xB8f0cd092979F273b752FDa060F82BF2745f192e",
"Onboarding": "0x8F00038542C87A5eAf18d5938B7723bF2A04A4e4",
"Proposals": "0x6c18eb38b7450F8DaE5A5928A40fcA3952493Ee4"
"DAO": "0x3734B0944ea37694E85AEF60D5b256d19EDA04be",
"Work1": "0x8BDA04936887cF11263B87185E4D19e8158c6296",
"Onboarding": "0x8688E736D0D72161db4D25f68EF7d0EE4856ba19",
"Proposals": "0x3287061aDCeE36C1aae420a06E4a5EaE865Fe3ce",
"Rollup": "0x71cb20D63576a0Fa4F620a2E96C73F82848B09e1",
"Work2": "0x76Dfe9F47f06112a1b78960bf37d87CfbB6D6133",
"Reputation": "0xEAefe601Aad7422307B99be65bbE005aeA966012",
"Forum": "0x79e365342329560e8420d7a0f016633d7640cB18",
"Bench": "0xC0f00E5915F9abE6476858fD1961EAf79395ea64"
},
"sepolia": {
"DAO": "0x8e5bd58B2ca8910C5F9be8de847d6883B15c60d2",
"Work1": "0x1708A144F284C1a9615C25b674E4a08992CE93e4",
"Onboarding": "0xb21D4c986715A1adb5e87F752842613648C20a7B",
"Proposals": "0x930c47293F206780E8F166338bDaFF3520306032"
"DAO": "0xBA2e65ae29667E145343bD5Fd655A72dcf873b08",
"Work1": "0x251dB891768ea85DaCA6bb567669F97248D09Fe3",
"Onboarding": "0x78FC8b520001560A9D7a61072855218320C71BDC",
"Proposals": "0xA888cDC4Bd80d402b14B1FeDE5FF471F1737570c",
"Reputation": "0x62cc0035B17F1686cE30320B90373c77fcaA58CD",
"Forum": "0x51b5Af12707e0d879B985Cb0216bFAC6dca85501",
"Bench": "0x98d9F0e97Af71936747819040ddBE896A548ef4d",
"Rollup": "0x678DC2c846bfDCC813ea27DfEE428f1d7f2521ED",
"Work2": "0x609102Fb6cA15da80D37E8cA68aBD5e1bD9C855B"
}
}

View File

@ -0,0 +1,84 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
import "./core/DAO.sol";
import "./interfaces/IAcceptAvailability.sol";
contract Availability is IAcceptAvailability, DAOContract {
struct AvailabilityStake {
address worker;
uint256 amount;
uint endTime;
bool assigned;
}
mapping(uint => AvailabilityStake) public stakes;
mapping(address worker => uint stakeIndex) activeWorkerStakes;
uint public stakeCount;
event AvailabilityStaked(uint stakeIndex);
constructor(DAO dao) DAOContract(dao) {}
/// Accept availability stakes as reputation token transfer
function acceptAvailability(
address worker,
uint256 amount,
uint duration
) external returns (uint refund) {
require(
msg.sender == address(dao),
"acceptAvailability must only be called by DAO contract"
);
require(amount > 0, "No stake provided");
// If we already have a stake for this worker, replace it
uint stakeIndex = activeWorkerStakes[worker];
if (stakeIndex == 0 && stakes[stakeIndex].worker != worker) {
// We don't have an existing stake for this worker
stakeIndex = stakeCount++;
activeWorkerStakes[worker] = stakeIndex;
} else if (stakes[stakeIndex].assigned) {
// Stake has already been assigned; We need to create a new one
stakeIndex = stakeCount++;
activeWorkerStakes[worker] = stakeIndex;
} else {
// We are replacing an existing stake.
// That means we can refund some of the granted allowance
refund = stakes[stakeIndex].amount;
}
AvailabilityStake storage stake = stakes[stakeIndex];
stake.worker = worker;
stake.amount = amount;
stake.endTime = block.timestamp + duration;
emit AvailabilityStaked(stakeIndex);
}
/// Select a worker randomly from among the available workers, weighted by amount staked
function randomWeightedSelection() internal view returns (uint stakeIndex) {
uint totalStakes;
for (uint i = 0; i < stakeCount; i++) {
if (stakes[i].assigned) continue;
if (block.timestamp > stakes[i].endTime) continue;
totalStakes += stakes[i].amount;
}
require(totalStakes > 0, "No available worker stakes");
uint select = block.prevrandao % totalStakes;
uint acc;
for (uint i = 0; i < stakeCount; i++) {
if (stakes[i].assigned) continue;
if (block.timestamp > stakes[i].endTime) continue;
acc += stakes[i].amount;
if (acc > select) {
stakeIndex = i;
break;
}
}
}
/// Assign a random available worker
function assignWork() internal returns (uint stakeIndex) {
stakeIndex = randomWeightedSelection();
AvailabilityStake storage stake = stakes[stakeIndex];
stake.assigned = true;
}
}

View File

@ -0,0 +1,105 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
struct Reference {
int weightPPM;
string targetPostId;
}
struct Author {
uint weightPPM;
address authorAddress;
}
struct Post {
string id;
address sender;
Author[] authors;
Reference[] references;
string content;
}
contract GlobalForum {
mapping(string => Post) posts;
string[] public postIds;
uint public postCount;
event PostAdded(string id);
function addPost(
Author[] calldata authors,
string calldata postId,
Reference[] calldata references
) external {
require(authors.length > 0, "Post must include at least one author");
postCount++;
postIds.push(postId);
Post storage post = posts[postId];
require(
post.authors.length == 0,
"A post with this postId already exists"
);
post.sender = msg.sender;
post.id = postId;
uint authorTotalWeightPPM;
for (uint i = 0; i < authors.length; i++) {
authorTotalWeightPPM += authors[i].weightPPM;
post.authors.push(authors[i]);
}
require(
authorTotalWeightPPM == 1000000,
"Author weights must sum to 1000000"
);
for (uint i = 0; i < references.length; i++) {
post.references.push(references[i]);
}
int totalReferenceWeightPos;
int totalReferenceWeightNeg;
for (uint i = 0; i < post.references.length; i++) {
int weight = post.references[i].weightPPM;
require(
weight >= -1000000,
"Each reference weight must be >= -1000000"
);
require(
weight <= 1000000,
"Each reference weight must be <= 1000000"
);
if (weight > 0) totalReferenceWeightPos += weight;
else totalReferenceWeightNeg += weight;
}
require(
totalReferenceWeightPos <= 1000000,
"Sum of positive references must be <= 1000000"
);
require(
totalReferenceWeightNeg >= -1000000,
"Sum of negative references must be >= -1000000"
);
emit PostAdded(postId);
}
function getPostAuthors(
string calldata postId
) external view returns (Author[] memory) {
Post storage post = posts[postId];
return post.authors;
}
function getPost(
string calldata postId
)
external
view
returns (
Author[] memory authors,
Reference[] memory references,
address sender
)
{
Post storage post = posts[postId];
authors = post.authors;
references = post.references;
sender = post.sender;
}
}

View File

@ -2,16 +2,16 @@
pragma solidity ^0.8.24;
import "./core/DAO.sol";
import "./core/Forum.sol";
import "./WorkContract.sol";
import "./Work.sol";
import "./interfaces/IOnValidate.sol";
contract Onboarding is WorkContract, IOnValidate {
contract Onboarding is Work, IOnValidate {
constructor(
DAO dao_,
Proposals proposals_,
uint price_
) WorkContract(dao_, proposals_, price_) {}
DAO dao,
GlobalForum forum,
Proposals proposals,
uint price
) Work(dao, forum, proposals, price) {}
/// Accept work approval/disapproval from customer
function submitWorkApproval(
@ -29,13 +29,13 @@ contract Onboarding is WorkContract, IOnValidate {
// Make work evidence post
Author[] memory authors = new Author[](1);
authors[0] = Author(1000000, stake.worker);
dao.addPost(authors, request.evidenceContentId, request.citations);
forum.addPost(authors, request.evidencePostId, request.references);
emit WorkApprovalSubmitted(requestIndex, approval);
// Initiate validation pool
uint poolIndex = dao.initiateValidationPool{
value: request.fee - request.fee / 10
}(
request.evidenceContentId,
request.evidencePostId,
POOL_DURATION,
[uint256(1), uint256(3)],
[uint256(1), uint256(2)],
@ -60,7 +60,7 @@ contract Onboarding is WorkContract, IOnValidate {
uint,
uint,
bytes calldata callbackData
) external returns (uint) {
) external {
require(
msg.sender == address(dao),
"onValidate may only be called by the DAO contract"
@ -70,15 +70,15 @@ contract Onboarding is WorkContract, IOnValidate {
if (!votePasses || !quorumMet) {
// refund the customer the remaining amount
payable(request.customer).transfer(request.fee / 10);
return 1;
return;
}
// Make onboarding post
Citation[] memory emptyCitations;
Reference[] memory emptyReferences;
Author[] memory authors = new Author[](1);
authors[0] = Author(1000000, request.customer);
dao.addPost(authors, request.requestContentId, emptyCitations);
forum.addPost(authors, request.requestPostId, emptyReferences);
dao.initiateValidationPool{value: request.fee / 10}(
request.requestContentId,
request.requestPostId,
POOL_DURATION,
[uint256(1), uint256(3)],
[uint256(1), uint256(2)],
@ -87,6 +87,5 @@ contract Onboarding is WorkContract, IOnValidate {
false,
""
);
return 0;
}
}

View File

@ -59,22 +59,17 @@ contract Proposals is DAOContract, IOnValidate {
// TODO receive : we want to be able to accept refunds from validation pools
/// Submit a post as a proposal. forum.addPost should be called before this.
function propose(
string calldata contentId,
address author,
string calldata postId,
uint[3] calldata durations,
bool callbackOnAccepted,
bytes calldata callbackData
) external payable returns (uint proposalIndex) {
// TODO: Take citations as a parameter
Citation[] memory emptyCitations;
Author[] memory authors = new Author[](1);
authors[0] = Author(1000000, author);
dao.addPost(authors, contentId, emptyCitations);
proposalIndex = proposalCount++;
Proposal storage proposal = proposals[proposalIndex];
proposal.sender = msg.sender;
proposal.postId = contentId;
proposal.postId = postId;
proposal.startTime = block.timestamp;
proposal.referenda[0].duration = durations[0];
proposal.referenda[1].duration = durations[1];
@ -158,7 +153,7 @@ contract Proposals is DAOContract, IOnValidate {
uint stakedFor,
uint stakedAgainst,
bytes calldata callbackData
) external returns (uint) {
) external {
require(
msg.sender == address(dao),
"onValidate may only be called by the DAO contract"
@ -182,7 +177,7 @@ contract Proposals is DAOContract, IOnValidate {
proposal.stage = Stage.Failed;
emit ProposalFailed(proposalIndex, "Quorum not met");
proposal.remainingFee += fee;
return 1;
return;
}
// Participation threshold of 50%
@ -243,7 +238,7 @@ contract Proposals is DAOContract, IOnValidate {
} else if (proposal.stage == Stage.Referendum100) {
initiateValidationPool(proposalIndex, 2, proposal.fee / 10);
}
return 0;
return;
}
/// External function that will advance a proposal to the referendum process

View File

@ -0,0 +1,52 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
import "./Work.sol";
import "./Rollup.sol";
abstract contract RollableWork is Work {
Rollup immutable rollupContract;
constructor(
DAO dao,
GlobalForum forum,
Proposals proposalsContract,
Rollup rollupContract_,
uint price
) Work(dao, forum, proposalsContract, price) {
rollupContract = rollupContract_;
}
/// Accept work approval/disapproval from customer
function submitWorkApproval(
uint requestIndex,
bool approval
) external override {
WorkRequest storage request = requests[requestIndex];
require(
request.status == WorkStatus.EvidenceSubmitted,
"Status must be EvidenceSubmitted"
);
AvailabilityStake storage stake = stakes[request.stakeIndex];
request.status = WorkStatus.ApprovalSubmitted;
request.approval = approval;
emit WorkApprovalSubmitted(requestIndex, approval);
// Make work evidence post
Author[] memory authors = new Author[](1);
authors[0] = Author(1000000, stake.worker);
forum.addPost(authors, request.evidencePostId, request.references);
// send worker stakes and customer fee to rollup contract
dao.forwardAllowance(
stake.worker,
address(rollupContract),
stake.amount
);
rollupContract.addItem{value: request.fee}(
stake.worker,
stake.amount,
request.evidencePostId
);
}
}

View File

@ -0,0 +1,141 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
import "./core/DAO.sol";
import "./Availability.sol";
contract Rollup is Availability {
struct BatchItem {
address sender;
address worker;
uint stakeAmount;
uint fee;
string postId;
}
mapping(uint => BatchItem) public items;
uint public itemCount;
address public batchWorker;
uint batchWorkerStakeIndex;
uint public immutable batchInterval;
uint public batchStart;
uint lastWorkerReset;
uint constant minResetInterval = 120;
event BatchItemAdded(string postId, address sender, uint fee);
event BatchWorkerAssigned(address batchWorker);
constructor(DAO dao, uint batchInterval_) Availability(dao) {
batchInterval = batchInterval_;
}
/// Instead of initiating a validation pool, call this method to include
/// the stakes and fee in the next batch validation pool
function addItem(
address author,
uint stakeAmount,
string calldata postId
) public payable {
BatchItem storage item = items[itemCount++];
item.sender = msg.sender;
item.worker = author;
item.stakeAmount = stakeAmount;
item.fee = msg.value;
item.postId = postId;
emit BatchItemAdded(postId, item.sender, item.fee);
}
/// To be called by the currently assigned batch worker,
/// If no batch worker has been assigned this may be called by anybody,
/// but it will only succeed if it is able to assign a new worker.
function submitBatch(
string calldata batchPostId,
string[] calldata batchItems,
uint poolDuration
) public returns (uint poolIndex) {
if (batchWorker != address(0)) {
require(
msg.sender == batchWorker,
"Batch result must be submitted by current batch worker"
);
}
require(batchItems.length <= itemCount, "Batch size too large");
// Make sure all batch items match
for (uint i = 0; i < batchItems.length; i++) {
require(
keccak256(bytes(batchItems[i])) ==
keccak256(bytes(items[i].postId)),
"Batch item mismatch"
);
}
// initiate a validation pool for this batch
uint fee;
for (uint i = 0; i < batchItems.length; i++) {
fee += items[i].fee;
}
poolIndex = dao.initiateValidationPool{value: fee}(
batchPostId,
poolDuration,
[uint256(1), uint256(3)],
[uint256(1), uint256(2)],
100,
true,
false,
""
);
// Include all the availability stakes from the batched work
for (uint i = 0; i < batchItems.length; i++) {
dao.delegatedStakeOnValidationPool(
poolIndex,
items[i].worker,
items[i].stakeAmount,
true
);
}
// Include availability stakes from the batch worker
if (batchWorker != address(0)) {
dao.delegatedStakeOnValidationPool(
poolIndex,
batchWorker,
stakes[batchWorkerStakeIndex].amount,
true
);
}
if (batchItems.length < itemCount) {
// Some items were added after this batch was computed.
// Keep them in the queue to be included in the next batch.
for (uint i = 0; i < itemCount - batchItems.length; i++) {
items[i] = items[batchItems.length + i];
}
itemCount = itemCount - batchItems.length;
} else {
// Reset item count so we can start the next batch
itemCount = 0;
}
// Select the next batch worker
batchWorkerStakeIndex = assignWork();
batchWorker = stakes[batchWorkerStakeIndex].worker;
batchStart = block.timestamp;
emit BatchWorkerAssigned(batchWorker);
}
/// If the batch worker fails to submit the batch, a new batch worker may be selected
function resetBatchWorker() public {
// TODO: Grace period after the current batch is due and before the worker can be replaced
require(
block.timestamp - batchStart > batchInterval,
"Current batch interval has not yet elapsed"
);
require(itemCount > 0, "Current batch is empty");
require(
lastWorkerReset == 0 ||
block.timestamp - lastWorkerReset >= minResetInterval,
"Mininum reset interval has not elapsed since last batch worker reset"
);
// TODO: Submit a validation pool targeting a null post, and send the worker's availability stake
// This gives the DAO an opportunity to police the failed work
// Select a new batch worker
batchWorkerStakeIndex = assignWork();
batchWorker = stakes[batchWorkerStakeIndex].worker;
}
}

View File

@ -2,23 +2,11 @@
pragma solidity ^0.8.24;
import "./core/DAO.sol";
import "./core/Forum.sol";
import "./Availability.sol";
import "./Proposals.sol";
import "./interfaces/IAcceptAvailability.sol";
import "./interfaces/IOnProposalAccepted.sol";
abstract contract WorkContract is
DAOContract,
IAcceptAvailability,
IOnProposalAccepted
{
struct AvailabilityStake {
address worker;
uint256 amount;
uint endTime;
bool assigned;
}
abstract contract Work is Availability, IOnProposalAccepted {
enum WorkStatus {
Requested,
EvidenceSubmitted,
@ -31,9 +19,9 @@ abstract contract WorkContract is
uint256 fee;
WorkStatus status;
uint stakeIndex;
string requestContentId;
string evidenceContentId;
Citation[] citations;
string requestPostId;
string evidencePostId;
Reference[] references;
bool approval;
}
@ -42,18 +30,16 @@ abstract contract WorkContract is
uint proposalIndex;
}
GlobalForum forum;
Proposals proposalsContract;
uint public price;
mapping(uint => PriceProposal) public priceProposals;
uint public priceProposalCount;
mapping(uint => AvailabilityStake) public stakes;
uint public stakeCount;
mapping(uint => WorkRequest) public requests;
uint public requestCount;
uint constant POOL_DURATION = 20;
event AvailabilityStaked(uint stakeIndex);
event WorkAssigned(uint requestIndex, uint stakeIndex);
event WorkEvidenceSubmitted(uint requestIndex);
event WorkApprovalSubmitted(uint requestIndex, bool approval);
@ -62,89 +48,32 @@ abstract contract WorkContract is
constructor(
DAO dao,
GlobalForum forum_,
Proposals proposalsContract_,
uint price_
) DAOContract(dao) {
) Availability(dao) {
price = price_;
proposalsContract = proposalsContract_;
}
/// Accept availability stakes as reputation token transfer
function acceptAvailability(
address sender,
uint256 amount,
uint duration
) external {
require(amount > 0, "No stake provided");
uint stakeIndex = stakeCount++;
AvailabilityStake storage stake = stakes[stakeIndex];
stake.worker = sender;
stake.amount = amount;
stake.endTime = block.timestamp + duration;
emit AvailabilityStaked(stakeIndex);
}
function extendAvailability(uint stakeIndex, uint duration) external {
AvailabilityStake storage stake = stakes[stakeIndex];
require(
msg.sender == stake.worker,
"Worker can only extend their own availability stake"
);
require(!stake.assigned, "Stake has already been assigned work");
if (block.timestamp > stake.endTime) {
stake.endTime = block.timestamp + duration;
} else {
stake.endTime = stake.endTime + duration;
}
emit AvailabilityStaked(stakeIndex);
}
/// Select a worker randomly from among the available workers, weighted by amount staked
function randomWeightedSelection() internal view returns (uint stakeIndex) {
uint totalStakes;
for (uint i = 0; i < stakeCount; i++) {
if (stakes[i].assigned) continue;
if (block.timestamp > stakes[i].endTime) continue;
totalStakes += stakes[i].amount;
}
require(totalStakes > 0, "No available worker stakes");
uint select = block.prevrandao % totalStakes;
uint acc;
for (uint i = 0; i < stakeCount; i++) {
if (stakes[i].assigned) continue;
if (block.timestamp > stakes[i].endTime) continue;
acc += stakes[i].amount;
if (acc > select) {
stakeIndex = i;
break;
}
}
}
/// Assign a random available worker
function assignWork(uint requestIndex) internal returns (uint stakeIndex) {
stakeIndex = randomWeightedSelection();
AvailabilityStake storage stake = stakes[stakeIndex];
stake.assigned = true;
emit WorkAssigned(requestIndex, stakeIndex);
forum = forum_;
}
/// Accept work request with fee
function requestWork(string calldata requestContentId) external payable {
function requestWork(string calldata requestPostId) external payable {
require(msg.value >= price, "Insufficient fee");
uint requestIndex = requestCount++;
WorkRequest storage request = requests[requestIndex];
request.customer = msg.sender;
request.fee = msg.value;
request.stakeIndex = assignWork(requestIndex);
request.requestContentId = requestContentId;
request.stakeIndex = assignWork();
request.requestPostId = requestPostId;
emit WorkAssigned(requestIndex, request.stakeIndex);
}
/// Accept work evidence from worker
function submitWorkEvidence(
uint requestIndex,
string calldata evidenceContentId,
Citation[] calldata citations
string calldata evidencePostId,
Reference[] calldata references
) external {
WorkRequest storage request = requests[requestIndex];
require(
@ -157,9 +86,9 @@ abstract contract WorkContract is
"Worker can only submit evidence for work they are assigned"
);
request.status = WorkStatus.EvidenceSubmitted;
request.evidenceContentId = evidenceContentId;
for (uint i = 0; i < citations.length; i++) {
request.citations.push(citations[i]);
request.evidencePostId = evidencePostId;
for (uint i = 0; i < references.length; i++) {
request.references.push(references[i]);
}
emit WorkEvidenceSubmitted(requestIndex);
}
@ -180,11 +109,11 @@ abstract contract WorkContract is
// Make work evidence post
Author[] memory authors = new Author[](1);
authors[0] = Author(1000000, stake.worker);
dao.addPost(authors, request.evidenceContentId, request.citations);
forum.addPost(authors, request.evidencePostId, request.references);
emit WorkApprovalSubmitted(requestIndex, approval);
// Initiate validation pool
uint poolIndex = dao.initiateValidationPool{value: request.fee}(
request.evidenceContentId,
request.evidencePostId,
POOL_DURATION,
[uint256(1), uint256(3)],
[uint256(1), uint256(2)],
@ -202,9 +131,11 @@ abstract contract WorkContract is
);
}
/// Initiate a new proposal to change the price for this work contract.
/// This takes a postId; DAO.addPost should be called before or concurrently with this.
function proposeNewPrice(
uint newPrice,
string calldata contentId,
string calldata postId,
uint[3] calldata durations
) external payable {
uint priceProposalIndex = priceProposalCount++;
@ -214,13 +145,7 @@ abstract contract WorkContract is
priceProposal.price = newPrice;
priceProposal.proposalIndex = proposalsContract.propose{
value: msg.value
}(
contentId,
msg.sender,
durations,
true,
abi.encode(priceProposalIndex)
);
}(postId, durations, true, abi.encode(priceProposalIndex));
emit PriceChangeProposed(priceProposalIndex);
}

View File

@ -2,13 +2,14 @@
pragma solidity ^0.8.24;
import "./core/DAO.sol";
import "./WorkContract.sol";
import "./Work.sol";
import "./Proposals.sol";
contract Work1 is WorkContract {
contract Work1 is Work {
constructor(
DAO dao_,
Proposals proposals_,
uint price_
) WorkContract(dao_, proposals_, price_) {}
DAO dao,
GlobalForum forum,
Proposals proposals,
uint price
) Work(dao, forum, proposals, price) {}
}

View File

@ -0,0 +1,17 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
import "./core/DAO.sol";
import "./RollableWork.sol";
import "./Proposals.sol";
import "./Rollup.sol";
contract Work2 is RollableWork {
constructor(
DAO dao,
GlobalForum forum,
Proposals proposals,
Rollup rollup,
uint price
) RollableWork(dao, forum, proposals, rollup, price) {}
}

View File

@ -0,0 +1,419 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
import "./DAO.sol";
import "../GlobalForum.sol";
struct ValidationPoolParams {
uint duration;
uint[2] quorum; // [ Numerator, Denominator ]
uint[2] winRatio; // [ Numerator, Denominator ]
uint bindingPercent;
bool redistributeLosingStakes;
}
struct ValidationPoolProps {
string postId;
uint fee;
uint minted;
uint endTime;
bool resolved;
bool outcome;
}
contract Bench {
struct Stake {
uint id;
bool inFavor;
uint amount;
address sender;
}
struct Pool {
uint id;
address sender;
mapping(uint => Stake) stakes;
uint stakeCount;
ValidationPoolParams params;
ValidationPoolProps props;
bool callbackOnValidate;
bytes callbackData;
}
mapping(uint => Pool) public validationPools;
uint public validationPoolCount;
DAO dao;
GlobalForum forum;
// Validation Pool parameters
uint constant minDuration = 1; // 1 second
uint constant maxDuration = 365_000_000 days; // 1 million years
uint[2] minQuorum = [1, 10];
// Forum parameters
// TODO: Make depth limit configurable; take as param
uint depthLimit = 3;
mapping(string => mapping(string => int)) _edgeBalances;
function registerDAO(DAO dao_, GlobalForum forum_) external {
require(
address(dao) == address(0),
"A DAO has already been registered"
);
dao = dao_;
forum = forum_;
}
/// Register a stake for/against a validation pool
function stakeOnValidationPool(
uint poolIndex,
address sender,
uint256 amount,
bool inFavor
) external {
require(
msg.sender == address(dao),
"Only DAO contract may call stakeOnValidationPool"
);
Pool storage pool = validationPools[poolIndex];
require(
block.timestamp <= pool.props.endTime,
"Pool end time has passed"
);
// We don't call _update here; We defer that until evaluateOutcome.
uint stakeIndex = pool.stakeCount++;
Stake storage s = pool.stakes[stakeIndex];
s.sender = sender;
s.inFavor = inFavor;
s.amount = amount;
s.id = stakeIndex;
}
/// Accept fee to initiate a validation pool
function initiateValidationPool(
address sender,
string calldata postId,
uint duration,
uint[2] calldata quorum, // [Numerator, Denominator]
uint[2] calldata winRatio, // [Numerator, Denominator]
uint bindingPercent,
bool redistributeLosingStakes,
bool callbackOnValidate,
bytes calldata callbackData
) external payable returns (uint poolIndex) {
require(
msg.sender == address(dao),
"Only DAO contract may call initiateValidationPool"
);
require(duration >= minDuration, "Duration is too short");
require(duration <= maxDuration, "Duration is too long");
require(
minQuorum[1] * quorum[0] >= minQuorum[0] * quorum[1],
"Quorum is below minimum"
);
require(quorum[0] <= quorum[1], "Quorum is greater than one");
require(winRatio[0] <= winRatio[1], "Win ratio is greater than one");
require(bindingPercent <= 100, "Binding percent must be <= 100");
poolIndex = validationPoolCount++;
Pool storage pool = validationPools[poolIndex];
pool.id = poolIndex;
pool.sender = sender;
pool.props.postId = postId;
pool.props.fee = msg.value;
pool.props.endTime = block.timestamp + duration;
pool.params.quorum = quorum;
pool.params.winRatio = winRatio;
pool.params.bindingPercent = bindingPercent;
pool.params.redistributeLosingStakes = redistributeLosingStakes;
pool.params.duration = duration;
pool.callbackOnValidate = callbackOnValidate;
pool.callbackData = callbackData;
// We use our privilege as the DAO contract to mint reputation in proportion with the fee.
// Here we assume a minting ratio of 1
// TODO: Make minting ratio an adjustable parameter
dao.mint(address(dao), pool.props.fee);
pool.props.minted = msg.value;
dao.emitValidationPoolInitiated(poolIndex);
}
/// Evaluate outcome of a validation pool
function evaluateOutcome(uint poolIndex) public returns (bool votePasses) {
require(
msg.sender == address(dao),
"Only DAO contract may call evaluateOutcome"
);
Pool storage pool = validationPools[poolIndex];
require(pool.props.resolved == false, "Pool is already resolved");
uint stakedFor;
uint stakedAgainst;
Stake storage s;
for (uint i = 0; i < pool.stakeCount; i++) {
s = pool.stakes[i];
// Make sure the sender still has the required balance.
// If not, automatically decrease the staked amount.
if (dao.balanceOf(s.sender) < s.amount) {
s.amount = dao.balanceOf(s.sender);
}
if (s.inFavor) {
stakedFor += s.amount;
} else {
stakedAgainst += s.amount;
}
}
stakedFor += pool.props.minted / 2;
stakedAgainst += pool.props.minted / 2;
if (pool.props.minted % 2 != 0) {
stakedFor += 1;
}
// Special case for early evaluation if dao.totalSupply has been staked
require(
block.timestamp > pool.props.endTime ||
stakedFor + stakedAgainst == dao.totalSupply(),
"Pool end time has not yet arrived"
);
// Check that quorum is met
if (
pool.params.quorum[1] * (stakedFor + stakedAgainst) <=
dao.totalSupply() * pool.params.quorum[0]
) {
// TODO: Refund fee
// TODO: this could be made available for the sender to withdraw
// payable(pool.sender).transfer(pool.props.fee);
pool.props.resolved = true;
dao.emitValidationPoolResolved(poolIndex, false, false);
// Callback if requested
if (pool.callbackOnValidate) {
dao.onValidate(
pool.sender,
votePasses,
false,
stakedFor,
stakedAgainst,
pool.callbackData
);
}
return false;
}
// A tie is resolved in favor of the validation pool.
// This is especially important so that the DAO's first pool can pass,
// when no reputation has yet been minted.
votePasses =
stakedFor * pool.params.winRatio[1] >=
(stakedFor + stakedAgainst) * pool.params.winRatio[0];
pool.props.resolved = true;
pool.props.outcome = votePasses;
dao.emitValidationPoolResolved(poolIndex, votePasses, true);
// Value of losing stakes should be distributed among winners, in proportion to their stakes
// Only bindingPercent % should be redistributed
// Stake senders should get (1000000-bindingPercent) % back
uint amountFromWinners = votePasses ? stakedFor : stakedAgainst;
uint totalRewards;
uint totalAllocated;
for (uint i = 0; i < pool.stakeCount; i++) {
s = pool.stakes[i];
if (votePasses != s.inFavor) {
// Losing stake
uint amount = (s.amount * pool.params.bindingPercent) / 100;
if (pool.params.redistributeLosingStakes) {
dao.update(s.sender, address(dao), amount);
totalRewards += amount;
} else {
dao.burn(s.sender, amount);
}
}
}
if (votePasses) {
// If vote passes, reward the author as though they had staked the winning portion of the VP initial stake
// Here we assume a stakeForAuthor ratio of 0.5
// TODO: Make stakeForAuthor an adjustable parameter
totalRewards += pool.props.minted / 2;
// Include the losign portion of the VP initial stake
// Issue rewards to the winners
for (uint i = 0; i < pool.stakeCount; i++) {
s = pool.stakes[i];
if (
pool.params.redistributeLosingStakes &&
votePasses == s.inFavor
) {
// Winning stake
uint reward = (((totalRewards * s.amount) /
amountFromWinners) * pool.params.bindingPercent) / 100;
totalAllocated += reward;
dao.update(address(dao), s.sender, reward);
}
}
// Due to rounding, there may be some excess REP. Award it to the author.
uint remainder = totalRewards - totalAllocated;
if (pool.props.minted % 2 != 0) {
// We staked the odd remainder in favor of the post, on behalf of the author.
remainder += 1;
}
// Transfer REP to the forum instead of to the author directly
propagateReputation(
pool.props.postId,
int(pool.props.minted / 2 + remainder),
false,
0
);
} else {
// If vote does not pass, divide the losing stake among the winners
totalRewards += pool.props.minted;
for (uint i = 0; i < pool.stakeCount; i++) {
s = pool.stakes[i];
if (
pool.params.redistributeLosingStakes &&
votePasses == s.inFavor
) {
// Winning stake
uint reward = (((totalRewards * s.amount) /
(amountFromWinners - pool.props.minted / 2)) *
pool.params.bindingPercent) / 100;
totalAllocated += reward;
dao.update(address(dao), s.sender, reward);
}
}
}
// Distribute fee proportionately among all reputation holders
dao.distributeFeeAmongMembers{value: pool.props.fee}();
// Callback if requested
if (pool.callbackOnValidate) {
dao.onValidate(
pool.sender,
votePasses,
true,
stakedFor,
stakedAgainst,
pool.callbackData
);
}
}
function _handleReference(
string memory postId,
Reference memory ref,
int amount,
bool initialNegative,
uint depth
) internal returns (int outboundAmount) {
outboundAmount = (amount * ref.weightPPM) / 1000000;
if (bytes(ref.targetPostId).length == 0) {
// Incineration
require(
outboundAmount >= 0,
"Leaching from incinerator is forbidden"
);
dao.burn(address(dao), uint(outboundAmount));
return outboundAmount;
}
int balanceToOutbound = _edgeBalances[postId][ref.targetPostId];
if (initialNegative) {
if (outboundAmount < 0) {
outboundAmount = outboundAmount > -balanceToOutbound
? outboundAmount
: -balanceToOutbound;
} else {
outboundAmount = outboundAmount < -balanceToOutbound
? outboundAmount
: -balanceToOutbound;
}
}
int refund = propagateReputation(
ref.targetPostId,
outboundAmount,
initialNegative || (depth == 0 && ref.weightPPM < 0),
depth + 1
);
outboundAmount -= refund;
_edgeBalances[postId][ref.targetPostId] += outboundAmount;
}
function _distributeAmongAuthors(
Author[] memory authors,
int amount
) internal returns (int refund) {
int allocated;
for (uint i = 0; i < authors.length; i++) {
dao.registerMember(authors[i].authorAddress);
}
for (uint i = 0; i < authors.length; i++) {
Author memory author = authors[i];
int share;
if (i < authors.length - 1) {
share = (amount * int(author.weightPPM)) / 1000000;
allocated += share;
} else {
// For the last author, allocate the remainder.
share = amount - allocated;
}
if (share > 0) {
dao.update(address(dao), author.authorAddress, uint(share));
} else if (dao.balanceOf(author.authorAddress) < uint(-share)) {
// Author has already lost some REP gained from this post.
// That means other DAO members have earned it for policing.
// We need to refund the difference here to ensure accurate bookkeeping
uint authorBalance = dao.balanceOf(author.authorAddress);
refund += share + int(authorBalance);
dao.update(
author.authorAddress,
address(dao),
dao.balanceOf(author.authorAddress)
);
} else {
dao.update(author.authorAddress, address(dao), uint(-share));
}
}
}
function propagateReputation(
string memory postId,
int amount,
bool initialNegative,
uint depth
) internal returns (int refundToInbound) {
if (depth >= depthLimit) {
return amount;
}
Reference[] memory references;
Author[] memory authors;
address sender;
(authors, references, sender) = forum.getPost(postId);
if (authors.length == 0) {
// We most likely got here via a reference to a post that hasn't been added yet.
// We support this scenario so that a reference graph can be imported one post at a time.
return amount;
}
// Propagate negative references first
for (uint i = 0; i < references.length; i++) {
if (references[i].weightPPM < 0) {
amount -= _handleReference(
postId,
references[i],
amount,
initialNegative,
depth
);
}
}
// Now propagate positive references
for (uint i = 0; i < references.length; i++) {
if (references[i].weightPPM > 0) {
amount -= _handleReference(
postId,
references[i],
amount,
initialNegative,
depth
);
}
}
refundToInbound = _distributeAmongAuthors(authors, amount);
}
}

View File

@ -3,21 +3,381 @@ pragma solidity ^0.8.24;
import {ERC20} from "@openzeppelin/contracts/token/ERC20/ERC20.sol";
import "./Reputation.sol";
import "./ValidationPools.sol";
import "./Forum.sol";
import "./Bench.sol";
import "./LightweightBench.sol";
import "../GlobalForum.sol";
import "../interfaces/IAcceptAvailability.sol";
import "../interfaces/IOnValidate.sol";
contract DAO {
Reputation rep;
GlobalForum forum;
Bench bench;
LightweightBench lightweightBench;
mapping(uint => address) public members;
uint public memberCount;
mapping(address => bool) public isMember;
event PostAdded(string id);
event ValidationPoolInitiated(uint poolIndex);
event ValidationPoolResolved(
uint poolIndex,
bool votePasses,
bool quorumMet
);
event LWValidationPoolInitiated(uint poolIndex);
event LWValidationPoolResolved(
uint poolIndex,
bool votePasses,
bool quorumMet
);
event LWResultProposed(
uint poolIndex,
uint proposedResultIndex,
string proposedResultHash
);
constructor(
Reputation reputation_,
Bench bench_,
LightweightBench lightweightBench_,
GlobalForum forum_
) {
rep = reputation_;
bench = bench_;
lightweightBench = lightweightBench_;
forum = forum_;
rep.registerDAO(this);
bench.registerDAO(this, forum);
lightweightBench.registerDAO(this);
}
function emitPostAdded(string calldata id) public {
emit PostAdded(id);
}
function emitValidationPoolInitiated(uint poolIndex) public {
emit ValidationPoolInitiated(poolIndex);
}
function emitValidationPoolResolved(
uint poolIndex,
bool votePasses,
bool quorumMet
) public {
emit ValidationPoolResolved(poolIndex, votePasses, quorumMet);
}
function emitLWValidationPoolInitiated(uint poolIndex) public {
emit LWValidationPoolInitiated(poolIndex);
}
function emitLWResultProposed(
uint poolIndex,
uint proposedResultIndex,
string calldata proposedResultHash
) public {
emit LWResultProposed(
poolIndex,
proposedResultIndex,
proposedResultHash
);
}
function update(address from, address to, uint256 value) public {
require(
msg.sender == address(lightweightBench) ||
msg.sender == address(bench),
"Only DAO core contracts may call update"
);
rep.update(from, to, value);
}
function mint(address account, uint256 value) public {
require(
msg.sender == address(lightweightBench) ||
msg.sender == address(bench),
"Only DAO core contracts may call mint"
);
rep.mint(account, value);
}
function burn(address account, uint256 value) public {
require(
msg.sender == address(lightweightBench) ||
msg.sender == address(bench),
"Only DAO core contracts may call burn"
);
rep.burn(account, value);
}
function registerMember(address account) public {
require(
msg.sender == address(lightweightBench) ||
msg.sender == address(bench),
"Only DAO core contracts may call registerMember"
);
if (!isMember[account]) {
members[memberCount++] = account;
isMember[account] = true;
}
}
function balanceOf(address account) public view returns (uint256) {
return rep.balanceOf(account);
}
function totalSupply() public view returns (uint256) {
return rep.totalSupply();
}
function allowance(
address owner,
address spender
) public view returns (uint256) {
return rep.allowance(owner, spender);
}
function forwardAllowance(
address owner,
address to,
uint256 amount
) public {
rep.spendAllowance(owner, msg.sender, amount);
rep.approve(owner, to, rep.allowance(owner, to) + amount);
}
contract DAO is Reputation, Forum, ValidationPools {
/// Authorize a contract to transfer REP, and call that contract's acceptAvailability method
function stakeAvailability(
address to,
uint256 value,
uint duration
) external returns (bool) {
_approve(msg.sender, to, value);
IAcceptAvailability(to).acceptAvailability(msg.sender, value, duration);
uint refund = IAcceptAvailability(to).acceptAvailability(
msg.sender,
value,
duration
);
rep.approve(
msg.sender,
to,
rep.allowance(msg.sender, to) + value - refund
);
return true;
}
function distributeFeeAmongMembers() public payable {
uint allocated;
for (uint i = 0; i < memberCount; i++) {
address member = members[i];
uint share;
if (i < memberCount - 1) {
share = (msg.value * balanceOf(member)) / totalSupply();
allocated += share;
} else {
// Due to rounding, give the remainder to the last member
share = msg.value - allocated;
}
// TODO: For efficiency this could be modified to hold the funds for recipients to withdraw
payable(member).transfer(share);
}
}
function getValidationPool(
uint poolIndex
)
public
view
returns (
uint id,
address sender,
uint stakeCount,
ValidationPoolParams memory params,
ValidationPoolProps memory props,
bool callbackOnValidate,
bytes memory callbackData
)
{
return bench.validationPools(poolIndex);
}
function getValidationPoolCount() public view returns (uint) {
return bench.validationPoolCount();
}
function initiateValidationPool(
string calldata postId,
uint duration,
uint[2] calldata quorum, // [Numerator, Denominator]
uint[2] calldata winRatio, // [Numerator, Denominator]
uint bindingPercent,
bool redistributeLosingStakes,
bool callbackOnValidate,
bytes calldata callbackData
) external payable returns (uint) {
return
bench.initiateValidationPool{value: msg.value}(
msg.sender,
postId,
duration,
quorum,
winRatio,
bindingPercent,
redistributeLosingStakes,
callbackOnValidate,
callbackData
);
}
function stakeOnValidationPool(
uint poolIndex,
uint256 amount,
bool inFavor
) public {
require(
balanceOf(msg.sender) >= amount,
"Insufficient REP balance to cover stake"
);
bench.stakeOnValidationPool(poolIndex, msg.sender, amount, inFavor);
}
/// Accept reputation stakes toward a validation pool
function delegatedStakeOnValidationPool(
uint poolIndex,
address owner,
uint256 amount,
bool inFavor
) public {
if (allowance(owner, msg.sender) < amount) {
amount = allowance(owner, msg.sender);
}
rep.spendAllowance(owner, msg.sender, amount);
bench.stakeOnValidationPool(poolIndex, owner, amount, inFavor);
}
function evaluateOutcome(uint poolIndex) public returns (bool) {
return bench.evaluateOutcome(poolIndex);
}
function getLWValidationPool(
uint poolIndex
)
public
view
returns (
uint id,
address sender,
uint stakeCount,
LWVPoolParams memory params,
LWVPoolProps memory props,
bool callbackOnValidate,
bytes memory callbackData
)
{
return lightweightBench.validationPools(poolIndex);
}
function getLWValidationPoolCount() public view returns (uint) {
return lightweightBench.validationPoolCount();
}
function initiateLWValidationPool(
string calldata postId,
uint duration,
uint[2] calldata quorum, // [Numerator, Denominator]
uint[2] calldata winRatio, // [Numerator, Denominator]
uint bindingPercent,
bool redistributeLosingStakes,
bool callbackOnValidate,
bytes calldata callbackData
) external payable returns (uint) {
return
lightweightBench.initiateValidationPool{value: msg.value}(
msg.sender,
postId,
duration,
quorum,
winRatio,
bindingPercent,
redistributeLosingStakes,
callbackOnValidate,
callbackData
);
}
function proposeLWResult(
uint poolIndex,
string calldata resultHash,
Transfer[] calldata transfers
) external {
lightweightBench.proposeResult(poolIndex, resultHash, transfers);
}
function stakeOnLWValidationPool(
uint poolIndex,
string calldata resultHash,
uint256 amount,
bool inFavor
) public {
require(
balanceOf(msg.sender) >= amount,
"Insufficient REP balance to cover stake"
);
lightweightBench.stakeOnValidationPool(
poolIndex,
resultHash,
msg.sender,
amount,
inFavor
);
}
/// Accept reputation stakes toward a validation pool
function delegatedStakeOnLWValidationPool(
uint poolIndex,
string calldata resultHash,
address owner,
uint256 amount,
bool inFavor
) public {
if (allowance(owner, msg.sender) < amount) {
amount = allowance(owner, msg.sender);
}
rep.spendAllowance(owner, msg.sender, amount);
lightweightBench.stakeOnValidationPool(
poolIndex,
resultHash,
owner,
amount,
inFavor
);
}
function evaluateLWOutcome(uint poolIndex) public returns (bool) {
return lightweightBench.evaluateOutcome(poolIndex);
}
function onValidate(
address target,
bool votePasses,
bool quorumMet,
uint stakedFor,
uint stakedAgainst,
bytes calldata callbackData
) public {
require(
msg.sender == address(lightweightBench) ||
msg.sender == address(bench),
"Only DAO core contracts may call onValidate"
);
IOnValidate(target).onValidate(
votePasses,
quorumMet,
stakedFor,
stakedAgainst,
callbackData
);
}
}
/// Convenience contract to extend for other contracts that will be initialized to

View File

@ -1,240 +0,0 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
import "./Reputation.sol";
struct Citation {
int weightPPM;
string targetPostId;
}
struct Author {
uint weightPPM;
address authorAddress;
}
struct Post {
string id;
address sender;
Author[] authors;
Citation[] citations;
uint reputation;
// TODO: timestamp
}
contract Forum is Reputation {
mapping(string => Post) public posts;
string[] public postIds;
uint public postCount;
mapping(string => mapping(string => int)) _edgeBalances;
event PostAdded(string id);
// Forum parameters
// TODO: Make depth limit configurable; take as param in _onValidatePost callback
uint depthLimit = 3;
function addPost(
Author[] calldata authors,
string calldata contentId,
Citation[] calldata citations
) external {
require(authors.length > 0, "Post must include at least one author");
postCount++;
postIds.push(contentId);
Post storage post = posts[contentId];
require(
post.authors.length == 0,
"A post with this contentId already exists"
);
post.sender = msg.sender;
post.id = contentId;
uint authorTotalWeightPercent;
for (uint i = 0; i < authors.length; i++) {
authorTotalWeightPercent += authors[i].weightPPM;
post.authors.push(authors[i]);
}
require(
authorTotalWeightPercent == 1000000,
"Author weights must sum to 1000000"
);
for (uint i = 0; i < citations.length; i++) {
post.citations.push(citations[i]);
}
int totalCitationWeightPos;
int totalCitationWeightNeg;
for (uint i = 0; i < post.citations.length; i++) {
int weight = post.citations[i].weightPPM;
require(
weight >= -1000000,
"Each citation weight must be >= -1000000"
);
require(
weight <= 1000000,
"Each citation weight must be <= 1000000"
);
if (weight > 0) totalCitationWeightPos += weight;
else totalCitationWeightNeg += weight;
}
require(
totalCitationWeightPos <= 1000000,
"Sum of positive citations must be <= 1000000"
);
require(
totalCitationWeightNeg >= -1000000,
"Sum of negative citations must be >= -1000000"
);
emit PostAdded(contentId);
}
function getPostAuthors(
string calldata postId
) external view returns (Author[] memory) {
Post storage post = posts[postId];
return post.authors;
}
function _handleCitation(
string memory postId,
Citation memory citation,
int amount,
bool initialNegative,
uint depth
) internal returns (int outboundAmount) {
outboundAmount = (amount * citation.weightPPM) / 1000000;
if (bytes(citation.targetPostId).length == 0) {
// Incineration
require(
outboundAmount >= 0,
"Leaching from incinerator is forbidden"
);
_burn(address(this), uint(outboundAmount));
return outboundAmount;
}
int balanceToOutbound = _edgeBalances[postId][citation.targetPostId];
if (initialNegative) {
if (outboundAmount < 0) {
outboundAmount = outboundAmount > -balanceToOutbound
? outboundAmount
: -balanceToOutbound;
} else {
outboundAmount = outboundAmount < -balanceToOutbound
? outboundAmount
: -balanceToOutbound;
}
}
int refund = _propagateReputation(
citation.targetPostId,
outboundAmount,
initialNegative || (depth == 0 && citation.weightPPM < 0),
depth + 1
);
outboundAmount -= refund;
_edgeBalances[postId][citation.targetPostId] += outboundAmount;
}
function _distributeAmongAuthors(
Post memory post,
int amount
) internal returns (int refund) {
int allocated;
for (uint i = 0; i < post.authors.length; i++) {
address authorAddress = post.authors[i].authorAddress;
if (!isMember[authorAddress]) {
members[memberCount++] = authorAddress;
isMember[authorAddress] = true;
}
}
for (uint i = 0; i < post.authors.length; i++) {
Author memory author = post.authors[i];
int share;
if (i < post.authors.length - 1) {
share = (amount * int(author.weightPPM)) / 1000000;
allocated += share;
} else {
// For the last author, allocate the remainder.
share = amount - allocated;
}
if (share > 0) {
_update(address(this), author.authorAddress, uint(share));
if (!isMember[author.authorAddress]) {
members[memberCount++] = author.authorAddress;
isMember[author.authorAddress] = true;
}
} else if (balanceOf(author.authorAddress) < uint(-share)) {
// Author has already lost some REP gained from this post.
// That means other DAO members have earned it for policing.
// We need to refund the difference here to ensure accurate bookkeeping
refund += share + int(balanceOf(author.authorAddress));
_update(
author.authorAddress,
address(this),
balanceOf(author.authorAddress)
);
} else {
_update(author.authorAddress, address(this), uint(-share));
}
}
}
function _propagateReputation(
string memory postId,
int amount,
bool initialNegative,
uint depth
) internal returns (int refundToInbound) {
if (depth >= depthLimit) {
return amount;
}
Post storage post = posts[postId];
if (post.authors.length == 0) {
// We most likely got here via a citation to a post that hasn't been added yet.
// We support this scenario so that a citation graph can be imported one post at a time.
return amount;
}
// Propagate negative citations first
for (uint i = 0; i < post.citations.length; i++) {
if (post.citations[i].weightPPM < 0) {
amount -= _handleCitation(
postId,
post.citations[i],
amount,
initialNegative,
depth
);
}
}
// Now propagate positive citations
for (uint i = 0; i < post.citations.length; i++) {
if (post.citations[i].weightPPM > 0) {
amount -= _handleCitation(
postId,
post.citations[i],
amount,
initialNegative,
depth
);
}
}
if (amount > 0) {
_distributeAmongAuthors(post, amount);
post.reputation += uint(amount);
} else {
if (int(post.reputation) + amount >= 0) {
// Reduce the reputation of each author proportionately;
// If any author has insufficient reputation, refund the difference.
refundToInbound = _distributeAmongAuthors(post, amount);
post.reputation -= uint(-amount);
} else {
// If we applied the full amount, the post's reputation would decrease below zero.
refundToInbound = int(post.reputation) + amount;
refundToInbound += _distributeAmongAuthors(
post,
-int(post.reputation)
);
post.reputation = 0;
}
}
}
}

View File

@ -1,18 +1,9 @@
// SPDX-License-Identifier: Unlicense
pragma solidity ^0.8.24;
import "./Reputation.sol";
import "./Forum.sol";
import "../interfaces/IOnValidate.sol";
import "./DAO.sol";
struct ValidationPoolStake {
uint id;
bool inFavor;
uint amount;
address sender;
}
struct ValidationPoolParams {
struct LWVPoolParams {
uint duration;
uint[2] quorum; // [ Numerator, Denominator ]
uint[2] winRatio; // [ Numerator, Denominator ]
@ -20,78 +11,67 @@ struct ValidationPoolParams {
bool redistributeLosingStakes;
}
struct ValidationPool {
uint id;
struct LWVPoolProps {
string postId;
address sender;
uint minted;
mapping(uint => ValidationPoolStake) stakes;
uint stakeCount;
ValidationPoolParams params;
uint fee;
uint minted;
uint endTime;
bool resolved;
bool outcome;
bool callbackOnValidate;
bytes callbackData;
}
contract ValidationPools is Reputation, Forum {
mapping(uint => ValidationPool) public validationPools;
struct Transfer {
address from;
address to;
uint amount;
}
contract LightweightBench {
struct ProposedResult {
Transfer[] transfers;
uint stakedFor;
}
struct Stake {
uint id;
bool inFavor;
uint amount;
address sender;
string resultHash;
}
struct Pool {
uint id;
address sender;
mapping(string => ProposedResult) proposedResults;
string[] proposedResultHashes;
mapping(uint => Stake) stakes;
uint stakeCount;
LWVPoolParams params;
LWVPoolProps props;
bool callbackOnValidate;
bytes callbackData;
}
mapping(uint => Pool) public validationPools;
uint public validationPoolCount;
DAO dao;
uint constant minDuration = 1; // 1 second
uint constant maxDuration = 365_000_000 days; // 1 million years
uint[2] minQuorum = [1, 10];
event ValidationPoolInitiated(uint poolIndex);
event ValidationPoolResolved(
uint poolIndex,
bool votePasses,
bool quorumMet
);
/// Internal function to register a stake for/against a validation pool
function _stakeOnValidationPool(
ValidationPool storage pool,
address sender,
uint256 amount,
bool inFavor
) internal {
require(block.timestamp <= pool.endTime, "Pool end time has passed");
// We don't call _update here; We defer that until evaluateOutcome.
uint stakeIndex = pool.stakeCount++;
ValidationPoolStake storage s = pool.stakes[stakeIndex];
s.sender = sender;
s.inFavor = inFavor;
s.amount = amount;
s.id = stakeIndex;
}
/// Accept reputation stakes toward a validation pool
function stakeOnValidationPool(
uint poolIndex,
uint256 amount,
bool inFavor
) public {
ValidationPool storage pool = validationPools[poolIndex];
_stakeOnValidationPool(pool, msg.sender, amount, inFavor);
}
/// Accept reputation stakes toward a validation pool
function delegatedStakeOnValidationPool(
uint poolIndex,
address owner,
uint256 amount,
bool inFavor
) public {
ValidationPool storage pool = validationPools[poolIndex];
_spendAllowance(owner, msg.sender, amount);
_stakeOnValidationPool(pool, owner, amount, inFavor);
function registerDAO(DAO dao_) external {
require(
address(dao) == address(0),
"A DAO has already been registered"
);
dao = dao_;
}
/// Accept fee to initiate a validation pool
function initiateValidationPool(
address sender,
string calldata postId,
uint duration,
uint[2] calldata quorum, // [Numerator, Denominator]
@ -101,7 +81,10 @@ contract ValidationPools is Reputation, Forum {
bool callbackOnValidate,
bytes calldata callbackData
) external payable returns (uint poolIndex) {
require(msg.value > 0, "Fee is required to initiate validation pool");
require(
msg.sender == address(dao),
"Only DAO contract may call initiateValidationPool"
);
require(duration >= minDuration, "Duration is too short");
require(duration <= maxDuration, "Duration is too long");
require(
@ -111,69 +94,172 @@ contract ValidationPools is Reputation, Forum {
require(quorum[0] <= quorum[1], "Quorum is greater than one");
require(winRatio[0] <= winRatio[1], "Win ratio is greater than one");
require(bindingPercent <= 100, "Binding percent must be <= 100");
Post storage post = posts[postId];
require(post.authors.length != 0, "Target post not found");
poolIndex = validationPoolCount++;
ValidationPool storage pool = validationPools[poolIndex];
pool.sender = msg.sender;
pool.postId = postId;
pool.fee = msg.value;
Pool storage pool = validationPools[poolIndex];
pool.id = poolIndex;
pool.sender = sender;
pool.props.postId = postId;
pool.props.fee = msg.value;
pool.props.endTime = block.timestamp + duration;
pool.params.quorum = quorum;
pool.params.winRatio = winRatio;
pool.params.bindingPercent = bindingPercent;
pool.params.redistributeLosingStakes = redistributeLosingStakes;
pool.params.duration = duration;
pool.endTime = block.timestamp + duration;
pool.id = poolIndex;
pool.callbackOnValidate = callbackOnValidate;
pool.callbackData = callbackData;
// We use our privilege as the DAO contract to mint reputation in proportion with the fee.
// Here we assume a minting ratio of 1
// TODO: Make minting ratio an adjustable parameter
_mint(address(this), msg.value);
pool.minted = msg.value;
emit ValidationPoolInitiated(poolIndex);
dao.mint(address(dao), pool.props.fee);
pool.props.minted = msg.value;
dao.emitLWValidationPoolInitiated(poolIndex);
}
function proposeResult(
uint poolIndex,
string calldata resultHash,
Transfer[] calldata transfers
) external {
require(
transfers.length > 0,
"The proposed result contains no transfers"
);
Pool storage pool = validationPools[poolIndex];
require(
block.timestamp <= pool.props.endTime,
"Pool end time has passed"
);
ProposedResult storage proposedResult = pool.proposedResults[
resultHash
];
require(
proposedResult.transfers.length == 0,
"This result hash has already been proposed"
);
uint resultIndex = pool.proposedResultHashes.length;
pool.proposedResultHashes.push(resultHash);
for (uint i = 0; i < transfers.length; i++) {
proposedResult.transfers.push(transfers[i]);
}
dao.emitLWResultProposed(poolIndex, resultIndex, resultHash);
}
/// Register a stake for/against a validation pool
function stakeOnValidationPool(
uint poolIndex,
string calldata resultHash,
address sender,
uint256 amount,
bool inFavor
) external {
require(
msg.sender == address(dao),
"Only DAO contract may call stakeOnValidationPool"
);
Pool storage pool = validationPools[poolIndex];
require(
block.timestamp <= pool.props.endTime,
"Pool end time has passed"
);
if (inFavor) {
ProposedResult storage proposedResult = pool.proposedResults[
resultHash
];
require(
proposedResult.transfers.length > 0,
"This result hash has not been proposed"
);
}
// We don't call _update here; We defer that until evaluateOutcome.
uint stakeIndex = pool.stakeCount++;
Stake storage s = pool.stakes[stakeIndex];
s.sender = sender;
s.inFavor = inFavor;
s.amount = amount;
s.id = stakeIndex;
s.resultHash = resultHash;
}
/// Evaluate outcome of a validation pool
function evaluateOutcome(uint poolIndex) public returns (bool votePasses) {
ValidationPool storage pool = validationPools[poolIndex];
require(pool.resolved == false, "Pool is already resolved");
require(
msg.sender == address(dao),
"Only DAO contract may call evaluateOutcome"
);
Pool storage pool = validationPools[poolIndex];
require(pool.props.resolved == false, "Pool is already resolved");
uint stakedFor;
uint stakedAgainst;
ValidationPoolStake storage s;
Stake storage s;
for (uint i = 0; i < pool.stakeCount; i++) {
s = pool.stakes[i];
// Make sure the sender still has the required balance.
// If not, automatically decrease the staked amount.
if (dao.balanceOf(s.sender) < s.amount) {
s.amount = dao.balanceOf(s.sender);
}
if (s.inFavor) {
stakedFor += s.amount;
ProposedResult storage proposedResult = pool.proposedResults[
s.resultHash
];
proposedResult.stakedFor += s.amount;
} else {
stakedAgainst += s.amount;
}
}
stakedFor += pool.minted / 2;
stakedAgainst += pool.minted / 2;
if (pool.minted % 2 != 0) {
// Determine the winning result hash
uint[] memory stakedForResult = new uint[](
pool.proposedResultHashes.length
);
uint winningResult;
for (uint i = 0; i < pool.proposedResultHashes.length; i++) {
string storage proposedResultHash = pool.proposedResultHashes[i];
ProposedResult storage proposedResult = pool.proposedResults[
proposedResultHash
];
stakedForResult[i] += proposedResult.stakedFor;
if (stakedForResult[i] > stakedForResult[winningResult]) {
winningResult = i;
}
}
// Only count stakes for the winning hash among the total staked in favor of the pool
for (uint i = 0; i < pool.stakeCount; i++) {
s = pool.stakes[i];
if (
s.inFavor &&
keccak256(bytes(s.resultHash)) ==
keccak256(bytes(pool.proposedResultHashes[winningResult]))
) {
stakedFor += s.amount;
}
}
stakedFor += pool.props.minted / 2;
stakedAgainst += pool.props.minted / 2;
if (pool.props.minted % 2 != 0) {
stakedFor += 1;
}
// Special case for early evaluation if dao.totalSupply has been staked
require(
block.timestamp > pool.endTime ||
stakedFor + stakedAgainst == totalSupply(),
block.timestamp > pool.props.endTime ||
stakedFor + stakedAgainst == dao.totalSupply(),
"Pool end time has not yet arrived"
);
// Check that quorum is met
if (
pool.params.quorum[1] * (stakedFor + stakedAgainst) <=
totalSupply() * pool.params.quorum[0]
dao.totalSupply() * pool.params.quorum[0]
) {
// TODO: Refund fee
// TODO: this could be made available for the sender to withdraw
// payable(pool.sender).transfer(pool.fee);
pool.resolved = true;
emit ValidationPoolResolved(poolIndex, false, false);
// payable(pool.sender).transfer(pool.props.fee);
pool.props.resolved = true;
dao.emitValidationPoolResolved(poolIndex, false, false);
// Callback if requested
if (pool.callbackOnValidate) {
IOnValidate(pool.sender).onValidate(
dao.onValidate(
pool.sender,
votePasses,
false,
stakedFor,
@ -184,15 +270,16 @@ contract ValidationPools is Reputation, Forum {
return false;
}
// A tie is resolved in favor of the validation pool.
// This is especially important so that the DAO's first pool can pass,
// when no reputation has yet been minted.
votePasses =
stakedFor * pool.params.winRatio[1] >=
(stakedFor + stakedAgainst) * pool.params.winRatio[0];
pool.resolved = true;
pool.outcome = votePasses;
emit ValidationPoolResolved(poolIndex, votePasses, true);
pool.props.resolved = true;
pool.props.outcome = votePasses;
dao.emitValidationPoolResolved(poolIndex, votePasses, true);
// Value of losing stakes should be distributed among winners, in proportion to their stakes
// Only bindingPercent % should be redistributed
@ -206,10 +293,10 @@ contract ValidationPools is Reputation, Forum {
// Losing stake
uint amount = (s.amount * pool.params.bindingPercent) / 100;
if (pool.params.redistributeLosingStakes) {
_update(s.sender, address(this), amount);
dao.update(s.sender, address(dao), amount);
totalRewards += amount;
} else {
_burn(s.sender, amount);
dao.burn(s.sender, amount);
}
}
}
@ -218,7 +305,7 @@ contract ValidationPools is Reputation, Forum {
// If vote passes, reward the author as though they had staked the winning portion of the VP initial stake
// Here we assume a stakeForAuthor ratio of 0.5
// TODO: Make stakeForAuthor an adjustable parameter
totalRewards += pool.minted / 2;
totalRewards += pool.props.minted / 2;
// Include the losign portion of the VP initial stake
// Issue rewards to the winners
for (uint i = 0; i < pool.stakeCount; i++) {
@ -231,26 +318,30 @@ contract ValidationPools is Reputation, Forum {
uint reward = (((totalRewards * s.amount) /
amountFromWinners) * pool.params.bindingPercent) / 100;
totalAllocated += reward;
_update(address(this), s.sender, reward);
dao.update(address(dao), s.sender, reward);
}
}
// Due to rounding, there may be some excess REP. Award it to the author.
uint remainder = totalRewards - totalAllocated;
if (pool.minted % 2 != 0) {
if (pool.props.minted % 2 != 0) {
// We staked the odd remainder in favor of the post, on behalf of the author.
remainder += 1;
}
// Transfer REP to the forum instead of to the author directly
_propagateReputation(
pool.postId,
int(pool.minted / 2 + remainder),
false,
0
);
// Execute the transfers from the winning proposed result
ProposedResult storage result = pool.proposedResults[
pool.proposedResultHashes[winningResult]
];
for (uint i = 0; i < result.transfers.length; i++) {
dao.update(
result.transfers[i].from,
result.transfers[i].to,
result.transfers[i].amount
);
}
} else {
// If vote does not pass, divide the losing stake among the winners
totalRewards += pool.minted;
totalRewards += pool.props.minted;
for (uint i = 0; i < pool.stakeCount; i++) {
s = pool.stakes[i];
if (
@ -259,25 +350,21 @@ contract ValidationPools is Reputation, Forum {
) {
// Winning stake
uint reward = (((totalRewards * s.amount) /
(amountFromWinners - pool.minted / 2)) *
(amountFromWinners - pool.props.minted / 2)) *
pool.params.bindingPercent) / 100;
totalAllocated += reward;
_update(address(this), s.sender, reward);
dao.update(address(dao), s.sender, reward);
}
}
}
// Distribute fee proportionately among all reputation holders
for (uint i = 0; i < memberCount; i++) {
address member = members[i];
uint share = (pool.fee * balanceOf(member)) / totalSupply();
// TODO: For efficiency this could be modified to hold the funds for recipients to withdraw
payable(member).transfer(share);
}
dao.distributeFeeAmongMembers{value: pool.props.fee}();
// Callback if requested
if (pool.callbackOnValidate) {
IOnValidate(pool.sender).onValidate(
dao.onValidate(
pool.sender,
votePasses,
true,
stakedFor,

View File

@ -2,11 +2,36 @@
pragma solidity ^0.8.24;
import {ERC20} from "@openzeppelin/contracts/token/ERC20/ERC20.sol";
import "./DAO.sol";
contract Reputation is ERC20("Reputation", "REP") {
mapping(uint => address) public members;
uint public memberCount;
mapping(address => bool) public isMember;
DAO dao;
function registerDAO(DAO dao_) external {
require(
address(dao) == address(0),
"A DAO has already been registered"
);
dao = dao_;
}
function update(address from, address to, uint256 value) public {
require(
msg.sender == address(dao),
"Only DAO contract may call update"
);
_update(from, to, value);
}
function mint(address account, uint256 value) public {
require(msg.sender == address(dao), "Only DAO contract may call mint");
_mint(account, value);
}
function burn(address account, uint256 value) public {
require(msg.sender == address(dao), "Only DAO contract may call burn");
_burn(account, value);
}
function decimals() public pure override returns (uint8) {
return 9;
@ -24,4 +49,24 @@ contract Reputation is ERC20("Reputation", "REP") {
) public pure override returns (bool) {
revert("REP transfer is not allowed");
}
function spendAllowance(
address owner,
address spender,
uint256 value
) public {
require(
msg.sender == address(dao),
"Only DAO contract may call spendAllowance"
);
_spendAllowance(owner, spender, value);
}
function approve(address owner, address spender, uint256 value) public {
require(
msg.sender == address(dao),
"Only DAO contract may call approve"
);
_approve(owner, spender, value);
}
}

View File

@ -6,5 +6,5 @@ interface IAcceptAvailability {
address from,
uint256 value,
uint duration
) external;
) external returns (uint refund);
}

View File

@ -8,5 +8,5 @@ interface IOnValidate {
uint stakedFor,
uint stakedAgainst,
bytes calldata callbackData
) external returns (uint);
) external;
}

View File

@ -1,10 +1,8 @@
const { ethers } = require('hardhat');
const { execSync } = require('child_process');
const { getContractAddressByNetworkName } = require('./contract-config');
const readFromApi = require('./util/read-from-api');
const network = process.env.HARDHAT_NETWORK;
let currentVersionProposalId;
let dao;
let work1;
@ -16,20 +14,6 @@ let posts;
let proposalsContract;
let proposals;
const getCurrentVersion = () => {
const currentCommit = execSync('git rev-parse HEAD');
return currentCommit.toString();
};
const fetchCurrentVersionProposal = async () => {
// const p = await proposalsContract.
};
const getLatestVersion = () => {
const latestVersion = 'TBD';
return latestVersion;
};
const fetchReputation = async () => {
reputation = await dao.balanceOf(account);
console.log(`reputation: ${reputation}`);
@ -37,14 +21,14 @@ const fetchReputation = async () => {
const fetchPost = async (postIndex) => {
const {
id, sender, author, contentId,
id, sender, author, postId,
} = await dao.posts(postIndex);
const { content, embeddedData } = await readFromApi(contentId);
const { content, embeddedData } = await readFromApi(postId);
const post = {
id,
sender,
author,
contentId,
postId,
content,
embeddedData,
};
@ -55,7 +39,7 @@ const fetchPost = async (postIndex) => {
const fetchValidationPool = async (poolIndex) => {
const {
id, postIndex, sender, stakeCount, fee, duration, endTime, resolved, outcome,
} = await dao.validationPools(poolIndex);
} = await dao.getValidationPool(poolIndex);
const pool = {
id, postIndex, sender, stakeCount, fee, duration, endTime, resolved, outcome,
};
@ -65,7 +49,7 @@ const fetchValidationPool = async (poolIndex) => {
};
const fetchValidationPools = async () => {
const count = await dao.validationPoolCount();
const count = await dao.getValidationPoolCount();
console.log(`validation pool count: ${count}`);
const promises = [];
validationPools = [];
@ -75,22 +59,6 @@ const fetchValidationPools = async () => {
await Promise.all(promises);
};
const fetchProposal = async (proposalIndex) => {
const proposal = await proposalsContract.proposals(proposalIndex);
proposals[proposalIndex] = proposal;
};
const fetchProposals = async () => {
const count = await proposalsContract.proposalCount();
console.log(`proposal count: ${count}`);
const promises = [];
proposals = [];
for (let i = 0; i < count; i += 1) {
promises.push(fetchProposal(i));
}
await Promise.all(promises);
};
const initialize = async () => {
const getContract = (name) => ethers.getContractAt(
name,
@ -106,12 +74,11 @@ const initialize = async () => {
posts = [];
await fetchReputation();
await fetchValidationPools();
await fetchProposals();
};
const poolIsActive = (pool) => {
if (new Date() >= new Date(Number(pool.endTime) * 1000)) return false;
if (pool.resolved) return false;
if (new Date() >= new Date(Number(pool.props.endTime) * 1000)) return false;
if (pool.props.resolved) return false;
return true;
};
@ -120,7 +87,7 @@ const poolIsValidWorkContract = (pool) => {
case getContractAddressByNetworkName(network, 'Work1'): {
// If this is a valid work evidence
// TODO: Can we decode from the post, a reference to the work request?
// The work request does have its own contentId, the work contract has that
// The work request does have its own postId, the work contract has that
// under availabilityStakes
const expectedContent = 'This is a work evidence post';
return pool.post.content.startsWith(expectedContent);
@ -138,8 +105,8 @@ const poolIsProposal = (pool) => pool.sender === getContractAddressByNetworkName
const getPoolStatus = (pool) => {
if (poolIsActive(pool)) return 'Active';
if (!pool.resolved) return 'Ready to Evaluate';
if (pool.outcome) return 'Accepted';
if (!pool.props.resolved) return 'Ready to Evaluate';
if (pool.props.outcome) return 'Accepted';
return 'Rejected';
};
@ -179,8 +146,6 @@ const printPool = (pool) => {
};
async function main() {
console.log('Current version:', getCurrentVersion());
await initialize();
validationPools.forEach(printPool);

View File

@ -1,7 +1,7 @@
const deployContract = require('./util/deploy-contract');
const deployDAOCoreContracts = require('./util/deploy-core-contracts');
async function main() {
await deployContract('DAO', [], true);
await deployDAOCoreContracts();
}
main().catch((error) => {

View File

@ -1,12 +1,20 @@
const deployWorkContract = require('./util/deploy-work-contract');
require('dotenv').config();
const deployContract = require('./util/deploy-contract');
const deployDAOContract = require('./util/deploy-dao-contract');
const deployWorkContract = require('./util/deploy-work-contract');
const deployRollableWorkContract = require('./util/deploy-rollable-work-contract');
const deployDAOCoreContracts = require('./util/deploy-core-contracts');
const { ROLLUP_INTERVAL } = process.env;
async function main() {
await deployContract('DAO', [], true);
await deployContract('GlobalForum');
await deployDAOCoreContracts();
await deployDAOContract('Rollup', [ROLLUP_INTERVAL]);
await deployDAOContract('Proposals');
await deployWorkContract('Work1');
await deployWorkContract('Onboarding');
await deployRollableWorkContract('Work2');
}
main().catch((error) => {

View File

@ -0,0 +1,17 @@
require('dotenv').config();
const deployContract = require('./deploy-contract');
const contractAddresses = require('../../contract-addresses.json');
const network = process.env.HARDHAT_NETWORK;
const deployDAOCoreContracts = async () => {
await deployContract('Reputation', [], true);
await deployContract('Bench', [], true);
await deployContract('DAO', [
contractAddresses[network].Reputation,
contractAddresses[network].GlobalForum,
contractAddresses[network].Bench,
], true);
};
module.exports = deployDAOCoreContracts;

View File

@ -6,8 +6,8 @@ require('dotenv').config();
const network = process.env.HARDHAT_NETWORK;
const deployDAOContract = async (name) => {
await deployContract(name, [contractAddresses[network].DAO]);
const deployDAOContract = async (name, args = []) => {
await deployContract(name, [contractAddresses[network].DAO, ...args]);
};
module.exports = deployDAOContract;

View File

@ -0,0 +1,22 @@
const { ethers } = require('hardhat');
const deployContract = require('./deploy-contract');
const contractAddresses = require('../../contract-addresses.json');
require('dotenv').config();
const network = process.env.HARDHAT_NETWORK;
const deployRollableWorkContract = async (name) => {
const priceEnvVar = `${name.toUpperCase()}_PRICE`;
const price = ethers.parseEther(process.env[priceEnvVar] || '0.001');
await deployContract(name, [
contractAddresses[network].DAO,
contractAddresses[network].Proposals,
contractAddresses[network].Rollup,
price,
]);
};
module.exports = deployRollableWorkContract;

View File

@ -9,10 +9,11 @@ const network = process.env.HARDHAT_NETWORK;
const deployWorkContract = async (name) => {
const priceEnvVar = `${name.toUpperCase()}_PRICE`;
const price = ethers.parseEther(process.env[priceEnvVar] || 0.001);
const price = ethers.parseEther(process.env[priceEnvVar] || '0.001');
await deployContract(name, [
contractAddresses[network].DAO,
contractAddresses[network].GlobalForum,
contractAddresses[network].Proposals,
price]);
};

View File

@ -4,20 +4,22 @@ const {
} = require('@nomicfoundation/hardhat-toolbox/network-helpers');
const { expect } = require('chai');
const { ethers } = require('hardhat');
const deployDAO = require('./util/deploy-dao');
describe('Forum', () => {
async function deploy() {
const [account1, account2, account3] = await ethers.getSigners();
const DAO = await ethers.getContractFactory('DAO');
const dao = await DAO.deploy();
const [account1, account2, account3, account4] = await ethers.getSigners();
const { dao, forum } = await deployDAO();
return {
dao, account1, account2, account3,
dao, forum, account1, account2, account3, account4,
};
}
let dao;
let forum;
let account1;
let account2;
let account3;
let account4;
const POOL_DURATION = 3600; // 1 hour
const POOL_FEE = 100;
const emptyCallbackData = ethers.AbiCoder.defaultAbiCoder().encode([], []);
@ -39,57 +41,51 @@ describe('Forum', () => {
{ value: fee ?? POOL_FEE },
);
const addPost = (author, contentId, citations) => dao.addPost([{
const addPost = (author, postId, references) => forum.addPost([{
weightPPM: 1000000,
authorAddress: author,
}], contentId, citations);
}], postId, references);
describe('Post', () => {
beforeEach(async () => {
({
dao, account1, account2, account3,
dao, forum, account1, account2, account3, account4,
} = await loadFixture(deploy));
});
it('should be able to add a post', async () => {
const contentId = 'some-id';
await expect(addPost(account1, contentId, [])).to.emit(dao, 'PostAdded').withArgs('some-id');
const post = await dao.posts(contentId);
const postId = 'some-id';
await expect(addPost(account1, postId, [])).to.emit(forum, 'PostAdded').withArgs('some-id');
const post = await forum.getPost(postId);
expect(post.sender).to.equal(account1);
expect(post.id).to.equal(contentId);
const postAuthors = await dao.getPostAuthors(contentId);
expect(postAuthors).to.have.length(1);
expect(postAuthors[0].weightPPM).to.equal(1000000);
expect(postAuthors[0].authorAddress).to.equal(account1);
expect(post.authors).to.have.length(1);
expect(post.authors[0].weightPPM).to.equal(1000000);
expect(post.authors[0].authorAddress).to.equal(account1);
});
it('should be able to add a post on behalf of another account', async () => {
const contentId = 'some-id';
await addPost(account2, contentId, []);
const post = await dao.posts(contentId);
const postId = 'some-id';
await addPost(account2, postId, []);
const post = await forum.getPost(postId);
expect(post.sender).to.equal(account1);
expect(post.id).to.equal(contentId);
const postAuthors = await dao.getPostAuthors(contentId);
expect(postAuthors).to.have.length(1);
expect(postAuthors[0].weightPPM).to.equal(1000000);
expect(postAuthors[0].authorAddress).to.equal(account2);
expect(post.authors).to.have.length(1);
expect(post.authors[0].weightPPM).to.equal(1000000);
expect(post.authors[0].authorAddress).to.equal(account2);
});
it('should be able to add a post with multiple authors', async () => {
const contentId = 'some-id';
await expect(dao.addPost([
const postId = 'some-id';
await expect(forum.addPost([
{ weightPPM: 500000, authorAddress: account1 },
{ weightPPM: 500000, authorAddress: account2 },
], contentId, [])).to.emit(dao, 'PostAdded').withArgs('some-id');
const post = await dao.posts(contentId);
], postId, [])).to.emit(forum, 'PostAdded').withArgs('some-id');
const post = await forum.getPost(postId);
expect(post.sender).to.equal(account1);
expect(post.id).to.equal(contentId);
const postAuthors = await dao.getPostAuthors(contentId);
expect(postAuthors).to.have.length(2);
expect(postAuthors[0].weightPPM).to.equal(500000);
expect(postAuthors[0].authorAddress).to.equal(account1);
expect(postAuthors[1].weightPPM).to.equal(500000);
expect(postAuthors[1].authorAddress).to.equal(account2);
expect(post.authors).to.have.length(2);
expect(post.authors[0].weightPPM).to.equal(500000);
expect(post.authors[0].authorAddress).to.equal(account1);
expect(post.authors[1].weightPPM).to.equal(500000);
expect(post.authors[1].authorAddress).to.equal(account2);
await initiateValidationPool({ postId: 'some-id' });
await time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(0);
@ -98,53 +94,48 @@ describe('Forum', () => {
});
it('should not be able to add a post with total author weight < 100%', async () => {
const contentId = 'some-id';
await expect(dao.addPost([
const postId = 'some-id';
await expect(forum.addPost([
{ weightPPM: 500000, authorAddress: account1 },
{ weightPPM: 400000, authorAddress: account2 },
], contentId, [])).to.be.rejectedWith('Author weights must sum to 1000000');
], postId, [])).to.be.rejectedWith('Author weights must sum to 1000000');
});
it('should not be able to add a post with total author weight > 100%', async () => {
const contentId = 'some-id';
await expect(dao.addPost([
const postId = 'some-id';
await expect(forum.addPost([
{ weightPPM: 500000, authorAddress: account1 },
{ weightPPM: 600000, authorAddress: account2 },
], contentId, [])).to.be.rejectedWith('Author weights must sum to 1000000');
], postId, [])).to.be.rejectedWith('Author weights must sum to 1000000');
});
it('should be able to donate reputation via citations', async () => {
it('should be able to donate reputation via references', async () => {
await addPost(account1, 'content-id', []);
await addPost(account2, 'second-content-id', [{ weightPPM: 500000, targetPostId: 'content-id' }]);
await initiateValidationPool({ postId: 'second-content-id' });
const pool = await dao.validationPools(0);
expect(pool.postId).to.equal('second-content-id');
const pool = await dao.getValidationPool(0);
expect(pool.props.postId).to.equal('second-content-id');
await dao.evaluateOutcome(0);
expect(await dao.balanceOf(account1)).to.equal(50);
expect(await dao.balanceOf(account2)).to.equal(50);
});
it('should be able to leach reputation via citations', async () => {
it('should be able to leach reputation via references', async () => {
await addPost(account1, 'content-id', []);
expect((await dao.posts('content-id')).reputation).to.equal(0);
await initiateValidationPool({ postId: 'content-id' });
await dao.evaluateOutcome(0);
expect(await dao.balanceOf(account1)).to.equal(100);
expect((await dao.posts('content-id')).reputation).to.equal(100);
await addPost(account2, 'second-content-id', [{ weightPPM: -500000, targetPostId: 'content-id' }]);
expect((await dao.posts('second-content-id')).reputation).to.equal(0);
await initiateValidationPool({ postId: 'second-content-id' });
const pool = await dao.validationPools(1);
expect(pool.postId).to.equal('second-content-id');
const pool = await dao.getValidationPool(1);
expect(pool.props.postId).to.equal('second-content-id');
await time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(1);
expect(await dao.balanceOf(account1)).to.equal(50);
expect(await dao.balanceOf(account2)).to.equal(150);
expect((await dao.posts('content-id')).reputation).to.equal(50);
expect((await dao.posts('second-content-id')).reputation).to.equal(150);
});
it('should be able to redistribute power via citations', async () => {
it('should be able to redistribute power via references', async () => {
await addPost(account1, 'content-id', []);
await initiateValidationPool({ postId: 'content-id' });
await dao.evaluateOutcome(0);
@ -156,8 +147,8 @@ describe('Forum', () => {
{ weightPPM: 1000000, targetPostId: 'second-content-id' },
]);
await initiateValidationPool({ postId: 'third-content-id' });
const pool = await dao.validationPools(1);
expect(pool.postId).to.equal('third-content-id');
const pool = await dao.getValidationPool(1);
expect(pool.props.postId).to.equal('third-content-id');
await time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(1);
expect(await dao.balanceOf(account1)).to.equal(0);
@ -165,7 +156,7 @@ describe('Forum', () => {
expect(await dao.balanceOf(account3)).to.equal(0);
});
it('should be able to reverse a negative citation with a negative citation', async () => {
it('should be able to reverse a negative reference with a negative reference', async () => {
await addPost(account1, 'content-id', []);
await initiateValidationPool({ postId: 'content-id' });
await dao.evaluateOutcome(0);
@ -192,8 +183,8 @@ describe('Forum', () => {
{ weightPPM: 100000, targetPostId: 'nonexistent-content-id' },
]);
await initiateValidationPool({ postId: 'second-content-id' });
const pool = await dao.validationPools(0);
expect(pool.postId).to.equal('second-content-id');
const pool = await dao.getValidationPool(0);
expect(pool.props.postId).to.equal('second-content-id');
await dao.evaluateOutcome(0);
expect(await dao.balanceOf(account1)).to.equal(10);
expect(await dao.balanceOf(account2)).to.equal(90);
@ -221,7 +212,6 @@ describe('Forum', () => {
});
it('should limit effects of negative references on prior positive references', async () => {
console.log('First post');
await addPost(account1, 'content-id', []);
await initiateValidationPool({ postId: 'content-id' });
await dao.evaluateOutcome(0);
@ -263,21 +253,15 @@ describe('Forum', () => {
it('should enforce depth limit', async () => {
await addPost(account1, 'content-id-1', []);
await addPost(account1, 'content-id-2', [{ weightPPM: 1000000, targetPostId: 'content-id-1' }]);
await addPost(account1, 'content-id-3', [{ weightPPM: 1000000, targetPostId: 'content-id-2' }]);
await addPost(account1, 'content-id-4', [{ weightPPM: 1000000, targetPostId: 'content-id-3' }]);
await addPost(account2, 'content-id-2', [{ weightPPM: 1000000, targetPostId: 'content-id-1' }]);
await addPost(account3, 'content-id-3', [{ weightPPM: 1000000, targetPostId: 'content-id-2' }]);
await addPost(account4, 'content-id-4', [{ weightPPM: 1000000, targetPostId: 'content-id-3' }]);
await initiateValidationPool({ postId: 'content-id-4' });
await dao.evaluateOutcome(0);
const posts = await Promise.all([
await dao.posts('content-id-1'),
await dao.posts('content-id-2'),
await dao.posts('content-id-3'),
await dao.posts('content-id-4'),
]);
expect(posts[0].reputation).to.equal(0);
expect(posts[1].reputation).to.equal(100);
expect(posts[2].reputation).to.equal(0);
expect(posts[3].reputation).to.equal(0);
expect(await dao.balanceOf(account1)).to.equal(0);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(account3)).to.equal(0);
expect(await dao.balanceOf(account4)).to.equal(0);
});
it('should be able to incinerate reputation', async () => {
@ -290,18 +274,16 @@ describe('Forum', () => {
await initiateValidationPool({ postId: 'content-id-1' });
expect(await dao.totalSupply()).to.equal(100);
await dao.evaluateOutcome(0);
expect((await dao.posts('content-id-1')).reputation).to.equal(50);
expect(await dao.totalSupply()).to.equal(50);
});
describe('negative citation of a post, the author having already staked and lost reputation', async () => {
describe('negative reference of a post, the author having already staked and lost reputation', async () => {
beforeEach(async () => {
await addPost(account1, 'content-id', []);
await initiateValidationPool({ postId: 'content-id' });
await dao.evaluateOutcome(0);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.totalSupply()).to.equal(100);
expect((await dao.posts('content-id')).reputation).to.equal(100);
await addPost(account2, 'second-content-id', []);
await initiateValidationPool({ postId: 'second-content-id' });
@ -310,8 +292,6 @@ describe('Forum', () => {
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.totalSupply()).to.equal(200);
expect((await dao.posts('content-id')).reputation).to.equal(100);
expect((await dao.posts('second-content-id')).reputation).to.equal(100);
// account1 stakes and loses
await initiateValidationPool({ postId: 'second-content-id' });
@ -322,8 +302,6 @@ describe('Forum', () => {
expect(await dao.balanceOf(account1)).to.equal(50);
expect(await dao.balanceOf(account2)).to.equal(250);
expect(await dao.totalSupply()).to.equal(300);
expect((await dao.posts('content-id')).reputation).to.equal(100);
expect((await dao.posts('second-content-id')).reputation).to.equal(100);
});
it('author and post rep can be completely destroyed', async () => {
@ -336,9 +314,6 @@ describe('Forum', () => {
expect(await dao.balanceOf(account2)).to.equal(250);
expect(await dao.balanceOf(account3)).to.equal(250);
expect(await dao.totalSupply()).to.equal(500);
expect((await dao.posts('content-id')).reputation).to.equal(0);
expect((await dao.posts('second-content-id')).reputation).to.equal(100);
expect((await dao.posts('third-content-id')).reputation).to.equal(250);
});
it('author rep can be destroyed while some post rep remains', async () => {
@ -351,9 +326,6 @@ describe('Forum', () => {
expect(await dao.balanceOf(account1)).to.equal(0);
expect(await dao.balanceOf(account2)).to.equal(250);
expect(await dao.balanceOf(account3)).to.equal(120);
expect((await dao.posts('content-id')).reputation).to.equal(30);
expect((await dao.posts('second-content-id')).reputation).to.equal(100);
expect((await dao.posts('third-content-id')).reputation).to.equal(120);
});
it('author rep can be destroyed while some post rep remains (odd amount)', async () => {
@ -366,15 +338,12 @@ describe('Forum', () => {
expect(await dao.balanceOf(account1)).to.equal(0);
expect(await dao.balanceOf(account2)).to.equal(250);
expect(await dao.balanceOf(account3)).to.equal(125);
expect((await dao.posts('content-id')).reputation).to.equal(25);
expect((await dao.posts('second-content-id')).reputation).to.equal(100);
expect((await dao.posts('third-content-id')).reputation).to.equal(125);
});
});
describe('negative citation of a post with multiple authors', async () => {
describe('negative reference of a post with multiple authors', async () => {
beforeEach(async () => {
await dao.addPost([
await forum.addPost([
{ weightPPM: 500000, authorAddress: account1 },
{ weightPPM: 500000, authorAddress: account2 },
], 'content-id', []);
@ -383,18 +352,16 @@ describe('Forum', () => {
expect(await dao.balanceOf(account1)).to.equal(50);
expect(await dao.balanceOf(account2)).to.equal(50);
expect(await dao.totalSupply()).to.equal(100);
expect((await dao.posts('content-id')).reputation).to.equal(100);
// account1 stakes and loses
await initiateValidationPool({ postId: 'content-id' });
await dao.stakeOnValidationPool(1, 25, true);
await dao.connect(account2).stakeOnValidationPool(1, 60, false);
await dao.connect(account2).stakeOnValidationPool(1, 50, false);
await time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(1);
expect(await dao.balanceOf(account1)).to.equal(25);
expect(await dao.balanceOf(account2)).to.equal(175);
expect(await dao.totalSupply()).to.equal(200);
expect((await dao.posts('content-id')).reputation).to.equal(100);
});
it('author and post rep can be completely destroyed', async () => {
@ -404,11 +371,9 @@ describe('Forum', () => {
await time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(2);
expect(await dao.balanceOf(account1)).to.equal(0);
expect(await dao.balanceOf(account2)).to.equal(125);
expect(await dao.balanceOf(account3)).to.equal(475);
expect(await dao.balanceOf(account2)).to.equal(0);
expect(await dao.balanceOf(account3)).to.equal(600);
expect(await dao.totalSupply()).to.equal(600);
expect((await dao.posts('content-id')).reputation).to.equal(0);
expect((await dao.posts('second-content-id')).reputation).to.equal(475);
});
it('author rep can be destroyed while some post rep remains', async () => {
@ -421,8 +386,6 @@ describe('Forum', () => {
expect(await dao.balanceOf(account1)).to.equal(0);
expect(await dao.balanceOf(account2)).to.equal(140);
expect(await dao.balanceOf(account3)).to.equal(130);
expect((await dao.posts('content-id')).reputation).to.equal(30);
expect((await dao.posts('second-content-id')).reputation).to.equal(130);
});
});
});

View File

@ -0,0 +1,318 @@
const {
time,
loadFixture,
} = require('@nomicfoundation/hardhat-toolbox/network-helpers');
const { expect } = require('chai');
const { ethers } = require('hardhat');
const deployDAO = require('./util/deploy-dao');
describe('Lightweight Validation Pools', () => {
async function deploy() {
const [account1, account2] = await ethers.getSigners();
const { dao, forum } = await deployDAO();
return {
dao, forum, account1, account2,
};
}
let dao;
let forum;
let account1;
let account2;
const POOL_DURATION = 3600; // 1 hour
const POOL_FEE = 100;
const emptyCallbackData = ethers.AbiCoder.defaultAbiCoder().encode([], []);
const initiateValidationPool = ({
postId, duration,
quorum, winRatio, bindingPercent,
redistributeLosingStakes, callbackOnValidate,
callbackData, fee,
} = {}) => dao.initiateLWValidationPool(
postId ?? 'content-id',
duration ?? POOL_DURATION,
quorum ?? [1, 3],
winRatio ?? [1, 2],
bindingPercent ?? 100,
redistributeLosingStakes ?? true,
callbackOnValidate ?? false,
callbackData ?? emptyCallbackData,
{ value: fee ?? POOL_FEE },
);
beforeEach(async () => {
({
dao, forum, account1, account2,
} = await loadFixture(deploy));
await forum.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'content-id', []);
const init = () => initiateValidationPool({ fee: POOL_FEE });
await expect(init()).to.emit(dao, 'LWValidationPoolInitiated').withArgs(0);
expect(await dao.getLWValidationPoolCount()).to.equal(1);
expect(await dao.memberCount()).to.equal(0);
expect(await dao.balanceOf(account1)).to.equal(0);
expect(await dao.totalSupply()).to.equal(POOL_FEE);
});
describe('Initiate', () => {
it('should be able to initiate a validation pool without a fee', async () => {
const init = () => initiateValidationPool({ fee: 0 });
await expect(init()).to.emit(dao, 'LWValidationPoolInitiated');
});
it('should not be able to initiate a validation pool with a quorum below the minimum', async () => {
const init = () => initiateValidationPool({ quorum: [1, 11] });
await expect(init()).to.be.revertedWith('Quorum is below minimum');
});
it('should not be able to initiate a validation pool with a quorum greater than 1', async () => {
const init = () => initiateValidationPool({ quorum: [11, 10] });
await expect(init()).to.be.revertedWith('Quorum is greater than one');
});
it('should not be able to initiate a validation pool with duration below minimum', async () => {
const init = () => initiateValidationPool({ duration: 0 });
await expect(init()).to.be.revertedWith('Duration is too short');
});
it('should not be able to initiate a validation pool with duration above maximum', async () => {
const init = () => initiateValidationPool({ duration: 40000000000000 });
await expect(init()).to.be.revertedWith('Duration is too long');
});
it('should not be able to initiate a validation pool with bindingPercent above 100', async () => {
const init = () => initiateValidationPool({ bindingPercent: 101 });
await expect(init()).to.be.revertedWith('Binding percent must be <= 100');
});
it('should be able to initiate a second validation pool', async () => {
const init = () => initiateValidationPool();
await expect(init()).to.emit(dao, 'LWValidationPoolInitiated').withArgs(1);
expect(await dao.getLWValidationPoolCount()).to.equal(2);
});
it('Should be able to fetch pool instance', async () => {
const pool = await dao.getLWValidationPool(0);
expect(pool).to.exist;
expect(pool.params.duration).to.equal(POOL_DURATION);
expect(pool.props.postId).to.equal('content-id');
expect(pool.props.resolved).to.be.false;
expect(pool.sender).to.equal(account1);
});
});
describe('Propose Result', () => {
it('should not be able to propose an empty result', async () => {
await expect(dao.proposeLWResult(0, 'some-hash', [])).to.be.revertedWith('The proposed result contains no transfers');
});
it('should be able to propose a result', async () => {
await expect(dao.proposeLWResult(0, 'some-hash', [{ from: account1, to: account2, amount: 0 }])).to.emit(dao, 'LWResultProposed').withArgs(0, 0, 'some-hash');
await expect(dao.proposeLWResult(0, 'some-other-hash', [{ from: account1, to: account2, amount: 0 }])).to.emit(dao, 'LWResultProposed').withArgs(0, 1, 'some-other-hash');
});
it('should not be able to propose the same result twice', async () => {
await expect(dao.proposeLWResult(0, 'some-hash', [{ from: account1, to: account2, amount: 0 }])).to.emit(dao, 'LWResultProposed').withArgs(0, 0, 'some-hash');
await expect(dao.proposeLWResult(0, 'some-hash', [{ from: account1, to: account2, amount: 0 }])).to.be.revertedWith('This result hash has already been proposed');
});
});
describe('Stake', async () => {
beforeEach(async () => {
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(0);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(0);
await initiateValidationPool();
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
});
it('should be able to stake before validation pool has elapsed', async () => {
await dao.stakeOnLWValidationPool(1, 10, true);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, true, true);
expect(await dao.balanceOf(account1)).to.equal(200);
expect(await dao.balanceOf(dao.target)).to.equal(0);
});
it('should not be able to stake after validation pool has elapsed', async () => {
time.increase(POOL_DURATION + 1);
await expect(dao.stakeOnValidationPool(1, 10, true)).to.be.revertedWith('Pool end time has passed');
});
it('should be able to stake against a validation pool', async () => {
await dao.stakeOnValidationPool(1, 10, false);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, false, true);
expect(await dao.balanceOf(account1)).to.equal(200);
expect(await dao.balanceOf(dao.target)).to.equal(0);
const pool = await dao.getValidationPool(1);
expect(pool.props.outcome).to.be.false;
});
it('should not be able to stake more REP than the sender owns', async () => {
await expect(dao.stakeOnValidationPool(1, 200, true)).to.be.revertedWith('Insufficient REP balance to cover stake');
});
});
describe('Delegated stake', () => {
it('should stake the lesser of the allowed amount or the owner\'s remaining balance', async () => {
// TODO: owner delegates stake and then loses rep
});
});
describe('Evaluate outcome', () => {
it('should not be able to evaluate outcome before duration has elapsed if not all rep has been staked', async () => {
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(0));
await initiateValidationPool({ fee: 100 });
await expect(dao.evaluateOutcome(1)).to.be.revertedWith('Pool end time has not yet arrived');
});
it('should not be able to evaluate outcome before duration has elapsed unless all rep has been staked', async () => {
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(0));
await initiateValidationPool({ fee: 100 });
await dao.stakeOnValidationPool(1, 100, true);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, true, true);
});
it('should be able to evaluate outcome after duration has elapsed', async () => {
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(0)).to.emit(dao, 'ValidationPoolResolved').withArgs(0, true, true);
expect(await dao.memberCount()).to.equal(1);
expect(await dao.balanceOf(account1)).to.equal(100);
const pool = await dao.getValidationPool(0);
expect(pool.props.resolved).to.be.true;
expect(pool.props.outcome).to.be.true;
});
it('should not be able to evaluate outcome more than once', async () => {
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(0)).to.emit(dao, 'ValidationPoolResolved').withArgs(0, true, true);
await expect(dao.evaluateOutcome(0)).to.be.revertedWith('Pool is already resolved');
});
it('should be able to evaluate outcome of second validation pool', async () => {
const init = () => initiateValidationPool();
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
expect(await dao.getValidationPoolCount()).to.equal(2);
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(0)).to.emit(dao, 'ValidationPoolResolved').withArgs(0, true, true);
expect(await dao.balanceOf(account1)).to.equal(100);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, true, true);
expect(await dao.balanceOf(account1)).to.equal(200);
});
it('should not be able to evaluate outcome if quorum is not met', async () => {
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(0)).to.emit(dao, 'ValidationPoolResolved').withArgs(0, true, true);
const init = () => initiateValidationPool({ quorum: [1, 1] });
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
expect(await dao.getValidationPoolCount()).to.equal(2);
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, false, false);
});
describe('Validation pool options', () => {
beforeEach(async () => {
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(0);
await forum.addPost([{ weightPPM: 1000000, authorAddress: account2 }], 'content-id-2', []);
const init = () => initiateValidationPool({ postId: 'content-id-2' });
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(1);
});
it('Binding validation pool should redistribute stakes', async () => {
const init = () => initiateValidationPool();
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(2);
await dao.connect(account1).stakeOnValidationPool(2, 10, true);
await dao.connect(account2).stakeOnValidationPool(2, 10, false);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(2);
expect(await dao.balanceOf(account1)).to.equal(210);
expect(await dao.balanceOf(account2)).to.equal(90);
expect(await dao.balanceOf(dao.target)).to.equal(0);
});
it('Non binding validation pool should not redistribute stakes', async () => {
const init = () => initiateValidationPool({ bindingPercent: 0 });
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(2);
await dao.connect(account1).stakeOnValidationPool(2, 10, true);
await dao.connect(account2).stakeOnValidationPool(2, 10, false);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(2);
expect(await dao.balanceOf(account1)).to.equal(200);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(0);
});
it('Partially binding validation pool should redistribute some stakes', async () => {
const init = () => initiateValidationPool({ bindingPercent: 50 });
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(2);
await dao.connect(account1).stakeOnValidationPool(2, 10, true);
await dao.connect(account2).stakeOnValidationPool(2, 10, false);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(2);
expect(await dao.balanceOf(account1)).to.equal(205);
expect(await dao.balanceOf(account2)).to.equal(95);
expect(await dao.balanceOf(dao.target)).to.equal(0);
expect(await dao.totalSupply()).to.equal(300);
});
it('If redistributeLosingStakes is false, validation pool should burn binding portion of losing stakes', async () => {
const init = () => initiateValidationPool({
bindingPercent: 50,
redistributeLosingStakes: false,
});
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(2);
await dao.connect(account1).stakeOnValidationPool(2, 10, true);
await dao.connect(account2).stakeOnValidationPool(2, 10, false);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(2);
expect(await dao.balanceOf(account1)).to.equal(200);
expect(await dao.balanceOf(account2)).to.equal(95);
expect(await dao.balanceOf(dao.target)).to.equal(0);
expect(await dao.totalSupply()).to.equal(295);
});
it('If redistributeLosingStakes is false and bindingPercent is 0, accounts should recover initial balances', async () => {
const init = () => initiateValidationPool({
bindingPercent: 0,
redistributeLosingStakes: false,
});
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(2);
await dao.connect(account1).stakeOnValidationPool(2, 10, true);
await dao.connect(account2).stakeOnValidationPool(2, 10, false);
expect(await dao.balanceOf(account1)).to.equal(100);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(100);
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(2);
expect(await dao.balanceOf(account1)).to.equal(200);
expect(await dao.balanceOf(account2)).to.equal(100);
expect(await dao.balanceOf(dao.target)).to.equal(0);
expect(await dao.totalSupply()).to.equal(300);
});
});
});
});

View File

@ -4,6 +4,7 @@ const {
} = require('@nomicfoundation/hardhat-toolbox/network-helpers');
const { expect } = require('chai');
const { ethers } = require('hardhat');
const deployDAO = require('./util/deploy-dao');
describe('Onboarding', () => {
const PRICE = 100;
@ -12,14 +13,13 @@ describe('Onboarding', () => {
// Contracts are deployed using the first signer/account by default
const [account1, account2] = await ethers.getSigners();
const DAO = await ethers.getContractFactory('DAO');
const dao = await DAO.deploy();
const { dao, forum } = await deployDAO();
const Proposals = await ethers.getContractFactory('Proposals');
const proposals = await Proposals.deploy(dao.target);
const Onboarding = await ethers.getContractFactory('Onboarding');
const onboarding = await Onboarding.deploy(dao.target, proposals.target, PRICE);
const onboarding = await Onboarding.deploy(dao.target, forum.target, proposals.target, PRICE);
await dao.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'content-id', []);
await forum.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'content-id', []);
const callbackData = ethers.AbiCoder.defaultAbiCoder().encode([], []);
await dao.initiateValidationPool(
'content-id',
@ -37,7 +37,7 @@ describe('Onboarding', () => {
expect(await dao.balanceOf(account1)).to.equal(100);
return {
dao, onboarding, account1, account2,
dao, forum, onboarding, account1, account2,
};
}
@ -53,13 +53,14 @@ describe('Onboarding', () => {
describe('Work approval/disapproval', () => {
let dao;
let forum;
let onboarding;
let account1;
let account2;
beforeEach(async () => {
({
dao, onboarding, account1, account2,
dao, forum, onboarding, account1, account2,
} = await loadFixture(deploy));
await dao.stakeAvailability(onboarding.target, 50, STAKE_DURATION);
});
@ -70,16 +71,14 @@ describe('Onboarding', () => {
await expect(onboarding.submitWorkApproval(0, true))
.to.emit(dao, 'ValidationPoolInitiated').withArgs(1)
.to.emit(onboarding, 'WorkApprovalSubmitted').withArgs(0, true);
const post = await dao.posts('evidence-content-id');
const post = await forum.getPost('evidence-content-id');
expect(post.sender).to.equal(onboarding.target);
expect(post.id).to.equal('evidence-content-id');
const postAuthors = await dao.getPostAuthors('evidence-content-id');
expect(postAuthors).to.have.length(1);
expect(postAuthors[0].weightPPM).to.equal(1000000);
expect(postAuthors[0].authorAddress).to.equal(account1);
const pool = await dao.validationPools(1);
expect(pool.postId).to.equal('evidence-content-id');
expect(pool.fee).to.equal(PRICE * 0.9);
expect(post.authors).to.have.length(1);
expect(post.authors[0].weightPPM).to.equal(1000000);
expect(post.authors[0].authorAddress).to.equal(account1);
const pool = await dao.getValidationPool(1);
expect(pool.props.postId).to.equal('evidence-content-id');
expect(pool.props.fee).to.equal(PRICE * 0.9);
expect(pool.sender).to.equal(onboarding.target);
});
@ -114,7 +113,7 @@ describe('Onboarding', () => {
describe('Onboarding followup', () => {
it('resolving the first validation pool should trigger a second pool', async () => {
const {
dao, onboarding, account2,
dao, forum, onboarding, account2,
} = await loadFixture(deploy);
await dao.stakeAvailability(onboarding.target, 50, STAKE_DURATION);
await onboarding.connect(account2).requestWork('req-content-id', { value: PRICE });
@ -122,24 +121,22 @@ describe('Onboarding', () => {
await expect(onboarding.submitWorkApproval(0, true)).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
await time.increase(86401);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolInitiated').withArgs(2);
expect(await dao.postCount()).to.equal(3);
const post = await dao.posts('req-content-id');
expect(await forum.postCount()).to.equal(3);
const post = await forum.getPost('req-content-id');
expect(post.sender).to.equal(onboarding.target);
expect(post.id).to.equal('req-content-id');
const postAuthors = await dao.getPostAuthors('req-content-id');
expect(postAuthors).to.have.length(1);
expect(postAuthors[0].weightPPM).to.equal(1000000);
expect(postAuthors[0].authorAddress).to.equal(account2);
const pool = await dao.validationPools(2);
expect(pool.postId).to.equal('req-content-id');
expect(pool.fee).to.equal(PRICE * 0.1);
expect(post.authors).to.have.length(1);
expect(post.authors[0].weightPPM).to.equal(1000000);
expect(post.authors[0].authorAddress).to.equal(account2);
const pool = await dao.getValidationPool(2);
expect(pool.props.postId).to.equal('req-content-id');
expect(pool.props.fee).to.equal(PRICE * 0.1);
expect(pool.sender).to.equal(onboarding.target);
expect(pool.fee);
expect(pool.props.fee);
});
it('if the first validation pool is rejected it should not trigger a second pool', async () => {
const {
dao, onboarding, account2,
dao, forum, onboarding, account2,
} = await loadFixture(deploy);
await dao.stakeAvailability(onboarding.target, 40, STAKE_DURATION);
await onboarding.connect(account2).requestWork('req-content-id', { value: PRICE });
@ -148,7 +145,7 @@ describe('Onboarding', () => {
await dao.stakeOnValidationPool(1, 60, false);
await time.increase(86401);
await expect(dao.evaluateOutcome(1)).not.to.emit(dao, 'ValidationPoolInitiated');
expect(await dao.postCount()).to.equal(2);
expect(await forum.postCount()).to.equal(2);
});
});
});

View File

@ -5,19 +5,19 @@ const {
const { expect } = require('chai');
const { ethers } = require('hardhat');
const { beforeEach } = require('mocha');
const deployDAO = require('./util/deploy-dao');
describe('Proposal', () => {
async function deploy() {
// Contracts are deployed using the first signer/account by default
const [account1, account2] = await ethers.getSigners();
const DAO = await ethers.getContractFactory('DAO');
const dao = await DAO.deploy();
const { dao, forum } = await deployDAO();
const Proposals = await ethers.getContractFactory('Proposals');
const proposals = await Proposals.deploy(dao.target);
await dao.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'some-content-id', []);
await dao.addPost([{ weightPPM: 1000000, authorAddress: account2 }], 'some-other-content-id', []);
await forum.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'some-content-id', []);
await forum.addPost([{ weightPPM: 1000000, authorAddress: account2 }], 'some-other-content-id', []);
const callbackData = ethers.AbiCoder.defaultAbiCoder().encode([], []);
await dao.initiateValidationPool(
'some-content-id',
@ -46,7 +46,7 @@ describe('Proposal', () => {
await dao.evaluateOutcome(1);
return {
dao, proposals, account1, account2,
dao, forum, proposals, account1, account2,
};
}
@ -65,6 +65,7 @@ describe('Proposal', () => {
describe('Attestation', () => {
let dao;
let forum;
let proposals;
let account1;
let account2;
@ -73,13 +74,15 @@ describe('Proposal', () => {
beforeEach(async () => {
({
dao,
forum,
proposals,
account1,
account2,
} = await loadFixture(deploy));
const emptyCallbackData = ethers.AbiCoder.defaultAbiCoder().encode([], []);
await proposals.propose('proposal-content-id', account1, [20, 20, 20], false, emptyCallbackData, { value: 100 });
await forum.addPost([{ authorAddress: account1, weightPPM: 1000000 }], 'proposal-content-id', []);
await proposals.propose('proposal-content-id', [20, 20, 20], false, emptyCallbackData, { value: 100 });
expect(await proposals.proposalCount()).to.equal(1);
proposal = await proposals.proposals(0);
expect(proposal.postId).to.equal('proposal-content-id');
@ -87,10 +90,10 @@ describe('Proposal', () => {
});
it('Can submit a proposal', async () => {
const postAuthors = await dao.getPostAuthors('proposal-content-id');
expect(postAuthors).to.have.length(1);
expect(postAuthors[0].weightPPM).to.equal(1000000);
expect(postAuthors[0].authorAddress).to.equal(account1);
const post = await forum.getPost('proposal-content-id');
expect(post.authors).to.have.length(1);
expect(post.authors[0].weightPPM).to.equal(1000000);
expect(post.authors[0].authorAddress).to.equal(account1);
});
it('Can attest for a proposal', async () => {
@ -221,8 +224,8 @@ describe('Proposal', () => {
});
afterEach(async () => {
const pool = await dao.validationPools(3);
expect(pool.resolved).to.be.true;
const pool = await dao.getValidationPool(3);
expect(pool.props.resolved).to.be.true;
});
it('proposal dies if it fails to meet quorum', async () => {
@ -307,8 +310,8 @@ describe('Proposal', () => {
});
afterEach(async () => {
const pool = await dao.validationPools(4);
expect(pool.resolved).to.be.true;
const pool = await dao.getValidationPool(4);
expect(pool.props.resolved).to.be.true;
});
it('proposal dies if it fails to meet quorum', async () => {

View File

@ -4,15 +4,18 @@ const {
} = require('@nomicfoundation/hardhat-toolbox/network-helpers');
const { expect } = require('chai');
const { ethers } = require('hardhat');
const deployDAO = require('./util/deploy-dao');
describe('Validation Pools', () => {
async function deploy() {
const [account1, account2] = await ethers.getSigners();
const DAO = await ethers.getContractFactory('DAO');
const dao = await DAO.deploy();
return { dao, account1, account2 };
const { dao, forum } = await deployDAO();
return {
dao, forum, account1, account2,
};
}
let dao;
let forum;
let account1;
let account2;
const POOL_DURATION = 3600; // 1 hour
@ -37,20 +40,22 @@ describe('Validation Pools', () => {
);
beforeEach(async () => {
({ dao, account1, account2 } = await loadFixture(deploy));
await dao.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'content-id', []);
({
dao, forum, account1, account2,
} = await loadFixture(deploy));
await forum.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'content-id', []);
const init = () => initiateValidationPool({ fee: POOL_FEE });
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(0);
expect(await dao.validationPoolCount()).to.equal(1);
expect(await dao.getValidationPoolCount()).to.equal(1);
expect(await dao.memberCount()).to.equal(0);
expect(await dao.balanceOf(account1)).to.equal(0);
expect(await dao.totalSupply()).to.equal(POOL_FEE);
});
describe('Initiate', () => {
it('should not be able to initiate a validation pool without a fee', async () => {
it('should be able to initiate a validation pool without a fee', async () => {
const init = () => initiateValidationPool({ fee: 0 });
await expect(init()).to.be.revertedWith('Fee is required to initiate validation pool');
await expect(init()).to.emit(dao, 'ValidationPoolInitiated');
});
it('should not be able to initiate a validation pool with a quorum below the minimum', async () => {
@ -81,15 +86,15 @@ describe('Validation Pools', () => {
it('should be able to initiate a second validation pool', async () => {
const init = () => initiateValidationPool();
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
expect(await dao.validationPoolCount()).to.equal(2);
expect(await dao.getValidationPoolCount()).to.equal(2);
});
it('Should be able to fetch pool instance', async () => {
const pool = await dao.validationPools(0);
const pool = await dao.getValidationPool(0);
expect(pool).to.exist;
expect(pool.params.duration).to.equal(POOL_DURATION);
expect(pool.postId).to.equal('content-id');
expect(pool.resolved).to.be.false;
expect(pool.props.postId).to.equal('content-id');
expect(pool.props.resolved).to.be.false;
expect(pool.sender).to.equal(account1);
});
});
@ -128,8 +133,18 @@ describe('Validation Pools', () => {
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, false, true);
expect(await dao.balanceOf(account1)).to.equal(200);
expect(await dao.balanceOf(dao.target)).to.equal(0);
const pool = await dao.validationPools(1);
expect(pool.outcome).to.be.false;
const pool = await dao.getValidationPool(1);
expect(pool.props.outcome).to.be.false;
});
it('should not be able to stake more REP than the sender owns', async () => {
await expect(dao.stakeOnValidationPool(1, 200, true)).to.be.revertedWith('Insufficient REP balance to cover stake');
});
});
describe('Delegated stake', () => {
it('should stake the lesser of the allowed amount or the owner\'s remaining balance', async () => {
// TODO: owner delegates stake and then loses rep
});
});
@ -155,9 +170,9 @@ describe('Validation Pools', () => {
await expect(dao.evaluateOutcome(0)).to.emit(dao, 'ValidationPoolResolved').withArgs(0, true, true);
expect(await dao.memberCount()).to.equal(1);
expect(await dao.balanceOf(account1)).to.equal(100);
const pool = await dao.validationPools(0);
expect(pool.resolved).to.be.true;
expect(pool.outcome).to.be.true;
const pool = await dao.getValidationPool(0);
expect(pool.props.resolved).to.be.true;
expect(pool.props.outcome).to.be.true;
});
it('should not be able to evaluate outcome more than once', async () => {
@ -169,7 +184,7 @@ describe('Validation Pools', () => {
it('should be able to evaluate outcome of second validation pool', async () => {
const init = () => initiateValidationPool();
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
expect(await dao.validationPoolCount()).to.equal(2);
expect(await dao.getValidationPoolCount()).to.equal(2);
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(0)).to.emit(dao, 'ValidationPoolResolved').withArgs(0, true, true);
expect(await dao.balanceOf(account1)).to.equal(100);
@ -183,7 +198,7 @@ describe('Validation Pools', () => {
const init = () => initiateValidationPool({ quorum: [1, 1] });
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
expect(await dao.validationPoolCount()).to.equal(2);
expect(await dao.getValidationPoolCount()).to.equal(2);
time.increase(POOL_DURATION + 1);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, false, false);
});
@ -192,7 +207,7 @@ describe('Validation Pools', () => {
beforeEach(async () => {
time.increase(POOL_DURATION + 1);
await dao.evaluateOutcome(0);
await dao.addPost([{ weightPPM: 1000000, authorAddress: account2 }], 'content-id-2', []);
await forum.addPost([{ weightPPM: 1000000, authorAddress: account2 }], 'content-id-2', []);
const init = () => initiateValidationPool({ postId: 'content-id-2' });
await expect(init()).to.emit(dao, 'ValidationPoolInitiated').withArgs(1);
time.increase(POOL_DURATION + 1);

View File

@ -4,6 +4,7 @@ const {
} = require('@nomicfoundation/hardhat-toolbox/network-helpers');
const { expect } = require('chai');
const { ethers } = require('hardhat');
const deployDAO = require('./util/deploy-dao');
describe('Work1', () => {
const WORK1_PRICE = 100;
@ -12,14 +13,13 @@ describe('Work1', () => {
// Contracts are deployed using the first signer/account by default
const [account1, account2] = await ethers.getSigners();
const DAO = await ethers.getContractFactory('DAO');
const dao = await DAO.deploy();
const { dao, forum } = await deployDAO();
const Proposals = await ethers.getContractFactory('Proposals');
const proposals = await Proposals.deploy(dao.target);
const Work1 = await ethers.getContractFactory('Work1');
const work1 = await Work1.deploy(dao.target, proposals.target, WORK1_PRICE);
const work1 = await Work1.deploy(dao.target, forum.target, proposals.target, WORK1_PRICE);
await dao.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'some-content-id', []);
await forum.addPost([{ weightPPM: 1000000, authorAddress: account1 }], 'some-content-id', []);
const callbackData = ethers.AbiCoder.defaultAbiCoder().encode([], []);
await dao.initiateValidationPool(
'some-content-id',
@ -36,7 +36,7 @@ describe('Work1', () => {
await dao.evaluateOutcome(0);
return {
dao, work1, proposals, account1, account2,
dao, forum, work1, proposals, account1, account2,
};
}
@ -54,12 +54,9 @@ describe('Work1', () => {
let dao;
let work1;
let account1;
let account2;
beforeEach(async () => {
({
dao, work1, account1, account2,
} = await loadFixture(deploy));
({ dao, work1, account1 } = await loadFixture(deploy));
await expect(dao.stakeAvailability(work1.target, 50, STAKE_DURATION)).to.emit(work1, 'AvailabilityStaked').withArgs(0);
});
@ -72,41 +69,53 @@ describe('Work1', () => {
expect(stake.worker).to.equal(account1);
expect(stake.amount).to.equal(50);
expect(stake.endTime).to.equal(await time.latest() + STAKE_DURATION);
expect(await dao.allowance(account1, work1.target)).to.equal(50);
});
it('should not be able to stake availability without reputation value', async () => {
await expect(dao.stakeAvailability(work1.target, 0, STAKE_DURATION)).to.be.revertedWith('No stake provided');
});
it('should not be able to call acceptAvailability directly', async () => {
await expect(work1.acceptAvailability(account1, 50, STAKE_DURATION)).to.be.revertedWith('acceptAvailability must only be called by DAO contract');
});
it('should be able to extend the duration of an availability stake before it expires', async () => {
await time.increase(STAKE_DURATION / 2);
await expect(work1.extendAvailability(0, STAKE_DURATION)).to.emit(work1, 'AvailabilityStaked').withArgs(0);
expect(await work1.stakeCount()).to.equal(1);
await expect(dao.stakeAvailability(work1.target, 50, STAKE_DURATION)).to.emit(work1, 'AvailabilityStaked').withArgs(0);
expect(await work1.stakeCount()).to.equal(1);
expect(await dao.allowance(account1, work1.target)).to.equal(50);
});
it('should be able to extend the duration of an availability stake after it expires', async () => {
await time.increase(STAKE_DURATION * 2);
await work1.extendAvailability(0, STAKE_DURATION);
expect(await work1.stakeCount()).to.equal(1);
await dao.stakeAvailability(work1.target, 50, STAKE_DURATION);
expect(await work1.stakeCount()).to.equal(1);
expect(await dao.allowance(account1, work1.target)).to.equal(50);
});
it('should not be able to extend the duration of another worker\'s availability stake', async () => {
await time.increase(STAKE_DURATION * 2);
await expect(work1.connect(account2).extendAvailability(0, STAKE_DURATION)).to.be.revertedWith('Worker can only extend their own availability stake');
});
it('extending a stake before expiration should increase the end time by the given duration', async () => {
it('extending a stake before expiration should reset the end time to the new duration from the present', async () => {
await time.increase(STAKE_DURATION / 2);
await work1.extendAvailability(0, STAKE_DURATION * 2);
const expectedEndTime = await time.latest() + 2.5 * STAKE_DURATION;
expect(await work1.stakeCount()).to.equal(1);
await dao.stakeAvailability(work1.target, 50, STAKE_DURATION * 2);
expect(await work1.stakeCount()).to.equal(1);
const expectedEndTime = await time.latest() + 2 * STAKE_DURATION;
const stake = await work1.stakes(0);
expect(stake.endTime).to.be.within(expectedEndTime - 1, expectedEndTime);
expect(await dao.allowance(account1, work1.target)).to.equal(50);
});
it('extending a stake after expiration should restart the stake for the given duration', async () => {
await time.increase(STAKE_DURATION * 2);
await work1.extendAvailability(0, STAKE_DURATION * 2);
expect(await work1.stakeCount()).to.equal(1);
await dao.stakeAvailability(work1.target, 50, STAKE_DURATION * 2);
expect(await work1.stakeCount()).to.equal(1);
const expectedEndTime = await time.latest() + STAKE_DURATION * 2;
const stake = await work1.stakes(0);
expect(stake.endTime).to.be.within(expectedEndTime - 1, expectedEndTime);
expect(await dao.allowance(account1, work1.target)).to.equal(50);
});
});
@ -121,7 +130,7 @@ describe('Work1', () => {
expect(await work1.requestCount()).to.equal(1);
const request = await work1.requests(0);
expect(request.customer).to.equal(account2);
expect(request.requestContentId).to.equal('req-content-id');
expect(request.requestPostId).to.equal('req-content-id');
});
it('should not be able to request work if there are no availability stakes', async () => {
@ -160,26 +169,30 @@ describe('Work1', () => {
await expect(requestWork()).to.be.revertedWith('No available worker stakes');
});
it('should not be able to extend a stake that has been assigned work', async () => {
it('after a stake has been assigned work, staking again should create a new stake', async () => {
const {
dao, work1, account2,
dao, work1, account1, account2,
} = await loadFixture(deploy);
await dao.stakeAvailability(work1.target, 50, STAKE_DURATION);
await work1.connect(account2).requestWork('req-content-id', { value: WORK1_PRICE });
await time.increase(STAKE_DURATION * 2);
await expect(work1.extendAvailability(0, STAKE_DURATION)).to.be.revertedWith('Stake has already been assigned work');
expect(await work1.stakeCount()).to.equal(1);
await dao.stakeAvailability(work1.target, 50, STAKE_DURATION);
expect(await work1.stakeCount()).to.equal(2);
expect(await dao.allowance(account1, work1.target)).to.equal(100);
});
});
describe('Work evidence and approval/disapproval', () => {
let dao;
let forum;
let work1;
let account1;
let account2;
beforeEach(async () => {
({
dao, work1, account1, account2,
dao, forum, work1, account1, account2,
} = await loadFixture(deploy));
await dao.stakeAvailability(work1.target, 50, STAKE_DURATION);
});
@ -211,17 +224,15 @@ describe('Work1', () => {
.to.emit(work1, 'WorkApprovalSubmitted').withArgs(0, true);
expect(await dao.balanceOf(work1.target)).to.equal(0);
expect(await dao.balanceOf(account1)).to.equal(100);
const post = await dao.posts('evidence-content-id');
const post = await forum.getPost('evidence-content-id');
expect(post.sender).to.equal(work1.target);
expect(post.id).to.equal('evidence-content-id');
const postAuthors = await dao.getPostAuthors('evidence-content-id');
expect(postAuthors).to.have.length(1);
expect(postAuthors[0].weightPPM).to.equal(1000000);
expect(postAuthors[0].authorAddress).to.equal(account1);
const pool = await dao.validationPools(1);
expect(pool.fee).to.equal(WORK1_PRICE);
expect(post.authors).to.have.length(1);
expect(post.authors[0].weightPPM).to.equal(1000000);
expect(post.authors[0].authorAddress).to.equal(account1);
const pool = await dao.getValidationPool(1);
expect(pool.props.fee).to.equal(WORK1_PRICE);
expect(pool.sender).to.equal(work1.target);
expect(pool.postId).to.equal('evidence-content-id');
expect(pool.props.postId).to.equal('evidence-content-id');
expect(pool.stakeCount).to.equal(1);
await time.increase(86401);
await expect(dao.evaluateOutcome(1)).to.emit(dao, 'ValidationPoolResolved').withArgs(1, true, true);

View File

@ -0,0 +1,27 @@
const { ethers } = require('hardhat');
const deployDAO = async () => {
const Reputation = await ethers.getContractFactory('Reputation');
const Bench = await ethers.getContractFactory('Bench');
const LightweightBench = await ethers.getContractFactory('LightweightBench');
const DAO = await ethers.getContractFactory('DAO');
const GlobalForum = await ethers.getContractFactory('GlobalForum');
const forum = await GlobalForum.deploy();
const reputation = await Reputation.deploy();
const bench = await Bench.deploy();
const lightweightBench = await LightweightBench.deploy();
const dao = await DAO.deploy(
reputation.target,
bench.target,
lightweightBench.target,
forum.target,
);
return {
forum,
dao,
reputation,
bench,
};
};
module.exports = deployDAO;

View File

@ -1,14 +1,24 @@
{
"localhost": {
"DAO": "0x57BDFFf79108E5198dec6268A6BFFD8B62ECfA38",
"Work1": "0xB8f0cd092979F273b752FDa060F82BF2745f192e",
"Onboarding": "0x8F00038542C87A5eAf18d5938B7723bF2A04A4e4",
"Proposals": "0x6c18eb38b7450F8DaE5A5928A40fcA3952493Ee4"
"DAO": "0x3734B0944ea37694E85AEF60D5b256d19EDA04be",
"Work1": "0x8BDA04936887cF11263B87185E4D19e8158c6296",
"Onboarding": "0x8688E736D0D72161db4D25f68EF7d0EE4856ba19",
"Proposals": "0x3287061aDCeE36C1aae420a06E4a5EaE865Fe3ce",
"Rollup": "0x71cb20D63576a0Fa4F620a2E96C73F82848B09e1",
"Work2": "0x76Dfe9F47f06112a1b78960bf37d87CfbB6D6133",
"Reputation": "0xEAefe601Aad7422307B99be65bbE005aeA966012",
"Forum": "0x79e365342329560e8420d7a0f016633d7640cB18",
"Bench": "0xC0f00E5915F9abE6476858fD1961EAf79395ea64"
},
"sepolia": {
"DAO": "0x8e5bd58B2ca8910C5F9be8de847d6883B15c60d2",
"Work1": "0x1708A144F284C1a9615C25b674E4a08992CE93e4",
"Onboarding": "0xb21D4c986715A1adb5e87F752842613648C20a7B",
"Proposals": "0x930c47293F206780E8F166338bDaFF3520306032"
"DAO": "0xBA2e65ae29667E145343bD5Fd655A72dcf873b08",
"Work1": "0x251dB891768ea85DaCA6bb567669F97248D09Fe3",
"Onboarding": "0x78FC8b520001560A9D7a61072855218320C71BDC",
"Proposals": "0xA888cDC4Bd80d402b14B1FeDE5FF471F1737570c",
"Reputation": "0x62cc0035B17F1686cE30320B90373c77fcaA58CD",
"Forum": "0x51b5Af12707e0d879B985Cb0216bFAC6dca85501",
"Bench": "0x98d9F0e97Af71936747819040ddBE896A548ef4d",
"Rollup": "0x678DC2c846bfDCC813ea27DfEE428f1d7f2521ED",
"Work2": "0x609102Fb6cA15da80D37E8cA68aBD5e1bD9C855B"
}
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Some files were not shown because too many files have changed in this diff Show More