Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix pipeline and tests #9

Closed
wants to merge 10 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
111 changes: 0 additions & 111 deletions .github/workflows/develop-pipeline.yml

This file was deleted.

22 changes: 1 addition & 21 deletions .github/workflows/master-pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,22 +43,6 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_S3_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_S3_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_S3_REGION }}

- name: Download latest DB backup from S3
run: |
FILENAME=$(aws s3 ls ${{ secrets.AWS_S3_BUCKET_PATH_PROD }}/ | sort | tail -n 1 | awk '{print $4}')
aws s3 cp ${{ secrets.AWS_S3_BUCKET_PATH_PROD }}/$FILENAME /tmp/db_backup.zip

- name: Unzip DB backup
run: |
unzip /tmp/db_backup.zip -d /tmp
mv /tmp/backups/givethio-db/*.sql /tmp/backups/givethio-db/db_backup.sql

- name: Wait for PostgreSQL to become ready
run: |
Expand All @@ -69,9 +53,6 @@ jobs:
sleep 1
done

- name: Restore DB backup
run: PGPASSWORD=postgres psql -h localhost -p 5443 -U postgres -d givethio < /tmp/backups/givethio-db/db_backup.sql

- name: Use Node.js
uses: actions/setup-node@v1
with:
Expand Down Expand Up @@ -113,7 +94,6 @@ jobs:
SOLANA_DEVNET_NODE_RPC_URL: ${{ secrets.SOLANA_DEVNET_NODE_RPC_URL }}
SOLANA_MAINNET_NODE_RPC_URL: ${{ secrets.SOLANA_MAINNET_NODE_RPC_URL }}
MPETH_GRAPHQL_PRICES_URL: ${{ secrets.MPETH_GRAPHQL_PRICES_URL }}
GIV_POWER_SUBGRAPH_URL: ${{ secrets.GIV_POWER_SUBGRAPH_URL }}

publish:
needs: test
Expand All @@ -128,7 +108,7 @@ jobs:
username: ${{ github.actor }}
password: ${{ github.token }}
registry: ghcr.io
repository: giveth/impact-graph
repository: generalmagicio/qacc-be # todo: check this to be correct, I just set that to be different from giveth/impact-graph
add_git_labels: true
# Add branch name to docker image tag @see{@link https://github.com/docker/build-push-action/tree/releases/v1#tag_with_ref}
tag_with_ref: true
Expand Down
40 changes: 0 additions & 40 deletions .github/workflows/staging-pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,22 +68,6 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v1
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_S3_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_S3_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_S3_REGION }}

- name: Download latest DB backup from S3
run: |
FILENAME=$(aws s3 ls ${{ secrets.AWS_S3_BUCKET_PATH_STAGING }}/ | sort | tail -n 1 | awk '{print $4}')
aws s3 cp ${{ secrets.AWS_S3_BUCKET_PATH_STAGING }}/$FILENAME /tmp/db_backup.zip

- name: Unzip DB backup
run: |
unzip /tmp/db_backup.zip -d /tmp
mv /tmp/backups/givethio-staging/*.sql /tmp/backups/givethio-staging/db_backup.sql

- name: Wait for PostgreSQL to become ready
run: |
Expand All @@ -94,9 +78,6 @@ jobs:
sleep 1
done

- name: Restore DB backup
run: PGPASSWORD=postgres psql -h localhost -p 5443 -U postgres -d givethio < /tmp/backups/givethio-staging/db_backup.sql

- name: Use Node.js
uses: actions/setup-node@v1
with:
Expand All @@ -117,28 +98,7 @@ jobs:
- name: Run tests
run: npm run test
env:
ETHERSCAN_API_KEY: ${{ secrets.ETHERSCAN_API_KEY }}
XDAI_NODE_HTTP_URL: ${{ secrets.XDAI_NODE_HTTP_URL }}
INFURA_API_KEY: ${{ secrets.INFURA_API_KEY }}
INFURA_ID: ${{ secrets.INFURA_ID }}
POLYGON_SCAN_API_KEY: ${{ secrets.POLYGON_SCAN_API_KEY }}
OPTIMISTIC_SCAN_API_KEY: ${{ secrets.OPTIMISTIC_SCAN_API_KEY }}
CELO_SCAN_API_KEY: ${{ secrets.CELO_SCAN_API_KEY }}
CELO_ALFAJORES_SCAN_API_KEY: ${{ secrets.CELO_ALFAJORES_SCAN_API_KEY }}
ARBITRUM_SCAN_API_KEY: ${{ secrets.ARBITRUM_SCAN_API_KEY }}
ARBITRUM_SEPOLIA_SCAN_API_KEY: ${{ secrets.ARBITRUM_SEPOLIA_SCAN_API_KEY }}
BASE_SCAN_API_KEY: ${{ secrets.BASE_SCAN_API_KEY }}
BASE_SEPOLIA_SCAN_API_KEY: ${{ secrets.BASE_SEPOLIA_SCAN_API_KEY }}
ZKEVM_MAINNET_SCAN_API_KEY: ${{ secrets.ZKEVM_MAINNET_SCAN_API_KEY }}
ZKEVM_CARDONA_SCAN_API_KEY: ${{ secrets.ZKEVM_CARDONA_SCAN_API_KEY }}
MORDOR_ETC_TESTNET: ${{ secrets.MORDOR_ETC_TESTNET }}
ETC_NODE_HTTP_URL: ${{ secrets.ETC_NODE_HTTP_URL }}
DROP_DATABASE: ${{ secrets.DROP_DATABASE_DURING_TEST_STAGING }}
SOLANA_TEST_NODE_RPC_URL: ${{ secrets.SOLANA_TEST_NODE_RPC_URL }}
SOLANA_DEVNET_NODE_RPC_URL: ${{ secrets.SOLANA_DEVNET_NODE_RPC_URL }}
SOLANA_MAINNET_NODE_RPC_URL: ${{ secrets.SOLANA_MAINNET_NODE_RPC_URL }}
MPETH_GRAPHQL_PRICES_URL: ${{ secrets.MPETH_GRAPHQL_PRICES_URL }}
GIV_POWER_SUBGRAPH_URL: ${{ secrets.GIV_POWER_SUBGRAPH_URL }}

publish:
needs: test
Expand Down
3 changes: 1 addition & 2 deletions .mocharc.all-test.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,7 @@
"spec": [
"./test/pre-test-scripts.ts",
"./src/**/*.test.ts",
"./src/**/**/*.test.ts",
"./migration/tests/*.test.ts"
"./src/**/**/*.test.ts"
],
"timeout": 90000,
"exit": true,
Expand Down
23 changes: 0 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,10 +71,6 @@ You can see logs beautifully with this command

```

## Features

- [Power Boosting Specs](./docs/powerBoosting.md)

### Authentication

### Start fast
Expand Down Expand Up @@ -246,20 +242,6 @@ in below image links

[![](https://mermaid.ink/img/pako:eNrFVcFu2zAM_RXCl7VAmgA9-hCga9euQJcNS7tTgEGVaEeNLXkSnSwo-u-jJDtJ03TYgA7zyYbI98j3aOoxk1Zhlmcef7RoJF5oUTpRzwzwIyRZB3ceXfpuhCMtdSMMwaWzhtColyfvhVwcPJhOQHiYWqlFBROklXULCGEzk4J7TDgZj3uYHM5FVYFQaoKrlPvF2UJXCHVLgrQ1cOQ5UJsSfDzvoQUjNc4-oKRv6HShZQy_tK6-VseJ0lhCsEt0W74pEmjTtARKkOBXcKi0j3AlGnSCUwSQ4wQmPWcBX8X6JEIItK6C-zXISqOhazWIiI7ruuODAQyHw4TQ5cFJEKBXI4evDql1JsAcEGo64YgOD1o2C8h2SoDppVDKofcpm32IBMHYHM78YpPlUCiQ1rCeFPutbKk7c0I0bAhvwsFLom5uKoIL7aV13RT0rLu2vih55Hfd9SPJvt9z9EglKFhpmvNMSm7kO1lGCd7w2Lo1eHKs85ZsI-R4HKq9Yktj_doUNjiRUG4DyPMK90pMsveZKXQz26HNK2vLCt-iyzIiDVKX4a_8o-6iGdPwEVOYoORe_6q9neDouVZJ3W3mgcG-wFRiFwtHoOmdh4cVHXc_Smr5OgheOFsDzdHhjoJbDW-0WbBM5i1UrDqs_6bjvnb7HHGbzYkan49GotHDvuKhtPVoeTqq8TdUYdG9Oo8HbLqKu6y3IO2xKMzu9oqG0dzZFeBPiU3cqbpIjjIoLEWlX-c4560RN-IyLllU8MyciO6wCjGagrIHdjIUvJS3E_H57O7242naPuDbKOxhSbdLsuh3YtmuPfi5bSt-10tkvbi4IHNg4UZZg6Z1jfW4J-AefSG4fPWPebNBVqOrhVZ8Cz_G2zDjX4WnIMv5VWEh2opm2cw8cWjb8J2EH5TmiznLC1F5HGSiJTtdG5nl5Frsg7qbvIt6-gW6nqHe)](https://mermaid-js.github.io/mermaid-live-editor/edit#pako:eNrFVcFu2zAM_RXCl7VAmgA9-hCga9euQJcNS7tTgEGVaEeNLXkSnSwo-u-jJDtJ03TYgA7zyYbI98j3aOoxk1Zhlmcef7RoJF5oUTpRzwzwIyRZB3ceXfpuhCMtdSMMwaWzhtColyfvhVwcPJhOQHiYWqlFBROklXULCGEzk4J7TDgZj3uYHM5FVYFQaoKrlPvF2UJXCHVLgrQ1cOQ5UJsSfDzvoQUjNc4-oKRv6HShZQy_tK6-VseJ0lhCsEt0W74pEmjTtARKkOBXcKi0j3AlGnSCUwSQ4wQmPWcBX8X6JEIItK6C-zXISqOhazWIiI7ruuODAQyHw4TQ5cFJEKBXI4evDql1JsAcEGo64YgOD1o2C8h2SoDppVDKofcpm32IBMHYHM78YpPlUCiQ1rCeFPutbKk7c0I0bAhvwsFLom5uKoIL7aV13RT0rLu2vih55Hfd9SPJvt9z9EglKFhpmvNMSm7kO1lGCd7w2Lo1eHKs85ZsI-R4HKq9Yktj_doUNjiRUG4DyPMK90pMsveZKXQz26HNK2vLCt-iyzIiDVKX4a_8o-6iGdPwEVOYoORe_6q9neDouVZJ3W3mgcG-wFRiFwtHoOmdh4cVHXc_Smr5OgheOFsDzdHhjoJbDW-0WbBM5i1UrDqs_6bjvnb7HHGbzYkan49GotHDvuKhtPVoeTqq8TdUYdG9Oo8HbLqKu6y3IO2xKMzu9oqG0dzZFeBPiU3cqbpIjjIoLEWlX-c4560RN-IyLllU8MyciO6wCjGagrIHdjIUvJS3E_H57O7242naPuDbKOxhSbdLsuh3YtmuPfi5bSt-10tkvbi4IHNg4UZZg6Z1jfW4J-AefSG4fPWPebNBVqOrhVZ8Cz_G2zDjX4WnIMv5VWEh2opm2cw8cWjb8J2EH5TmiznLC1F5HGSiJTtdG5nl5Frsg7qbvIt6-gW6nqHe)

### Power Snapshot

Impact graph supports ranking projects based on power boosted by users.
Users who have GIVpower, can boost a project by allocating a portion (percentage) of their GIVpower to that project and after that impact-graph regularly takes snapshot of user GIVpower balance and boost percentages.
At the end of each givback round (14 days), average of allocated power will be the effective power balance of each project.

Snapshotting mechanism is implemented in by the hlp of database cron job and impact graph support of historic user balance on blockchain.

##### Database Snapshot

Snapshot taking on database is implemented by the help `pg_cron` extension on Postgres database.
On regular interval (defined by cron job expression), calls a db procedure called public."TAKE_POWER_BOOSTING_SNAPSHOT".
This procedure creates a new record of power_snapshot and copies power boosting percentages content to another table and associates them to the new power_snapshot record.

###### Cron Job Creation

Cron job creation for test environment is already implemented in dbCronRepository.ts and a modified docker with enabled `pg_cron` extension.
Expand Down Expand Up @@ -303,8 +285,3 @@ SELECT CRON.schedule(
'*/5 * * * *',
$$CALL public."ARCHIVE_POWER_BOOSTING_OLD_SNAPSHOT_DATA"()$$);
```

##### User GIVpower balance snapshot

impact-graph monitors power_snapshot table and whenever a new record is created it find corresponding ethereum blockchain block number and fills in the snapshot record.
Then for every user who has a percentage snapshot, fills balance snapshot table with the user balance at the corresponding block number by the help of impact graph block filter.
37 changes: 0 additions & 37 deletions config/example.env
Original file line number Diff line number Diff line change
Expand Up @@ -115,21 +115,6 @@ TWITTER_CLIENT_ID=
TWITTER_CLIENT_SECRET=
TWITTER_CALLBACK_URL=https://dev.serve.giveth.io/socialProfiles/callback/twitter

GIVPOWER_BOOSTING_USER_PROJECTS_LIMIT=5
GIVPOWER_BOOSTING_PERCENTAGE_PRECISION=2

#GIV_POWER_SUBGRAPH_ADAPTER=givPower
GIV_POWER_SUBGRAPH_ADAPTER=mock
GIV_POWER_SUBGRAPH_URL=https://api.thegraph.com/subgraphs/name/aminlatifi/giveconomy-xdai-deployment-seven
GIV_POWER_UNIPOOL_CONTRACT_ID=0xdaea66adc97833781139373df5b3bced3fdda5b1
FIRST_GIVBACK_ROUND_TIME_STAMP=1640361600
# GIVpower round 14days * 24 hours * 3600seconds =1209600
GIVPOWER_ROUND_DURATION=1209600
FILL_POWER_SNAPSHOT_BALANCE_SERVICE_ACTIVE=true
FILL_POWER_SNAPSHOT_BALANCE_CRONJOB_EXPRESSION=0 0 * * * *
NUMBER_OF_FILLING_POWER_SNAPSHOT_BALANCE_CONCURRENT_JOB=5


#NOTIFICATION_CENTER_ADAPTER=notificationCenter
NOTIFICATION_CENTER_ADAPTER=mock
NOTIFICATION_CENTER_BASE_URL=
Expand All @@ -144,20 +129,9 @@ PROJECT_UPDATES_EXPIRED_ADDITIONAL_REVOKE_DAYS=30
PROJECT_REVOKE_SERVICE_ACTIVE=true
PROJECT_UPDATES_FIRST_REVOKE_BATCH_DATE=2022-10-22

FILL_POWER_SNAPSHOT_SERVICE_ACTIVE=true

#Every 30 minutes
UPDATE_POWER_ROUND_CRONJOB_EXPRESSION=10 */30 * * * *
UPDATE_POWER_SNAPSHOT_SERVICE_ACTIVE=true

GIVBACK_MIN_FACTOR=0.5
GIVBACK_MAX_FACTOR=0.8

ENABLE_DB_POWER_BOOSTING_SNAPSHOT=

DB_POWER_BOOSTING_SNAPSHOT_CRONJOB_EXPRESSION=
ARCHIVE_POWER_BOOSTING_OLD_SNAPSHOT_DATA_CRONJOB_EXPRESSION=

PROJECT_FILTERS_THREADS_POOL_CONCURRENCY=1
PROJECT_FILTERS_THREADS_POOL_NAME=ProjectFiltersThreadPool
PROJECT_FILTERS_THREADS_POOL_SIZE=4
Expand All @@ -181,12 +155,6 @@ SERVICE_NAME=example
OPTIMISM_NODE_HTTP_URL=https://optimism-mainnet.public.blastapi.io/
OPTIMISM_SEPOLIA_NODE_HTTP_URL=

####################################### INSTANT BOOSTING #################################
# OPTIONAL - default: false
ENABLE_INSTANT_BOOSTING_UPDATE=true
# OPTIONAL - default: Every 5 minutes
INSTANT_BOOSTING_UPDATE_CRONJOB_EXPRESSION=0 */5 * * * *

# Gitcoin API
GITCOIN_ADAPTER=mock
GITCOIN_PASSPORT_API=
Expand All @@ -195,11 +163,6 @@ GITCOIN_SCORER_ID=
# OPTIONAL - default: Every 10 minutes
CHECK_QF_ROUND_ACTIVE_STATUS_CRONJOB_EXPRESSION=*/10 * * * *

BALANCE_AGGREGATOR_BASE_URL=https://dev.serve.giveth.io/givpower-balance-aggregator
POWER_BALANCE_AGGREGATOR_ADAPTER=powerBalanceAggregator
#POWER_BALANCE_AGGREGATOR_ADAPTER=mock
NUMBER_OF_BALANCE_AGGREGATOR_BATCH=20

# OPTIONAL - default: 60000 (1 minute
QF_ROUND_ESTIMATED_MATCHING_CACHE_DURATION=60000

Expand Down
27 changes: 0 additions & 27 deletions config/test.env
Original file line number Diff line number Diff line change
Expand Up @@ -114,24 +114,6 @@ LINKEDIN_CLIENT_ID=
LINKEDIN_CLIENT_SECRET=
LINKEDIN_REDIRECT_URL=http://localhost:3040/socialProfiles/callback/linkedin

GIVPOWER_BOOSTING_USER_PROJECTS_LIMIT=5
GIVPOWER_BOOSTING_PERCENTAGE_PRECISION=2

# GIVpower round 14days * 24 hours * 3600seconds =1209600
GIVPOWER_ROUND_DURATION=1209600

GIV_POWER_SUBGRAPH_ADAPTER=mock
FIRST_GIVBACK_ROUND_TIME_STAMP=1640361600
GIV_POWER_SUBGRAPH_URL=
_GIV_POWER_SUBGRAPH_URL=http://localhost:8000/subgraphs/name/local/staging
GIV_POWER_UNIPOOL_CONTRACT_ID=0xdaea66adc97833781139373df5b3bced3fdda5b1
FILL_POWER_SNAPSHOT_BALANCE_SERVICE_ACTIVE=false
FILL_POWER_SNAPSHOT_BALANCE_CRONJOB_EXPRESSION=0 * * * * *
# Set 10 to do it fast while running tests
NUMBER_OF_FILLING_POWER_SNAPSHOT_BALANCE_CONCURRENT_JOB=20
ENABLE_DB_POWER_BOOSTING_SNAPSHOT=false
DB_POWER_BOOSTING_SNAPSHOT_CRONJOB_EXPRESSION =0 0 1 1 *

NOTIFICATION_CENTER_ADAPTER=mock
NOTIFICATION_CENTER_BASE_URL=
NOTIFICATION_CENTER_USERNAME=
Expand All @@ -144,14 +126,10 @@ PROJECT_UPDATES_VERIFIED_REVOKED_DAYS=104
PROJECT_UPDATES_FIRST_REVOKE_BATCH_DATE=2021-01-22
PROJECT_REVOKE_SERVICE_ACTIVE=true

UPDATE_POWER_ROUND_CRONJOB_EXPRESSION=0 0 * * *
UPDATE_POWER_SNAPSHOT_SERVICE_ACTIVE=false

GIVBACK_MIN_FACTOR=0.5
GIVBACK_MAX_FACTOR=0.8

ONRAMPER_SECRET=secreto
ENABLE_GIV_POWER_TESTING=true
THIRD_PARTY_PROJECTS_ADMIN_USER_ID=4

PROJECT_FILTERS_THREADS_POOL_CONCURRENCY=1
Expand All @@ -172,11 +150,6 @@ RECURRING_DONATION_VERIFICAITON_EXPIRATION_HOURS=24
POLYGON_MAINNET_NODE_HTTP_URL=https://polygon-rpc.com
OPTIMISM_NODE_HTTP_URL=https://optimism-mainnet.public.blastapi.io

BALANCE_AGGREGATOR_BASE_URL=https://dev.serve.giveth.io/givpower-balance-aggregator
POWER_BALANCE_AGGREGATOR_ADAPTER=mock
NUMBER_OF_BALANCE_AGGREGATOR_BATCH=7


# ! millisecond cache, if we increase cache in test ENV we might get some errors in tests
QF_ROUND_ESTIMATED_MATCHING_CACHE_DURATION=1
# ! millisecond cache, if we increase cache in test ENV we might get some errors in tests
Expand Down
Loading
Loading