This is a Node API which in turn uses a private Geth PoA Ethereum blockchain as its datastore. Most of the actual business logic consists of on-chain smart contracts. It all (optionally) runs inside Docker containers.
These services have been Dockerized so that they can be launched via a single docker-compose statement, after the various configuration files have been appropriately populated. Automated integration tests exist for the smart contracts and the API. Details on how to build & launch the app below.
To get this code up and running locally, you're going to need:
- The code, obviously, cloned from this repo.
- Docker and docker-compose installed locally, to run it in containers, or
- Node and truffle installed locally, to run it directly
Note also that the instructions here assume a Unix environment (and it was developed in OS X) so any necessary operating-system adjustments are, I'm afraid, left as an exercise for the reader.
-
Ensure you have the prerequisites mentioned above.
-
Populate the YKarma configuration file for the API service
- Copy the
.example.env.production
file in the "server" top-level directory to.env.production
- Edit the values there per your needs. In particular, change the admin URL to your email or other URL.
- Copy the
-
Breathe a sigh of relief that the annoying config-file stuff is now done and you shouldn't need to deal with it again.
-
Build the app with docker-compose
- From a shell in the project root directory, run
docker-compose build
- Note you'll need to repeat this step after code changes for those to be promoted into the containers.
- From a shell in the project root directory, run
-
Run the app with docker-compose
- From a shell in the project root directory, run
docker-compose up
- A lot of stuff happens on first run: launching the blockchain, compiling all the smart contracts and migrating them into the blockchain, communicating the contract address to the API server, etc. This can take a few minutes. Wait for it to settle down into a steady stream of mining empty blocks, and/or wait for a "hostname" log message from the "node_1" container, before you...
- Open a browser and point it to "localhost"
- From a shell in the project root directory, run
-
Profit!
- Use the API Log in with your admin URL
- Send karma to other URLs to add them to the built-in test community
- Create rewards, purchase rewards, etc.
- Use this code as a basis for your own experimentation!
Note that when running the blockchain via Docker, the actual data directory is the "geth/cbdata" directory under the project root, which is shared with (and written to by) Docker as a volume. If you want to restart with a brand-new, unsullied blockchain, just delete that directory, delete the YKARMA_ADDRESS line in server/.env.production, and run "docker-compose up" again.
Docker deployment is relatively fast (other than the configuration-file dance, but that's software for you these days) but is a little too arms-length for a development environment. If you want to build on this code, or if for some reason Docker isn't cooperating, here's how to get it up and running directly on your machine, with no container abstraction:
-
Open a shell and run
ganache-cli -u 0
(installed as part of Truffle) to get a local blockchain up and running. For extra debug info you may wish to runganache-cli -u 0 --noVMErrorsOnRPCResponse
-
Open another shell, navigate to the "ethereum" top-level directory of this repo, and run
truffle test
This should compile the YKarma smart contracts, write them to the local blockchain,
and run some JavaScript test code against them. The result should output the admin
email in the file server/.env
and also the results of the "Paces" integration
test, which should pass.
Note that the fundamental smart contract interface with which the JavaScript code interfaces, YKarma.sol, is very nearly at the maximum size limit for an Ethereum smart contract; if you want to add to it you may need to split it up.
Both the API server and the React client have a configuration file. Their use
is slightly confusing because this code can run in two environments: Docker and
local. The Docker environment uses the .env.production
files to set its
environment variables; the local environment simply uses .env
files.
So, to get the Node API up and running locally:
-
Copy the
.example.env.development
file in the "server" top-level directory to.env
-
Edit the values there per your needs. In particular, change the admin URL to your URL.
-
Open a shell and run
ganache-cli -u 0
as above. -
Open another shell, navigate to the "ethereum" top-level directory of this repo, and run
truffle deploy
(this should compile and write the smart contracts, again, and write the address of the resulting YKarma interface contract toserver/.env
for use by the Node code.) -
If you want to run the automated tests locally, instead run
TRUFFLE_ENV=test truffle deploy
, which populates the blockchain with a few test accounts and rewards subsequently assumed by the API test code. -
Open a third shell, navigate to the "server" top-level directory of this repo, run
npm install
, and then runnpm run test
to run the API in test mode -
Open a fourth shell, navigate to the "server" top-level directory of this repo, and run
mocha
to run the API tests. They should pass. -
Once you've established that tests are passing, stop all of those, restart ganache, run
truffle deploy
without TRUFFL_ENV set, and runnpm start
rather thannpm run test
. Voila! The API runs locally on port 3001.
The above looks more painful and difficult than it is because of all the config files. Once those are set up, running locally consists of opening three shell windows and running, in this order:
- (in the "ethereum" directory)
ganache-cli -u 0
- (in the "ethereum" directory)
truffle deploy
- (in the "server" directory)
npm start
You'll need to restart Node and React if you change the API code, and obviously you'll need to restart truffle as well if you change the smart contracts.
If you want to run this in production, eg on a DigitalOcean droplet, you should obviously use the Docker configuration. You should be able to run it well on a fairly small instance (eg 4GB memory). All of the Quick Launch instructions above apply, along with a few other notes:
-
Again, when running under Docker the actual blockchain data is the local "geth/cbdata" directory, shared with and written to by Docker as a volume
-
If you want your site secured via TLS, you probably want to use Let's Encrypt via docker; here's a good guide to doing that. If you have SSL set up, you want to replace nginx.conf in "server" with nginx.conf.https, and there are two more lines to uncomment in docker-compose.yml
-
You'll want to set up a cron job to refresh users' karma every week, and another to refresh the Let's Encrypt cert. An example crontab for your server can be found in the "cron" directory under "server".
The hard parts, technically, are yet to come, but much of the tedious work is now done. (Not counting the inevitable bug fixes and/or design changes which will necessitate the hair-tearing idea of updating smart contracts and data in a production blockchain, but we'll burn those bridges when we come to them.)
The technical downsides to this architecture are obvious. Blockchains and blockchain development are both far less efficient than databases and database development. As mentioned, refactoring which involves production data is very difficult, as, rather than, say, running a simple ALTER TABLE statement, you have to either add to the existing, immutable, on-chain data; or transform it all with a long list of blockchain transactions; or launch a new chain, copy the old data over, and transform it en route. Even fetching data from the chain can be surprisingly convoluted. Etc etc etc.
I was tempted to add more abstraction layers, eg some kind of ORM data layer between the API and the blockchain, but the idea was for this to be illustrative as well as useful, so I went with just trying to make the code simple, readable, and straightforward.
I'm a polyglot programmer and neither Javascript nor Solidity is my first or even my fifth language of choice (though like many I have warmed to JS over the years) which will probably be very apparent to serious ode developers when they look at the code.
If I was building this to seriously scale I ... well, I wouldn't have used a blockchain. Given its necessity, I'd probably add that ORM data layer and also use it to cache data for reads, along with a messaging queue for writes, and maybe Kubernetes to support an arbitrary number of blockchain nodes and web servers, and then worry about how/when to invalidate the cache(s) ... but obviously caching would add a great deal of complexity to the system. (There's a little Redis caching in there now, but only a little.)