Progress

Well that’s been a fun and productive couple of months!

ConsenSys Academy 2018

I’m now officially a ConsenSys certified dApp Developer 👊! (Certificate apparently on its way)

The ConsenSys Developer course was definitely worthwhile. I covered a lot of Blockchain theory while following the course lectures and taking the quizzes. The real learning and fun came from the final project where I actually had to build something.

ConSensys Academy Final Project
My final project was a bounty DApp that allows anyone to upload a picture of an item they want identified along with an associated bounty in Eth for the best answer. I got a lot of experience using the various parts of the Web3 technology stack. I used Truffle for development/testing, IPFS for storing the pictures and data (was cool to use this, very powerful idea), uPort for identity, OpenZeppelin libraries (which are really useful) an upgradeable design pattern, deployment to Rinkeby and lots of practice securing and testing smart contracts.

Colony Hackathon Winner

I also managed to bag myself a prize in the Colony Hackathon for my decentralised issue reporting app. I got the Creativity Honorable Mention which was pretty cool and I used my winnings to buy a Devcon IV ticket ✈️ 🤘!!

The Learnings

I came across a few things that I wanted to do while I was #BUIDLING but couldn’t easily find the info on so I’ve been keeping a kind of cheat sheet. Hopefully it might help someone else out there.

https://github.com/johngrantuk/dAppCheatSheet/blob/master/README.md

The Future Is Bright

The last few months I’ve confirmed to myself that the Blockchain/Ethereum world is something I want to be involved in. There’s so many different, exciting areas to investigate further, now I just have to chose one and dive further down the rabbit hole!

Blockchain – The Adjacent Infrastructure

The Adjacent Infrastructure Movement

Chris Robisons Ethreal summary post really reinforces the feeling I get about Blockchain. He mentions how Joseph Lubin described the tech as an Adjacent Infrastructure. This really rings true to me. He goes on to say that from the ground up a new parallel society is being built and people can opt into it.  Some nice examples are given:

  • Crypto – adjacent financial industry infrastructure
  • Ujo – adjacent music industry infrastructure
  • Hoard – adjacent virtual entertainment infrastructure

It’s my first real experience of a “community”. This community is growing. There are people genuinely enthusiastic about the technology and they are willing to contribute and collaborate to drive things forward. As more and more adjacent infrastructure is built more people are going to opt in. At the same time the building blocks of the Adjacent Infrastructure are still being built. There’s loads of opportunity to get involved and help make this happen. Personally I’m looking forward to getting stuck in to the Consensys Academy in June/July and here are some of the other things that have enthused me lately!

Ideo, Designing for Blockchain: Three Ways to Get Started

Really good article centered around three Blockchain challenges that can be worked on now:

  1. Design secure and user-friendly ways to store private keys.
  2. Design better ways to help users make a decision about transaction costs, or find ways to abstract it away all together.
  3. The challenge: Design ways to display blockchain addresses in a more readable or recognizable format.

The challenges are real. People who are well versed, immersed and interested in the technology sometimes forget how difficult it can be to actually just use it. Solving these challenges would go a long way to getting more adoption and real life use. The article also provides a bit of inspiration to get started and it’s definitely thought provoking.

WalletConnect/Balance

A project I’ve enjoyed reading about lately is WalletConnect from the team at Balance. Their blog posts are great and the code is interesting and well documented. Richard Burton appears to be really open to collaboration with a genuine desire to drive the technology forward.

The WalletConnect concept is an example of something built to solve the usability challenges described in the Ideo article. The solution feels natural to me and I can definitely imagine using it. Balance themselves are in the process of developing a mobile wallet and I think this is something to look forward to.

ETHPrize

I read the introduction post to ETHPrize and it sounds like a great idea. Bounties as a driving force for development seem to be taking off.

The list of key points is really interesting. It shows how early stage this technology and the tools around it is. There’s so much opportunity to get involved and really make a difference.

I found it really weird to be inspired to do something about documentation! Not something I naturally gravitate towards but its something I know can actually make a massive difference and from experience it’s an issue in places. Seems like a great way to get involved and do valuable work for projects.

 

Django – Custom Migrations

The Problem

I have a Django app running on Heroku. Recently I had to change one of the models to add a new ForeignKey. The app in production has data stored in the database that I want to keep and this existing data can be used to define the new ForeignKey field.

After a bit of searching I discovered I can run a custom migration to manipulate the data in the database. Before now I never really understood what migrations were really doing. As usual the Django docs are really good and for this particular case I found this How to Create Django Data Migrations blog post useful – Vitor Freitas stuff is always good.

The Solution

First I added the new field to the model:
from django.db import models

class modelToChange(models.Model):
   manualAz = models.TextField(default="0")
   manualEl = models.TextField(default="0")
   manualPol = models.TextField(default="0")
   newField = models.ForeignKey(FkModel, null=True)
Next I ran the makemigrations command. The makemigrations command is responsible for creating new migrations based on the changes you have made to your models (kind of obvious really).
$python manage.py makemigrations
This creates a new migration file in the app/migrations folder which looks like:
# -*- coding: utf-8 -*-
# Generated by Django 1.11.7 on 2018-02-08 11:57
from __future__ import unicode_literals

from django.db import migrations, models
import django.db.models.deletion

class Migration(migrations.Migration):

dependencies = [
 ('myApp', '0015_auto_20180201_0936'),
 ]

operations = [
 migrations.AddField(
   model_name='modelToChange',
   name='newField',
   field=models.ForeignKey(null=True,         on_delete=django.db.models.deletion.CASCADE, to='myApp.FkModel'),
 ),
 ]

This adds a new field, ‘newField’, to the existing model, ‘modelToChange’. The field is a ForeignKey field link to another model class called FkModel.

Normally I’d run the migrate command next. Migrate – is responsible for applying and unapplying migrations.  Basically, it updates the database. In this case I want to add some custom code to the migration to update the new field using existing data.

After the additions the migration file looks like this:

# -*- coding: utf-8 -*-
# Generated by Django 1.11.7 on 2018-02-08 11:57
from __future__ import unicode_literals

from django.db import migrations, models
import django.db.models.deletion


def addCustom(apps, schema_editor):

  ExistingRecords = apps.get_model('myApp', 'modelToChange')
  ForeignKeyRecords = apps.get_model('myApp', 'FkModel')

  for message in ExistingRecords.objects.all():
    fkRecord =ForeignKeyRecords.objects.filter(id=message.id)
    
    if len(fkRecord) == 0:
      fkRecord =ForeignKeyRecords(id="unknown:" + message.id)
      fkRecord.save()
    else:
      fkRecord =fkRecord[0]

   message.fkRecord =fkRecord
   message.save()

class Migration(migrations.Migration):

  dependencies = [
    ('myApp', '0015_auto_20180201_0936'),
  ]

  operations = [
    migrations.AddField(
    model_name='modelToChange',
    name='newField',
    field=models.ForeignKey(null=True,     on_delete=django.db.models.deletion.CASCADE, to='myApp.FkModel'),
    ),

    migrations.RunPython(addCustom),
  ]

Notice the addition of migrations.RunPython(addCustom) to the Migration class and the new function, addCustom.

The last step is just to run the migrate command:

$python manage.py migrate

When the migrate command is run the addCustom function will be called. The addCustom function iterates through existing records in the database and adds the foreign key object to the existing data.

Heroku Application

It’s also possible to run custom migrations on a Heroku application. Migration files will be pushed to the app along with new code. Once this is deployed the following command can be run:

heroku run python manage.py migrate.

You can put your app in maintenance first with:

heroku maintenance:on

Django And Heroku Postgres Databases

I’m using Heroku to run one of my Django applications. The application has a Heroku Postgres Add-on for storing data. I’d like to use this ‘live’ data when I’m working on the application on my local development set-up.

I can see two options – connect directly to the live database or retrieve a copy to use locally. Using the live database directly could lead to issues if I make a mistake so I’m going with the local copy.

PostgreSQL Mac Set-up

Initially my local app was using SQLite so the first thing I need to do is get my development Mac set up for PostgreSQL. As detailed in the Heroku docs I used the Postgres.app package.

To confirm it was running ok:

$ which psql
 /Applications/Postgres.app/Contents/Versions/latest/bin/psql

And verified:

$ psql -h localhost
 psql (10.1)
 Type "help" for help.

Making A Local Database Copy

The pg:pull command can be used to pull remote data from a Heroku Postgres database to a database on the local machine. I also want to use a user and password so the command I used is:

PGUSER=postgres PGPASSWORD=password heroku pg:pull DATABASE_URL bcmlocaldb

A few things to note:

  • This is run from the main application directory on my local machine.
  • PGUSER and PGPASSWORD set the authentication credentials for the local db.
  • My Django app has the Database URL stored under the DATABASE_URL environment variable. The URL can be viewed with the $ heroku config command or on the Heroku dashboard if you want to use it directly.
  • bcmlocaldb is the name of the new local db.

Once the command has successfully run I could then see the db in the PostgresSQL dashboard on my local machine.

PostgresSQL Dashboard

Django Settings To Use Local DB

Now I have a local copy of the db I just need to change the local apps settings to use this instead of the old Sqlite. I’m using the DJ-Database-URL utility to configure my environment variable so in my local .env file I changed:

DATABASE_URL=sqlite:///db.sqlite3

to:

DATABASE_URL=postgres://postgres:password@localhost:5432/bcmlocaldb

Now when I run the local application its using the new database!

Heroku has a lot of nice documentation and the Postgres info can be found here.

Alternative – Connect to live Heroku Postgres Database

To get the applications Postgres URI I just run the $ heroku config command, you will see something along the lines of:

$ heroku config

DATABASE_URL: postgres://your_username:the_password@host.amazonaws.com:5432/database

Now you just replace the local DATABASE_URL .env variable with the info above.

There’s some good documentation going into more detail about connecting from outside Heroku here.

DApp Learnings  –  Storing & Iterating a Collection

I’ve been working on a rock, paper, scissors Ethereum DApp using Solidity, Web3 and the Truffle framework. I hit a few difficulties trying to replicate functionality that would normally be trivial in a non blockchain world so I thought I’d share what I learned.

My first thoughts for the DApp was to display a list of existing games that people had created. Normally if I were doing something like this in Django I’d create a game model and save any new games in the database. To display a list of existing games on the front end I’d query the db and iterate over the returned collection. (I realise storage is expensive when using the Ethereum blockchain but I thought trying to replicate this functionality would make sense and would be a good place to start.)

Solidity

Structures

While investigating the various data types that could be used I found the Typing and Your Contracts Storage page from Ethereum useful. I settled on using a struct, a grouping of variables, stored under one reference.

struct Game {
   string name;
   uint move;
   bool isFinished;
   address ownerAddress;
   uint stake;
   uint index;
}

That handles one game but I want to store all games. I attempted to do this in a number of different ways but settled on mapping using the games index as the key. Every time a new game is added the index is incremented so I also use gameCount to keep count of the total games.

mapping (uint => Game) games;
uint gameCount;

struct Game {
        string name;
        uint move;
        bool isFinished;
        address ownerAddress;
        uint stake;
        uint index;
    }

To add a new game I used this function:

function StartGame(uint moveId, string gameName) payable {
      require(moveId >= 0 && moveId <= 2);
      games[gameCount].name = gameName;
      games[gameCount].move = moveId;
      games[gameCount].isFinished = false;
      games[gameCount].ownerAddress = msg.sender;
      games[gameCount].stake = msg.value;
      games[gameCount].index = gameCount;
      gameCount++;
}

I also added a function that returns the total number of games:

function GetGamesLength() public returns (uint){
   return gameCount;
}

Returning A Structure

Next I want to be able to get information about a game using it’s index. In Solidity a structure can only be returned by a function from an internal call so for the front end to get the data I had to find another way. I went with the suggestion here — return the fields of the struct as separate return variables.

function GetGame(uint Index) public returns (string, bool, address, uint, uint) {
    return (games[Index].name, games[Index].isFinished, games[Index].ownerAddress, games[Index].stake, games[Index].index);
}

Front End

On the front end I use Web3 to iterate over each game and display it. To begin I call the GetGamesLength() function. As we saw previously this gives the total number of games. Then I can iterate the index from 0->NoGames to get the data for each game using the GetGame(uint Index) function.

When my page first loads it calls:

getGames: function() {
    var rpsInstance;
    App.contracts.RpsFirst.deployed().then(function(instance) {
      rpsInstance = instance;
      return rpsInstance.GetGamesLength.call();
    }).then(function(gameCount) {
      App.getAllGames(web3.toDecimal(gameCount), rpsInstance);
    }).catch(function(err) {
      console.log(err.message);
    });
  },

Web3 – Promises, Promises & more Promises…

The getAllGames function calls GetGame(uint Index) for each game. To do this I created a sequence of promises using the method described here:

getAllGames: function(NoGames, Instance){
   
   var sequence = Promise.resolve()

   for (var i=0; i < NoGames; i++){(function(){  
         var capturedindex = i
         sequence = sequence.then(function(){
            return Instance.GetGame.call(capturedindex);
         }).then(function(Game){
            console.log(Game + ' fetched!'
            // Do something with game data. 
            console.log(Game[0]); // Name
            console.log(Game[1]); // isFinished
         }).catch(function(err){
            console.log('Error loading ' + err)
         })
      }())
   }
}

Conclusion

Looking back at this now it’s all pretty easy looking but it took me a while to get there! I’m still not even sure if it’s the best way to do it. Any advice would be awesome and if it helps someone even better.

Token Models – Storm

Storm

Storm are another interesting crypto company that are planning to do an ICO for their Storm Token in the near future so it’s worth investigating how their token model works.

Storm Background

Storm plan to release a number of DApps:

Storm Play: Storm already have a product called BitMaker (1.2M+ downloads across 187 countries) which they are rebranding as Storm Play and integrating to the Storm products.

Storm Market:  A decentralised micro-task marketplace.

Storm Shop: Opportunities where users can learn about or sample retail
goods and services (no real info on this yet).

How It Works

The service is relatively simple, and I see this is as a positive. The basics of the service are described below:

“Makers” and “Players” meet each other to buy and sell tasks.

“Storm Makers,” post tasks using the Storm Play app. To post a task you are required to pay using STORM tokens.

“Storm Players,” use the Storm Play app to perform tasks. Players receive reward units called Bolts in return.

Bolts may not be transferred off of Storm Market but are redeemable for STORM Tokens.

Players can also earn Bolts as part of the gamified reward system (see below). Players can be rewarded for:

  • Creating tasks
  • Completing tasks
  • Managing other Storm Players that a person referred in to complete a task
    successfully
  • Helping categorize a task or helping to rank a task

These Bolts can be used for boosts. Boosts give the Storm Player access to more micro-tasks for a certain period of time

Token Analysis

As I did the last time, working through the twenty questions proposed by William Mougayar in Tokenomics — A Business Guide to Token Usage, Utility and Value really helps to think about how the tokens are used:

I found it more difficult to answer some of these and I’m not 100% convinced with some of my answers. For now I think the tokens offer the following utility:

The Value Exchange

Players are rewarded Bolts for tasks such as categorising a task or helping to rank a task. These Bolts can be used for boosts. Boosts give the Storm Player access to more micro-tasks for a certain period of time which should lead to more tasks being completed. Think this is maybe clutching a little to meet Mougayars definition, there’s not a real transactional economy between buyers/sellers as there’s not much to spend Bolts on but I think this will develop.

The Toll

Makers are required to pay for posts in STORM tokens. Also Players have to pay Bolts to boost.

 

The Currency

Reduces traditional payment fees.

Conclusion:

Overall score is 9/20. I’m not surprised it’s a fairly low score as my gut feeling is that apart from reducing fees the token isn’t intrinsic to the business plan. Micro-tasks and tokens fit well but they don’t need each other. I do still like the  though and maybe ‘only’ reducing fees is enough – especially with the current industry leaders charging up to 40% (see whitepaper).

Gamification:

Whilst not blockchain technology specific the gamification aspect is interesting and is a big strength for the Storm team so I thought I’d investigate it a bit further in this section.

The main goal of embedding a ‘gamification’ layer to the Storm Market platform is to make it effective in reaching objectives for boosting and delivering business results, like ROE (Return on Engagement). Storm Play can make monotonous work much more enjoyable through a gamified platform.

Unlike previously cited freelance platforms too, gamification is a very important aspect of Storm’s ecosystem. Makers post tasks on the market and Players execute those tasks in exchange of different kinds of rewards (including money) that make the whole thing a fun experience.

To gamify the progression of its users, Storm uses “Bolts” as units of a rewarding system. Users can earn Bolts by doing micro-tasks and by doing training tasks to improve their skills. As their experience increase, they are assigned more micro-tasks. But your experience level on the Storm market is not tied to your Bolts, they are an artifact of it. You can actually use them to get “boosts”: “Within Storm Market, Storm Players can use their balance of Bolts for boosts. Boosts give the Storm Player access to more micro-tasks for a certain period of time”

Links

Storm Website

Analysis of Storm ICO

Token Models – Grid+

Grid+

In my previous post I stated I would try to learn more about token models and the associated technology by investigating crypto companies. First up a company called Grid+.

Grid+ aims to be a utility provider that exposes its customers to wholesale electricity prices. I really like their blog posts, they put a lot of effort into explaining the thought process behind the company. One of the co-founders, Alex Miller, actually encourages criticism and I think his transparency is refreshing. They’re one of the few companies who actually go into details about how blockchain technology can be used in a real life business and reading their posts really helps to develop my understanding of this technology.

Grid+ Value

First up it’s useful to look at what value Grid+ can potentially bring to a customer.

Grid+ want to expose its customers to wholesale electricity prices. Traditional utilities use fixed prices which are not market driven but wholesale prices vary throughout the day. Grid+ will offer prices both for consumed and generated energy based on the wholesale price (with a relatively small markup for most customers). This means customers have the opportunity to buy electricity when it is cheap, store it in batteries and sell when the price goes up. To me that’s pretty cool by itself.

The plan is to operate using the existing grid infrastructure. In theory thats a smart way to save costs, especially to start-up. From the description of how the system will work (see below) I wonder if this might potentially restrict them.

They place a lot of emphasis on giving their users ‘agency‘ which means users control the sending of funds, etc. In this system there is minimal human administration and minimal credit risk. Ultimately this keeps the price low for customers by avoiding traditional costs associated with admin, etc.

An IoT ‘Agent’ device is used to automatically handle payments via the Ethereum network. The goal is to create a p2p market between Grid+ agents. A customer who has electricity to sell can offer it at a local rate lower than the nearest wholesale rate. They will get bought out by another customer  with the two customer Agents handling payments via a Grid+ hub on the Raiden network.

Grid+ acts as the bridge between the old world of traditional energy suppliers and the new world of prosumers and p2p transactions.

They also have ambitions to make the Agent ‘smart’ to help the customer purchase the cheapest electricity possible. The Agent can be connected to other sensors such as thermostats, it will then predict energy usage and save money on energy purchasing.

Ultimately these are all potentially smart ways to provide cheaper electricity to the customer.

Tokens

Grid+ Tokens

Grid+ will operate with a two-token model, a BOLT token and a GRID token.

The BOLT token will be treated by Grid+ as a stable-coin, redeemable by Grid+ customers for $1 worth of energy from Grid+ and backed by USD deposits.

The GRID token will allow Grid+ customers to purchase electricity from Grid+ at wholesale price. 1 GRID token = 500 kWh at the wholesale price.

Token Evaluation

Working through the twenty questions proposed by William Mougayar in Tokenomics — A Business Guide to Token Usage, Utility and Value really helps to think about how the tokens are used:

QuestionBOLTGRID
Is the token tied to a product usage, i.e. does it give the user exclusive access to it, or provide interaction rights to the product?Yes
Provides access to Grid+ energy.
Yes
1 GRID token = 500 kWh at the wholesale price.
Does the token grant a governance action, like voting on a consensus related or other decision-making factor?NoNo
Does the token enable the user to contribute to a value-adding action for the network or market that is being built?Yes
Selling generated electricity on a p2p network.
No
Does the token grant an ownership of sorts, whether it is real or a proxy to a value?Yes
Ownership of energy.
Yes
Ownership of wholesale energy.
Does the token result in a monetizable reward based on an action by the user (active work)?Yes
Any revenue earned by a customer from selling electricity is earned in BOLT tokens instead of fiat and is stored on the Agent.
No
Does the token grant the user a value based on sharing or disclosing some data about them (passive work)?NoNo
Is buying something part of the business model?YesYes
Is selling something part of the business model?YesNo
Can users create a new product or service?NoNo
Is the token required to run a smart contract or to fund an oracle? NoNo
Is the token required as a security deposit to secure some aspect of the blockchain’s operation?NoNo
Is the token (or a derivative of it, like a stable coin or gas unit) used to pay for some usage?Yes
Pay for used electricity.
No
Is the token required to join a network or other related entity?Yes
Can't join Grid+ network without token.
No
Does the token enable a real connection between users?Yes?
Allows p2p payments between users.
No
Is the token given away or offered at a discount, as an incentive to encourage product trial or usage?NoYes
I think it incentivises people to start using Grid+.
Is the token your principal payment unit, essentially functioning as an internal currency?YesNo
Is the token (or derivative of it) the principal accounting unit for all internal transactions?I think this is a yes although it may be kW instead.No
Does your blockchain autonomously distribute profits to token holders?Yes
Agents are automatically updated.
No
Does your blockchain autonomously distribute other benefits to token holders?Yes
Possibly Casper, etc?
No
Is there a related benefit to your users, resulting from built-in currency inflation?NoNo
Total13/204/20

Which leads to the conclusion that the tokens offer the following utility:

The Value Exchange

The BOLT token is a unit of value that allows the buying and selling of energy between Grid+ and users and also user to user.

Users can also earn BOLTS by generating and selling electricity or by arbitrage.

The Toll

It’s obvious that the BOLT token is used to pay for electricity used but I feel like I’m not 100% on this one. Maybe it’s because a token isn’t essential for this function. In the User Agency post Alex does admits this part could be done using a traditional infrastructure.

The Function

As far as I can see the GRID token is mainly for. It is offering a discount that encourages users to use Grid+ services.

The Currency

Traditional payment processing fees and admin costs are reduced.

p2p payments are made economically and technologically viable.

Conclusions

I’m not sure if 13/20 is considered a high score but my feelings are the BOLT token utility more than proves itself essential to the business model.

Initially I was sceptical about the use of the GRID token but after working through the questions I can see it’s potentially a really nice way to incentivise people to start using the service.

There are also some other interesting opportunities described in the post Casper, Plasma, and the Grid+ Agent that demonstrates the Grid+ team are super innovative when it comes to  using blockchain technology and are still exploring new ideas – it must be a cool place to work! I think the last paragraph from Alex Millers The P2P Grid We All Want post demonstrates this:

However, this brainstorming session is likely an exercise in futility because the future is always uncertain and we’re talking about a future that is likely a decade or more away. Nevertheless, we at Grid+ are primarily motivated by bringing forth the future of energy. If Grid+ is so successful that our customers eventually dis-intermediate us and we really have no way to make money, then perhaps we’ve accomplished what we set out to do. We as a team would be happy with an outcome that empowers the people and establishes a new, transactive grid because that would mean we will have proven the power of decentralizing technologies — as far as we’re concerned, that’s a win for humanity.

How Grid+ Will Work

The following isn’t actually that relative from the token model perspective but it’s really interesting to see the nuts and bolts of how a crypto company could function. Really cool of the Grid+ to share this.

A customer purchases an agent (a small device capable of making signatures) and claims that device based on a serial number printed on the box.

Before shipping the agent Grid+ whitelist its serial number on their registry contract.

Grid+ map that serial number to an Ethereum address, which is a function of its public key .

Agent Address – Created when device first boots up.

Owner Address – customer who enters the serial number printed on the device into the Grid+ web console.

Customers will make a refundable initial deposit then will prepay each month.

Fiat payments will be converted to USDX tokens.

Every few hours Grid+ request payment from a customers agent commensurate with the amount of energy used.

If the customer exceeds her monthly allowance (or her agent simply doesn’t pay), the customer will be notified deductions are made from her initial account deposit.

your utility company tracks how much power goes into your house and how much power goes out of your house and then sends you a bill based on the net result.

Smart meters send data to ISO. ISO makes data available via API.

Grid+ will  query the ISO (ERCOT in Texas) and bill customers’ Agents based on that usage.

Analysing Token Models

ICO’s

An ICO or Initial Coin Offering allows a company to sell its own cryptocurrency token to investors. As William Mougayar (a true Blockchain expert) states:

A [cryptocurrency] token is just another term for a type of privately issued currency.

It’s probably fair to say that at this time raising funds via an ICO is relatively easy when compared to traditional methods. There have been many successful ICOs with large amounts of cash being generated. It’s also probably fair to say that a large amount of the companies running these have no real requirement for a token and are simply accessing the ‘easy’ cash.

Some experts predict the majority of ICOs will fail but the organisations with genuine token models (and good execution) will survive and these are the ones that are innovative, valuable and interesting.

Analysing The Token Model

My plan is to analyse companies that use tokens to basically answer the question:

What does the token do that makes it essential to the business model?

This should help me to better understand the technology and identify how tokens can be used.

As a basis of the investigation I will use William Mougayars Role — Features — Purpose framework for assessing token utility as described in his post, Tokenomics — A Business Guide to Token Usage.

The chart shown below is taken from the Tokenomics post and shows the main roles a token can play.  The post itself provides great detail that I won’t repeat but is definitely worth a read. Using this and the list of questions at the end should result in a reliable conclusion on the utility of the token.

Token Utility – William Mougayar

Token Utility Questions (credit William Mougayar)

  1. Is the token tied to a product usage, i.e. does it give the user exclusive access to it, or provide interaction rights to the product?
  2. Does the token grant a governance action, like voting on a consensus related or other decision-making factor?
  3. Does the token enable the user to contribute to a value-adding action for the network or market that is being built?
  4. Does the token grant an ownership of sorts, whether it is real or a proxy to a value?
  5. Does the token result in a monetizable reward based on an action by the user (active work)?
  6. Does the token grant the user a value based on sharing or disclosing some data about them (passive work)?
  7. Is buying something part of the business model?
  8. Is selling something part of the business model?
  9. Can users create a new product or service?
  10. Is the token required to run a smart contract or to fund an oracle? (an oracle is a source of information or data that other a smart contract can use)
  11. Is the token required as a security deposit to secure some aspect of the blockchain’s operation?
  12. Is the token (or a derivative of it, like a stable coin or gas unit) used to pay for some usage?
  13. Is the token required to join a network or other related entity?
  14. Does the token enable a real connection between users?
  15. Is the token given away or offered at a discount, as an incentive to encourage product trial or usage?
  16. Is the token your principal payment unit, essentially functioning as an internal currency?
  17. Is the token (or derivative of it) the principal accounting unit for all internal transactions?
  18. Does your blockchain autonomously distribute profits to token holders?
  19. Does your blockchain autonomously distribute other benefits to token holders?
  20. Is there a related benefit to your users, resulting from built-in currency inflation?

Python & Redis PUB/SUB

I recently had an issue where I have three Python scripts that need to be started on my RaspberryPI when it boots. Script 1 is identifying what serial devices are attached to the PI, saving the information to a database. Scripts 2 and 3 need to use this information, so have to wait until script 1 has completed.

I wasn’t sure of the best way to do this but I thought of a method that lets me experiment with Redis PUB/SUB which I’ve wanted an excuse to use for a while!

My Scripts

The redis-py Python Client is well documented and really easy to use. I created a function, RedisCheck(), that Scripts 2 & 3 call when they start. The function subscribes to a Redis channel called ‘startScripts’. It then goes into a loop until it receives a ‘START’ message on that channel. Once this is received the main scripts can continue with their primary jobs.

Script1 is very simple, the ‘START’ message is PUBLISHED via the ‘startScripts’ channel once the main work is done.

The following code should demonstrate how easy the code was to write and it works really well.

RedisCheck() SUB for Scripts 2 & 3

Script 1 PUB START

Publishing Via The Command Line

Another nice thing I found was how easy it was to PUBLISH the ‘START’ message using the redis-cli. All I do is run:

# redis-cli

> PUBLISH startScripts START

This is really useful if I’m debugging and so easy to do. Overall I really like Redis.

Python Map Plotting Using Cartopy

Cartopy Plot of Scotland

Recently I’ve been using Python and Cartopy to plot some Latitude/Longitude data on a map. Initially it took some time to figure out how to get it to work so I thought I’d share my code incase it was useful.

According to the Cartopy intro it is

“a Python package designed to make drawing maps for data analysis and visualisation as easy as possible.”

I’m not sure how active the project is and I found the documentation a bit lacking but once I was up and running it was pretty easy to use and I think the results look pretty good.

Plotting My Data

I have a csv file with various data timestamped and saved on each line. For this case I was interested in the lat/lng location, signal strength (for an antenna) and also a satellite number. An example of one line of data is:

2017–07–10 22:31:59:203,Processing UpdatePacket: [‘:’, ‘1’, ‘0’, ‘0’, ‘1’, ‘0’, ‘0’, ‘1.63’, ‘17.15’, ‘246.57’, ‘114.11’, ‘57.008263’, ‘-5.827861’, ‘310.00’, ‘1’, ‘NAN’, ‘0’, ‘2’, ‘0’, ‘c\n’]

and from that the information I require is:

lat/lng position: 57.008263,-5.827861
signal strength: 1.63
satellite number: 310.00

Initially for each lat/lng position I wanted to plot the point on a map and colour the marker at that point to show which satellite number it was. Also if the signal strength was -100 the marker colour should be shown as red. An example taken from some of the data is shown below.

 

Lat/Lng Plots with different zoom level

The following Gist shows the Python script I used:

Script Details

Most of the script is actually concerned with reading the file and parsing the relevant data. The main plotting functionality is in the section:

ax = plt.axes(projection=ccrs.Mercator()) # Map projection
ax.coastlines(resolution=’10m’)           # Adds coastline to map at highest resolution

plt.scatter(lngArr, latArr, s=area, c=satLngArr, alpha=0.5, transform=ccrs.Geodetic())                # Plot
plt.show()

The projection sets the coordinate system and is selected from the Cartopy projection list (there’s a lot to pick from and I chose the one I thought looked the best).

Next a coastline is added to the projection. As I was focusing on a small section of Scottish coastline I went with the 10m resolution which is the highest but lower resolutions can be selected as detailed in the documentation.

Finally a scatter plot is created. The data has been parsed into equal sized lists of longitude and latitude points.

The ‘s’ parameter defines the size of the marker at each point, in this case all set to 1pt radii.

The ‘c’ parameter defines the colour of the marker, in this case blue for satellite 310, green for 60, yellow for 302, black for any other satellite and red if signal strength is -100.

Finally the transform=ccrs.Geodetic() sets the lat/lng coordinate system as defined here.

Scaling Marker Size

It’s also possible to adjust the radius of the marker at each point. To scale it relative to the signal strength (I removed the -100 strengths):

area = np.pi * (strengthNpArray)**2

Which gives:

 

Marker scaled to strength at point