Last week the tension was palpable in Kenya in the run-up to the 8th August general elections. Gladly the voting process took place without too much hassle and the mood across the country was very positive with the interwebs populated with the purple shade “pinkie” that confirmed that the holder had indeed exercised a key democratic right. There was also Githeri Man who helped punctuate the mood with humor, having spawned hundreds of memes.
Things then seemingly took a turn south as the process of counting and transmitting results started, making varied publics uneasy, leading to all manner of conspiracy theories and the coming to the fore of dodgy dossiers that are quickly disseminated via social media. We have seen the impact of fake news on even the most advanced economies and the effects are probably compounded in our setup due to a huge rural populace that are several notches more impressionable.
Steering clear of granular analysis of the just concluded election tech, which I personally think operated well on the frontend, a giant step from the previous season performance, I would like to guide our future thinking based on first principles. This will allow us to look at the solution based on what we desire to achieve as opposed to what we have at hand to work with. This will also call for us to run experiments and test them at scale to allow for maximum possible buy in from all invested parties, especially the 45 million citizens.
At the core of every election pain point and rouse is trust. If we can better guarantee data integrity then our problems are 80 percent sorted. Data integrity can be further broken down to capture, transit and rest. We need to assure the publics that how the data is collected, how it is transmitted and how it is stored is both secure and auditable. The audit function needs to be open and transparent for anyone to query and follow.
Since the integrity of the data committed to transit has been verified, the payload encrypted and distribution made distributed and asymmetric, the transmission can be done on any connection, even the open internet and need not be secured whether via VPN or other such link.
With thousands or even millions of nodes and agents in the ecosystem performing maker checker operations on every single system interaction, it then becomes possible to reliably tabulate output and share that across different mediums and arrive at a leaderboard that communicates the will of the people and can very well be incontestable.
I have steered clear of mentioning technology stacks explicitly not to put blinders on you, but we have all the building blocks ready and all that is left is the goodwill to push the envelope. Another world first? Perhaps.