Some time ago, I built the OpenERZ API (and wrote about it on this blog). The API provides a nice and easy way to the waste collection data of the City of Zurich, provided by Entsorgung + Recycling Zürich (ERZ) via the Open Data Portal of the City of Zurich.
OpenERZ provides iCal calendars, so that you can create a custom calender for the types of waste and the the ZIP code your interested in and put that in your digital calendar.
But for me the killer feature is the get push notifications. There used to be a free SMS service from ERZ to remind you of the waste collection (e.g. to put your cardboard in front of your house, so that it gets collected the next day). Unfortunately they shut down this sevice and launched their own app.
So here I am, with an API that has all the data and the need for push notifications. Instead of building everything from scratch, I decided to look into existing solutions. The first thing, that came to my mind was IFTTT, the universal recipe tool to connect different services.
It has almost everything I need:
- A time based trigger, to run something every day (think cron)
- A so called channel to make and receive web requests (it's called "Maker" on IFTTT)
- A channel for Pushover, a service to receive push notifications on your phone (there are other options to do this, but this is my current choice)
To glue these together, only a little script is needed, that makes API requests to OpenERZ and transforms these into single messages that can be received as a push notification. For this purpose I created the erz-trigger project. It's a small node application that I deployed to Heroku.
If you make a GET request to the application, it calls the OpenERZ API for waste collections of the following day with the given zip code. It then makes a call to the Maker channel web hook for each waste collection returned by the API. IFTTT takes care that I get a push notification, if the web hook is called.
In the end, I have the heroku instance running and setup two recipes on IFTTT:
To make it a little more secure, the application is protected by an API token. So feel free to launch your own instance of erz-trigger, just make sure to configure your own token and to provide your API key of IFTTT.
At this years OpenData.ch conference in Lausanne I could present the current state of the Swiss federal open data portal opendata.swiss and give some technical insights on how we built certain components.
04.06.2016Civic Hacking @ ImNeuland
At the "Informatiktage 2016" there were several events and I spoke at the ImNeuland conference about civic hacking. My talk was mainly about why I consider myself to be a civic hacker, why my curiosity drives me to explore things and how data can help us to explain the world.
Here are my slides:
If you are like me, chances are you have a vagrant setup for each project your working on. It's a great way of sharing a specific system configuration, especially if you work in a team of developers. In my team, we have people running their systems with different operating systems and settings in general. So vagrant is a real life-saver when it comes to developing together and agreeing on a setup for a specific project.
But there are some pain points. If you turn off your machine, but forget you have a vagrant box running, the shutdown gets stuck, because it waits for the virtual network interface to be freed, which never happens To solve this issue, it's important to know, what instances are still running. If you only have one or two setups, this might be easy. But it soons gets a tedious task to keep track by yourself.
Luckily vagrant provides us with a nice command to show all boxes on a machine called
This shows all instances, that are known to vagrant. The results are cached and updated from time to time. In my case, it didn't recognize all of my instances. Unfortunately, the only way to change that is the rebuild the box.
What I did instead is create a little script, that collects it's own data (based on the existance of
This gives a more fine-grained view of the system and I have better control over when the data is updated.
vagrant-status collects the data and updates the cache (slow operation).
I run it every 15min as a cronjob, to keep the cache up-to-date.
vagrant-cache simply displays the result of the last run of
You can find the scripts on GitHub.
As I already mentioned before, I'm not a big fan of E-Voting.
Grünenfelder warnte davor, dass Unterbrechungen die Verbreitung und Akzeptanz von E-Voting mehr gefährden als drohende Sicherheitsrisiken. Es sei darum notwendig, dass Bund, Kantone und die Bundeskanzlei das E-Voting und die Digitalisierung der Politik weiter vorantreiben. "Die Erfolgsgeschichte wird weitergehen", sagte Grünenfelder. "Allen Unkenrufen zum Trotz."
As long as we don't have an open source solution, that is accessible to everyone and which makes it possible for everyone to verify that their vote has been casted, we can stop all further discussions.
My fear is, that E-Voting will just be an accepted reality, without any discussions about it. I'm not against technology, by no means. But we need to have a discussion as a society what this means for us. And if we are okay with the consequences. Now is the time to discuss this. Not when all systems are in place and everybody already uses them. Or when they get hacked for the first time.
Update: Someone mentioned to me, that we have the option to vote by snail mail and that my critic is invalid, as this way of voting is just as bad, but already accepted. For the record: just because there is another way of casting my vote, that is equally bad and hard to control/verify, does not mean that the critic is invalid. It only means that we have to think about this other way as well. And that the critics may apply as well.
01.12.2015Open Data MashUp @ Impact Hub
Last week I had the opportunity to talk at the Impact Hub in Zurich about Open Data. This event was part of a group of events in different locations of Impact Hub, called MashUp.
The talk was held in the Pecha Kucha format, which means you have 20 slides and 20 seconds for each slide. Because of this condensed form of giving a talk, there is not much space for text on the slides, rather images, graphics etc. This is just a warning, as the slides without my talk might not be very useful. Feel free to either fill the gaps yourself or contact me if you need more information or if you want to hear the talk again.
Download the PDF of my slides or see them directly here:
17.09.2015Why I'm against E-Voting
In Switzerland we'll have the national elections coming up on the 18th October 2015. It's again that time where the voices are getting louder. Especially in those moments, but actually before every political vote in Switzerland (which happens several times a year), there seems to be a rising interest in the subject of E-Voting.
As I work in IT, people ofter assume I'm waiting for that moment, when it's finally allowed to do that (actually there are two cantons that have the permission, but I don't live in one of those). But actually it's the opposite. And probably it's because I work in IT. I had this conversation already several times, so I though it's time to write down my main arguments for the most common objections.
Feel free to correct me :)
I can haz vote on my smartphone on my way to work?
Democracy is hard. It's not designed to be easy, and "Voting" is the crucial core of democracy. Voting is a public act, that must be accountable, while remaining each persons voting secrecy. And to achieve that we already have a perfect system in place: you can either vote in person or send the ballot by mail.
It's not that hard, but it takes some time. We're not talking about a Facebook "Like" click or a Tinder "Hot or Not" swipe, we're talking about our polical rights and our duty as citizens to not easily give up the power we have.
E-Voting will increase the participation in the elections, especially the young will vote if there would be E-Voting
Bullshit! If someone doesn't care about voting, they will not start doing so if they get another channel. Even the Federal Chancellery had to admit that the participations didn't increase in their tests (see also here, here and here)
E-Banking is secure, why should E-Voting be less secure?
First of all: E-Banking is not secure. They try their best to make it so, but there are so many possibilities to overcome a system, it's impossible to say one of these systems is secure. But the banks are willing to take the risk. This is up to them, and I myself am a happy E-Banking user.
I do this with the knowledge, that if something goes wrong, it can be fixed. Or if I lose money due to my own stupidity, so be it. I'm willing to take this risk as well.
But all this does not apply to a core element of democracy. There is no one that covers for a fraudulent election. The risks that are inherent to using computers are simply not acceptable.
Fraud is also possible in the classic voting system. You could count wrong. Or enter the wrong number when transfering it the next department. But E-Voting would make it possible to commit systematic fraud.
All you had to do, is to attack a single point. If you can change the software, you might be able to count differently. Or you could do a Denial-of-Service attack and prevent people from voting online.
The software they use is security-audited and state-of-the-art
Yes, I'm sure about that. But that doesn't change a thing. This simply states, that it's hard to do, but not impossible. Plus we can't be sure that the auditors catch everything. Of course a good way to avoid or mitigate this kind of problem is to use open source software. I must be able to read it's source code. So we can stop talking about all those closed-source solutions, where we'll never be able to tell if it's doing what it should be doing. The consequence of this would be, that only people that can code, can verify the E-Voting system, where the current (paper-based) system, is immediately clear for everybody.
There is a famous competition called "Underhanded C". The goal is to write innocent-looking code, that passes visual inspection, but has a harmful side-effect. So it's definitely possible to do. It's only a matter of time.
But let's assume for a moment, that this perfect piece of software exists, and it has been audited and it works and does no evil. We still have a problem: there is no way of telling what code currently runs on a computer. So maybe the software has been replaced between the review and the runtime. Or it could have been changed at runtime. We simply cannot say. If we could, we would have solved the Halting problem (see also this excellent explanation about why this is the case).
We as a society have to decide if we're willing to take the risk, that the E-Voting system might get hacked. That it might be broken. That it might have been compromised. The answer for me is clear: it's broken by design, and it's way too important to take chances here.
Let's keep the current system
The current system is not broken. Why should be replace it with a broken one? The current system allows us to watch the voting process. We can see the ballots, we can count them.
By hiding that inside a computer, there is nothing we can watch. We can only trust in the system. It's way to important to just leave it like that.
That said, I'm always open to discuss alternatives. Maybe there comes a time, where we discover something fundamental new, that it'll be possible. But this new thing is not just a new algorithm. If it's based on the same principles as I've written above, it is not accepable for a democracy.
Even though I'm not an education expert, and I don't really have a professional background in didactics, I have some experience when I comes to teach people how to code, especially if they never wrote a single line of code before.
The topic of teaching accompanies me since some time now. For one, the most obvious one, I'm an apprentice trainer. This means it's part of my job, to teach our young apprentices how to program. But it doesn't stop there. I'm quite active on StackOverflow (although one could argue this doesn't count as teaching). Then last year, I got the chance to hold a workshop at the Jugendmedientage to teach programming to young journalists, that never coded before. All these experiences led me to believe that:
- Programming can be learned
- It takes time, effort and patience for both the teacher and the student
- Curiosity is the key
I'm writing about this topic, because not long ago we were discussing this topic at work. One of our project managers said, she wants to learn at least a bit of programming, in order to understand what her team is doing. Intuitively I disagreed. I argued, that it's not the responsibility of a project manager to understand programming. Everybody should focus on their main responsibilities and skills.
But then again:
- Wouldn't it be nice to have a common understanding of some basics?
- It would be a tremendous help, if a PM could already decide if a problem is hard or easy
- Or if a PM understands some technical terms that get thrown around in the room all the time?
- Why should anybody stop someone from learning something they are clearly interested in?
But it's not just about PMs, and it's not just about my work place. It's nice to be able to code. To know, that this task, you've been doing manually since so many years could be written in just a few lines of code. Your life can get easier.
One of the projects I really like is OpenTechSchool. It aims to create free online learning material for everybody to learn about programming and related topics. Apart from the docs, there are workshops, free of charge, led by volunteers like me.
The goal of such workshops is not to become a super-hero-programming-ninja. It's about understanding how a programmer tackles a problem, learn new skills and grasp some basic concepts.
Everbody can learn to code. If they want to. I'm happy to help.
01.01.2015Why and how I created the OpenERZ API
The holidays season is always a good point in time to reflect and sort ideas/thoughts etc. A number of reasons lead me to create the OpenERZ API.
It all started about a month ago, when I got the new calendar of Entsorung und Recycling Zürich (ERZ) with the dates for the different waste collections. This is a phyisical paper thingy that is sent to all households in Zurich. There was also an information attached that ERZ has its own iOS and Android app, so you have the same data in an eletronic format on your personal smartphone.
That's when it got interesting. I knew that ERZ already published their data as CSVs on the Open Data portal of the City of Zurich. I also knew that my fellow open data enthusiasts already created a smartphone app with that data. What I didn't know at that point was, if there has been any kind of communication between those parties or how this all fits together.
Via twitter I got confirmation, that there was indeed no communication. Unfortunately ERZ did not contact the original developers and did not engage with the official Open Data team of the City of Zurich. André Golliez wrote an "open letter" to the CEO of ERZ. After that there were replies, meetings etc. to clear things up.
What really caught my attention was the reply of SR Filippo Leutenegger, where he stated:
"Die Datenbank steht im Zentrum, die App «Sauberes Zürich» ist mit ihren drei Betriebssystemen ein Output aus der Datenbank. Die Datenbank steht im Zentrum, die App «Sauberes Zürich» ist mit ihren drei Betriebssystemen ein Output aus der Datenbank."
So they created a database, which makes it easier for them to update the data. And this database is used for the print products, the apps and possible other clients like their website (not sure if that's the case though). For me it was clear, that there is a need for an API on top of this data, no matter what client uses this. A database is a good start but an API is how data becomes really useful and easy to integrate.
My first thought was, that the developers of the new app already created an API, but just didn't tell anyone. That would've been perfect, as all I had to do, was find this hidden API and create a public proxy to it. To investigate this, I downloaded the app, setup a Charles proxy and looked at the communication (thanks to my colleague Franz for pointing me in the right direction here!).
I quickly found the calls of the app:
- ... etc.
So there we have all the data.
metadata.json contains a list of available files, which are then requested by the client.
It looks like an export from the above-mentioned database, which is loaded when the app starts.
Of course, this is a kind of API, but not one that you can query for single records.
At this point, I decided to create the OpenERZ API. The JSON from the app are not very clear and self-explanatory (they were not designed with that in mind). So I decided to use the CSVs from the Open Data portal as a basis instead. It was made clear by SR Filippo Leutenegger, that ERZ wants to continue to provide this data, so I think its a safe choice to build the API on top of that.
The OpenERZ API is a node application that runs on Heroku. The process is quite simple: the CSVs are converted to JSON and transformed to be in a clear format. After that, the nicely formatted JSON is imported in a MongoDB as documents. Later this allows us to query the data easily. And by using MongoDB, it's easy to extend the data model at a later point without breaking the application.
The routing part of the API is build using the Hapi framework and its ecosystem (Joi, hapi-swagger etc.). After some iterations the current API gained some features:
- route for all the different waste types
- a list of stations
- data can be filtered and sorted by different parameter
- GeoJSON of all the cargo and e-tram stops
- For the sake of completeness, I added the JSONs of the app as well, those are available at the /export route.
All the code is on GitHub, feel free to contribute, either with code, ideas or bug reports and of course use the API for whatever you want :)
18.09.2014Workshop about Open Data Technologies
At todays Opendata.ch Conference 2014 I had the opportunity to talk about technologies around the topic of open data. I gave a quick overview over different approaches that are being used including CKAN, The DataTank and Data Central.
Here are my slides (German only, sorry!):
In the discussions that followed we talked about why we need Open Data portals (conclusion: mainly for the PR) and if we prefer to wait until a data provider "cleaned" it's data or rather want it fast, so that the community can help to clean it up (consensus: quick and dirty).
For me, today was a really fun conferences. I met lots of new people, could catch up with already familiar ones and listen to a lot of great talks.
Update: Now the videos of the conference have been published. My session has not been recorded, but at the end I could quickly present our findings (again German only):