I was asked to give a workshop about Open Data. The audience was very broad, from young to old, from novice to tech-savvy, but they all had very little knowledge about Open Data.
It was a very good expierence to try to explain the field of my work and one of my biggest private interests from the ground up.
My approach was to break down this very complicated topic in 5 areas and I tried to encourage the participants to actually "get their hands dirty" by diving into an open data portal. In the end, 60min is a very short time, but I hope I made a lasting impression and the topic of open data is no longer an "unknown" in their heads.
12.11.2016Jugend hackt Schweiz: Github & Git
I was co-organizing "Jugend hackt" in Switzerland this year. If you want more information about the event, read the excellent blog post about it, watch the video of the final presentations or read the article by SRF (including an audio interview with me :) ).
During the event there was also a series of lightning talks about a variety of topics. I held a talk about "Github & Git", the slides are on my github account or available online and they are actually a fork of a similar talk at a previous "Jugend hackt" event.
This year the CKANCon was in Madrid, prior to the International Open Data Conference. It was a one day, rather tech-focused, conference. As CKAN is the framework I'm using almost on a daily basis, and I know a lot of people from the mailing list, it's good to meet them from time to time in person.
My talk was about the CKAN/WordPress integration we did for opendata.swiss. You can find the slides below.
Update: In the meantime, all CKAN and WordPress plugins were open sourced and can be found on GitHub!
I was invited by the Hacks/Hackers meetup to give a talk about my passion "open data", so I decided to mashup my previous talks and talk about my work (building open data portals) and my private interest (being an open data activist and a civic hacker). It was a very interesting experience and there were a lot of questions in the Q&A after my talk and even after then when we had some drinks at the venue.
Thank you to the organizers to having me and for everybody, who showed up!
As always, here are my slides:
Some time ago, I built the OpenERZ API (and wrote about it on this blog). The API provides a nice and easy way to the waste collection data of the City of Zurich, provided by Entsorgung + Recycling Zürich (ERZ) via the Open Data Portal of the City of Zurich.
OpenERZ provides iCal calendars, so that you can create a custom calender for the types of waste and the the ZIP code your interested in and put that in your digital calendar.
But for me the killer feature is the get push notifications. There used to be a free SMS service from ERZ to remind you of the waste collection (e.g. to put your cardboard in front of your house, so that it gets collected the next day). Unfortunately they shut down this sevice and launched their own app.
So here I am, with an API that has all the data and the need for push notifications. Instead of building everything from scratch, I decided to look into existing solutions. The first thing, that came to my mind was IFTTT, the universal recipe tool to connect different services.
It has almost everything I need:
- A time based trigger, to run something every day (think cron)
- A so called channel to make and receive web requests (it's called "Maker" on IFTTT)
- A channel for Pushover, a service to receive push notifications on your phone (there are other options to do this, but this is my current choice)
To glue these together, only a little script is needed, that makes API requests to OpenERZ and transforms these into single messages that can be received as a push notification. For this purpose I created the erz-trigger project. It's a small node application that I deployed to Heroku.
If you make a GET request to the application, it calls the OpenERZ API for waste collections of the following day with the given zip code. It then makes a call to the Maker channel web hook for each waste collection returned by the API. IFTTT takes care that I get a push notification, if the web hook is called.
In the end, I have the heroku instance running and setup two recipes on IFTTT:
To make it a little more secure, the application is protected by an API token. So feel free to launch your own instance of erz-trigger, just make sure to configure your own token and to provide your API key of IFTTT.
At this years OpenData.ch conference in Lausanne I could present the current state of the Swiss federal open data portal opendata.swiss and give some technical insights on how we built certain components.
04.06.2016Civic Hacking @ ImNeuland
At the "Informatiktage 2016" there were several events and I spoke at the ImNeuland conference about civic hacking. My talk was mainly about why I consider myself to be a civic hacker, why my curiosity drives me to explore things and how data can help us to explain the world.
Here are my slides:
If you are like me, chances are you have a vagrant setup for each project your working on. It's a great way of sharing a specific system configuration, especially if you work in a team of developers. In my team, we have people running their systems with different operating systems and settings in general. So vagrant is a real life-saver when it comes to developing together and agreeing on a setup for a specific project.
But there are some pain points. If you turn off your machine, but forget you have a vagrant box running, the shutdown gets stuck, because it waits for the virtual network interface to be freed, which never happens To solve this issue, it's important to know, what instances are still running. If you only have one or two setups, this might be easy. But it soons gets a tedious task to keep track by yourself.
Luckily vagrant provides us with a nice command to show all boxes on a machine called
This shows all instances, that are known to vagrant. The results are cached and updated from time to time. In my case, it didn't recognize all of my instances. Unfortunately, the only way to change that is the rebuild the box.
What I did instead is create a little script, that collects it's own data (based on the existance of
This gives a more fine-grained view of the system and I have better control over when the data is updated.
vagrant-status collects the data and updates the cache (slow operation).
I run it every 15min as a cronjob, to keep the cache up-to-date.
vagrant-cache simply displays the result of the last run of
You can find the scripts on GitHub.
As I already mentioned before, I'm not a big fan of E-Voting.
Grünenfelder warnte davor, dass Unterbrechungen die Verbreitung und Akzeptanz von E-Voting mehr gefährden als drohende Sicherheitsrisiken. Es sei darum notwendig, dass Bund, Kantone und die Bundeskanzlei das E-Voting und die Digitalisierung der Politik weiter vorantreiben. "Die Erfolgsgeschichte wird weitergehen", sagte Grünenfelder. "Allen Unkenrufen zum Trotz."
As long as we don't have an open source solution, that is accessible to everyone and which makes it possible for everyone to verify that their vote has been casted, we can stop all further discussions.
My fear is, that E-Voting will just be an accepted reality, without any discussions about it. I'm not against technology, by no means. But we need to have a discussion as a society what this means for us. And if we are okay with the consequences. Now is the time to discuss this. Not when all systems are in place and everybody already uses them. Or when they get hacked for the first time.
Update: Someone mentioned to me, that we have the option to vote by snail mail and that my critic is invalid, as this way of voting is just as bad, but already accepted. For the record: just because there is another way of casting my vote, that is equally bad and hard to control/verify, does not mean that the critic is invalid. It only means that we have to think about this other way as well. And that the critics may apply as well.
01.12.2015Open Data MashUp @ Impact Hub
Last week I had the opportunity to talk at the Impact Hub in Zurich about Open Data. This event was part of a group of events in different locations of Impact Hub, called MashUp.
The talk was held in the Pecha Kucha format, which means you have 20 slides and 20 seconds for each slide. Because of this condensed form of giving a talk, there is not much space for text on the slides, rather images, graphics etc. This is just a warning, as the slides without my talk might not be very useful. Feel free to either fill the gaps yourself or contact me if you need more information or if you want to hear the talk again.
Download the PDF of my slides or see them directly here: