Ubiquitous urban sensors can give us rich information about the world around us, such as current temperature, humidity, air quality, or noise levels. This information can help us make decisions in our everyday lives. We may for example decide to take a different route to work if the air quality on our commute is particularly bad, or we may want to open a window if humidity or CO levels are getting higher inside. On a higher level, such information can make us more aware of our environment and how we contribute to it. While sensor platforms are commercially available and sensors have been placed in our cities and our streets, data from these sensors is usually only available through websites or is stored in large data sheets that are hidden or unengaging to inexperienced users. Furthermore, direct representations of sensor readings do not always make sense to us: what unit are readings presented in, and when are readings out of the ordinary?
Physikit is a toolkit that makes users’ data visible and tangible through physical and embedded data visualizations called PhysiCubes (Figure, right). Physikit consists of (i) a number of PhysiCubes that each provide one unique physical visualization such as movement, light, air or vibration, and (ii) a web-based end-user configuration tool that allows users to quickly and easily connect data sources (Figure, left) to the PhysiCubes using a touchenabled interface (Figure , center). Users can explore, interpret and engage with different kinds of data through setting up simple rules for a variety of physical ambient visualizations. A key research question this raises is: whether allowing users to program the mapping and relation between data and physical visualizations empowers them to explore, engage with and understand a diversity of data.
Videos and Pictures