The Blog

CHI 2015 Workshop: Beyond Personal Informatics

Alongside Chris Elsden, Dave Kirk and Chris Speed, I am involved in organising a workshop for CHI 2015, entitled ‘Beyond Personal Informatics: Designing for Experiences with Data’.
The goal of the workshop is to take a more critical look at Personal Informatics, and open up new opportunities for both capturing and designing with this kind of data, beyond its current narrow scope.
You can read our full proposal here, but the call for participation is pasted below.

======================================================
Beyond Personal Informatics: Designing for Experiences with Data
Workshop at ACM CHI 2015 Conference, Seoul, Korea

Submission deadline: 5th January 2015
Notifications: 6th February 2015
Workshop date: Saturday April 18th 2015
Website and details: http://di.ncl.ac.uk/beyondpersonalinformatics/
Contact: c.r.elsden@ncl.ac.uk
========================================================

A ‘data-driven life’ – the subject of five previous CHI workshops about self-tracking, personal informatics (PI), and the ‘Quantified Self’ – appears increasingly possible and popular. While these workshops have undoubtedly moved the field forward, we argue we should now develop a more critical understanding of the experience of living with, and by, data – rather than focusing only on the utility and efficiency of these technologies for behaviour change or health management.

We propose a workshop that looks beyond personal informatics, broadening and remapping a design space to consider the situated experience of a data-driven life. The workshop will be interactive and future-focused, seeking to critically challenge existing narratives and identify new design opportunities. How does PI become a social concern? How does the value of data evolve over many years? How can a ‘Quantified Self’ be represented besides graphs and numbers?

We invite participants from a range of backgrounds and practices to submit position papers, which may include, but are not limited to:

• Critical reflections on the design of PI or ‘Quantified Self’
• Case studies of existing experiences with PI
• Design or deployment of bespoke PI systems
• Speculative scenarios or design fictions including PI

Submissions should be of 4-6 pages in Extended Abstract format. However, we are flexible with regards to how you use this space, and for example would welcome submissions in the pictorial format introduced at DIS 2014 if this better suits your submission. These should be sent to c.r.elsden@ncl.ac.uk. The organising committee will review these, with particular attention to how they extend current thinking on personal informatics.

At least one author of each accepted submission must register for the workshop and at least one day of the conference. For approximate pricing, see CHI 2014 rates.

Organisers:
Chris Elsden, Culture Lab, Newcastle University
David Kirk, Culture Lab, Newcastle University
Mark Selby, Design Informatics, University of Edinburgh
Chris Speed, University of Edinburgh

Photobox: Best Paper Award at CHI 2014

Photobox Cat

Photobox was recently the subject of a paper that won a Best Paper award at CHI.

Will Odom, a researcher at Carnegie Mellon, did a study in which he deployed 3 Photobox’s in the 3 households for 14 months each. After wading through a lot of data, he wrote a great paper about what he found out .
The project and paper was a collaboration between researchers at Microsoft Research, Carnegie Mellon and Culture Lab, Newcastle University. You can read ‘Designing for Slowness, Anticipation and Re-visitation: A Long Term Field Study of the Photobox’, here.

As well as contributing to this paper, I’m really happy to have designed something that was part of a few people’ lives for over a year, and that gave them a good experience while prompting them to think more critically about their relationships with the technologies they use, and the ways they record and revisit their experiences.

Late Labs: The Internet of Things That Matter

smashed

On the 10th of April I’ll be giving a short talk and showing a version of The Earthquake Shelf at an event called, The Internet of Things That Matter. It will take place at 7:00pm, in the Informatics Forum, 10 Crichton Street, Edinburgh EH8 9AB.

The event is one of a series of Late Labs that are a part of the  Edinburgh International Science Festival, in collaboration with New Media Scotland and the University of Edinburgh’s School of Informatics.

How does data change our relationship with physical ‘things’? The Internet of Things exploits new technologies to link physical artefacts with data across social and technical networks. Join the Design Informatics Research Group to explore this new technology.

From teapots that you can haggle with in Oxfam shops or shelves that shake when earthquakes take place on the other side of the world, to clocks that print you a postcard of something that happened in the past. Let’s reflect upon the implication on our social lives.

Do come along and if you can – there are lots of  other great projects from Design Informatics lined up for the evening and, I’m sure, some great talks and conversations.

 

Design Informatics and Learning Energy Systems

start new job

For the last few months I have been writing up my phd thesis, and while that may be pushing what can acceptably be described as a ‘few’, the end appears to be in sight. I’ll post more about that when the time is right, but for now there is other news. On the 6th of January I received the above notification, touched ‘OK’, and dragged myself out of the black hole to start a new job.

I am very happy to have joined the Centre for Design Informatics at Edinburgh College of Art, University of Edinburgh. It’s an exciting department, with some very original and refreshing ideas about the opportunities that can come from designing with data and physical things. For the next 18 months or so I’ll be working as a Post Doctoral Research Fellow on the Learning Energy Systems research project. There’ll be lots more information on the project website in due course, but for now here’s a short explanation:

This project will develop a collective ‘Learning Energy System’ involving people, objects, data and machines. Central to this is a digital system designed to align human needs and comfort with building energy systems, with the aim of to reducing overall energy demand. This project differs from many energy reduction projects. The building user; as a sensor of conditions; as a driver of energy demand; as an individual; and as a collective, is at the heart of the ‘Learning Energy System’.

The project is situated in a couple of schools in Scotland. The Building Management Services that run these school buildings collect energy use data, but the students, teachers and staff that ‘use’ the school building, and the energy, have no real access to that data or any engagement with the ways that the energy is used. We are interested in learning about the reality of how energy is used in those schools; how energy is tied into the social and educational lives of the schools’ inhabitants, and the value(s) placed on its use in such a context. Through a process of design-led research we hope to reveal the complexities and messy-ness of this energy system at the centre of which are the building users. Then, turning the schools into ‘Living Labs’ we will use co-design methods to develop what a Learning Energy System might be, and to find ways that energy systems might fit meaningfully into such complex settings to better support and engage the communities within them. This is a really exciting project for me to be a part of. Though I’ve worked with similar themes and processes before, this represents a big step and I’m really looking forward to getting out of my comfort zone to learn about new kinds of design practice, new communities of  people, and new ways of thinking about technology. Of course, this also means that I have moved to Edinburgh. I love getting to know a new place,  and am having a great time exploring this beautiful and intriguing city.

Timestreams In Brazil

While the extended British winter has me thinking of warmer, sunnier places, I remembered that I never really posted about the work I did in Brazil back in October. I was lucky enough to go there for a 2 week residency with Active Ingredient for Timestreams – a joint research and development project between the RCUK’s Horizon Digital Economy Research Hub, University of Nottingham,  and Active Ingredient.

Timestreams is a WordPress plugin that allows you import, edit, overlay and compare different sets of live and pre-recorded data in timelines, much as you would do in video editing software. This allows you to compose different streams of data, and play with the relationships between them, or simply just to use as an easy to access repository of live data.

Our role in the project was not to produce polished, finished pieces, but instead to explore the possibilities and capabilities of the platform by creating artistic experiments in response to the environmental data we collected.

We spent the first week of our trip on a farm in the Mata Atlantica called Vera Cruz (top image) just outside Miguel Perriera, where we explored, gathered data and tested ideas for things to make. As it turned out, other than being beautiful, our remote setting offered some valuable reminders about our creative process when working with data. Surrounded by nature, and faced with a total lack of web connectivity, we were eventually forced to come up with more analog, and to my mind more creative approaches. Trying things out quickly and simply is much more valuable when trying to get a feel for the stories you want to tell. It requires relatively minor investment and risk, and in doing so allows you to be more agile in your creatives process and decision making.

One example of this was an idea of Rachel’s that came from the possibility that climate change might bring about more frequent instances of extreme weather. She wanted to build a prediction machine that monitored humidity and temperature data in order to offer advice for how to deal with this changing weather. Instead of trying to build the machine immediately she decided to make these fortune empanadas.

They contained predictions and advice based on interviews she conducted with us, and local people. This was designed as a way to see what it felt like to receive such a prediction, without having access to the data needed to build the final machine. In order to fit the predictions inside empanadas meant that they needed to be brief while retaining their meaning. As a result, they were often cryptic, or mysterious, scary or mundane. We wanted to retain these qualities and so when we did eventually built the prediction machine, we used the same predictions.

Week 2 of our residency lead us back to Rio de Janeiro where we took up residence at Barracao Maravilha, an artist studio and gallery space in the city. We spent this time developing what we had started testing during our first week on the farm, but were also keen to collaborate with the various artists who worked in the studio. For many of them it was their first encounter with using data, and they had some really interesting approaches to engaging with it, and with us. These collaborations were similar in that because the artists at Barracao weren’t used to working with data, they bought ideas with them that we could test out equally quickly, and combine with data relatively easily.

This record player belonged to Bruno – one of the Barracao artists. He’d had it for a while and had recently become interested in Arduino, so we had a perfect opportunity to work together and hook it up to Timestreams. First though, we connected the Arduino to the motor that drives the turntable, and used it to vary the speed. Bruno made this makeshift, but surprisingly effective speaker out of a piece of paper and a sewing needle, and we were treated to a raucous, fluctuating rendition of Stravinsky’s Rite of Spring.

The result of this experiment was definitely intriguing and compelling, both as an object and as a performance of the music. So, the next step was to get some data involved. We experimented with decibels, temperature, humidity and a few more, but we knew from the experiment that we wanted the speed of the record to fluctuate gradually up and down. We remembered the MaunaLoa CO2 data that we had come across in a previous project. This data set contains global CO2 levels from 1959 to the present day. Within each year, the levels fluctuate seasonally, but the overall trend is gradual and incremental increase. The effect of using this data was that the speed of the record fluctuated, but gradually got faster and faster – the pitch changing with it.

Two of the other Barracao artists,Hugo and Natali, make huge, brightly coloured inflatable structures and place them in various natural and manmade contexts. They made the one seen in the image above especially for us to experiment with. They had always wanted them to ‘breathe’ and move, so we attached the small fans that inflate them to an Arduino, and after some fine tuning, we were turning the fans on and off to make them appear as though they were breathing by themselves. The movement and their bright, vibrant colours were reminiscent of the street life outside the gallery, so we wanted to use data to draw the link between them. We had a decibel sensor sending live data from the street a Timestream allowing the three inflatable structures to respond to the noise of the street.

These prototypes reflect something of the environment in which they were constructed too – they are vibrant, warm, and at times ramshackle. A pretty fair reflection of our experience in Rio, and the area surrounding the studio we were resident in.

They also reflect an inherent tension when making ‘digital’ work, or at least work that relies on electricity and wi-fi connection. This was an environments where these infrastructures seemed as temperamental as the  environment. In these circumstances though, we are still in the business of using data creating something engaging for an audience, which presents us with some interesting questions that we have encountered before. Where some part of the technical infrastructure you are using breaks down, it leaves you with a difficult choice. First, you can hang an ‘out of order’ sign on the work and apologise for the technical failures. But, this is always a bit embarrassing  and ultimately gives your audience nothing to engage with, and nothing to take away. Alternatively, you can use some prerecorded ‘backup’ data. While this at least gives the audience something to engage with, it also leaves you with an ethical dilemma. If the work is described as ‘live’, people can find out pretty quickly that it isn’t, and that you are misleading them. Also, and this is particularly true when using scientific data, you are in effect giving people information that is not accurate, and therefore potentially misleading.

So which is more important, the experience or the science?

 

If you are interested in finding out more about the project there are more photos on the Active Ingredient flickr, as well as more information on the Timesteams website.

At the time of writing, it is not yet possible to get the Timestreams plugin, but stand by for news of it’s availability. If you are interested in using it yourself – the API is here.

Participants Wanted: Experiential Manufacturing

I am currently looking for people who have been affected by an earthquake to take part in a new design led research project called Experiential Manufacturing.

Experiential Manufacturing is a system that uses real data and eyewitness accounts to make personal mementos of earthquakes. It is part of a research project from the Mixed Reality Lab at the University of Nottingham that uses scientific data like magnitude, duration and location, along with more personal descriptive accounts of events, to alter the material characteristics of existing objects.

The project investigates how data about our previous experiences could be used to shape the material qualities (touch, shape, feel) of otherwise meaningless objects so that they are better able to reflect our personal experiences and tell stories about our past.

During the course of the project, a set of bespoke domestic manufacturing systems will be produced and installed in the participant’s home. Using both scientifically recorded and anecdotal data from real earthquakes, the exact configuration and behaviour of the services will be derived from participants’ descriptions of the material and sensory conditions during their own earthquake experiences. The Experiential Manufacturing system will then act as a digital manufacturing service that creates devices and objects that aim to be evocative of aspects of these experiences in order to craft personal memorial artefacts. Each artefact produced by the Experiential Manufacturing systems will be materially, unique.

If you think you, or any one you know might be interested in taking part, please feel free to head over to the site for more information about the project and what will happen if you take part: http://experientialmanufacturing.com/

IBI: Weather Maker

The Institute have been busy recently conducting weather modification experiments. We were invited by Phoenix Square in Leicester to undertake a week long residency in their gallery. We used the time to conduct a series of weather modification experiments, as well as collect research material about the facts, speculations and controversies surrounding the science. The devices we built, and the results of our research have been left behind as an exhibition that will be up for the next couple of weeks.
There’s more information and content over here.

Camera Explora – Development

Tracing Mobility is almost over, it ends on the 12th, and it really has been a pleasure (if not a little intimidating) to be part of an exhibition with so much great work.

Above is an image of it in the gallery, and I’m really pleased with how it turned out. The maps on the wall are those created by people as they used the camera to explore Berlin. That was taken not long after the show opened, so hopefully by now there a few more.

Anyway, since its all up and running I thought I’d post up a few snippets of Camera Explora’s development.

 The Camera

The camera itself has more or less been completely rebuilt. We got hold of some android phones that are better able to cope with processing the images people take and logging co-ordinates to the server. Sam has also completely re-written the application so that its much more efficient. The UI is still very simple with just three main screens and relatively few ‘choices’. Applying constraints to digital technology is a big part of this project so we wanted strip out most of the functionality associated with digital photography, especially the ability to review and edit on the fly.

The camera’s body is a 3d printed case intended to make the smart phone feel a little bit more like a camera. This was again important to creating the desired experience of using a camera that does one job and relates to one experience of one place, rather than a smart phone that (excellent they may be) does everything.  This isn’t necessarily a comment over one being better than the other, more an attempts to find out what it means to have digital technologies that have a specific purpose. The casing also serves the functional purpose of blocking access to the phones buttons. This means means people cant exit the app and play with a free smart phone for a few hours.

 

The Plotter

The plotter draws lines with felt pen onto a map of the city. These lines don’t show your exact route, but instead draw lines between the locations where you take photos. This links the locations of things or events that you considered to be important, or worthy of attention, and therefore recording.

Because the pen only draws when you take a photograph, it rests in one place until another picture is taken. During this time the ink from the pen bleeds into the paper, so the size of the dot becomes a rough indicator of the length of time that passes between photos – or between paces and events of interest. The maps were custom made using open street map and printed on treated ink jet ready drawing paper.

The new version of the plotter is made using an A3 scanner. This gives you all the mechanical elements you need to run the x-axis, and is driven with a pretty standard 4-wire stepper motor so controlling it with an arduino and motor shield was quite straightforward. The mechanism involves a few drive wheels, belts and some wire + pulley systems that slow and smooth the movement of the stepper.

Flatbed scanners have no Y-axis, so I took the drive mechanism out of a smaller A4 scanner and attached it to the A3 scanner’s scan head using some custom made 3D printed brackets. The motor that came with this mechanism was a dc motor with an optical encoder on the back, rather than a stepper motor. These are really accurate, but unfortunately I couldn’t get a reading from the encoder – possibly why the scanner was being thrown out. Instead I got hold of a small stepper motor that could fit inside the  scan head along with the drive wheel of the smaller scanner mechanism.

This version of the plotter has a stand so that it could be displayed as a stand alone unit. I also wanted it to look and feel a bit more like a piece of furniture than last time.

Its build using plane old pine timber and plywood. But to make it look a little more like something that would belong in a home I covered it with oak veneer. I like how it turned out – it’s intentionally quite retro looking. This was partly to lend it a little domestic familiarity.

  

 

  

 

The Printer

As you walk around and take photos, a small photo printer hidden in the plotter display prints your photos. This happens as you take them.

The printer is a Polaroid Pogo. These are designed to work with mobile phones and certain camera with PictBridge functionality. It does this either by a usb connection to the camera or a bluetooth obex transfer. A PC won’t recognise these printers through a USB connection as it can’t run PictBridge, so the photos had to be sent over bluetooth.

The printer was held inside the plotter display casing by a 3d printed bracket. This means the printer can be easily slid in and out when the paper needs replacing, which, because they only hold 10 sheets, is quite often.


 

That’s a bit of a quick overview, but for the sake of brevity, that will do for now. Along with Sam who did all the android development, many thanks also go to Mike Golembewski and Rob Mitchelmore for help with various bits of software development.  It will soon be coming back from Berlin, and when it does i’ll get around to making a film about the project and post up some of the maps and photos that have been created.

I will also running some more controlled user trials for use in my PhD research. So if you’d like to have a go, get in touch!

Coming Up – Tracing Mobility Berlin

Camera Explora will soon be going to Berlin to take part in the Tracing Mobility exhibition at Haus De Kulturen De Welt from the 24th November to the 12th December 2011.
The exhibition will showcase about 20 works around the theme of cartography and migration in networked space.

Mobility has become one of the most important keywords in the discourse on globalisation, techno-economic change or the Information Society. The idea of nomadic, ‘mobile’ persons supported by spatial mobilisation of capital, goods and knowledge pervades politics and economics, technology and science, advertising and media, commerce and culture. The Tracing Mobility exhibition and symposium present a snapshot of the dynamic topography of this constant being-in-motion.

The prototypes have changed a lot since their last outing in Nottingham. I’m still working with Sam and we’ve managed to solve pretty much all of the hardware and software bugs (we’re working on the rest!), so hopefully the experience should run much more as intended. This means that we should be able to start getting some more useful feedback from people about how the technology influences their exploration and perception of the city. I’ll do a write-up about this after the event, as well as more detail about the prototypes.

It’s going to be really exciting to see people using them in a city i’m unfamiliar with – I’ve only ever seen it or tried it out in cities where I live, so hopefully I might even get to have a go myself!

Anyway, if you’re in Berlin any time between the 24th November and the 12th December, come along and say hi.

Visualising Climate Change … again

 

The next showing of the Active Ingredient’s project ‘A Conversation Between Trees’ is coming up at the Rufford Gallery and Country Park in Nottinghamshire. It’s running from the 13th September – 30th October 2011, with members of Active Ingredient in residence until 24th September.

There’s been a whole load of development since the last exhibition, with improvements to environmental sensors and mobile app’s to name but a few. The Climate Machine I’m developing has also undergone significant improvement. A lot of the kinks are being smoothed out to not only make it mechanically more efficient, but also to make it more robust and easier for us to run throughout the whole period of the show. We’ve also been playing around with how the machine draws out the data, trying to find the way that we thing offers the most truthful and interpretable impression of global CO2 levels since 1959.
To add to this we also have a new data set based on MET Office predictions for future Co2 levels. This gives forecasts of data up until 2050, which will hopefully allow us to give an impression of a trajectory into the future, and think about what these increasing CO2 levels might mean.

Anyway, I shouldn’t give too much away just yet. For more information visit the project website: www.hello-tree.com.