All posts in exhibition

Late Labs: The Internet of Things That Matter

smashed

On the 10th of April I’ll be giving a short talk and showing a version of The Earthquake Shelf at an event called, The Internet of Things That Matter. It will take place at 7:00pm, in the Informatics Forum, 10 Crichton Street, Edinburgh EH8 9AB.

The event is one of a series of Late Labs that are a part of the  Edinburgh International Science Festival, in collaboration with New Media Scotland and the University of Edinburgh’s School of Informatics.

How does data change our relationship with physical ‘things’? The Internet of Things exploits new technologies to link physical artefacts with data across social and technical networks. Join the Design Informatics Research Group to explore this new technology.

From teapots that you can haggle with in Oxfam shops or shelves that shake when earthquakes take place on the other side of the world, to clocks that print you a postcard of something that happened in the past. Let’s reflect upon the implication on our social lives.

Do come along and if you can – there are lots of  other great projects from Design Informatics lined up for the evening and, I’m sure, some great talks and conversations.

 

IBI: Weather Maker

The Institute have been busy recently conducting weather modification experiments. We were invited by Phoenix Square in Leicester to undertake a week long residency in their gallery. We used the time to conduct a series of weather modification experiments, as well as collect research material about the facts, speculations and controversies surrounding the science. The devices we built, and the results of our research have been left behind as an exhibition that will be up for the next couple of weeks.
There’s more information and content over here.

Camera Explora – Development

Tracing Mobility is almost over, it ends on the 12th, and it really has been a pleasure (if not a little intimidating) to be part of an exhibition with so much great work.

Above is an image of it in the gallery, and I’m really pleased with how it turned out. The maps on the wall are those created by people as they used the camera to explore Berlin. That was taken not long after the show opened, so hopefully by now there a few more.

Anyway, since its all up and running I thought I’d post up a few snippets of Camera Explora’s development.

 The Camera

The camera itself has more or less been completely rebuilt. We got hold of some android phones that are better able to cope with processing the images people take and logging co-ordinates to the server. Sam has also completely re-written the application so that its much more efficient. The UI is still very simple with just three main screens and relatively few ‘choices’. Applying constraints to digital technology is a big part of this project so we wanted strip out most of the functionality associated with digital photography, especially the ability to review and edit on the fly.

The camera’s body is a 3d printed case intended to make the smart phone feel a little bit more like a camera. This was again important to creating the desired experience of using a camera that does one job and relates to one experience of one place, rather than a smart phone that (excellent they may be) does everything.  This isn’t necessarily a comment over one being better than the other, more an attempts to find out what it means to have digital technologies that have a specific purpose. The casing also serves the functional purpose of blocking access to the phones buttons. This means means people cant exit the app and play with a free smart phone for a few hours.

 

The Plotter

The plotter draws lines with felt pen onto a map of the city. These lines don’t show your exact route, but instead draw lines between the locations where you take photos. This links the locations of things or events that you considered to be important, or worthy of attention, and therefore recording.

Because the pen only draws when you take a photograph, it rests in one place until another picture is taken. During this time the ink from the pen bleeds into the paper, so the size of the dot becomes a rough indicator of the length of time that passes between photos – or between paces and events of interest. The maps were custom made using open street map and printed on treated ink jet ready drawing paper.

The new version of the plotter is made using an A3 scanner. This gives you all the mechanical elements you need to run the x-axis, and is driven with a pretty standard 4-wire stepper motor so controlling it with an arduino and motor shield was quite straightforward. The mechanism involves a few drive wheels, belts and some wire + pulley systems that slow and smooth the movement of the stepper.

Flatbed scanners have no Y-axis, so I took the drive mechanism out of a smaller A4 scanner and attached it to the A3 scanner’s scan head using some custom made 3D printed brackets. The motor that came with this mechanism was a dc motor with an optical encoder on the back, rather than a stepper motor. These are really accurate, but unfortunately I couldn’t get a reading from the encoder – possibly why the scanner was being thrown out. Instead I got hold of a small stepper motor that could fit inside the  scan head along with the drive wheel of the smaller scanner mechanism.

This version of the plotter has a stand so that it could be displayed as a stand alone unit. I also wanted it to look and feel a bit more like a piece of furniture than last time.

Its build using plane old pine timber and plywood. But to make it look a little more like something that would belong in a home I covered it with oak veneer. I like how it turned out – it’s intentionally quite retro looking. This was partly to lend it a little domestic familiarity.

  

 

  

 

The Printer

As you walk around and take photos, a small photo printer hidden in the plotter display prints your photos. This happens as you take them.

The printer is a Polaroid Pogo. These are designed to work with mobile phones and certain camera with PictBridge functionality. It does this either by a usb connection to the camera or a bluetooth obex transfer. A PC won’t recognise these printers through a USB connection as it can’t run PictBridge, so the photos had to be sent over bluetooth.

The printer was held inside the plotter display casing by a 3d printed bracket. This means the printer can be easily slid in and out when the paper needs replacing, which, because they only hold 10 sheets, is quite often.


 

That’s a bit of a quick overview, but for the sake of brevity, that will do for now. Along with Sam who did all the android development, many thanks also go to Mike Golembewski and Rob Mitchelmore for help with various bits of software development.  It will soon be coming back from Berlin, and when it does i’ll get around to making a film about the project and post up some of the maps and photos that have been created.

I will also running some more controlled user trials for use in my PhD research. So if you’d like to have a go, get in touch!

Coming Up – Tracing Mobility Berlin

Camera Explora will soon be going to Berlin to take part in the Tracing Mobility exhibition at Haus De Kulturen De Welt from the 24th November to the 12th December 2011.
The exhibition will showcase about 20 works around the theme of cartography and migration in networked space.

Mobility has become one of the most important keywords in the discourse on globalisation, techno-economic change or the Information Society. The idea of nomadic, ‘mobile’ persons supported by spatial mobilisation of capital, goods and knowledge pervades politics and economics, technology and science, advertising and media, commerce and culture. The Tracing Mobility exhibition and symposium present a snapshot of the dynamic topography of this constant being-in-motion.

The prototypes have changed a lot since their last outing in Nottingham. I’m still working with Sam and we’ve managed to solve pretty much all of the hardware and software bugs (we’re working on the rest!), so hopefully the experience should run much more as intended. This means that we should be able to start getting some more useful feedback from people about how the technology influences their exploration and perception of the city. I’ll do a write-up about this after the event, as well as more detail about the prototypes.

It’s going to be really exciting to see people using them in a city i’m unfamiliar with – I’ve only ever seen it or tried it out in cities where I live, so hopefully I might even get to have a go myself!

Anyway, if you’re in Berlin any time between the 24th November and the 12th December, come along and say hi.

Visualising Climate Change … again

 

The next showing of the Active Ingredient’s project ‘A Conversation Between Trees’ is coming up at the Rufford Gallery and Country Park in Nottinghamshire. It’s running from the 13th September – 30th October 2011, with members of Active Ingredient in residence until 24th September.

There’s been a whole load of development since the last exhibition, with improvements to environmental sensors and mobile app’s to name but a few. The Climate Machine I’m developing has also undergone significant improvement. A lot of the kinks are being smoothed out to not only make it mechanically more efficient, but also to make it more robust and easier for us to run throughout the whole period of the show. We’ve also been playing around with how the machine draws out the data, trying to find the way that we thing offers the most truthful and interpretable impression of global CO2 levels since 1959.
To add to this we also have a new data set based on MET Office predictions for future Co2 levels. This gives forecasts of data up until 2050, which will hopefully allow us to give an impression of a trajectory into the future, and think about what these increasing CO2 levels might mean.

Anyway, I shouldn’t give too much away just yet. For more information visit the project website: www.hello-tree.com.

Visualising Climate Change

In a previous post I mentioned that I was working with Active Ingredient on their current project, A Conversation Between Trees. Well, that’s exactly what happened and we recently spent a few days in Rockingham Forest, Northamptonshire installing the work in the Fineshade Woods art centre. You can read up on the project over at the site, but for now I’m just going to talk a little about what I worked on.

The main part of the installation involves  projected visualisations developed in Unity 3D that show real time environmental data from sensors in trees in Sherwood Forest UK, and the Mata Atlantica, Brazil. To compliment these, we wanted to create something that would add some historical context to the data. Global CO2 levels in particular change slowly, and by very small increments that aren’t linear or continuous, so we wanted to add a more accumulative and temporal impression of the data allowing more of a ‘big picture’.

 

The machine draws out CO2 data taken from the Mauna Loa observatory in Hawaii. This dataset is the longest continuous record of global CO2 data available, and has monthly readings dating back to 1959. Having data that covers such a large period of time allows us to depict the behaviour and effects of the increasing CO2 levels over the years.

 

The machine consists of a revolving circular platform and an arm that moves between the center and edge of the paper depending on the CO2 level. The revolving platform represents time with one revolution of the platform representing one year. After one revolution the paper is removed and the machine starts again. Because each year is on a seperate sheet of paper, it gradually builds a stack of paper marked with 52 years worth of CO2 fluctuations.

The data readings are  monthly, so within one revolution there are 12 readings, and the arm plots lines between each data point. Arduino controlled stepper motors drive both the platform and arm, and the steps needed to draw lines between data points are calculated using Bresenham’s algorithm. This is done in Processing rather than Arduino, as it it made sense to do all the calculations on a laptop rather than the arduino itself. The processing sketch then just passes simple movement commands to the arduino which drives the motors accordingly.

 

Lines are scorched onto the recycled paper using a soldering iron. We did a lot of experiments to test which papers scorched nicely, but didn’t burn, and which speed / heat combinations left the nicest mark on the paper. We’re not sure we’ve got this quite right yet and will continue trying different tips for the soldering iron and different speeds for the new surface area and heat combination that this will create.

As well as forcing an uneasy contradiction by using a carbonising process to make marks, the act of burning symbolises the relationship between global CO2 levels and temperature.

This brings us to an interesting point of discussion that has come to light through doing this project. The team have discussed many times the role of technology led human intervention in environmentally engaged artwork. As ever, we have no answers, but these projects are about presenting information to provoke debate and dialogue around some very serious but complex issues without being prescriptive, or trying to force our views on the audience.

This is still the first prototype, there are a few bugs to work out, and some more in depth design decisions to make but its a good starting point, and by the time the next exhibitions come around it will, of course, be flawless.

Active Ingredient: A Conversation Between Trees

Data Visualisation

[image from Active Ingredient]

Over the next few months I’m going to be working with Active Ingredient to develop a dynamic sculpture for their project A Conversation Between Trees. The project combines environmental data gathered from trees in Nottingham’s Sherwood Forest and Brazil’s Mata Atlantica. This data from each location is then visualized (above image) side by side to illustrate the contrast between the two environments, and represent a form of conversation.

Here’s what they say about the project:

“Welcome to a forest that spans time and location… a journey from the temperate north to the tropical south to discover the invisible forces at play, to reveal a story of 150 years of climate and environmental change.”

A Conversation Between Trees connects trees in different environments, using sensors connected to mobile phones to visualize and interpret the sensor data, as part of a new locative artwork. Following on from research developed as part of the Dark Forest Project.

Active Ingredient will work with environmental sensors including CO2, temperature and humidity, to interpret the carbon cycle and the sensitivity of this process to climate change (carbon cycle feedback).  This will take place in forest and woodland environments in a series of locations.

The sculpture will join the work they have already developed for the up-coming series of exhibitions, and will offer an alternative to the data shown through the visualisations, giving a physical and accumulative representation of the contrasting environmental conditions, and the significance of the changing climates over time.

Human sensor data maps

[image from Active Ingredient]

Showing environmental data in ways that are meaningful to people, but still true it’s complexity, is extremely problematic. To find compelling ways of doing this Active Ingredient have undertaken several local community based exercises to map and visualise environmental data and the longer-term affects of climate change. One of these was a workshop with school children in Brazil [images above], where the children created data maps using felt to depict the environmental conditions as they perceived them:

As objects, data maps, they are quite beautiful, the colours, layout and style (to use the language of Robin Active Ingredient’s programmer) were simple yet evocative representions of the data they collected as ‘human sensors’.

I really like these data maps, and although they are highly personal representations, I think the intuition involved in making them and their highly evocative illustration also describes part of our approach to designing the installation. Simply, the idea to create something that changes and evolves to show abstract data, and the tensions and dialogues at work within, in compelling ways that people can immediately, and tangibly make sense of.

I’ll post more as things develop, but have a look through Active Ingredient’s website for more information and regular updates, including the exhibition dates and locations.

Camera Explora at Territorial Play

camera explora_territorial play

Camera Explora recently appeared at Radiator’s Territorial Play , the opening event of their Tracing Mobility programme. Unfortunately I didn’t get much chance to publicize this fact in the run up to the event as I was too busy trying to work out what kind of string would give the most friction on a rubber pulley.
Embroidery cotton is quite good.

There were two main elements – activity and installation. The activity bit involved people going out and exploring the city using the camera, which is now a repackaged Google G1 phone running a custom made Android application. That bit was programmed by Sam Meek, who’s done a great job in spite of the somewhat … ‘limited’ hardware.

Those that took part seemed to respond well to the experience. A few said that they found it frustrating at first to be so constrained in what they could take photos of, but eventually began to resist the urge to photograph the first thing they came across and took the time to have a proper look around first.

camera case prototype

The second part – the installation part – was an arduino controlled CNC plotter (hence the business with the string) that drew lines onto a paper map of the city between  the locations where each photograph was taken, as they were being taken. Each photo represents, in theory, something the photographer found interesting or noteworthy. Physically connecting these instances on a paper map ties them all together. It links them in memory and space, as well as providing a tangible, non-photographic mnemonic of those experiences.*

The aesthetic of the plotter is quite rough. Although it’s absolutely a work in progress this was, for the most part, intentional – because it was an installation rather than a product design I wanted it to look like the kind of eccentric, unrefined, but very personally engaging and valuable machine that someone might have built for themselves. The details of that were worked out by just building as much of it as possible out of stuff that I had lying around. Whether or not that was the best strategy is up for debate.

plotter closeup

The projects is about exploring new places, so one concern leading into the event was that because most of the participants would be from Nottingham, the intended experience might be somewhat diluted. However, even those that were familiar with the city enjoyed actively seeking out things that they might not have seen or noticed before, which certainly seems to suggest more attentive exploration of the city. Some even requested to keep the photos they had taken, as well as the route map that had been drawn, when they returned. It’s nice when things like this come out in testing.

Anyway, not an especially in depth write-up just yet – think I’d need to run it again to do that. There were also a few minor technical issues that we couldn’t iron out in the time available. So although things didn’t run quite so smoothly as we would have liked, it helped us see exactly what was and wasn’t right about the prototype both technically and in terms of the design. There’s nothing that can’t be fixed.

Not bad for a first go. Fun too – it’s always good to see people using and enjoying something that you’ve made.


* This is not to say that tangible things are necessarily, or inherently any more or less valuable than digital things. One of the aims of the project is to investigate ways of generating meaningful records of experiences, and the play between digital and physical things is just one way of looking at how to do that.