Its inevitable that I would be drawn into doing some work about cycling – I live in a small market town that is overrun by crazy people donning lycra and heading to the hills. The Tour de Yorkshire is the latest in a series of major cycle events that is coming through my town this weekend. We were lucky enough to be on the route of Le Grand Depart back in July 2014, an event that did bring out the entire community. While, we can muse about who actually comes to these events – they involve public money so therefore should be accessible and attended by all sections of society, right? – we don’t actually know for sure. This was the purpose of some work that I did with Matt Whittle and Nik Lomax last summer. We worked with LCC on some very tasty data that was collated throughout Le Grand Depart. The findings do indeed back up what you would expect, typically those who came to view the race fall into the category commonly labelled as MAMILS (Middle-aged men in lycra), this was particularly prevalent at the King of the Mountains sections. Further details about this work can be found in the very catchy sounding Conversation article: Charge of the lycra brigade.
In the fastest ever journal submission to publication I have ever experienced, the following paper has just been published online, and free to grab a copy of:
Here is the abstract to whet your appetite:
Cities are complex systems, comprising of many interacting parts. How we simulate and understand causality in urban systems is continually evolving. Over the last decade the agent-based modeling (ABM) paradigm has provided a new lens for understanding the effects of interactions of individuals and how through such interactions macro structures emerge, both in the social and physical environment of cities. However, such a paradigm has been hindered due to computational power and a lack of large fine scale datasets. Within the last few years we have witnessed a massive increase in computational processing power and storage, combined with the onset of Big Data. Today geographers find themselves in a data rich era. We now have access to a variety of data sources (e.g., social media, mobile phone data, etc.) that tells us how, and when, individuals are using urban spaces. These data raise several questions: can we effectively use them to understand and model cities as complex entities? How well have ABM approaches lent themselves to simulating the dynamics of urban processes? What has been, or will be, the influence of Big Data on increasing our ability to understand and simulate cities? What is the appropriate level of spatial analysis and time frame to model urban phenomena? Within this paper we discuss these questions using several examples of ABM applied to urban geography to begin a dialogue about the utility of ABM for urban modeling. The arguments that the paper raises are applicable across the wider research environment where researchers are considering using this approach.
Question is, what famous line to try and get into the title of a paper next time..?
Fancy coming to London at the end of August? Looking for an exciting Geocomp session to show off your fancy research? Look no further…
Call for papers: **GeoComputation; the next 20 years**
Just returning from a workshop on Geocomputation at Kings College London. The event was put together to bring researchers together from around the country to discuss the ‘future of Geocomputation’. There were three keynotes (Chris Brundson, Alex Singleton and myself) each giving our different views on the future of Geocomputation. Whilst we concentrated on different aspects (technology, in particular agent-based modelling for me, Bayesian approach called ABC for Chris and teaching of GIS for Alex), there was commonality in the areas that we felt future work was needed in such as data handling, visualisation, more engaging teaching methods and teaching programming to students. For me, the future of Geocomputation is very much going to be shaped by developments in both agent-based modelling and big data. Instead of developing agent frameworks (of which there are numerous – I did a head count of about 86), we should instead focus on tackling the thorny issues of identifying behaviour and processes in systems as well as calibration and validation.
This is something I will return to in a future post, but a copy of my slides can be found by clicking on Heppenstall.
New paper just published…
Olner D; Evans A; Heppenstall A (2015) An agent model of urban economics: Digging into emergence, Computers, Environment and Urban Systems, . doi: 10.1016/j.compenvurbsys.2014.12.003
This paper presents an agent-based ‘monocentric’ model: assuming only a fixed location for firms, outcomes closely parallel those found in classical urban economic models, but emerge through ‘bottom-up’ interaction in an agent-based model. Agents make buying and movement decisions based on a set of simple costs they face from their current location. These spatial costs are reduced to two types: the costs of moving people and goods across geographical distances and the costs (and benefits) of ‘being here’ (the effects of being at a particular location such as land costs, amenities or disamenities). Two approaches to land cost are compared: landlords and a ‘density cost’ proxy. Emergent equilibrium outcomes are found to depend on the interaction of externalities and time. These findings are produced by looking at how agents react to changing four types of cost, two spatial and two non-spatial: commuting, wage, good cost and good delivery. The models explore equilibrium outcomes, the effect of changing costs and the impact of heterogeneous agents, before focusing in on one example to find the source of emergence in the externalities of agent choice. The paper finishes by emphasising the importance of thinking about emergence as a tool, not an end in itself.