Thursday 28 May 2009

Back and forth to Cardiff

I've been to a very interesting workshop held by Glyn Elwyn and colleagues at Cardiff University. It was really good fun - the object being to think through the 'grand challenges' for telecare over the next decade. Douglas Robinson from Paris did an interesting talk on Multi-Path Mapping, I did Normalization Process Theory, and then there were animated discussions. I learned a lot and came away with some ideas about things that I hope one day to have enough time to write about. 

The taxi into town from the airport was interesting.... As we climbed into the car the driver said, apropos of nothing, "I've got a story for you, about pies....." And then launched into a long discourse about the fine cheese and potato pies to be had at Kidderminster Harriers' football ground. I found my mind drifting. Suddenly (several miles later) I felt him grip my arm tightly and he said, "and then I was seized with panic. Well, wouldn't I be?" I had to agree that he would be. But for the next 24 hrs I wondered why. What had I missed?

Tuesday 26 May 2009

Sociology and the evidence base

Another post that I have recovered from the wreckage of the 'Great Blog Crash of 2008'

Some time ago, a colleague remarked that he hoped that, in the Institute in which I work, research would be “less critical and more evidence-based”. I’ve thought about this a great deal, partly because as time has passed, I think my work has become more, not less, critical. The idea of an ‘evidence-base’ seems to make social scientists both excited and anxious. On one hand, it promises us a place at the table. Our work can inform policy – and the more it informs policy, the more that resources and prestige seem to flow from it. On the other, it ties us to a particular kind of program of research, making evidence for policy – one that in the UK has been effectively nationalized, and tied to a set of policy imperatives. But it also ties us to a set of political processes that decouple the analysis of data (something that academics do) and its interpretation (something that policy-makers do). In the impulse to make ‘evidence’ we risk regressing to method and technique as the central scholarly problem. The more we focus on technical problems of practice, the less we focus on explanation and theory.

Evidence only works as evidence if it is engaged with explanation. This week, I’ve been relaxing and recovering from double vision, by reading Stephen Kern’s ‘Cultural History of Causality' (Princeton, 2004). Kern has a wonderful take on this problem. He examines the ways in which ideas about the causes of human behaviour have changed since the 1830s. He explores the impact of theories from the psychosocial sciences as they impact on explanations for action. The twist is that he uses examples from detective fiction to give bite to his argument. There’s a second twist, by focusing on individual acts of aberrant behaviour (mainly murder), what Kern shows, is the complexity of individual action and its integration within patterns of collective action. Kern’s analysis is an interesting and useful antidote to the grinding orthodoxy of much Foucauldian scholarship about ‘social construction’, and is also a useful reminder of the enervating and convincing patterns of explanation that arose from earlier – phenomenological – constructionist theory in the 1960s. In particular, it draws back to the work of Peter L. Berger, and the problems of how realities are constructed in practice by their participants. That is the constant engagement and interdependence of individual and societal, and – as Kern emphatically announces on the final page of his wonderful book – how little we yet know about this, and how much more we have to learn.

The genome and social structure

Beautifully written and very interesting indeed is Daniel Adkins and Stephen Vaisey's recent paper, 'Toward a Unified Stratification Theory' Sociological Theory, 27(2): 99-121. They make a fundamental point, and they make it very well. This is that "while both genome and social background influence the status attainment process, the relative importance of these factors is determined by the surrounding structure of the society (p99)" What follows from this is a clever debunking of genetic determinism, and, I would guess, a nice proto-rejection of the kinds of biological reductionism that also underpin the growing paradigmatic claims of the neurosciences. It's a paper worth reading.

'Unification' is an interesting theme in recent sociological writing, and I'd draw attention to a previous 'Paper of Note' which was lost when my blog crashed at the end of 2008 and all text was lost. This was Guillermina Jasso's wonderful article, 'A new unified theory of sociobehavioural forces' in European Sociological Review (2008, 24(4):411-434). If ever a paper might have encouraged me to give up ethnography and discourse analysis in favour of mathematical sociology, then this was it.

Developing an on-line resource for Normalization Process Theory

Normalization Process Theory enables clinicians and managers to understand the dynamics of embedding new healthcare techniques and organizational changes in context. In the UK, the Economic and Social Research Council (ESRC) has now anounced that it will fund the development of an on-line users manual and web-based tools that will assist researchers, clinicians, and managers in employing NPT. This is a major investment by the ESRC, and over the next 12 months it will involve authoring an on-line users' manual for NPT, developing a set of web-enabled tools for users, and will conclude with a major symposium.

Investigators on this project include: Frances Mair, Elizabeth Murray, Carl May, Shaun Treweek, Tim Rapley and Tracy Finch. International collaborators include Anne MacFarlane (NUI Galway), Luciana Ballini (Bologna), Jane Gunn (Melbourne), Mary Ellen Purkis (Victoria), France Legare (Montreal), and Victor Montori (the Mayo Clinic).

The protocol for the project is available at http://www.newcastle.academia.edu/CarlMay

Sunday 24 May 2009

Developing Theory

One problem for students and researchers interested in evaluating theories in the social sciences is understanding the trajectory of their development. Because theory is so important in explaining and understanding social phenomena we need to know how they are put together and operationalized. This helps us adjudicate on the claims of theorists about their work, its validity, and its practical usefulness. After all, there's nothing so practical as a good theory!

The network of researchers that has coalesced around developing Normalization Process Theory has just published a paper that describes the work of defining and developing NPT - it shows how the theory was organized and enacted practically in a series of well defined discrete tasks that ran from the development of a set of empirical generalizations about telemedicine systems to a fully formed middle range theory of implementation and integration

The paper is available here: May C, Mair FS, Finch T, MacFarlane A, Dowrick C, Treweek S, et al. Development of a theory of implementation and integration: Normalization Process Theory. Implementation Science. 2009;4(29).