Dawes, R.M., 1980. Social dilemmas. Annual review of psychology, 31, p.169-93. Available at: http://www.annualreviews.org/doi/abs/10.1146/annurev.ps.31.020180.001125 [Accessed March 27, 2012].
This review from 1980 gives an overview of social dilemma research for games with over 2 people. A social dilemma game being defined as one where each player gets a bigger payout for defecting than cooperating, unless either enough people cooperate (in which case a higher payout is achieved) or everyone defects (in which case a minimum payment is achieved).
(The normal prisoner's dilemma game is 2 player, 1 shot, which doesn't allow for retaliation or anonymity effects.)
Looks at Involvement - inter-subject involvement. Where financial rewards were offered there was a high degree of affect reported in the reviewed papers. The papers that were reviewed that used an abstract points system apparently didn't report the affect, which Dawes reckoned was probably indicative of it being absent given how noticeable it was to the other paper writers. Not too sure about that. But apparently there was much threatening and calling of 'fink'.
Communication (which I think kind of follows from involvement). The ability to communicate almost doubled the rate of cooperation. I think that reflects Ostrom's findings too - I need to re-read that. One paper broke communication down into different aspects - non-game related, game-related but keep your choice secret, and declare what you're going to choose. Apparently in the last condition everyone claimed they were going to cooperate, but not all of them did. Not hugely surprising.
Group size - Dawes has some trouble with the papers he reviewed over the group size effects. Although they all agree that cooperation is more likely in smaller groups, he found flaws with how the group sizes vs rewards had been balanced, and one that claimed to look at 1-person groups (huh?).
Public disclosure vs anonymity - publicly announcing your choice did apparently increase the level of cooperation.
Expectation of other's behaviour - Defectors were more accurate at guessing the overall defection rate, but no better than Cooperators at guessing who exactly would defect/cooperate. In fact, most people were rubbish at working that out.
Moralising - If they had a sermon read to them about the values of cooperating, then people were more likely to cooperate. I like that. That brings the salience thing in again.
In the conclusion he says something really interesting about cooperation - it's not so much the rules of the game that get people to cooperate, but "altruisms, norms, and conscience". He reckons that anything that increases the salience of those cooperative norms should increase cooperation (e.g. communication, public disclosure, moralising...). Which is exactly what I'm trying to do, using social identity theory to identify the game rules that should make group behaviour more salient, then changing those things and watching the results.