Situation Room survey: brief overview of methodology

junior ferreira 735237 creds

The survey involves six countries - Hungary, Poland, Greece, Italy, France and Germany.

Its questionnaire has two main goals: first, to better understand attitudes to open and closed societies, and second to examine how people respond when asked to evaluate the comparative open and closed society merits.

 

 

 What makes a good society

In the first part of the survey respondents were presented with 14 statements – half associated with open society views, and half with closed society ones. They were asked to rate each statement. For example:

  1. Should everyone be allowed to practise their religion freely?
  2. Should government ensure that media reporting always reflects a positive image of a country?

 

Trade-offs within a good society

For each of the first part’s seven open society statements, its second part showed two statements and asked respondents to evaluate their relative importance for a good society. They were asked to choose one, the other or both.

For example, in the statement that everyone should be able to express their opinion freely, the alternatives were:

  1. That Christian values not be offended
  2. That ethnic and national minorities not be offended.

 The survey also asked views on about immigration, civil society and political affiliation, with a few country-specific questions.

 

The development team

The research institute d|part developed the survey in close cooperation with OSEPI and the six country partners. The final survey was written and edited in English, before being translated into the six relevant languages.

Fieldwork was done by Lightspeed Germany, and the survey was administered via an online panel representing the six countries. Each country had quotas for ages, gender, geography, education and income levels. A soft launch pilot was carried out with 50 respondents, and the full launch includes more than 1,000 respondents in each country. The survey took place between February 12 and March 5, 2018.


 


The extended explanation


This note briefly summarises the background of the survey that was carried out to collect data on open society attitudes across the six countries in the project (Hungary, Poland, Greece, Italy, France and Germany). The questionnaire was developed to address two main areas of concerns: first, to evaluate the composition of open and closed society attitudes conceptually and second, to examine how individuals respond when asked to decide whether a particular open society attribute was more, less or equally important than an attribute associated with closed societies.

Open society constructions

In the first part of the survey respondents were presented with 14 items (in random order) and asked to indicate how important they thought the respective item was for a good society. Half of the items (7) were attributes defined as open society characteristics and half (7) were attributes more closely associated with closed societies.

The data collected in this first section allows us to apply dimension reduction techniques to examine how the items relate to each other. The items were:

Table1
Based on the answer options we computed two scores, one for the rating of open and one for closed society attributes. To calculate the score, the order of the answer scales was reversed (so higher values indicated higher levels of rating an item as essential). Then all seven respective item scores were added up, resulting in a score between 7 and 28. We then standardised the scores between 0 and 1. So both scores respectively measure how essential respondents rated the open and closed society attribute items respectively. A score of 0 means respondents said “not at all essential” to all seven items, a score of 1 means they said “absolutely essential” to all seven items.

Trade-off experiments

In the second part of the survey respondents were presented with 14 direct comparisons between two items and asked to evaluate their relative importance. For each of the seven open society attributes from the first part of the survey, respondents were presented with two alternative items and asked to make a decision in each of those comparisons. The order in which the comparisons were presented was randomised. The full set of comparisons was as follows:

Table 2

 Other questions

In addition to the instruments presented above, we also included a number of socio-economic questions and correlate questions about attitudinal domains, such as attitudes towards immigration, civil society and their political affiliations across all countries. Additionally, a few country-specific questions were added in each country to support the analyses within each country-specific context.

Survey development

The survey was developed by dpart’s core team for the Situation Room project and in close cooperation with all five country partners. After initial scoping about issues and instruments, a draft questionnaire was developed that was then discussed in a workshop with representatives of all country partners, d|part and OSEPI. Based on that workshop a second draft questionnaire was developed that went through further iterations with feedback from all partners. The final survey (in an English master version) was then translated by professional translators into the six country languages. Country partners then checked the translation, in addition to checks carried out by the core team. Feedback was then given to the translators to further improve the translations. Country partners formulated draft questions specific to their own contexts that were reviewed and edited by the core team before going through a final round of checks together with the master questionnaire.

 Fieldwork and data

The fieldwork was carried out by Lightspeed Germany in close cooperation with the core team. The survey was administered through an online panel in all countries. Programming of the survey was pre-tested by several people in each country to check for user experience, correct routing and the implementation of translations. Quotas for age, gender, geography, education and income and several cross quotas were applied, to achieve good representation in the samples. Quotas were only relaxed at later stages in the fieldwork and in case they could not be filled adequately otherwise. Before commencing with the fieldwork, a soft launch pilot was carried out with 50 respondents in each country to test the survey instruments and check initial distributions and participation. Subsequently the full launch took place with over 1000 respondents recruited in each country. The survey was carried out between 12 February 2018 and 5 March 2018. Where achieved sample distributions deviated from actual population distributions, weights were calculated and applied to account for those deviations.
 
0
0
0
s2smodern