Variables in quantitative reserach

What is the difference between interval/ ratio, ordinal, nominal and categorical variables? This post answers this question!

Interval/ ratio variables

Where the distances between the categories are identical across the range of categories.

For example, in question 2, the age intervals go up in years, and the distance between the years is same between every interval.

Interval/ ratio variables are regarded as the highest level of measurement because they permit a wider variety of statistical analyses to be conducted.

There is also a difference between interval and ratio variables… the later have a fixed zero point.

Ordinal variables

These are variables that can be rank ordered but the distances between the categories are not equal across the range. For example, in question 6, the periods can be ranked, but the distances between the categories are not equal.

NB if you choose to group an interval variable like age in question 2 into groups (e.g. 20 and under, 21-30, 31-40 and so on) you are converting it into an ordinal variable.

Nominal or categorical variables

These consist of categories that cannot be rank ordered. For example, in questions 7-9, it is not possible to rank subjective responses of respondents here into an order.

Dichotomous variables

These variables contain data that have only two categories – e.g. ‘male’ and ‘female’. Their relationship to the other types of variable is slightly ambiguous. In the case of question one, this dichotomous variable is also a categorical variable. However, some dichotomous variables may be ordinal variables as they could have one distinct interval between responses – e.g. a question might ask ‘have you ever heard of Karl Marx’ – a yes response could be regarded as higher in rank order to a no response.

Multiple-indicator measure such as Likert Scales provide strictly speaking ordinal variables, however, many writers argue they can be treated as though they produce interval/ ratio variables, if they generate large number of categories.

In fact Bryman and Cramer (2011) make a distinction between ‘true’ interval/ ratio variables and those generated by Likert Scales.

A flow chart to help define variables

*A nominal variable – aka categorical variable! 

Questionnaire Example 

This section deals with how different types of question in a questionnaire can be designed to yield different types of variable in the responses from respondents.

If you look at the example of a questionnaire below, you will notice that the information you receive varies by question

Some of the questions ask for answers in terms of real numbers, such as question 2 which asks ‘how old are you’ or questions 4 and 5 and 6 which asks students how many hours a day they spend doing sociology class work and homework. These will yield interval variables.

Some of the questions ask for either/ or answers or yes/ no answers and are thus in the form of dichotomies. For example, question 1 asks ‘are you male or female’ and question 10 asks students to respond ‘yes’ or ‘no’ to whether they intend to study sociology at university. These will yield dichotomous variables.

The rest of the questions ask the respondent to select from lists of categories:

The responses to some of these list questions can be rank ordered – for example in question 6, once a day is clearly more than once a month! Responses to these questions will yield ordinal variables. 

Some other ‘categorical list’ questions yield responses which cannot be ranked in order – for example it is impossible to say that studying sociology because you find it generally interesting is ranked higher than studying it because it fits in with your career goals.  These will yield categorical variables.

These different types of response correspond to the four main types of variable above.

 

 

 

Advertisements

What the public thinks of Boris Johnson

YouGov recently published a post outlining ‘Everything they know about what the public think of Boris Johnson‘ – these are basically the results of various opinion polls carried out in recent months and are a great example of secondary quantitative data.

The polls clearly show that most people think Boris will be a strong leader, and a different type of leader to previous PMs, but not in a good way: most people don’t trust Boris and think he’s going to make a terrible Prime Minister….

Most people think he’ll be a different type of leader…

Boris new PM.PNG

But almost 60% of people don’t trust Boris

public opinion Boris Johnson.PNG

One of the more creative questions was what Hogwarts House Boris Johnson would be in – no surprise that 42% of the population put him in Slytherin – which values ambition and cunning.

Boris Johnson Slytherin.PNG

Relevance of all of this to A-level Sociology 

These polls are a good example of the problems with validity in quantitative social survey research.

We need to treat these results with caution: the negative responses may be because of the lack of say most people have had over Boris being elected, or about the lack of any kind of progress over Brexit.

In other words, people may not be expressing their dissatisfaction with Boris in particular, but possibly at the whole of the inept political class in general!

Having said that, I’m not going to dismiss criticism of Boris: he is an Eton educated millionaire who seems to be prepared to lie and spin his way to the top, always putting his own personal ambition ahead of anything else.

The Global Drug Survey – a good example of invalid data due to bias?

86% of the global population have used drugs in the last year, and more people have used cannabis than tobacco. Almost 30% of the world’s population have used Cocaine in the last year, at least according to the 2019 Global Drug Survey.

Global Drugs Survey.PNG

This survey asked adults in 36 countries about their use of drugs and alcohol.

According to the same survey, the British get drunk more often than people in any other nation, at least according to a recent

In Britain, people stated they got drunk an average of 51 times last year, with U.S., Canada and Australia not far behind. The average was 33 times.

Where Cocaine use was concerned, 73% of people in England said they had tried it compared to 43% globally.

How valid is this data?

I don’t know about you, but to me these figures seem very high, and I’m left wondering if they aren’t skewed upwards by selective sampling or loose questions.

This report is produced by a private company who sell products related to addiction advice, and I guess their market is national health care services.

Seems to me like it’s in their interests to skew the data upwards to give themselves more of a purpose.

I certainly don’t believe the average person in the UK gets drunk once a week and that almost 3/4s of the population have tried Cocaine.

Sources

The Week 25th May 2019

 

 

Using interviews to research education

Interviews are one of the most commonly used qualitative research methods in the sociology of education. In this post I consider some of the strengths and limitations of using interviews to research education, focussing mainly on unstructured interviews.

This post is primarily designed to get students thinking about methods in context, or ‘applied research methods’. Before reading through this students might like to brush up on methods in context by reading this introductory post. Links to other methods in context advice posts can be found at the bottom of the research methods page (link above!)

Practical issues with interviews  

Gaining access may be a problem as schools are hierarchical institutions and the lower down the hierarchy an individual is, the more permissions the interviewer will require to gain access to interview them. For example, you might require the headmaster’s permission to interview a teacher, while to interview pupils you’ll require the headmasters and their parent’s permission.

However, if you can gain consent, and get the headmaster onside, the hierarchy may make doing interviews more efficient – the headmaster can instruct teachers to release pupils from lessons to do the interviews, for example.

Interviews tend to take more time than questionnaires, and so finding the time to do the interviews may be a problem – teachers are unlikely to want to give up lesson time for interviews, and pupils are unlikely to want spend their free time in breaks or after school taking part in interviews. Where teachers are concerned, they do tend to be quite busy, so they may be reluctant to give up time in their day to do interviews.

However, if the topic is especially relevant or interesting, this will be less of a problem, and the interviewer could use incentives (rewards) to encourage respondents to take part. Group interviews would also be more time efficient.

Younger respondents tend to have more difficulty in keeping to the point, and they often pick up on unexpected details in questions, which can make interviews take longer.

Younger respondents may have a shorter attention span than adults, which means that interviews need to be kept short.

Validity issues

Students may see the interviewer as the ‘teacher in disguise’ – they may see them as part of the hierarchical structure of the institution, which could distort their responses. This could make pupils give socially desirable responses. With questions about homework, for example, students may tell the interviewer they are doing the number of hours that the school tells them they should be doing, rather than the actual number of hours they spend doing homework.

To overcome this the teacher might consider conducting interviews away from school premises and ensure that confidentiality is guaranteed.

Young people’s intellectual and linguistic skills are less developed that adults and the interviewer needs to keep in mind that:

  • They may not understand longer words or more complex sentences.
  • They may lack the language to be able to express themselves clearly
  • They may have a shorter attention span than adults
  • They may read body language different to adults

Having said all of that, younger people are probably going to be more comfortable speaking rather than reading and writing if they have poor communication skills, which means interviews are nearly always going to be a better choice than questionnaires where younger pupils are concerned.

To ensure greater validity in interviews, researchers should try to do the following:

  • Avoid using leading questions as young people are more suggestible than adults.
  • Use open ended questions
  • Not interrupt students’ responses
  • Learn to tolerate pauses while students think.
  • Avoid repeating questions, which makes students change their first answer as they think it was wrong.

Unstructured interviews may thus be more suitable than structured interviews, because they make it easier for the researcher to rephrase questions if necessary.

The location may affect the validity of responses – if a student associates school with authority, and the interview takes place in a school, then they are probably more likely to give socially desirable answers.

If the researcher is conducting interviews over several days, later respondents may get wind of the topics/ questions which may influence the responses they give.

Ethical issues

Schools and parents may object to students being interviewed about sensitive topics such as drugs or sexuality, so they may not give consent.

To overcome this the researcher might consider doing interviews with the school alongside their PSHE programme.

Interviews may be unsettling for some students – they are, after all, artificial situations. This could be especially true of group interviews, depending on who is making up the groups.

Group interviews

Peer group interviews may well be a good a choice for researchers studying topics within the sociology of education.

Advantages 

  • Group interviews can create a safe environment for pupils
  • Peer-group discussion should be something pupils are familiar with from lessons
  • Peer-support can reduce the power imbalance between interviewer and students
  • The free-flowing nature of the group interview could allow for more information to come forth.
  • The group interview also allows the researcher to observe group dynamics.
  • They are more time efficient than one on one interviews.

Disadvantages

  • Peer pressure may mean students are reluctant to be honest for fear of ridicule
  • Students may also encourage each other to exaggerate or lie for laffs.
  • Group interviews are unpredictable, and very difficult to standardise and repeat which mean they are low in validity.

Depression leads to more social media usage, not the other way around!

Recent longitudinal research from Brock University in Canada suggests that depression leads to people spending more time on social media, rather than those who spend more time of social media being more likely to develop depression.

Facebook depression.png

This study contradicts many of the ‘moral panic’ type headlines which suggests a link between heavy social media use and depression. Such headlines tend to be based on studies which look at correlations between indicators of depression and indicators of social media use at the same point in time, which cannot tell us which comes first: the depression or the heavy social media use.

This Canadian study followed a sample of teenagers from 2015 (and university students for 6 years) and surveyed them at intervals using a set of questions designed to measure depression levels and another set designed to measure social media usage and other aspects of screen time.

What they found was that teenage girls who showed signs of depression early on in the study were more likely to have higher rates of social media usage later on, leading to the theory that teenage girls who are depressed may well turn to social media to make themselves feel better.

The study found no relationship between boys or adults of both sexes and depression and social media.

This is an interesting research study which really goes to show the advantages of the longitudinal method (researching the same sample at intervals over time) in possibly busting a few myths about the harmful effects of social media!

 

The limitations of School Exclusion Statistics

The Department for Education publishes an annual report on exclusions, the latest edition published in August 2018 being ‘Permanent and fixed-period exclusions in England: 2016 to 2017.

The 2018 report shows that the overall rate of permanent exclusions was 0.1 per cent of pupil enrolments in 2016/17. The number of exclusions was 7,720.

exlusion statistics.png

The report also goes into more detail, for example….

  • The vast majority of exclusions were from secondary schools >85% of exclusions.
  • The three main reasons for permanent exclusions (not counting ‘other’) were
    • Persistent disruptive behaviour
    • Physical assault against a pupil
    • Physical assault against an adult.

Certain groups of students are far more likely to be permanently excluded:

  • Free School Meals (FSM) pupils had a permanent exclusion rate four times higher than non-FSM pupils
  • FSM pupils accounted for 40.0% of all permanent exclusions
  • The permanent exclusion rate for boys was over three times higher than that for girls
  • Over half of all permanent exclusions occur in national curriculum year 9 or above. A quarter of all permanent exclusions were for pupils aged 14
  • Black Caribbean pupils had a permanent exclusion rate nearly three times higher than the school population as a whole.
  • Pupils with identified special educational needs (SEN) accounted for around half of all permanent exclusions

The ‘reasons why’ and ‘types of pupil’ data probably hold no surprises, but NB there are quite a few limitations with the above data, and so these stats should be treated with caution!

Limitations of data on permanent exclusions

Validity problems…

According to this Guardian article, the figures do not take into account ‘informal exclusions’ or ‘off-rolling’ – where schools convince parents to withdraw their children without making a formal exclusion order – technically it’s then down to the parents to enrol their child at another institution or home-educate them, but in many cases this doesn’t happen.

According to research conducted by FFT Education Datalab up to 7, 700 students go missing from the school role between year 7 and year 11 when they are  supposed to sit their GCSEs…. Equivalent to a 1.4% drop out rate across from first enrolment at secondary school to GCSEs.

Datalabs took their figures from the annual school census and the DfE’s national pupil database. The cohort’s numbers were traced from year seven, the first year of secondary school, up until taking their GCSEs in 2017.

The entire cohort enrolled in year 7 in state schools in England in 2013 was 550,000 children

However, by time of sitting GCSEs:

  • 8,700 pupils were in alternative provision or pupil referral units,
  • nearly 2,500 had moved to special schools
  • 22,000 had left the state sector (an increase from 20,000 in 2014) Of the 22,000,
    • 3,000 had moved to mainstream private schools
    • Just under 4,000 were enrolled or sat their GCSEs at a variety of other education institutions.
    • 60% of the remaining 15,000 children were likely to have moved away from England, in some case to other parts of the UK such as Wales (used emigration data by age and internal migration data to estimate that around)
    • Leaves between 6,000 to 7,700 former pupils unaccounted for, who appear not to have sat any GCSE or equivalent qualifications or been counted in school data.

Working out the percentages this means that by GCSEs, the following percentages of the original year 7 cohort had been ‘moved on’ to other schools. 

  • 6% or 32, 000 students in all, 10, 00 of which were moved to ‘state funded alternative provision, e.g. Pupil Referral Units.
  • 4%, or 22K left the mainstream state sector altogether (presumably due to exclusion or ‘coerced withdrawal’ (i.e. off rolling), of which
  • 4%, or 7, 700 cannot be found in any educational records!

This Guardian article provides a decent summary of the research.

Further limitations of data on school exclusions

  • There is very little detail on why pupils were excluded, other than the ‘main reason’ formally recorded by the head teacher in all school. There is no information at all about the specific act or the broader context. Labelling theorists might have something to say about this!
  • There is a significant time gap between recording and publication of the data. This data was published in summer 2018 and covers exclusions in the academic year 2016-2017. Given that you might be looking at this in 2019 (data is published annually) and that there is probably a ‘long history’ behind many exclusions (i.e. pupils probably get more than one second chance), this data refers to events that happened 2 or more years ago.

Relevance of this to A-level sociology

This is of obvious relevance to the education module… it might be something of a wake up call that 4% of students leave mainstream secondary education before making it to GCSEs, and than 1.4% seem to end up out of education and not sitting GCSEs!

It’s also a good example of why independent longitudinal studies provide a more valid figure of exclusions (and ‘informal’ exclusions) than the official government statistics on this.

 

I’ll be producing more posts on why students get excluded, and on what happens to them when they do and the consequences for society in coming weeks.

 

This is a topic that interests me, shame it’s not a direct part of the A level sociology education spec!

Do 25% of children really have their own mobiles? Invalid Research Example #01

This is a ‘new thread’ idea… posting up examples of naff research. I figure there are two advantages to this…

  1. It’s useful for students to have good examples of naff research, to show them the meaning of ‘invalid data’ or ‘unrepresentative samples’, or in this case, just plain unreferenced material which may as well be ‘Fake News’.
  2. At least I get some kind of pay back (in the form of the odd daily post) for having wasted my time wading through this drivel.

My first example is from The Independent, the ex-newspaper turned click-bait website.

I’ve been doing a bit of research on smart phone usage statistics this week and I came across this 2018 article in the Independent: Quarter of Children under 6 have a smartphone, study finds.

invalid research.png

The article provides the following statistics

  • 25% of children under 6 now have their own mobile
  • 12% of children under 6 spend more than 24 hours a week on their mobile
  • 80% parents admit to not limiting the amount of time their children spend on games

Eventually it references a company called MusicMagpie (which is an online store) but fails to provide a link to the research,  and provides no information at all about the sampling methods used or other details of the survey (i.e. the actual questions, or how it’s administered.). I dug around for a few minutes, but couldn’t find the original survey either.

The above figures just didn’t sound believable to me, and they don’t tie in with OFCOM’s 2017 findings which say that only 5% of 5-7 year olds and 1% of 3-4 year olds have their own mobiles.

As it stands, because of the simple fact that I can’t find details of the survey, these research findings from musicMagpie are totally invalid.

I’m actually quite suspicious that the two companies have colluded to generate some misleading click-bait statistics to drive people to their websites to increase advertising and sales revenue.

If you cannot validate your sources, then do not use the data!

Have one in five Britons really considered going vegan?

According to a recent poll (1) of 1000 people, one in five Britons have considered going vegan, which is 20% of the population.

But how many of these people have a genuine intention of going vegan? Possibly not that many…..

Firstly, if someone’s asking you questions about veganism, there is going to be a degree of social pressure to state that ‘you have thought about going vegan’…. so social desirability is going to come into play here!

Secondly, vague questioning doesn’t help… the ‘I’ve considered going vegan’ response covers everything from ‘I’m definitely going Vegan in January’ to ‘I thought about it once, but really I’ve got no serious intention of giving up meat’.

Finally, there’s the problem that 1/3rd of the general population seem confused as to what veganism entails…. 27% think vegans can’t eat fruit (God knows what they think a vegan diet consists of!), while 6% think it’s OK to eat fish if you’re a vegan.

Fish: those vegetables what swim in the sea? 

 

However, apparently 3.5 million people in the UK are now Vegan, which suggests enough of a ‘base-line’ figure to make 20% of the population ‘thinking’ about going vegan not seem completly unrealistic.

Then there’s the fact that 100K people signed up for Veganuary 2018, and probably more this year, meaning that veganism is in the news a lot more than it used to be, even a couple of years ago.

Having said that, veganism may be on the increase, but apparently 15% of them think it’s OK to eat Dairy and eggs.

Sources 

(1) Poll of 1000 people

Research Methods Practice Questions for A-level sociology

AQA A-level sociology Papers 1 and 3 will both contain an ‘outline and explain’ 10 mark (no item) question on sociological theories, and/ or methods.

One possible format for this question is what I like to think of as the ‘pure research methods’ format (‘classic’ might be a better word than ‘pure’) in which students are asked to outline and explain two theoretical, practical or ethical advantages or problems of using one of the main research methods.

For example (taken from the AQA’s June 2017 Education with Theory and Methods paper): ‘Outline and explain two problems of using documents in social research’

There are actually 36 possible variations of such ‘pure’ or ‘classic’ research methods questions, as outlined in the flow chart below.

Outline and Explain 10 mark research methods questions

Students may be asked to give two advantages or problems of any of the above methods, or more specific methods (field experiments for example), or they may be asked to give two advantages of using overt compared to covert participant observation, or asked to simply give two ethical problems which you may encounter when doing research more generally.

Then of course, students may be asked to relate methods to theories, or just asked about a pure ‘theoretical’/ perspectives question.

While there is no guarantee that this particular format of question will actually come up on either paper 1 or 3, it’s still good practice for students to work through a number of such questions as revision practice.

Outline and explain two practical problems of using documents in social research (10)

There are a lot of documents available and it can be time consuming to analyse them qualitatively

Taking news for example, there are thousands of news items published every day.

You also need to distinguish between ‘real and ‘fake news’.

Also, in the postmodern age where fewer people get their news from mainstream news it is necessary to analyse a wide range of media content to get representatives, which makes this more difficult.

Because there are so many documents available today, it is necessary to use computer assisted qualitative analysis, which effectively quantifies the qualitative data, meaning that some of depth and insight are lost in the process.

With personal documents, gaining access might be a problem

Personal diaries are one of the most authentic sources of information because people write them with no intention of them being seen.

However, they may not be willing to show researchers the content because they say negative feelings about people close to them, which could harm them.

Blogs would be easier to access but  the problem is people will edit out much of what they feel because these are published.