The polls clearly show that most people think Boris will be a strong leader, and a different type of leader to previous PMs, but not in a good way: most people don’t trust Boris and think he’s going to make a terrible Prime Minister….
Most people think he’ll be a different type of leader…
But almost 60% of people don’t trust Boris
One of the more creative questions was what Hogwarts House Boris Johnson would be in – no surprise that 42% of the population put him in Slytherin – which values ambition and cunning.
We need to treat these results with caution: the negative responses may be because of the lack of say most people have had over Boris being elected, or about the lack of any kind of progress over Brexit.
In other words, people may not be expressing their dissatisfaction with Boris in particular, but possibly at the whole of the inept political class in general!
Having said that, I’m not going to dismiss criticism of Boris: he is an Eton educated millionaire who seems to be prepared to lie and spin his way to the top, always putting his own personal ambition ahead of anything else.
86% of the global population have used drugs in the last year, and more people have used cannabis than tobacco. Almost 30% of the world’s population have used Cocaine in the last year, at least according to the 2019 Global Drug Survey.
This survey asked adults in 36 countries about their use of drugs and alcohol.
According to the same survey, the British get drunk more often than people in any other nation, at least according to a recent
In Britain, people stated they got drunk an average of 51 times last year, with U.S., Canada and Australia not far behind. The average was 33 times.
Where Cocaine use was concerned, 73% of people in England said they had tried it compared to 43% globally.
How valid is this data?
I don’t know about you, but to me these figures seem very high, and I’m left wondering if they aren’t skewed upwards by selective sampling or loose questions.
This report is produced by a private company who sell products related to addiction advice, and I guess their market is national health care services.
Seems to me like it’s in their interests to skew the data upwards to give themselves more of a purpose.
I certainly don’t believe the average person in the UK gets drunk once a week and that almost 3/4s of the population have tried Cocaine.
Interviews are one of the most commonly used qualitative research methods in the sociology of education. In this post I consider some of the strengths and limitations of using interviews to research education, focussing mainly on unstructured interviews.
This post is primarily designed to get students thinking about methods in context, or ‘applied research methods’. Before reading through this students might like to brush up on methods in context by reading this introductory post. Links to other methods in context advice posts can be found at the bottom of the research methods page (link above!)
Practical issues with interviews
Gaining access may be a problem as schools are hierarchical institutions and the lower down the hierarchy an individual is, the more permissions the interviewer will require to gain access to interview them. For example, you might require the headmaster’s permission to interview a teacher, while to interview pupils you’ll require the headmasters and their parent’s permission.
However, if you can gain consent, and get the headmaster onside, the hierarchy may make doing interviews more efficient – the headmaster can instruct teachers to release pupils from lessons to do the interviews, for example.
Interviews tend to take more time than questionnaires, and so finding the time to do the interviews may be a problem – teachers are unlikely to want to give up lesson time for interviews, and pupils are unlikely to want spend their free time in breaks or after school taking part in interviews. Where teachers are concerned, they do tend to be quite busy, so they may be reluctant to give up time in their day to do interviews.
However, if the topic is especially relevant or interesting, this will be less of a problem, and the interviewer could use incentives (rewards) to encourage respondents to take part. Group interviews would also be more time efficient.
Younger respondents tend to have more difficulty in keeping to the point, and they often pick up on unexpected details in questions, which can make interviews take longer.
Younger respondents may have a shorter attention span than adults, which means that interviews need to be kept short.
Students may see the interviewer as the ‘teacher in disguise’ – they may see them as part of the hierarchical structure of the institution, which could distort their responses. This could make pupils give socially desirable responses. With questions about homework, for example, students may tell the interviewer they are doing the number of hours that the school tells them they should be doing, rather than the actual number of hours they spend doing homework.
To overcome this the teacher might consider conducting interviews away from school premises and ensure that confidentiality is guaranteed.
Young people’s intellectual and linguistic skills are less developed that adults and the interviewer needs to keep in mind that:
They may not understand longer words or more complex sentences.
They may lack the language to be able to express themselves clearly
They may have a shorter attention span than adults
They may read body language different to adults
Having said all of that, younger people are probably going to be more comfortable speaking rather than reading and writing if they have poor communication skills, which means interviews are nearly always going to be a better choice than questionnaires where younger pupils are concerned.
To ensure greater validity in interviews, researchers should try to do the following:
Avoid using leading questions as young people are more suggestible than adults.
Use open ended questions
Not interrupt students’ responses
Learn to tolerate pauses while students think.
Avoid repeating questions, which makes students change their first answer as they think it was wrong.
Unstructured interviews may thus be more suitable than structured interviews, because they make it easier for the researcher to rephrase questions if necessary.
The location may affect the validity of responses – if a student associates school with authority, and the interview takes place in a school, then they are probably more likely to give socially desirable answers.
If the researcher is conducting interviews over several days, later respondents may get wind of the topics/ questions which may influence the responses they give.
Schools and parents may object to students being interviewed about sensitive topics such as drugs or sexuality, so they may not give consent.
To overcome this the researcher might consider doing interviews with the school alongside their PSHE programme.
Interviews may be unsettling for some students – they are, after all, artificial situations. This could be especially true of group interviews, depending on who is making up the groups.
Peer group interviews may well be a good a choice for researchers studying topics within the sociology of education.
Group interviews can create a safe environment for pupils
Peer-group discussion should be something pupils are familiar with from lessons
Peer-support can reduce the power imbalance between interviewer and students
The free-flowing nature of the group interview could allow for more information to come forth.
The group interview also allows the researcher to observe group dynamics.
They are more time efficient than one on one interviews.
Peer pressure may mean students are reluctant to be honest for fear of ridicule
Students may also encourage each other to exaggerate or lie for laffs.
Group interviews are unpredictable, and very difficult to standardise and repeat which mean they are low in validity.
This study contradicts many of the ‘moral panic’ type headlines which suggests a link between heavy social media use and depression. Such headlines tend to be based on studies which look at correlations between indicators of depression and indicators of social media use at the same point in time, which cannot tell us which comes first: the depression or the heavy social media use.
This Canadian study followed a sample of teenagers from 2015 (and university students for 6 years) and surveyed them at intervals using a set of questions designed to measure depression levels and another set designed to measure social media usage and other aspects of screen time.
What they found was that teenage girls who showed signs of depression early on in the study were more likely to have higher rates of social media usage later on, leading to the theory that teenage girls who are depressed may well turn to social media to make themselves feel better.
The study found no relationship between boys or adults of both sexes and depression and social media.
This is an interesting research study which really goes to show the advantages of the longitudinal method (researching the same sample at intervals over time) in possibly busting a few myths about the harmful effects of social media!
The 2018 report shows that the overall rate of permanent exclusions was 0.1 per cent of pupil enrolments in 2016/17. The number of exclusions was 7,720.
The report also goes into more detail, for example….
The vast majority of exclusions were from secondary schools >85% of exclusions.
The three main reasons for permanent exclusions (not counting ‘other’) were
Persistent disruptive behaviour
Physical assault against a pupil
Physical assault against an adult.
Certain groups of students are far more likely to be permanently excluded:
Free School Meals (FSM) pupils had a permanent exclusion rate four times higher than non-FSM pupils
FSM pupils accounted for 40.0% of all permanent exclusions
The permanent exclusion rate for boys was over three times higher than that for girls
Over half of all permanent exclusions occur in national curriculum year 9 or above. A quarter of all permanent exclusions were for pupils aged 14
Black Caribbean pupils had a permanent exclusion rate nearly three times higher than the school population as a whole.
Pupils with identified special educational needs (SEN) accounted for around half of all permanent exclusions
The ‘reasons why’ and ‘types of pupil’ data probably hold no surprises, but NB there are quite a few limitations with the above data, and so these stats should be treated with caution!
Limitations of data on permanent exclusions
According to this Guardian article, the figures do not take into account ‘informal exclusions’ or ‘off-rolling’ – where schools convince parents to withdraw their children without making a formal exclusion order – technically it’s then down to the parents to enrol their child at another institution or home-educate them, but in many cases this doesn’t happen.
According to research conducted by FFT Education Datalab up to 7, 700 students go missing from the school role between year 7 and year 11 when they are supposed to sit their GCSEs…. Equivalent to a 1.4% drop out rate across from first enrolment at secondary school to GCSEs.
Datalabs took their figures from the annual school census and the DfE’s national pupil database. The cohort’s numbers were traced from year seven, the first year of secondary school, up until taking their GCSEs in 2017.
The entire cohort enrolled in year 7 in state schools in England in 2013 was 550,000 children
However, by time of sitting GCSEs:
8,700 pupils were in alternative provision or pupil referral units,
nearly 2,500 had moved to special schools
22,000 had left the state sector (an increase from 20,000 in 2014) Of the 22,000,
3,000 had moved to mainstream private schools
Just under 4,000 were enrolled or sat their GCSEs at a variety of other education institutions.
60% of the remaining 15,000 children were likely to have moved away from England, in some case to other parts of the UK such as Wales (used emigration data by age and internal migration data to estimate that around)
Leaves between 6,000 to 7,700 former pupils unaccounted for, who appear not to have sat any GCSE or equivalent qualifications or been counted in school data.
Working out the percentages this means that by GCSEs, the following percentages of the original year 7 cohort had been ‘moved on’ to other schools.
6% or 32, 000 students in all, 10, 00 of which were moved to ‘state funded alternative provision, e.g. Pupil Referral Units.
4%, or 22K left the mainstream state sector altogether (presumably due to exclusion or ‘coerced withdrawal’ (i.e. off rolling), of which
4%, or 7, 700 cannot be found in any educational records!
There is very little detail on why pupils were excluded, other than the ‘main reason’ formally recorded by the head teacher in all school. There is no information at all about the specific act or the broader context. Labelling theorists might have something to say about this!
There is a significant time gap between recording and publication of the data. This data was published in summer 2018 and covers exclusions in the academic year 2016-2017. Given that you might be looking at this in 2019 (data is published annually) and that there is probably a ‘long history’ behind many exclusions (i.e. pupils probably get more than one second chance), this data refers to events that happened 2 or more years ago.
Relevance of this to A-level sociology
This is of obvious relevance to the education module… it might be something of a wake up call that 4% of students leave mainstream secondary education before making it to GCSEs, and than 1.4% seem to end up out of education and not sitting GCSEs!
This is a ‘new thread’ idea… posting up examples of naff research. I figure there are two advantages to this…
It’s useful for students to have good examples of naff research, to show them the meaning of ‘invalid data’ or ‘unrepresentative samples’, or in this case, just plain unreferenced material which may as well be ‘Fake News’.
At least I get some kind of pay back (in the form of the odd daily post) for having wasted my time wading through this drivel.
My first example is from The Independent, the ex-newspaper turned click-bait website.
12% of children under 6 spend more than 24 hours a week on their mobile
80% parents admit to not limiting the amount of time their children spend on games
Eventually it references a company called MusicMagpie (which is an online store) but fails to provide a link to the research, and provides no information at all about the sampling methods used or other details of the survey (i.e. the actual questions, or how it’s administered.). I dug around for a few minutes, but couldn’t find the original survey either.
The above figures just didn’t sound believable to me, and they don’t tie in with OFCOM’s 2017 findings which say that only 5% of 5-7 year olds and 1% of 3-4 year olds have their own mobiles.
As it stands, because of the simple fact that I can’t find details of the survey, these research findings from musicMagpie are totally invalid.
I’m actually quite suspicious that the two companies have colluded to generate some misleading click-bait statistics to drive people to their websites to increase advertising and sales revenue.
If you cannot validate your sources, then do not use the data!
According to a recent poll (1) of 1000 people, one in five Britons have considered going vegan, which is 20% of the population.
But how many of these people have a genuine intention of going vegan? Possibly not that many…..
Firstly, if someone’s asking you questions about veganism, there is going to be a degree of social pressure to state that ‘you have thought about going vegan’…. so social desirability is going to come into play here!
Secondly, vague questioning doesn’t help… the ‘I’ve considered going vegan’ response covers everything from ‘I’m definitely going Vegan in January’ to ‘I thought about it once, but really I’ve got no serious intention of giving up meat’.
Finally, there’s the problem that 1/3rd of the general population seem confused as to what veganism entails…. 27% think vegans can’t eat fruit (God knows what they think a vegan diet consists of!), while 6% think it’s OK to eat fish if you’re a vegan.
AQA A-level sociology Papers 1 and 3 will both contain an ‘outline and explain’ 10 mark (no item) question on sociological theories, and/ or methods.
One possible format for this question is what I like to think of as the ‘pure research methods’ format (‘classic’ might be a better word than ‘pure’) in which students are asked to outline and explain two theoretical, practical or ethical advantages or problems of using one of the main research methods.
For example (taken from the AQA’s June 2017 Education with Theory and Methods paper): ‘Outline and explain two problems of using documents in social research’
There are actually 36 possible variations of such ‘pure’ or ‘classic’ research methods questions, as outlined in the flow chart below.
Students may be asked to give two advantages or problems of any of the above methods, or more specific methods (field experiments for example), or they may be asked to give two advantages of using overt compared to covert participant observation, or asked to simply give two ethical problems which you may encounter when doing research more generally.
Then of course, students may be asked to relate methods to theories, or just asked about a pure ‘theoretical’/ perspectives question.
While there is no guarantee that this particular format of question will actually come up on either paper 1 or 3, it’s still good practice for students to work through a number of such questions as revision practice.
There are a lot of documents available and it can be time consuming to analyse them qualitatively
Taking news for example, there are thousands of news items published every day.
You also need to distinguish between ‘real and ‘fake news’.
Also, in the postmodern age where fewer people get their news from mainstream news it is necessary to analyse a wide range of media content to get representatives, which makes this more difficult.
Because there are so many documents available today, it is necessary to use computer assisted qualitative analysis, which effectively quantifies the qualitative data, meaning that some of depth and insight are lost in the process.
With personal documents, gaining access might be a problem
Personal diaries are one of the most authentic sources of information because people write them with no intention of them being seen.
However, they may not be willing to show researchers the content because they say negative feelings about people close to them, which could harm them.
Blogs would be easier to access but the problem is people will edit out much of what they feel because these are published.
Official Statistics are a quick and cheap means of accessing data relevant to an entire population in a country.
They are cheap for researchers to use because they are collected by governments, who often make them available online for free—for example, the UK Census.
Marxists might point out that the fact they are free enables marginalised groups to ‘keep a check on government’.
More generally, they are useful for making quick evaluations of government policy, to see if tax payers’ money is being spent effectively–
Official statistics are a very convenient way of making cross national comparisons without visiting other countries.
Most governments in the developed world today collect official statistics which are made available for free.
More and more governments collect data around the world, so there is more and more data available every year.
The United Nations Development Programme collects the same data in the same way, so it’s easy to assess the relationship between economic and social development in a global age.
Theory and Methods A Level Sociology Revision Bundle
If you like this sort of thing, then you might like my Theory and Methods Revision Bundle – specifically designed to get students through the theory and methods sections of A level sociology papers 1 and 3.
74 pages of revision notes
15 mind maps on various topics within theory and methods