Appraising the Quality of Knowledge Resources: Difference between revisions

No edit summary
No edit summary
Line 8: Line 8:
</div>  
</div>  
== Introduction ==
== Introduction ==
The third step in the evidence based practice process is appraising the quality of the resources found by creating a clinical question and locating the resources. Critically appraising a paper or study implies that you closely examine the results of the paper so that you can decide whether it is worthy of being used to inform your clinical practice. (Hoffman et al). Unfortunately, just because a study is published does not make it necessarily a good and relevant study for your scenario. Also, with so many research papers published it is challenging to decide which studies to use or not. This is where the critical appraisal of studies is advised.  
The third step in the evidence based practice process is appraising the quality of the resources found by creating a clinical question and locating the resources. Critically appraising a paper or study implies that you closely examine the results of the paper so that you can decide whether it is worthy of being used to inform your clinical practice. (Hoffman et al). Unfortunately, just because a study is peer-reviewed and published does not necessarily mean that is of good quality. (Ioannidis 2016) Clinical research is criticised as not useful for reasons such as: it is not sufficiently pragmatic (applicable to real-life scenarios),  it is not patient-centred, transparent or feasible. Also, with so many research papers published it is challenging to decide which studies to use or not. This is where the critical appraisal of studies is advised.  


The vast amount of published information available creates two questions for rehabilitation professionals.  
== How to Deal with Information Overload? ==
The vast amount of published information available creates two questions for rehabilitation professionals. (Benita)


1.       How do you ensure that you find what you need in the vast amount of available information
1.       How do you ensure that you find what you need in the vast amount of available information
Line 23: Line 24:


Steps in Appraising the Quality of Knowledge Resources
Steps in Appraising the Quality of Knowledge Resources
Before using the results of a study to help you with clinical decision-making process,  it is important to determine if the study has used sound methods. Poorly designed studies may lead to bias and may also provide you with misleading results. (Hoffman et al)


Levels of Evidence
Levels of Evidence


This is also known as the hierarchy of evidence.  
Levels of evidence (also referred toas hierarchies of evidence) helps in deciding which study type will provide the best evidence for a specific question. The heuristic ("rule of thumb") developed by the Oxford Centre for Evidence Based Medicine is a useful tool to refer to.  
 
For this course the heuristic developed by the Oxford Centre for Evidence Based Medicine is used.
 
Links to documents: Levels of evidence table
 
Introductory document


Background document
* Introductory document: [https://www.cebm.ox.ac.uk/resources/levels-of-evidence/levels-of-evidence-introductory-document Levels of Evidence: An introduction]
* Background document: [https://www.cebm.ox.ac.uk/files/levels-of-evidence/cebm-levels-of-evidence-background-document-2-1.pdf CEBM Background document: Explanation of the 2011 Oxford Centre for Evidence-Based Medicine (OCEBM)  Levels of Evidence]
* [https://www.cebm.ox.ac.uk/files/levels-of-evidence/cebm-levels-of-evidence-2-1.pdf Levels of evidence document]


Study designs
Study designs

Revision as of 12:48, 25 November 2023

This article or area is currently under construction and may only be partially complete. Please come back soon to see the finished work! (25/11/2023)

Original Editor - User Name

Top Contributors - Wanda van Niekerk, Jess Bell and Kim Jackson  

Introduction[edit | edit source]

The third step in the evidence based practice process is appraising the quality of the resources found by creating a clinical question and locating the resources. Critically appraising a paper or study implies that you closely examine the results of the paper so that you can decide whether it is worthy of being used to inform your clinical practice. (Hoffman et al). Unfortunately, just because a study is peer-reviewed and published does not necessarily mean that is of good quality. (Ioannidis 2016) Clinical research is criticised as not useful for reasons such as: it is not sufficiently pragmatic (applicable to real-life scenarios), it is not patient-centred, transparent or feasible. Also, with so many research papers published it is challenging to decide which studies to use or not. This is where the critical appraisal of studies is advised.

How to Deal with Information Overload?[edit | edit source]

The vast amount of published information available creates two questions for rehabilitation professionals. (Benita)

1.       How do you ensure that you find what you need in the vast amount of available information

a.       Develop a well-thought through clinical question

b.       Use specific search terms

2.       How do you know which findings are based on sound science and is applicable to your specific case or scenario?

If there is a good summarised evidence based information available that answers your exact clinical question – use it, learn from it and apply it as you see fit with your clinical reasoning skills. Systematic review or clinical practice guidelines

Steps in Appraising the Quality of Knowledge Resources

Before using the results of a study to help you with clinical decision-making process, it is important to determine if the study has used sound methods. Poorly designed studies may lead to bias and may also provide you with misleading results. (Hoffman et al)

Levels of Evidence

Levels of evidence (also referred toas hierarchies of evidence) helps in deciding which study type will provide the best evidence for a specific question. The heuristic ("rule of thumb") developed by the Oxford Centre for Evidence Based Medicine is a useful tool to refer to.

Study designs

Clinical practice guidelines – definition – a document that contains valuable recommendation based on a condition; group of stakeholders or experts comes together, form a committee and put recommendations together

Show an example of one

Systematic review

Pooling together of various primary or individual studies

Link to design tree by Centre for Evidence-Based Medicine

Two arms of design tree – Descriptive and Analytical

Descriptive – Population and outcome  - no intervention

Analytical – PICO or PECO – Population, Intervention / Exposure, Comparison, Outcome

Descriptive studies -for eg survey or qualitative study, describes the problem and not examining relationships or associations

Experimental studies – active involvement of the researcher, group may be given an intervention, another group is a control group and groups are compared

https://www.physio-pedia.com/Quantitative_Research

Randomised control trials – used to determine the effect of an intervention

Randomisation – definition – ensures that the unknown or confounding factors are equally distributed between groups

Concealed allocation –

Blinding

Types randomised parallel group or crossover design

GetReal Trial Tool

Observational analytical arm – researcher less involved than in experimental design

Cohort studies – group of participants observed over period of time – effect of risk factors on an outcome

Cross-sectional study – snapshot of what’s going on in a population at a specific time

Case-control studies – two groups one with the outcome of interest and one without

How to decide what study design was used

Centre of evidence based mecdicne  - series of questions

Grimes and Schultz image

Historically controlled studes

Mechanism – based reasoning

Case series

Critical Appraisal

Equator Network

CASP

Sub Heading 2[edit | edit source]

Sub Heading 3[edit | edit source]

Resources[edit | edit source]

  • bulleted list
  • x

or

  1. numbered list
  2. x

References[edit | edit source]