| Home | E-Submission | Sitemap | Contact Us |  
top_img
Journal of Korean Academy of Fundamentals of Nursing 1998;5(2): 269
인공호흡기 사용 환자들에게 제공된 예비적 정보에 대한 내용분석의 측정자간 신뢰도
김화순
인하대학교 의과대학 간호학과
Interrater Reliability in the Content Analysis of Preparatory Information for Mechanically Ventilated Patients
Hwa-Soon Kim
Department of Nursing, College of Medicine, Inha University
Abstract
In nursing research that the data is collected through clinical observation, analysis of clinical recording or coding of interpersonal interaction in clinical areas, testing and reporting interrater reliability is very important to assure reliable results. Procedures for interrater reliability in these studies should follow two steps. The first step is to determine unitizing reliability, which is defined as consistency in the identification of same data elements in the record by two or more raters reviewing the same record. Unitizing reliability have been rarely reported in previous studies. Unitizing reliability should be tested before progressing to the next step as precondition. Next step is to determine interpretive reliability. Cohen's kappa is a preferable method of calculating the extent of agreement between observer or judges because it provides beyond-chance agreement. Despite its usefulness, kappa can sometimes present paradoxical conclusions and can be difficult to interpret. These difficulties result from the feature of kappa which is affected in complex ways by the presence of bias between observers and by true prevalence of certain categories. Therefore, percentage agreement should be reported with kappa for adequate interpretation of kappa. The presence of bias should be assessed using the bias index and the effect of prevalence should be assessed using the prevalence index. Researchers have been reported only global reliability reflecting the extent to which coders can consistently use the whole coding system across all categories. Category-by-category reliability also need to be reported to inform the possibility that some categories are harder to use than others.
Key words: Content Analysis | Preparatory Information | Interrater Reliability | Cohen's kappa
주요어: 내용분석 | 예비적 정보 | 측정자간 신뢰도
TOOLS
PDF Links  PDF Links
Full text via DOI  Full text via DOI
Download Citation  Download Citation
Supplement  Supplement
  E-Mail
Share:      
METRICS
908
View
5
Download