The significance of participant experience when evaluating software inspection techniques
Software inspections have been used to improve software quality for 30 years. The Checklist Based Reading strategy has traditionally been the most prevalent reading strategy. Increased Object Oriented usage has raised questions regarding this techniques efficacy, given issues such as delocalisation....
| Main Authors: | , , , |
|---|---|
| Other Authors: | |
| Format: | Conference Paper |
| Published: |
IEEE Computer Society
2009
|
| Online Access: | http://hdl.handle.net/20.500.11937/39323 |
| _version_ | 1848755560203157504 |
|---|---|
| author | McMeekin, David von Konsky, Brian Robey, Michael Cooper, David |
| author2 | Paul Strooper |
| author_facet | Paul Strooper McMeekin, David von Konsky, Brian Robey, Michael Cooper, David |
| author_sort | McMeekin, David |
| building | Curtin Institutional Repository |
| collection | Online Access |
| description | Software inspections have been used to improve software quality for 30 years. The Checklist Based Reading strategy has traditionally been the most prevalent reading strategy. Increased Object Oriented usage has raised questions regarding this techniques efficacy, given issues such as delocalisation. This study compared two inspection techniques: Use-Case Reading and Usage-Based Reading, with Checklist Based Reading. Students and industry professionals were recruited to participate in the study. The effectiveness of each reading strategy was analysed, and the effect experience had on inspection efficacy. The results showed no significant difference between inspection techniques,whether used by student or professional developers but a significant difference was identified between student and professional developers in applying the different techniques. Qualitative results highlighted the differences in ability between industry and students with respect to what each group considered important when inspecting and writing code. These results highlight the differences between student and industry professionals when applying inspections. Therefore, when selecting participants for empirical software engineering studies, participant experience level must be accounted for within the reporting of results. |
| first_indexed | 2025-11-14T08:58:15Z |
| format | Conference Paper |
| id | curtin-20.500.11937-39323 |
| institution | Curtin University Malaysia |
| institution_category | Local University |
| last_indexed | 2025-11-14T08:58:15Z |
| publishDate | 2009 |
| publisher | IEEE Computer Society |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | curtin-20.500.11937-393232022-12-09T06:09:43Z The significance of participant experience when evaluating software inspection techniques McMeekin, David von Konsky, Brian Robey, Michael Cooper, David Paul Strooper David Carrington Software inspections have been used to improve software quality for 30 years. The Checklist Based Reading strategy has traditionally been the most prevalent reading strategy. Increased Object Oriented usage has raised questions regarding this techniques efficacy, given issues such as delocalisation. This study compared two inspection techniques: Use-Case Reading and Usage-Based Reading, with Checklist Based Reading. Students and industry professionals were recruited to participate in the study. The effectiveness of each reading strategy was analysed, and the effect experience had on inspection efficacy. The results showed no significant difference between inspection techniques,whether used by student or professional developers but a significant difference was identified between student and professional developers in applying the different techniques. Qualitative results highlighted the differences in ability between industry and students with respect to what each group considered important when inspecting and writing code. These results highlight the differences between student and industry professionals when applying inspections. Therefore, when selecting participants for empirical software engineering studies, participant experience level must be accounted for within the reporting of results. 2009 Conference Paper http://hdl.handle.net/20.500.11937/39323 IEEE Computer Society fulltext |
| spellingShingle | McMeekin, David von Konsky, Brian Robey, Michael Cooper, David The significance of participant experience when evaluating software inspection techniques |
| title | The significance of participant experience when evaluating software inspection techniques |
| title_full | The significance of participant experience when evaluating software inspection techniques |
| title_fullStr | The significance of participant experience when evaluating software inspection techniques |
| title_full_unstemmed | The significance of participant experience when evaluating software inspection techniques |
| title_short | The significance of participant experience when evaluating software inspection techniques |
| title_sort | significance of participant experience when evaluating software inspection techniques |
| url | http://hdl.handle.net/20.500.11937/39323 |