Evaluating institutional open access performance: Sensitivity analysis
In the article “Evaluating institutional open access performance: Methodology, challenges and assessment” we develop the first comprehensive and reproducible workflow that integrates multiple bibliographic data sources for evaluating institutional open access (OA) performance. The major data sources...
| Main Authors: | , , , , , , |
|---|---|
| Format: | Journal Article |
| Published: |
2020
|
| Online Access: | http://hdl.handle.net/20.500.11937/85129 |
| _version_ | 1848764717736132608 |
|---|---|
| author | Huang, Karl Neylon, Cameron Hosking, Richard Montgomery, Lucy Wilson, Katie Ozaygen, Alkim Brookes-Kenworthy, Chloe |
| author_facet | Huang, Karl Neylon, Cameron Hosking, Richard Montgomery, Lucy Wilson, Katie Ozaygen, Alkim Brookes-Kenworthy, Chloe |
| author_sort | Huang, Karl |
| building | Curtin Institutional Repository |
| collection | Online Access |
| description | In the article “Evaluating institutional open access performance: Methodology, challenges and assessment” we develop the first comprehensive and reproducible workflow that integrates multiple bibliographic data sources for evaluating institutional open access (OA) performance. The major data sources include Web of Science, Scopus, Microsoft Academic, and Unpaywall. However, each of these databases continues to update, both actively and retrospectively. This implies the results produced by the proposed process are potentially sensitive to both the choice of data source and the versions of them used. In addition, there remain the issue relating to selection bias in sample size and margin of error. The current work shows that the levels of sensitivity relating to the above issues can be significant at the institutional level. Hence, the transparency and clear documentation of the choices made on data sources (and their versions) and cut-off boundaries are vital for reproducibility and verifiability. |
| first_indexed | 2025-11-14T11:23:48Z |
| format | Journal Article |
| id | curtin-20.500.11937-85129 |
| institution | Curtin University Malaysia |
| institution_category | Local University |
| last_indexed | 2025-11-14T11:23:48Z |
| publishDate | 2020 |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | curtin-20.500.11937-851292022-01-14T01:53:51Z Evaluating institutional open access performance: Sensitivity analysis Huang, Karl Neylon, Cameron Hosking, Richard Montgomery, Lucy Wilson, Katie Ozaygen, Alkim Brookes-Kenworthy, Chloe In the article “Evaluating institutional open access performance: Methodology, challenges and assessment” we develop the first comprehensive and reproducible workflow that integrates multiple bibliographic data sources for evaluating institutional open access (OA) performance. The major data sources include Web of Science, Scopus, Microsoft Academic, and Unpaywall. However, each of these databases continues to update, both actively and retrospectively. This implies the results produced by the proposed process are potentially sensitive to both the choice of data source and the versions of them used. In addition, there remain the issue relating to selection bias in sample size and margin of error. The current work shows that the levels of sensitivity relating to the above issues can be significant at the institutional level. Hence, the transparency and clear documentation of the choices made on data sources (and their versions) and cut-off boundaries are vital for reproducibility and verifiability. 2020 Journal Article http://hdl.handle.net/20.500.11937/85129 10.1101/2020.03.19.998542 restricted |
| spellingShingle | Huang, Karl Neylon, Cameron Hosking, Richard Montgomery, Lucy Wilson, Katie Ozaygen, Alkim Brookes-Kenworthy, Chloe Evaluating institutional open access performance: Sensitivity analysis |
| title | Evaluating institutional open access performance: Sensitivity analysis |
| title_full | Evaluating institutional open access performance: Sensitivity analysis |
| title_fullStr | Evaluating institutional open access performance: Sensitivity analysis |
| title_full_unstemmed | Evaluating institutional open access performance: Sensitivity analysis |
| title_short | Evaluating institutional open access performance: Sensitivity analysis |
| title_sort | evaluating institutional open access performance: sensitivity analysis |
| url | http://hdl.handle.net/20.500.11937/85129 |