The work stream will develop scripts to qualify an R environment, including the base installation and additional R framework that exists for the base installation tests; however this does not include testing for additional packages. The tests will initially include those that R and the packages come installed with; however, the work stream will also develop a repository of tests to supplement those tests. Organisations could extend the tests if they find them insufficient and contributed their changes back to the project if their policies allow. This work stream will collaborate with the RValidationHub and the RConsortium to gather requirements for R environments and best practices for qualifying R systems. This project will focus on R environments; however, the process could be extended to analysis environments including Python, Julia and other languages. The work done with the framework could be extended by other groups working with virtualised environments like Best Practices for Interactive Visualisations. It could also be extended by the Clinical Statistical Reporting in a multilingual world group to ensure environments with multiple languages work as expected.
|Eli Miller (Atorus Research)|
|Paula Rowley (PHUSE Project Assistant)||firstname.lastname@example.org|
|Objectives & Deliverables||Timeline|
CURRENT STATUS Q1 2023
Project is currently on hold due to low demand and waiting for feedback from R Validation Hub.
|David Dube||Sarepta Therapeutics|
|Drew Foglia||Veeva Systems|
|Kapil Anand||Parexel International|
|Matthew Travell||GW Pharmaceuticals|
|Tania Walton||Sycamore Informatics|
|Vineet Sharma||Serepta Therapeutics|