You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Project Scope

The work stream will develop scripts to qualify an R environment, including the base installation and additional R framework that exists for the base installation tests; however this does not include testing for additional packages. The tests will initially include those that R and the packages come installed with; however, the work stream will also develop a repository of tests to supplement those tests. Organisations could extend the tests if they find them insufficient and contributed their changes back to the project if their policies allow. This work stream will collaborate with the RValidationHub and the RConsortium to gather requirements for R environments and best practices for qualifying R systems. This project will focus on R environments; however, the process could be extended to analysis environments including Python, Julia and other languages. The work done with the framework could be extended by other groups working with virtualised environments like Best Practices for Interactive Visualisations. It could also be extended by the Clinical Statistical Reporting in a multilingual world group to ensure environments with multiple languages work as expected. 

Project LeadsEmail
Eli Miller

eli.miller@atorusresearch.com 

Project MembersOrganisation
Eric Milliman Biogen 
Kelci MiclausVeeva Systems
Manolo Corte Industry
Vineet Sharma Serepta
Objectives & DeliverablesTimeline
Kick off Meeting Q2 2021
Working System Qualification Script Q3 2021

CURRENT STATUS Q22021

This is a new project and is actively seeking participation. If you are interested in joining this team, please email workinggroups@phuse.global.

Problem Statement 

Organisations have taken several approaches to qualifying an R environment. Many of these approaches don’t capture the nuances of qualifying open-source software. Qualifying an analysis environment is a time-consuming and labour-intensive process that could be automated easily. The groups that do the environment qualification are often not the end users of the environment and may not know best practices for R systems. This framework could give information to both the users of the R environment and the IT managers who must ensure the system is qualified. 

Problem Impact 

This work stream would result in a repository that an organisation could clone and use to test, qualify, and document the quality and characteristics of their R environments. This would improve the qualify and efficiency of qualifying the R environments and expand their use to containers and virtualised systems. The report generated from the framework would be extensible, flexible, and meet the requirements most QA groups look for. The report would require minimal set-up and would pull the required information from the environment itself. 

  • No labels