QuestionTeams Collective Response

Naming Convention for Split RS Domain

In my company, questionnaires are split and have a split dataset name. 

The Eastern Cooperative Oncology Group (ECOG) performance status, which was initially mapped into QSEG, now needs to be mapped into RSEG, according to SDTM IG v3.3. However, there is also an RS domain for Disease Response forms, resulting in a compliance issue.

To solve this issue, I propose a split dataset name for Disease Response in RS (RSRS). Either keep RS as it is because it is not a questionnaire or keep RSEG and provide an explanation of the P21 issue in the cSDRG. What do you think? Do you have any recommendations?

PHUSE Team Response: 26 March 2024

Suggest either keeping everything in RS, including ECOG, or splitting RS into multiple datasets with different names (e.g. RS01, RSDR) and avoiding using a suffix that may be similar to a standard domain name (e.g. RSEG may be confused with having a relationship to the EG domain). The assumption is for the split datasets to add up to become the original domain (FA, RS, QS). It is acceptable to treat the individual response scores as split datasets; however, in that case, the upper-level domain (e.g. RS) cannot be included in the data package.


Can an SDTM domain that is in the SDTM IG v3.3 be used for a study that is using the SDTM IG v3.2 to map the study data? Will this domain be considered a custom domain and need to be documented in the cSDRG?

Similarly, some variables are defined in a TAUG but are not part of the SDTM IG version that is being used for a given study. Can such a variable be added to the parent domain in the tabulation domains? How should this addition be documented in the cSDRG?

PHUSE Team Response: 24 June 2022 

There is no harm in borrowing domains from a later SDTM IG version or as documented in a TAUG to add as part of the study tabulation domains. Domains such as AG and CC have been included in the SDTM IG v3.3 and are part of the Alzheimer’s Disease TAUG (for AG) and the Cardiovascular TAUG (for CC). The sponsor may choose to include this information in the cSDRG to add clarification for a regulatory reviewer. It is not required by either CDISC or PHUSE.

There is also no harm in adding variables documented in a given TAUG but not yet existing as part of a parent domain in the SDTM IG version. The SDTM IG version 3.3, for example, has included the FOCID variable as part of the OE domain. In order to avoid any Pinnacle 21 findings, however, it is recommended to add such variables in the supplemental qualifier of the domain.

Note that the SDTM model differences should also be taken into consideration. The FOCID variable mentioned above, for example, should be added into the supplemental qualifier if the other variables recommended to be included in the FO domain by the SDTM model associated with the SDTM IG v3.3 are not present. Additionally, standardised controlled terminology and standardised external dictionaries linked to the SDTM IG version and model should also be taken into consideration when implementing such variables into the domains.

Missing severity in the Pinnacle 21 Community Version 3.1.

In general, we will execute Pinnacle 21 Community version and justify any outstanding issues present in Pinnacle 21 report under section 4.1 of the Reviewers Guide with severity level. Initially, Pinnacle 21 report used to have various severity levels e.g. Error, Warnings, Notice, Reject. But now the Pinnacle 21 Community version 3.1 (FDA Engine version 1907.2) report will only show severity for rejection rules.      

  1. Is it ok to leave severity column blank in the Reviewers Guide (SDRG/ADRG)?.
  2. Severity level used to serve as a major driver on our decision to document particular issue. Does the FDA recommend any particular criteria instead of the severity level for successful submission e.g. particular FDA business or validator rules needs to be fixed where possible?.

PHUSE Team Response: 16 March 2021

The PHUSE SDTM/ADaM Implementation FAQ team reached out to the eData team at the FDA. Responses received from the eData team on 1st Feb, 2021 are summarised below:

  1. It is fine to leave severity column blank in SDRG as you mentioned that only severity for rejection is shown on the most current version of Pinnacle 21 validation report.
  2. Currently FDA don’t have the particular criteria which is in line with severity level for successful submission. Please correct all validation findings as possible and write down unfixed findings on reviewers guide.
What are FDA Business Rules and Validator Rules?.

PHUSE Team Response: 20 July 2018

a) FDA Business Rules

The FDA Business Rules document V1.3, published December 2017, states 'The FDA Business Rules describe the business requirement for regulatory review to help ensure that the study data is compliant, useful, and will support meaningful review and analysis.' For more information see Section 8 of the Technical Conformance Guide.

b) Validator Rules

The FDA Validator Rules document V1.2 published December 2017, states 'The rules used by the FDA study data validator to ensure data are standards compliant and support meaningful review and analysis. In addition, the document links the study data business rules to the study data validator rules.'

Please refer to the most recent version of these documents that are available in the Business Rules section at the following Web Page

2) What are CDISC SDTM Conformance Rules?

PHUSE Team Response: 20 July 2018

a) CDISC Study Data Tabulation Model: Conformance Rules User Guide, V1.0 published in December 2016, states 'The purpose of this guide is to document a standard, concise structure for identifying and classifying SDTM and SDTMIG text that may constitute a conformance rule definition. The structure for the rules, the Rules Metadata Model, and the conventions for its content are described in detail.' Additionally, a companion Microsoft Excel workbook, SDTMIGV3.2 Conformance Rules V1.0, was released simultaneously. For each rule, the workbook describes the RuleID, Class, Domain, Variable, Rule, and Condition along with the SDTMIG reference details, Programmable Flag and FDA Rule ID (V1.0).

Please refer to the most recent version of the SDTM Conformance Rules document here

CDISC ADaM Validation Checks V1.3, published in March 2015, states 'This document contains a list of requirements which may be used to validate datasets against a subset of these rules which are objective and unambiguously evaluable. The validation checks within this document are defined to be machine readable (i.e. programmable within computer software) and capable of being implemented by ADaM users. The validation checks within this document can be implemented with software to test rules defined within the ADaM Implementation Guide 1.0, Data Structure for Adverse Events (ADAE), and the ADaM Basic Data Structure for Time-to-Event Analyses.' Additionally, a companion Microsoft Excel workbook, ADaM-validation_checks_V1.3_final, was released simultaneously. For each rule, the ADaM workbook describes the following: Check Number, ADaM IG Section Number, Text from ADaM IG, ADaM Structure Group, Functional Group, ADaM Variable Group and Machine-Testable Failure Criteria.

Please refer to the most recent version of the ADaM Validation Checks document here.

3) How do the FDA Business Rules and Validator Rules differ from the CDISC SDTM Conformance rules and ADaM checks?

PHUSE Team Response: 20 July 2018

a) The CDISC conformance rules check for conformance to the CDISC standards; whereas the FDA business rules help to confirm that the study data are compliant, useful and support a meaningful review.

The FDA Validator Rules check whether the FDA business rules are met. Not every FDA Business Rule can be automated; checking some of them would need human involvement.

4) If I get no error messages from my CDISC conformance checks, are the SDTM and ADaM submission datasets CDISC-conformant?.

PHUSE Team Response: 20 July 2018

a) It is important to understand that absence of validation tool error message doesn't ensure CDISC conformance. There are some aspects of SDTM and ADaM conformance that are not testable by computer.

Section 1 of the 'CDISC Study Data Tabulation Model: Conformance Rules User Guide V1.0' document states, 'Rules governed by this guidance are not assumed to be universally programmable, that is, capable of being implemented as automated checks.' Section 3 defines a rules metadata attribute 'Programmable' as 'Indicator that a rule may be able to be implemented as an automated check.' The 'Programmable Flag Comment' is defined as 'Supplemental explanatory text for rules where there is a condition or factor as to whether they are able to be programmed as an automated check. In most cases this text would indicate a specific dependency on data or metadata that cannot be assumed to be always present and available.' Of the 410 conformance rules defined in the document, 85 are dependent on addition data or metadata, including in some cases non-standard sponsor data and metadata. Some of these rules are not tested by common validation tools; yet they still must be followed for SDTM conformance.

Similarly, Section 2 of the 'CDISC ADaM Validation Checks V1.3' document states:

'The validation checks within this document can be implemented with software... The checks are meant to test the structure and certain standardised content of the ADaM data sets. These checks are not meant to define the whole spectrum of ADaM compliance including content and well defined metadata.

The following are examples of aspects of ADaM compliance that cannot be tested by software program:

Within section 4.3.1 of the Implementation Guide the text says, 'Include all observed and derived rows for a given analysis parameter.'
Within section 4.6.1 of the Implementation Guide the text says, 'To identify population-specific analysed rows, use population-specific indicator variables.'


Many ADaM variables are conditionally required (required if a condition is true), but some conditions are not testable by a software program
One of the key components of the ADaM is the inclusion of thorough and well defined metadata. The thoroughness and clarity of metadata cannot be tested by a machine-readable algorithm but is necessary to enable the traceability that ADaM requires.

While the examples above are rules that must be followed while implementing ADaM , they cannot be tested by a machine-readable algorithm. Instead, a complete assessment of compliance must be based on an understanding of the scope of the study data and the analyses which the datasets should support coupled with the published validation checks within this document and the general rules and principles published in the ADaM Implementation Guide.'

1) When the Errors and Warnings are still in the validation report after running validation tool before submitting to regulatory agencies, how do companies document this?.

2) Should each error and warning documented or every unique error and warning has to be documented? How can the different errors and warnings produced in the report be handled?.

3) How should messages with Reject severity be addressed?.

PHUSE Team Response: 7 June 2017

1) In general, the outstanding errors and warnings should be documented in Study Data Reviewers Guide (SDRG as csdrg.pdf) for SDTM and Analysis Data Reviewers Guide (ADRG as adrg.pdf) for ADaM. See reference section below.

2) It is the decision of the sponsor to document the errors/warnings. It is highly recommended to document the rationale for the failure. The level of the documentation depends on the reviewer and the regulatory agency. It is recommended to document each and every unique SDTM error/warnings within each domain in the Study Data Reviewers Guide with as much detail as possible. Similarly, it is recommended to document each and every unique ADaM error/warnings within each dataset in the Analysis Data Reviewers Guide with as much detail as possible.

3) The intent of Reject severity is that the data must be FIXED in the submission. Please be proactive and speak to the regulatory agencies prior to a submission.

Additional References:


Study Data Reviewers Guide and Analysis Data Reviewers Guide

What are the best ways to document errors/warnings caused due to Controlled Terminology? For e.g. when a non-extensible Codelist has been extended or if extensible codelist has been extended?.PHUSE Team Response: 7 June 2017

Please reference the FDA Technical Conformance Guide section 6 on the maintenance of the controlled terminology for US submissions. Please refer to Validation rules spreadsheet in the PMDA website for more information on the non-extensible codelists. It is recommended to document errors/warnings in the SDRG or ADRG

Additional References:

N/A
How do we document the errors/warnings from the FDA or PMDA Validation rules that are not part of the CDISC Validation rules?.

PHUSE Team Response: 7 June 2017

Please refer to Validation rules spreadsheet in the FDA and PMDA website for more information. It is recommended to document errors/warnings specific to the regulatory authorities’ validation rules in the SDRG and ADRG. Please be proactive and speak to the reviewer and the regulatory agencies prior to a submission.

Additional References:

N/A

The validation software used by the FDA is very buggy. How do we recognise false positive errors/warnings from real ones? Is there a numbered list of them, so that we can reference these false positives in the SDRG?.

  • No labels