Audit
This section explains the steps to be taken to go through each part of the SUNRISE Safety Assurance Framework (SAF), including references to SUNRISE deliverable sections for further details (indicated by “D”). These steps can guide both the auditors and the appliers of the SAF. The stepwise approach applied in this section, caters the work of certifiers and regulators that normally use a process format more than a framework. In this context, “Auditors” are people that audit the application of the SUNRISE SAF (like certifiers or regulators), whereas “Appliers” are people that apply SUNRISE SAF to assess the safety of Cooperative Connected Automated Mobility (CCAM) systems (like vehicle manufacturers or their suppliers).
Some important remarks to take into account, related to the contents of this section:
- This section might be modified and will be expanded with additional steps, based on contents of deliverables that were not available yet at the time of writing this section.
- Auditing of CCAM safety assurance procedures (like the SUNRISE SAF), might involve aspects going beyond the steps described in this section, including (but not limited to) audits on the internal procedures for design, development, testing and overall safety management at CCAM technology developers (like vehicle manufacturer and their suppliers). Although not denying their possible importance, these aspects are not treated here due to scope limitations of the SUNRISE project.
Scenario
By following the steps indicated below, a user of the SUNRISE SAF can audit the contents of the Scenario block. These steps aim to audit the overall characteristics and properties of the scenario database, and do not consider the actual content of individual test cases (which is part of the Execute block). Note that the Scenario sub-blocks (Create, Format and Store) are the responsibility of the scenario database owner and therefore not further elaborated in the SUNRISE SAF. For that same reason, auditing steps will not be specified for individual sub-blocks, and apply only to the overarching Scenario block.
- Check Standardization Compliance
- Verify that the scenario databases follow standardized formats for storing scenarios (D3.2 Section 7.1)
- Confirm alignment with data exchange and testing standards (D6.1 Section 2.6)
- Verify Source Documentation
- Check that scenarios in the database include source information (D3.2 Section 4.2.3, Requirement A.3)
- Validate that sources are properly documented with data or expert knowledge origin (D3.2 Section 4.2.3, 8.1.1)
- Evaluate Database Extension Capabilities
- Verify the database supports extensible parameter lists (D3.2 Section 5.5.2, Requirement H.2)
- Ensure the database can accommodate new parameters. For example when standards or protocols are updated. (D3.2 Section 5.5.2, 8.2.4)
- Verify Database Search Functionality
- Confirm the database supports searching/querying using tags (D3.2 Section 4.2.2, Requirement A.2)
- Review Data Accuracy (D5.3 Section 4.2.2)
- Assess the correctness of data entered into the scenario database
- Cross-reference with known benchmarks or references
- Verify that scenarios reflect real-world conditions accurately
- Review Data Consistency (D5.3 Section 4.2.2)
- Check for uniformity across all scenarios in:
- Units of measurement
- Data formats
- Terminologies used
- Standardized naming conventions
- Check for uniformity across all scenarios in:
- Check Data Freshness (D5.3 Section 4.2.2)
- Check how up-to-date the scenarios are
- Verify current relevance of the database content
- Check last update timestamps
- Check Number of Scenarios (D5.3 Section 4.2.2)
- Count total distinct scenarios available
- Obtain overview of database comprehensiveness
- Check scenario quantity metrics
- Check Covered Kilometres (D5.3 Section 4.2.2)
- Check on how many Km’s the scenario database is based
- Assess span and scale of included scenarios
- Quantify geographic coverage
- Verify Scenario Distribution (D5.3 Section 4.2.2)
- Analyse breakdown of scenarios by:
- Geographic regions
- Road types
- Weather conditions
- Time of day
- Traffic density
- Analyse breakdown of scenarios by:
- Verify Scenario Complexity (D5.3 Section 4.2.2)
- Assess difficulty levels across scenarios considering:
- Number of vehicles involved
- Presence of pedestrians
- Road complexity
- Environmental conditions
- Assess difficulty levels across scenarios considering:
Environment
Query & Concretize
By following the steps indicated below, a user of the SUNRISE SAF can audit the contents of the Query & Concretize block, thereby ensuring that resulting test cases are of high quality, relevant and comprehensive for the intended testing purposes:
- Review the inputs to the COTSATO (COncretizing Test Scenarios and Associating Test Objectives) process (D3.2 Section 7.3)
- Verify that the Operational Design Domain (ODD) description is provided and follows the format guidelines in ISO 34503
- Check that the system requirements are clearly defined
- Ensure the SUT (System Under Test) is properly specified
- Confirm that variables to be measured during test execution are listed
- Validate that pass/fail criteria for successful test execution are provided
- Examine the query used to fetch scenarios from the scenario storage:
- Ensure the query is well-formulated and aligns with the requirements specified in D3.2 Section 7.4
- Verify the scenarios retrieved from the scenario storage:
- Check that the scenario concept complies with the requirements outlined in D3.2 Section 4
- Confirm that the scenario parameters meet the requirements outlined in D3.2 Section 5
- Validate that the parameter spaces adhere to the requirements in D3.2 Section 6
- Review the test cases generated by the COTSATO process (D3.2 Section 7.5)
- Ensure each test case includes a test scenario, metrics, validation criteria, and pass/fail criteria
- Depending on the purpose, verify that the metrics cover aspects such as safety, functional performance, HMI (Human Machine Interface), operational performance, reliability, and scenario validation
- Check the mapping of requirements to test cases (D3.2 Section 7.5)
- Confirm that a clear mapping exists between system requirements and the generated test cases
- Review the metrics on the collection of test scenarios (D3.2 Section 7.5)
- Verify that information about representativity and source of test scenarios is provided
- Evaluate Individual Scenario Quality
- Check the testing purpose metrics to ensure scenarios are relevant for the intended testing (D5.3 Section 3.1.1)
- Assess scenario exposure and probability to verify if scenarios represent realistic situations (D5.3 Section 3.1.2)
- Verify scenario description quality including completeness and unambiguity (D5.3 Section 3.1.3)
- Validate scenario consistency in terms of semantic and format consistency (D5.3 Section 3.1.4)
- Check scenario processability to ensure scenarios can be executed in the intended test environments (D5.3 Section 3.1.5)
- Evaluate Multiple Scenario Quality
- Assess diversity and similarity between scenarios to avoid redundant testing (D5.3 Section 3.2.1)
- Verify scenario coverage to ensure comprehensive testing of the ODD (D5.3 Section 3.2.3)
- Check completeness of data across the scenario set (D5.3 Section 3.2.4)
- If Using Multiple Scenario Databases
- Use the Scenario Relatedness Index (SRI) described in D5.3 Section 5 to:
- Check for identification and filtering out of redundant scenarios
- Verify adequate coverage
- Verify test efficiency optimization
- Use the Scenario Relatedness Index (SRI) described in D5.3 Section 5 to:
Allocate
By following the steps indicated below, a user of the SUNRISE SAF can audit the contents of the Allocate block, thereby ensuring that the allocation process was carried out correctly and comprehensively according to the guidelines provided by the SAF.
- Review the comparison of test case requirements with test instance capabilities:
- Ensure that the structure outlined in D3.3 Section 3 was followed, which includes scenery elements, environment conditions, dynamic elements, and test criteria (D3.3 Section 3.3).
- Verify that the metrics described in D3.3 Section 4.3 were applied for the comparison.
- Check the decision-making process for test case allocation:
- Confirm that the process outlined in D3.3 Section 4.5 and D3.3 Figure 27 was followed.
- Verify that the “virtual simulation first” approach was applied as described in D3.3 Section 4.2.
- Examine the documentation of the allocation results:
- Review the allocation matrix or table as described in D3.3 Section 4.6 and exemplified in D3.3 Figure 28.
- Ensure that scenarios that could not be allocated or were not sufficiently tested are properly flagged and reported to the “Coverage” block of the “Analyse” part of the SAF.
- Verify the consideration of various metrics:
- Check that both functional and non-functional metrics were considered, as described in D3.3 Sections 4.3 and 4.4.
- Confirm that safety was prioritized in the decision-making process (D3.3 Section 4.5).
- Review the reallocation process:
- Ensure that the iterative allocation to higher-fidelity test instances, when necessary, was performed as described in D3.3 Section 4.5.
- Verify that the reasons for reallocation decisions were properly documented (D3.3 Section 4.6).
- Check for special circumstances:
- Review if any deviations from the general methodology were made due to special circumstances, and if so, ensure they were properly justified (D3.3 Section 4.5).
- Verify the completeness of documentation:
- Ensure that all steps of the decision-making process, including reasons for decisions, were documented and returned to the SAF (D3.3 Section 4.6).
- Check for the presence of a tree structure containing all metrics and results of the comparison to all test instances (D3.3 Section 4.6).
- Review the consideration of safety standards:
- Verify that safety standards such as SOTIF were considered in the allocation process, particularly for identifying potentially triggering conditions or functional insufficiencies of the System under Test (SUT) (D3.3 Section 4.5).
Execute
By following the steps indicated below, a user of the SUNRISE SAF can audit the contents of the Execute block, thereby ensuring that whatever simulation framework was used, it meets the essential requirements and produces reliable, validated results. It’s important to note that while using the SUNRISE harmonized V&V simulation framework is recommended for better interoperability, modularity and scalability, it is not mandatory for the SUNRISE SAF.
- Verify that the simulation framework used aligns with the harmonized V&V simulation framework described in D4.4 Section 4.5, which includes:
- Checking that the base layer contains the four core interconnected subsystems:
- Subject Vehicle – Vehicle Dynamics
- Subject Vehicle – Sensors
- Subject Vehicle – Automated Driving (AD) function
- Environment
- Confirming the framework uses standardized interfaces between subsystems, particularly ASAM OSI as detailed in D4.4 Section 4.4
- Checking that the base layer contains the four core interconnected subsystems:
- Validate the data formats used align with recommended standards (D4.4 Section 4.3):
- ASAM OpenSCENARIO for scenario descriptions
- ASAM OpenDRIVE for road networks
- ASAM OpenLABEL for sensor data and scenario tagging
- Assess simulation model validation:
- Verify that correlation analysis between virtual simulation and physical tests was performed (D4.1 Section 3.6, D4.1 Section 1.1, D4.2 Section 8.1 – R1.1_14)
- Confirm robustness and representativeness of virtual validation framework (D4.2 Section 8.1 – R1.2_10)
- Check that model quality metrics meet defined thresholds (D4.1 Section 3.6)
- Review the simulation model validation test report (D4.1 Section 3.6, D4.3 Section 4.4)
- Evaluate validation metrics and Key Performance Indicators (KPIs):
- Review test case validation metrics (D4.1 Section 3.1)
- Verify that requirements from protocols, standards and regulations were used where applicable. For example from Euro NCAP or GSR (D4.2 Section 8.1 – R1.1_01, R1.2_01, R3.1_01)
- Check that SOTIF requirements were addressed per D4.2 Section 4.1, including:
- Risk quantification for scenarios, triggering conditions and ODD boundaries (D4.2 Table 12 – R10.1.19, R10.1.20, R10.1.21)
- Validation results for known unsafe scenarios (D4.2 Table 12 – R10.1.20.1)
- Validation results for discovered unknown unsafe scenarios (D4.2 Table 12 – R10.1.18.1)
- Assessment of residual risk (D4.2 Table 12 – R10.1.6)
- Check test case execution results:
- Check that all executed test cases generated desired results.
- Confirm test coverage metrics have been generated. For example:
- Check EURO NCAP and GSR compliance metrics (D4.2 Section 8.1 – R1.1_01, R1.2_01, R3.1_01)
- Verify sensor validation metrics were applied (D4.2 Section 8.1 – R1.1_02)
- Review correlation coefficients between simulation and physical test results (D4.2 Section 8.2 – R2.1_49)
- Confirm that test results include both virtual and physical validation data where applicable (D4.2 Section 8.1 – R1.1_14)
- Verify that executed simulations correspond to the requests from scenario manager (D4.2 Section 8.1 – R1.1_25)
- Verify that test results are correctly documented and stored (D4.2 Section 8.1 – R1.1_27)
Analyse
Coverage
By following the steps indicated below, a user of the SUNRISE SAF can audit the contents of the Coverage block, thereby ensuring that the combined set of test case results sufficiently covers the ODD and the parameter value ranges:
- Verify Scenario Coverage (D5.3 Section 4.2.3):
- Check if the coverage analysis includes all four types of coverage metrics described by de Gelder et al. (D5.3 Reference [59]):
- Tag-based coverage: Verify that scenarios cover all relevant ODD aspects through specific tags
- Time-based coverage: Confirm all timestamps in driving data are represented
- Actor-based coverage: Ensure all relevant actors are included in at least one scenario
- Actor-over-time-based coverage: Check that relevant actors are included throughout their period of importance
- Check if the coverage analysis includes all four types of coverage metrics described by de Gelder et al. (D5.3 Reference [59]):
- Verify Parameter Coverage (D5.3 Section 4.2.3):
- According to Laurent et al. (D5.3 Reference [60]), verify that:
- Parameters influencing ADS decision-making are adequately tested
- Multiple simulations with different parameter values have been run
- Changes in parameters lead to statistically significant differences in outcomes (if they should)
- Key metrics like path deviation and safety (minimum distance to objects) are considered
- According to Laurent et al. (D5.3 Reference [60]), verify that:
Test Evaluate
Further inputs for this section will result from the unfinished deliverable D3.5.
Decide
Further inputs for this section will result from the unfinished deliverable D2.3.