M
"

Directory

SUNRISE Safety Assurance Framework

Execute

Input Scenario Create Format Store Environment Query & Concretise Allocate Execute Safety Argument Coverage Test Evaluate Safety Case Decide Audit

In the Execute block, the test execution might happen in a virtual, hybrid or physical test environment, depending on the test instance resulting from the Allocate block. The allocated test cases form the input for this block. How the tests are carried out is not explicitly specified by the Safety Assurance Framework (SAF) and is the responsibility of the entity performing the tests. If the Allocate block has been applied correctly, it is guaranteed that the selected test instance is capable of performing the tests. In the case of virtual testing, the SAF recommends a harmonised approach. For that, the SUNRISE project developed a harmonised V&V simulation framework, which can be used for virtual validation of Cooperative Connected Automated Mobility (CCAM) systems but is not mandatory.

As shown in the figure below, the SUNRISE harmonised V&V simulation framework consists of a so-called base layer consisting of 4 interconnected subsystems, namely:

  • Subject Vehicle – Sensors (the sensors installed in the vehicle)
  • Subject Vehicle – Automated Driving (AD) function (the behavioural competencies of the vehicle)
  • Subject Vehicle – Vehicle Dynamics
  • Environment (in which the vehicle operates)

In this approach, the base layer is the core element that can be harmonised, because these 4 subsystems are essential for all simulations. That is the reason why it is possible to use standardised interfaces between these subsystems. The framework can be extended by the user in 4 dedicated dimensions related to the target Operational Design Domain (ODD), the vehicle Sensor set-up, the Software architecture and the Hardware architecture.

 

Audit instructions for 'Execute'

By following the steps indicated below, a user of the SUNRISE SAF can audit the contents of the Execute block, thereby ensuring that whatever simulation framework was used, it meets the essential requirements and produces reliable, validated results. It’s important to note that while using the SUNRISE harmonized V&V simulation framework is recommended for better interoperability, modularity and scalability, it is not mandatory for the SUNRISE SAF.

  1. Verify that the simulation framework used aligns with the harmonized V&V simulation framework described in D4.4 Section 4.5, which includes:
    1. Checking that the base layer contains the four core interconnected subsystems:
      • Subject Vehicle – Vehicle Dynamics
      • Subject Vehicle – Sensors
      • Subject Vehicle – Automated Driving (AD) function
      • Environment
    2. Confirming the framework uses standardized interfaces between subsystems, particularly ASAM OSI as detailed in D4.4 Section 4.4
  2. Validate the data formats used align with recommended standards (D4.4 Section 4.3):
    1. ASAM OpenSCENARIO for scenario descriptions
    2. ASAM OpenDRIVE for road networks
    3. ASAM OpenLABEL for sensor data and scenario tagging
  3. Assess simulation model validation:
    1. Verify that correlation analysis between virtual simulation and physical tests was performed (D4.1 Section 3.6, D4.1 Section 1.1, D4.2 Section 8.1 – R1.1_14)
    2. Confirm robustness and representativeness of virtual validation framework (D4.2 Section 8.1 – R1.2_10)
    3. Check that model quality metrics meet defined thresholds (D4.1 Section 3.6)
    4. Review the simulation model validation test report (D4.1 Section 3.6, D4.3 Section 4.4)
  4. Evaluate validation metrics and Key Performance Indicators (KPIs):
    1. Review test case validation metrics (D4.1 Section 3.1)
    2. Verify that requirements from protocols, standards and regulations were used where applicable. For example from Euro NCAP or GSR (D4.2 Section 8.1 – R1.1_01, R1.2_01, R3.1_01)
    3. Check that SOTIF requirements were addressed per D4.2 Section 4.1, including:
      • Risk quantification for scenarios, triggering conditions and ODD boundaries (D4.2 Table 12 – R10.1.19, R10.1.20, R10.1.21)
      • Validation results for known unsafe scenarios (D4.2 Table 12 – R10.1.20.1)
      • Validation results for discovered unknown unsafe scenarios (D4.2 Table 12 – R10.1.18.1)
      • Assessment of residual risk (D4.2 Table 12 – R10.1.6)
  5. Check test case execution results:
    1. Check that all executed test cases generated desired results.
    2. Confirm test coverage metrics have been generated. For example:
      • Check EURO NCAP and GSR compliance metrics (D4.2 Section 8.1 – R1.1_01, R1.2_01, R3.1_01)
      • Verify sensor validation metrics were applied (D4.2 Section 8.1 – R1.1_02)
      • Review correlation coefficients between simulation and physical test results (D4.2 Section 8.2 – R2.1_49)
    3. Confirm that test results include both virtual and physical validation data where applicable (D4.2 Section 8.1 – R1.1_14)
    4. Verify that executed simulations correspond to the requests from scenario manager (D4.2 Section 8.1 – R1.1_25)
    5. Verify that test results are correctly documented and stored (D4.2 Section 8.1 – R1.1_27)