Query & Concretise
The Query & Concretise block takes input from the Input block, which contains Operational Design Domain (ODD), behaviour, external requirements, and test objectives. It then passes these requirements to the SUNRISE Data Framework (DF) as a query and retrieves scenarios from the individual scenario databases. The scenarios returned from the SUNRISE DF to the Query & Concretise block could be either of logical or concrete scenario levels, meaning that all parameters are defined using value ranges, allowing for an infinite number of concrete scenarios to be derived from a single logical scenario. The next step is to concretise these parameter ranges into specific values and to combine these concrete scenarios with the test objectives. Once combined, the scenarios are allocated within an execution environment.
The text below provides a workflow of the Query & Concretize block.
- Since the scenario creation and formatting occurs in individual Scenario Databases (SCDB), the Input block information containing the test objectives, ODD & behaviour requirements and external requirements, will be fed directly into the Query & Concretise block.
- These external requirements will then be used to query individual SCDBs via the SUNRISE DF, and logical or concrete scenarios will be then returned. It is at this step that scenarios hosted within SCDBs become test scenarios, because it is associated with the intended testing purposes upon the retrieval from the scenario databases.
- If the returned scenarios are at the logical level (i.e., parameters are described in ranges)
- the first function of the block will create a concrete scenario with concrete parameter values.
- the second function of the block will then combine the concrete test scenario with further test objectives.
- If the returned scenarios are already concrete scenarios, then this block will skip the creation of the concrete scenario step, but combining it with test objectives.
- The block will send the concrete test scenarios with their test objectives to the Allocate block for test execution.
- During or post execution, the Analyse block will then feed back the test outcome to this block, based on which the next set of concrete parameter combinations can be created. An example of such process could be incorporating an optimisation algorithm, with the intention to explore a testing objective. For example, to explore the failure points within logical scenarios.
Logical or concrete scenarios can be retrieved from databases connected to the SUNRISE Data Framework through queries. These queries are constructed using tags recorded in the OpenLABEL format, which adheres to a harmonized ontology developed within the SUNRISE project. This approach ensures a unified understanding of all elements and their interrelationships across the connected databases.
To derive concrete scenarios from the logical ones for testing purposes, several sampling methodologies have been developed. These methodologies facilitate the discretization of the continuous parameter space and enable the selection of specific samples (concrete scenarios) within this space. These samples are chosen to estimate the distribution of a safety measure across the parameter space. Alternatively, these methodologies can be applied to optimize for other testing objectives, such as identifying the pass/fail boundary within the parameter space or identifying parameter subspaces.
Audit instructions for 'Query & Concretize'
- Review the inputs to the COTSATO (COncretizing Test Scenarios and Associating Test Objectives) process (D3.2 Section 7.3)
- Verify that the Operational Design Domain (ODD) description is provided and follows the format guidelines in ISO 34503
- Check that the system requirements are clearly defined
- Ensure the SUT (System Under Test) is properly specified
- Confirm that variables to be measured during test execution are listed
- Validate that pass/fail criteria for successful test execution are provided
- Examine the query used to fetch scenarios from the scenario storage:
- Ensure the query is well-formulated and aligns with the requirements specified in D3.2 Section 7.4
- Verify the scenarios retrieved from the scenario storage:
- Check that the scenario concept complies with the requirements outlined in D3.2 Section 4
- Confirm that the scenario parameters meet the requirements outlined in D3.2 Section 5
- Validate that the parameter spaces adhere to the requirements in D3.2 Section 6
- Review the test cases generated by the COTSATO process (D3.2 Section 7.5)
- Ensure each test case includes a test scenario, metrics, validation criteria, and pass/fail criteria
- Depending on the purpose, verify that the metrics cover aspects such as safety, functional performance, HMI (Human Machine Interface), operational performance, reliability, and scenario validation
- Check the mapping of requirements to test cases (D3.2 Section 7.5)
- Confirm that a clear mapping exists between system requirements and the generated test cases
- Review the metrics on the collection of test scenarios (D3.2 Section 7.5)
- Verify that information about representativity and source of test scenarios is provided
- Evaluate Individual Scenario Quality
- Check the testing purpose metrics to ensure scenarios are relevant for the intended testing (D5.3 Section 3.1.1)
- Assess scenario exposure and probability to verify if scenarios represent realistic situations (D5.3 Section 3.1.2)
- Verify scenario description quality including completeness and unambiguity (D5.3 Section 3.1.3)
- Validate scenario consistency in terms of semantic and format consistency (D5.3 Section 3.1.4)
- Check scenario processability to ensure scenarios can be executed in the intended test environments (D5.3 Section 3.1.5)
- Evaluate Multiple Scenario Quality
- Assess diversity and similarity between scenarios to avoid redundant testing (D5.3 Section 3.2.1)
- Verify scenario coverage to ensure comprehensive testing of the ODD (D5.3 Section 3.2.3)
- Check completeness of data across the scenario set (D5.3 Section 3.2.4)
- If Using Multiple Scenario Databases
- Use the Scenario Relatedness Index (SRI) described in D5.3 Section 5 to:
- Check for identification and filtering out of redundant scenarios
- Verify adequate coverage
- Verify test efficiency optimization
- Use the Scenario Relatedness Index (SRI) described in D5.3 Section 5 to: