L4 System Interface (VSI)
Overview (VSI)
This document outlines the testing requirements and strategy for the Vendor System Interfaces (VSI) in the RDK framework.
Implementing the Testing Suite: A Proposed Approach
This document outlines a proposed approach for implementing a comprehensive testing suite, leveraging the raft
framework and adhering to the RRDI (Research, Review, Design, Implement) methodology.
Current Status: This plan is currently a work in progress and will be further refined as more information becomes available.
Target Environment Assumptions:
- Base Image: The testing suite can assume that the target device has been programmed with a valid vendor image.
- Driver Installation: All necessary drivers are fully installed and operational on the target device.
- Clean Slate: No middleware or application layer exists on the target device. This allows for testing the vendor layer in isolation and ensures that tests are not influenced by pre-existing software components.
Proposed Steps:
1. Research and Review:
- Identify and Evaluate: Conduct thorough research to identify suitable open-source testing suites that align with the module's testing requirements. Consider factors like:
- Test Coverage: Does the suite cover the necessary protocols, functionalities, and edge cases?
- Maturity and Support: Is the suite actively maintained with a strong community or support channels?
- Licensing: Is the licensing compatible with the project?
- Integration: How easily can the suite be integrated with the
raft
framework and the target environment?
- Deep Dive: Review the selected testing suite's documentation and codebase to understand its capabilities, limitations, and potential integration challenges.
- Best Practices: Refer to the RRDI guidelines provided in the https://github.com/rdkcentral/ut-core/wiki/3.2.-Standards:-Requirements-for-building-testing-suites document for best practices.
2. Design:
- Test Strategy: Based on the research and review findings, design a comprehensive testing strategy, including:
- Test Cases: Define specific test cases to be implemented, prioritizing "big ticket" checks for core functionality.
- Test Data: Outline how test data will be generated and managed. Consider using pre-defined datasets or dynamic generation techniques.
- Validation Classes: Design L4-wide validation classes to abstract the validation mechanisms, allowing for phased automation (human-assisted initially, progressing to fully automated).
raft
Integration: Detail how the testing suite will be integrated with theraft
framework for test execution, result collection, and reporting.
- Phased Automation: Incorporate a plan for the gradual transition from human-assisted validation to automated checks within the validation classes.
3. Implementation:
- Leverage
raft
: Utilize theraft
framework throughout the implementation process:- Download: Download a specific version of the chosen open-source testing suite using
raft
. - Build: Build the testing suite using the toolchain provided by the
sc docker
environment, ensuring compatibility with the target platform. - Deploy: Copy the built testing suite to the target device/environment using
raft
. - Orchestrate: Utilize
raft
to orchestrate the execution of the test suite on the target, including setup, execution, and teardown. - Remote Execution: Enable the capability to download and execute the test suite on a running device/environment using
raft
. - Result Collation: Utilize
raft
to collect and collate the test results for analysis and reporting.
- Download: Download a specific version of the chosen open-source testing suite using
- Debugging Support: Ensure the implementation allows for easy debugging by enabling single-stepping through
raft
scripts and providing seamless access to the target device for engineers.
Key Considerations:
- Module Requirements: Clearly define the testing requirements and goals for the module under test. This will guide the selection of the testing suite and the design of specific test cases.
raft
Integration: Ensure seamless integration with theraft
framework throughout the entire testing process, from downloading and building the suite to executing tests and collecting results.- Target Environment: Consider the specific characteristics of the target environment (e.g., hardware limitations, operating system) when selecting and building the testing suite. These will be driven by platform-specific input profiles fed into
raft
and the build process. - Scalability and Maintainability: Design the testing suite and its integration with
raft
for scalability and maintainability, allowing for easy expansion and updates in the future.
By following this approach and incorporating the principles outlined in the previous document, we can create a robust and efficient testing suite that effectively validates the functionality and stability of the stack.
Key Interfaces
Here's the list of main modules that require dedicated testing:
-
Bluetooth (Bluez)
- Requirements Definition: Clearly define the requirements for Bluetooth functionality within RDK.
- Collaboration: Architecture experts need to review and confirm these requirements.
-
WiFi (wpa-supplicant)
- API Testing: Utilize an open-source testing suite to conduct comprehensive API testing of wpa-supplicant.
-
Clear set of requirements needs to be defined
-
OpenGLES / EGL
- Compliance and Performance:
- Gather Vendor Test Data: Obtain a detailed description of the testing performed by the SoC vendor, including compliance tests and performance benchmarks. Request evidence (test results, reports) to support.
- Performance Benchmarking: Establish performance benchmarks using OpenGLES benchmarking tools (e.g., glbenchmark).
- Cross-Platform Comparison: It must be possible to run the same benchmarks across all supported platforms to establish a baseline and ensure new deliveries from vendors meet the minimum performance requirements. Therefore the data output will be in a format that can be used per platform to compare.
- Compliance and Performance:
-
Kernel Testing
- Kernel Configuration Requirements: Define specific kernel configuration requirements for RDK in collaboration with the vendor team. These requirements will guide the selection of appropriate validation testing suites from LFS.
- LFS Testing System: Leverage the Linux Foundation System (LFS) testing infrastructure for kernel-level testing.