Testing Models the Right Way

From Requirements to Model Testing - 2 days

Date/Time Location Language
Upon request   English/German

This training class provides a comprehensive overview of the principles, processes, and objectives of model testing – from requirements to model tests. We offer step-by-step guidance from creating requirements-based test specifications, through testing TargetLink and/or Embedded Coder models, to automated test evaluation based on test assessments and back-to-back/regression tests. In particular, we will emphasize ISO 26262-compliant test management and explain the test process for MiL and SiL, as well as tracing requirements to test specifications and test assessments. You will learn all process steps through hands-on practical exercises using Simulink and TargetLink or Embedded Coder models. During the training, we will use the MES Test Manager (MTest) as a model test framework in practical exercises. However, this training is suitable for anybody who wants to learn how to test models the right way – no matter which tool you want to use.

Target Audience

This training class is aimed at developers, testers, test managers, and quality managers who focus on model-based development of embedded software based on MATLAB/Simulink and related to TargetLink/Embedded Coder.

Highlights

  • Test objectives and workflow
  • Test management
  • Test specification
  • Testing TargetLink/Embedded Coder models
  • Regression and back-to-back testing
  • Automated test evaluation with test assessments
  • Model and code coverage

★★★★★ Participant from Valeo Siemens

"The best to learn about both – the theory and practice of testing.“

Languages

Available in English and German

Formats

Icon On Site Training

Open-enrollment Trainings
at one of our locations

Icon Online Training

Virtual Classroom Trainings
wherever you are

Icon Inhouse Training

In-house Trainings
online or in-house

Cost, Terms & Conditions

 

Our Trainers

Agenda

Day 1

Introduction to model testing

  • Objectives, workflow, and process steps of model testing
  • Test specification methods
  • Test evaluation methods
  • Test documentation
  • Tracing requirements in model testing

Introduction to sample application

  • Setup of testing environment
  • Introduction to sample models

Systematic requirements-based specification of test sequences

  • Test cases: What are the typical basic elements?
  • Definition of test groups and test sequences with MTCD
  • Specification functions and parameter handling
  • Best practices for test specifications

Hands-on: Systematic requirements-based test specification

  • Creating test sequences
  • Executing test sequences
  • Using parameters for efficient modification of test sequences
  • Importing measurement data for testing (import of mat files)

Regression and back-to-back signal comparison

  • Scope (MiL – SiL – PiL)
  • Combination of back-to-back and regression testing

Hands-on: Signal comparison

  • Configuring and executing a test evaluation
  • Definition of tolerances
  • Documenting test evaluation results in reports and catalogs
  • Converting output signals into reference signals

Day 2

Testing TargetLink and Embedded Coder models and model/code coverage

  • Automated test bed creation and module testing for subsystems
  • Advanced support of code generation in model testing
  • Model coverage for all MiL test platforms
  • Code coverage for SiL/PiL test platforms

Hands-on: Increasing model/code coverage

  • Automatic test execution for MiL/SiL/PiL
  • Interpretation and evaluation of coverage reports
  • Increasing model/code coverage through structure-based test cases
  • Logging internal signals

Introduction to test evaluation with test assessments

  • Principles and objectives of test assessments
  • Structure and content of test assessments

Assessment generation from requirements

  • Types of requirement patterns
  • Benefits of a formal requirements syntax

Hands-on: Formal requirements and assessment generation

  • Writing typical formal requirements
  • Generating and executing test assessments
  • Workflow with generated assessments

Hands-on: Functional test evaluation with test assessments

  • Writing typical assessments manually or extending assessments
  • Test assessment evaluation in the assessment catalog
  • Best practices for test assessments

Overview of results and progress of model test

  • Judging the progress of a test project (tracing, coverage)
  • Are requirements correctly implemented in the test object?
  • Assessing the quality of test results (test catalog, test report)
  • When is testing over? (test project protocol)

Hands-on: Overview of results and progress of model test

  • Efficient workflow in case of modified requirements
  • Modifying test specifications and test assessments after requirement changes
  • Review of test specifications and test assessments