Testing Models with MTest

From Requirements to Model Testing - 3 days

Date Location Language
Upon request   English/German

This training class provides a comprehensive overview of the principles, processes, and objectives of model testing with the MES Test Manager (MTest) from requirements to model tests. We offer step-by-step guidance from creating requirements-based test specifications, through testing TargetLink and/or Embedded Coder models, to automated test evaluation based on test assessments and back-to-back/regression tests. In particular, we will emphasize ISO 26262-compliant test management and explain the test process for MiL and SiL, as well as tracing requirements to test specifications and test assessments. You will learn all process steps by means of hands-on practical exercises using Simulink and TargetLink or Embedded Coder models and MTest as a model test framework. This training class includes lots of hands-on sessions with MTest.

Target Audience

This training class is aimed at developers and testers, who want to learn how to use the MTest for testing. Experience with model-based development of embedded software based on MATLAB/Simulink related to TargetLink/Embedded Coder is advantageous. Share your experiences and discuss with other users of the MES Test Manager (MTest).


  • Test objectives and workflow
  • Test management
  • Test specification with MTCD
  • Testing TargetLink/Embedded Coder models
  • Regression and back-to-back testing
  • Automated test evaluation with test assessments
  • Model and code coverage
  • Insight into test progress and test quality
  • Several hands-on sessions with MTest


Available in English and German


Icon On Site Training

Open-enrolment Trainings
at one of our locations

Icon Online Training

Virtual Classroom Trainings
wherever you are

Icon Inhouse Training

In-house Trainings
online or in-house

Cost, Terms & Conditions


Our Trainers


Day 1

Introduction to model testing

  • Objectives, workflow, and process steps of model testing
  • Test specification methods
  • Test evaluation methods
  • Test documentation
  • Tracing requirements in model testing
  • Setting up the work environment for the workshop

Introduction to sample models

  • Setup of testing environment
  • Introduction to sample models
  • Introduction to customer models
  • Walk-through (MTest)

Systematic requirements-based specification of test sequences

  • Definition of test groups and test sequences with MTCD
  • Specification functions in MTCD (functions, synchronous, asynchronous)
  • Parameter handling with MTCD
  • Specification of test cases using variation
  • Best practices for test specifications

Hands-on: Test specification

  • In-depth work based on practical exercises
  • Joint creation of test specifications
  • Executing test sequences
  • Using parameters for modifying test sequences efficiently

Day 2

Testing TargetLink and Embedded Coder models and model/code coverage

  • Automated test bed creation and module testing for subsystems
  • Advanced support of code generation in model testing
  • Model coverage for all MiL test platforms
  • Code coverage for SiL/PiL test platforms

Hands-on: Increasing model/code coverage

  • Automatic test execution for MiL/SiL/PiL
  • Interpretation and evaluation of coverage reports
  • Increasing model/code coverage through structure-based test cases
  • Logging internal signals

Back-to-back and regression comparison

  • Area of application (MiL vs. SiL vs. PiL, model simulation vs. measurement data)
  • Combination of test assessments and back-to-back/ regression testing
  • Conversion of output signals into reference signals

Hands-on: Test evaluation

  • Execution and documentation of test evaluation in report
  • Definition of tolerances (amplitude and time)

Introduction to automated test evaluation with test assessments

  • Principles and objectives of test assessments
  • Structure and content of test assessments

Assessment generation from requirements (MARS)

  • Types of requirement patterns
  • Benefits of a formal requirements syntax
  • Assessment generation

Day 3

Requirements-based test case creation and generation

  • How does the equivalence class method work and how can it help?
  • Creating test sequences with the classification tree method
  • Boundary value testing
  • Generation of test sequences from formal requirements

Hands-on: Requirements-based test case generation

    • Automated stimulation and evaluation
    • Inspection of coverage and trigger behavior

MTest and Continuous Integration

      • Workflow of test projects using CI
      • MES Jenkins Plugin
      • Demo: MTest and Jenkins

Hands-on: Complete setup of test project

      • Create test project
      • Select test object and corresponding requirements
      • Formalize requirements
      • Create test sequences and simulate
      • Create test assessments and evaluate
      • Inspect model/code coverage and write further test sequences
      • Perform back-to-back test and configure tolerances

Result and progress overview

      • Where can I see the progress of my test project? (tracing, coverage, project integrity)
      • Are the requirements correctly implemented in the test object? (assessment catalog)
      • What is the quality of test results? (test catalog, test report)
      • When am I done testing?

Hands-on: Results and progress

    • What is the efficient workflow after requirement modifications?
    • Modifying test specifications and test assessments after requirements changes
    • Review of test specifications and test assessments