Introduction to Model-Based Development and Quality Assurance of Embedded Software


Date: available upon request

Languages: English or German

This 3-day training workshop provides a practice-based overview of developing and safeguarding embedded software on the basis of Simulink® and TargetLink® within the framework of serial projects. The training workshop takes participants through all process steps from designing and creating the simulation model in Simulink® and Stateflow® to production code generation. Model quality assurance consists of verifying the model and software architecture, safeguarding the modeling guidelines, as well as checking for compliance with requirements in the model test. The efficient requirements-based test specification is implemented for the created models and applied in MiL and SiL tests. Functional accuracy is verified by the evaluation of regression and back-toback tests. You will learn to make use of all steps in practical exercises using the MES Test Manager® (MTest), MES Model Examiner® (MXAM), and MES M-XRAY® (MXRAY).

Valeo Siemens

"A prolific overview of the model-based development process and betterment of quality of the project.”

Valeo Siemens

"A must for modern software development!”


Target audience

The training workshop is targeted towards modelers, developers, testers, quality managers, project managers, and team leaders, whose focus is model-based development of embedded software based on MATLAB®/Simulink® for serial projects.

Conditions of Participation and Cost

Available as an in-house training class worldwide on request.
For company-specific adaptations to the agenda, quotations, or questions, please do not hesitate to contact us via


  • Model-based development with Simulink® and Stateflow®
  • Developing safety-critical software in compliance with ISO 26262
  • Code generation from Simulink® models
  • Model quality analysis and evaluation
  • Modeling guidelines
  • Model testing and test implementation techniques


9 a.m. Welcome and introduction round
9:30 a.m. Model-based software development with Simulink®
  • Foundations of model-based development
  • Overview of development and quality assurance activities
  • Characteristics of ISO 26262-compliant development
  • 10:15 a.m. Introduction to sample application
  • Set up modeling environment
  • Introduction to sample models
  • 10:30 a.m. Modeling embedded software in Simulink®
  • Simulink® modeling environment
  • Parametrization of Simulink® models
  • Continuous and discrete modeling
  • 11:15 a.m. Hands-on: Simulink®
  • Creating Simulink® modules
  • 12:30 p.m. Lunch break and open dialog
    1:30 p.m. Modeling embedded software with Stateflow®
  • Introduction to the concept of finite-based machines
  • Stateflow® modeling environment
  • Stateflow® design pattern
  • Recommended best practices
  • 2:15 p.m. Hands-on: Stateflow®
  • Creating a Stateflow® chart
  • 4 p.m. Analysis and evaluation of model structure
  • Model structure analysis
  • Introduction to complexity metrics
  • Calculating model complexity
  • Countermeasures to overly complex models
  • Assessing coherence in models
  • 4:30 p.m. Hands-on: Model architecture analysis with MES M-XRAY® (MXRAY)
  • Analysis of sample models
  • Detecting complex subsystems, ineffective interfaces, and cloned subsystems
  • 5 p.m. End of day

    AGENDA - DAY 2

    9 a.m. Code generation via TargetLink®/Embedded Coder® development environment
  • Principles of code generation
  • Data Dictionary
  • Data types, classes, scaling and fixed-point arithmetic
  • Interfaces (signals and busses)
  • 10:30 a.m. Integrating models and distributed modeling
  • Advantages of model referencing and libraries
  • Definition of distributed parameter files
  • 11 a.m. Ensuring model quality with modeling guidelines
  • Overview of modeling guidelines
  • Modeling guidelines for ISO 26262-compliant modeling
  • Automatic checks of modeling guidelines with the MES Model Examiner® (MXAM)
  • 12 p.m. Lunch break and open dialog
    1 p.m. Hands-on: Modeling guidelines with the MES Model Examiner® (MXAM)
  • Performing analyses with MXAM
  • Evaluation and discussion of specific guideline violations
  • Repairing or justifying guideline violations
  • 2 p.m. Ensuring model quality with model tests
  • Goals, workflow, and process steps of model testing
  • Test specification methods
  • Regression testing and back-to-back testing, MiL – SiL – PiL
  • Automatic test evaluation with test assessments
  • Tracing requirements in model tests
  • 3 p.m. Systematic requirements-based test case creation
  • Test cases: What are the typical basic elements?
  • Specification functions in MTCD (functions, synchronous, asynchronous)
  • Parameter handling with MTCD
  • Best practices for test specifications
  • 4 p.m. Hands-on: MTCD with the MES Test Manager® (MTest)
  • Creating test sequences
  • Executing test sequences
  • Using parameters to efficiently modify test sequences
  • 5 p.m. End of day

    AGENDA - DAY 3

    9 a.m. Regression and back-to-back signal comparison
  • Scope (MiL vs. SiL vs. PiL, model simulation vs. measurement data)
  • Combination of back-to-back and regression testing
  • 9:30 a.m. Hands-on: Signal comparison with the MES Test Manager® (MTest)
  • Configuring and executing a test evaluation
  • Definition of tolerances
  • Documenting test evaluation results in reports and catalogs
  • Converting output signals into reference signals
  • 10 a.m. Automated test evaluation with test assessments
  • Principles and objectives of test assessments
  • Assessment generation from requirements (MARS)
  • Benefits of formal requirements syntax
  • 11 a.m. Hands-on: MARS with the MES Test Manager® (MTest)
  • Creating typical formal requirements
  • Generating and executing test assessments
  • Workflow with generated assessments
  • 12 p.m. Lunch break and open dialog
    1 p.m. Model and code coverage in the model test
  • Model coverage for all MiL test platforms
  • Code coverage for SiL/PiL test platforms
  • 1:30 p.m. Hands-on: Increasing model/code coverage with the MES Test Manager® (MTest)
  • Interpreting and evaluating coverage reports
  • Increasing model/code coverage through structure-based test cases
  • 2:30 p.m. Overview of results and progress of model test
  • Judging the progress of a test project
  • Are requirements correctly implemented in the test object?
  • Assessing the quality of test results
  • When am I finished with development and quality assurance?
  • 3 p.m. Hands-on: Overview of results and progress of model test
  • Assessing the quality of the test objects (test catalog, test report)
  • Efficient workflow in case of modified requirements?
  • Overview of development and project quality
  • 4 p.m. Concluding words and feedback
    5 p.m. End of training class