MBSE and AI – Let’s talk machine

MBSE and AI – Let’s talk machine


Spoiler: If general blah blah doesn’t interest you, you’ll find a concrete application of AI at the end.

What impact will AI have on MBSE? I’m pretty sure the impact is big, but what does it mean specifically? What is already going on, what can we expect in the future, and what role does MBSE play in the context of AI? Do we even need MBSE anymore?

A short and quick answer: I’m convinced we still need MBSE, and with AI even more so.

The long answer consists of several blog posts on the topic. They haven’t all been written yet, and ideas about them will probably only emerge over time. The topic is multi-faceted, and I would like to explore it in more detail in several short posts. I look forward to lively discussions and further views. So, please send me your thoughts.

To start with, a few basic questions: What is MBSE anyway and what is a model? The fundamental questions are usually the hardest. I have already explained my understanding of them in the following blog posts.

In a nutshell, modeling opens access to engineering information to machines, enabling machine assistance.

Finally, the promised concrete application of AI in SysML v2 – modeling.

The SysML v2 pilot implementation in Jupyter Lab can get an AI assistant with the free plugin ChatGPT-Jupyter-AI Assistant. This allows SysML v2 models to be explained using natural language. If you do not have a local installation of the SysML v2 pilot implementation, you can also use the SysMLv2Lab.com.

The following video demonstrates this use case. After the video, you will find the used SysML v2 model.

// Extract of the MBSE4U example model set about a forest fire observation system
// Copyright 2023 MBSE4U

abstract part def Drone {
    // Spatial extent
    item :>> shape : ShapeItems::Cylinder {
        :>> height = 20 [SI::cm];
        :>> base.shape : ShapeItems::Circle {
            :>> radius = 60 [SI::cm];
    item boundingBox : ShapeItems::Box [1] :> boundingShapes {
        :>> length = 120 [SI::cm];
        :>> width  = 120 [SI::cm];
        :>> height = 20 [SI::cm];
    // Timeslices            
    timeslice parking {
        snapshot startCharging {
            // model some start charging conditions
        then snapshot stopCharging;
    then timeslice flying {
        :>> battery[1];
    // States
    state droneStates {
        entry; then off;

        state off;
        accept SigSwitchOn then standBy;
        state standBy;
        accept SigSwitchOff then off;
        transition standBy_charging
            first standBy
            accept SigStartCharging
            then charging;
        transition standBy_ready 
            first standBy
            accept SigActivate
            then ready;
        state charging;
        transition charging_standBy
            first charging
            accept SigStopCharging
            then standBy;
        state ready;
        accept SigDeactivate then standBy;
        transition ready_flying
        first ready
            accept SigStartFlying 
            then flying;
        state flying;    
        accept SigStopFlying then ready;
    // Signals
    attribute def SigStartCharging;
    attribute def SigStopCharging;
    attribute def SigSwitchOn;
    attribute def SigSwitchOff;
    attribute def SigActivate;
    attribute def SigDeactivate;
    attribute def SigStartFlying;
    attribute def SigStopFlying;
    attribute def SigBatteryLow;
    // Power
    part battery {
        part batteryManagementSystem;
    message notifyBatteryLow of SigBatteryLow from battery.batteryManagementSystem to flightControl;
    // Engines
    part engine[4];
    // Flight Control
    part flightControl;
// Drone Use Cases
use case observeArea {
    subject drone : Drone;
    in item hi_observationArea;
    first start;
    then action defineObservationArea {
        in item hi_observationArea = observeArea::hi_observationArea;
        out item observationArea;
    then action approachArea {
        in item observationArea;
    then action flyObservationPatterns;
    then action returnToHomeBase;
    then done;            
    flow defineObservationArea.observationArea to approachArea.observationArea;
use case chargeDrone {
    subject drone : Drone;
    first start;
    then action plugDrone;
    then action chargeDroneBatteries;
    then done;


2 Responses

  1. Varun says:

    Hello Tim Weilkiens , Its really awesome to have an AI assistance in analyzing SysML V2 model.
    Does this work only when I have a subscription version of GPT-4 or also with the free version GPT 3.5?

  2. Tim Weilkiens says:

    As far as I know it should work. You need an OpenAPI key, and they are provided for free – at least for a limited trial.

Leave a Reply

Your email address will not be published. Required fields are marked *