visual.build_module_mermaid_chart

lucid.visual.build_module_mermaid_chart(module: Module, input_shape: list[int] | tuple[int] | list[list[int] | tuple[int]] | None = None, inputs: Iterable[Tensor] | Tensor | None = None, depth: int = 2, direction: str = 'LR', include_io: bool = True, show_params: bool = False, return_lines: bool = False, copy_to_clipboard: bool = False, compact: bool = False, use_class_defs: bool = False, end_semicolons: bool = True, edge_mode: Literal['dataflow', 'execution'] = 'execution', collapse_repeats: bool = True, repeat_min: int = 2, color_by_subpackage: bool = True, container_name_from_attr: bool = True, edge_stroke_width: float = 2.0, emphasize_model_title: bool = True, model_title_font_px: int = 20, show_shapes: bool = False, hide_subpackages: Iterable[str] = (), hide_module_names: Iterable[str] = (), dash_multi_input_edges: bool = True, subgraph_fill: str = '#000000', subgraph_fill_opacity: float = 0.05, subgraph_stroke: str = '#000000', subgraph_stroke_opacity: float = 0.75, force_text_color: str | None = None, edge_curve: str = 'natural', node_spacing: int = 50, rank_spacing: int = 50, input_dtype: type | None = None, **forward_kwargs) str | list[str]

Generates a Mermaid flowchart diagram from a lucid.nn.Module by running a forward pass (using the provided inputs or a randomly generated tensor matching input_shape) and recording the execution or dataflow between modules.

This is intended for quick architecture inspection and for embedding lightweight model diagrams in documentation.

Basic Example

import lucid
import lucid.nn as nn
from lucid.visual import build_module_mermaid_chart

class Tiny(nn.Module):
    def __init__(self):
        super().__init__()
        self.net = nn.Sequential(
            nn.Conv2d(3, 8, kernel_size=3, stride=1, padding=1),
            nn.ReLU(),
            nn.Conv2d(8, 8, kernel_size=3, stride=1, padding=1),
        )

    def forward(self, x):
        return self.net(x)

model = Tiny()
chart = build_module_mermaid_chart(
    model,
    input_shape=(1, 3, 32, 32),
    depth=3,
    edge_mode="execution",
    show_shapes=True,
    collapse_repeats=True,
)

print(chart)
        %%{init: {"flowchart":{"curve":"monotoneX","nodeSpacing":50,"rankSpacing":50}} }%%
flowchart LR
  linkStyle default stroke-width:2.0px
  subgraph sg_m0["<span style='font-size:20px;font-weight:700'>Tiny</span>"]
  style sg_m0 fill:#000000,fill-opacity:0.05,stroke:#000000,stroke-opacity:0.75,stroke-width:1px
    subgraph sg_m1["net"]
      direction TB;
    style sg_m1 fill:#000000,fill-opacity:0.05,stroke:#000000,stroke-opacity:0.75,stroke-width:1px
      m2["Conv2d<br/><span style='font-size:11px;color:#c53030;font-weight:400'>(1,3,32,32) → (1,8,32,32)</span>"];
      m3["ReLU"];
      m4["Conv2d"];
    end
  end
  input["Input<br/><span style='font-size:11px;color:#a67c00;font-weight:400'>(1,3,32,32)</span>"];
  output["Output<br/><span style='font-size:11px;color:#a67c00;font-weight:400'>(1,8,32,32)</span>"];
  style input fill:#fff3cd,stroke:#a67c00,stroke-width:1px;
  style output fill:#fff3cd,stroke:#a67c00,stroke-width:1px;
  style m2 fill:#ffe8e8,stroke:#c53030,stroke-width:1px;
  style m3 fill:#faf5ff,stroke:#6b46c1,stroke-width:1px;
  style m4 fill:#ffe8e8,stroke:#c53030,stroke-width:1px;
  input --> m2;
  m2 --> m3;
  m3 --> m4;
  m4 --> output;
    

Key Parameters

  • inputs / input_shape: Provide explicit inputs (Tensor or iterable of Tensor) or let the function generate random inputs from input_shape.

  • depth: Limits how deep the module tree is expanded for grouping (subgraphs).

  • edge_mode: “execution” records sequential execution order; “dataflow” attempts to connect producers and consumers via intermediate tensors.

  • show_shapes: Adds input/output shape hints to node labels (when a shape change is detected).

  • collapse_repeats / repeat_min: Collapses repeated sibling structures into a single node like Layer x N.

  • hide_subpackages / hide_module_names: Allows filtering out modules by Lucid builtin subpackage (e.g. activation, drop) or by class name (e.g. ReLU).