Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #1212

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #1212

Triggered via pull request November 15, 2024 16:59
Status Success
Total duration 27s
Artifacts

copyright-check.yml

on: pull_request
copyright-check  /  main
18s
copyright-check / main
Fit to window
Zoom out
Zoom in