Skip to content

Expand of a scalar panics burn-import #4206

@GregorySech

Description

@GregorySech

Feature description

The Expand onnx operation with a scalar as first parameter should work when importing a model with burn-import.

Feature motivation

I'm trying to get EdgeSAM imported in Burn.
I've converted the official onnx models to opset 16 and created a build.rs script as described in the docs.
The encoder get's imported with no issues, the decoder panics with the following logs:

  DEBUG onnx_ir::rank_inference: Inferring rank for node: expand3
  DEBUG onnx_ir::node::expand: Expand node expand3 has 2 inputs
  DEBUG onnx_ir::node::expand: Expand node expand3 input[0]: Scalar(Int64)
  DEBUG onnx_ir::node::expand: Expand node expand3 input[1]: Tensor(TensorType { elem_type: Int64, rank: 1, static_shape: Some([1]) })
  ERROR burn_import::logger: PANIC => panicked at /Users/gregorysech/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/onnx-ir-0.19.1/src/node/expand.rs:39:14:
  Expand operation requires first input to be a tensor

Netron for the decoder onnx models agrees with the inference of burn_import and onnx_ir.

Image

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingonnx

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions