Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Improve handling for FP8 grouped gemm without zero_start_index_M (#3615)
Summary: Pull Request resolved: #3615 X-link: facebookresearch/FBGEMM#694 When zero_start_index_M isnt provided, inputs can have variable M values across groups. To support this, we need to return a tensor with shape [total_M, N] since it isnt possible to view the tensor as [G, M, N]. Reviewed By: jasonjk-park, bradleyhd Differential Revision: D68686266 fbshipit-source-id: 4267de288d6f7f2d0dec82b881eba056c11ea737
- Loading branch information