Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to do transpose on input data, (hwc->chw) #815

Closed
WIll-Xu35 opened this issue Mar 6, 2019 · 11 comments
Closed

How to do transpose on input data, (hwc->chw) #815

WIll-Xu35 opened this issue Mar 6, 2019 · 11 comments

Comments

@WIll-Xu35
Copy link

Hi all,

I found this code in wiki to transpose chw to cwh:

void transpose(const ncnn::Mat& in, ncnn::Mat& out)
{
    ncnn::Layer* op = ncnn::create_layer("Permute");

    // set param
    ncnn::ParamDict pd;
    pd.set(0, 1);// order_type

    op->load_param(pd);

    // forward
    op->forward(in, out);

    delete op;
}

Does anyone know how to transpose Mat, HWC to CHW?

I'm sorry this may be a dumb question, but I'm new to c++ and ncnn, any help is appreciated.

Thanks.

@nihui
Copy link
Member

nihui commented Mar 7, 2019

you can change the order_type param to 5

// 5 = c h w

@nihui nihui closed this as completed Mar 7, 2019
@WIll-Xu35
Copy link
Author

So just use:
pd.set(1, 5);// order_type
in the transpose functions right?

The original code:
pd.set(0, 1);// order_type
according to the comment, tranpose whc to hwc.

But in this link (https://github.com/Tencent/ncnn/wiki/low-level-operation-api)
it says transpose chw to cwh. I'm confused about which one is correct.

@nihui
Copy link
Member

nihui commented Mar 7, 2019

the order_type convention is the inner-dim-first
order_type whc = chw in wiki notation
order_type hwc = cwh in wiki notation

@WIll-Xu35
Copy link
Author

got it, much appreciated.

@WIll-Xu35
Copy link
Author

WIll-Xu35 commented Mar 11, 2019

@nihui

I used the given transpose function to do preprocessing on the input data. However, no matter what order_type I set, the output of the model is exactly the same.

I've read #221, does ncnn::Mat automatically transform channel order hwc in cv::Mat to chw in ncnn:Mat? Or do I have to do the transpose on cv::Mat and then initialized the ncnn::Mat?

My final goal is to feed a 3x112x112 rgb image to the model.

Thanks!

@nihui
Copy link
Member

nihui commented Mar 11, 2019

use Mat::from_pixels() to convert RGB packed data to Mat

@WIll-Xu35
Copy link
Author

@nihui

So this line of code can read cv::Mat, which is in BGR order, into RGB order in ncnn:Mat:
ncnn::Mat in_rgb = ncnn::Mat::from_pixels(bgr_data, ncnn::Mat::PIXEL_BGR2RGB, w, h)

But how to further transpose it into dimensionality of 3x112x112?

What I've done is that I firstly use from_pixels() function to initialize ncnn::Mat, but when I use the transpose() function (mentioned earlier in this issue) to change the dimensionality from 112x112x3 to 3x112x112, the output of the network seems to be the same, so it seems that the transpose() function is not working.

The overall question I have now is after caffe int 8 quantization, my face recognition network has lost too much performance, I'm trying to eliminate all the possible errors apart from the quantized model itself. So I want to be 100% sure that my input is correct, then I can determine what the problem is, the input or the model itself.

Sorry to keep you busy in the recent days, I really appreciate your help.

@lizozom
Copy link

lizozom commented Jun 1, 2022

@nihui I am also attempting to transpose an image (following https://github.com/nihui/ncnn-webassembly-yolov5) and can't figure out a way to do so -

// convert to RGB
ncnn::Mat in = ncnn::Mat::from_pixels(rgba.data, ncnn::Mat::PIXEL_RGBA2RGB, width, height);

// reshape to match model's input tensor
in = in.reshape(1, 3, 640, 640);

// normalize
const float norm_vals[3] = {1 / 255.f, 1 / 255.f, 1 / 255.f};
in.substract_mean_normalize(0, norm_vals);

// How to reorder axes from [batch_size, width, height, color] to [batch_size, color, width, height]?

@lucasjinreal
Copy link

lucasjinreal commented Jun 15, 2022

@lizozom As far as I can understand, from_pixels already converts your hwc data to chw layout:

for (; remain > 0; remain--) {
      *ptr0 = pixels[2];
      *ptr1 = pixels[1];
      *ptr2 = pixels[0];

      pixels += 3;
      ptr0++;
      ptr1++;
      ptr2++;
    }

    pixels += wgap;

ptr0, ptr1, ptr2 should stores data in RRRRR, GGGGG, BBBBB layout respectively if you set BGR2RGB type.

@kikirizki
Copy link

kikirizki commented Sep 29, 2022

Hi @nihui I am sorry this is maybe another dumb question, but how prevent Mat::from_pixels() from transposing my tensor I want to keep it hwc

@Suncheng2022
Copy link

@nihui

So this line of code can read cv::Mat, which is in BGR order, into RGB order in ncnn:Mat: ncnn::Mat in_rgb = ncnn::Mat::from_pixels(bgr_data, ncnn::Mat::PIXEL_BGR2RGB, w, h)

But how to further transpose it into dimensionality of 3x112x112?

What I've done is that I firstly use from_pixels() function to initialize ncnn::Mat, but when I use the transpose() function (mentioned earlier in this issue) to change the dimensionality from 112x112x3 to 3x112x112, the output of the network seems to be the same, so it seems that the transpose() function is not working.

The overall question I have now is after caffe int 8 quantization, my face recognition network has lost too much performance, I'm trying to eliminate all the possible errors apart from the quantized model itself. So I want to be 100% sure that my input is correct, then I can determine what the problem is, the input or the model itself.

Sorry to keep you busy in the recent days, I really appreciate your help.

from_pixels()默认将HWC转为CHW,你只需要给它cv::Mat就可以。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants