Skip to content

Instantly share code, notes, and snippets.

@Alby-o
Last active March 20, 2024 18:43
  • Star 39 You must be signed in to star a gist
  • Fork 11 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save Alby-o/fe87e35bc21d534c8220aed7df028e03 to your computer and use it in GitHub Desktop.
// imgLib -> Image package from https://pub.dartlang.org/packages/image
import 'package:image/image.dart' as imglib;
import 'package:camera/camera.dart';
Future<List<int>> convertImagetoPng(CameraImage image) async {
try {
imglib.Image img;
if (image.format.group == ImageFormatGroup.yuv420) {
img = _convertYUV420(image);
} else if (image.format.group == ImageFormatGroup.bgra8888) {
img = _convertBGRA8888(image);
}
imglib.PngEncoder pngEncoder = new imglib.PngEncoder();
// Convert to png
List<int> png = pngEncoder.encodeImage(img);
return png;
} catch (e) {
print(">>>>>>>>>>>> ERROR:" + e.toString());
}
return null;
}
// CameraImage BGRA8888 -> PNG
// Color
imglib.Image _convertBGRA8888(CameraImage image) {
return imglib.Image.fromBytes(
image.width,
image.height,
image.planes[0].bytes,
format: imglib.Format.bgra,
);
}
// CameraImage YUV420_888 -> PNG -> Image (compresion:0, filter: none)
// Black
imglib.Image _convertYUV420(CameraImage image) {
var img = imglib.Image(image.width, image.height); // Create Image buffer
Plane plane = image.planes[0];
const int shift = (0xFF << 24);
// Fill image buffer with plane[0] from YUV420_888
for (int x = 0; x < image.width; x++) {
for (int planeOffset = 0;
planeOffset < image.height * image.width;
planeOffset += image.width) {
final pixelColor = plane.bytes[planeOffset + x];
// color: 0x FF FF FF FF
// A B G R
// Calculate pixel color
var newVal = shift | (pixelColor << 16) | (pixelColor << 8) | pixelColor;
img.data[planeOffset + x] = newVal;
}
}
return img;
}
@yh4922
Copy link

yh4922 commented Dec 9, 2021

imglib.Image _convertYUV420(CameraImage image) {
  var img = imglib.Image(image.width, image.height); // Create Image buffer

  final int width = image.width;
  final int height = image.height;
  final int uvRowStride = image.planes[1].bytesPerRow;
  final int uvPixelStride = image.planes[1].bytesPerPixel;
  const shift = (0xFF << 24);

  for(int x=0; x < width; x++) {
    for(int y=0; y < height; y++) {
      final int uvIndex = uvPixelStride * (x/2).floor() + uvRowStride*(y/2).floor();
      final int index = y * width + x;

      final yp = image.planes[0].bytes[index];
      final up = image.planes[1].bytes[uvIndex];
      final vp = image.planes[2].bytes[uvIndex];
      // Calculate pixel color
      int r = (yp + vp * 1436 / 1024 - 179).round().clamp(0, 255);
      int g = (yp - up * 46549 / 131072 + 44 -vp * 93604 / 131072 + 91).round().clamp(0, 255);
      int b = (yp + up * 1814 / 1024 - 227).round().clamp(0, 255);     
      // color: 0x FF  FF  FF  FF 
      //           A   B   G   R
      img.data[index] = shift | (b << 16) | (g << 8) | r;
    }
  }

  return img;
}

@sikandernoori this way, it can be converted into color images, but it takes more than 1000ms to convert on a mobile phone with Snapdragon 870 CPU, and it will block the UI.

@alexcohn
Copy link

alexcohn commented Dec 9, 2021

@yh4922 this is the time it takes dart to convert a YUV 420 image (which comes from Android camera) to RGB. Performance could be much better if you do it in C, possibly enabling NEON or GPU. You can do this conversion in OpenCV, it's nicely optimized.

@sikandernoori
Copy link

@yh4922 solution proposed by @alexcohn is a good option.
But if you want the simpler one than I would suggest use isolates to convert image ...

Send CameraImage to isolate and convert image within isolate and use ...

@alexcohn
Copy link

@ramsmart-inno if you are using OpenCV anyway, this image conversion does not cost you APK size

@luomo-pro
Copy link

Thanks for your answers, I see many kinds of solutions.
I now want to convert yuv420 images to color, can you tell me which solution can do it?
I've seen many that only convert to black and white, and that's not what I want.
In addition, the camera's imageFormatGroup parameter is set to jpeg, you can easily convert the color image, but I found that it will cause the preview screen lag, the experience is very bad.
So I may only be able to convert yuv420.
Thank you very much!

@krzaklus
Copy link

krzaklus commented Feb 22, 2022

About a few days of struggling with the CamerImage to Image conversion, I managed to improve the method to include paddnig on different devices. I tested on several devices, checking the conversion at different camera resolutions. And I think it works.

imglib.Image convertYUV420ToImage(CameraImage cameraImage) {
    final imageWidth = cameraImage.width;
    final imageHeight = cameraImage.height;

    final yBuffer = cameraImage.planes[0].bytes;
    final uBuffer = cameraImage.planes[1].bytes;
    final vBuffer = cameraImage.planes[2].bytes;

    final int yRowStride = cameraImage.planes[0].bytesPerRow;
    final int yPixelStride = cameraImage.planes[0].bytesPerPixel!;

    final int uvRowStride = cameraImage.planes[1].bytesPerRow;
    final int uvPixelStride = cameraImage.planes[1].bytesPerPixel!;

    final image = imglib.Image(imageWidth, imageHeight);

    for (int h = 0; h < imageHeight; h++) {
      int uvh = (h / 2).floor();

      for (int w = 0; w < imageWidth; w++) {
        int uvw = (w / 2).floor();

        final yIndex = (h * yRowStride) + (w * yPixelStride);

        // Y plane should have positive values belonging to [0...255]
        final int y = yBuffer[yIndex];

        // U/V Values are subsampled i.e. each pixel in U/V chanel in a
        // YUV_420 image act as chroma value for 4 neighbouring pixels
        final int uvIndex = (uvh * uvRowStride) + (uvw * uvPixelStride);

        // U/V values ideally fall under [-0.5, 0.5] range. To fit them into
        // [0, 255] range they are scaled up and centered to 128.
        // Operation below brings U/V values to [-128, 127].
        final int u = uBuffer[uvIndex];
        final int v = vBuffer[uvIndex];

        // Compute RGB values per formula above.
        int r = (y + v * 1436 / 1024 - 179).round();
        int g = (y - u * 46549 / 131072 + 44 - v * 93604 / 131072 + 91).round();
        int b = (y + u * 1814 / 1024 - 227).round();

        r = r.clamp(0, 255);
        g = g.clamp(0, 255);
        b = b.clamp(0, 255);

        // Use 255 for alpha value, no transparency. ARGB values are
        // positioned in each byte of a single 4 byte integer
        // [AAAAAAAARRRRRRRRGGGGGGGGBBBBBBBB]
        final int argbIndex = h * imageWidth + w;

        image.data[argbIndex] = 0xff000000 |
            ((b << 16) & 0xff0000) |
            ((g << 8) & 0xff00) |
            (r & 0xff);
      }
    }

    return image;
  }

@neoacevedo
Copy link

This code has some deprecated classes for Flutter 3.7.1:
imglib.Image _convertBGRA8888(CameraImage image) has to be rewritten because of imglib.Image.fromBytes doesn't accept direct parameters and the Format enum doesn't have the bgra value.

@Holofox
Copy link

Holofox commented Feb 17, 2023

@neoacevedo,

imglib.Image _convertBGRA8888(CameraImage image) {
  return imglib.Image.fromBytes(
    image.width,
    image.height,
    image.planes[0].bytes,
    order: ChannelOrder.bgra,
  );
}

@rraayy
Copy link

rraayy commented Mar 15, 2023

Hi, 'img.data[index]' seems no longer use, how to modify the code when using convertYUV420ToImage?

@juanlabrador
Copy link

Hi, 'img.data[index]' seems no longer use, how to modify the code when using convertYUV420ToImage?

I have the same problem

@rraayy
Copy link

rraayy commented Mar 17, 2023

Hi, 'img.data[index]' seems no longer use, how to modify the code when using convertYUV420ToImage?

I have the same problem

Hi juanlabrador

I found a way and it's work well. Replace 'img.data[index]' to 'img.setPixelRgba(x, y, r, g, b, hexFF);'

@Serdar1112
Copy link

Hi, 'img.data[index]' seems no longer use, how to modify the code when using convertYUV420ToImage?

I have the same problem

Hi juanlabrador

I found a way and it's work well. Replace 'img.data[index]' to 'img.setPixelRgba(x, y, r, g, b, hexFF);'

And what should go into each value?

@federico-amura-kenility
Copy link

@neoacevedo,

imglib.Image _convertBGRA8888(CameraImage image) {
  return imglib.Image.fromBytes(
    image.width,
    image.height,
    image.planes[0].bytes,
    order: ChannelOrder.bgra,
  );
}

the image result is all broken.

@krzaklus
Copy link

krzaklus commented May 1, 2023

@neoacevedo I used this function.

  static imglib.Image convertBGRA8888ToImage(CameraImage cameraImage) {
    return imglib.Image.fromBytes(
      width: cameraImage.planes[0].width!,
      height: cameraImage.planes[0].height!,
      bytes: cameraImage.planes[0].bytes.buffer,
      order: imglib.ChannelOrder.bgra,
    );
  }

@federico-amura-kenility
Copy link

federico-amura-kenility commented May 1, 2023

I used this function.

static imglib.Image convertBGRA8888ToImage(CameraImage cameraImage) { return imglib.Image.fromBytes( width: cameraImage.planes[0].width!, height: cameraImage.planes[0].height!, bytes: cameraImage.planes[0].bytes.buffer, order: imglib.ChannelOrder.bgra, ); }

its working for iOS, but not on Android.
What imageFormatGroup did you use to create the Camera Controller?

@krzaklus
Copy link

krzaklus commented May 1, 2023

federico-amura-kenility I think on Android you should use convertYUV420ToImage. I modify my class after update Image lib, please try this code.

part of object_detection;

/// ImageUtils
class ImageUtils {
  ///
  /// Converts a [CameraImage] in YUV420 format to [image_lib.Image] in RGB format
  ///
  static imglib.Image convertCameraImage(CameraImage cameraImage) {
    if (cameraImage.format.group == ImageFormatGroup.yuv420) {
      return convertYUV420ToImage(cameraImage);
    } else if (cameraImage.format.group == ImageFormatGroup.bgra8888) {
      return convertBGRA8888ToImage(cameraImage);
    } else {
      throw Exception('Undefined image type.');
    }
  }

  ///
  /// Converts a [CameraImage] in BGRA888 format to [image_lib.Image] in RGB format
  ///
  static imglib.Image convertBGRA8888ToImage(CameraImage cameraImage) {
    return imglib.Image.fromBytes(
      width: cameraImage.planes[0].width!,
      height: cameraImage.planes[0].height!,
      bytes: cameraImage.planes[0].bytes.buffer,
      order: imglib.ChannelOrder.bgra,
    );
  }

  ///
  /// Converts a [CameraImage] in YUV420 format to [image_lib.Image] in RGB format
  ///
  static imglib.Image convertYUV420ToImage(CameraImage cameraImage) {
    final imageWidth = cameraImage.width;
    final imageHeight = cameraImage.height;

    final yBuffer = cameraImage.planes[0].bytes;
    final uBuffer = cameraImage.planes[1].bytes;
    final vBuffer = cameraImage.planes[2].bytes;

    final int yRowStride = cameraImage.planes[0].bytesPerRow;
    final int yPixelStride = cameraImage.planes[0].bytesPerPixel!;

    final int uvRowStride = cameraImage.planes[1].bytesPerRow;
    final int uvPixelStride = cameraImage.planes[1].bytesPerPixel!;

    final image = imglib.Image(width: imageWidth, height: imageHeight);

    for (int h = 0; h < imageHeight; h++) {
      int uvh = (h / 2).floor();

      for (int w = 0; w < imageWidth; w++) {
        int uvw = (w / 2).floor();

        final yIndex = (h * yRowStride) + (w * yPixelStride);

        // Y plane should have positive values belonging to [0...255]
        final int y = yBuffer[yIndex];

        // U/V Values are subsampled i.e. each pixel in U/V chanel in a
        // YUV_420 image act as chroma value for 4 neighbouring pixels
        final int uvIndex = (uvh * uvRowStride) + (uvw * uvPixelStride);

        // U/V values ideally fall under [-0.5, 0.5] range. To fit them into
        // [0, 255] range they are scaled up and centered to 128.
        // Operation below brings U/V values to [-128, 127].
        final int u = uBuffer[uvIndex];
        final int v = vBuffer[uvIndex];

        // Compute RGB values per formula above.
        int r = (y + v * 1436 / 1024 - 179).round();
        int g = (y - u * 46549 / 131072 + 44 - v * 93604 / 131072 + 91).round();
        int b = (y + u * 1814 / 1024 - 227).round();

        r = r.clamp(0, 255);
        g = g.clamp(0, 255);
        b = b.clamp(0, 255);

        image.setPixelRgb(w, h, r, g, b);
      }
    }

    return image;
  }
}

@saad-palapa
Copy link

saad-palapa commented May 21, 2023

After some trial and error, I found the perfect solution for iOS:

const IOS_BYTES_OFFSET = 28;

static Image _convertBGRA8888ToImage(CameraImage cameraImage) {
  final plane = cameraImage.planes[0];

  return Image.fromBytes(
    width: cameraImage.width,
    height: cameraImage.height,
    bytes: plane.bytes.buffer,
    rowStride: plane.bytesPerRow,
    bytesOffset: IOS_BYTES_OFFSET,
    order: ChannelOrder.bgra,
  );
}

The other solution produced a 1088 wide image with a 8px black bar. By adding rowStride and bytesOffset, it is now 1080 width with no black bars.

I have no idea where the offset of 28 comes from. Does anyone know why 28 works?

@jmealo
Copy link

jmealo commented May 22, 2023 via email

@alexcohn
Copy link

@saad-palapa Does anyone know why 28 works?

Now that we know the answer, the explanation is rather easy. 28 bytes is 8 pixels of BGRA. The image in memory is 1088 pixel wide with a black bar before the first column (the illustration keeps the 8 "black" extra pixels, but does not go keep the real dimensions):

XXXXXXXX......................
XXXXXXXX.......... _ .........
XXXXXXXX........ _( )_ .......
XXXXXXXX....... (_(%)_) ......
XXXXXXXX......... (_)\ .......
XXXXXXXX............. | __ ...
XXXXXXXX............. |/_/ ...
XXXXXXXX............. | ......
XXXXXXXX............. | ......
XXXXXXXX......................
XXXXXXXX......................

By adding the offset, you feed to Image.fromBytes() something like

......................XXXXXXXX
.......... _ .........XXXXXXXX
........ _( )_ .......XXXXXXXX
....... (_(%)_) ......XXXXXXXX
......... (_)\ .......XXXXXXXX
............. | __ ...XXXXXXXX
............. |/_/ ...XXXXXXXX
............. | ......XXXXXXXX
............. | ......XXXXXXXX
......................XXXXXXXX
......................

The function is smart enough to throw away the extra pixels on the right when the width parameter is 1080 and rowStride is 1088.

@rraayy
Copy link

rraayy commented Jun 1, 2023

Hello, I meet another problem. Does anyone try to convert NV21 from cameraImage to image ?
Some device (Xiaomi, Motorola) will have the NV21 format. Anyone can help?

@alexcohn
Copy link

alexcohn commented Jun 3, 2023

@rraayy you can see how this can be efficiently done with ffi on medium. The sample code is available at https://github.com/Hugand/camera_tutorial

@rraayy
Copy link

rraayy commented Jun 4, 2023

@rraayy you can see how this can be efficiently done with ffi on medium. The sample code is available at https://github.com/Hugand/camera_tutorial

Thanks so much @alexcohn , I will try it with ffi!

@DmitrySikorsky
Copy link

federico-amura-kenility I think on Android you should use convertYUV420ToImage. I modify my class after update Image lib, please try this code.

This works on Android, thank you very much.

@owjoh
Copy link

owjoh commented Jun 16, 2023

@rraayy Any luck converting NV21 to Image? I'm having the same difficulties. The NV21 formatted CameraImage object only has one plane.

@alexcohn
Copy link

@owjoh here @camsim99 claims that the issue with NV21 for Motorola and Xiaomi has been fixed in a recent CameraX implementation.

Alternatively, convertImage function could be tuned to handle single-plane images, but I don't know their actual layout.

@prkhrv
Copy link

prkhrv commented Aug 30, 2023

can anyone help me with yuv420 to RGB conversion on iOS ?

@prkhrv
Copy link

prkhrv commented Aug 30, 2023

About a few days of struggling with the CamerImage to Image conversion, I managed to improve the method to include paddnig on different devices. I tested on several devices, checking the conversion at different camera resolutions. And I think it works.

imglib.Image convertYUV420ToImage(CameraImage cameraImage) {
    final imageWidth = cameraImage.width;
    final imageHeight = cameraImage.height;

    final yBuffer = cameraImage.planes[0].bytes;
    final uBuffer = cameraImage.planes[1].bytes;
    final vBuffer = cameraImage.planes[2].bytes;

    final int yRowStride = cameraImage.planes[0].bytesPerRow;
    final int yPixelStride = cameraImage.planes[0].bytesPerPixel!;

    final int uvRowStride = cameraImage.planes[1].bytesPerRow;
    final int uvPixelStride = cameraImage.planes[1].bytesPerPixel!;

    final image = imglib.Image(imageWidth, imageHeight);

    for (int h = 0; h < imageHeight; h++) {
      int uvh = (h / 2).floor();

      for (int w = 0; w < imageWidth; w++) {
        int uvw = (w / 2).floor();

        final yIndex = (h * yRowStride) + (w * yPixelStride);

        // Y plane should have positive values belonging to [0...255]
        final int y = yBuffer[yIndex];

        // U/V Values are subsampled i.e. each pixel in U/V chanel in a
        // YUV_420 image act as chroma value for 4 neighbouring pixels
        final int uvIndex = (uvh * uvRowStride) + (uvw * uvPixelStride);

        // U/V values ideally fall under [-0.5, 0.5] range. To fit them into
        // [0, 255] range they are scaled up and centered to 128.
        // Operation below brings U/V values to [-128, 127].
        final int u = uBuffer[uvIndex];
        final int v = vBuffer[uvIndex];

        // Compute RGB values per formula above.
        int r = (y + v * 1436 / 1024 - 179).round();
        int g = (y - u * 46549 / 131072 + 44 - v * 93604 / 131072 + 91).round();
        int b = (y + u * 1814 / 1024 - 227).round();

        r = r.clamp(0, 255);
        g = g.clamp(0, 255);
        b = b.clamp(0, 255);

        // Use 255 for alpha value, no transparency. ARGB values are
        // positioned in each byte of a single 4 byte integer
        // [AAAAAAAARRRRRRRRGGGGGGGGBBBBBBBB]
        final int argbIndex = h * imageWidth + w;

        image.data[argbIndex] = 0xff000000 |
            ((b << 16) & 0xff0000) |
            ((g << 8) & 0xff00) |
            (r & 0xff);
      }
    }

    return image;
  }

Does this work for iOS ?

@rraayy
Copy link

rraayy commented Aug 31, 2023

Hi @prkhrv,
Please take a look at nv21 to Image
In my project, the solution is work for me.

@min23asdw
Copy link

min23asdw commented Feb 24, 2024

i fix it with use image.setPixelRgb it got valid image
by
"
imglib.Image convertYUV420ToImage(CameraImage cameraImage) {
final imageWidth = cameraImage.width;
final imageHeight = cameraImage.height;

final yBuffer = cameraImage.planes[0].bytes;
final uBuffer = cameraImage.planes[1].bytes;
final vBuffer = cameraImage.planes[2].bytes;

final int yRowStride = cameraImage.planes[0].bytesPerRow;
final int yPixelStride = cameraImage.planes[0].bytesPerPixel!;

final int uvRowStride = cameraImage.planes[1].bytesPerRow;
final int uvPixelStride = cameraImage.planes[1].bytesPerPixel!;

final image = imglib.Image(width: imageWidth, height: imageHeight);

for (int h = 0; h < imageHeight; h++) {
  int uvh = (h / 2).floor();

  for (int w = 0; w < imageWidth; w++) {
    int uvw = (w / 2).floor();

    final yIndex = (h * yRowStride) + (w * yPixelStride);

    // Y plane should have positive values belonging to [0...255]
    final int y = yBuffer[yIndex];

    // U/V Values are subsampled i.e. each pixel in U/V chanel in a
    // YUV_420 image act as chroma value for 4 neighbouring pixels
    final int uvIndex = (uvh * uvRowStride) + (uvw * uvPixelStride);

    // U/V values ideally fall under [-0.5, 0.5] range. To fit them into
    // [0, 255] range they are scaled up and centered to 128.
    // Operation below brings U/V values to [-128, 127].
    final int u = uBuffer[uvIndex];
    final int v = vBuffer[uvIndex];

    // Compute RGB values per formula above.
    int r = (y + v * 1436 / 1024 - 179).round();
    int g = (y - u * 46549 / 131072 + 44 - v * 93604 / 131072 + 91).round();
    int b = (y + u * 1814 / 1024 - 227).round();

    r = r.clamp(0, 255);
    g = g.clamp(0, 255);
    b = b.clamp(0, 255);

    // Use 255 for alpha value, no transparency.
    image.setPixelRgb(w, h, r, g, b);
  }
}

return image;

}
"
but it still rotate the image by 270 degrees
you can fix it by rotate but i don't know how to to i just rotate in python backend !

@KevinCCucumber
Copy link

imglib.Image _convertYUV420(CameraImage image) {
  var img = imglib.Image(image.width, image.height); // Create Image buffer

  final int width = image.width;
  final int height = image.height;
  final int uvRowStride = image.planes[1].bytesPerRow;
  final int uvPixelStride = image.planes[1].bytesPerPixel;
  const shift = (0xFF << 24);

  for(int x=0; x < width; x++) {
    for(int y=0; y < height; y++) {
      final int uvIndex = uvPixelStride * (x/2).floor() + uvRowStride*(y/2).floor();
      final int index = y * width + x;

      final yp = image.planes[0].bytes[index];
      final up = image.planes[1].bytes[uvIndex];
      final vp = image.planes[2].bytes[uvIndex];
      // Calculate pixel color
      int r = (yp + vp * 1436 / 1024 - 179).round().clamp(0, 255);
      int g = (yp - up * 46549 / 131072 + 44 -vp * 93604 / 131072 + 91).round().clamp(0, 255);
      int b = (yp + up * 1814 / 1024 - 227).round().clamp(0, 255);     
      // color: 0x FF  FF  FF  FF 
      //           A   B   G   R
      img.data[index] = shift | (b << 16) | (g << 8) | r;
    }
  }

  return img;
}

@sikandernoori this way, it can be converted into color images, but it takes more than 1000ms to convert on a mobile phone with Snapdragon 870 CPU, and it will block the UI.

This works with 2 planes like the iPad camera does. But for me, "final int uvPixelStride = image.planes[1].bytesPerPixel;" is always null, so I cannot use this code. Any Idea what I can change if that is null?

@alexcohn
Copy link

@KevinCCucumber what device are you working with?

@KevinCCucumber
Copy link

@KevinCCucumber what device are you working with?

@alexcohn I am using an ipad air 5th gen

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment