You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Reading the pixel data for my camera frame is producing a uint8Array with only a single byte per pixel
width=640, height=480, pixelFormat=rgb, orientation=landscape-right, bytesPerRow=2560
const buffer = frame.toArrayBuffer();
const data = new Uint8Array(buffer);
data.length is 307200 which is exactly 640 * 480 indicating only 1 byte per pixel (grayscale?). But if the pixelFormat were actually RGBA it would be 640 * 480 * 4.
The bytesPerRow seems to be correct, bytesPerRow(2560) === width * 4 but it doesn't match the array length. Am I doing something wrong? I'd like to grab pixels from the edges of the screen but the indices keep going out of bounds.
New features, bugfixes, updates and other improvements are all handled mostly by @mrousavy in his free time.
To support @mrousavy, please consider 💖 sponsoring him on GitHub 💖.
Sponsored issues will be prioritized.
@mrousavy just an update. I got my hands on a newer android (Galaxy S21) and rebuit it for that device however the same issue is occurring as on the S9
const frameProcessor = useFrameProcessor(frame => {
"worklet";
try {
runAtTargetFps(3, () => {
const { width, height, orientation, bytesPerRow, pixelFormat } = frame;
const buffer = frame.toArrayBuffer();
const data = new Uint8Array(buffer);
const bytesPerPixel = 4;
console.log(`Frame: ${frame.width} ${frame.height}`); // 1280 720
console.log(`Data length: ${data.length}`); // 921600
console.log(`Expected length: ${width * height * bytesPerPixel}`); // 2764800
if (data.length !== width * height * bytesPerPixel) {
console.error(
"Data length does not match expected size. Check the frame format and conversion method."
);
}
});
} catch (err) {
console.log("frameProcessor err", err, (err as Error).message);
}
}, []);
the data.length is always frame.width * frame.height which doesn't make sense. Is the frame compressed in some way? Am i doing something wrong?
What's happening?
Reading the pixel data for my camera frame is producing a uint8Array with only a single byte per pixel
data.length
is307200
which is exactly640 * 480
indicating only 1 byte per pixel (grayscale?). But if the pixelFormat were actually RGBA it would be640 * 480 * 4
.The
bytesPerRow
seems to be correct,bytesPerRow(2560) === width * 4
but it doesn't match the array length. Am I doing something wrong? I'd like to grab pixels from the edges of the screen but the indices keep going out of bounds.Reproduceable Code
Relevant log output
Camera Device
Device
Galaxy S9
VisionCamera Version
4.3.2
and4.4.1
Can you reproduce this issue in the VisionCamera Example app?
I didn't try (⚠️ your issue might get ignored & closed if you don't try this)
Additional information
The text was updated successfully, but these errors were encountered: