首页 > 解决方案 > CameraX ML KIT 给出错误“java.lang.IllegalStateException:图像已关闭

问题描述

我想使用 Google ML Kit 和 CameraX API 制作一个实时图像分类器。我正在使用 CameraX API 的预览和分析。它给出了错误

    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err: Caused by: java.lang.IllegalStateException: Image is already closed
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at android.media.Image.throwISEIfImageIsInvalid(Image.java:68)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at android.media.ImageReader$SurfaceImage$SurfacePlane.getBuffer(ImageReader.java:832)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.ImageConvertUtils.zza(com.google.mlkit:vision-common@@16.0.0:139)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.ImageConvertUtils.convertToUpRightBitmap(com.google.mlkit:vision-common@@16.0.0:89)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.ImageConvertUtils.getUpRightBitmap(com.google.mlkit:vision-common@@16.0.0:10)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.label.automl.internal.zzo.zza(com.google.mlkit:image-labeling-automl@@16.0.0:16)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.label.automl.internal.zzo.run(com.google.mlkit:image-labeling-automl@@16.0.0:60)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.MobileVisionBase.zza(com.google.mlkit:vision-common@@16.0.0:23)
    2020-07-27 01:17:18.146 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.zzb.call(com.google.mlkit:vision-common@@16.0.0)
    2020-07-27 01:17:18.146 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.common.sdkinternal.ModelResource.zza(com.google.mlkit:common@@16.0.0:26)
    2020-07-27 01:17:18.146 11009-11009/com.example.camerax_automl W/System.err:    ... 9 more`

这里我使用了一个 TextureView 和一个 Textview 来展示分类结果。我还将 .tflite 模型放在 assets 文件夹中并插入所需的依赖项。我的代码如下 -

public class MainActivity extends AppCompatActivity {

    private int REQUEST_CODE_PERMISSIONS = 101;

    private final String[] REQUIRED_PERMISSIONS = new String[]{"android.permission.CAMERA"};

    TextureView textureView;
    ImageButton imgbutton;
    //LinearLayout linear1;
    TextView text1;



    //automal objects

    AutoMLImageLabelerLocalModel localModel =
            new AutoMLImageLabelerLocalModel.Builder()
                    .setAssetFilePath("model/manifest.json")
                    // or .setAbsoluteFilePath(absolute file path to manifest file)
                    .build();


    AutoMLImageLabelerOptions autoMLImageLabelerOptions =
            new AutoMLImageLabelerOptions.Builder(localModel)
                    .setConfidenceThreshold(0.0f)  // Evaluate your model in the Firebase console
                    // to determine an appropriate value.
                    .build();
    ImageLabeler labeler = ImageLabeling.getClient(autoMLImageLabelerOptions);


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        textureView = findViewById(R.id.view_finder);
        imgbutton = findViewById(R.id.imgCapture);
        text1 = findViewById(R.id.textView2);

        if(allPermissionsGranted())
        {
            startCamera();

        }
        else
        {

            ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS);
        }
    }


    private void startCamera() {

        CameraX.unbindAll();

        Rational aspectRatio = new Rational (textureView.getWidth(), textureView.getHeight());
        Size screen = new Size(textureView.getWidth(), textureView.getHeight()); //size of the screen

        PreviewConfig pConfig = new PreviewConfig.Builder()
                .setTargetAspectRatio(aspectRatio)
                .setTargetResolution(screen)
                .build();

        Preview preview = new Preview(pConfig);

        preview.setOnPreviewOutputUpdateListener(new Preview.OnPreviewOutputUpdateListener() {
            @Override
            public void onUpdated(Preview.PreviewOutput output) {
                ViewGroup parent = (ViewGroup) textureView.getParent();
                parent.removeView(textureView);
                parent.addView(textureView, 0);

                textureView.setSurfaceTexture(output.getSurfaceTexture());
                updateTransform();

            }
        });

        ImageAnalysisConfig imconfig = new ImageAnalysisConfig.Builder().setTargetAspectRatio(aspectRatio)
                .setTargetResolution(screen)
                .setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE).build();

        final ImageAnalysis analysis = new ImageAnalysis(imconfig);





                analysis.setAnalyzer(new ImageAnalysis.Analyzer() {
                    @Override
                    public void analyze(ImageProxy image, int rotationDegrees) {


                        Image img = image.getImage();
                        if (image.getImage() == null) {
                            Log.d("Null", "Image is Null");
                        } else {
                            InputImage img1 = InputImage.fromMediaImage(img, rotationDegrees);

                            labeler.process(img1)
                                    .addOnSuccessListener(new OnSuccessListener<List<ImageLabel>>() {
                                        @Override
                                        public void onSuccess(List<ImageLabel> labels) {
                                            // Task completed successfully
                                            for (ImageLabel label : labels) {
                                                String text = label.getText();
                                                float confidence = label.getConfidence();
                                                int index = label.getIndex();

                                                text1.setText(text + " " + confidence);


                                            }


                                        }

                                    })
                                    .addOnFailureListener(new OnFailureListener() {
                                        @Override
                                        public void onFailure(@NonNull Exception e) {
                                            // Task failed with an exception
                                            // ...

                                            e.printStackTrace();
                                        }


                                    });


                        }

                        image.close();
                    }
                });

        CameraX.bindToLifecycle((LifecycleOwner)this,analysis, preview);


    }

    private void updateTransform() {

        Matrix mx = new Matrix();
        float w = textureView.getMeasuredWidth();
        float h = textureView.getMeasuredHeight();

        float cX = w / 2f;
        float cY = h / 2f;

        int rotationDgr;
        int rotation = (int)textureView.getRotation();

        switch(rotation){
            case Surface.ROTATION_0:
                rotationDgr = 0;
                break;
            case Surface.ROTATION_90:
                rotationDgr = 90;
                break;
            case Surface.ROTATION_180:
                rotationDgr = 180;
                break;
            case Surface.ROTATION_270:
                rotationDgr = 270;
                break;
            default:
                return;
        }

        mx.postRotate((float)rotationDgr, cX, cY);
        textureView.setTransform(mx);
    }

    private boolean allPermissionsGranted(){

        for(String permission:REQUIRED_PERMISSIONS) {
            if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) {
                return false;
            }
        }

        return true;
    }



}

我在这里做错了什么?

标签: androidfirebase-mlkitandroid-cameraxgoogle-mlkit

解决方案


在处理图像之前,您不得关闭图像。关闭图像会触发相机拍摄另一张图像以供您的应用处理。

处理需要时间。这不是立即的。

labeler.process(img1)
        .addOnSuccessListener(new OnSuccessListener<List<ImageLabel>>() {
            @Override
            public void onSuccess(List<ImageLabel> labels) {
                // Close the image
                image.close();
                ...
            }
        })
        .addOnFailureListener(new OnFailureListener() {
            @Override
            public void onFailure(@NonNull Exception e) {
                // Close the image
                image.close();
                ...
            }
        });

推荐阅读