首页 > 解决方案 > Camera preview is fine but front camera produces very dark photos

问题描述

I built a custom camera using the Camera 1 API and for some reason it produces very dark pictures (only on the front camera, the back camera works perfectly fine). The camera preview shows the camera as it should, with the correct brightness - it's only when an image is captured and decoded into a bitmap does it look really dark. I have been frantically googling for a while and have found this problem reported quite a few times but can't find a working solution. The device I'm using is a Samsung J5.

CameraPreview:

class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {

  private static final String CAMERA = "CAMERA";

  private static Camera mCamera;

  private final CameraActivity cameraActivity;
  private final SurfaceHolder mHolder;

  public CameraPreview(Camera camera, CameraActivity cameraActivity) {
    super(cameraActivity);
    this.cameraActivity = cameraActivity;
    mCamera = camera;
    mHolder = getHolder();
    mHolder.addCallback(this);
    mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
  }

  public void setCameraDisplayOrientation(int cameraId) {
    Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
    Camera.getCameraInfo(cameraId, info);
    final int rotation = cameraActivity.getWindowManager().getDefaultDisplay().getRotation();
    int degrees = 0;

    switch (rotation) {
      case Surface.ROTATION_0:
        degrees = 0;
        break;
      case Surface.ROTATION_90:
        degrees = 90;
        break;
      case Surface.ROTATION_180:
        degrees = 180;
        break;
      case Surface.ROTATION_270:
        degrees = 270;
        break;
    }

    int result;
    if (info.facing == cameraId) {
      result = (info.orientation + degrees) % 360;
      result = (360 - result) % 360;
    } else {
      result = (info.orientation - degrees + 360) % 360;
    }

    mCamera.setDisplayOrientation(result);
  }

  @Override
  public void surfaceCreated(SurfaceHolder holder) {
    try {
      mCamera.setPreviewDisplay(holder);
      mCamera.startPreview();
      cameraActivity.isSafeToTakePicture(true);
      Camera.Parameters params = mCamera.getParameters();
      // my attempt at preventing darkness 
      params.setExposureCompensation(params.getMaxExposureCompensation());
      if(params.isAutoExposureLockSupported()) {
        params.setAutoExposureLock(false);
      }

      mCamera.setParameters(params);
    } catch (IOException e) {
      Log.d(CAMERA, "An error occured when setting up the camera preview " + e.getMessage());
    }
  }

  @Override
  public void surfaceDestroyed(SurfaceHolder holder) {
    mCamera.stopPreview();
  }

  @Override
  public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
  }
}

On my CameraPictureCallback (when an image is taken), I send the bytes to this method, which decodes the bytes into a bitmap, puts it in a bundle and passes it to the next fragment:

 public void openFragmentWithBitmap(byte[] bytes) {
    final BitmapFactory.Options scalingOptions = new BitmapFactory.Options();
    final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length, scalingOptions);
    final Bundle bundle = new Bundle();
    bundle.putParcelable(SELFIE, bitmap);

    mCamera.stopPreview();
    final FragmentTransaction ft = getSupportFragmentManager().beginTransaction();
    final Fragment startChainFragment = new StartChainFragment();
    startChainFragment.setArguments(bundle);
    ft.setCustomAnimations(R.anim.slide_up, R.anim.slide_down)
      .replace(R.id.rlPlaceholder, startChainFragment, StartChainFragment.TAG)
      .addToBackStack(null)
      .commit();
  }

Am I missing a trick here? In my surfaceCreated() I set the exposure compensation to the max but this doesn't seem to have an effect. Appreciate any help.

Edit: Turns out adding a delay didn't make a difference, so here are the results (camera preview vs actual image taken):

CameraPreview: enter image description here

Captured image: enter image description here

The image is captured by calling mCamera.takePicture(null, null, pictureCallback) when the capture button is clicked (the callback just transfers the bytes to the above method).

标签: javaandroidcameraandroid-cameraandroid-hardware

解决方案


在经历了所有的血汗和泪水之后,我找到了解决办法。我注意到预览图片和最终图片看起来不像是相同的分辨率(您可以看到预览图片中的瓶子比捕获的图片更宽)。所以我试图使它们相同或尽可能接近。startPreview()我在in之后调用此方法surfaceCreated()

private void loadCameraParameters() {
    final Camera.Parameters camParams = mCamera.getParameters();

    final Camera.Size previewSize = getOptimalPreviewSize(camParams.getSupportedPreviewSizes(),  screenWidth, screenHeight);
    camParams.setPreviewSize(previewSize.width, previewSize.height);

    final Camera.Size pictureSize = getOptimalPreviewSize(camParams.getSupportedPictureSizes(), screenWidth, screenHeight);
    camParams.setPictureSize(pictureSize.width, pictureSize.height);

    mCamera.setParameters(camParams);
}

哪里getOptimalPreviewSize()是:

private Camera.Size getOptimalPreviewSize(List<Size> sizes, int targetWidth, int targetHeight) {
    final double aspectTolerance = 0.05;
    final double targetRatio = (double) targetWidth/targetHeight;

    if (sizes == null) {
      return null;
    }

    Camera.Size optimalSize = null;
    double minDiff = Double.MAX_VALUE;

    for (Camera.Size size : sizes) {
      final double ratio = (double) size.width / size.height;
      if (Math.abs(ratio - targetRatio) > aspectTolerance) continue;
      if (Math.abs(size.height - targetHeight) < minDiff) {
        optimalSize = size;
        minDiff = Math.abs(size.height - targetHeight);
      }
    }

    if (optimalSize == null) {
      minDiff = Double.MAX_VALUE;
      for (Camera.Size size : sizes) {
        if (Math.abs(size.height - targetHeight) < minDiff) {
          optimalSize = size;
          minDiff = Math.abs(size.height - targetHeight);
        }
      }
    }

    return optimalSize;
  }

现在,我的图片和预览在我的两台设备上都完美匹配,具有相同的亮度分辨率。


推荐阅读