#StackBounty: #android #kotlin #android-camera2 How to take a picture where all settings are set manually including the flash without m…

Bounty: 100

I used the latest Camera2Basic sample program as a source for my trials:

https://github.com/android/camera-samples.git

Basically I configured the CaptureRequest before I call the capture() function in the takePhoto() function like this:

    private fun prepareCaptureRequest(captureRequest: CaptureRequest.Builder) {
    //set all needed camera settings here
    captureRequest.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_OFF)

    captureRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF);
    //captureRequest.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
    //captureRequest.set(CaptureRequest.CONTROL_AWB_LOCK, true);
    captureRequest.set(CaptureRequest.CONTROL_AWB_MODE, CaptureRequest.CONTROL_AWB_MODE_OFF);
    captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
    //captureRequest.set(CaptureRequest.CONTROL_AE_LOCK, true);
    //captureRequest.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_CANCEL);
    //captureRequest.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_FAST);

    //flash
    if (mState == CaptureState.PRECAPTURE){
        //captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
        captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF)
    }
    if (mState == CaptureState.TAKEPICTURE) {
        //captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_SINGLE)
        //captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
        captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_SINGLE)
    }

    val iso = 100
    captureRequest.set(CaptureRequest.SENSOR_SENSITIVITY, iso)

    val fractionOfASecond = 750.toLong()
    captureRequest.set(CaptureRequest.SENSOR_EXPOSURE_TIME, 1000.toLong() * 1000.toLong() * 1000.toLong() / fractionOfASecond)
    //val exposureTime = 133333.toLong()
    //captureRequest.set(CaptureRequest.SENSOR_EXPOSURE_TIME, exposureTime)

    //val characteristics = cameraManager.getCameraCharacteristics(cameraId)
    //val configs: StreamConfigurationMap? = characteristics[CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP]
    //val frameDuration = 33333333.toLong()
    //captureRequest.set(CaptureRequest.SENSOR_FRAME_DURATION, frameDuration)

    val focusDistanceCm = 20.0.toFloat()    //20cm 
    captureRequest.set(CaptureRequest.LENS_FOCUS_DISTANCE, 100.0f / focusDistanceCm)

    //captureRequest.set(CaptureRequest.COLOR_CORRECTION_MODE, CameraMetadata.COLOR_CORRECTION_MODE_FAST)
    captureRequest.set(CaptureRequest.COLOR_CORRECTION_MODE, CaptureRequest.COLOR_CORRECTION_MODE_TRANSFORM_MATRIX)
    val colorTemp = 8000.toFloat();
    val rggb = colorTemperature(colorTemp)
    //captureRequest.set(CaptureRequest.COLOR_CORRECTION_TRANSFORM, colorTransform);
    captureRequest.set(CaptureRequest.COLOR_CORRECTION_GAINS, rggb);
}

but the picture that is returned never is the picture where the flash is at its brightest. This is on a Google Pixel 2 device.
As I only take one picture I am also not sure how to check some CaptureResult states to find the correct one as there is only one.
I already looked at the other solutions to similar problems here but they were either never really solved or somehow took the picture during capture preview which I don’t want.

Other strange observations are that on different devices the images are taken (also not always at the right moment), but then the manual values I set are not observed in the JPEG metadata of the image.

If needed I can put my git fork on github.


Get this bounty!!!

#StackBounty: #android #android-camera2 #textureview Android Camera2 displays black and distorted JPEG image on TextureView?

Bounty: 150

Im making a test app for a friend, on the Samsung S20.

The Samsung S20 has a ToF (Time of Flight) camera facing the back.

I will like to display the ToF image preview & regular camera preview on a TextureView side by side.

Im able to get the ToF sensor and convert its raw output to visual output using a color mask and display depth ranges visually (red farthest, oranges, etc..), see the screenshot:

enter image description here

Below is relevant code:

<?xml version="1.0" encoding="utf-8"?>
<androidx.coordinatorlayout.widget.CoordinatorLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">
    <com.google.android.material.appbar.AppBarLayout
        android:id="@+id/appBarLayout"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:theme="@style/AppTheme.AppBarOverlay">
        <androidx.appcompat.widget.Toolbar
            android:id="@+id/toolbar"
            android:layout_width="match_parent"
            android:layout_height="?attr/actionBarSize"
            android:background="?attr/colorPrimary"
            app:popupTheme="@style/AppTheme.PopupOverlay" />

        <androidx.constraintlayout.widget.ConstraintLayout
            android:layout_width="match_parent"
            android:layout_height="619dp"
            android:background="#FFFFFFFF">
            <TextureView
                android:id="@+id/regularBackCamera"
                android:layout_width="320dp"
                android:layout_height="240dp"
                android:layout_marginEnd="44dp"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                app:layout_constraintVertical_bias="0.899" />
            <TextView
                android:id="@+id/textView3"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:text="Raw ToF Data"
                android:textColor="@android:color/primary_text_light"
                app:layout_constraintEnd_toEndOf="@+id/rawData"
                app:layout_constraintStart_toStartOf="@+id/rawData"
                app:layout_constraintTop_toBottomOf="@+id/rawData" />
            <TextureView
                android:id="@+id/rawData"
                android:layout_width="320dp"
                android:layout_height="240dp"
                android:layout_marginStart="44dp"
                app:layout_constraintBottom_toTopOf="@+id/regularBackCamera"
                app:layout_constraintStart_toStartOf="parent"
                app:layout_constraintTop_toTopOf="parent"
                app:layout_constraintVertical_bias="0.485" />
            <TextView
                android:id="@+id/textView5"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_marginStart="120dp"
                android:text="Back Camera"
                android:textColor="@android:color/primary_text_light"
                app:layout_constraintStart_toStartOf="@+id/regularBackCamera"
                app:layout_constraintTop_toBottomOf="@+id/regularBackCamera" />
        </androidx.constraintlayout.widget.ConstraintLayout>
    </com.google.android.material.appbar.AppBarLayout>
</androidx.coordinatorlayout.widget.CoordinatorLayout>

MainActivity class:

/*  This is an example of getting and processing ToF data
 */

public class MainActivity extends AppCompatActivity implements DepthFrameVisualizer, RegularCameraFrameVisualizer {
    private static final String TAG = MainActivity.class.getSimpleName();
    public static final int CAM_PERMISSIONS_REQUEST = 0;

    private TextureView rawDataView;
    private TextureView regularImageView;
    private Matrix ToFBitmapTransform;
    private Matrix regularBackCameraBitmapTransform;
    private BackToFCamera backToFCamera;
    private RegularBackCamera regularBackCamera;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        rawDataView = findViewById(R.id.rawData);
        regularImageView = findViewById(R.id.regularBackCamera);
        checkCamPermissions();

    }

    @Override
    protected void onPause() {
        super.onPause();

        if ( backToFCamera !=null)
        {
            backToFCamera.getCamera().close();
            backToFCamera = null;

        }
        if ( regularBackCamera!= null)
        {
            regularBackCamera.getCamera().close();
            regularBackCamera = null;
        }
    }

    @Override
    protected void onResume() {
        super.onResume();

        backToFCamera = new BackToFCamera(this, this);
        String tofCameraId = backToFCamera.openCam(null);

        regularBackCamera = new RegularBackCamera(this, this);
        //pass in tofCameraId to avoid opening again since both regular cam & ToF camera are back facing
        regularBackCamera.openCam(tofCameraId);

    }

    @Override
    protected void onDestroy() {
        super.onDestroy();               // Add this line
    }

    private void checkCamPermissions() {
        if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
            ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, CAM_PERMISSIONS_REQUEST);
        }
    }

    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
    }

    @Override
    public void onRawDataAvailable(Bitmap bitmap) {
        renderBitmapForToFToTextureView(bitmap, rawDataView);
    }

    @Override
    public void onRegularImageAvailable(Bitmap bitmap) {
        renderBitmapToTextureView( bitmap,regularImageView);
    }

    /* We don't want a direct camera preview since we want to get the frames of data directly
        from the camera and process.

        This takes a converted bitmap and renders it onto the surface, with a basic rotation
        applied.
     */
    private void renderBitmapForToFToTextureView(Bitmap bitmap, TextureView textureView) {

        if (bitmap!=null && textureView!=null) {
            Canvas canvas = textureView.lockCanvas();
            canvas.drawBitmap(bitmap, ToFBitmapTransform(textureView), null);
            textureView.unlockCanvasAndPost(canvas);
        }
    }

    private void renderBitmapToTextureView(Bitmap bitmap, TextureView textureView) {
        if (bitmap!=null && textureView!=null)
        {
        Canvas canvas = textureView.lockCanvas();
        if (canvas!=null) {
            canvas.drawBitmap(bitmap, regularBackCamBitmapTransform(textureView), null);
            textureView.unlockCanvasAndPost(canvas);
        }
        }
    }

    private Matrix ToFBitmapTransform(TextureView view) {

        if (view!=null) {
            if (ToFBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {
                int rotation = getWindowManager().getDefaultDisplay().getRotation();
                Matrix matrix = new Matrix();
                int centerX = view.getWidth() / 2;
                int centerY = view.getHeight() / 2;

                int bufferWidth = DepthFrameAvailableListener.SAMSUNG_S20_TOF_WIDTH;
                int bufferHeight = DepthFrameAvailableListener.SAMSUNG_S20_TOF_HEIGHT;

                RectF bufferRect = new RectF(0, 0, bufferWidth, bufferHeight);
                RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
                matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);

                Log.i(TAG, " rotation:" + rotation);
                if (Surface.ROTATION_90 == rotation) {

                    matrix.postRotate(270, centerX, centerY);
                } else if (Surface.ROTATION_270 == rotation) {

                    matrix.postRotate(90, centerX, centerY);
                } else if (Surface.ROTATION_180 == rotation) {

                    matrix.postRotate(180, centerX, centerY);
                } else {
                    //strange but works!
                    matrix.postRotate(90, centerX, centerY);
                }


                ToFBitmapTransform = matrix;
            }
        }
        return  ToFBitmapTransform;
    }

    private Matrix regularBackCamBitmapTransform(TextureView view) {
        if (view!=null) {
            if (regularBackCameraBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {

                int rotation = getWindowManager().getDefaultDisplay().getRotation();
                Matrix matrix = new Matrix();
                RectF bufferRect = new RectF(0, 0, MAX_PREVIEW_WIDTH,MAX_PREVIEW_HEIGHT);
                RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
                matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);
                float centerX = viewRect.centerX();
                float centerY = viewRect.centerY();

                Log.i(TAG, " rotation:" + rotation);
                if (Surface.ROTATION_90 == rotation) {

                    matrix.postRotate(270, centerX, centerY);
                } else if (Surface.ROTATION_270 == rotation) {

                    matrix.postRotate(90, centerX, centerY);
                } else if (Surface.ROTATION_180 == rotation) {

                    matrix.postRotate(180, centerX, centerY);
                } else {
                    //strange but works!
                    matrix.postRotate(90, centerX, centerY);
                }

                regularBackCameraBitmapTransform = matrix;
            }
        }
        return regularBackCameraBitmapTransform;
    }
}

Listener that signals a frame is available for display, look at the function publishOriginalBitmap():

import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_HEIGHT;
import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_WIDTH;

public class BackCameraFrameAvailableListener implements ImageReader.OnImageAvailableListener {
    private static final String TAG = BackCameraFrameAvailableListener.class.getSimpleName();
    private RegularCameraFrameVisualizer regularCameraFrameVisualizer;

    public BackCameraFrameAvailableListener(RegularCameraFrameVisualizer regularCameraFrameVisualizer) {
        this.regularCameraFrameVisualizer = regularCameraFrameVisualizer;
    }

    @Override
    public void onImageAvailable(ImageReader reader) {
        try {
            Image image = reader.acquireNextImage();
          if (image != null && image.getFormat() == ImageFormat.JPEG)
            {

                publishOriginalBitmap(image);
            }

        }
        catch (Exception e) {
            Log.e(TAG, "Failed to acquireNextImage: " + e.getMessage());
        }
    }

    private void publishOriginalBitmap(final Image image) {

        if (regularCameraFrameVisualizer != null) {
            new Thread() {
                public void run() {
                    Bitmap bitmap = returnBitmap(image);
                    if (bitmap != null) {
                        regularCameraFrameVisualizer.onRegularImageAvailable(bitmap);
                        bitmap.recycle();
                    }
                }
            }.start();
        }

    }

    private Bitmap returnBitmap(Image image) {
        Bitmap bitmap = null;
        // width=1920,height=1080
        int width =1920;
        int height =1080;
        if (image!=null) {

            Log.i(TAG,"returnBitmap,CONSTANT MAX width:"+MAX_PREVIEW_WIDTH +",MAX height:"+MAX_PREVIEW_HEIGHT);
            Log.i(TAG,"BEFORE returnBitmap,image.width:"+width +",height:"+height );
            if (image!=null) {
                Image.Plane[] planes = image.getPlanes();
                if (planes!=null && planes.length>0) {

                    ByteBuffer buffer = image.getPlanes()[0].getBuffer();
                    image.close();
                    Log.i(TAG,"buffer size:"+buffer.capacity());

                        float currenBufferSize = buffer.capacity();
                        float jpegReportedArea = width * height;
                        if (currenBufferSize >=jpegReportedArea ) {
                            Log.i(TAG,"currenBufferSize >=jpegReportedArea ");

                            float quotient =  jpegReportedArea/currenBufferSize ;
                            float f_width = width * quotient;
                            width = (int) Math.ceil(f_width);
                            float f_height = height * quotient;
                            height = (int) Math.ceil(f_height);
                        }
                        else
                        {
                            Log.i(TAG,"currenBufferSize <jpegReportedArea ");
                            float quotient = currenBufferSize / jpegReportedArea;
                            float f_width = (width * quotient);
                            width = (int) Math.ceil(f_width);
                            float f_height = (height * quotient);
                            height = (int) Math.ceil(f_height);

                        }


                        Log.i(TAG,"AFTER width:"+width+",height:"+height);
                        //***here bitmap is black
                        bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);

                        buffer.rewind();
                        if (bitmap!=null) {
                            bitmap.copyPixelsFromBuffer(buffer);
                        }

                }

            }
        }

        return bitmap;
    }
} 

The interface used by the listener to signal image is ready:

package com.example.opaltechaitestdepthmap;

        import android.graphics.Bitmap;

public interface RegularCameraFrameVisualizer {
    void onRegularImageAvailable(Bitmap bitmap);

}

Handles camera states:

public class RegularBackCamera extends CameraDevice.StateCallback {

    private static final String TAG = RegularBackCamera.class.getSimpleName();
    private static int FPS_MIN = 15;
    private static int FPS_MAX = 30;
    public static final int MAX_PREVIEW_WIDTH = 1920;
    public static final int MAX_PREVIEW_HEIGHT = 1080;
    private Context context;
    private CameraManager cameraManager;
    private ImageReader RawSensorPreviewReader;
    private CaptureRequest.Builder previewBuilder;
    private BackCameraFrameAvailableListener imageAvailableListener;
    private String cameraId;
    private CameraDevice camera;

    public RegularBackCamera(Context context, RegularCameraFrameVisualizer frameVisualizer) {
        this.context = context;
        cameraManager = (CameraManager)context.getSystemService(Context.CAMERA_SERVICE);
        imageAvailableListener = new BackCameraFrameAvailableListener(frameVisualizer);

    }

    // Open the back camera and start sending frames
    public String openCam(String idToExclude) {
        this.cameraId  = getBackCameraID(idToExclude);
        Size size = openCamera(this.cameraId);

        //Tried this DID NOT WORK Size smallerPreviewSize =chooseSmallerPreviewSize();

                RawSensorPreviewReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH,
                        MAX_PREVIEW_HEIGHT, ImageFormat.JPEG,2);
        Log.i(TAG,"ImageFormat.JPEG, width:"+size.getWidth()+", height:"+ size.getHeight());
        RawSensorPreviewReader.setOnImageAvailableListener(imageAvailableListener, null);

        return this.cameraId;
    }

    private String getBackCameraID(String idToExclude) {
        String cameraId = null;
        CameraManager cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
        try {

            if (idToExclude!=null) {
                for (String camera : cameraManager.getCameraIdList()) {
                    //avoid getting same camera
                    if (!camera.equalsIgnoreCase(idToExclude)) {
                        //avoid return same camera twice as 1 sensor can only be accessed once
                        CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
                        final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
                        boolean facingBack = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;

                        if (facingBack) {
                            cameraId = camera;
                            // Note that the sensor size is much larger than the available capture size
                            SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
                            Log.i(TAG, "Sensor size: " + sensorSize);

                            // Since sensor size doesn't actually match capture size and because it is
                            // reporting an extremely wide aspect ratio, this FoV is bogus
                            float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
                            if (focalLengths.length > 0) {
                                float focalLength = focalLengths[0];
                                double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
                                Log.i(TAG, "Calculated FoV: " + fov);
                            }

                        }

                    }//end avoid getting same camera


                }//end for
            }
            else
            {
                for (String camera : cameraManager.getCameraIdList()) {

                    //avoid return same camera twice as 1 sensor can only be accessed once
                    CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
                    final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
                    boolean facingFront = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;

                    if (facingFront) {
                        cameraId = camera;
                        // Note that the sensor size is much larger than the available capture size
                        SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
                        Log.i(TAG, "Sensor size: " + sensorSize);

                        // Since sensor size doesn't actually match capture size and because it is
                        // reporting an extremely wide aspect ratio, this FoV is bogus
                        float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
                        if (focalLengths.length > 0) {
                            float focalLength = focalLengths[0];
                            double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
                            Log.i(TAG, "Calculated FoV: " + fov);
                        }

                    }
                }//end for
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        return    cameraId ;
    }

    //opens camera based on ID & returns optimal size caped at maximum size based on docs
    private Size openCamera(String cameraId) {
        Size size = null;
        try{
            int permission = ContextCompat.checkSelfPermission(context, Manifest.permission.CAMERA);
            if(PackageManager.PERMISSION_GRANTED == permission) {
                if ( cameraManager!=null) {
                    if (cameraId!=null) {
                        cameraManager.openCamera(cameraId, this, null);


                            CameraCharacteristics characteristics
                                    =  cameraManager.getCameraCharacteristics(cameraId);


                            StreamConfigurationMap map = characteristics.get(
                                    CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);

                        size = Collections.max(
                                    Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
                                    new CompareSizeByArea());
                        if (size.getWidth() > MAX_PREVIEW_WIDTH ||  size.getHeight() > MAX_PREVIEW_HEIGHT)
                        {
                            size = new Size( MAX_PREVIEW_WIDTH ,MAX_PREVIEW_HEIGHT);

                        }

                       List<Size> sizes =  Arrays.asList(map.getOutputSizes(ImageFormat.JPEG));
                       for (int i=0; i<sizes.size(); i++)
                       {
                           Log.i(RegularBackCamera.class.toString(),"JPEG sizes, width="+sizes.get(i).getWidth()+","+"height="+sizes.get(i).getHeight());
                       }

                    }

                }
            }else{
                Log.e(TAG,"Permission not available to open camera");
            }
        }catch (CameraAccessException | IllegalStateException | SecurityException e){
            Log.e(TAG,"Opening Camera has an Exception " + e);
            e.printStackTrace();
        }
        return  size;
    }


    @Override
    public void onOpened(@NonNull CameraDevice camera) {
        try {
            this.camera = camera;
            previewBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            previewBuilder.set(CaptureRequest.JPEG_ORIENTATION, 0);
            Range<Integer> fpsRange = new Range<>(FPS_MIN, FPS_MAX);
            previewBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, fpsRange);
            previewBuilder.addTarget(RawSensorPreviewReader.getSurface());

            List<Surface> targetSurfaces = Arrays.asList(RawSensorPreviewReader.getSurface());

            camera.createCaptureSession(targetSurfaces,
                    new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession session) {
                            onCaptureSessionConfigured(session);
                        }
                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                            Log.e(TAG,"!!! Creating Capture Session failed due to internal error ");
                        }
                    }, null);

        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void onCaptureSessionConfigured(@NonNull CameraCaptureSession session) {
        Log.i(TAG,"Capture Session created");
        previewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
        try {
            session.setRepeatingRequest(previewBuilder.build(), null, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }


    @Override
    public void onDisconnected(@NonNull CameraDevice camera) {

        if (camera!=null)
        {
            camera.close();
            camera = null;
        }
    }

    @Override
    public void onError(@NonNull CameraDevice camera, int error) {
        if (camera!=null)
        {
            camera.close();
            Log.e(TAG,"onError,cameraID:"+camera.getId()+",error:"+error);
            camera = null;


        }
    }

    protected Size chooseSmallerPreviewSize()
    {
        CameraManager cm = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
        CameraCharacteristics cc = null;
        try {
            cc = cm.getCameraCharacteristics(this.cameraId);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
        StreamConfigurationMap streamConfigs = cc.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
        Size[] sizes = streamConfigs.getOutputSizes( ImageFormat.JPEG);
        Size smallerPreviewSize = chooseVideoSize( sizes);

        return smallerPreviewSize;
    }


  //Rerefences: https://stackoverflow.com/questions/46997776/camera2-api-error-failed-to-create-capture-session
    protected Size chooseVideoSize(Size[] choices) {
        List<Size> smallEnough = new ArrayList<>();

        for (Size size : choices) {
            if (size.getWidth() == size.getHeight() * 4 / 3 && size.getWidth() <= 1080) {
                smallEnough.add(size);
            }
        }
        if (smallEnough.size() > 0) {
            return Collections.max(smallEnough, new CompareSizeByArea());
        }

        return choices[choices.length - 1];
    }

    public CameraDevice getCamera() {
        return camera;
    }
}

Helper to sort preview sizes:

public class CompareSizeByArea implements Comparator<Size> {
    @Override
    public int compare(Size lhs, Size rhs) {
        return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
                (long) rhs.getWidth() * rhs.getHeight());
    }

}

I included the code for the regular camera only since the regular camera was not displaying, however the code for obtaining the ToF camera & listeners is the exactly the same except ToF specific logic.

I’m not seeing any exceptions or errors in the app logs, however the logs system show:

E/CHIUSECASE: [ERROR  ] chxusecase.cpp:967 ReturnFrameworkResult() ChiFrame: 0 App Frame: 0 - pResult contains more buffers (1) than the expected number of buffers (0) to return to the framework!
E/CamX: [ERROR][CORE   ] camxnode.cpp:4518 CSLFenceCallback() Node::FastAECRealtime_IFE0 : Type:65536 Fence 3 handler failed in node fence handler
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9279 GetSensorMode() Sensor name: s5k2la
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9302 GetSensorMode() W x H : 4032, 3024
E//vendor/bin/hw/vendor.samsung.hardware.camera.provider@3.0-service_64: vendor/qcom/proprietary/commonsys-intf/adsprpc/src/fastrpc_apps_user.c:750: Error 0xe08132b8: remote_handle_invoke failed
E/CamX: [ERROR][ISP    ] camxispiqmodule.h:1871 IsTuningModeDataChanged() Invalid pointer to current tuning mode parameters (0x0)
E/CamX: [ERROR][PPROC  ] camxipenode.cpp:9529 GetFaceROI() Face ROI is not published

**1) How can I display the regular back facing camera as a Bitmap on TextureView, correctly?

  1. Save that bitmap as JPEG or PNG in internal drive**

Thanks a million!


Get this bounty!!!

#StackBounty: #android #android-camera2 Android studio Camera 2 save video in full screen

Bounty: 50

I am using the Camera2 API to record videos, I’ve used this project as a reference, I managed to change my TextTure view to full screen but the video that I am saving is still not in full screen, how can I change the saved video to full screen as well?

enter image description here

enter image description here

you can see that when I play the video the video dimensions are the same as my preview, please help how can I save the video in the same perspective as my preview?


Get this bounty!!!

#StackBounty: #android #scaling #android-camera2 How to show a camera preview with the new android.hardware.camera2 API?

Bounty: 50

I want to replace the deprecated camera api with the android.hardware.camera2 api. The example project on github shows the camera picture but creates a black stripe on the edge of the screen if the resolution of the Device doesn’t match perfect the camera scaling options. I’ve tried it on an Emulator (Android 26-29, Nexus 2). Same result as on the Samsung devices SM-T395 and SM-T585. On FullHD it is quite strange.

Increasing the MAX_PREVIEW_HEIGHT and MAX_PREVIEW_WIDTH doesn’t help.
I think I need to take

Here some example picture of the emulator:

Custom resolution with quite a good  fit FullHD sample

How to show a preview of the camera in a custom sized preview?


Get this bounty!!!