Android allows devices to support concurrent streaming of camera devices. For example, this allows a device to have both the front and back cameras operating at the same time. From Android 11, the Camera2 API includes the following methods that apps can call to determine if the cameras support concurrent streaming and the stream configurations that are supported.
getConcurrentCameraIds
: Gets the set of combinations of currently connected camera device identifiers that support configuring camera device sessions concurrently.isConcurrentSessionConfigurationSupported
: Checks whether the provided set of camera devices and their corresponding session configurations can be configured concurrently.
A set of mandatory stream combinations that must be supported during concurrent
streaming are included through a camera device's camera characteristics in the
SCALER_MANDATORY_CONCURRENT_STREAM_COMBINATIONS
property.
Each camera device advertised through getConcurrentStreamingCameraIds()
must
support the following guaranteed configurations for concurrent streams.
Target 1 | Target 2 | |||
---|---|---|---|---|
Type | Max size | Type | Max size | Sample use cases |
YUV | s1440p | In-app video or image processing | ||
PRIV | s1440p | In-app viewfinder analysis | ||
JPEG | s1440p | No viewfinder still image capture | ||
YUV / PRIV | s720p | JPEG | s1440p | Standard still imaging |
YUV / PRIV | s720p | YUV / PRIV | s1440p | In-app video or processing with preview |
Devices with the MONOCHROME
capability
(CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES
includes
CameraMetadata#REQUEST_AVAILABLE_CAPABILITIES_MONOCHROME
)
supporting Y8 must support substituting YUV streams with Y8 in all guaranteed
stream combinations.
s720p
refers to 720p (1280 x 720) or the maximum supported resolution for the
particular format returned by
StreamConfigurationMap.getOutputSizes()
.
s1440p
refers to 1440p (1920 x 1440) or the maximum supported resolution for
the particular format returned by
StreamConfigurationMap.getOutputSizes()
.
Devices whose capabilities don't include
ANDROID_REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE
must support at least a single Y16 stream, Dataspace::DEPTH
with sVGA
resolution, during concurrent operation, where sVGA is the smaller of the two
following resolutions:
- maximum output resolution for the given format
- 640 x 480
Implementation
To allow apps to query a device to determine if its
cameras support concurrent streaming, implement the
ICameraProvider@2.6
HAL interface, which includes the following methods:
For a reference implementation of the ICameraProvider@2.6
HAL interface, see
the emulated camera HAL library at
EmulatedCameraProviderHWLImpl.cpp
.
Validation
To test that your implementation of this feature works as intended, use the
ConcurrentCameraTest.java
CTS test. Also, test using an app that opens up multiple cameras and operates
them concurrently.
Resource allocations problems
If camera HALs advertise support for concurrent operation of camera devices, they might run into resource allocation problems, especially in the case where there are enough image signal processor (ISP) resources on the phone to stream both front and back (or other) cameras concurrently, but not to their full capacity. In this case, the camera HAL must allocate limited hardware resources to each camera device.
Example scenario
The following scenario demonstrates this problem.
Problem
The device has the following configuration:
- Camera ID
0
is a logical camera backed by a wide and ultra-wide camera, which each take one ISP resource. - Camera ID
1
is a camera which takes one ISP resource.
The device (phone) has two ISPs. If camera ID 0
is opened and a session is
configured, it's possible that the camera HAL reserves two ISPs anticipating
both ultrawide and wide camera use.
If that's the case, the front camera (ID 1
) can't configure any
streams because both ISPs are in use.
Solution
To address this problem, the framework can open both camera IDs 0
and 1
before configuring sessions to provide a hint to the camera HAL about how to
allocate resources (because it now expects concurrent operation of cameras).
However, this can lead to limited capabilities, for example, zoom might not be
able to handle the full zoom range ratio (because switching physical camera
IDs might be problematic).
To implement this solution, make the following updates to
provider@2.6::ICameraProvider::getConcurrentCameraStreamingCameraIds
.
Mandate that for concurrent operation of cameras, the camera framework must open camera devices (
@3.2::ICameraDevice::open
) before configuring any sessions on the camera devices. This allows camera providers to allocate resources accordingly.To address the issue of not being able to handle the full zoom range ratio, ensure that camera apps, when using cameras concurrently, are guaranteed to use the
ZOOM_RATIO
control setting between only 1x andMAX_DIGITAL_ZOOM
instead of the completeZOOM_RATIO_RANGE
(this prevents the switching of physical cameras internally, which potentially requires more ISPs).
Problem with testDualCameraPreview
When you make the updates above, it can create a problem with a behavior allowed
by the MultiViewTest.java#testDualCameraPreview
test.
The test testDualCameraPreview
doesn't configure sessions only after opening
all cameras. It follows this sequence:
for each camera in cameraDevices :
device = openCamera(camera)
createCaptureSession(device);
It does, however, tolerate camera open failures with
ERROR_MAX_CAMERAS_IN_USE [1]
. Third-party apps might depend on this behavior.
Because the camera HAL won't know the complete set of camera IDs being opened for concurrent operation before configuring sessions, it could be hard for it to allocate hardware resources (assuming there is some competition for them).
To address this problem, maintaining backward compatibility in addition to
supporting concurrent streaming, camera HALs should fail openCamera
calls with
ERROR_MAX_CAMERAS_IN_USE
if they can't support full stream configuration for
all cameras running concurrently.