๐ธ Embedding a camera experience within your own app shouldn't be that hard.
A flutter plugin to integrate awesome Android / iOS camera experience.
This package provides you with a fully customizable camera experience that you can use within your app.
Use our awesome built-in interface or customize it as you want.
If you are migrating from version 1.x.x to 2.x.x, please read the migration guide.
Here's all native features that cameraAwesome provides to the flutter side.
Features | Android | iOS |
---|---|---|
๐ Ask permissions | โ | โ |
๐ฅ Record video | โ | โ |
๐น Multi camera (๐ง BETA) | โ | โ |
๐ Enable/disable audio | โ | โ |
๐ Take photos | โ | โ |
๐ Photo live filters | โ | โ |
๐ค Exposure level | โ | โ |
๐ก Broadcast live image stream | โ | โ |
๐งช Image analysis (barcode scan & more.) | โ | โ |
๐ Zoom | โ | โ |
๐ธ Device flash support | โ | โ |
โ๏ธ Auto focus | โ | โ |
๐ฒ Live switching camera | โ | โ |
๐ตโ๐ซ Camera rotation stream | โ | โ |
๐ค Background auto stop | โ | โ |
๐ Sensor type switching | โ๏ธ | โ |
๐ช Enable/disable front camera mirroring | โ | โ |
pubspec.yaml
dependencies:
camerawesome: ^2.0.0-dev.1
...
Add these on ios/Runner/Info.plist
:
<key>NSCameraUsageDescription</key>
<string>Your own description</string>
<key>NSMicrophoneUsageDescription</key>
<string>To enable microphone access when recording video</string>
<key>NSLocationWhenInUseUsageDescription</key>
<string>To enable GPS location access for Exif data</string>
Change the minimum SDK version to 21 (or higher) in android/app/build.gradle
:
minSdkVersion 21
In order to be able to take pictures or record videos, you may need additional permissions depending on the Android version and where you want to save them. Read more about it in the official documentation.
WRITE_EXTERNAL_STORAGE
is not included in the plugin starting with version 1.4.0.
If you want to record videos with audio, add this permission to your AndroidManifest.xml
:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.yourpackage">
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<!-- Other declarations -->
</manifest>
You may also want to save location of your pictures in exif metadata. In this case, add below permissions:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.yourpackage">
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<!-- Other declarations -->
</manifest>
โ ๏ธ Overriding Android dependencies
Some of the dependencies used by CamerAwesome can be overriden if you have a conflict. Change these variables to define which version you want to use:
buildscript {
ext.kotlin_version = '1.7.10'
ext {
// You can override these variables
compileSdkVersion = 33
minSdkVersion = 24 // 21 minimum
playServicesLocationVersion = "20.0.0"
exifInterfaceVersion = "1.3.4"
}
// ...
}
Only change these variables if you are sure of what you are doing.
For example, setting the Play Services Location version might help you when you have conflicts with other plugins. The below line shows an example of these conflicts:
java.lang.IncompatibleClassChangeError: Found interface com.google.android.gms.location.ActivityRecognitionClient, but class was expected
import 'package:camerawesome/camerawesome_plugin.dart';
Just use our builder.
That's all you need to create a complete camera experience within your app.
CameraAwesomeBuilder.awesome(
saveConfig: SaveConfig.photoAndVideo(),
onMediaTap: (mediaCapture) {
OpenFile.open(mediaCapture.filePath);
},
),
This builder can be customized with various settings:
Here is an example:
Check the full documentation to learn more.
If the awesome()
factory is not enough, you can use custom()
instead.
It provides a builder
property that lets you create your own camera experience.
The camera preview will be visible behind what you will provide to the builder.
CameraAwesomeBuilder.custom(
saveConfig: SaveConfig.photo(),
builder: (state, previewSize, previewRect) {
// create your interface here
},
)
See more in documentation
Here is the definition of our builder method.
typedef CameraLayoutBuilder = Widget Function(CameraState cameraState, PreviewSize previewSize, Rect previewRect);some states work ?
Using the state you can do anything you need without having to think about the camera flow
PreparingCameraState
PhotoCameraState
or VideoCameraState
VideoRecordingCameraState
VideoCameraState
state.when(
onPhotoMode: (photoState) => photoState.start(),
onVideoMode: (videoState) => videoState.start(),
onVideoRecordingMode: (videoState) => videoState.pause(),
);
See more in documentation
Use this to achieve:
You can check examples using MLKit inside the example
directory. The above example is from ai_analysis_faces.dart
. It detects faces and draw their contours.
It's also possible to use MLKit to read barcodes:
Check ai_analysis_barcode.dart
and preview_overlay_example.dart
for examples or the documentation.
CameraAwesomeBuilder.awesome(
saveConfig: SaveConfig.photo(),
onImageForAnalysis: analyzeImage,
imageAnalysisConfig: AnalysisConfig(
// Android specific options
androidOptions: const AndroidAnalysisOptions.nv21(
// Target width (CameraX will chose the closest resolution to this width)
width: 250,
),
// Wether to start automatically the analysis (true by default)
autoStart: true,
// Max frames per second, null for no limit (default)
maxFramesPerSecond: 20,
),
)
MLkit recommends using nv21 format for Android.
bgra8888 is the iOS format For machine learning you don't need full-resolution images (720 or lower should be enough and makes computation easier)
Learn more about the image analysis configuration in the documentation .
Check also detailed explanations on how to use MLKit to read barcodes and detect faces.
โ ๏ธ On Android, some devices don't support video recording and image analysis at the same time.
CameraCharacteristics .isVideoRecordingAndImageAnalysisSupported(Sensors.back)
.Through state you can access to a SensorConfig
class.
Function | Comment |
---|---|
setZoom | change zoom |
setFlashMode | change flash between NONE,ON,AUTO,ALWAYS |
setBrightness | change brightness level manually (better to let this auto) |
setMirrorFrontCamera | set mirroring for front camera |
All of these configurations are listenable through a stream so your UI can automatically get updated according to the actual configuration.
Apply live filters to your pictures using the built-in interface:
You can also choose to use a specific filter from the start:
CameraAwesomeBuilder.awesome(
// other params
filter: AwesomeFilter.AddictiveRed,
availableFilters: ...
)
Or set the filter programmatically:
CameraAwesomeBuilder.custom(
builder: (cameraState, previewSize, previewRect) {
return cameraState.when(
onPreparingCamera: (state) =>
const Center(child: CircularProgressIndicator()),
onPhotoMode: (state) =>
TakePhotoUI(state, onFilterTap: () {
state.setFilter(AwesomeFilter.Sierra);
}),
onVideoMode: (state) => RecordVideoUI(state, recording: false),
onVideoRecordingMode: (state) =>
RecordVideoUI(state, recording: true),
);
},
)
See all available filters in the documentation.
[!TIP] By default the awesome ui setup has a filter list but you can pass an empty list to remove it
๐ง Feature in beta ๐ง Any feedback is welcome!
In order to start using CamerAwesome with multiple cameras simulatenously, you need to define a SensorConfig
that uses several sensors. You can use the SensorConfig.multiple()
constructor for this:
CameraAwesomeBuilder.awesome(
sensorConfig: SensorConfig.multiple(
sensors: [
Sensor.position(SensorPosition.back),
Sensor.position(SensorPosition.front),
],
flashMode: FlashMode.auto,
aspectRatio: CameraAspectRatios.ratio_16_9,
),
// Other params
)
This feature is not supported by all devices and even when it is, there are limitations that you must be aware of.
Check the details in the dedicated documentation.
Run this command:
With Flutter:
$ flutter pub add camerawesome
This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get
):
dependencies:
camerawesome: ^2.0.0+1
Alternatively, your editor might support flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:camerawesome/camerawesome_plugin.dart';
import 'package:camerawesome/pigeon.dart';
import 'dart:io';
// import 'package:better_open_file/better_open_file.dart';
import 'package:camerawesome/camerawesome_plugin.dart';
import 'package:camerawesome/pigeon.dart';
import 'package:flutter/material.dart';
import 'package:path_provider/path_provider.dart';
import 'utils/file_utils.dart';
void main() {
runApp(const CameraAwesomeApp());
}
class CameraAwesomeApp extends StatelessWidget {
const CameraAwesomeApp({super.key});
@override
Widget build(BuildContext context) {
return const MaterialApp(
title: 'camerAwesome',
home: CameraPage(),
);
}
}
class CameraPage extends StatelessWidget {
const CameraPage({super.key});
@override
Widget build(BuildContext context) {
return Scaffold(
body: Container(
color: Colors.white,
child: CameraAwesomeBuilder.awesome(
saveConfig: SaveConfig.photoAndVideo(
initialCaptureMode: CaptureMode.photo,
mirrorFrontCamera: true,
photoPathBuilder: (sensors) async {
final Directory extDir = await getTemporaryDirectory();
final testDir = await Directory(
'${extDir.path}/camerawesome',
).create(recursive: true);
if (sensors.length == 1) {
final String filePath =
'${testDir.path}/${DateTime.now().millisecondsSinceEpoch}.jpg';
return SingleCaptureRequest(filePath, sensors.first);
} else {
// Separate pictures taken with front and back camera
return MultipleCaptureRequest(
{
for (final sensor in sensors)
sensor:
'${testDir.path}/${sensor.position == SensorPosition.front ? 'front_' : "back_"}${DateTime.now().millisecondsSinceEpoch}.jpg',
},
);
}
},
videoOptions: VideoOptions(
enableAudio: true,
ios: CupertinoVideoOptions(
fps: 10,
),
android: AndroidVideoOptions(
bitrate: 6000000,
fallbackStrategy: QualityFallbackStrategy.lower,
),
),
exifPreferences: ExifPreferences(saveGPSLocation: true),
),
sensorConfig: SensorConfig.single(
sensor: Sensor.position(SensorPosition.back),
flashMode: FlashMode.auto,
aspectRatio: CameraAspectRatios.ratio_4_3,
zoom: 0.0,
),
enablePhysicalButton: true,
// filter: AwesomeFilter.AddictiveRed,
previewFit: CameraPreviewFit.contain,
onMediaTap: (mediaCapture) {
mediaCapture.captureRequest.when(
single: (single) {
debugPrint('single: ${single.file?.path}');
single.file?.open();
},
multiple: (multiple) {
multiple.fileBySensor.forEach((key, value) {
debugPrint('multiple file taken: $key ${value?.path}');
value?.open();
});
},
);
},
availableFilters: awesomePresetFiltersList,
),
),
);
}
}
Download details:
Author: apparence.io
Source: https://github.com/Apparence-io/camera_awesome