1684948140
A Flutter plugin providing Signature Pad for drawing smooth signatures. Library is written in pure Dart/Flutter environment to provide support for all platforms..
Easy to use library with variety of draw and export settings. Also supports SVG files.
Signature pad drawing is based on Cubic Bézier curves.
Offers to choose between performance and beauty mode.
Usage
import 'package:hand_signature/signature.dart';
With HandSignatureControl and HandSignature is possible to tweak some drawing aspects like stroke width, smoothing ratio or velocity weight.
final control = HandSignatureControl(
threshold: 3.0,
smoothRatio: 0.65,
velocityRange: 2.0,
);
final widget = HandSignature(
control: control,
color: Colors.blueGrey,
strokeWidth: 1.0,
maxStrokeWidth: 10.0,
type: SignatureDrawType.shape,
);
HandSignatureControl sets up 'math' to control input touches and handles control points of signature curve.
HandSignature sets up visual style of signature curve.
Export
Properties, like canvas size, stroke min/max width and color can be modified during export.
There are more ways and more formats how to export signature, most used ones are svg and png formats.
final control = HandSignatureControl();
final svg = control.toSvg();
final png = control.toImage();
final json = control.toMap();
control.importData(json);
Svg: SignatureDrawType shape generates reasonably small file and is read well by all programs. On the other side arc generates really big svg file and some programs can have hard times handling so much objects. Line is simple Bezier Curve.
Image: Export to image supports ImageByteFormat and provides png or raw rgba data.
Json/Map: Exports current state - raw data that can be used later to restore state.
Parsing and drawing saved SVG
Exported svg String is possible to display in another lib like: flutter_svg.
Run this command:
With Flutter:
$ flutter pub add hand_signature
This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get
):
dependencies:
hand_signature: ^3.0.1
Alternatively, your editor might support flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:hand_signature/signature.dart';
import 'dart:typed_data';
import 'package:flutter/cupertino.dart';
import 'package:flutter/material.dart';
import 'package:flutter_svg/flutter_svg.dart';
import 'package:hand_signature/signature.dart';
import 'scroll_test.dart';
void main() => runApp(MyApp());
HandSignatureControl control = new HandSignatureControl(
threshold: 0.01,
smoothRatio: 0.65,
velocityRange: 2.0,
);
ValueNotifier<String?> svg = ValueNotifier<String?>(null);
ValueNotifier<ByteData?> rawImage = ValueNotifier<ByteData?>(null);
ValueNotifier<ByteData?> rawImageFit = ValueNotifier<ByteData?>(null);
class MyApp extends StatelessWidget {
bool get scrollTest => false;
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Signature Demo',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: Scaffold(
backgroundColor: Colors.orange,
body: scrollTest
? ScrollTest()
: SafeArea(
child: Stack(
children: <Widget>[
Column(
children: <Widget>[
Expanded(
child: Center(
child: AspectRatio(
aspectRatio: 2.0,
child: Stack(
children: <Widget>[
Container(
constraints: BoxConstraints.expand(),
color: Colors.white,
child: HandSignature(
control: control,
type: SignatureDrawType.shape,
),
),
CustomPaint(
painter: DebugSignaturePainterCP(
control: control,
cp: false,
cpStart: false,
cpEnd: false,
),
),
],
),
),
),
),
Row(
children: <Widget>[
CupertinoButton(
onPressed: () {
control.clear();
svg.value = null;
rawImage.value = null;
rawImageFit.value = null;
},
child: Text('clear'),
),
CupertinoButton(
onPressed: () async {
svg.value = control.toSvg(
color: Colors.blueGrey,
type: SignatureDrawType.shape,
fit: true,
);
rawImage.value = await control.toImage(
color: Colors.blueAccent,
background: Colors.greenAccent,
fit: false,
);
rawImageFit.value = await control.toImage(
color: Colors.black,
background: Colors.greenAccent,
fit: true,
);
},
child: Text('export'),
),
],
),
SizedBox(
height: 16.0,
),
],
),
Align(
alignment: Alignment.bottomRight,
child: Column(
mainAxisSize: MainAxisSize.min,
children: <Widget>[
_buildImageView(),
_buildScaledImageView(),
_buildSvgView(),
],
),
),
],
),
),
),
);
}
Widget _buildImageView() => Container(
width: 192.0,
height: 96.0,
decoration: BoxDecoration(
border: Border.all(),
color: Colors.white30,
),
child: ValueListenableBuilder<ByteData?>(
valueListenable: rawImage,
builder: (context, data, child) {
if (data == null) {
return Container(
color: Colors.red,
child: Center(
child: Text('not signed yet (png)\nscaleToFill: false'),
),
);
} else {
return Padding(
padding: EdgeInsets.all(8.0),
child: Image.memory(data.buffer.asUint8List()),
);
}
},
),
);
Widget _buildScaledImageView() => Container(
width: 192.0,
height: 96.0,
decoration: BoxDecoration(
border: Border.all(),
color: Colors.white30,
),
child: ValueListenableBuilder<ByteData?>(
valueListenable: rawImageFit,
builder: (context, data, child) {
if (data == null) {
return Container(
color: Colors.red,
child: Center(
child: Text('not signed yet (png)\nscaleToFill: true'),
),
);
} else {
return Container(
padding: EdgeInsets.all(8.0),
color: Colors.orange,
child: Image.memory(data.buffer.asUint8List()),
);
}
},
),
);
Widget _buildSvgView() => Container(
width: 192.0,
height: 96.0,
decoration: BoxDecoration(
border: Border.all(),
color: Colors.white30,
),
child: ValueListenableBuilder<String?>(
valueListenable: svg,
builder: (context, data, child) {
if (data == null) {
return Container(
color: Colors.red,
child: Center(
child: Text('not signed yet (svg)'),
),
);
}
return Padding(
padding: EdgeInsets.all(8.0),
child: SvgPicture.string(
data,
placeholderBuilder: (_) => Container(
color: Colors.lightBlueAccent,
child: Center(
child: Text('parsing data(svg)'),
),
),
),
);
},
),
);
}
Download Details:
Author: basecontrol.dev
Source Code: https://github.com/romanbase/hand_signature
1673727660
A practical hand tracking engine.
npm install @handtracking.io/yoha
Please note:
node_modules/@handtracking.io/yoha
since the library needs to download the model files from here. (Webpack Example)Yoha is a hand tracking engine that is built with the goal of being a versatile solution in practical scenarios where hand tracking is employed to add value to an application. While ultimately the goal is to be a general purpose hand tracking engine supporting any hand pose, the engine evolves around specific hand poses that users/developers find useful. These poses are detected by the engine which allows to build applications with meaningful interactions. See the demo for an example.
Yoha is currently in beta.
About the name: Yoha is short for ("Your Hand Tracking").
Yoha is currently available for the web via JavaScript. More languages will be added in the future. If you want to port Yoha to another language and need help feel free reach out.
Yoha was built from scratch. It uses a custom neural network trained using a custom dataset. The backbone for the inference in the browser is currently TensorFlow.js
Your desired pose is not on this list? Feel free to create an issue for it.
Yoha was built with performance in mind. It is able to provide realtime user experience on a broad range of laptops and desktop devices. The performance on mobile devices is not great which hopefuly will change with the further development of inference frameworks like TensorFlow.js
Please note that native inference speed can not be compared with the web inference speed. Differently put, if you were to run Yoha natively it would be much faster than via the web browser.
git clone https://github.com/handtracking-io/yoha && \
cd yoha/example && \
yarn && \
yarn start
git clone https://github.com/handtracking-io/yoha && \
cd yoha && \
./download_models.sh && \
yarn && \
yarn start
Author: Handtracking-io
Source Code: https://github.com/handtracking-io/yoha
License: MIT license