1675502769
flutter_huaji_push
A new Flutter plugin.
在工程 pubspec.yaml 中加入 dependencies
dependencies:
flutter_huaji_push: 1.0.0
、## 使用
中国上海:tpns.sh.tencent.com
中国香港:tpns.hk.tencent.com
新加坡:tpns.sgp.tencent.com
void configureClusterDomainName(String domainStr);
<application>
// 其他安卓组件
<meta-data
android:name="XG_SERVER_SUFFIX"
android:value="其他地区域名" />
</application>
import 'package:flutter_huaji_push/flutter_huaji_push.dart';
说明(接口使用参考/flutter_huaji_push/example/lib/main.dart和/flutter_huaji_push/example/lib/ios/homeTest.dart文件)
android: {
....
defaultConfig {
applicationId "替换成自己应用 ID"
...
ndk {
/// 选择要添加的对应.so 库。
abiFilters 'armeabi', 'armeabi-v7a', 'x86', 'x86_64', 'mips', 'mips64', 'arm64-v8a',
}
//
manifestPlaceholders = [
XG_ACCESS_ID : "替换自己的ACCESS_ID", // 信鸽官网注册所得ACCESS_ID
XG_ACCESS_KEY : "替换自己的ACCESS_KEY", // 信鸽官网注册所得ACCESS_KEY
]
}
}
-keep public class * extends android.app.Service
-keep public class * extends android.content.BroadcastReceiver
-keep class com.tencent.android.tpush.** {*;}
-keep class com.tencent.tpns.baseapi.** {*;}
-keep class com.tencent.tpns.mqttchannel.** {*;}
-keep class com.tencent.tpns.dataacquisition.** {*;}
-keep class com.tencent.bigdata.baseapi.** {*;} // TPNS-Android-SDK 1.2.0.1 及以上版本不需要此条配置
-keep class com.tencent.bigdata.mqttchannel.** {*;} // TPNS-Android-SDK 1.2.0.1 及以上版本不需要此条配置
说明 : 提供安卓各厂商通道接入方法。
说明 : 提供TPNS的所有业务接口。
Run this command:
With Flutter:
$ flutter pub add flutter_huaji_push
This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get
):
dependencies:
flutter_huaji_push: ^1.0.12
Alternatively, your editor might support flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:flutter_huaji_push/flutter_huaji_push.dart';
import 'package:flutter/material.dart';
import 'dart:async';
import 'package:flutter/services.dart';
import 'package:flutter_huaji_push/flutter_huaji_push.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatefulWidget {
@override
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
String _platformVersion = 'Unknown';
@override
void initState() {
super.initState();
initPlatformState();
}
// Platform messages are asynchronous, so we initialize in an async method.
Future<void> initPlatformState() async {
String platformVersion;
// Platform messages may fail, so we use a try/catch PlatformException.
try {
platformVersion = await FlutterHuajiPush.xgSdkVersion;
} on PlatformException {
platformVersion = 'Failed to get platform version.';
}
// If the widget was removed from the tree while the asynchronous platform
// message was in flight, we want to discard the reply rather than calling
// setState to update our non-existent appearance.
if (!mounted) return;
setState(() {
_platformVersion = platformVersion;
});
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: const Text('Plugin example app'),
),
body: Center(
child: Text('Running on: $_platformVersion\n'),
),
),
);
}
}
Download Details:
Author: wskkhn-hezhong
Source Code: https://github.com/wskkhn-hezhong/flutter_huaji_push
1675286820
A list of online resources for quantitative modeling, trading, portfolio management
There are lots of other valuable online resources. We are not trying to be exhaustive. Please feel free to send a pull request if you believe something is worth recommending. A general rule of thumb for open source projects is having already received 100 stars on github.
awesome-quant - Awesome quant is another curated list of quant resources
Quantopian - First Python-based online quantitative trading platform; its core library zipline and its performance evaluation library pyfolio; and alphalens
QuantConnect - C# based online quantitative trading platform; its core library Lean
Quantiacs - The Marketplace For Algorithmic Trading Strategies; its Matlab and Python toolbox
Numerai - crowd-sourced trading strategies; its Python API
Collective2 - The platform that allows investors subscribe to top-traders; its algotrades system
ZuluTrade - The platform that allows investors subscribe to top-traders
Tradingview - It provides free widgets used for example Huobi
Investing.com - Real time multi-assets and markets
KloudTrader Narwhal - Trading algorithm deployment platform with flat-rate commission-free brokerage
MetaTrader 5 - Multi-Asset trading system
TradeStation - Trading system
SmartQuant(OpenQuant) - C# Trading system
RightEdge - Trading system
AmiBroker - Trading system
Algo Terminal - C# Trading system
NinjaTrader - Trading system
QuantTools - Enhanced Quantitative Trading Modelling in R
vnpy - A popular and powerful trading platform
pyalgotrade - Python Algorithmic Trading Library
finmarketpy - Python library for backtesting trading strategies
IBridgePy - A Python system derived from zipline
Backtrader - Blog, trading community, and github
IbPy - Interactive Brokers Python API
PyLimitBook - Python implementation of fast limit-order book
qtpylib - Pythonic Algorithmic Trading via IbPy API and its Website
Quantdom - Python-based framework for backtesting trading strategies & analyzing financial markets [GUI]
ib_insync - Python sync/async framework for Interactive Brokers API
rqalpha - A popular trading platform
bt - flexible backtesting for Python
TradingGym - Trading and Backtesting environment for training reinforcement learning agent or simple rule base algo.
btgym - Gym-compatible backtesting
prophet - Python backtesting and trading platform
OpenHFT - Java components for high-frequency trading
libtrading - C API, low latency, fix support
thOth - open-source high frequency trading library in C++ 11
qt_tradingclient - multithreaded Qt C++ trading application, QuantLib-1.2.1, CUDA 5.0
SubMicroTrading - Java Ultra Low Latency Trading Framework
WPF/MVVM Real-Time Trading Application - Architechture
TradeLink - TradeLink, one of the earliest open source trading system
Reactive Trader - using reactive Rx framework, includes Reactive Trader and Reactive Trader Cloud. The demo is here.
QuantTrading - Pure C# trading system
StockTrading - C# system utilising WPF, WCF, PRISM, MVVM, Threading
Quanter - StockTrader
StockSharp - C# trading system
SharpQuant - C# trading system
QuantSys - C# trading system
StockTicker - C# trading system
gotrade - Electronic trading and order management system written in Golang
gofinance - Financial information retrieval and munging in golang
goib - Pure Go interface to Interactive Brokers IB API
Matlab Trading Toolbox - Official toolbox from Matlab; acommpanying Introduction to Matlab Trading Toolbox, and webinar Automated Trading System Development with MATLAB, and webinar Automated Trading with MATLAB, as well as webinar A Real-Time Trading System in MATLAB, Automated Trading with Matlab, Commodities Trading with Matlab, Cointegration and Pairs Trading with Econometrics Toolbox
Matlab risk management Toolbox - Official toolbox from Matlab
Matlab Walk Forward Analysis Toolbox - toolbox for walk-forward analysis
IB4m - matlab interface to interactive broker
IB-Matlab - introduction to another matlab interface to interactive broker and demo video
openAlgo Matlab - openAlgo's Matlab library
MatTest - Matlab backtest system
Quantlib - famous C++ library for quantitative finance; tranlated into other langugages via Swig
TA-Lib - Python wrapper for TA-Lib
DX Analytics - Python-based financial analytics library
FinMath - Java analytics library
OpenGamma - Java analytics library named STRATA
pyflux - Open source time series library for Python
arch - ARCH models in Python
flint - A Time Series Library for Apache Spark
Statsmodels - Statsmodels’s Documentation
awesome-deep-trading - A list of machine learning resources for trading
Awesome-Quant-Machine-Learning-Trading - Another list of machine learning resources for trading
awesome-ai-in-finance - A collection of AI resources in finance
deepstock - Technical experimentations to beat the stock market using deep learning
qtrader - Reinforcement Learning for Portfolio Management
stockPredictor - Predict stock movement with Machine Learning and Deep Learning algorithms
stock_market_reinforcement_learning - Stock market environment using OpenGym with Deep Q-learning and Policy Gradient
deep-algotrading - deep learning techniques from regression to LSTM using financial data
deep_trader - Use reinforcement learning on stock market and agent tries to learn trading.
Deep-Trading - Algorithmic trading with deep learning experiments
Deep-Trading - Algorithmic Trading using RNN
100 Day Machine Learning - Machine Learning tutorial with code
Multidimensional-LSTM-BitCoin-Time-Series - Using multidimensional LSTM neural networks to create a forecast for Bitcoin price
QLearning_Trading - Learning to trade under the reinforcement learning framework
bulbea - Deep Learning based Python Library for Stock Market Prediction and Modelling
PGPortfolio - source code of "A Deep Reinforcement Learning Framework for the Financial Portfolio Management Problem"
gym-trading - Environment for reinforcement-learning algorithmic trading models
Thesis - Reinforcement Learning for Automated Trading
DQN - Reinforcement Learning for finance
Deep-Trading-Agent - Deep Reinforcement Learning based Trading Agent for Bitcoin
deep_portfolio - Use Reinforcement Learning and Supervised learning to Optimize portfolio allocation.
Deep-Reinforcement-Learning-in-Stock-Trading - Using deep actor-critic model to learn best strategies in pair trading
Stock-Price-Prediction-LSTM - OHLC Average Prediction of Apple Inc. Using LSTM Recurrent Neural Network
DeepDow - Portfolio optimization with deep learning
Personae - Quantitative trading with deep learning
tensortrade - Reinforcement learning and trading
stockpredictionai - AI models such as GAN and PPO applied to stock markets
machine-learning-for-trading - Machine learning for algorithmic trading book
algorithmic-trading-with-python - Algorithmic Trading with Python book (2020)
machine-learning-asset-management - Machine Learning in Asset Management by firmai.org
Interactive Brokers - popular among retail trader
Bloomberg API - from Bloomberg
Quandl - free and premium data sources
iex - free market data
one tick - historical tick data
iqfeed - real time data feed
quantquote - tick and live data
algoseek - historical intraday
EOD data - historical data
EOD historical data - historical data
intrinio - financial data
arctic - High performance datastore from Man AHL for time series and tick data
SEC EDGAR API -- Query company filings on SEC EDGAR
Blockchain-stuff - Blockchain and Crytocurrency Resources
cryptrader - Node.js Bitcoin bot for MtGox/Bitstamp/BTC-E/CEX.IO; cryptrade
BitcoinExchangeFH - Cryptocurrency exchange market data feed handler
hummingbot - free open source crypto trading bot that supports both DEXes and CEXes
blackbird - C++ trading system that does automatic long/short arbitrage between Bitcoin exchanges
Peatio - An open-source crypto currency exchange on github
Qt Bitcoin Trader - Qt C++ Bitcoin trading
ccxt - A JavaScript / Python / PHP cryptocurrency trading library with support for more than 130 bitcoin/altcoin exchanges
r2 - Qan automatic arbitrage trading system powered by Node.js + TypeScript
bcoin - Javascript bitcoin library for node.js and browsers
XChange - Java library providing a streamlined API for interacting with 60+ Bitcoin and Altcoin exchanges
Krypto-trading-bot - Self-hosted crypto trading bot (automated high frequency market making) in node.js, angular, typescript and c++
freqtrade - Simple High Frequency Trading Bot for crypto currencies
Gekko - A bitcoin trading bot written in node
viabtc_exchange_server - A trading engine with high-speed performance and real-time notification
catalyst - An Algorithmic Trading Library for Crypto-Assets in Python Enigma
buttercoin - Opensource Bitcoin Exchange Software
zenbot - A command-line cryptocurrency trading bot using Node.js and MongoDB.
tribeca - A high frequency, market making cryptocurrency trading platform in node.js
rbtc_arbitrage - A gem for automating arbitrage between Bitcoin exchanges.
automated-trading - Automated Trading: Trading View Strategies => Bitfinex, itBit, DriveWealth
gocryptotrader - A cryptocurrency trading bot and framework supporting multiple exchanges written in Golang
btcrobot - Golang bitcoin trading bot
bitex - Open Source Bitcoin Exchange; and its front-end
cryptoworks - A cryptocurrency arbitrage opportunity calculator. Over 800 currencies and 50 markets; cryptocurrency-arbitrage
crypto-exchange - list of crypto exchanges to interact with their API's in a uniform fashion
bitcoin-abe - block browser for Bitcoin and similar currencies
MultiPoolMiner - Monitors crypto mining pools in real-time in order to find the most profitable for your machine. Controls any miner that is available via command line
tai - An open source, composable, real time, market data and trade execution toolkit. Written in Elixir
crypto-signal - Technical signals for multiple exchanges
Not trying to be exhaustive
FIA PTG and FIA Europe
Commodity Focused
Top Geeky Quant Blogs - A quant blogs check out list
Quantocracy - Aggregation of news on quants
seekingalpha - Seeking Alpha community
Quantivity - quantitative and algorithmic trading
Wilmott - quantitative finance community forum
Elitetrader - trading forum
nuclearphynance - quantitative finance forum
Investopedia - The Encyclopedia of investments
Quantpedia - The Encyclopedia of Quantitative Trading Strategies
EpChan - Dr. Ernie Chan's blog
Quantinsti - Quant Institute
QuantStart - Michael Halls-Moore's quantstart, quant trading 101; its Python backtest platform qstrader and qsforex
Algotrading 101 - Algo trading 101
Systematic Investor/old version - Michael Kapler's blog, one of the best R quantitative blog; Systematic Investor Toolkit
R-Finance - R-Finance repository. It has backtest quantstrat, trade blotter, famous performance analytics package, and package portfolio analytics, portfolio attribution.
quantmod - R modelling and trading framework
r programming - Guy Yollin's R backtesting
Seer Trading - R Backtest and live trading
python programming finance - python finance tutorial and quantopian toturial
python for finance - python finance
Quant Econ - open source python and julia codes for economic modeling; and lectures
JuliaQuant - Quantitative Finance in Julia
Portfolio Effect - real time portfolio and risk management
quant365 - Henry Moo's blog and trading system; including Sentosa, pysentosa binding, rsentosa binding and qblog.
hpc quantlib - HPC + QuantLib
quantstrat trader - Backtesting trading ideas with R QuantStrat package
Backtesting Strategies - Backtesting in R; codes at Github
The Quant MBA - good quant blog
Foss Trading - Algorithmic trading with free open source software
Gekko Quant - Quantitative Trading
Investment Idiocy - Systematic Trading, Quantitative Finance, Investing, Financial Activism, Economic decision making by Robert Carver; his book and his Python library
Quantifiable Edges/old version - Assessing market action with indicators and history
My Simple Quant - Market analysis utilizing historical, back-tessted data
Vix and more - discussions on Vix
Timely Portfolio - Strategies and tests in R
Quantitative Research and Trading
Qusma - Quantitative Systematic Market Analysis
return and risk - Quantitative finance, analysis, and applications
Physics of Finance - Inspiration from physics for thinking about economics, finance and social systems
Quantum Financier - algorithmic trading
Trading the Odds -- market timing & quantitative analysis
CSSA - new concepts in quantitative research
Tr8dr - strategies, statistics, computer science, numerical techniques
Deniz's Note - blog of a quant Deniz Turan
Quant at risk - quantitative analysis and risk management
Quant Blog - Quantitative trading, portfolio management, and machine learning, with source codes on Github
The R Trader - Using R in quant finance
rbresearch - Using R for trading strategy ideas in FX and equity markets
NaN Quantivity - quant trading, statistical learning, coding and brainstorming
Factor Investing - blog on wordpress
Big Mike Trading - Youtube chanel in quant trading
Predict Stock Prices Using RNN
BlackArbs - blog and machine learning notebooks on Github
Author: EliteQuant
Source Code: https://github.com/EliteQuant/EliteQuant
License: Apache-2.0 license
#machinelearning #trading #platform #quantitative #finance #algorithmic
1675255518
The global cloud computing market is a rapidly growing one, valued at over 405.65 billion USD in 2021. According to predictions from Fortune Business Insights, professionals in the cloud computing industry are expected to enjoy its impressive compound annual growth rate (CAGR) of 19.9% and increasing demand across many regions.
To beginner developers and those who are looking into digitally transforming their businesses, the concept of cloud data platforms might be a bit difficult to grasp. But worry not – in this article, we’ll be tackling all you need to know about the basics of cloud data platforms.
A post on ‘What is a Data Platform?’ by MongoDB defines it as a set of technologies that can completely meet an organization’s end-to-end data needs. ‘End-to-end’ often refers to everything from the collection, storage, preparation, delivery, and safekeeping of data.
Cloud data platforms were essentially created as a solution to the problems that big data posted over 20 years ago.
Now, cloud data platforms and tools are the new norm, and have been developed to handle large data volumes at incredible speeds.
One of the major considerations for those who are looking into cloud platforms is how it differs from running databases on-premises.
An article by TechTarget explains that any IaaS or DBaaS is similar to running an on-premise database. Traditionally, organizations build data centers and servers with everything needed to manage data. Instead, major cloud platform providers can provide these services and tools to minimize the work needed for developers. Time and resources can then be spent on the development process itself.
First, make sure to look into database management systems that are compatible with your OS. Most of the top providers can be installed on Windows and MacOS, but some are more suited for specific operating systems. There are also three different types of cloud databases to choose from: self-managed, autonomous, and automated cloud databases. We recommend doing your research to choose the best database category for the type of program or application you are creating.
Next, you will need to learn about the process of data normalization, which refers to the structured approach to organizing a database. This reduces the chances of data redundancy and ensures that the database is easy to navigate.
Managed cloud database services that can handle aspects like the necessary hardware, automated backups, and storage capacity management.
Most providers are also written in popular languages such as JavaScript, Python, and Ruby, which makes them accessible even to beginners. Regardless of your experience as a developer, cloud database building is essentially the same at the core. Most of the work lies in understanding the structure of your cloud database, and keeping your records and attributes organized.
In the next decade, we will continue to see a rise in the usage of cloud data platforms and their impact on industries like retail, finance, manufacturing, healthcare, and even for government use. It’s a great time to learn about this technology and its many uses.
Original article source at: https://makitweb.com/
1674941460
Open-source developer infrastructure for internal tools. Self-hostable alternative to Airplane, Pipedream, Superblocks and a simplified Temporal with autogenerated UIs to trigger workflows and scripts as internal apps. Scripts are turned into UIs and no-code modules, no-code modules can be composed into very rich flows, and script and flows can be triggered from internal UIs made with a low-code builder. The script languages supported are: Python, Typescript, Go, Bash.
Windmill
Disclaimer: Windmill is in BETA. It is secure to run in production but we are still improving the product fast.
Define a minimal and generic script in Python, Typescript, Go or Bash that solves a specific task. Here sending an email with SMTP. The code can be defined in the provided Web IDE or synchronized with your own github repo:
Your scripts parameters are automatically parsed and generate a frontend.
Make it flow! You can chain your scripts or scripts made by the community shared on WindmillHub.
Build complex UI on top of your scripts and flows.
Scripts and flows can also be triggered by a cron schedule '*/5 * * * *' or through webhooks.
You can build your entire infra on top of Windmill!
We have a powerful CLI to interact with the windmill platform and sync your scripts from your own github repo. See more details
backend/
: Rust backendfrontend
: Svelte frontendlsp/
: Lsp asssistant for the monaco editor<lang>-client/
: Windmill client for the given <lang>
Windmill uses nsjail on top of the deno sandboxing. It is production multi-tenant grade secure. Do not take our word for it, take fly.io's one
There is one encryption key per workspace to encrypt the credentials and secrets stored in Windmill's K/V store.
In addition, we strongly recommend that you encrypt the whole Postgres database. That is what we do at https://app.windmill.dev.
Once a job started, there is no overhead compared to running the same script on the node with its corresponding runner (Deno/Go/Python/Bash). The added latency from a job being pulled from the queue, started, and then having its result sent back to the database is ~50ms. A typical lightweight deno job will take around 100ms total.
We only provide docker-compose setup here. For more advanced setups, like compiling from source or using without a postgres super user, see documentation
docker compose up
with the following docker-compose is sufficient: https://github.com/windmill-labs/windmill/blob/main/docker-compose.yml
Go to https://localhost et voilà :)
For older kernels < 4.18, set DISABLE_NUSER=true
as env variable, otherwise nsjail will not be able to launch the isolated scripts.
To disable nsjail altogether, set DISABLE_NSJAIL=true
.
The default super-admin user is: admin@windmill.dev / changeme
From there, you can create other users (do not forget to change the password!)
We publish helm charts at: https://github.com/windmill-labs/windmill-helm-charts
To self-host Windmill, you must respect the terms of the AGPLv3 license which you do not need to worry about for personal uses. For business uses, you should be fine if you do not re-expose it in any way Windmill to your users and are comfortable with AGPLv3.
To re-expose any Windmill parts to your users as a feature of your product, or to build a feature on top of Windmill, to comply with AGPLv3 your product must be AGPLv3 or you must get a commercial license. Contact us at license@windmill.dev if you have any doubts.
In addition, a commercial license grants you a dedicated engineer to transition your current infrastructure to Windmill, support with tight SLA, audit logs export features, SSO, unlimited users creation, advanced permission managing features such as groups and the ability to create more than one workspace.
To get the same oauth integrations as Windmill Cloud, mount oauth.json
with the following format:
{
"<client>": {
"id": "<CLIENT_ID>",
"secret": "<CLIENT_SECRET>",
"allowed_domains": ["windmill.dev"] //restrict a client OAuth login to some domains
}
}
and mount it at /usr/src/app/oauth.json
.
The redirect url for the oauth clients is: <instance_url>/user/login_callback/<client>
The list of all possible "connect an app" oauth clients
To add more "connect an app" OAuth clients to the Windmill project, read the Contributor's guide. We welcome contributions!
You may also add your own custom OAuth2 IdP and OAuth2 Resource provider:
{
"<client>": {
"id": "<CLIENT_ID>",
"secret": "<CLIENT_SECRET>",
// To add a new OAuth2 IdP
"login_config": {
"auth_url": "<auth_endpoint>",
"token_url": "<token_endpoint>",
"userinfo_url": "<userinfo endpoint>",
"scopes": ["scope1", "scope2"],
"extra_params": "<if_needed>"
},
// To add a new OAuth2 Resource
"connect_config": {
"auth_url": "<auth_endpoint>",
"token_url": "<token_endpoint>",
"scopes": ["scope1", "scope2"],
"extra_params": "<if_needed>"
}
}
}
You will also want to import all the approved resource types from WindmillHub. There is no automatic way to do this automatically currently, but it will be possible using a command with the upcoming CLI tool.
Environment Variable name | Default | Description | Api Server/Worker/All |
---|---|---|---|
DATABASE_URL | The Postgres database url. | All | |
DISABLE_NSJAIL | true | Disable Nsjail Sandboxing | |
NUM_WORKERS | 3 | The number of worker per Worker instance (set to 1 on Eks to have 1 pod = 1 worker) | Worker |
METRICS_ADDR | None | The socket addr at which to expose Prometheus metrics at the /metrics path. Set to "true" to expose it on port 8001 | All |
JSON_FMT | false | Output the logs in json format instead of logfmt | All |
BASE_URL | http://localhost:8000 | The base url that is exposed publicly to access your instance | Server |
BASE_INTERNAL_URL | http://localhost:8000 | The base url that is reachable by your workers to talk to the Servers. This help avoiding going through the external load balancer for VPC-internal requests. | Worker |
TIMEOUT | 300 | The timeout in seconds for the execution of a script | Worker |
SLEEP_QUEUE | 50 | The number of ms to sleep in between the last check for new jobs in the DB. It is multiplied by NUM_WORKERS such that in average, for one worker instance, there is one pull every SLEEP_QUEUE ms. | Worker |
MAX_LOG_SIZE | 500000 | The maximum number of characters a job can emit (log + result) | Worker |
DISABLE_NUSER | false | If Nsjail is enabled, disable the nsjail's clone_newuser setting | Worker |
KEEP_JOB_DIR | false | Keep the job directory after the job is done. Useful for debugging. | Worker |
LICENSE_KEY (EE only) | None | License key checked at startup for the Enterprise Edition of Windmill | Worker |
S3_CACHE_BUCKET (EE only) | None | The S3 bucket to sync the cache of the workers to | Worker |
TAR_CACHE_RATE (EE only) | 100 | The rate at which to tar the cache of the workers. 100 means every 100th job in average (uniformly randomly distributed). | Worker |
SLACK_SIGNING_SECRET | None | The signing secret of your Slack app. See Slack documentation | Server |
COOKIE_DOMAIN | None | The domain of the cookie. If not set, the cookie will be set by the browser based on the full origin | Server |
SERVE_CSP | None | The CSP directives to use when serving the frontend static assets | Server |
DENO_PATH | /usr/bin/deno | The path to the deno binary. | Worker |
PYTHON_PATH | /usr/local/bin/python3 | The path to the python binary. | Worker |
GO_PATH | /usr/bin/go | The path to the go binary. | Worker |
PIP_INDEX_URL | None | The index url to pass for pip. | Worker |
PIP_EXTRA_INDEX_URL | None | The extra index url to pass to pip. | Worker |
PIP_TRUSTED_HOST | None | The trusted host to pass to pip. | Worker |
PATH | None | The path environment variable, usually inherited | Worker |
HOME | None | The home directory to use for Go and Bash , usually inherited | Worker |
DATABASE_CONNECTIONS | 50 (Server)/3 (Worker) | The max number of connections in the database connection pool | All |
SUPERADMIN_SECRET | None | A token that would let the caller act as a virtual superadmin superadmin@windmill.dev | Server |
TIMEOUT_WAIT_RESULT | 20 | The number of seconds to wait before timeout on the 'run_wait_result' endpoint | Worker |
QUEUE_LIMIT_WAIT_RESULT | None | The number of max jobs in the queue before rejecting immediately the request in 'run_wait_result' endpoint. Takes precedence on the query arg. If none is specified, there are no limit. | Worker |
DENO_AUTH_TOKENS | None | Custom DENO_AUTH_TOKENS to pass to worker to allow the use of private modules | Worker |
DENO_FLAGS | None | Override the flags passed to deno (default --allow-all) to tighten permissions. Minimum permissions needed are "--allow-read=args.json --allow-write=result.json" | Worker |
PIP_LOCAL_DEPENDENCIES | None | Specify dependencies that are installed locally and do not need to be solved nor installed again | Worker |
This will use the backend of https://app.windmill.dev but your own frontend with hot-code reloading.
frontend/
:npm install
, npm run generate-backend-client
then npm run dev
sudo caddy run --config CaddyfileRemote
http://localhost/
See the ./frontend/README_DEV.md file for all running options.
cargo install sqlx-cli && sqlx migrate run
. This will also avoid compile time issue with sqlx's query!
macro/usr/bin/deno
and /usr/local/bin/python3
frontend/
:npm install
, npm run generate-backend-client
then npm run dev
npm run build
otherwise the backend will not find the frontend/build
folder and will crashsudo caddy run --config Caddyfile
backend/
: DATABASE_URL=<DATABASE_URL_TO_YOUR_WINDMILL_DB> RUST_LOG=info cargo run
http://localhost/
Try it (personal workspaces are free forever): https://app.windmill.dev
Documentation: https://docs.windmill.dev
Discord: https://discord.gg/V7PM2YHsPB
Contributor's guide: https://docs.windmill.dev/docs/contributors_guide
Roadmap: https://github.com/orgs/windmill-labs/projects/2
You can show your support for the project by starring this repo.
Windmill Labs offers commercial licenses, an enterprise edition, local hub mirrors, and support: contact ruben@windmill.dev.
Author: Windmill-labs
Source Code: https://github.com/windmill-labs/windmill
License: Unknown and 2 other licenses found
1671794888
If you want to learn something, doing so online is a good idea. This is because if you think about it, everything is on the internet nowadays. Besides, you don’t get old textbooks or anything of that kind. In this age we count with very nice interactive and multimedia material!
But because everything is on the internet, there are many learning platforms available. Knowing which one is best could be a big challenge.
For that reason, in today’s article we will talk about 9 of the best online learning platforms out there.
Let’s start.
Let’s begin with one of the best programming learning platforms out there. It offers a broad range of subjects to learn about.
The offer courses:
HTML CSS, JavaScript, React JS, React Native, Python, Blockchain.
The courses are very interactive, easy to digest, and insightful. They count with a nice constant learning feedback loop. Which shows you a multiple-choice question after most definitions/explanations.
The organization of courses is always coherent and nice. It organizes them in career paths and in a comprehensive chart.
The platform promises to give you enough tools to be able to build projects on your own after a few courses. It also offers certificates, and there is a very neat dark mode available.
The platform offers several free courses and a pro upgrade that starts at $25/mo. or 240$ per year or once in a limited offer with unlimited access to the library and certificates.
Udemy is one of the biggest, well-known learning platforms to this day. Their library is over 130k courses, encompassing career and personal skills. Offered by top industry instructors.
In their main categories we got:
Design, (Web) Development, Marketing, IT and Software, Personal Development, Business, Photography, and Music.
You count with a search bar and featured sections where you will see courses as a product in a marketplace.
Each of them counts with a detailed description, a rating system, and the language offer.
There is also a course content section where you’ll see the main areas of the courses and estimated time. Plus, there are also requirements, reviews, and info about the instructor. And they got a mobile app.
About prices since this is a marketplace rather than a focused teaching platform. You must pay for every individual course, prices range and start at 15-20 Euro.
If you’ve been online for enough time, you’ve heard about LinkedIn already. Well, they also got a learning platform.
The platform offers courses from industry experts. It has over 16k course and learning paths.
Among the popular topics are:
Personal Effectiveness, Spreadsheets, Illustration, Personal Branding, Data Analysis, Drawing, Business Intelligence, etc.
Courses are video based, count with a preview video and a table of content. Each section has estimated times and transcripts.
Also, view count, general overview and some have exercise files/quizzes. Plus, they have a mobile app as well.
There are no free plans/courses available. They have a fixed plan of 39$/mo. It gives you full access to the library, certificates, and access to LinkedIn Premium.
Skillshare is one the most famous learning platforms out there. Offer by experienced industry people. Its main categories are: Creative, Business, Technology, and Lifestyle.
The platform has a lot of focus on creative work so let me tell you about some of its categories:
Animation, Design, Illustration, Lifestyle, Photography and Film, Business (creative), Writing, etc.
But they count with everything else like:
Data science, Mobile/web development, Language, Culinary, etc.
The courses count with a welcome video, a table of content, and level of difficulty.
Each section with estimated times. You will also find the number of students, and projects available. Plus, comments and detailed descriptions. There is a nice rating system that includes an expectation met rubric on offer as well. And there are transcripts for each section, a mobile app, and offline watching.
Coursera is one the most corporate/university-based learning platform there is. It collaborates with over 200 of them, like Google, IBM, and Stanford. It provides industry-recognized credentials and a mobile app at your disposal.
The library consists of +5100 courses/specification, +40 certificates and +25 degrees.
Among the popular subjects are:
Business, Computer Science, Data Science, Language, etc.
Among the Online master/university degrees you got those based on:
IT, Business, Health, and other sectors. The offer is big, there are free courses and paid ones. One generalization is that those free courses charge a fee for the certificate.
Udacity is a learning platform focused on digital skills of all sorts.
Its categories named “schools”, and these are currently available:
Data Science, Programming, Business, AI, Autonomous Systems, Cloud Computing, Cybersecurity.
Courses have a syllabus, estimated times, prerequisites, detailed descriptions, full list of offerings.
Also, there is a list of instructors with their info, reviews, and ratings, and 24⁄7 questions to mentors. There are free trials available, but prices will depend per course. It seems that most of the “top ones” start from 399$/mo. with discounts, if paid once per semester, say nearing $200/mo.
Edx is a not-for-profit organization only partnered with universities. It includes over 150 of them, such as MIT and Harvard.
It uses cognitive science learning techniques and has 24 distinct learning features. Which includes videos, graphs, interactive segments, quizzes, etc.
Its library consists of over 2800 courses, and those in the top courses have to do with:
Computer Sciences, Business, Data Science, Engineering, Design, Humanities, Language.
Each course will have an overview, detailed info, avg effort/time spent. There is also, info about the instructor, and endorsements. There is no rating system available.
There are some caveats though. Some programs are not full if not behind a paywall, and others are full, but the certificate is not.
Thinkific, like Teachable, is famous for being a course creation platform. But they both also offer courses.
But this platform is different since you won’t find courses like in a marketplace on it. Creators will create them and post them by their own.
Stating which are the most popular courses/subjects is difficult for one main reason. Because there is no platform to compare them. But what we can tell you is what to expect.
Based on best practices, you will a few things. Which includes at least an overview, a detailed description, requisites, estimated time.
Talking about prices won’t be possible either for the same reasons. And, because each course will have its own.
Teachable is a well know online learning platform. Between the courses you can take, you will find those related to:
Marketing, Productivity, Course Creation, Writing, Business. Also, Art, Language, Tech and Programing, Health and Fitness.
In each course, you will find a description, downloadable materials (videos included).
There is also a list of content with estimated time, structure info, and exercise sheets.
Plus, you also got a section of FAQ per course and unfortunately, a rating system is missing. On prices, there are free and paid plans, that could range from $100-200 or reach +$1000.
Online learning has never been this accessible and varied. Regardless of your interests if you are committed to gain knowledge all doors are open. Whether you decide to take free short courses or pay for an online university degree, there is a ton to make the most of.
With some research and reflection, we’re certain you’ll find what suits you best.
Thanks for reading this article, we expect to have been educating and useful. To our fellow learners out there, we wish you all the best of lucks.
Original article source at: https://www.blog.duomly.com/
1671227040
Artificial intelligence has taken off with a speed that few could have predicted five years ago. With companies like Google and Facebook investing billions of dollars every year into AI research, we’re able to see self-driving cars and virtual assistants that can recognize our voices while responding almost instantaneously to our commands after only a couple of iterations.
In this post, I’ll go into more depth about how Kaggle works, what types of competitions are available, and then give details about how one would solve the challenge at hand using machine learning.
If you’d like to learn more about what Kaggle is, how it works, and why 600 000 people use their platform, read on below!
Kaggle is a platform where data scientists can compete in machine learning challenges. These challenges can be anything from predicting housing prices to detecting cancer cells. Kaggle has a massive community of data scientists who are always willing to help others with their data science problems. In addition to the competitions, Kaggle also has many tutorials and resources that can help you get started in machine learning.
If you are an aspiring data scientist, Kaggle is the best way to get started. Many companies will give offers to those who rank highly in their competitions. In fact, Kaggle may become your full-time job if you can hit one of their high rankings.
Kaggle is best for businesses that have data that they feel needs to be analyzed. The most significant benefit of Kaggle is that these companies can easily find someone who knows how to work with their data, which makes solving the problem much easier than if they were trying to figure out what was wrong with their system themselves.
There are many different types of competitions available on Kaggle. You can enter a contest in everything from predicting cancer cells in microscope images to analyzing satellite images for changes overtime on any given day.
Examples include:
Every competition on Kaggle has a dataset associated with it and a goal you must reach (i.e., predict housing prices or detect cancer cells). You can access the data as often as possible and build your prediction model. Still, once you submit your solution, you cannot use it to make future submissions.
This ensures that everyone is starting from the same point when competing against one another, so there are no advantages given to those with more computational power than others trying to solve the problem.
Competitions are separated into different categories depending on their complexity level, how long they take, whether or not prize money is involved, etc., so users with varying experience levels can compete against each other in the same arena.
You should be comfortable with data analysis and machine learning if you’re looking to get involved in competitions.
Data science is a very broad term that can be interpreted in many ways depending on who you talk to. But suppose we’re talking specifically about competitive data science like what you see on Kaggle. In that case, it’s about solving problems or gaining insights from data.
It doesn’t necessarily involve machine learning, but you will need to understand the basics of machine learning to get started. There are no coding prerequisites either, though I would recommend having some programming experience in Python or R beforehand.
That being said, if competitive data science sounds interesting to you and you want to get started right away, we have a course for that on Duomly!
The best way to improve is just practice, so feel free to give any of their challenges a shot!
The sign-up process for entering a competition is very straightforward: Most competitions ask competitors to submit code that meets specific criteria at the end of each challenge. However, there may be times when they want competitors to explain what algorithms they used or provide input about how things work.
Suppose you want to solve one of their business-related challenges. In that case, you’ll need to have a good understanding of machine learning and what models work well with certain types of data. Suppose you want to do one of their custom competition. You’ll need to have a background in computer science to code in the language associated with the problem.
Many companies on Kaggle are looking for solutions, so there is always a prize attached to each competition. If your solution is strong enough, you can win a lot of money!
Some of these competitions are just for fun or learning purposes but still award winners with cash or merchandise prizes.
The most important tool that competitors rely on every day is the Python programming language. It’s used by over 60% of all data scientists, so it has an extremely large community behind it. It’s also extremely robust and has many different packages available for data manipulation, preprocessing, exploration to get you started.
TensorFlow is another popular tool that machine learning enthusiasts use to solve Kaggle competitions. It allows quick prototyping of models to get the best possible results. Several other tools are used in addition to Python and Tensorflow, such as R (a statistical programming language), Git (version control), and Bash (command-line interface). Still, I’ll let you research those on your own!
Kaggle aims to give you the tools necessary to become a world-class data scientist. They provide you with access to real data in real-time so you can practice solving problems similar to what companies face around the world.
They’re constantly updating their site for you to have the most up-to-date learning.
Kaggle gives beginners a way to learn more about machine learning and will allow them to utilize their skills no matter where they’re at.
Using Kaggle allows beginners to see what’s going on in the industry, keep up with trends, and become an expert with their tools as things change.
It also offers free training material for those just starting out or those who want a refresher course on specific concepts or who need help getting started.
With many tutorials and datasets readily available, Machine Learning enthusiasts would be very interested in Kaggle.
It is an excellent place to learn more about machine learning, practice what they’ve learned, and compete with other data scientists. This will help them become better at their craft.
Data analysts that want to use machine learning in their work can refer to Kaggle when choosing tools to improve the performance of business-related tasks such as forecasting sales numbers or predicting customer behavior.
In addition, businesses who are looking for third-party solutions can benefit from Kaggle’s extensive list of companies offering the service they need.
If you need machine learning services, don’t hesitate to contact us. We have a team of experts who can help you with your needs.
Original article source at: https://www.blog.duomly.com/
1670079125
Power Apps cards are simple lightweight micro-apps which holds and display enterprise data along with workflow and logic functions.
No need, using simple drag and drop function cards can be quickly built and shared with other applications as an actionable app without any coding expertise.
Data card are inserted inside the Power Apps itself as an integrated card. But Power Apps cards are micro-apps which are standalone in the Power Platform ecosystem where this card UI elements can be used by other applications to display the content.
To find the Cards go to https://make.powerapps.com/ on the left side navigation you can find a new feature available.
Microsoft has provided an interface to create cards. The UI controls can be dragged and dropped in the card and attach to a function logic and connect to all the Power Platform connectors to display data.
You can add
And so on.
It is a service where you can manage the cards which is used to send and receive data and to embed the data and the workflow.
Power Platform connectors can be connected within no time with all the safety and security enabled to the environment.
Not just static data but you can definitely build business logic using Power Fx which can be added to create
And those responses can be targeted and inserted into the UI Elements available in the cards.
You can send cards from the card designer to a Microsoft Teams chat or channel to share your cards with others.
No, this is a preview feature and previews are not suitable for the production environment. You can play around and wait for the general availability.
Yes definitely, please visit the Microsoft Learn docs to read more about it.
Original article source at:: https://www.c-sharpcorner.com/
1669454290
What is Google Cloud Platform (GCP)? – Introduction to GCP Services & GCP Account
In recent years the market for Cloud Computing has grown unexpectedly. There are many Cloud providers in the market today such as VM Ware, Amazon Web Services, Google Cloud Platform, Microsoft Azure, IBM Cloud and many more. According to Gartner’s prediction, the worldwide public cloud service market will be $178 Billion in 2018, from $146 Billion in 2017 and will continue to grow at 22% CAGR (Compound Annual Growth Rate). So let’s begin with our What is Google Cloud Blog.
Below are the topics it will cover. What is Google Cloud definition in this article:
Google Cloud Platform is a set of Computing, Networking, Storage, Big Data, Machine Learning and Management services provided by Google that runs on the same Cloud infrastructure that Google uses internally for its end-user products, such as Google Search, Gmail, Google Photos and YouTube.
You can go through this Google Cloud Provider video lecture where our GCP Training expert is discussing each & every nitty-gritty for what is Google Cloud technology.
So before looking into the details of Google Cloud Platform, Let’s understand Cloud Computing First.
Cloud computing is the on-demand delivery of compute power, database storage, applications, and other IT resources through a cloud services platform via the internet with pay-as-you-go pricing. It is the use of remote servers on the internet to store, manage and process data rather than a local server or your personal computer.
Cloud computing allows companies to avoid or minimize up-front IT infrastructure costs to keep their applications up and running faster, with improved manageability and less maintenance, and that it enables IT teams, to adjust resources rapidly to meet fluctuating and unpredictable demand.
Cloud-computing providers offer their services according to different models, of which the three standard models per NIST (National Institute of Standards and Technology ) are :
Now that you have a brief idea of What is Google Cloud Platform and Cloud Computing, let’s understand why one must go for it. Google Cloud Platform, is a suite of cloud computing services that run on the same infrastructure that Google uses internally for its end-user products, such as Google Search, Gmail, Google Photos and YouTube. We all know how big is the database of Gmail, Youtube and Google Search.
And I don’t think in the recent years, Google’s server has gone down. It’s one of the biggest in the world, so it seems an obvious choice, to trust them, Right?
Find out our Google Cloud Training in Top Cities/Countries
India | USA | Other Cities/Countries |
Bangalore | New York | UK |
Hyderabad | Chicago | London |
Pune | Dallas | Canada |
Chennai | Houston | Australia |
So now look at some of the features of GCP what really gives it an upper hand over other vendors.
Google Cloud Platform services are available in various locations across North America, South America, Europe, Asia, and Australia. These locations are divided into regions and zones. You can choose where to locate your applications to meet your latency, availability and durability requirements.
Here you can see that there is a total of 15 regions with at least 3 zones in every region.
Google offers a seven wide range of Services. What is google cloud for:
Compute: GCP provides a scalable range of computing options you can tailor to match your needs. It provides highly customizable virtual machines. and the option to deploy your code directly or via containers.
Networking: The Storage domain includes services related to networking, it includes the following services
Storage and Databases: The Storage domain includes services related to data storage, it includes the following services
Big Data: The Storage domain includes services related to big data, it includes the following services
Cloud AI: The Storage domain includes services related to machine learning, it includes the following services
Identity & Security: The Storage domain includes services related to security, it includes the following services
Management Tools: The Storage domain includes services related to monitoring and management, it includes the following services
Developer Tools: The Storage domain includes services related to development, it includes the following services
Now that we have learned What is Google Cloud Computing, To gain access to these Services, you need to just create a free account on GCP. You get $300 worth credit to spend it over a period of 12 Months. You need to provide your card details, but you won’t be charged extra after your trial period ends or you have exhausted the $300 credit.
After you create an account. Go to Console.
Here you will have a Dashboard which gives a summary of the what is Google Cloud Platform services which GCP Services you are using, along with the Stats and Billing Report.
In this section of Google Cloud Platform, you can find the summarized view of the following:
So this is it, guys!
I hope you enjoyed this What is Google Cloud Platform used for blog. If you are reading this, Congratulations! You are no longer a newbie to GCP.
Now that you have understood hat is Google Cloud Server and what is Google Cloud Engine, check out the GCP Certification Training by Edureka, a trusted online learning company with a network of more than 250,000 satisfied learners spread across the globe. The Edureka Google Cloud Computing course is designed to help you pass the Professional Cloud Architect – Google Cloud Certification.
Got a question for us? Please mention it in the comments section and we will get back to you or join our GCP Training in Singapore today.
Original article source at: https://www.edureka.co/
1667017455
This plugin is used to recognize user activity on Android and iOS platforms. To implement this plugin, Android used ActivityRecognitionClient
and iOS used CMMotionActivityManager
.
To use this plugin, add flutter_activity_recognition
as a dependency in your pubspec.yaml file. For example:
dependencies:
flutter_activity_recognition: ^1.3.0
After adding the flutter_activity_recognition
plugin to the flutter project, we need to specify the platform-specific permissions and services to use for this plugin to work properly.
Open the AndroidManifest.xml
file and add the following permissions between the <manifest>
and <application>
tags.
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />
<uses-permission android:name="com.google.android.gms.permission.ACTIVITY_RECOGNITION" />
Open the ios/Runner/Info.plist
file and add the following permission inside the <dict>
tag.
<key>NSMotionUsageDescription</key>
<string>Used to recognize user activity information.</string>
FlutterActivityRecognition
instance.final activityRecognition = FlutterActivityRecognition.instance;
Future<bool> isPermissionGrants() async {
// Check if the user has granted permission. If not, request permission.
PermissionRequestResult reqResult;
reqResult = await activityRecognition.checkPermission();
if (reqResult == PermissionRequestResult.PERMANENTLY_DENIED) {
dev.log('Permission is permanently denied.');
return false;
} else if (reqResult == PermissionRequestResult.DENIED) {
reqResult = await activityRecognition.requestPermission();
if (reqResult != PermissionRequestResult.GRANTED) {
dev.log('Permission is denied.');
return false;
}
}
return true;
}
// Subscribe to the activity stream.
final _activityStreamSubscription = activityRecognition.activityStream
.handleError(_handleError)
.listen(_onActivityReceive);
@override
void dispose() {
activityStreamSubscription?.cancel();
super.dispose();
}
Defines the type of permission request result.
Value | Description |
---|---|
GRANTED | Occurs when the user grants permission. |
DENIED | Occurs when the user denies permission. |
PERMANENTLY_DENIED | Occurs when the user denies the permission once and chooses not to ask again. |
A model representing the user's activity.
Property | Description |
---|---|
type | The type of activity recognized. |
confidence | The confidence of activity recognized. |
Defines the type of activity.
Value | Description |
---|---|
IN_VEHICLE | The device is in a vehicle, such as a car. |
ON_BICYCLE | The device is on a bicycle. |
RUNNING | The device is on a user who is running. This is a sub-activity of ON_FOOT. |
STILL | The device is still (not moving). |
WALKING | The device is on a user who is walking. This is a sub-activity of ON_FOOT. |
UNKNOWN | Unable to detect the current activity. |
Defines the confidence of activity.
Value | Description |
---|---|
HIGH | High accuracy: 80~100 |
MEDIUM | Medium accuracy: 50~80 |
LOW | Low accuracy: 0~50 |
If you find any bugs or issues while using the plugin, please register an issues on GitHub. You can also contact us at hwj930513@naver.com.
Run this command:
With Flutter:
$ flutter pub add flutter_activity_recognition
This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get
):
dependencies:
flutter_activity_recognition: ^1.3.0
Alternatively, your editor might support flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:flutter_activity_recognition/flutter_activity_recognition.dart';
import 'dart:async';
import 'dart:developer' as dev;
import 'package:flutter/material.dart';
import 'package:flutter_activity_recognition/flutter_activity_recognition.dart';
void main() => runApp(ExampleApp());
class ExampleApp extends StatefulWidget {
@override
_ExampleAppState createState() => _ExampleAppState();
}
class _ExampleAppState extends State<ExampleApp> {
final _activityStreamController = StreamController<Activity>();
StreamSubscription<Activity>? _activityStreamSubscription;
void _onActivityReceive(Activity activity) {
dev.log('Activity Detected >> ${activity.toJson()}');
_activityStreamController.sink.add(activity);
}
void _handleError(dynamic error) {
dev.log('Catch Error >> $error');
}
@override
void initState() {
super.initState();
WidgetsBinding.instance?.addPostFrameCallback((_) async {
final activityRecognition = FlutterActivityRecognition.instance;
// Check if the user has granted permission. If not, request permission.
PermissionRequestResult reqResult;
reqResult = await activityRecognition.checkPermission();
if (reqResult == PermissionRequestResult.PERMANENTLY_DENIED) {
dev.log('Permission is permanently denied.');
return;
} else if (reqResult == PermissionRequestResult.DENIED) {
reqResult = await activityRecognition.requestPermission();
if (reqResult != PermissionRequestResult.GRANTED) {
dev.log('Permission is denied.');
return;
}
}
// Subscribe to the activity stream.
_activityStreamSubscription = activityRecognition.activityStream
.handleError(_handleError)
.listen(_onActivityReceive);
});
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: const Text('Flutter Activity Recognition'),
centerTitle: true
),
body: _buildContentView()
),
);
}
@override
void dispose() {
_activityStreamController.close();
_activityStreamSubscription?.cancel();
super.dispose();
}
Widget _buildContentView() {
return StreamBuilder<Activity>(
stream: _activityStreamController.stream,
builder: (context, snapshot) {
final updatedDateTime = DateTime.now();
final content = snapshot.data?.toJson().toString() ?? '';
return ListView(
physics: const BouncingScrollPhysics(),
padding: const EdgeInsets.all(8.0),
children: [
Text('•\t\tActivity (updated: $updatedDateTime)'),
SizedBox(height: 10.0),
Text(content)
]
);
}
);
}
}
Download Details:
Author: Dev-hwang
Source Code: https://github.com/Dev-hwang/flutter_activity_recognition
1665680340
ownCloud gives you freedom and control over your own data. A personal cloud which runs on your own server.
For installing ownCloud, see the official ownCloud 10 installation manual.
Note that when doing a local development build, you need to have Composer v2 installed. If your OS provides a lower version than v2, you can install Composer v2 manually. As an example, which may be valid for other releases/distros too, see How to install Composer on Ubuntu 22.04 | 20.04 LTS.
You also must have installed yarn
and node
(v14 or higher).
https://owncloud.com/contribute/
Learn about the different ways you can get support for ownCloud: https://owncloud.com/support/
Please submit translations via Transifex: https://explore.transifex.com/owncloud-org/
See the detailed information about translations here.
Author: Owncloud
Source Code: https://github.com/owncloud/core
License: AGPL-3.0, Unknown licenses found
1661617752
The library is made to allow direct access to Internet files on all platforms. It also has the middleware to store files locally if needed. Aimed primarily at use with plugins, without the ability to work with the Internet
Simple usage anywhere:
import 'package:internet_file/internet_file.dart';
final Uint8List bytes = await InternetFile.get(
'https://github.com/rbcprolabs/icon_font_generator/raw/master/example/lib/icon_font/ui_icons.ttf',
progress: (receivedLength, contentLength) {
final percentage = receivedLength / contentLength * 100;
print(
'download progress: $receivedLength of $contentLength ($percentage%)');
},
);
For local store files you can usage InternetFileStorageIO
(not works on web):
import 'package:internet_file/storage_io.dart';
final storageIO = InternetFileStorageIO();
await InternetFile.get(
'https://github.com/rbcprolabs/icon_font_generator/raw/master/example/lib/icon_font/ui_icons.ttf',
storage: storageIO,
storageAdditional: storageIO.additional(
filename: 'ui_icons.ttf',
location: '',
),
);
Or you can write you own storage not requires io (web support etc.):
class MyOwnInternetFileStorage extends InternetFileStorage {
@override
Future<Uint8List?> findExist(
String url,
InternetFileStorageAdditional additional,
) {
# find local here
# access you own string property:
print(additional['my_string_property'] as String);
# access you own any type property:
print((additional['my_date_property'] as DateTime).toString())
}
@override
Future<void> save(
String url,
InternetFileStorageAdditional additional,
Uint8List bytes,
) async {
# save file here
}
}
final myOwnStorage = MyOwnInternetFileStorage();
await InternetFile.get(
'https://github.com/rbcprolabs/icon_font_generator/raw/master/example/lib/icon_font/ui_icons.ttf',
storage: myOwnStorage,
storageAdditional: {
'my_string_property': 'string',
'my_int_property': 99,
'my_date_property': DateTime.now(),
},
);
InternetFile.get params
Parameter | Description | Optional | Default |
---|---|---|---|
url | Link to network file | required | - |
headers | Headers passed for wile load | optional | - |
progress | Callback with received & all bytes length progress value called when file loads | optional | - |
storage | Implements of InternetFileStorage with save & find local methods for saving files | optional | - |
storageAdditional | Additional args for pass to InternetFileStorage implementation passed in storage | optional | {} |
Full api reference available here
Inspired by flutter_cache_manager, but make for support all platforms
Uses:
InternetFileStorageIO
Created for usage in:
Run this command:
With Dart:
$ dart pub add internet_file
With Flutter:
$ flutter pub add internet_file
This will add a line like this to your package's pubspec.yaml (and run an implicit dart pub get
):
dependencies:
internet_file: ^1.2.0
Alternatively, your editor might support dart pub get
or flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:internet_file/internet_file.dart';
example/lib/internet_file_example.dart
import 'package:internet_file/internet_file.dart';
import 'package:internet_file/storage_io.dart';
void main() async {
final storageIO = InternetFileStorageIO();
await InternetFile.get(
'http://www.africau.edu/images/default/sample.pdf',
storage: storageIO,
storageAdditional: storageIO.additional(
filename: 'ui_icons.ttf',
location: '',
),
force: true,
progress: (receivedLength, contentLength) {
final percentage = receivedLength / contentLength * 100;
print(
'download progress: $receivedLength of $contentLength ($percentage%)');
},
);
}
Download Details:
Author: ScerIO
Source Code: https://github.com/ScerIO/packages.dart/tree/master/packages/internet_file
1661536163
What is Rudder?
Short answer: Rudder is an open-source Segment alternative written in Go, built for the enterprise.
Long answer: Rudder is a platform for collecting, storing and routing customer event data to dozens of tools. Rudder is open-source, can run in your cloud environment (AWS, GCP, Azure or even your data-centre) and provides a powerful transformation framework to process your event data on the fly.
Released under MIT License
pubspec.yaml
and add rudder_sdk_flutter
under dependencies
section:RudderClient
import
the RudderClient.RudderClient
Somewhere in your Application, add the following code
RudderConfigBuilder builder = RudderConfigBuilder();
builder.withDataPlaneUrl(DATA_PLANE_URL);
builder.withTrackLifecycleEvents(true);
builder.withRecordScreenViews(true);
final client = RudderClient.instance;
client.initialize(WRITE_KEY,config: builder.build());
An example track
call is as below
RudderProperty property = RudderProperty();
property.put("test_key_1", "test_key_1");
client.track("test_track_event", properties: property);
You can pass your device-token
for Push Notifications to be passed to the destinations which support Push Notification. We set the token
under context.device.token
. An example of setting the device-token
is as below
client.putDeviceToken(<DEVICE_TOKEN>);
We use the deviceId
as anonymousId
by default. You can use the following method to override and use your own anonymousId
with the SDK. You need to call setAnonymousId
method before calling getInstance
. An example of setting the anonymousId
is as below
client.putAnonymousId(<ANONYMOUS_ID>);
You can use the setAdvertisingId
method to pass your Android and iOS AAID and IDFA respectively. The setAdvertisingId
method accepts a string
argument :
advertisingId
: Your Android advertisingId
(AAID) (or) Your iOS advertisingId
(IDFA) On Android
device you need to call setAdvertisingId
method before calling getInstance
Example Usage:The advertisingId
parameter you pass to the above method is assigned as AAID
if you are on android
device and as IDFA
if you are on a iOS
device. For more detailed documentation check the documentation page.If you come across any issues while configuring or using RudderStack, please feel free to contact us or start a conversation on our Slack channel. We will be happy to help you.
client.putAdvertisingId(<ADVERTISING_ID>);
import 'package:rudder_sdk_flutter/RudderClient.dart';
flutter pub get
dependencies:
rudder_sdk_flutter: ^1.2.0
Run this command:
With Flutter:
$ flutter pub add rudder_sdk_flutter_platform_interface
This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get
):
dependencies:
rudder_sdk_flutter_platform_interface: ^2.1.0
Alternatively, your editor might support flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:rudder_sdk_flutter_platform_interface/platform.dart';
import 'package:rudder_sdk_flutter_platform_interface/rudder_sdk_platform.dart';
Download Details:
Author: rudderlabs
Source Code: https://github.com/rudderlabs/rudder-sdk-flutter
1661531210
Get Web Renderer
This package help you to detect current web renderer.
This package provide very basic apis to recognize current web renderer.
You just need to add get_web_renderer: ^any
to your pubspec.yaml
and this is all apis for you to use:
/// Return true if current renderer is HTML
bool _isHtmlRenderer = WebRenderer.isHtmlRenderer;
/// Return true if current renderer is CanvasKit
bool _isCanvasKitRenderer = WebRenderer.isCanvasKitRenderer;
/// Return true if current renderer is not the web platform
bool _isOtherRenderer = WebRenderer.isOtherRenderer;
// return CurrentRenderer.html, CurrentRenderer.canvasKit, CurrentRenderer.other
CurrentRenderer _currentRenderer = WebRenderer.getCurrentRenderer;
Run this command:
With Dart:
$ dart pub add get_web_renderer
With Flutter:
$ flutter pub add get_web_renderer
This will add a line like this to your package's pubspec.yaml (and run an implicit dart pub get
):
dependencies:
get_web_renderer: ^1.1.0
Alternatively, your editor might support dart pub get
or flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:get_web_renderer/get_web_renderer.dart';
import 'dart:async';
import 'package:flutter/material.dart';
import 'package:get_web_renderer/get_web_renderer.dart';
void main() {
runApp(const MyApp());
}
class MyApp extends StatefulWidget {
const MyApp({Key? key}) : super(key: key);
@override
State<MyApp> createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
CurrentRenderer? _currentRenderer;
@override
void initState() {
super.initState();
initPlatformState();
}
// Platform messages are asynchronous, so we initialize in an async method.
Future<void> initPlatformState() async {
final currentRenderer = WebRenderer.getCurrentRenderer;
// If the widget was removed from the tree while the asynchronous platform
// message was in flight, we want to discard the reply rather than calling
// setState to update our non-existent appearance.
if (!mounted) return;
setState(() {
_currentRenderer = currentRenderer;
});
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: const Text('Current Web Renderer'),
),
body: _currentRenderer == null
? const CircularProgressIndicator()
: Center(
child: Text('This device using $_currentRenderer renderer'),
),
),
);
}
}
Download Details:
Author: vursin
Source Code: https://github.com/vursin/get_web_renderer
1660572051
quick_breakpad
A cross-platform flutter plugin for C/C++/ObjC
crash report via Google Breakpad
Use breakpad for quick_breakpad_example
$CLI_BREAKPAD is local clone of https://github.com/Sunbreak/cli-breakpad.trial
# Device/emulator connected
$ android_abi=`adb shell getprop ro.product.cpu.abi`
$ pushd example
$ flutter run
✓ Built build/app/outputs/flutter-apk/app-debug.apk.
I/quick_breakpad(28255): JNI_OnLoad
I quick_breakpad_example(28255): JNI_OnLoad
D quick_breakpad(28255): Dump path: /data/data/com.example.quick_breakpad_example/cache/54ecbb9d-cef5-4fa9-5b6869b2-198bc87e.dmp
$ popd
$ adb shell "run-as com.example.quick_breakpad_example sh -c 'cat /data/data/com.example.quick_breakpad_example/cache/54ecbb9d-cef5-4fa9-5b6869b2-198bc87e.dmp'" >| 54ecbb9d-cef5-4fa9-5b6869b2-198bc87e.dmp
Only C/C++ crash for now
$ $CLI_BREAKPAD/breakpad/linux/$(arch)/dump_syms example/build/app/intermediates/cmake/debug/obj/${android_abi}/libquick-breakpad-example.so > libquick-breakpad-example.so.sym
$ uuid=`awk 'FNR==1{print \$4}' libquick-breakpad-example.so.sym`
$ mkdir -p symbols/libquick-breakpad-example.so/$uuid/
$ mv ./libquick-breakpad-example.so.sym symbols/libquick-breakpad-example.so/$uuid/
$ $CLI_BREAKPAD/breakpad/linux/$(arch)/minidump_stackwalk 54ecbb9d-cef5-4fa9-5b6869b2-198bc87e.dmp symbols/ > libquick-breakpad-example.so.log
head -n 20 libquick-breakpad-example.so.log
So the crash is at line 30 of quick_breakpad_example.cpp
$ flutter devices
1 connected device:
iPhone SE (2nd generation) (mobile) • C7E50B0A-D9AE-4073-9C3C-14DAF9D93329 • ios • com.apple.CoreSimulator.SimRuntime.iOS-14-5 (simulator)
$ device=C7E50B0A-D9AE-4073-9C3C-14DAF9D93329
$ pushd example
$ flutter run -d $device
Running Xcode build...
└─Compiling, linking and signing... 2,162ms
Xcode build done. 6.2s
Lost connection to device.
$ popd
$ data=`xcrun simctl get_app_container booted com.example.quickBreakpadExample data`
$ ls $data/Library/Caches/Breakpad
A1D2CF75-848E-42C4-8F5C-0406E8520647.dmp Config-FsNxCZ
$ cp $data/Library/Caches/Breakpad/A1D2CF75-848E-42C4-8F5C-0406E8520647.dmp .
Runner
Only C/C++/Objective-C crash for now
$ dsymutil example/build/ios/Debug-iphonesimulator/Runner.app/Runner -o Runner.dSYM
$ $CLI_BREAKPAD/breakpad/mac/dump_syms Runner.dSYM > Runner.sym
$ uuid=`awk 'FNR==1{print \$4}' Runner.sym`
$ mkdir -p symbols/Runner/$uuid/
$ mv ./Runner.sym symbols/Runner/$uuid/
$ $CLI_BREAKPAD/breakpad/mac/$(arch)/minidump_stackwalk A1D2CF75-848E-42C4-8F5C-0406E8520647.dmp symbols > Runner.log
head -n 20 Runner.log
So the crash is at line 11 of AppDelegate.m
rem Command Prompt
> pushd example
> flutter run -d windows
Building Windows application...
dump_path: .
minidump_id: 34cd2b95-aef1-4003-ae75-1c848b18aad2
> popd
> copy example\34cd2b95-aef1-4003-ae75-1c848b18aad2.dmp .
rem Command Prompt
> %CLI_BREAKPAD%\windows\%PROCESSOR_ARCHITECTURE%\dump_syms example\build\windows\runner\Debug\quick_breakpad_example.pdb > quick_breakpad_example.sym
# bash or zsh
$ uuid=`awk 'FNR==1{print \$4}' quick_breakpad_example.sym`
$ mkdir -p symbols/quick_breakpad_example.pdb/$uuid/
$ mv ./quick_breakpad_example.sym symbols/quick_breakpad_example.pdb/$uuid/
$ ./breakpad/linux/$(arch)/minidump_stackwalk 34cd2b95-aef1-4003-ae75-1c848b18aad2.dmp symbols > quick_breakpad_example.log
# bash or zsh
$ head -n 20 quick_breakpad_example.log
So the crash is at line 23 of flutter_windows.cpp
https://github.com/woodemi/quick_breakpad/issues/5
$ pushd example
$ flutter run -d linux
Building Linux application...
Dump path: /tmp/d4a1c6ac-2ad7-4301-c22e3c9b-0a4c5588.dmp
$ popd
$ cp /tmp/d4a1c6ac-2ad7-4301-c22e3c9b-0a4c5588.dmp .
# flutterArch=x64 or arm64
$ $CLI_BREAKPAD/breakpad/linux/$(arch)/dump_syms build/linux/${flutterArch}/debug/bundle/quick_breakpad_example > quick_breakpad_example.sym
$ uuid=`awk 'FNR==1{print \$4}' quick_breakpad_example.sym`
$ mkdir -p symbols/quick_breakpad_example/$uuid/
$ mv ./quick_breakpad_example.sym symbols/quick_breakpad_example/$uuid/
$ $CLI_BREAKPAD/breakpad/linux/$(arch)/minidump_stackwalk d4a1c6ac-2ad7-4301-c22e3c9b-0a4c5588.dmp symbols/ > quick_breakpad_example.log
head -n 20 quick_breakpad_example.log
So the crash is at line 19 of my_application.cc
Run this command:
With Flutter:
$ flutter pub add quick_breakpad
This will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get
):
dependencies:
quick_breakpad: ^0.3.0
Alternatively, your editor might support flutter pub get
. Check the docs for your editor to learn more.
Now in your Dart code, you can use:
import 'package:quick_breakpad/quick_breakpad.dart';
import 'package:flutter/material.dart';
import 'dart:async';
import 'package:flutter/services.dart';
import 'package:quick_breakpad/quick_breakpad.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatefulWidget {
@override
_MyAppState createState() => _MyAppState();
}
class _MyAppState extends State<MyApp> {
String _platformVersion = 'Unknown';
@override
void initState() {
super.initState();
initPlatformState();
}
// Platform messages are asynchronous, so we initialize in an async method.
Future<void> initPlatformState() async {
String platformVersion;
// Platform messages may fail, so we use a try/catch PlatformException.
try {
platformVersion = await QuickBreakpad.platformVersion;
} on PlatformException {
platformVersion = 'Failed to get platform version.';
}
// If the widget was removed from the tree while the asynchronous platform
// message was in flight, we want to discard the reply rather than calling
// setState to update our non-existent appearance.
if (!mounted) return;
setState(() {
_platformVersion = platformVersion;
});
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: const Text('Plugin example app'),
),
body: Center(
child: Text('Running on: $_platformVersion\n'),
),
),
);
}
}
Download Details:
Author: woodemi
Source Code: https://github.com/woodemi/quick_breakpad
1660269780
Distributed R is a scalable high-performance platform for the R language. It enables and accelerates large scale machine learning, statistical analysis, and graph processing.
The Distributed R platform exposes data structures, such as distributed arrays, to store data across a cluster. Arrays act as a single abstraction to efficiently express both machine learning algorithms, which primarily use matrix operations, and graph algorithms, which manipulate the graph’s adjacency matrix. In addition to distributed arrays, the platform also provides distributed data frames, lists and loops.
Using Distributed R constructs, data can be loaded in parallel from any data source. Distributed R also provides a parallel data loader from the Vertica database. Please see vRODBC repository.
Distributed R is delivered in a single, easy-to-install tar file. The installation tool "distributedR_install" installs the platform and all parallel algorithm R packages. You can register and get the tar file here.
You can also get a Virtual Machine with everything installed here.
On Ubuntu:
$ sudo apt-get install -y make gcc g++ libxml2-dev rsync bison byacc flex
On CentOS:
$ sudo yum install -y make gcc gcc-c++ libxml2-devel rsync bison byacc flex
On Ubuntu:
$ echo "deb http://cran.r-project.org//bin/linux/ubuntu trusty/" | sudo tee /etc/apt/sources.list.d/r.list
$ sudo apt-get update
$ sudo apt-get install -y --force-yes r-base-core
On CentOS:
$ curl -O http://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
$ sudo rpm -i epel-release-latest-7.noarch.rpm
$ sudo yum update
$ sudo yum install R R-devel
Install R dependencies:
$ sudo R # to install globally
R> install.packages(c('Rcpp','RInside','XML','randomForest','data.table'))
Compile and install Distributed R:
$ R CMD INSTALL platform/executor
$ R CMD INSTALL platform/master
Or directly from the R console:
R> devtools::install_github('vertica/DistributedR',subdir='platform/executor')
R> devtools::install_github('vertica/DistributedR',subdir='platform/master')
Open R and run an example:
library(distributedR)
distributedR_start() # start DR
distributedR_status()
B <- darray(dim=c(9,9), blocks=c(3,3), sparse=FALSE) # create a darray
foreach(i, 1:npartitions(B),
init<-function(b = splits(B,i), index=i) {
b <- matrix(index, nrow=nrow(b), ncol=ncol(b))
update(b)
}) # initialize it
getpartition(B) # collect darray data
distributedR_shutdown() # stop DR
You can help us in different ways:
In order to contribute the code base of this project, you must agree to the Developer Certificate of Origin (DCO) 1.1 for this project under GPLv2+:
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I have the
right to submit it under the open source license indicated in the file; or
(b) The contribution is based upon previous work that, to the best of my
knowledge, is covered under an appropriate open source license and I
have the right under that license to submit that work with modifications,
whether created in whole or in part by me, under the same open source
license (unless I am permitted to submit under a different license),
as indicated in the file; or
(c) The contribution was provided directly to me by some other person who
certified (a), (b) or (c) and I have not modified it.
(d) I understand and agree that this project and the contribution are public and
that a record of the contribution (including all personal information I submit
with it, including my sign-off) is maintained indefinitely and may be
redistributed consistent with this project or the open source license(s) involved.
To indicate acceptance of the DCO you need to add a Signed-off-by
line to every commit. E.g.:
Signed-off-by: John Doe <john.doe@hisdomain.com>
To automatically add that line use the -s
switch when running git commit
:
$ git commit -s
Author: Vertica
Source Code: https://github.com/vertica/DistributedR
License: GPL-2.0 license