Address Challenges with Community Metrics

Consider this advice for addressing the organizational and technical challenges of implementing community health metrics for your own community.

The previous two articles in this series looked at open source community health and the metrics used to understand it. They showed examples of how open source communities have measured at their health through metrics. This final article brings those ideas together, discussing the challenges of implementing community health metrics for your own community.

Organizational challenges

First, you must decide which metrics you want to examine. This requires understanding your questions about reaching your goals as a community. The metrics relevant to you are those that can answer your questions. Otherwise, you risk being overwhelmed by the amount of data available.

Second, you need to anticipate how you want to react to the metrics. This is about making decisions based on what your data shows you. For example, this includes managing engagement with other community members, as discussed in previous articles.

Third, you must differentiate between good and bad results in your metrics. A common pitfall is to compare your community to other communities, but the truth is that every community works and behaves differently. You can't necessarily even compare metrics within the same project. For example, you may be unable to compare the number of commits in repositories within the same project because one may be squashing commits while the other might have hundreds of micro commits. You can establish a baseline of where you are and have been and then see whether you've improved over time.

Privacy

The final organizational challenge I want to discuss is Personally Identifiable Information (PII) concerns. One of open source's core values and strengths is the transparency of how contributors work. This means everyone has information about who's engaged, including their name, email address, and possibly other information. There are ethical considerations about using that data.

In recent years, regulations like the European General Data Protection Regulation (GDPR) have defined legal requirements for what you can and cannot do with PII data. The key question is whether you need to ask everyone's permission to process their data. This is an opt-in strategy. On the other hand, you might choose to use the data and provide an opt-out process.

This distinction is important. For instance, suppose you're providing metrics and dashboards as a service to your community. In an effort to improve the community, you might make the case that the (already publicly available) information has greater value for the community once it's processed. Either way, make it clear what data you use and how you use it.

Technical challenges

Where is your community data being collected? To answer this, consider all the places and platforms your community is engaging in. This includes the software repository, whether it's GitLab, GitHub, Bitbucket, Codeberg, or just a mailing list and a Git server. It may also include issue trackers, a change request workflow system like Gerrit, or a wiki.

But don't stop at the software development interactions. Where else does the community exist? These could be forums, mailing lists, instant messaging channels, question-and-answer sites, or meetups. There's a lot of activity in open source communities that doesn't strictly involve software development work but that you want to recognize in your metrics. These non-coding activities may be hard to track automatically, but you should pay special attention to them or risk ignoring important community members.

With all of these considerations addressed, it's time to take action.

1. Retrieve the data

Once you've identified the data sources, you must get the data and make it useful. Collecting raw data is almost always the easiest step. You have logs and APIs for that. Once set up, the (hopefully occasional) main challenge is when APIs and log formats change.

2. Data enrichment

Once you have the data, you probably need to enrich it.

First, you must unify the data. This step includes converting data into a standard format, which is no small feat. Just think of all the different ways to express a simple date. The order of the year, month, and day varies between regions; dates may use dots, slashes, or other symbols, or they can be expressed in the Unix epoch. And that's just a timestamp!

Whatever your raw data format is, make it consistent for analysis. You also want to determine the level of detail. For example, when you look at a Git log, you may only be interested in when a commit was made and by whom, which is high-level information. Then again, maybe you also want to know what files were touched or how many lines were added and removed. That's a detailed view.

You may also want to track metadata about different contributions. This may involve adding contextual information on how the data was collected or the circumstances under which it was created. For example, you could tag contributions made during the Hacktoberfest event.

Finally, standardize the data into a format suitable for analysis and visualization.

When you care about who is active in your community (and possibly what organizations they work for), you must pay special attention to identity. This can be a challenge because contributors may use different usernames and email addresses across the various platforms. You need a mechanism to track an individual by several online identifiers, such as an issue tracker, mailing list, and chat.

You can also pre-process data and calculate metrics during the data enrichment phase. For example, the original raw data may have a timestamp of when an issue was opened and closed, but you really want to know the number of days the issue has been open. You may also have categorization criteria for contributions, such as identifying which contribution came from a core contributor, who's been doing a lot in a project, how many "fly by" contributors show up and then leave, and so on. Doing these calculations during the enrichment phase makes it easier to visualize and analyze the data and requires less overhead at later stages.

3. Make data useful

Now that your data is ready, you must decide how to make it useful. This involves figuring out who the user of the information is and what they want to do with it. This helps determine how to present and visualize the data. One thing to remember is that the data may be interesting but not impactful by itself. The best way to use the data is to make it part of a story about your community.

You can use the data in two ways to tell your community story:

  • Have a story in mind, and then verify that the data supports how you perceive the community. You can use the data as evidence to corroborate the story. Of course, you should look for evidence that your story is incorrect and try to refute it, similar to how you make a scientific hypothesis.
  • Use data to find anomalies and interesting developments you wouldn't have otherwise observed. The results can help you construct a data-driven story about the community by providing a new perspective that perhaps has outgrown casual observation.

Solve problems with open source

Before you address the technical challenges, I want to give you the good news that you're in open source technology, and others have already solved many of the challenges you're facing. There are several open source solutions available to you:

  • CHAOSS GrimoireLab: The industry standard and enterprise-ready solution for community health analytics.
  • CHAOSS Augur: A research project with a well-defined data model and bleeding-edge functionality for community health analytics.
  • Apache Kibble: The Apache Software Foundations' solution for community health analytics.
  • CNCF Dev Analytics: CNCF's GitHub statistics for community health analytics.

To overcome organizational challenges, rely on the CHAOSS Project, a community of practice around community health.

The important thing to remember is that you and your community aren't alone. You're a part of a larger community that's constantly growing.

I've covered a lot in the past three articles. Here's what I hope you take away:

  • Use metrics to identify where your community needs help.
  • Track whether specific actions lead to changes.
  • Track metrics early, and establish a baseline.
  • Gather the easy metrics first, and get more sophisticated later.
  • Present metrics in context. Tell a story about your community.
  • Be transparent with your community about metrics. Provide a public dashboard and publish reports.

Original article source at: https://opensource.com/

#challenge #community #metrics 

Address Challenges with Community Metrics

Laravel.io: The Laravel.io Community Portal

Laravel.io

This is the repository for the Laravel.io community portal. The code is entirely open source and licensed under the MIT license. We welcome your contributions but we encourage you to read the contributing guide before creating an issue or sending in a pull request. Read the installation guide below to get started with setting up the app on your machine.

Sponsors

We'd like to thank these amazing companies for sponsoring us. If you are interested in becoming a sponsor, please visit the Laravel.io Github Sponsors page.

Requirements

The following tools are required in order to start the installation.

Installation

Note that you're free to adjust the ~/Sites/laravel.io location to any directory you want on your machine. In doing so, be sure to run the valet link command inside the desired directory.

  1. Clone this repository with git clone git@github.com:laravelio/laravel.io.git ~/Sites/laravel.io
  2. Run composer install to install the PHP dependencies
  3. Set up a local database called laravel
  4. Run composer setup to setup the application
  5. Set up a working e-mail driver like Mailtrap
  6. Run valet link to link the site to a testing web address
  7. Configure the (optional) features from below

You can now visit the app in your browser by visiting http://laravel.io.test. If you seeded the database you can login into a test account with testing & password.

Github Authentication (optional)

To get Github authentication to work locally, you'll need to register a new OAuth application on Github. Use http://laravel.io.test for the homepage url and http://laravel.io.test/auth/github for the callback url. When you've created the app, fill in the ID and secret in your .env file in the env variables below. You should now be able to authentication with Github.

GITHUB_ID=
GITHUB_SECRET=
GITHUB_URL=http://laravel.io.test/auth/github

Algolia Search (optional)

To get Algolia search running locally, you'll need to register for a new account and create an index called threads. Algolia has a free tier that satisfies all of the requirements needed for a development environment. Now update the below variables in your .env file. The App ID and secret keys can be found in the API Keys section of the Algoila UI.

SCOUT_DRIVER=algolia
SCOUT_QUEUE=true

ALGOLIA_APP_ID=
ALGOLIA_SECRET="Use the Write API Key"

VITE_ALGOLIA_APP_ID="${ALGOLIA_APP_ID}"
VITE_ALGOLIA_SECRET="Use the Search API Key"
VITE_ALGOLIA_INDEX=threads

In order to index your existing threads, run the following command:

php artisan scout:import App\\Models\\Thread

New threads will be automatically added to the index and threads which get updated will be automatically synced. If you need to flush your index and start again, you can run the following command:

php artisan scout:flush App\\Models\\Thread

Twitter Sharing (optional)

To enable published articles to be automatically shared on Twitter, you'll need to create a Twitter app. Once the app has been created, update the below variables in your .env file. The consumer key and secret and access token and secret can be found in the Keys and tokens section of the Twitter developers UI.

TWITTER_CONSUMER_KEY=
TWITTER_CONSUMER_SECRET=
TWITTER_ACCESS_TOKEN=
TWITTER_ACCESS_SECRET=

Approved articles are shared in the order they were submitted for approval. Articles are shared twice per day at 14:00 and 18:00 UTC. Once an article has been shared, it will not be shared again.

Telegram Notifications (optional)

Laravel.io can notify maintainers of newly submitted articles through Telegram. For this to work, you'll need to set up a Telegram bot and obtain a token. Then, configure the channel you want to send new article messages to.

TELEGRAM_BOT_TOKEN=
TELEGRAM_CHANNEL=

Fathom Analytics (optional)

To enable view counts on articles, you'll need to register a Fathom Analytics account and install it on the site. You will then need to create an API token and find your site ID before updating the below environment variables in your .env file.

FATHOM_SITE_ID=
FATHOM_TOKEN=

Commands

CommandDescription
vendor/bin/pest -pRun the tests with parallel execution
php artisan migrate:fresh --seedReset the database
npm run devBuild and watch for changes in CSS and JS files

Maintainers

The Laravel.io portal is currently maintained by Dries Vints and Joe Dixon. If you have any questions please don't hesitate to create an issue on this repo.

Contributing

Please read the contributing guide before creating an issue or sending in a pull request.

Code of Conduct

Please read our Code of Conduct before contributing or engaging in discussions.

Security Vulnerabilities

Please review our security policy on how to report security vulnerabilities.

Download Details:

Author: laravelio
Source Code: https://github.com/laravelio/laravel.io 
License: MIT license

#php #laravel #community 

Laravel.io: The Laravel.io Community Portal
Rupert  Beatty

Rupert Beatty

1660100700

Laravel.io: The Laravel.io Community Portal

Laravel.io

This is the repository for the Laravel.io community portal. The code is entirely open source and licensed under the MIT license. We welcome your contributions but we encourage you to read the contributing guide before creating an issue or sending in a pull request. Read the installation guide below to get started with setting up the app on your machine.

Requirements

The following tools are required in order to start the installation.

Installation

Note that you're free to adjust the ~/Sites/laravel.io location to any directory you want on your machine. In doing so, be sure to run the valet link command inside the desired directory.

  1. Clone this repository with git clone git@github.com:laravelio/laravel.io.git ~/Sites/laravel.io
  2. Run composer install to install the PHP dependencies
  3. Set up a local database called laravel
  4. Run composer setup to setup the application
  5. Set up a working e-mail driver like Mailtrap
  6. Run valet link to link the site to a testing web address
  7. Configure the (optional) features from below

You can now visit the app in your browser by visiting http://laravel.io.test. If you seeded the database you can login into a test account with testing & password.

Github Authentication (optional)

To get Github authentication to work locally, you'll need to register a new OAuth application on Github. Use http://laravel.io.test for the homepage url and http://laravel.io.test/auth/github for the callback url. When you've created the app, fill in the ID and secret in your .env file in the env variables below. You should now be able to authentication with Github.

GITHUB_ID=
GITHUB_SECRET=
GITHUB_URL=http://laravel.io.test/auth/github

Algolia Search (optional)

To get Algolia search running locally, you'll need to register for a new account and create an index called threads. Algolia has a free tier that satisfies all of the requirements needed for a development environment. Now update the below variables in your .env file. The App ID and secret keys can be found in the API Keys section of the Algoila UI.

SCOUT_DRIVER=algolia
SCOUT_QUEUE=true

ALGOLIA_APP_ID=
ALGOLIA_SECRET="Use the Write API Key"

VITE_ALGOLIA_APP_ID="${ALGOLIA_APP_ID}"
VITE_ALGOLIA_SECRET="Use the Search API Key"
VITE_ALGOLIA_INDEX=threads

In order to index your existing threads, run the following command:

php artisan scout:import App\\Models\\Thread

New threads will be automatically added to the index and threads which get updated will be automatically synced. If you need to flush your index and start again, you can run the following command:

php artisan scout:flush App\\Models\\Thread

Twitter Sharing (optional)

To enable published articles to be automatically shared on Twitter, you'll need to create a Twitter app. Once the app has been created, update the below variables in your .env file. The consumer key and secret and access token and secret can be found in the Keys and tokens section of the Twitter developers UI.

TWITTER_CONSUMER_KEY=
TWITTER_CONSUMER_SECRET=
TWITTER_ACCESS_TOKEN=
TWITTER_ACCESS_SECRET=

Approved articles are shared in the order they were submitted for approval. Articles are shared twice per day at 14:00 and 18:00 UTC. Once an article has been shared, it will not be shared again.

Telegram Notifications (optional)

Laravel.io can notify maintainers of newly submitted articles through Telegram. For this to work, you'll need to set up a Telegram bot and obtain a token. Then, configure the channel you want to send new article messages to.

TELEGRAM_BOT_TOKEN=
TELEGRAM_CHANNEL=

Fathom Analytics (optional)

To enable view counts on articles, you'll need to register a Fathom Analytics account and install it on the site. You will then need to create an API token and find your site ID before updating the below environment variables in your .env file.

FATHOM_SITE_ID=
FATHOM_TOKEN=

Commands

CommandDescription
vendor/bin/pest -pRun the tests with parallel execution
php artisan migrate:fresh --seedReset the database
npm run devBuild and watch for changes in CSS and JS files

Maintainers

The Laravel.io portal is currently maintained by Dries Vints and Joe Dixon. If you have any questions please don't hesitate to create an issue on this repo.

Contributing

Please read the contributing guide before creating an issue or sending in a pull request.

Code of Conduct

Please read our Code of Conduct before contributing or engaging in discussions.

Security Vulnerabilities

Please review our security policy on how to report security vulnerabilities.

Download Details:

Author: laravelio
Source Code: https://github.com/laravelio/laravel.io 
License: MIT license

#laravel #php #community 

Laravel.io: The Laravel.io Community Portal
Reid  Rohan

Reid Rohan

1657795620

Community: Discussion, Support and Common information for Projects

community

Discussion, support and common information for projects in the community.

FAQ

What is Level?

Level is a collection of Node.js modules for creating transparent databases. A solid set of primitives enable powerful databases to be built in userland. They can be embedded or networked, persistent or transient - in short, tailored to your needs.

At the heart of Level are key-value databases that follow the characteristics of LevelDB. They support binary keys and values, batched atomic writes and bi-directional iterators that read from a snapshot in time. Entries are sorted lexicographically by keys which, when combined with ranged iterators, makes for a powerful query mechanism. Level combines idiomatic JavaScript interfaces like async iterators with Node.js interfaces like streams, events and buffers. It offers a rich set of data types through encodings and can split a database into evented sections called sublevels.

The underlying storage can be easily swapped by selecting a different database implementation, all sharing a common API and the same characteristics. Together they target a wide range of runtime environments: Node.js and Electron on Linux, Mac OS, Windows and FreeBSD, including ARM platforms like Raspberry Pi and Android, as well as Chrome, Firefox, Edge, Safari, iOS Safari and Chrome for Android.

Where do I start?

The level module is the recommended way to get started. It offers a persistent database that works in both Node.js and browsers, backed by LevelDB and IndexedDB respectively. Many alternatives are available. For example, memory-level is an in-memory database backed by a red-black tree. Visit Level/awesome to discover more modules.

What is abstract-level?

If you are new to Level, there is a quick answer: abstract-level is the new core of Level on top of which several databases are (or will be) implemented. Read on if you're already familiar with Level modules (before 2022) and have used level, levelup, abstract-leveldown, encoding-down or deferred-leveldown.

Back in 2012, levelup offered a Node.js binding for Google's LevelDB. Authored by Rod Vagg, levelup exposed the features of LevelDB in a Node.js-friendly way. It had streams, binary support, encodings... all the goodies. Later on, the binding was moved to leveldown, so that other stores could be swapped in while retaining the friendly API of levelup.

This is when "up" vs "down" naming was born, where databases followed the formula of "level = levelup + leveldown". For example, level-mem was a convenience package that bundled levelup with memdown. The abstract-leveldown module offered a lower-level abstraction for the "down" part, to encapsulate common logic between "down" stores. Many such stores were written, replacing LevelDB with IndexedDB, RocksDB, in-memory red-black trees, relational databases and more.

Around 2017, further parts were extracted from levelup and moved to single-purpose modules. This effectively introduced the concept of "layers", where an implementation of abstract-leveldown wasn't necessarily a storage for levelup but could also wrap another abstract-leveldown implementation. For example, levelup encoding logic was extracted to encoding-down. This changed the database formula to "level = levelup + encoding-down + leveldown". Or in other words: "levelup + layer + layer".

This highly modular architecture led to clean code, where each module had a single responsibility. By this time, the overall API had settled and matured, some contributors moved on to other exciting things and the primary remaining effort was maintenance. This posed new challenges. We worked on test suites, added automated browser tests, code coverage and database manifests.

Yet, releases too often required canary testing in dependents. It was hard to predict the effect of a change. In addition, documentation became fragmented and some modules actually suffered from the high modularity, having to peel off layers to customize behavior. At the same time, we could see that typical usage of a Level database still involved encodings and the other goodies that the original levelup had.

Enter abstract-level. This module merges levelup, encoding-down and abstract-leveldown into a single codebase. Instead of implementing behaviors "vertically" in layers, it is done per database method. Performance-wise abstract-level is on par with the old modules. GC pressure is lower because methods allocate less callback functions. Custom (userland) database methods also benefit from the new architecture, because they can reuse utility methods included in abstract-level rather than a layer having to detect and wrap custom methods.

Lastly, abstract-level comes with new features, some of which were not possible to implement before. Among them: Uint8Array support, built-in sublevels, atomically committing data to multiple sublevels, and reading multiple or all entries from an iterator in one call.

How do I upgrade to abstract-level?

We've put together several upgrade guides for different modules. For example, if you're currently using level@7 and no other modules (ignoring transitive dependencies) then it will suffice to read the upgrade guide of level@8.

Naming-wise, databases generally use an npm package name in the form of *-level while utilities and plugins are called level-*. This replaces the down versus up naming scheme. Similarly, while it was previously helpful for documentation to distinguish between "database" and its "underlying store", now you will mostly just encounter the term "database".

To upgrade, please consult the following table. If you use a combination of the modules listed here, each must be upgraded to its abstract-level equivalent.

Old moduleNew moduleNamed export 3Upgrade guide
[level][level] <= 7[level][level] >= 8Level[level@8][level@8]
[abstract-leveldown][abstract-leveldown][abstract-level][abstract-level]AbstractLevel[abstract-level@1][abstract-level@1]
[levelup][levelup]n/an/aDepends 2
level or levelup with streams[level-read-stream][l-read-stream]EntryStream[level-read-stream@1][l-read-stream@1]
[leveldown][leveldown][classic-level][classic-level]ClassicLevel[classic-level@1][classic-level@1]
[level-mem][level-mem][memory-level][memory-level]MemoryLevel[memory-level@1][memory-level@1]
[memdown][memdown][memory-level][memory-level]MemoryLevel[memory-level@1][memory-level@1]
[level-js][level-js][browser-level][browser-level]BrowserLevel[browser-level@1][browser-level@1]
[level-rocksdb][level-rocksdb]rocks-levelRocksLevelNot yet available
[rocksdb][rocksdb]rocks-levelRocksLevelNot yet available
[multileveldown][multileveldown][many-level][many-level]ManyLevelGuest[many-level@1][many-level@1]
[level-party][level-party]rave-levelRaveLevelNot yet available
[subleveldown][subleveldown]1n/an/a[abstract-level@1][abstract-level@1]
[deferred-leveldown][def-ld]1n/an/a[abstract-level@1][abstract-level@1]
[encoding-down][encoding-down]1n/an/a[abstract-level@1][abstract-level@1]
[level-errors][level-errors]1n/an/a[abstract-level@1][abstract-level@1]
[level-packager][level-packager]n/an/an/a
[level-supports][supports] <= 2[level-supports][supports] >= 3supportsn/a
[level-codec][level-codec] 4[level-transcoder][transcoder]Transcoder[level-transcoder@1][transcoder@1]
[level-test][level-test]n/an/aNot yet available
  1. Functionality is now included in abstract-level.
  2. If the module that you're wrapping with levelup is listed here then refer to that module's upgrade guide, else see abstract-level@1.
  3. Most new modules use named exports, for example const { ClassicLevel } = require('classic-level') instead of const leveldown = require('leveldown').
  4. Encodings that follow the level-codec interface (without level-codec as a dependency) can still be used.

Where can I get support?

If you need help - technical, philosophical or other - feel free to open an issue in community or a more specific repository. We don't (yet) use GitHub Discussions, at least until discussions get the ability to close them.

You will generally find someone willing to help. Good questions get better and quicker answers. We do not offer paid support. All time is volunteered.

Where can I follow progress?

Most if not all activity happens on GitHub. See our project board to find out what we're working on. Any timelines there are just a rough indication of priority. We cannot guarantee that feature X or Y will actually be released on the given dates.

Subscribe to individual repositories to follow their progress. All releases are accompanied by a changelog and a GitHub Release, which gives you the option to only subscribe to new releases.

People

Collaborators

Collaborator emeriti

Contributors

Is your name missing? Send us a pull request!

API

This repository also used to hold a small amount of metadata on past and present contributors. They can be accessed from code by:

console.log(require('level-community'))

This metadata is no longer maintained and the npm package will be deprecated at some point. Contributors are instead documented in this README under People.

Contributing

Level/community is an OPEN Open Source Project. This means that:

Individuals making significant and valuable contributions are given commit-access to the project to contribute as they see fit. This project is more like an open wiki than a standard guarded open source project.

See the Contribution Guide for more details.

Author: Level
Source Code: https://github.com/Level/community 
License: MIT license

#javascript #node #community #project 

Community: Discussion, Support and Common information for Projects
Python  Library

Python Library

1657315380

GlobaLeaks | Web Application to Enable anonymous & Secure Reporting

GlobaLeaks

GlobaLeaks is free, open source software enabling anyone to easily set up and maintain a secure whistleblowing platform.

Community support

If you need technical support, have general questions, or have new ideas for GlobaLeaks, please post your message on the Community Forum.

Join our Slack to get in touch with the development team and the GlobaLeaks community:

  • #development to participate in development discussions
  • #community-support for the community support

You can join our development discussions as well on IRC on the channel #globaleaks of the OFTC Network.

If you want to contribute to the project please check the Contributors Guidelines.

In case you need to file a security report please check our Security Policy

Brand guidelines and brand assets

Within the GlobaLeaks project we researched a nice and smooth brand style, using accessible colors and trying to communicate our values. If you are planning some press releases, a conference, or promoting GlobaLeaks please keep at reference our official Brand Guidelines and use our Brand Assets.

Continuous integration and testing

BranchStatusQualityCoverageDocumentation
mainBuild StatusCodacy BadgeCodacy BadgeBuild Status
develBuild StatusCodacy BadgeCodacy BadgeBuild Status

Project best practices and scores: | Metric | Score | :---: | :---: | | Mozilla HTTP Observatory | Status | Security Headers | Status | SSLLabs | Status | CII Best Practices | CII Best Practices

Project statistics on OpenHub: www.openhub.net/p/globaleaks

Infrastructure status: uptime.globaleaks.org

Documentation

GlobaLeaks's documentation is accessible at: docs.globaleaks.org

Download Details:
Author: globaleaks
Source Code: https://github.com/globaleaks/GlobaLeaks
License: View license

#python

GlobaLeaks | Web Application to Enable anonymous & Secure Reporting

EXist Native XML Database and Application Platform Written in Java

eXist-db is a high-performance open source native XML database—a NoSQL document database and application platform built entirely around XML technologies. The main homepage for eXist-db can be found at exist-db.org. This is the GitHub repository of eXist source code, and this page links to resources for downloading, building, and contributing to eXist-db, below.

The eXist-db community has adopted the Contributor Covenant Code of Conduct.

Open Community Calls

We hold an open Community Call each week on Monday, from 19:30-20:30 CET. The meetings are posted to this public Google Calendar.

If you wish to participate, please join the #community channel on our Slack workspace (invitation link below). Pinned to that channel is a link to the upcoming meeting's agenda, which contains the link to the call, as well as a link to timeanddate.com to look up the time of the meeting for your local time zone.

The notes of past Community Calls are located here.

Resources

New developers may find the notes in BUILD.md and CONTRIBUTING.md helpful to start using and sharing your work with the eXist community.

Credits

The eXist-db developers use the YourKit Java Profiler.

YourKit Logo

YourKit kindly supports open source projects with its full-featured Java Profiler. YourKit, LLC is the creator of YourKit Java Profiler and YourKit .NET Profiler, innovative and intelligent tools for profiling Java and .NET applications.

sauce-labs_horiz_red-grey_rgb_200x28

Cross-browser Testing Platform and Open Source <3 Provided by Sauce Labs

Download Details:
Author: eXist-db
Source Code: https://github.com/eXist-db/exist
License: LGPL-2.1 license

#database  #java 

EXist Native XML Database and Application Platform Written in Java

Los Efectos Destacados Disponibles En Xamarin Community Toolkit

En esta publicación, veremos los efectos destacados disponibles en Xamarin Community Toolkit y cómo usarlos. Esta publicación es parte de Xamarin Community Toolkit - Tutorial Series , visite la publicación para obtener información sobre Xamarin Community Toolkit. El kit de herramientas de la comunidad de Xamarin es una colección de elementos reutilizables para el desarrollo móvil con Xamarin.Forms, que incluye animaciones, comportamientos, convertidores, efectos y ayudantes. Simplifica y demuestra las tareas comunes de los desarrolladores al crear aplicaciones para iOS, Android, macOS, WPF y Universal Windows Platform (UWP) con Xamarin.Forms. 

Parte de codificación

Pasos

  1. Paso 1: creación de nuevos proyectos de Xamarin.Forms.
  2. Paso 2: configurar el kit de herramientas de la comunidad de Xamarin en Xamarin.Forms .Net Standard Project.
  3. Paso 3: Implementación de efectos usando Xamarin Community Toolkit.

Paso 1: creación de nuevos proyectos de Xamarin.Forms

Cree un nuevo proyecto seleccionando Nuevo proyecto à Seleccione Xamarin Cross Platform App y haga clic en Aceptar.

Nota: la versión de Xamarin.Forms debe ser superior a 5.0.

Luego, seleccione las plataformas Android e iOS como se muestra a continuación con la estrategia de uso compartido de código como PCL o .Net Standard y haga clic en Aceptar.

Paso 2: Configuración del escáner en Xamarin.Forms .Net Standard Project

En este paso, veremos cómo configurar el complemento.

  • Abra Nuget Manager en Visual Studio Solution haciendo clic derecho en la solución y seleccione "Administrar paquetes Nuget".
     

  • Luego seleccione "Xamarin Community Toolkit" y verifique todos los proyectos en la solución, instale el

    complemento

Paso 3: Implementación de efectos usando Xamarin Community Toolkit.

En este paso, veremos cómo implementar los efectos destacados que se ofrecen en Xamarin Community Toolkit. Aquí, hemos explicado la implementación de Safe Area Effect, Shadow Effect, Life Cycle Effect y StatusBar Effect.

  • Abra su archivo de diseño XAML y agregue el siguiente espacio de nombres para utilizar las vistas en la pantalla.
xmlns:xct="http://xamarin.com/schemas/2020/toolkit"

C#

efecto de área segura

  • SafeAreaEffect es un efecto que se puede agregar a cualquier elemento a través de una propiedad adjunta para indicar si ese elemento debe tener en cuenta las áreas seguras actuales o no.
  • Esta es un área de la pantalla que es segura para todos los dispositivos que usan iOS 11 y superior.

Específicamente, ayudará a asegurarse de que las esquinas redondeadas del dispositivo, el indicador de inicio o la carcasa del sensor en un iPhone X no recorten el contenido. El efecto solo se dirige a iOS, lo que significa que en otras plataformas no hace nada.

Propiedades

<StackLayout xct:SafeAreaEffect.SafeArea="True" BackgroundColor="White"> </StackLayout>

C#

Area segura

Indica qué áreas seguras se deben tener en cuenta para este elemento

Antes de aplicar el efecto

Después de aplicar el efecto

efecto de sombra

Se usa para tener un efecto de sombra para las vistas de Xamarin Forms y tenemos cinco propiedades que deben entenderse para usar este efecto.

Propiedades

  1. Color: Es el color que tendrá la sombra.
  2. Opacidad: Con esta propiedad puedes controlar la opacidad que quieres en la sombra.
  3. Radio: Es responsable de manejar el desenfoque en la sombra.
  4. OffsetX/OffsetY: Nos permite definir el desplazamiento que tendrá la sombra, por lo tanto OffsetX se encarga de especificar la distancia del desplazamiento horizontal, mientras que OffsetY del desplazamiento vertical.

En este ejemplo, actualicé el efecto de sombra para el control de imagen como en el siguiente bloque de código

<Image
	x:Name="img"
	HeightRequest="150"
	Margin="10"
	xct:ShadowEffect.Color="Green"
	xct:ShadowEffect.OffsetY="15"
	Source="https://shorturl.at/qsvJ1">
</Image>

C#

Efecto del ciclo de vida

LifecycleEffect le permite determinar cuándo la plataforma asigna su renderizador a VisualElement. Se puede identificar mediante los controladores de eventos LifeCycleEffect.

<Image
	x:Name="img"
	HeightRequest="150"
	Margin="10"
	Source="https://shorturl.at/qsvJ1">
	<Image.Effects>
		<xct:LifecycleEffect Loaded="LifeCycleEffect_Loaded" Unloaded="LifeCycleEffect_Unloaded" />
	</Image.Effects>
</Image>

C#

private void LifeCycleEffect_Loaded(object sender, EventArgs e)
{
	Console.WriteLine("Image loaded...");
}

private void LifeCycleEffect_Unloaded(object sender, EventArgs e)
{
	Console.WriteLine("Image Unloaded...");
}

C#

Efecto de la barra de estado

  • Este efecto se usa para controlar el color de la barra de estado en la aplicación Xamarin Forms durante el tiempo de compilación o el tiempo de ejecución o los clics de botones similares. Es un código de una sola línea para ser útil para controlar.

En este ejemplo, crearemos un recurso de color en el archivo App.xaml como se muestra a continuación. Se actualizará más adelante de forma dinámica.

<Color x:Key="StatusBarColor">Firebrick</Color>

C#

Luego agregue la siguiente línea en el elemento raíz de su página XAML como DynamicResource

xct:StatusBarEffect.Color="{DynamicResource StatusBarColor}"

C#

Luego agregue los 3 botones como en la pantalla de muestra y actualice el color de su recurso dinámico al hacer clic en el botón como se muestra a continuación.

private void ButtonClicked(object sender, EventArgs e)
{
	Application.Current.Resources["StatusBarColor"] = ((Button)sender).TextColor;
}

C#

Código completo

<?xml version="1.0" encoding="utf-8" ?>
<ContentPage xmlns="http://xamarin.com/schemas/2014/forms"
             xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
             xmlns:xct="http://xamarin.com/schemas/2020/toolkit"
             x:Class="XamarinCommunityToolkit.EffectsSamplePage"
             xct:StatusBarEffect.Color="{DynamicResource StatusBarColor}">
    <ContentPage.Content>
        <StackLayout xct:SafeAreaEffect.SafeArea="True"
                     BackgroundColor="White">

            <Frame BackgroundColor="#2196F3"
                   Padding="24"
                   CornerRadius="0">
                <Label Text="Xamarin Forms Effects using XCT"
                       HorizontalTextAlignment="Center"
                       TextColor="White"
                       FontSize="36"/>
            </Frame>

            <Image
                x:Name="img"
                HeightRequest="150"
                Margin="10"
                xct:ShadowEffect.Color="Green"
                xct:ShadowEffect.OffsetY="15"
                Source="https://shorturl.at/qsvJ1">
                <Image.Effects>
                    <xct:LifecycleEffect Loaded="LifeCycleEffect_Loaded" Unloaded="LifeCycleEffect_Unloaded" />
                </Image.Effects>
            </Image>
            <Grid Padding="10">
                <Button Clicked="ButtonClicked" Text="Red" TextColor="Red" BorderColor="Red" BorderWidth="2" Grid.Column="0"/>
                <Button Clicked="ButtonClicked" Text="Green" TextColor="Green" BorderColor="Green" BorderWidth="2" Grid.Column="1"/>
                <Button Clicked="ButtonClicked" Text="Blue" TextColor="Blue" BorderColor="Blue" BorderWidth="2" Grid.Column="2"/>
            </Grid>
        </StackLayout>
    </ContentPage.Content>
</ContentPage>

C#

El XCT también ofrece otros tipos de efectos. Visite el siguiente enlace para obtener más ejemplos. https://github.com/xamarin/XamarinCommunityToolkit/tree/main/samples/XCT.Sample/Pages/Effects

Descargar Código

Puedes descargar el código desde GitHub. Si tienes dudas, no dudes en publicar un comentario. Si te gusta este artículo y te resulta útil, haz clic en Me gusta, comparte el artículo y destaca el repositorio en GitHub.

 Fuente: https://www.c-sharpcorner.com/article/xamarin-community-toolkit-effects/

#xamarin #community #toolkit #effective 

Los Efectos Destacados Disponibles En Xamarin Community Toolkit
高橋  花子

高橋 花子

1650410160

XamarinCommunityToolkitで利用可能な注目のエフェクト

この投稿では、XamarinCommunityToolkitで利用できる機能とその使用方法について説明します。この投稿はXamarinCommunityToolkit-Tutorial Seriesの一部です。投稿にアクセスして、XamarinCommunityToolkitについて知ってください。Xamarin Community Toolkitは、アニメーション、動作、コンバーター、エフェクト、ヘルパーなど、Xamarin.Formsを使用したモバイル開発用の再利用可能な要素のコレクションです。Xamarin.Formsを使用してiOS、Android、macOS、WPF、およびユニバーサルWindowsプラットフォーム(UWP)アプリを構築する際の一般的な開発者タスクを簡素化し、デモンストレーションします。 

コーディング部

手順

  1. ステップ1:新しいXamarin.Formsプロジェクトを作成します。
  2. ステップ2:Xamarin.Forms .NetStandardProjectでXamarinCommunityToolkitをセットアップします。
  3. ステップ3:XamarinCommunityToolkitを使用したエフェクトの実装。

ステップ1:新しいXamarin.Formsプロジェクトを作成する

[新しいプロジェクト]を選択して新しいプロジェクトを作成する[XamarinCrossPlatform App]を選択し、[OK]をクリックします。

注: Xamarin.Formsのバージョンは5.0より大きくする必要があります。

次に、以下に示すように、PCLまたは.Net Standardとしてコード共有戦略を使用してAndroidおよびiOSプラットフォームを選択し、[OK]をクリックします。

手順2:Xamarin.Forms .NetStandardProjectでスキャナーを設定する

このステップでは、プラグインのセットアップ方法を説明します。

  • ソリューションを右クリックして[Nugetパッケージの管理]を選択し、VisualStudioソリューションでNugetManagerを開きます。
     

  • 次に、「Xamarin Community Toolkit」を選択し、ソリューション内のすべてのプロジェクトを確認して、

    プラグイン

ステップ3:XamarinCommunityToolkitを使用したエフェクトの実装。

このステップでは、XamarinCommunityToolkitで提供される注目のエフェクトを実装する方法を説明します。ここでは、セーフエリア効果、シャドウ効果、ライフサイクル効果、ステータスバー効果の実装について説明しました。

  • XAMLデザインファイルを開き、次の名前空間を追加して、画面上のビューを利用します。
xmlns:xct="http://xamarin.com/schemas/2020/toolkit"

C#

SafeAreaEffect

  • SafeAreaEffectは、アタッチされたプロパティを介して任意の要素に追加できる効果であり、その要素が現在の安全領域を考慮に入れる必要があるかどうかを示します。
  • これは、iOS11以降を使用するすべてのデバイスにとって安全な画面の領域です。

具体的には、コンテンツが丸みを帯びたデバイスの角、ホームインジケーター、またはiPhone Xのセンサーハウジングによってクリップされないようにするのに役立ちます。この効果はiOSのみを対象とします。つまり、他のプラットフォームでは何もしません。

プロパティ

<StackLayout xct:SafeAreaEffect.SafeArea="True" BackgroundColor="White"> </StackLayout>

C#

安全地帯

この要素で考慮すべき安全な領域を示します

効果を適用する前に

効果を適用した後

ShadowEffect

これは、Xamarinフォームビューのシャドウ効果を持たせるために使用されます。この効果を使用するには、理解する必要のある5つのプロパティがあります。

プロパティ

  1. 色:それは影が持つ色です。
  2. 不透明度:このプロパティを使用すると、シャドウで必要な不透明度を制御できます。
  3. 半径:影のぼやけを処理する責任があります。
  4. OffsetX / OffsetY:シャドウが持つ変位を定義できるため、OffsetXは水平方向の変位の距離を指定し、OffsetYは垂直方向の変位を指定します。

この例では、次のコードブロックのように、画像コントロールのシャドウ効果を更新しました。

<Image
	x:Name="img"
	HeightRequest="150"
	Margin="10"
	xct:ShadowEffect.Color="Green"
	xct:ShadowEffect.OffsetY="15"
	Source="https://shorturl.at/qsvJ1">
</Image>

C#

ライフサイクル効果

LifecycleEffectを使用すると、VisualElementのレンダラーがプラットフォームによって割り当てられるタイミングを決定できます。これは、LifeCycleEffectイベントハンドラーを使用して識別できます。

<Image
	x:Name="img"
	HeightRequest="150"
	Margin="10"
	Source="https://shorturl.at/qsvJ1">
	<Image.Effects>
		<xct:LifecycleEffect Loaded="LifeCycleEffect_Loaded" Unloaded="LifeCycleEffect_Unloaded" />
	</Image.Effects>
</Image>

C#

private void LifeCycleEffect_Loaded(object sender, EventArgs e)
{
	Console.WriteLine("Image loaded...");
}

private void LifeCycleEffect_Unloaded(object sender, EventArgs e)
{
	Console.WriteLine("Image Unloaded...");
}

C#

ステータスバー効果

  • このエフェクトは、コンパイル時や実行時などのボタンクリック時にXamarinFormsアプリケーションのステータスバーの色を制御するために使用されます。制御に役立つのは1行のコードです。

この例では、以下のようにApp.xamlファイルにカラーリソースを作成します。後で動的に更新されます。

<Color x:Key="StatusBarColor">Firebrick</Color>

C#

次に、XAMLページのルート要素にDynamicResourceとして次の行を追加します

xct:StatusBarEffect.Color="{DynamicResource StatusBarColor}"

C#

次に、サンプル画面のように3つのボタンを追加し、以下のようにボタンクリックで動的リソースの色を更新します。

private void ButtonClicked(object sender, EventArgs e)
{
	Application.Current.Resources["StatusBarColor"] = ((Button)sender).TextColor;
}

C#

完全なコード

<?xml version="1.0" encoding="utf-8" ?>
<ContentPage xmlns="http://xamarin.com/schemas/2014/forms"
             xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
             xmlns:xct="http://xamarin.com/schemas/2020/toolkit"
             x:Class="XamarinCommunityToolkit.EffectsSamplePage"
             xct:StatusBarEffect.Color="{DynamicResource StatusBarColor}">
    <ContentPage.Content>
        <StackLayout xct:SafeAreaEffect.SafeArea="True"
                     BackgroundColor="White">

            <Frame BackgroundColor="#2196F3"
                   Padding="24"
                   CornerRadius="0">
                <Label Text="Xamarin Forms Effects using XCT"
                       HorizontalTextAlignment="Center"
                       TextColor="White"
                       FontSize="36"/>
            </Frame>

            <Image
                x:Name="img"
                HeightRequest="150"
                Margin="10"
                xct:ShadowEffect.Color="Green"
                xct:ShadowEffect.OffsetY="15"
                Source="https://shorturl.at/qsvJ1">
                <Image.Effects>
                    <xct:LifecycleEffect Loaded="LifeCycleEffect_Loaded" Unloaded="LifeCycleEffect_Unloaded" />
                </Image.Effects>
            </Image>
            <Grid Padding="10">
                <Button Clicked="ButtonClicked" Text="Red" TextColor="Red" BorderColor="Red" BorderWidth="2" Grid.Column="0"/>
                <Button Clicked="ButtonClicked" Text="Green" TextColor="Green" BorderColor="Green" BorderWidth="2" Grid.Column="1"/>
                <Button Clicked="ButtonClicked" Text="Blue" TextColor="Blue" BorderColor="Blue" BorderWidth="2" Grid.Column="2"/>
            </Grid>
        </StackLayout>
    </ContentPage.Content>
</ContentPage>

C#

他のタイプのエフェクトも提供するXCT。その他のサンプルについては、以下のリンクにアクセスしてください。https://github.com/xamarin/XamarinCommunityToolkit/tree/main/samples/XCT.Sample/Pages/Effects

コードをダウンロード

コードはGitHubからダウンロードできます。疑問がある場合は、コメントを投稿してください。この記事が気に入って、役立つ場合は、GitHubで記事を共有し、リポジトリにスターを付けてください。

 ソース:https ://www.c-sharpcorner.com/article/xamarin-community-toolkit-effects/

#xamarin #community #toolkit #effective 

XamarinCommunityToolkitで利用可能な注目のエフェクト
Ssekidde  Nat

Ssekidde Nat

1638817200

Learn About TOP Stories From The Microsoft DevOps Community

The top stories from the AzureDevOps #community for 2021.09.03 are here! Welcome to September! Here in Brooklyn, NY I am readying to start switching from shorts to jeans, from tank tops to hooded sweatshirts. If you’re ready to switch up your computer,

⭐️You can see more at the link at the end of the article. Thank you for your interest in the blog, if you find it interesting, please give me a like, comment and share to show your support for the author.

#azuredevops #azure 


 

Learn About TOP Stories From The Microsoft DevOps Community

What is xHashtag (XTAG) | What is xHashtag token | What is XTAG token

In this article, we'll discuss information about the xHashtag project and XTAG token. What is xHashtag (XTAG) | What is xHashtag token | What is XTAG token?

What is xHashtag?

xHashtag is a Work 4.0 DAO that lets registered users earn cryptocurrency by choosing from a number of simple on-chain, off-chain and creative tasks available to them. These tasks could range from something as simple as retweeting a given tweet, making an on-chain transaction on a DEX or something more creative — A YouTube Shorts video perhaps.

While this exercise is rewarding for the user, who doesn’t have to pay any fees to be eligible to participate in these tasks, it also opens up a new marketing vehicle for Web3 projects, to help them accelerate their community and token growth. Thus, both the Marketer and the Consumer stand to gain.

Communities: The Backbone of Web 3.0

Web 3.0 projects rely on communities to unlock conversations that drive project growth and adoption. With many projects coming up on Web 3.0, there is intense competition and pressure on the projects to adhere to timelines and deliver a good product. What xHashtag does is, it automates and amplifies your community messaging through activities decided by you, while letting you focus on your core aim of creating and building your product.

When several people initiate conversations or complete on-chain activities, other interested people are more likely to gain confidence in it, and possibly invest or be a part of that project.

Both Project Marketers and Digital Influencers would agree when we say that digital growth is all about the numbers, and these numbers are an outcome of a homogenous mix of organic and incentivized tasks to drive engagement.

Not having enough on-chain activities can be a critical issue in terms of branding a Web 3.0 product, as it is likely to obtain a lower ranking on websites such as dAppRadar that offers analytical statistics about these projects.

However, most Web 3.0 projects are clueless about how to go ahead planning the right marketing mix. Would high spending translate to better conversions? Would it be better to simply have a large number of airdrops and giveaways on the platform? Or would it be wise to hand over the marketing duties to a specialized agency?

With xHashtag, projects no longer need to worry about these questions. As a DAO, xHashtag allows projects to create a #Community where they can:

A) Leverage the community talents and skillsets to accelerate project growth

B) Set token-denominated incentives for the community members who add value to the ecosystem

FutureOfWork — Revolutionizing Work through Blockchain

xHashtag strongly believes in the concept of Future of Work, where digital platforms play a major role in shaping how work is executed. This digitization would cause most traditionally established structures such as fixed work location, fixed work hours, and organizational hierarchies to become redundant and obsolete.

We believe that a DAO which automates most of the work done by corporate boards and senior executives, while having no centralized authority to influence its functioning is the way forward. Any major decision regarding the progress of the project would be taken on the basis of participant votes, with users having a higher stake naturally having a bigger say.

Thus, the xHashtag DAO aims to be a non-partisan, fair platform for projects, users and reviewers to collaborate for mutual benefit based on the concept of #PlayToEarn.

From a Project’s Perspective

If you represent a Web 3.0 project, xHashtag is a perfect marketing vehicle for you to drive engagement and growth. Getting started is as simple as 1–2–3.

1) Set your Objective

Whether you want the #community to conduct an on-chain transaction on your dApp or to post a simple tweet, we’ve got you covered.

2) Launch your Campaign

Prior to launching the campaign, you get to choose who exactly you want doing these tasks. You can filter users by preferences such as Public/Staked Users, KYC/Non-KYC, Gender, Age Group, and a lot more. Once you’ve decided on these parameters, simply fund the campaign with your project $tokens.

3) Receive Results

Once the campaign is live, it will run until the DAO distributes all of the allocated project $tokens to users who have completed the tasks set by you. You will also receive a continuous stream of proof that your task is being executed by our decentralized workforce.

From a User’s Perspective

Once you sign up on our platform, you immediately become eligible to receive simple, on-chain or creative tasks based on parameters set by the Project running the campaign. Even here, it is as simple as 1–2–3.

1) Check your Timeline Regularly

Once logged in, the users only need to keep checking the timeline regularly to find out if they are eligible for any task.

2) Complete Tasks on Time

Once a user accepts a task, they would need to complete it within the stipulated time and submit them for review, in order to be eligible for rewards.

3) Earn Rewards as Indicated

Once the DAO reviews the submission, the reward is automatically credited to the user’s wallet with no manual intervention required from either the Project or xHashtag.

From a Reviewer’s Perspective

With great power comes great responsibility. As a Reviewer on the platform, you would need to uphold the highest standards of integrity in verifying user submissions to earn greater rewards. However, to become a Reviewer, you would need to stake $XTAG tokens.

1) Stake $XTAG tokens

Only accounts that have staked $XTAG tokens would be eligible to be a part of the Reviewers panel. Once the $XTAG is staked, Users are upgraded to Reviewers, with the ability to verify submitted tasks.

2) Review User Submissions

The reviewer needs to verify if the user has completed the task as per the campaign requirements, and report this to the DAO.

3) Earn Accelerated Rewards

Reviewers receive additional incentives in exchange for honest reviews. In case the project raises a dispute that a reviewer has approved an incorrect submission, the DAO will automatically slash the stake. Multiple wrong reviews would lead to revoking the ‘Reviewer’ status.

Why xHashtag?

As mentioned above, xHashtag is a platform that has something in it for everyone. The platform offers its users the ability to earn rewards with zero investment or risk, however, staking $XTAG and project $tokens of our partners will allow users to multiply their earnings. We’ll be covering this in detail in a future article, so stay tuned.

Also, by virtue of being a DAO, the possibilities of scaling up xHashtag into something much bigger are endless. As an early adopter, you stand to gain more. So, what are you waiting for? Hurry up and subscribe to be the first one to know about our launch.

The utility of the $XTAG token

Staking $XTAG tokens on the platform serve three major purposes — Governance, Task Eligibility Prioritization, and Reviewer Status Upgrades. While staking isn’t mandatory for a casual user of the platform who simply wants to participate in campaigns without investing (Yes, that’s possible!), it does offer a host of utilitarian benefits for the ones who show their support to our project by staking tokens.

  1. Participatory Governance

On the xHashtag DAO, our native $XTAG token assumes the role of a governance token. This means that staked users on the platform would have a say on every major decision taken concerning the furtherance of the project. Naturally, users with a higher amount of $XTAG staked would have a stronger influence on the voting power for any proposals for the development of the platform.

Staking $XTAG allows users to initiate and participate in what we term an XDP — xHashtag Development Proposal. Through this proposal, users can vote in favour of, or against decisions that shape the outcome of the DAO. A majority approval in terms of voting would be required for a proposal to be accepted and implemented.

2. Task eligibility prioritization

As a free user on the platform who hasn’t staked $XTAG, eligible tasks would show up only after a fixed number of hours. However, by staking $XTAG, users receive tasks on priority, thus ensuring that they are more likely to receive tasks before the campaign’s fund runs out.

Currently, the task delay for free (non-staked) users is set at 3 hours of the campaign going live. For prioritization based on staking, we propose to have 3 staking categories — xWhale, xShark, and xDolph, in the decreasing order of their staking power.

xWhales are the users with the highest stake, currently set at 900 $XTAG. They are eligible to receive tasks without any delay, as soon as a project creates a campaign, subject to these xWhales fulfilling the campaign participation criteria.

xSharks are the mid-level stakers, with a staking value currently set at 600 $XTAG. They are eligible to receive tasks with a delay of 1 hour from campaign set-live, subject to these xSharks fulfilling the campaign participation criteria.

xDolphs receive eligible tasks after 2 hours of the campaign being initiated, and the current staking value to become an xDolph is set at 300 $XTAG.

However, it is important to note that all the above-mentioned staking values and delay durations can be changed by the DAO through a community-approved XDP.

3. Upgradation to task reviewer status

Although certain simple on-chain tasks are automatically approved by the DAO, other complex tasks including off-chain tasks or procedural tasks would need to be manually verified post user submission to ensure that the campaign creator who is funding the campaign is not delivered an unacceptable quality of work, either by omission, oversight or willfully.

By staking $XTAG tokens, users are granted access to a reviewer role. The reviewers, as an integral part of the DAO perform the task review, thus generating consensus regarding task completion. In case of an incorrect or incomplete submission, the reviewer rejects the submission through the DAO, and in the process gets rewarded with $XTAG for every valid review.

For every incorrect review, the reviewer loses a portion of his/her staked $XTAG tokens, which will be used to compensate the project for the incorrect submission. This form of reward and penalty system will ensure that the reviewer is attentive, prudent, and dedicated to the task.

Every user on the platform has the opportunity to become a reviewer, provided they stake a minimum amount of $XTAG that is required. This required amount of $XTAG to be staked can vary depending on the DAO’s decision made through a community-approved XDP.

Summary

The $XTAG token is a core utility token on the xHashtag DAO, with staking being the primary use-case. Through staking, users on the platform can benefit from:

  • Participatory access to XDPs, which essentially facilitates governance on the platform
  • Priority access to tasks made available by the Web3 projects on the platform, based on the level of staking
  • Reviewer access to conduct performance checks on completed tasks submitted by the users, in exchange for rewards

The future is massive at xHashtag. Don’t miss out on being an early adopter of this exciting new DAO which is set to revolutionize the #FutureOfWork and help shape the #FutureOfWeb3.

How and Where to Buy XTAG token?

 XTAG has been listed on a number of crypto exchanges, unlike other main cryptocurrencies, it cannot be directly purchased with fiats money. However, You can still easily buy this coin by first buying Bitcoin, ETH, USDT, BNB from any large exchanges and then transfer to the exchange that offers to trade this coin, in this guide article we will walk you through in detail the steps to buy XTAG token.

You will have to first buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…

We will use Binance Exchange here as it is one of the largest crypto exchanges that accept fiat deposits.

Binance is a popular cryptocurrency exchange which was started in China but then moved their headquarters to the crypto-friendly Island of Malta in the EU. Binance is popular for its crypto to crypto exchange services. Binance exploded onto the scene in the mania of 2017 and has since gone on to become the top crypto exchange in the world.

Once you finished the KYC process. You will be asked to add a payment method. Here you can either choose to provide a credit/debit card or use a bank transfer, and buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…

☞ SIGN UP ON BINANCE

Step by Step Guide : What is Binance | How to Create an account on Binance (Updated 2021)

Next step - Transfer your cryptos to an Altcoin Exchange

Since XTAG is an altcoin we need to transfer our coins to an exchange that XTAG can be traded. Below is a list of exchanges that offers to trade XTAG in various market pairs, head to their websites and register for an account.

Once finished you will then need to make a BTC/ETH/USDT/BNB deposit to the exchange from Binance depending on the available market pairs. After the deposit is confirmed you may then purchase XTAG from the exchange.

The top exchange for trading in XTAG token is currently: SolRazr

Top exchanges for token-coin trading. Follow instructions and make unlimited money

BinancePoloniexGate.ioBitfinexHuobiMXCProBITCoinbase

🔺DISCLAIMER: The Information in the post isn’t financial advice, is intended FOR GENERAL INFORMATION PURPOSES ONLY. Trading Cryptocurrency is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money.

🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner

⭐ ⭐ ⭐The project is of interest to the community ☞ **-----https://geekcash.org-----**⭐ ⭐ ⭐

Find more information XTAG token ☞ Website

I hope this post will help you. Don't forget to leave a like, comment and sharing it with others. Thank you!

#bitcoin #cryptocurrency

What is xHashtag (XTAG) | What is xHashtag token | What is XTAG token
Vinnie  Erdman

Vinnie Erdman

1627026180

Learn New Developer Gig!

In this video I’m going to talk a bit about my new role with Octopus Deploy, a leading continuous deployment/delivery and automation tool. I’ll also go into what this looks like for my content and why I’m so excited about it!

#Developer #Development #Community

#developer #development #community

Learn New Developer Gig!
Lina  Biyinzika

Lina Biyinzika

1624963320

Determining Communities In Graphs Using Label Propagation Algorithm

Community detection is a topic that can be applied to datasets that have an inherent order among themselves.

For example, redwood, birch and oak are types of trees. Fish, algae and octopus are types of underwater creatures. These entities can be grouped under a common category, and the task of doing that is called as community detection.

Sometimes we have a dataset that is unstructured and we want to get some value out of it, in those cases community detection can be useful. In this article I will be talking about how to detect communities using LPA, label propagation algorithm in GraphFrames package of Spark. The code for the article can be found here.

Label Propagation Algorithm

Label Propagation Algorithm is a fast algorithm that detects communities in a graph. It does not require prior information about the communities, and it uses network structure to detect them. We label some initial communities and it takes it from there.

LPA was proposed by Raghavan in this paper, and it works by labelling the nodes, and propagating these labels throughout the network and forming communities based on the process of label propagation.

The idea is that a label will become dominant in a densely connected group of nodes, but will have trouble propagating in a sparsely connected region. When the algorithm finishes, the nodes with the same label are considered as part of the community.

The algorithm works as follows:

a) Every node is initialized with a unique label.

b) Label are propagated through the network

c) At every iteration of the propagation, each node updates its label to the one that the maximum number of its neighbours belongs to.

d) LPA reaches convergence when each node has the majority label of its neighbours.

e) LPA stops if either convergence or the user defined maximum number of iterations are achieved.

#apache-spark #python #community #knowledge-graph #graph

Determining Communities In Graphs Using Label Propagation Algorithm
John Garcia

John Garcia

1624572000

THIS COMMUNITY IS A FUD-FREE ZONE

THIS COMMUNITY IS A FUD-FREE ZONE. LET US STAY POSITIVE AND UPLIFT EACH OTHER. NO ROOM FOR ANYTHING ELSE, BUT TO MOVE FORWARD. GREAT THINGS AHEAD FOR US, THE SHIBA COMMUNITY!

📺 The video in this post was made by SHIBANALYST
The origin of the article: https://www.youtube.com/watch?v=n85pXk423Ss
🔺 DISCLAIMER: The article is for information sharing. The content of this video is solely the opinions of the speaker who is not a licensed financial advisor or registered investment advisor. Not investment advice or legal advice.
Cryptocurrency trading is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money
🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner
⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!
☞ **-----CLICK HERE-----**⭐ ⭐ ⭐
Thanks for visiting and watching! Please don’t forget to leave a like, comment and share!

#bitcoin #blockchain #shiba-inu #this community is a fud-free zone #fud #community

THIS COMMUNITY IS A FUD-FREE ZONE
Eric  Bukenya

Eric Bukenya

1624125600

Top Stories from the Microsoft DevOps Community – 2021.06.18

The top stories from the #AzureDevOps #community for 2021.06.18 are here!

It’s Friday which means there’s a new batch of content from the community to share. We have new posts on pipelines, DevSecOps, and Bicep to share with you. Let’s get into it!

Azure DevOps Pipelines for deploying content to RStudio Connect

Kelly checks in with a post on moving from push-button or git-backed deployments via the RStudio Connect Server API.

Implementing DevSecOps in Azure

Ilche Bedelovski extols the benefits of keeping security at the front of the line when developing a DevOps strategy.

Introduction to Azure Bicep v 0.4.x

Deploy your Azure infrastructure using Bicep! Olivier Miossec gives us an introduction to the latest version of this native Infrastructure-as-Code domain-specific language.

Creating a Azure API Management Instance using Bicep Lang via Azure DevOps

Will Velida loves Bicep! Learn about deploying your Bicep templates via Azure DevOps.

DataOps Automation — Deploying Databricks notebooks with Azure DevOps YAML Pipelines

Wesley shows you how to deploy your Databricks notebooks using Azure DevOps and YAML pipelines.

Using Azure DevOps for Private Go Modules

Attention Gophers: This post by Sheldon Hull goes over configs for developing your private Go Modules.

Thank you for your contributions KellyIlcheOlivierWillWesley, and Sheldon! The DevOps community appreciates your help in sharing knowledge.

#azure & cloud #community #microsoft devops

Top Stories from the Microsoft DevOps Community – 2021.06.18
Rylan  Becker

Rylan Becker

1623895680

Top Stories from the Microsoft DevOps Community – 2021.05.21

Happy Friday everyone! I can’t believe it’s already the end of May. This year is flying by! The area of Texas I live in has been getting some much-needed rain, and I’ve enjoyed checking out what the community has been up to during these torrential storms.

This week’s posts cover things like figuring out who an approver is on a pipeline for contact, how to safely use System.AccessToken in Docker builds, hosting free websites with Azure Static Web Apps, and more! Be sure to check them out!

Getting the approver for release to an environment within an Azure DevOps Multi-Stage YAML pipeline

Are you working on a pipeline but you’re not sure who the approvers are or how you can contact them? Richard shows us how to programmatically pull approver contact info from an Azure Pipeline.

Static Code Analyses – Terrascan, Terraform and Azure DevOps

James shows us how to use Terrascan to perform code analysis on a Terraform codebase within an Azure Pipeline.

Hosting Free Website with Serverless backend on Azure Static Web App

Looking for a place to host a free website? Subhankar walks us through using the Azure Free Tier to host an Azure Static Web App.

Use your Azure DevOps System.AccessToken for Docker builds… safely

Using the System.AccessToken within an Azure Pipeline is straightforward… but what if you need to use that token within Docker? Karsten shows us how we can accomplish this safely.

Send Slack Notification via InvokeRESTAPI@1 – Azure DevOps Tricks #2

Sending Slack notifications automatically during an Azure Pipeline run is super powerful. Iain shows us how we can get this setup using a Service Connection in Azure Pipelines.

#azure & cloud #community #microsoft devops

Top Stories from the Microsoft DevOps Community – 2021.05.21