What is Aluna (ALN) | What is Alunal token | What is ALN token

Introducing the Aluna Project: Aluna.Social and ALN Token Overview

The Aluna project aims to address the lack of transparency and **trust **in crypto trading.

Image for post

Problem

Anyone can say what they think about the market, but it’s difficult to prove if a trader puts his money where his mouth is. Among the constant noise of Crypto Twitter, Telegram, YouTube channels and other media, it is difficult to make sense of what’s actually being traded versus what’s being promoted — there is simply a lack of transparency and trust.

Solution

In 2018, we embarked on the Aluna project with the goal of creating a transparent environment where aspiring crypto traders can thrive, by combining a trading terminal with a social network.

What is Aluna.Social?

Aluna.Social is ALN’s flagship product — a multi-exchange social trading terminal for crypto traders.

On Aluna.Social, users can connect and manage multiple exchange accounts in one place, verify and share their trading performance in an unforgeable way, leverage on community insights and positive social feedback loops, and automatically copy the trades of the world’s best traders (or counter-copy the worst!).

The platform is currently in Beta, integrated with Bitmex, Binance, Bitfinex, Bittrex and Poloniex, with more exchange integrations on the way.

Three notable features in the pipeline are:

  • Web3 Prediction Games
  • DeFi Leaderboard — An on-chain analytics tool for DEX traders
  • On-chain Social Trading

What is ALN?

Aluna (ALN) Token is the utility token at the heart of the Aluna ecosystem.

The core functions of ALN are:

  1. To fuel the incentive and gamification mechanisms of ALN-powered products and services.
  2. To incentivise ecosystem participants and bootstrapping of ALN liquidity and utility.
  3. To coordinate decentralised governance and reward the community of governors.

ALN Token Allocation

51% of the total ALN supply is distributed to the community who will govern the remaining 49% held in the treasury through the Aluna DAO.

  • Early Adopters: 10%
  • Token Sale: 15%
  • Ecosystem Fund: 15%
  • Team & Advisors: 11%
  • Treasury: 49%

The total supply of ALN is 100 million.

Image for post

ALN Token Distribution

ALN Token Vesting

Image for post

ALN Token Vesting

ALN Token Release Schedule

Image for post

ALN Token Release Schedule

ALN Tokenomics

  • EARN: Platform incentives reward active users and the best traders on the leaderboard.
  • SPEND: Users can spend ALN tokens for a discounted monthly subscription plan.
  • PLAY: Users can play Web3 prediction games (similar to binary options) using ALN tokens.
  • STAKE: Staking at least 1337 ALN provides users with platform benefits including free monthly subscription and fee discounts, while keeping 1337 ALN in your MetaMask wallet enables access to the Aluna token-permissioned Telegram chat group.
  • FARM: The Ecosystem Fund is used to bootstrap liquidity, attract a larger audience, and incentivise meaningful contribution such as by being an active or top performing trader on Aluna.Social.
  • BURN: Part of revenue generated from monthly subscription fees and gaming fees in ALN is burned.
  • GOVERN: ALN is self-governed through the Aluna DAO, enabling decentralised control and evolution of the ALN token. Governors will be able to propose and vote on usage of the treasury and other protocol improvements.

TOKEN SALE: 15 DEC – 29 DEC

Ticker: ALN 
Token type: ERC20
ICO Token Price: 1 ALN = 0.1 USD
Fundraising Goal: $1,500,000
Total Tokens: 100,000,000 
Available for Token Sale: 5%

Would you like to earn ALN right now! ☞ CLICK HERE

Looking for more information…

☞ Website
☞ Explorer
☞ Source Code
☞ Social Channel
Message Board
☞ Documentation
☞ Coinmarketcap

Create an Account and Trade Cryptocurrency NOW

Binance
Bittrex
Poloniex

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#cryptocurrency #bitcoin #aluna.social #aln

What is GEEK

Buddha Community

What is Aluna (ALN) | What is Alunal token | What is ALN token

What is Aluna (ALN) | What is Alunal token | What is ALN token

Introducing the Aluna Project: Aluna.Social and ALN Token Overview

The Aluna project aims to address the lack of transparency and **trust **in crypto trading.

Image for post

Problem

Anyone can say what they think about the market, but it’s difficult to prove if a trader puts his money where his mouth is. Among the constant noise of Crypto Twitter, Telegram, YouTube channels and other media, it is difficult to make sense of what’s actually being traded versus what’s being promoted — there is simply a lack of transparency and trust.

Solution

In 2018, we embarked on the Aluna project with the goal of creating a transparent environment where aspiring crypto traders can thrive, by combining a trading terminal with a social network.

What is Aluna.Social?

Aluna.Social is ALN’s flagship product — a multi-exchange social trading terminal for crypto traders.

On Aluna.Social, users can connect and manage multiple exchange accounts in one place, verify and share their trading performance in an unforgeable way, leverage on community insights and positive social feedback loops, and automatically copy the trades of the world’s best traders (or counter-copy the worst!).

The platform is currently in Beta, integrated with Bitmex, Binance, Bitfinex, Bittrex and Poloniex, with more exchange integrations on the way.

Three notable features in the pipeline are:

  • Web3 Prediction Games
  • DeFi Leaderboard — An on-chain analytics tool for DEX traders
  • On-chain Social Trading

What is ALN?

Aluna (ALN) Token is the utility token at the heart of the Aluna ecosystem.

The core functions of ALN are:

  1. To fuel the incentive and gamification mechanisms of ALN-powered products and services.
  2. To incentivise ecosystem participants and bootstrapping of ALN liquidity and utility.
  3. To coordinate decentralised governance and reward the community of governors.

ALN Token Allocation

51% of the total ALN supply is distributed to the community who will govern the remaining 49% held in the treasury through the Aluna DAO.

  • Early Adopters: 10%
  • Token Sale: 15%
  • Ecosystem Fund: 15%
  • Team & Advisors: 11%
  • Treasury: 49%

The total supply of ALN is 100 million.

Image for post

ALN Token Distribution

ALN Token Vesting

Image for post

ALN Token Vesting

ALN Token Release Schedule

Image for post

ALN Token Release Schedule

ALN Tokenomics

  • EARN: Platform incentives reward active users and the best traders on the leaderboard.
  • SPEND: Users can spend ALN tokens for a discounted monthly subscription plan.
  • PLAY: Users can play Web3 prediction games (similar to binary options) using ALN tokens.
  • STAKE: Staking at least 1337 ALN provides users with platform benefits including free monthly subscription and fee discounts, while keeping 1337 ALN in your MetaMask wallet enables access to the Aluna token-permissioned Telegram chat group.
  • FARM: The Ecosystem Fund is used to bootstrap liquidity, attract a larger audience, and incentivise meaningful contribution such as by being an active or top performing trader on Aluna.Social.
  • BURN: Part of revenue generated from monthly subscription fees and gaming fees in ALN is burned.
  • GOVERN: ALN is self-governed through the Aluna DAO, enabling decentralised control and evolution of the ALN token. Governors will be able to propose and vote on usage of the treasury and other protocol improvements.

TOKEN SALE: 15 DEC – 29 DEC

Ticker: ALN 
Token type: ERC20
ICO Token Price: 1 ALN = 0.1 USD
Fundraising Goal: $1,500,000
Total Tokens: 100,000,000 
Available for Token Sale: 5%

Would you like to earn ALN right now! ☞ CLICK HERE

Looking for more information…

☞ Website
☞ Explorer
☞ Source Code
☞ Social Channel
Message Board
☞ Documentation
☞ Coinmarketcap

Create an Account and Trade Cryptocurrency NOW

Binance
Bittrex
Poloniex

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#cryptocurrency #bitcoin #aluna.social #aln

What is Aluna Social (ALN) | What is Aluna Social token | What is ALN token

Introducing the Aluna Project: Aluna.Social and ALN Token Overview

The Aluna project aims to address the lack of transparency and **trust **in crypto trading.

Image for post

Problem

Anyone can say what they think about the market, but it’s difficult to prove if a trader puts his money where his mouth is. Among the constant noise of Crypto Twitter, Telegram, YouTube channels and other media, it is difficult to make sense of what’s actually being traded versus what’s being promoted — there is simply a lack of transparency and trust.

Solution

In 2018, we embarked on the Aluna project with the goal of creating a transparent environment where aspiring crypto traders can thrive, by combining a trading terminal with a social network.

What is Aluna.Social?

Aluna.Social is ALN’s flagship product — a multi-exchange social trading terminal for crypto traders.

On Aluna.Social, users can connect and manage multiple exchange accounts in one place, verify and share their trading performance in an unforgeable way, leverage on community insights and positive social feedback loops, and automatically copy the trades of the world’s best traders (or counter-copy the worst!).

The platform is currently in Beta, integrated with Bitmex, Binance, Bitfinex, Bittrex and Poloniex, with more exchange integrations on the way.

Three notable features in the pipeline are:

  • Web3 Prediction Games
  • DeFi Leaderboard — An on-chain analytics tool for DEX traders
  • On-chain Social Trading

What is ALN?

Aluna (ALN) Token is the utility token at the heart of the Aluna ecosystem.

The core functions of ALN are to:

  1. Bootstrap the community, ecosystem, utility and liquidity.
  2. Fuel the incentive and gamification mechanisms on ALN-powered platforms.
  3. Coordinate decentralised governance and reward the community of governors.

ALN Token Allocation

65% of the total ALN supply is distributed to the community who will govern the remaining 35% held in the treasury through the Aluna DAO.

  • Early Adopters: 10%
  • Token Sale: 25%
  • Ecosystem Fund: 15%
  • Team & Advisors: 15%
  • Treasury: 35%

The total supply of ALN is 100 million.

Image for post

ALN Token Distribution

ALN Token Vesting

Image for post

ALN Token Release Schedule

Image for post

ALN Tokenomics

  • **EARN: **Platform incentives reward active users and the best traders on the leaderboard.
  • **SPEND: **Users can spend ALN tokens for a discounted monthly subscription plan.
  • **PLAY: **Users can play Web3 prediction games (similar to binary options) using ALN tokens.
  • **STAKE: **Staking at least 1337 ALN provides users with platform benefits including free monthly subscription and fee discounts, while keeping 1337 ALN in your MetaMask wallet enables access to the  Aluna token-permissioned Telegram chat group.
  • **FARM: **The Ecosystem Fund is used to bootstrap liquidity, attract a larger audience, and incentivise meaningful contribution such as by being an active or top performing trader on Aluna.Social.
  • **BURN: **Part of revenue generated from monthly subscription fees and gaming fees in ALN is burned.
  • **GOVERN: **ALN is self-governed through the Aluna DAO, enabling decentralised control and evolution of the ALN token. Governors will be able to propose and vote on usage of the treasury and other protocol improvements.

ALN Private Sale, Revised Token Metrics and Roadmap

We have updated ALN’s allocation, metrics, and tokenomics based on recent developments shared here. Refer to our Whitepaper V1.5 for the full details.

This post covers the revised token allocation, token sale and private sale breakdown, then highlights some tokenomics changes, and ends off with our roadmap for the next 4 quarters.

Revised ALN Token Allocation

The treasury allocation has been reduced to accommodate the new token sale allocation. The complete breakdown is as follows:

Image for post

ALN Token Allocation as of 25th February 2021

ALN Token Sale Overview

25% of the total supply (25,000,000 ALN) is allocated to the Token Sale and divided into 4 phases held from Q4 2020 to Q1 2021, detailed in the table below:

Image for post

Note: All vesting is linear (by block) and starts from date of exchange listing.

Unsold tokens will be added to the Treasury.

ALN Token Private Sale Overview

The ALN private sale is currently open, and will end on 22nd February 2021, or when allocated tokens are sold out, whichever comes first.

5% of total supply (5,000,00 ALN) is allocated for the Private Sale, at the price of $0.08/ALN. Tokens will be locked up until exchange listing, followed by 6 months of linear vesting (by block).

Private Sale conditions:

  • Minimum contribution: US$10,000
  • Accepted currencies: ETH and ERC-20 (WBTC, DAI, USDC and USDT)
  • Restricted jurisdictions: USA and other sanctioned countries

ALN Token: Tokenomics Highlights

The core functions of the ALN token are to:

  1. Bootstrap the ALN community, ecosystem, utility and liquidity.
  2. Fuel the incentive and gamification mechanisms of ALN-powered products and services.
  3. Coordinate decentralised governance and reward the community of governors.

Here we highlight some new incentive mechanisms in our latest Whitepaper, namely the Aluna.Social Performance Pool, and ALN’s value accrual design with the addition of the Rewards Pool.

For more information about other incentive mechanisms such as participation mining, holding and staking rewards, refer to our latest Whitepaper.

Aluna.Social Performance Pool

Up to 50% of fees from Aluna.Social (e.g. payment for PRO subscription plans starting Q2 2021) will be added to a Performance Pool. This is then shared among all the profitable leader traders every month.

Rewards Pool

Up to 50% of ALN fees and 5% of non-ALN fees from ALN-powered platforms and smart contracts (e.g. for prediction games, defi social trading) will be added to a Rewards Pool. This is then distributed to ALN stakers and added to the community-owned treasury.

Aluna Roadmap: 2021

We also completed our smart contract audit with CertiK last month, and are in the process of completing a platform infrastructure security audit with  Deployflow.

Here’s our roadmap for the next 4 quarters:

Image for post

Refer to section 7 in our Whitepaper for the long term roadmap.

How and Where to Buy ALN – An Easy Step by Step Guide

You will have to first buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT)… We will use Binance here as it is one of the largest crypto exchanges that accept fiat deposits.

Binance is a popular cryptocurrency exchange which was started in China but then moved their headquarters to the crypto-friendly Island of Malta in the EU. Binance is popular for its crypto to crypto exchange services. Binance exploded onto the scene in the mania of 2017 and has since gone on to become the top crypto exchange in the world.

Once you finished the KYC process. You will be asked to add a payment method. Here you can either choose to provide a credit/debit card or use a bank transfer. You will be charged higher fees when using cards but you will also make an instant purchase. While a bank transfer will be cheaper but slower, depending on the country of your residence.

SIGN UP ON BINANCE

Step by Step Guide ☞ What is Binance | How to Create an account on Binance (Updated 2021)

Next step - Transfer your cryptos to an Altcoin Exchange

Since ALN is an altcoin we need to transfer our coins to an exchange that ALN can be traded. Below is a list of exchanges that offers to trade ALN in various market pairs, head to their websites and register for an account.

Once finished you will then need to make a BTC/ETH/USDT deposit to the exchange from Binance depending on the available market pairs. After the deposit is confirmed you may then purchase SHIB from the exchange view.

Exchange: Gate.io

Apart from the exchange(s) above, there are a few popular crypto exchanges where they have decent daily trading volumes and a huge user base. This will ensure you will be able to sell your coins at any time and the fees will usually be lower. It is suggested that you also register on these exchanges since once ALN gets listed there it will attract a large amount of trading volumes from the users there, that means you will be having some great trading opportunities!

Top exchanges for token-coin trading. Follow instructions and make unlimited money

BinanceBittrexPoloniexBitfinexHuobiMXCProBITGate.ioCoinbase

Find more information ALN

WebsiteExplorerExplorer 2WhitepaperSource CodeSocial ChannelSocial Channel 2Message BoardCoinmarketcap

Would you like to earn TOKEN right now! ☞ CLICK HERE

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#blockchain #bitcoin #crypto #aluna.social #aln

Royce  Reinger

Royce Reinger

1658068560

WordsCounted: A Ruby Natural Language Processor

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Are you using WordsCounted to do something interesting? Please tell me about it.

Gem Version 

RubyDoc documentation.

Demo

Visit this website for one example of what you can do with WordsCounted.


Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: Abitdodgy
Source Code: https://github.com/abitdodgy/words_counted 
License: MIT license

#ruby #nlp 

Words Counted: A Ruby Natural Language Processor.

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Are you using WordsCounted to do something interesting? Please tell me about it.

 

Demo

Visit this website for one example of what you can do with WordsCounted.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: abitdodgy
Source code: https://github.com/abitdodgy/words_counted
License: MIT license

#ruby  #ruby-on-rails 

aaron silva

aaron silva

1622197808

SafeMoon Clone | Create A DeFi Token Like SafeMoon | DeFi token like SafeMoon

SafeMoon is a decentralized finance (DeFi) token. This token consists of RFI tokenomics and auto-liquidity generating protocol. A DeFi token like SafeMoon has reached the mainstream standards under the Binance Smart Chain. Its success and popularity have been immense, thus, making the majority of the business firms adopt this style of cryptocurrency as an alternative.

A DeFi token like SafeMoon is almost similar to the other crypto-token, but the only difference being that it charges a 10% transaction fee from the users who sell their tokens, in which 5% of the fee is distributed to the remaining SafeMoon owners. This feature rewards the owners for holding onto their tokens.

Read More @ https://bit.ly/3oFbJoJ

#create a defi token like safemoon #defi token like safemoon #safemoon token #safemoon token clone #defi token