1620263993
Evolution Finance has created an innovative token design. While the token’s uniqueness may capture all early attention the project receives, the core value Evolution Finance provides is in its advanced lending market, which captures a 10x higher market value than any existing lending operator.
Evolution Finance operates is using renVM to port the top 50 market assets on to Ethereum for availability on the Evolution Finance lending market.
Farming has quickly become a mainstay in the DeFi space. However, it has forced projects to build problematic inflationary releases which are absolutely harmful for the project’s longevity.
The constant supply of free tokens comes at the expense of holders. It also forces platform users to become addicted to an unsustainable incentive system. Evolution Finance is addressing this problem with a powerful yet deflationary farming system.
EVN Token is deflationary.
When ever the EVN token is transacted, 1.7% of it is deducted. For example, is a person sells 100 EVN tokens, he/she actually processes only 98.3 tokens during the sale. 1.7 tokens are claimed by the EVN smart contract for distribution.
The 1.7% is distributed as such:
This creates a natural deflation of EVN, while incentivizing the growth of liquidity that supports the token.
EVN liquidity providers become **permanently **locked. This means any liquidity added to become a liquidity provider is permanently allocated to the token’s price floor.
EVN liquidity provider tokens are called Evolution Yield tokens (EVNY). Liquidity providers have to stake their LP tokens (EVNY) to begin enjoying the 1% fee from the token transfer volume. EVNY can also be sold on its own liquidity pairs, which the community can create on Uniswap or Balancer.
The permanent liquidity lock is desirable trade-off for the 1% transaction volume EVNY stakers earn permanently.
EVNY tokens have a hard cap. LP tokens have never had a hard cap till now. This is because anyone can buy fractional tokens, add ETH, and become a liquidity provider.
This can be problematic for liquidity providers who are locking in their assets. To incentivize this, the EVNY staking reaches a cap once the EVN liquidity pools reach $5M net liquidity. This makes sure EVNY stakers do not have to be worried about reward dilution and can confidently participate.
As exciting as the token design is, it’s not the core product but a mere feature in the overall development of Evolution Finance.
The Evolution lending market is the primary service and value-add of the ecosystem. Lending is one of the most prominent and successful businesses in the crypto market. Every major industry operator, including Digital Currency Group (Grayscale), Binance, Kraken, Huobi, etc., participates in crypto lending.
Crypto lending is driven by traders who want to either leverage their position or short sell an asset. The largest markets for leveraged trading and short selling are BTC, ETH, XRP, LTC, BCH, DOT, EOS, TRX, and so many other billion-dollar coins, yet only BTC and ETH can be leveraged or short sold in Ethereum’s DeFi ecosystem.
This is a huge market gap.
CeFi is capitalizing on the billions of dollars in demand for lending assets that do not yet exist in Ethereum DeFi, let alone be part of decentralized lending markets. Evolution Finance is using renVM and other wrapping solutions to bring the top 50 native assets of blockchains (beside ETH), on to the Ethereum network, for exclusive availability on the Evolution lending market.
The business of Evolution Finance is to capitalize on an untapped lending market that is 10x larger than any DeFi competitor.
Evolution Finance is an initiative by known partners and public figures in the space who are tired of the flaws in DeFi. This initiative takes advantage of their network, experience, security and exposure. As a natural by product, the project has already settled major collaborations and partnerships.
At inception the project will be managed by the partner network and approved via multi-sig signers to ensure no malicious activity, Evolution Finance will move to full decentralized governance control with the launch of the primary lending and borrowing platform.
A total of 300,000 EVN tokens will be minted at the token generation event (TGE) with no future minting function enabled. The 300,000 tokens minted during the TGE will be the first and only batch of EVN tokens that will ever be minted. These tokens will be added to uniswap proportionately, with $300k (ETH) of locked liquidity:
At Launch there will be one pool.
Pool 1: ETH/EVN = $300k ETH + 300,000 EVN
Post launch a second pool will be created for tETH/EVN (wrapped eth to manage eth below the price floor). Liquidity will be split from the ETH/EVN pool and managed between the two new pools to ensure maximum value.
EVN token is now live on the Ethereum mainnet. The token address for EVN is 0x9af15d7b8776fa296019979e70a5be53c714a7ec. Be cautious not to purchase any other token with a smart contract different from this one (as this can be easily faked). We strongly advise to be vigilant and stay safe throughout the launch. Don’t let the excitement get the best of you.
Just be sure you have enough ETH in your wallet to cover the transaction fees.
You will have to first buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…
We will use Binance Exchange here as it is one of the largest crypto exchanges that accept fiat deposits.
Once you finished the KYC process. You will be asked to add a payment method. Here you can either choose to provide a credit/debit card or use a bank transfer, and buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…
Step by Step Guide : What is Binance | How to Create an account on Binance (Updated 2021)
Next step
You need a wallet address to Connect to Uniswap Decentralized Exchange, we use Metamask wallet
If you don’t have a Metamask wallet, read this article and follow the steps
☞What is Metamask wallet | How to Create a wallet and Use
Transfer $ETH to your new Metamask wallet from your existing wallet
Next step
Connect Metamask wallet to Uniswap Decentralized Exchange and Buy, Swap EVN token
Contract: 0x9af15d7b8776fa296019979e70a5be53c714a7ec
Read more: What is Uniswap | Beginner’s Guide on How to Use Uniswap
The top exchange for trading in EVN token is currently …
There are a few popular crypto exchanges where they have decent daily trading volumes and a huge user base. This will ensure you will be able to sell your coins at any time and the fees will usually be lower. It is suggested that you also register on these exchanges since once EVN gets listed there it will attract a large amount of trading volumes from the users there, that means you will be having some great trading opportunities!
Top exchanges for token-coin trading. Follow instructions and make unlimited money
☞ https://www.binance.com
☞ https://www.bittrex.com
☞ https://www.poloniex.com
☞ https://www.bitfinex.com
☞ https://www.huobi.com
Find more information EVN
☞ Website ☞ Explorer ☞ Explorer 2 ☞ Source Code ☞ Social Channel ☞ Social Channel 2 ☞ Social Channel 3 ☞ Message Board ☞ Coinmarketcap
🔺DISCLAIMER: The Information in the post isn’t financial advice, is intended FOR GENERAL INFORMATION PURPOSES ONLY. Trading Cryptocurrency is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money.
🔥 If you’re a beginner. I believe the article below will be useful to you
⭐ ⭐ ⭐ What You Should Know Before Investing in Cryptocurrency - For Beginner ⭐ ⭐ ⭐
I hope this post will help you. Don’t forget to leave a like, comment and sharing it with others. Thank you!
#blockchain #bitcoin #evn #evolution finance
1620263993
Evolution Finance has created an innovative token design. While the token’s uniqueness may capture all early attention the project receives, the core value Evolution Finance provides is in its advanced lending market, which captures a 10x higher market value than any existing lending operator.
Evolution Finance operates is using renVM to port the top 50 market assets on to Ethereum for availability on the Evolution Finance lending market.
Farming has quickly become a mainstay in the DeFi space. However, it has forced projects to build problematic inflationary releases which are absolutely harmful for the project’s longevity.
The constant supply of free tokens comes at the expense of holders. It also forces platform users to become addicted to an unsustainable incentive system. Evolution Finance is addressing this problem with a powerful yet deflationary farming system.
EVN Token is deflationary.
When ever the EVN token is transacted, 1.7% of it is deducted. For example, is a person sells 100 EVN tokens, he/she actually processes only 98.3 tokens during the sale. 1.7 tokens are claimed by the EVN smart contract for distribution.
The 1.7% is distributed as such:
This creates a natural deflation of EVN, while incentivizing the growth of liquidity that supports the token.
EVN liquidity providers become **permanently **locked. This means any liquidity added to become a liquidity provider is permanently allocated to the token’s price floor.
EVN liquidity provider tokens are called Evolution Yield tokens (EVNY). Liquidity providers have to stake their LP tokens (EVNY) to begin enjoying the 1% fee from the token transfer volume. EVNY can also be sold on its own liquidity pairs, which the community can create on Uniswap or Balancer.
The permanent liquidity lock is desirable trade-off for the 1% transaction volume EVNY stakers earn permanently.
EVNY tokens have a hard cap. LP tokens have never had a hard cap till now. This is because anyone can buy fractional tokens, add ETH, and become a liquidity provider.
This can be problematic for liquidity providers who are locking in their assets. To incentivize this, the EVNY staking reaches a cap once the EVN liquidity pools reach $5M net liquidity. This makes sure EVNY stakers do not have to be worried about reward dilution and can confidently participate.
As exciting as the token design is, it’s not the core product but a mere feature in the overall development of Evolution Finance.
The Evolution lending market is the primary service and value-add of the ecosystem. Lending is one of the most prominent and successful businesses in the crypto market. Every major industry operator, including Digital Currency Group (Grayscale), Binance, Kraken, Huobi, etc., participates in crypto lending.
Crypto lending is driven by traders who want to either leverage their position or short sell an asset. The largest markets for leveraged trading and short selling are BTC, ETH, XRP, LTC, BCH, DOT, EOS, TRX, and so many other billion-dollar coins, yet only BTC and ETH can be leveraged or short sold in Ethereum’s DeFi ecosystem.
This is a huge market gap.
CeFi is capitalizing on the billions of dollars in demand for lending assets that do not yet exist in Ethereum DeFi, let alone be part of decentralized lending markets. Evolution Finance is using renVM and other wrapping solutions to bring the top 50 native assets of blockchains (beside ETH), on to the Ethereum network, for exclusive availability on the Evolution lending market.
The business of Evolution Finance is to capitalize on an untapped lending market that is 10x larger than any DeFi competitor.
Evolution Finance is an initiative by known partners and public figures in the space who are tired of the flaws in DeFi. This initiative takes advantage of their network, experience, security and exposure. As a natural by product, the project has already settled major collaborations and partnerships.
At inception the project will be managed by the partner network and approved via multi-sig signers to ensure no malicious activity, Evolution Finance will move to full decentralized governance control with the launch of the primary lending and borrowing platform.
A total of 300,000 EVN tokens will be minted at the token generation event (TGE) with no future minting function enabled. The 300,000 tokens minted during the TGE will be the first and only batch of EVN tokens that will ever be minted. These tokens will be added to uniswap proportionately, with $300k (ETH) of locked liquidity:
At Launch there will be one pool.
Pool 1: ETH/EVN = $300k ETH + 300,000 EVN
Post launch a second pool will be created for tETH/EVN (wrapped eth to manage eth below the price floor). Liquidity will be split from the ETH/EVN pool and managed between the two new pools to ensure maximum value.
EVN token is now live on the Ethereum mainnet. The token address for EVN is 0x9af15d7b8776fa296019979e70a5be53c714a7ec. Be cautious not to purchase any other token with a smart contract different from this one (as this can be easily faked). We strongly advise to be vigilant and stay safe throughout the launch. Don’t let the excitement get the best of you.
Just be sure you have enough ETH in your wallet to cover the transaction fees.
You will have to first buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…
We will use Binance Exchange here as it is one of the largest crypto exchanges that accept fiat deposits.
Once you finished the KYC process. You will be asked to add a payment method. Here you can either choose to provide a credit/debit card or use a bank transfer, and buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…
Step by Step Guide : What is Binance | How to Create an account on Binance (Updated 2021)
Next step
You need a wallet address to Connect to Uniswap Decentralized Exchange, we use Metamask wallet
If you don’t have a Metamask wallet, read this article and follow the steps
☞What is Metamask wallet | How to Create a wallet and Use
Transfer $ETH to your new Metamask wallet from your existing wallet
Next step
Connect Metamask wallet to Uniswap Decentralized Exchange and Buy, Swap EVN token
Contract: 0x9af15d7b8776fa296019979e70a5be53c714a7ec
Read more: What is Uniswap | Beginner’s Guide on How to Use Uniswap
The top exchange for trading in EVN token is currently …
There are a few popular crypto exchanges where they have decent daily trading volumes and a huge user base. This will ensure you will be able to sell your coins at any time and the fees will usually be lower. It is suggested that you also register on these exchanges since once EVN gets listed there it will attract a large amount of trading volumes from the users there, that means you will be having some great trading opportunities!
Top exchanges for token-coin trading. Follow instructions and make unlimited money
☞ https://www.binance.com
☞ https://www.bittrex.com
☞ https://www.poloniex.com
☞ https://www.bitfinex.com
☞ https://www.huobi.com
Find more information EVN
☞ Website ☞ Explorer ☞ Explorer 2 ☞ Source Code ☞ Social Channel ☞ Social Channel 2 ☞ Social Channel 3 ☞ Message Board ☞ Coinmarketcap
🔺DISCLAIMER: The Information in the post isn’t financial advice, is intended FOR GENERAL INFORMATION PURPOSES ONLY. Trading Cryptocurrency is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money.
🔥 If you’re a beginner. I believe the article below will be useful to you
⭐ ⭐ ⭐ What You Should Know Before Investing in Cryptocurrency - For Beginner ⭐ ⭐ ⭐
I hope this post will help you. Don’t forget to leave a like, comment and sharing it with others. Thank you!
#blockchain #bitcoin #evn #evolution finance
1624219980
NFT Art Finance is currently one of the most popular cryptocurrencies right now on the market, so in today’s video, I will be showing you guys how to easily buy NFT Art Finance on your phone using the Trust Wallet application.
📺 The video in this post was made by More LimSanity
The origin of the article: https://www.youtube.com/watch?v=sKE6Pc_w1IE
🔺 DISCLAIMER: The article is for information sharing. The content of this video is solely the opinions of the speaker who is not a licensed financial advisor or registered investment advisor. Not investment advice or legal advice.
Cryptocurrency trading is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money
🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner
⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!
☞ **-----CLICK HERE-----**⭐ ⭐ ⭐
Thanks for visiting and watching! Please don’t forget to leave a like, comment and share!
#bitcoin #blockchain #nft art finance token #token #buy nft art finance #how to buy nft art finance token - the easiest method!
1624312800
SPORE FINANCE PREDICTION - WHAT IS SPORE FINANCE & SPORE FINANCE ANALYSIS - SPORE FINANCE
In this video, I talk about spore finance coin and give my spore finance prediction. I talk about the latest spore finance analysis & spore finance crypto coin that recently has been hit pretty hard in the last 24 hours. I go over what is spore finance and how many holders are on this new crypto coin spore finance.
📺 The video in this post was made by Josh’s Finance
The origin of the article: https://www.youtube.com/watch?v=qbPQvdxCtEI
🔺 DISCLAIMER: The article is for information sharing. The content of this video is solely the opinions of the speaker who is not a licensed financial advisor or registered investment advisor. Not investment advice or legal advice.
Cryptocurrency trading is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money
🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner
⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!
☞ **-----CLICK HERE-----**⭐ ⭐ ⭐
Thanks for visiting and watching! Please don’t forget to leave a like, comment and share!
#bitcoin #blockchain #spore finance #what is spore finance #spore finance prediction - what is spore finance & spore finance analysis - spore finance #spore finance prediction
1659601560
We are all in the gutter, but some of us are looking at the stars.
-- Oscar Wilde
WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.
Are you using WordsCounted to do something interesting? Please tell me about it.
Visit this website for one example of what you can do with WordsCounted.
["Bayrūt"]
and not ["Bayr", "ū", "t"]
, for example.Add this line to your application's Gemfile:
gem 'words_counted'
And then execute:
$ bundle
Or install it yourself as:
$ gem install words_counted
Pass in a string or a file path, and an optional filter and/or regexp.
counter = WordsCounted.count(
"We are all in the gutter, but some of us are looking at the stars."
)
# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")
.count
and .from_file
are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter
initialized with the tokens. The WordsCounted::Tokeniser
and WordsCounted::Counter
classes can be used alone, however.
WordsCounted.count(input, options = {})
Tokenises input and initializes a WordsCounted::Counter
object with the resulting tokens.
counter = WordsCounted.count("Hello Beirut!")
Accepts two options: exclude
and regexp
. See Excluding tokens from the analyser and Passing in a custom regexp respectively.
WordsCounted.from_file(path, options = {})
Reads and tokenises a file, and initializes a WordsCounted::Counter
object with the resulting tokens.
counter = WordsCounted.from_file("hello_beirut.txt")
Accepts the same options as .count
.
The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.
Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.
#tokenise([pattern: TOKEN_REGEXP, exclude: nil])
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise
# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")
# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)
See Excluding tokens from the analyser and Passing in a custom regexp for more information.
The WordsCounted::Counter
class allows you to collect various statistics from an array of tokens.
#token_count
Returns the token count of a given string.
counter.token_count #=> 15
#token_frequency
Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.
counter.token_frequency
[
["the", 2],
["are", 2],
["we", 1],
# ...
["all", 1]
]
#most_frequent_tokens
Returns a hash where each key-value pair is a token and its frequency.
counter.most_frequent_tokens
{ "are" => 2, "the" => 2 }
#token_lengths
Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.
counter.token_lengths
[
["looking", 7],
["gutter", 6],
["stars", 5],
# ...
["in", 2]
]
#longest_tokens
Returns a hash where each key-value pair is a token and its length.
counter.longest_tokens
{ "looking" => 7 }
#token_density([ precision: 2 ])
Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision
argument, which must be a float.
counter.token_density
[
["are", 0.13],
["the", 0.13],
["but", 0.07 ],
# ...
["we", 0.07 ]
]
#char_count
Returns the char count of tokens.
counter.char_count #=> 76
#average_chars_per_token([ precision: 2 ])
Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.
counter.average_chars_per_token #=> 4
#uniq_token_count
Returns the number of unique tokens.
counter.uniq_token_count #=> 13
You can exclude anything you want from the input by passing the exclude
option. The exclude option accepts a variety of filters and is extremely flexible.
:odd?
.tokeniser =
WordsCounted::Tokeniser.new(
"Magnificent! That was magnificent, Trevor."
)
# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]
# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]
# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]
# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]
# Using an array
tokeniser = WordsCounted::Tokeniser.new(
"Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]
The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.
/[\p{Alpha}\-']+/
You can pass your own criteria as a Ruby regular expression to split your string as desired.
For example, if you wanted to include numbers, you can override the regular expression:
counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]
Use the from_file
method to open files. from_file
accepts the same options as .count
. The file path can be a URL.
counter = WordsCounted.from_file("url/or/path/to/file.text")
A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.
counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency
[
["do", 2],
["how", 1],
["you", 1],
["-you", 1], # WTF, mate!
["are", 1],
# ...
]
In this example -you
and you
are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.
The program will normalise (downcase) all incoming strings for consistency and filters.
def self.from_url
# open url and send string here after removing html
end
See contributors.
git checkout -b my-new-feature
)git commit -am 'Add some feature'
)git push origin my-new-feature
)Author: abitdodgy
Source code: https://github.com/abitdodgy/words_counted
License: MIT license
#ruby #ruby-on-rails
1658068560
WordsCounted
We are all in the gutter, but some of us are looking at the stars.
-- Oscar Wilde
WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.
["Bayrūt"]
and not ["Bayr", "ū", "t"]
, for example.Add this line to your application's Gemfile:
gem 'words_counted'
And then execute:
$ bundle
Or install it yourself as:
$ gem install words_counted
Pass in a string or a file path, and an optional filter and/or regexp.
counter = WordsCounted.count(
"We are all in the gutter, but some of us are looking at the stars."
)
# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")
.count
and .from_file
are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter
initialized with the tokens. The WordsCounted::Tokeniser
and WordsCounted::Counter
classes can be used alone, however.
WordsCounted.count(input, options = {})
Tokenises input and initializes a WordsCounted::Counter
object with the resulting tokens.
counter = WordsCounted.count("Hello Beirut!")
Accepts two options: exclude
and regexp
. See Excluding tokens from the analyser and Passing in a custom regexp respectively.
WordsCounted.from_file(path, options = {})
Reads and tokenises a file, and initializes a WordsCounted::Counter
object with the resulting tokens.
counter = WordsCounted.from_file("hello_beirut.txt")
Accepts the same options as .count
.
The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.
Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.
#tokenise([pattern: TOKEN_REGEXP, exclude: nil])
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise
# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")
# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)
See Excluding tokens from the analyser and Passing in a custom regexp for more information.
The WordsCounted::Counter
class allows you to collect various statistics from an array of tokens.
#token_count
Returns the token count of a given string.
counter.token_count #=> 15
#token_frequency
Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.
counter.token_frequency
[
["the", 2],
["are", 2],
["we", 1],
# ...
["all", 1]
]
#most_frequent_tokens
Returns a hash where each key-value pair is a token and its frequency.
counter.most_frequent_tokens
{ "are" => 2, "the" => 2 }
#token_lengths
Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.
counter.token_lengths
[
["looking", 7],
["gutter", 6],
["stars", 5],
# ...
["in", 2]
]
#longest_tokens
Returns a hash where each key-value pair is a token and its length.
counter.longest_tokens
{ "looking" => 7 }
#token_density([ precision: 2 ])
Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision
argument, which must be a float.
counter.token_density
[
["are", 0.13],
["the", 0.13],
["but", 0.07 ],
# ...
["we", 0.07 ]
]
#char_count
Returns the char count of tokens.
counter.char_count #=> 76
#average_chars_per_token([ precision: 2 ])
Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.
counter.average_chars_per_token #=> 4
#uniq_token_count
Returns the number of unique tokens.
counter.uniq_token_count #=> 13
You can exclude anything you want from the input by passing the exclude
option. The exclude option accepts a variety of filters and is extremely flexible.
:odd?
.tokeniser =
WordsCounted::Tokeniser.new(
"Magnificent! That was magnificent, Trevor."
)
# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]
# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]
# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]
# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]
# Using an array
tokeniser = WordsCounted::Tokeniser.new(
"Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]
The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.
/[\p{Alpha}\-']+/
You can pass your own criteria as a Ruby regular expression to split your string as desired.
For example, if you wanted to include numbers, you can override the regular expression:
counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]
Use the from_file
method to open files. from_file
accepts the same options as .count
. The file path can be a URL.
counter = WordsCounted.from_file("url/or/path/to/file.text")
A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.
counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency
[
["do", 2],
["how", 1],
["you", 1],
["-you", 1], # WTF, mate!
["are", 1],
# ...
]
In this example -you
and you
are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.
The program will normalise (downcase) all incoming strings for consistency and filters.
def self.from_url
# open url and send string here after removing html
end
Are you using WordsCounted to do something interesting? Please tell me about it.
Visit this website for one example of what you can do with WordsCounted.
Contributors
See contributors.
git checkout -b my-new-feature
)git commit -am 'Add some feature'
)git push origin my-new-feature
)Author: Abitdodgy
Source Code: https://github.com/abitdodgy/words_counted
License: MIT license