What is DeFi100 (D100) | What is DeFi100 token | What is D100 token

The year 2020 saw staggering growth and a phenomenal rise in the decentralized finance (DeFi) industry. The DeFi sector continues to break a new high in terms of market cap as well as total value locked.

It will not be wrong to say, 2020 was the year of DeFi. Throughout the year DeFi tokens saw unprecedented growth.

The total market cap of DeFi tokens that stood at $1.6 billion as of 31 Dec 2019 touched $19.95 billion on 21 December 2020. At the time of writing the market cap of DeFi tokens stand at roughly $19 billion.

Image for post

If someone wants to speculate on the growth of the DeFi sector and bet on the sector as a whole instead of investing in one or two tokens, how to do that?

DEFI100-Rebase makes it possible to speculate on the growth of the DeFi sector as a whole by investing in one token that tracks the value of the total DeFi market cap and rebases itself by expanding and contracting supply to align its value with the DeFi Market cap at the ratio of 1:100 billion.

DeFi sector is fueled for long-term growth and current market capital is highly undervalued considering the demand and scope in the sector.

DEFI100-REBASE (D100R)

DEFI100-Rebase is the first project by Wrapp3d, a consortium of people working on creating new synthetic assets to introduce indices and real world tradable assets to the token economy. D100R is an index token that derives its value from the total market cap of DEFI sector.

Why it is required?

Decentralized Finance as a sector has a massive growth potential. However it is not possible for any individual to keep a track of hundred’s of tokens and be benefited by the future growth. Some assets will see increase in their valuation whereas some might fail to achieve their milestones resulting in decline in their valuation. Also over the course of time there will be new projects and tokens that will make way in the top 100. This requires regular update of portfolio. For someone who is not active trader, it is simply not possible to do so.

D100R solves this problem from its core. As an index token with its price pegged to the market cap of top 100 DeFi tokens, D100R makes an ideal asset for an investment to bet on the decentralized finance growth.

The price of D100R increases and decreases as per the valuation of total market capitalisation of decentralized finance sector. The Elastic supply of token adjust itself every rebase to bring the spot price of token in equilibrium to the targeted price, i.e. Market Cap of DeFi at the ratio of 1:100 Bn.

Smart Rebase

Image for post

Rebase is fairly new concept in cryptocurrency space. Pioneered by Ampleforth (AMPL) and later successfully tested by Base Protocol (BASE) that pegged its value to the total crypto market cap. The algorithmic stable coins are still in experimentation and nascent stage. However it provides one of the best solution to offer a true trustlessly pegged assets.

D100R is a Rebase token that increases or decreases the supply to bring the spot value of the token equal to the target price through positive or negative rebases.

What is Smart in DEFI100 REBASE?

To give market a window where the market forces bring the spot price of the token in equilibrium with the Target price, D100R Rebase will require the following:

  • Positive Rebase are uncapped.
  • For rebase to occur, difference between Spot Price & Target Price should be greater than 5%. If the difference is lesser than or equal to 5% there will be no rebase.
  • Negative Rebase are capped at 10%. This gives the market forces to rebalance the demand and supply and bring Spot price equal to or higher than the Target price.

Vision — DEFI 100-REBASE

Decentralized finance sector is growing with unprecedented speed. Soon DeFi will be a major player in cryptocurrency ecosystem. D100R aims to become the leading Index token that can be used for investing upon or bet on the growth of DeFi sector.

D100R is an experiment to remove the inherent vice in the REBASE model of elastic supply token. By keeping the negative rebase capped at 10% we aim to give market and users more opportunity to drive the prices towards equilibrium by creating demand and reducing supply at the same time.

Oracles

DIA (Decentralised Information Asset) is an open-source, financial information platform that utilises crypto-economic incentives to source and validate data. Market actors can supply, share and use financial and digital asset data.

DEFI100-Rebase token will be powered by DIA data to source the data & feeds for rebase. DIA’s decentralized oracle will source the price feeds from multiple sources to provide the most accurate data.

DEFI100 IDO on BakerySwap

Image for post

DEFI100 IDO — 22 FEB 2020

It is a long-awaited moment for which we all were waiting eagerly. Finally, the dates have been finalized for our much-awaited launch on #BinanceSmartChain.

DEFI100 will begin its journey as the first Synthetic asset with elastic supply pegged to #DEFI Market Cap on Binance Chain Ecosystem. Before we start answering questions that might be lingering in your minds, We would like to throw a brief insight into what is DEFI100 all about?

DEFI100 — What & Why?

DEFI100 is a Synthetic Index Token that works on the principle of elastic supply and uses rebase function to maintain the equilibrium between its Spot Price and Target price. Our Token is pegged to the total market cap of the Decentralized Finance sector at the ratio of 1:100 billion.

Since tokens are smart contracts and smart contracts in themselves are unable to fetch real-world data on their own, it requires “Oracle” that can help it bridge the gap between the blockchain and the real world. DEFI100 uses the decentralized oracles of our partners to provide the real-world data, i.e. Total Market Cap of the DeFi sector to power DEFI100 rebases.

Now the question arises, what is Rebase? What is elastic supply and how it works?

Elastic supply principle and rebase are not exactly a new concept in the otherwise young blockchain industry. The credit of popularizing it goes to Ampleforth (AMPL) and later Base Protocol. In Elastic supply, tokens do not have a fixed supply and their supply expands or contracts as per market demands.

In the case of AMPL, its value is pegged to 1 USD and Base Protocol has its value pegged to the total crypto market cap. DEFI100 derives its value from the DEFI sector market cap at the ratio of 1:100 billion. So that means,

  • If the DEFI Market Cap is $80 Billion, the D100 token will be $0.80
  • If the DEFI Market Cap is $90 Billion, the D100 token will be $0.90

How it will affect your holding?

The incredible thing about this mechanism is that it will not affect your holding at all. What does it mean?

Let us explain it with an example.

  • DEFI Market Cap is $100 billion, making our Target price $1 (DEFI100 is pegged at the ratio of 1:100 billion.)
  • DEFI100 token is trading at $1.5 (Spot Price)
  • Mr. Hodler has 1000 DEFI100 Tokens, presently valued at $1500

In the above example, there will be a positive rebase, since Spot Price ($1.50) is more than Target Price ($1.00). Thus, supply will increase. How it will affect Mr. Hodler’s Holding? let’s see

  • After rebase occurs, Mr. Hodler will have 1500 DEFI100 Tokens in his wallet and the price will come down to $1 per token, however keeping the holding value of Mr. Hodler at $1500.

The increase and decrease in holding will be done automatically.

What will happen in case of a negative rebase?

Negative Rebases is painful. However to reduce the number of negative rebase, we are implementing a couple of twists in our rebase mechanism.

  • No Rebase will occur if the difference between Spot & Target Price is less than 5%.
  • Negative rebase has been capped at 10% whereas no capping for positive rebases.

These changes have been incorporated to ensure minimum negative rebases and more time for market forces to bring the price in equilibrium.

Now let’s move to more information about our upcoming IDO!

TokenMetrics

  • Name — DEFI 100
  • Symbol — D100
  • Token Type — BEP20
  • Initial supply — 2,500,000 D100 Tokens
  • IDO Date & Time— 9 AM UTC, 22 Feb 2021How to Participate in DEFI100 IDO?
  • Currencies accepted for IDO — BNB & BUSD
  • Contract Address —  0x9d8aac497a4b8fe697dd63101d793f0c6a6eebb6

DEFI100 IDO will be launched at the best AMM on BSC — BakerySwap  https://www.bakeryswap.org/#/ido on 9:00 AM, February 22 UTC.

Token Distribution and Vesting

Tokens allotted for IDO/Public Sale will have no vesting and will be distributed immediately after liquidity is being added.

5% of the Tokens reserved for the team will be vested for a period of 1 Year and thereafter will be released at the rate of 25% each month.

The liquidity that will be added will be locked for 11 months.

Ecosystem reserve includes tokens that have been reserved for various competitions like trading competitions, hackathons, etc. These Tokens will be slow-released at 10% per month. In Case there is no event scheduled, tokens will be locked for another 11 months.

25% of the tokens are reserved as a reward for the various Liquidity pools that will be available for staking and farming.

Looking for more information…

WebsiteExplorerSocial ChannelSocial Channel 2Social Channel 3Message BoardCoinmarketcap

Would you like to earn D100 right now! ☞ [CLICK HERE](https://www.binance.com/en/register?ref=28551372 “CLICK HERE”)

Top exchanges for token-coin trading. Follow instructions and make unlimited money

BinanceBittrexPoloniexBitfinexHuobiMXC

🔺DISCLAIMER: The Information in the post isn’t financial advice, is intended FOR GENERAL INFORMATION PURPOSES ONLY. Trading Cryptocurrency is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money.

🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner

⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!

☞ -----CLICK HERE**-----**⭐ ⭐ ⭐

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#blockchain #bitcoin #defi100 #d100

What is GEEK

Buddha Community

What is DeFi100 (D100) | What is DeFi100 token | What is D100 token

What is DeFi100 (D100) | What is DeFi100 token | What is D100 token

The year 2020 saw staggering growth and a phenomenal rise in the decentralized finance (DeFi) industry. The DeFi sector continues to break a new high in terms of market cap as well as total value locked.

It will not be wrong to say, 2020 was the year of DeFi. Throughout the year DeFi tokens saw unprecedented growth.

The total market cap of DeFi tokens that stood at $1.6 billion as of 31 Dec 2019 touched $19.95 billion on 21 December 2020. At the time of writing the market cap of DeFi tokens stand at roughly $19 billion.

Image for post

If someone wants to speculate on the growth of the DeFi sector and bet on the sector as a whole instead of investing in one or two tokens, how to do that?

DEFI100-Rebase makes it possible to speculate on the growth of the DeFi sector as a whole by investing in one token that tracks the value of the total DeFi market cap and rebases itself by expanding and contracting supply to align its value with the DeFi Market cap at the ratio of 1:100 billion.

DeFi sector is fueled for long-term growth and current market capital is highly undervalued considering the demand and scope in the sector.

DEFI100-REBASE (D100R)

DEFI100-Rebase is the first project by Wrapp3d, a consortium of people working on creating new synthetic assets to introduce indices and real world tradable assets to the token economy. D100R is an index token that derives its value from the total market cap of DEFI sector.

Why it is required?

Decentralized Finance as a sector has a massive growth potential. However it is not possible for any individual to keep a track of hundred’s of tokens and be benefited by the future growth. Some assets will see increase in their valuation whereas some might fail to achieve their milestones resulting in decline in their valuation. Also over the course of time there will be new projects and tokens that will make way in the top 100. This requires regular update of portfolio. For someone who is not active trader, it is simply not possible to do so.

D100R solves this problem from its core. As an index token with its price pegged to the market cap of top 100 DeFi tokens, D100R makes an ideal asset for an investment to bet on the decentralized finance growth.

The price of D100R increases and decreases as per the valuation of total market capitalisation of decentralized finance sector. The Elastic supply of token adjust itself every rebase to bring the spot price of token in equilibrium to the targeted price, i.e. Market Cap of DeFi at the ratio of 1:100 Bn.

Smart Rebase

Image for post

Rebase is fairly new concept in cryptocurrency space. Pioneered by Ampleforth (AMPL) and later successfully tested by Base Protocol (BASE) that pegged its value to the total crypto market cap. The algorithmic stable coins are still in experimentation and nascent stage. However it provides one of the best solution to offer a true trustlessly pegged assets.

D100R is a Rebase token that increases or decreases the supply to bring the spot value of the token equal to the target price through positive or negative rebases.

What is Smart in DEFI100 REBASE?

To give market a window where the market forces bring the spot price of the token in equilibrium with the Target price, D100R Rebase will require the following:

  • Positive Rebase are uncapped.
  • For rebase to occur, difference between Spot Price & Target Price should be greater than 5%. If the difference is lesser than or equal to 5% there will be no rebase.
  • Negative Rebase are capped at 10%. This gives the market forces to rebalance the demand and supply and bring Spot price equal to or higher than the Target price.

Vision — DEFI 100-REBASE

Decentralized finance sector is growing with unprecedented speed. Soon DeFi will be a major player in cryptocurrency ecosystem. D100R aims to become the leading Index token that can be used for investing upon or bet on the growth of DeFi sector.

D100R is an experiment to remove the inherent vice in the REBASE model of elastic supply token. By keeping the negative rebase capped at 10% we aim to give market and users more opportunity to drive the prices towards equilibrium by creating demand and reducing supply at the same time.

Oracles

DIA (Decentralised Information Asset) is an open-source, financial information platform that utilises crypto-economic incentives to source and validate data. Market actors can supply, share and use financial and digital asset data.

DEFI100-Rebase token will be powered by DIA data to source the data & feeds for rebase. DIA’s decentralized oracle will source the price feeds from multiple sources to provide the most accurate data.

DEFI100 IDO on BakerySwap

Image for post

DEFI100 IDO — 22 FEB 2020

It is a long-awaited moment for which we all were waiting eagerly. Finally, the dates have been finalized for our much-awaited launch on #BinanceSmartChain.

DEFI100 will begin its journey as the first Synthetic asset with elastic supply pegged to #DEFI Market Cap on Binance Chain Ecosystem. Before we start answering questions that might be lingering in your minds, We would like to throw a brief insight into what is DEFI100 all about?

DEFI100 — What & Why?

DEFI100 is a Synthetic Index Token that works on the principle of elastic supply and uses rebase function to maintain the equilibrium between its Spot Price and Target price. Our Token is pegged to the total market cap of the Decentralized Finance sector at the ratio of 1:100 billion.

Since tokens are smart contracts and smart contracts in themselves are unable to fetch real-world data on their own, it requires “Oracle” that can help it bridge the gap between the blockchain and the real world. DEFI100 uses the decentralized oracles of our partners to provide the real-world data, i.e. Total Market Cap of the DeFi sector to power DEFI100 rebases.

Now the question arises, what is Rebase? What is elastic supply and how it works?

Elastic supply principle and rebase are not exactly a new concept in the otherwise young blockchain industry. The credit of popularizing it goes to Ampleforth (AMPL) and later Base Protocol. In Elastic supply, tokens do not have a fixed supply and their supply expands or contracts as per market demands.

In the case of AMPL, its value is pegged to 1 USD and Base Protocol has its value pegged to the total crypto market cap. DEFI100 derives its value from the DEFI sector market cap at the ratio of 1:100 billion. So that means,

  • If the DEFI Market Cap is $80 Billion, the D100 token will be $0.80
  • If the DEFI Market Cap is $90 Billion, the D100 token will be $0.90

How it will affect your holding?

The incredible thing about this mechanism is that it will not affect your holding at all. What does it mean?

Let us explain it with an example.

  • DEFI Market Cap is $100 billion, making our Target price $1 (DEFI100 is pegged at the ratio of 1:100 billion.)
  • DEFI100 token is trading at $1.5 (Spot Price)
  • Mr. Hodler has 1000 DEFI100 Tokens, presently valued at $1500

In the above example, there will be a positive rebase, since Spot Price ($1.50) is more than Target Price ($1.00). Thus, supply will increase. How it will affect Mr. Hodler’s Holding? let’s see

  • After rebase occurs, Mr. Hodler will have 1500 DEFI100 Tokens in his wallet and the price will come down to $1 per token, however keeping the holding value of Mr. Hodler at $1500.

The increase and decrease in holding will be done automatically.

What will happen in case of a negative rebase?

Negative Rebases is painful. However to reduce the number of negative rebase, we are implementing a couple of twists in our rebase mechanism.

  • No Rebase will occur if the difference between Spot & Target Price is less than 5%.
  • Negative rebase has been capped at 10% whereas no capping for positive rebases.

These changes have been incorporated to ensure minimum negative rebases and more time for market forces to bring the price in equilibrium.

Now let’s move to more information about our upcoming IDO!

TokenMetrics

  • Name — DEFI 100
  • Symbol — D100
  • Token Type — BEP20
  • Initial supply — 2,500,000 D100 Tokens
  • IDO Date & Time— 9 AM UTC, 22 Feb 2021How to Participate in DEFI100 IDO?
  • Currencies accepted for IDO — BNB & BUSD
  • Contract Address —  0x9d8aac497a4b8fe697dd63101d793f0c6a6eebb6

DEFI100 IDO will be launched at the best AMM on BSC — BakerySwap  https://www.bakeryswap.org/#/ido on 9:00 AM, February 22 UTC.

Token Distribution and Vesting

Tokens allotted for IDO/Public Sale will have no vesting and will be distributed immediately after liquidity is being added.

5% of the Tokens reserved for the team will be vested for a period of 1 Year and thereafter will be released at the rate of 25% each month.

The liquidity that will be added will be locked for 11 months.

Ecosystem reserve includes tokens that have been reserved for various competitions like trading competitions, hackathons, etc. These Tokens will be slow-released at 10% per month. In Case there is no event scheduled, tokens will be locked for another 11 months.

25% of the tokens are reserved as a reward for the various Liquidity pools that will be available for staking and farming.

Looking for more information…

WebsiteExplorerSocial ChannelSocial Channel 2Social Channel 3Message BoardCoinmarketcap

Would you like to earn D100 right now! ☞ [CLICK HERE](https://www.binance.com/en/register?ref=28551372 “CLICK HERE”)

Top exchanges for token-coin trading. Follow instructions and make unlimited money

BinanceBittrexPoloniexBitfinexHuobiMXC

🔺DISCLAIMER: The Information in the post isn’t financial advice, is intended FOR GENERAL INFORMATION PURPOSES ONLY. Trading Cryptocurrency is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money.

🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner

⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!

☞ -----CLICK HERE**-----**⭐ ⭐ ⭐

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#blockchain #bitcoin #defi100 #d100

Words Counted: A Ruby Natural Language Processor.

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Are you using WordsCounted to do something interesting? Please tell me about it.

 

Demo

Visit this website for one example of what you can do with WordsCounted.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: abitdodgy
Source code: https://github.com/abitdodgy/words_counted
License: MIT license

#ruby  #ruby-on-rails 

Royce  Reinger

Royce Reinger

1658068560

WordsCounted: A Ruby Natural Language Processor

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Are you using WordsCounted to do something interesting? Please tell me about it.

Gem Version 

RubyDoc documentation.

Demo

Visit this website for one example of what you can do with WordsCounted.


Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: Abitdodgy
Source Code: https://github.com/abitdodgy/words_counted 
License: MIT license

#ruby #nlp 

aaron silva

aaron silva

1622197808

SafeMoon Clone | Create A DeFi Token Like SafeMoon | DeFi token like SafeMoon

SafeMoon is a decentralized finance (DeFi) token. This token consists of RFI tokenomics and auto-liquidity generating protocol. A DeFi token like SafeMoon has reached the mainstream standards under the Binance Smart Chain. Its success and popularity have been immense, thus, making the majority of the business firms adopt this style of cryptocurrency as an alternative.

A DeFi token like SafeMoon is almost similar to the other crypto-token, but the only difference being that it charges a 10% transaction fee from the users who sell their tokens, in which 5% of the fee is distributed to the remaining SafeMoon owners. This feature rewards the owners for holding onto their tokens.

Read More @ https://bit.ly/3oFbJoJ

#create a defi token like safemoon #defi token like safemoon #safemoon token #safemoon token clone #defi token

aaron silva

aaron silva

1621844791

SafeMoon Clone | SafeMoon Token Clone | SafeMoon Token Clone Development

The SafeMoon Token Clone Development is the new trendsetter in the digital world that brought significant changes to benefit the growth of investors’ business in a short period. The SafeMoon token clone is the most widely discussed topic among global users for its value soaring high in the marketplace. The SafeMoon token development is a combination of RFI tokenomics and the auto-liquidity generating process. The SafeMoon token is a replica of decentralized finance (DeFi) tokens that are highly scalable and implemented with tamper-proof security.

The SafeMoon tokens execute efficient functionalities like RFI Static Rewards, Automated Liquidity Provisions, and Automatic Token Burns. The SafeMoon token is considered the most advanced stable coin in the crypto market. It gained global audience attention for managing the stability of asset value without any fluctuations in the marketplace. The SafeMoon token clone is completely decentralized that eliminates the need for intermediaries and benefits the users with less transaction fee and wait time to overtake the traditional banking process.

Reasons to invest in SafeMoon Token Clone :

  • The SafeMoon token clone benefits the investors with Automated Liquidity Pool as a unique feature since it adds more revenue for their business growth in less time. The traders can experience instant trade round the clock for reaping profits with less investment towards the SafeMoon token.
  • It is integrated with high-end security protocols like two-factor authentication and signature process to prevent various hacks and vulnerable activities. The Smart Contract system in SafeMoon token development manages the overall operation of transactions without any delay,
  • The users can obtain a reward amount based on the volume of SafeMoon tokens traded in the marketplace. The efficient trading mechanism allows the users to trade the SafeMoon tokens at the best price for farming. The user can earn higher rewards based on the staking volume of tokens by users in the trade market.
  • It allows the token holders to gain complete ownership over their SafeMoon tokens after purchasing from DeFi exchanges. The SafeMoon community governs the token distribution, price fluctuations, staking, and every other token activity. The community boosts the value of SafeMoon tokens.
  • The Automated Burning tokens result in the community no longer having control over the SafeMoon tokens. Instead, the community can control the burn of the tokens efficiently for promoting its value in the marketplace. The transaction of SafeMoon tokens on the blockchain platform is fast, safe, and secure.

The SafeMoon Token Clone Development is a promising future for upcoming investors and startups to increase their business revenue in less time. The SafeMoon token clone has great demand in the real world among millions of users for its value in the market. Investors can contact leading Infinite Block Tech to gain proper assistance in developing a world-class SafeMoon token clone that increases the business growth in less time.

#safemoon token #safemoon token clone #safemoon token clone development #defi token