What is Bonded Finance (BOND) | What is BOND token

Welcome To Bonded Finance

Decentralized Finance (“DeFi”) Today

In what has become something of an arms race, the open finance/DeFi space, despite its share of copies, forks and fugazis represents something both old and new. From the outside looking in, it may appear to be no different than the proliferation of Bitcoin clones and forks circa 2017 that brought us the likes of Super Bitcoin and the lesser. Do not let the usual suspects rebase us into a bubble mindset because this is very different. To do so would be to miss a great transformatiaon, something akin to what the late Bruce Lee said: “It’s like pointing at the moon. Do not concentrate on the finger, or you will miss all of the heavenly glory.”

Though it has been somewhat obscured amidst the whirlwind of development, we are witnessing the rapid emergence of a new paradigm that has reclaimed the very essence of crypto. The principle of cooperation and the ethos of open source has been re-established by governance tokens and it appears there will be no slowing down or looking back. With the advent of true cross-chain interoperability coupled with the inspired number of co-opetitive initiatives, we are redefining our world by way of governance-driven-crowd-wise communities. Never has the phrase, “what we can not do alone, we can do together,” been more apropos.

The Bonded platform was created to incubate and deploy experimental, high-yield, smart-contract driven, financial instruments that push the bounds of open finance. Bonding is an algorithmic model that aims to unlock, aggregate and de-risk ~50 billion in dormant value distributed amongst untapped digital assets. Bonded proposes a new fintech solution deploying both permissioned and permission-less high yield exotic financial products in the rapidly growing DEFI sector.

Philosophy

As yet another entree into the fray, Bonded Finance diverged from old guard business tenets which typically ask, “what can we capture,” and replaced it with “what can we contribute?” We believe that the pathway to success for the crypto economy is through collaboration. This mindset is not just a belief system but is fundamentally built into our operational model and the very mechanics of our smart contract financial products.

The Name

When unstable atoms share, transfer and manage electrons to fill their outer shell, it is known as covalent bonding while ionic bonding is when bonds are formed by two oppositely charged ions. These principles inspired the architecture of our smart instruments. A well-managed, shared basket of volatile assets can create a stable ecosystem while bonding lenders and borrowers, despite their opposite agendas, allows for a sharing of resources and creates lasting, sustainable growth and value.

The Opportunity

Our desire to add value and contribute has been laborious as we experimented with lending products while watching the space crawl, walk and occasionally stumble. After months of pivoting and market observation, it became clear that there is a gaping hole in DeFi. As more and more assets are being put to work, there is an inordinate and truly inefficient amount of untapped liquidity and resources in altcoins. Certainly, Uniswap and AMMs have breathed much needed life into many early stage projects, however, the ability to aggregate and earn for many, has been absent. Providing a way to put alternative coins to work is the essence of Bonded Finance.

In crypto today, there is roughly fifty billion USD in dormant value and some thirty billion in under-utilised liquidity locked in altcoins. This is by no means in total either. These estimates are taken from a large sample of verifiable projects that have price and volume history. Currently, crypto lenders support only a handful of select assets and though this number is slowly growing, the volatile and unrefined altcoin space remains rife with opportunity and risk in need of amelioration. In the most general of terms, why would a supporter of a given project not be able to earn while having her long bag in tow? Again, we are entering a phase of great growth where competition is outstripped by cooperation. Previously, projects were continually at each other’s throats and held hostage by traders. Liquidity came at a great premium and essentially, the success of one project created the failings of another. A finite amount of liquidity playing the equivalent of musical chairs.

Through an educated lens, we can identify distributed pools of both capital and liquidity from the community rich demographic in the altcoin markets. It is the mission of Bonded Finance to harvest this hidden, unchallenged capital.

To that end, we have created smart instruments that generate incentives, can reduce downside exposure and effectively realize value in a vast but largely untargeted market.

Our pioneering innovation and contribution leverages the untapped altcoin market place, in order to create theoretically unlimited total value locked or TVL. Given the proliferation of new projects, we believe that a suite of smart contract instruments could go a long way in securing value. Though we have some four products in the pipeline, we are open to share our first: a simple crypto loan accelerator that offers a rather unique feature-set.

Bonded Accelerated Crypto Loan (“ACL”):

· Allows loans against traditionally dormant assets

· Creates low-risk yield for liquidity providers (“LPs”) on those same assets

· High-yield, interest-bearing product

· Variable APY balanced by a price elasticity of demand algorithm

· Upside exposure of the collateral used to secure your investment

· Aggressive loan-to-value ratios to guard against volatility

· Dynamic loan to value (“LTV”) ratios and interest-automating algorithm

To simply describe the Accelerated Crypto Loan:

Unlike a traditional loan, the issuer of the ACL is not a person, bank or lending institution but a pool of shared liquidity supported by makers looking to put their crypto to work. This ensures liquidity for borrowers, instant click and borrow solutions, elimination of third-party intermediaries and also allows for compounded returns for the LP.

Image for post

Let’s take a closer look at how this works:

· A borrower moves funds from an exchange wallet to the Bonded platform

· The Bonded interface determines the current loan-to-value ratio, the dynamic interest rate and the general risk profile

· Borrower deposits collateral then withdraws the loan from the LP pool

The loan amount will depend on a variety of factors:

· The specific coin deposited and the amount

· Current and post-deposit state of the liquidity pool

· Various other health factors related to pool size and volatility

Over the life of the loan, Bonded LPs will benefit from their participation in a number of ways:

1. All generated fees and interest related to the loan will be returned to the liquidity pool, (hence to the LPs) with the added benefit of a more secure pool and additional liquidity.

2. If applicable, borrowers in the Bonded protocol agree to “upside-redistribution*,” at the end of the loan term.

*Upside-redistribution takes place when the price of the asset at the end of term is greater than at the time of loan issuance.

3. A percentage of that gain is returned to the Liquidity Pool, and LP token holders are further compensated for their participation.

What you end up with is a positive feedback loop, for both lender and borrower, that supports higher yield generation on less liquid assets while growing the liquidity that determines rewards.

Would you like to earn token right now! ☞ CLICK HERE

☞ Public Sale Details 
☞ Token Distribution

Looking for more information…

☞ Website
☞ Announcement
☞ Source Code
☞ Social Channel
☞ Documentation
☞ Coinmarketcap

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#blockchain #bitcoin #cryptocurrency #bonded finance #bond

What is GEEK

Buddha Community

What is Bonded Finance (BOND) | What is BOND token

Amanda Rossel

1620895412

Thanks for this information. I think that every successful financial project now depends on automation and artificial intelligence, which calculates a lot of data and gives the right direction where to go. I’ve read about robo-advisory platforms that help with just this kind of thing.

thank you Amanda Rossel!

Angelina roda

Angelina roda

1624219980

How to Buy NFT Art Finance Token - The EASIEST METHOD! DO NOT MISS!!! JUST IN 4 MINUTES

NFT Art Finance is currently one of the most popular cryptocurrencies right now on the market, so in today’s video, I will be showing you guys how to easily buy NFT Art Finance on your phone using the Trust Wallet application.
📺 The video in this post was made by More LimSanity
The origin of the article: https://www.youtube.com/watch?v=sKE6Pc_w1IE
🔺 DISCLAIMER: The article is for information sharing. The content of this video is solely the opinions of the speaker who is not a licensed financial advisor or registered investment advisor. Not investment advice or legal advice.
Cryptocurrency trading is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money
🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner
⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!
☞ **-----CLICK HERE-----**⭐ ⭐ ⭐
Thanks for visiting and watching! Please don’t forget to leave a like, comment and share!

#bitcoin #blockchain #nft art finance token #token #buy nft art finance #how to buy nft art finance token - the easiest method!

What is Bonded Finance (BOND) | What is BOND token

Welcome To Bonded Finance

Decentralized Finance (“DeFi”) Today

In what has become something of an arms race, the open finance/DeFi space, despite its share of copies, forks and fugazis represents something both old and new. From the outside looking in, it may appear to be no different than the proliferation of Bitcoin clones and forks circa 2017 that brought us the likes of Super Bitcoin and the lesser. Do not let the usual suspects rebase us into a bubble mindset because this is very different. To do so would be to miss a great transformatiaon, something akin to what the late Bruce Lee said: “It’s like pointing at the moon. Do not concentrate on the finger, or you will miss all of the heavenly glory.”

Though it has been somewhat obscured amidst the whirlwind of development, we are witnessing the rapid emergence of a new paradigm that has reclaimed the very essence of crypto. The principle of cooperation and the ethos of open source has been re-established by governance tokens and it appears there will be no slowing down or looking back. With the advent of true cross-chain interoperability coupled with the inspired number of co-opetitive initiatives, we are redefining our world by way of governance-driven-crowd-wise communities. Never has the phrase, “what we can not do alone, we can do together,” been more apropos.

The Bonded platform was created to incubate and deploy experimental, high-yield, smart-contract driven, financial instruments that push the bounds of open finance. Bonding is an algorithmic model that aims to unlock, aggregate and de-risk ~50 billion in dormant value distributed amongst untapped digital assets. Bonded proposes a new fintech solution deploying both permissioned and permission-less high yield exotic financial products in the rapidly growing DEFI sector.

Philosophy

As yet another entree into the fray, Bonded Finance diverged from old guard business tenets which typically ask, “what can we capture,” and replaced it with “what can we contribute?” We believe that the pathway to success for the crypto economy is through collaboration. This mindset is not just a belief system but is fundamentally built into our operational model and the very mechanics of our smart contract financial products.

The Name

When unstable atoms share, transfer and manage electrons to fill their outer shell, it is known as covalent bonding while ionic bonding is when bonds are formed by two oppositely charged ions. These principles inspired the architecture of our smart instruments. A well-managed, shared basket of volatile assets can create a stable ecosystem while bonding lenders and borrowers, despite their opposite agendas, allows for a sharing of resources and creates lasting, sustainable growth and value.

The Opportunity

Our desire to add value and contribute has been laborious as we experimented with lending products while watching the space crawl, walk and occasionally stumble. After months of pivoting and market observation, it became clear that there is a gaping hole in DeFi. As more and more assets are being put to work, there is an inordinate and truly inefficient amount of untapped liquidity and resources in altcoins. Certainly, Uniswap and AMMs have breathed much needed life into many early stage projects, however, the ability to aggregate and earn for many, has been absent. Providing a way to put alternative coins to work is the essence of Bonded Finance.

In crypto today, there is roughly fifty billion USD in dormant value and some thirty billion in under-utilised liquidity locked in altcoins. This is by no means in total either. These estimates are taken from a large sample of verifiable projects that have price and volume history. Currently, crypto lenders support only a handful of select assets and though this number is slowly growing, the volatile and unrefined altcoin space remains rife with opportunity and risk in need of amelioration. In the most general of terms, why would a supporter of a given project not be able to earn while having her long bag in tow? Again, we are entering a phase of great growth where competition is outstripped by cooperation. Previously, projects were continually at each other’s throats and held hostage by traders. Liquidity came at a great premium and essentially, the success of one project created the failings of another. A finite amount of liquidity playing the equivalent of musical chairs.

Through an educated lens, we can identify distributed pools of both capital and liquidity from the community rich demographic in the altcoin markets. It is the mission of Bonded Finance to harvest this hidden, unchallenged capital.

To that end, we have created smart instruments that generate incentives, can reduce downside exposure and effectively realize value in a vast but largely untargeted market.

Our pioneering innovation and contribution leverages the untapped altcoin market place, in order to create theoretically unlimited total value locked or TVL. Given the proliferation of new projects, we believe that a suite of smart contract instruments could go a long way in securing value. Though we have some four products in the pipeline, we are open to share our first: a simple crypto loan accelerator that offers a rather unique feature-set.

Bonded Accelerated Crypto Loan (“ACL”):

· Allows loans against traditionally dormant assets

· Creates low-risk yield for liquidity providers (“LPs”) on those same assets

· High-yield, interest-bearing product

· Variable APY balanced by a price elasticity of demand algorithm

· Upside exposure of the collateral used to secure your investment

· Aggressive loan-to-value ratios to guard against volatility

· Dynamic loan to value (“LTV”) ratios and interest-automating algorithm

To simply describe the Accelerated Crypto Loan:

Unlike a traditional loan, the issuer of the ACL is not a person, bank or lending institution but a pool of shared liquidity supported by makers looking to put their crypto to work. This ensures liquidity for borrowers, instant click and borrow solutions, elimination of third-party intermediaries and also allows for compounded returns for the LP.

Image for post

Let’s take a closer look at how this works:

· A borrower moves funds from an exchange wallet to the Bonded platform

· The Bonded interface determines the current loan-to-value ratio, the dynamic interest rate and the general risk profile

· Borrower deposits collateral then withdraws the loan from the LP pool

The loan amount will depend on a variety of factors:

· The specific coin deposited and the amount

· Current and post-deposit state of the liquidity pool

· Various other health factors related to pool size and volatility

Over the life of the loan, Bonded LPs will benefit from their participation in a number of ways:

1. All generated fees and interest related to the loan will be returned to the liquidity pool, (hence to the LPs) with the added benefit of a more secure pool and additional liquidity.

2. If applicable, borrowers in the Bonded protocol agree to “upside-redistribution*,” at the end of the loan term.

*Upside-redistribution takes place when the price of the asset at the end of term is greater than at the time of loan issuance.

3. A percentage of that gain is returned to the Liquidity Pool, and LP token holders are further compensated for their participation.

What you end up with is a positive feedback loop, for both lender and borrower, that supports higher yield generation on less liquid assets while growing the liquidity that determines rewards.

Would you like to earn token right now! ☞ CLICK HERE

☞ Public Sale Details 
☞ Token Distribution

Looking for more information…

☞ Website
☞ Announcement
☞ Source Code
☞ Social Channel
☞ Documentation
☞ Coinmarketcap

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#blockchain #bitcoin #cryptocurrency #bonded finance #bond

David mr

David mr

1624312800

SPORE FINANCE PREDICTION - WHAT IS SPORE FINANCE & SPORE FINANCE ANALYSIS - SPORE FINANCE

SPORE FINANCE PREDICTION - WHAT IS SPORE FINANCE & SPORE FINANCE ANALYSIS - SPORE FINANCE

In this video, I talk about spore finance coin and give my spore finance prediction. I talk about the latest spore finance analysis & spore finance crypto coin that recently has been hit pretty hard in the last 24 hours. I go over what is spore finance and how many holders are on this new crypto coin spore finance.
📺 The video in this post was made by Josh’s Finance
The origin of the article: https://www.youtube.com/watch?v=qbPQvdxCtEI
🔺 DISCLAIMER: The article is for information sharing. The content of this video is solely the opinions of the speaker who is not a licensed financial advisor or registered investment advisor. Not investment advice or legal advice.
Cryptocurrency trading is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money
🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner
⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!
☞ **-----CLICK HERE-----**⭐ ⭐ ⭐
Thanks for visiting and watching! Please don’t forget to leave a like, comment and share!

#bitcoin #blockchain #spore finance #what is spore finance #spore finance prediction - what is spore finance & spore finance analysis - spore finance #spore finance prediction

Words Counted: A Ruby Natural Language Processor.

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Are you using WordsCounted to do something interesting? Please tell me about it.

 

Demo

Visit this website for one example of what you can do with WordsCounted.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: abitdodgy
Source code: https://github.com/abitdodgy/words_counted
License: MIT license

#ruby  #ruby-on-rails 

Royce  Reinger

Royce Reinger

1658068560

WordsCounted: A Ruby Natural Language Processor

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Are you using WordsCounted to do something interesting? Please tell me about it.

Gem Version 

RubyDoc documentation.

Demo

Visit this website for one example of what you can do with WordsCounted.


Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: Abitdodgy
Source Code: https://github.com/abitdodgy/words_counted 
License: MIT license

#ruby #nlp