What is Router Protocol (ROUTE) | What is Router Protocol token | What is ROUTE token

Router Protocol is a crosschain-liquidity aggregator platform that was built to seamlessly provide bridging infrastructure between current and emerging Layer 1 and Layer 2 blockchain solutions. The goal is to enable users to swap their assets from different networks seamlessly in a near-instant and low-cost manner

On cross-chain liquidity: Introducing the Router Protocol

Image for post

As a multi blockchain future becomes a reality, there is a critical need for infrastructure that can port liquidity across chains for the ecosystem to develop.

Ethereum has been great — It is the first Turing-complete, trustless ‘world computer’ of its kind. And we are just tapping into the full potential of Ethereum, which should be unleashed with Eth 2.0 and its transition to POS over the coming months and years.

However the fact remains that, scalability efforts are still in their early days, as most platforms continue with the scalability trilemma — the trade off between decentralization, scalability and security. Layer 2 solutions include Bitcoin’s Lightning network, as well as multiple solutions for Ethereum such as Plasma. Layer 1 solutions include Sharding, as well as consensus protocol changes that inevitably involve various sliding-scale trade offs across the axes of the scalability trilemma. While Eth 2.0 promises tps (transactions per second) figures of over 100000 at some point in the future, there are significant technical hurdles to be overcome before the sharding in ETH 2.0 can get there. In the meanwhile, most major blockchain solutions offer tps figures far less than that seen on Visa, Mastercard or PayPal, which are atleast 1500 tps, even from a conservative standpoint.

Image for post

Therefore we have multiple efforts addressed towards scalability — In addition to Eth 2.0 there are consensus protocol approaches such as Dfinity, Tezos, Polka and Cosmos as well as Layer 2 solutions such as Matic, Omisego, xDAI etc. There is no one conclusive winner yet, and it might be that the future of blockchains will be multi-chain for quite a while, with a fragmented market with liquidity and developer community dispersed across these.

In this scenario, what will be needed is the infra that will seamlessly port liquidity between these isolated liquidity pools, almost like highways connecting far flung cities. The development of railways starting in the early 1800s helped develop the US economy and laid foundations for it to become the super power of the 20th century; This also led to the development of various cities across the United States. Similarly, cross-chain bridging infrastructure will help promote liquidity migration and developer efforts towards various emerging chains and solutions, and eventually lead to a thriving, bustling blockchain ecosystem.

Router is a cross chain liquidity infrastructure primitive that aims to seamlessly provide bridging infra between various current and emerging layer-1 and layer-2 blockchain solutions, such as Matic and Ethereum.

Progressively, Router plans to build out bridging infra between multiple other chains in its roadmap. Stay tuned as we roll out the product and expand on our roadmap over the coming few weeks.

Router Protocol is set on building a cross-chain future

Image for post

The decentralized ecosystem consists of over 6,000 cryptocurrencies that currently operate in closed-off, isolated infrastructures, unable to access neither the liquidity nor the functionality offered by each other. Protocols and communities across the entire industry are continually battling each other to claim their spot in the emerging Web 3.0 paradigm. And while the rise of Web 3.0 is inevitable, it’s still too early to tell what the ecosystem will look like — will it evolve into a monopoly, with the entire space dominated by a single blockchain platform, or will it turn into an oligopoly, where a handful of protocols control the market?

What is clear, however, is the need for bridging these fragmented networks. With these 6,000 different cryptocurrencies on the market encapsulating over $752 billion in value, there has never been a more pressing need for porting liquidity across them. As more institutional investors join the crypto space, the need for better efficiency and flexibility increases almost exponentially.

The crypto industry was built on a fundamental premise; the fragmentation and limitations endemic to traditional finance needed to be replaced with a better, more robust system built with user needs front and center. The world of traditional finance, while dealing with a wide variety of asset classes, is a deeply flawed one — the limitations it essentially put upon itself do more to facilitate rent-seeking behavior than actually benefiting its users. Unfortunately, several of these legacy encumbrances have been grandfathered into our new world of blockchains.

When the need to transfer assets from one blockchain system to another occurs, users are kept in the dark on exactly what goes on under the hood, and are forced to pay high fees to facilitate technically simple transfers. For example, when writing this post — the cost of transferring value on Ethereum is almost at all-time high of $10+ and if you want to use a service like Uniswap to swap tokens — the fees is north of $50.

Router Protocol wants to resolve this issue, which is becoming increasingly important as the world gets more globally connected. We at Router believe that the future lies in cross-chain interoperability. The crypto industry doesn’t have to suffer from the fragmentation seen in traditional finance.

One of the most integral parts of the emerging Web 3.0 paradigm is liquidity farming. While opinions are divided on the topic, liquidity farming is here to stay and should be seen as a largely positive development in the crypto industry. Unlike the data farming that shaped the Web 2.0 era, yield farming in the Web 3.0 ecosystem will provide users with real, tangible benefits — a steady trickle of yield.

With Web 3.0 being all about startups building products around various facets of capital flows, the rise of financial primitives such as loans seems inevitable.

But, the trustless, open-source nature of the emerging Web 3.0 ecosystem is set to result in a large number of closed-off projects, each competing for a tiny part of the market. Such a fragmented market is set to produce multiple Layer 1 and Layer 2 solutions working to make their systems more efficient.

As a cross-chain liquidity infrastructure primitive, Router Protocol aims to provide a seamless bridge between those Layer 1 and Layer 2 blockchain platforms. Aside from connecting blockchains and enabling a free flow of information, Router will also make smart order-routing possible, enabling users to swap their assets from different networks seamlessly. Utilizing Router will be completely transparent and users will be able to see what goes on “under the hood” every time they interact with the protocol.

Liquidity Landscape after Router Protocol

Being transparent, however, doesn’t have to mean being complicated. Those interacting with Router Protocol will be able to enjoy a rather hands-off user experience — when swapping assets, the system will automatically find the best market price and execute the swap.

The first major product coming from Router Protocol will be a bridge between the Matic Network and Ethereum. A bridge between Matic and Ethereum will provide a scaling solution to Ethereum that is near-instant, low-cost, and incredibly flexible. It will enable users to get the best of both worlds — Matic users will reap the benefits of Ethereum’s unprecedented liquidity, while Ethereum users will be able to utilize Matic’s scalability and high throughput.

In the future, Router plans on building bridging infrastructure between multiple other blockchains and blockchain solutions. Stay tuned as we roll out new products and expand our roadmap in the coming weeks. Router protocol will extend interoperability to many different layers. We are currently considering Avalanche, Binance Smart Chain, Cardano, Algorand, Solana, Stellar, Polkadot, and Litecoin.

Total Supply: 20,000,000 ROUTE
Initial Circulating Supply: 1,022,865 ROUTE

Token Distribution

Seed Round: 600,000 ROUTE, 3.00% of Total Supply
Private Round: 1,521,818 ROUTE, 7.61% of Total Supply
Reward Pool: 3,444,000 ROUTE, 17.22% of Total Supply
Ecosystem Fund: 5,084,000 ROUTE, 25.42% of Total Supply
Team: 3,000,000 ROUTE, 15.00% of Total Supply
Liquidity Provision Fund: 350,000 ROUTE, 1.75% of Total Supply
Foundation: 4,000,000 ROUTE, 20.00% of Total Supply
Strategic Partners: 2,000,000 ROUTE, 10.00% of Total Supply

Token Vesting Schedule

Seed Round: Vesting over 15 months with 6 month cliff
Private Round: Vesting over 9 months
Reward Pool: Locked in Smart Contract for daily distribution over 12 months per reward programs
Ecosystem Fund: 8% unlock at day 0, 7% unlock at month 3, remaining distributed monthly 17 months
Team: 10% unlock at month 9, remaining distributed monthly for 12 months
Liquidity Provision Fund: 80,000 unlock at day 0, 30,000 at day 30, remaining distributed quarterly for 12 months
Foundation: 10% unlock at month 9, remaining distributed monthly for 12 months
Strategic Partners: Vesting over 12 months
Initial Circulating Supply at listing:1,022,865 ROUTE(5.11% of Total Supply), including:
-304,364 ROUTE from private investors unlock
-80,000 ROUTE from liquidity provision fund
-406,720 ROUTE from ecosystem fund
-31,371 ROUTE from reward pool
-200,000 ROUTE from strategic partners

Looking for more information…

WebsiteExplorerSource CodeSocial ChannelMessage BoardMessage Board 2Coinmarketcap

Would you like to earn ROUTE right now! ☞ CLICK HERE

Top exchanges for token-coin trading. Follow instructions and make unlimited money

BinanceBittrexPoloniexBitfinexHuobi

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#blockchain #bitcoin #crypto #router protocol #route

What is GEEK

Buddha Community

What is Router Protocol (ROUTE) | What is Router Protocol token | What is ROUTE token

What is Router Protocol (ROUTE) | What is Router Protocol token | What is ROUTE token

Router Protocol is a crosschain-liquidity aggregator platform that was built to seamlessly provide bridging infrastructure between current and emerging Layer 1 and Layer 2 blockchain solutions. The goal is to enable users to swap their assets from different networks seamlessly in a near-instant and low-cost manner

On cross-chain liquidity: Introducing the Router Protocol

Image for post

As a multi blockchain future becomes a reality, there is a critical need for infrastructure that can port liquidity across chains for the ecosystem to develop.

Ethereum has been great — It is the first Turing-complete, trustless ‘world computer’ of its kind. And we are just tapping into the full potential of Ethereum, which should be unleashed with Eth 2.0 and its transition to POS over the coming months and years.

However the fact remains that, scalability efforts are still in their early days, as most platforms continue with the scalability trilemma — the trade off between decentralization, scalability and security. Layer 2 solutions include Bitcoin’s Lightning network, as well as multiple solutions for Ethereum such as Plasma. Layer 1 solutions include Sharding, as well as consensus protocol changes that inevitably involve various sliding-scale trade offs across the axes of the scalability trilemma. While Eth 2.0 promises tps (transactions per second) figures of over 100000 at some point in the future, there are significant technical hurdles to be overcome before the sharding in ETH 2.0 can get there. In the meanwhile, most major blockchain solutions offer tps figures far less than that seen on Visa, Mastercard or PayPal, which are atleast 1500 tps, even from a conservative standpoint.

Image for post

Therefore we have multiple efforts addressed towards scalability — In addition to Eth 2.0 there are consensus protocol approaches such as Dfinity, Tezos, Polka and Cosmos as well as Layer 2 solutions such as Matic, Omisego, xDAI etc. There is no one conclusive winner yet, and it might be that the future of blockchains will be multi-chain for quite a while, with a fragmented market with liquidity and developer community dispersed across these.

In this scenario, what will be needed is the infra that will seamlessly port liquidity between these isolated liquidity pools, almost like highways connecting far flung cities. The development of railways starting in the early 1800s helped develop the US economy and laid foundations for it to become the super power of the 20th century; This also led to the development of various cities across the United States. Similarly, cross-chain bridging infrastructure will help promote liquidity migration and developer efforts towards various emerging chains and solutions, and eventually lead to a thriving, bustling blockchain ecosystem.

Router is a cross chain liquidity infrastructure primitive that aims to seamlessly provide bridging infra between various current and emerging layer-1 and layer-2 blockchain solutions, such as Matic and Ethereum.

Progressively, Router plans to build out bridging infra between multiple other chains in its roadmap. Stay tuned as we roll out the product and expand on our roadmap over the coming few weeks.

Router Protocol is set on building a cross-chain future

Image for post

The decentralized ecosystem consists of over 6,000 cryptocurrencies that currently operate in closed-off, isolated infrastructures, unable to access neither the liquidity nor the functionality offered by each other. Protocols and communities across the entire industry are continually battling each other to claim their spot in the emerging Web 3.0 paradigm. And while the rise of Web 3.0 is inevitable, it’s still too early to tell what the ecosystem will look like — will it evolve into a monopoly, with the entire space dominated by a single blockchain platform, or will it turn into an oligopoly, where a handful of protocols control the market?

What is clear, however, is the need for bridging these fragmented networks. With these 6,000 different cryptocurrencies on the market encapsulating over $752 billion in value, there has never been a more pressing need for porting liquidity across them. As more institutional investors join the crypto space, the need for better efficiency and flexibility increases almost exponentially.

The crypto industry was built on a fundamental premise; the fragmentation and limitations endemic to traditional finance needed to be replaced with a better, more robust system built with user needs front and center. The world of traditional finance, while dealing with a wide variety of asset classes, is a deeply flawed one — the limitations it essentially put upon itself do more to facilitate rent-seeking behavior than actually benefiting its users. Unfortunately, several of these legacy encumbrances have been grandfathered into our new world of blockchains.

When the need to transfer assets from one blockchain system to another occurs, users are kept in the dark on exactly what goes on under the hood, and are forced to pay high fees to facilitate technically simple transfers. For example, when writing this post — the cost of transferring value on Ethereum is almost at all-time high of $10+ and if you want to use a service like Uniswap to swap tokens — the fees is north of $50.

Router Protocol wants to resolve this issue, which is becoming increasingly important as the world gets more globally connected. We at Router believe that the future lies in cross-chain interoperability. The crypto industry doesn’t have to suffer from the fragmentation seen in traditional finance.

One of the most integral parts of the emerging Web 3.0 paradigm is liquidity farming. While opinions are divided on the topic, liquidity farming is here to stay and should be seen as a largely positive development in the crypto industry. Unlike the data farming that shaped the Web 2.0 era, yield farming in the Web 3.0 ecosystem will provide users with real, tangible benefits — a steady trickle of yield.

With Web 3.0 being all about startups building products around various facets of capital flows, the rise of financial primitives such as loans seems inevitable.

But, the trustless, open-source nature of the emerging Web 3.0 ecosystem is set to result in a large number of closed-off projects, each competing for a tiny part of the market. Such a fragmented market is set to produce multiple Layer 1 and Layer 2 solutions working to make their systems more efficient.

As a cross-chain liquidity infrastructure primitive, Router Protocol aims to provide a seamless bridge between those Layer 1 and Layer 2 blockchain platforms. Aside from connecting blockchains and enabling a free flow of information, Router will also make smart order-routing possible, enabling users to swap their assets from different networks seamlessly. Utilizing Router will be completely transparent and users will be able to see what goes on “under the hood” every time they interact with the protocol.

Liquidity Landscape after Router Protocol

Being transparent, however, doesn’t have to mean being complicated. Those interacting with Router Protocol will be able to enjoy a rather hands-off user experience — when swapping assets, the system will automatically find the best market price and execute the swap.

The first major product coming from Router Protocol will be a bridge between the Matic Network and Ethereum. A bridge between Matic and Ethereum will provide a scaling solution to Ethereum that is near-instant, low-cost, and incredibly flexible. It will enable users to get the best of both worlds — Matic users will reap the benefits of Ethereum’s unprecedented liquidity, while Ethereum users will be able to utilize Matic’s scalability and high throughput.

In the future, Router plans on building bridging infrastructure between multiple other blockchains and blockchain solutions. Stay tuned as we roll out new products and expand our roadmap in the coming weeks. Router protocol will extend interoperability to many different layers. We are currently considering Avalanche, Binance Smart Chain, Cardano, Algorand, Solana, Stellar, Polkadot, and Litecoin.

Total Supply: 20,000,000 ROUTE
Initial Circulating Supply: 1,022,865 ROUTE

Token Distribution

Seed Round: 600,000 ROUTE, 3.00% of Total Supply
Private Round: 1,521,818 ROUTE, 7.61% of Total Supply
Reward Pool: 3,444,000 ROUTE, 17.22% of Total Supply
Ecosystem Fund: 5,084,000 ROUTE, 25.42% of Total Supply
Team: 3,000,000 ROUTE, 15.00% of Total Supply
Liquidity Provision Fund: 350,000 ROUTE, 1.75% of Total Supply
Foundation: 4,000,000 ROUTE, 20.00% of Total Supply
Strategic Partners: 2,000,000 ROUTE, 10.00% of Total Supply

Token Vesting Schedule

Seed Round: Vesting over 15 months with 6 month cliff
Private Round: Vesting over 9 months
Reward Pool: Locked in Smart Contract for daily distribution over 12 months per reward programs
Ecosystem Fund: 8% unlock at day 0, 7% unlock at month 3, remaining distributed monthly 17 months
Team: 10% unlock at month 9, remaining distributed monthly for 12 months
Liquidity Provision Fund: 80,000 unlock at day 0, 30,000 at day 30, remaining distributed quarterly for 12 months
Foundation: 10% unlock at month 9, remaining distributed monthly for 12 months
Strategic Partners: Vesting over 12 months
Initial Circulating Supply at listing:1,022,865 ROUTE(5.11% of Total Supply), including:
-304,364 ROUTE from private investors unlock
-80,000 ROUTE from liquidity provision fund
-406,720 ROUTE from ecosystem fund
-31,371 ROUTE from reward pool
-200,000 ROUTE from strategic partners

Looking for more information…

WebsiteExplorerSource CodeSocial ChannelMessage BoardMessage Board 2Coinmarketcap

Would you like to earn ROUTE right now! ☞ CLICK HERE

Top exchanges for token-coin trading. Follow instructions and make unlimited money

BinanceBittrexPoloniexBitfinexHuobi

Thank for visiting and reading this article! I’m highly appreciate your actions! Please share if you liked it!

#blockchain #bitcoin #crypto #router protocol #route

Royce  Reinger

Royce Reinger

1658068560

WordsCounted: A Ruby Natural Language Processor

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Are you using WordsCounted to do something interesting? Please tell me about it.

Gem Version 

RubyDoc documentation.

Demo

Visit this website for one example of what you can do with WordsCounted.


Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: Abitdodgy
Source Code: https://github.com/abitdodgy/words_counted 
License: MIT license

#ruby #nlp 

Words Counted: A Ruby Natural Language Processor.

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Are you using WordsCounted to do something interesting? Please tell me about it.

 

Demo

Visit this website for one example of what you can do with WordsCounted.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: abitdodgy
Source code: https://github.com/abitdodgy/words_counted
License: MIT license

#ruby  #ruby-on-rails 

aaron silva

aaron silva

1622197808

SafeMoon Clone | Create A DeFi Token Like SafeMoon | DeFi token like SafeMoon

SafeMoon is a decentralized finance (DeFi) token. This token consists of RFI tokenomics and auto-liquidity generating protocol. A DeFi token like SafeMoon has reached the mainstream standards under the Binance Smart Chain. Its success and popularity have been immense, thus, making the majority of the business firms adopt this style of cryptocurrency as an alternative.

A DeFi token like SafeMoon is almost similar to the other crypto-token, but the only difference being that it charges a 10% transaction fee from the users who sell their tokens, in which 5% of the fee is distributed to the remaining SafeMoon owners. This feature rewards the owners for holding onto their tokens.

Read More @ https://bit.ly/3oFbJoJ

#create a defi token like safemoon #defi token like safemoon #safemoon token #safemoon token clone #defi token

aaron silva

aaron silva

1621844791

SafeMoon Clone | SafeMoon Token Clone | SafeMoon Token Clone Development

The SafeMoon Token Clone Development is the new trendsetter in the digital world that brought significant changes to benefit the growth of investors’ business in a short period. The SafeMoon token clone is the most widely discussed topic among global users for its value soaring high in the marketplace. The SafeMoon token development is a combination of RFI tokenomics and the auto-liquidity generating process. The SafeMoon token is a replica of decentralized finance (DeFi) tokens that are highly scalable and implemented with tamper-proof security.

The SafeMoon tokens execute efficient functionalities like RFI Static Rewards, Automated Liquidity Provisions, and Automatic Token Burns. The SafeMoon token is considered the most advanced stable coin in the crypto market. It gained global audience attention for managing the stability of asset value without any fluctuations in the marketplace. The SafeMoon token clone is completely decentralized that eliminates the need for intermediaries and benefits the users with less transaction fee and wait time to overtake the traditional banking process.

Reasons to invest in SafeMoon Token Clone :

  • The SafeMoon token clone benefits the investors with Automated Liquidity Pool as a unique feature since it adds more revenue for their business growth in less time. The traders can experience instant trade round the clock for reaping profits with less investment towards the SafeMoon token.
  • It is integrated with high-end security protocols like two-factor authentication and signature process to prevent various hacks and vulnerable activities. The Smart Contract system in SafeMoon token development manages the overall operation of transactions without any delay,
  • The users can obtain a reward amount based on the volume of SafeMoon tokens traded in the marketplace. The efficient trading mechanism allows the users to trade the SafeMoon tokens at the best price for farming. The user can earn higher rewards based on the staking volume of tokens by users in the trade market.
  • It allows the token holders to gain complete ownership over their SafeMoon tokens after purchasing from DeFi exchanges. The SafeMoon community governs the token distribution, price fluctuations, staking, and every other token activity. The community boosts the value of SafeMoon tokens.
  • The Automated Burning tokens result in the community no longer having control over the SafeMoon tokens. Instead, the community can control the burn of the tokens efficiently for promoting its value in the marketplace. The transaction of SafeMoon tokens on the blockchain platform is fast, safe, and secure.

The SafeMoon Token Clone Development is a promising future for upcoming investors and startups to increase their business revenue in less time. The SafeMoon token clone has great demand in the real world among millions of users for its value in the market. Investors can contact leading Infinite Block Tech to gain proper assistance in developing a world-class SafeMoon token clone that increases the business growth in less time.

#safemoon token #safemoon token clone #safemoon token clone development #defi token