Ben Taylor

Ben Taylor

1647313293

What is Tokenomics | The Real Value of a Token

In this post, you'll learn What is tokenomics and why do they matter?

So much about the crypto ecosystem is novel and disruptive, including its vocabulary, which features entirely new words, invented to describe entirely new concepts. Tokenomics is a great example. It is what is known as a portmanteau, a word that blends the meaning of two other words - tokens and economics. It fills the empty space in the dictionary to describe how the mechanics of cryptocurrency function - supply, distribution and incentive structure - relate to value.

What is Cryptocurrency?

Although there are many different types of cryptocurrencies, they all have one thing in common: they operate on blockchain technology, making them decentralized. Decentralization of financial operations through cryptocurrencies has several efficiencies over the traditional financial system, including:

  • Cuts out almost all the overhead costs associated with banks
  • Less expensive transactions that can be sent and received internationally
  • Inflation or finite supply that’s written into code — no need to trust the Federal Reserve
  • Financial derivatives like trading strategies and loans can be coded directly onto certain cryptocurrency blockchains, replacing the need for financial intermediaries.

The largest cryptocurrency is Bitcoin and it’s used as a “digital gold.” Essentially, Bitcoin is a commodity used as a store of value. Ethereum is the 2nd-largest cryptocurrency with a market cap of $170 billion. Developers can develop smart contracts on Ethereum’s blockchain to create decentralized alternatives to traditional banking functions, like lending and trading.

How Does Cryptocurrency Work?

Cryptocurrencies are digital assets that are powered on the blockchain. Blockchain technology stores a ledger of every transaction of the cryptocurrency on every node powering the blockchain. Nodes are computers that are connected to Bitcoin’s network to  mine Bitcoin. If one of these miners tries to enter false transactions, it will be nullified by the correct ledger.

The correct ledger is determined by the majority of miners’ records. In theory, you could hack a blockchain by controlling 51% of the cryptocurrency’s network in a process called a  51% attack. However, this process is economically infeasible and would require an extremely choreographed hack with billions, if not trillions, of dollars worth of computer hardware.

To transact with a cryptocurrency, you need to have a set of public and private keys. These keys are like passwords generated by your cryptocurrency wallet. Your public key is connected to your wallet’s address and allows people to send you cryptocurrency. Your private key is used to approve transactions being sent from your wallet –– only you have access to your private keys.

Contrary to popular belief, many cryptocurrencies don’t have a finite supply. Bitcoin’s total supply is capped at 21 million coins, but many altcoins have a set inflation rate with no cap on total supply, like Ethereum.

What is tokenomics?

The choice of the term tokenomics' constituent parts - token and economics - may seem a bit confusing if your assumption is that cryptocurrencies are simply new forms of internet money. In reality, crypto can apply to any form of value transfer. 

This is why the word token is used, because units of cryptocurrency value can function as money, but also give the holder specific utility. Just as a games arcade or laundromat may require that you use a specific token to operate their machines, many  blockchain based services will be powered by their own token, which unlocks specific privileges or rewards:

  • DEFI - users are rewarded with tokens for activity (borrowing/lending), or tokens are created as synthetic versions of other existing cryptocurrencies
  • DAOs - token holders get voting rights within Decentralised Autonomous Organisations, new digital communities governed by Smart Contracts
  • Gaming/Metaverse - where game activity and in-game items are represented by tokens and can have exchangeable value

If we add this understanding of cryptocurrency as tokens, to the traditional definition of economics - measuring the production, distribution and consumption of goods and services - we can breakdown what tokenomics within cryptocurrency measures into:

  1. how tokens are produced via their supply schedule, using a specific set of supply metrics
  2. how tokens are distributed among holders
  3. the incentives that encourage usage and ownership of tokens

We can start to unpack these aspects of tokenomics by looking at the supply schedule for the first ever cryptocurrency, Bitcoin.

1. Supply schedule

Bitcoin went live in January 2009,  based on a set  of rules - the Bitcoin Protocol - that included a clearly defined supply schedule:

  • New bitcoins are created through Mining. Miners compete to process a new block of transactions by committing computing power to solve a mathematical puzzle. They do this by running a set algorithm hoping to find the answer - this is known as Proof of Work.
  • A new block is mined roughly every ten minutes. The system is self-regulating, through a difficulty adjustment of the mining algorithm every two weeks, to maintain the steady rate of block creation.
  • The mining reward began at 50 BTC in 2009, but halves every 210,000 blocks - roughly four years. There have been three so-called halvings - the last in May 2020 - with the block reward now set at 6.25 BTC.
     
  • This fixed supply schedule will continue until a maximum of 21 million bitcoin are created.
  • There is no other way that bitcoin can be created
  • Along with the block reward, successful Miners also receive fees that each transaction pays to be sent over the network

The importance of Bitcoin's fixed supply schedule to perceived value cannot be overstated.  It enables us to know Bitcoin's inflation rate over time - its programmed scarcity.

It also tells us that as of January 2022, 90% of Bitcoin supply has been mined and that the maximum supply will be reached in around 2140, at which point the only reward Miners will receive will be the transaction fees.

The supply schedule is a critical piece of the tokenomics puzzle. If a coin has a maximum supply this tells you that over time inflation will decline to zero, at the point the last coins are mined (see the graph above). This quality is described as disinflationary - as supply increases but at a decreasing marginal rate - and is a valuable characteristic for something to function as a store of value. 

If there is no maximum supply this means that tokens will keep being created indefinitely, and potentially diluting value. This is true of the existing fiat monetary system, and one of its biggest criticisms along withe the uncertainty that surrounds the changes in supply. 

To know whether the supply of fiat money is expanding or contracting - with the obvious knock on effects to the its purchasing power and the wider economy - you have to wait anxiously for the outcome of periodic closed door Federal Reserve or ECB meetings. Contrast that with the certainty that Bitcoin's fixed supply schedule provides, which even allows scarcity-based models to predict its value.

Supply Metrics

As the first example of a cryptocurrency, Bitcoin effectively introduced the concept of tokenomics, along with a set of metrics that breakdown the supply schedule of any cryptocurrency into key components that give valuable insight into potential, or comparative, value.

These common yardsticks are published on popular crypto price comparison sites like Coinmarketcap or Coingecko as a complement to the headline price and volume data. 

  • Maximum Supply - A hard cap on the total number of coins that will ever exist. In the case of Bitcoin 21 million.
  • Disinflationary -  Coins with a maximum supply are described as disinflationary or deflationary, because the marginal supply increase decreases over time.
  • Inflationary - Coins without a maximum supply are described as inflationary because the supply will constantly grow - inflate - over time, which may decrease the purchasing power of existing coins.
  • Total Supply - The total number of coins in existence right now.
  • Circulating Supply - The best guess of the total number of coins circulating in the public’s hands right now. In the case of Bitcoin, Total Supply and Circulating Supply are the same thing because its distribution was broadcast from day one.
  • Market Capitalisation - The Circulating Supply multiplied by current price; this is the main metric for measuring the overall value and importance of a cryptocurrency, just as it is for public companies which multiply share price by number of tradable shares.
  • Generally abbreviated to Marketcap, it is often used as a proxy measure for value, and though it is helpful in a comparative sense, its reliance on price means that it reflects what the last person was prepared to pay, which is a very different thing to estimating fundamental value.
  • Fully Diluted Market Capitalisation - The maximum supply multiplied by current price; this projects an overall value of the fully supplied coin, but based on current price.

Read more: Using CoinMarketCap Like A Pro | A Guide to Coinmarketcap (CMC)

2. Supply Distribution

Whereas the Supply Schedule tells you what the currently Circulating Supply is and the rate at which coins are being created, Supply Distribution takes into consideration how coins are spread among addresses, which  can have a big influence on value, and is another important part of tokenomics.

Given cryptocurrencies like Bitcoin are open source, this information is freely available to anyone with an internet connection and some data analysis skills. 

The raw supply distribution for Bitcoin doesn’t look particularly healthy, with less than 1% of addresses owning 86% of coins, which would suggest that is vulnerable to the actions of the smaller controlling addresses.

But this picture is somewhat misleading, as an individual will have numerous addresses, while one address might belong to an entity - like an exchange - which holds custody of Bitcoin on behalf of potentially millions of users.

Analysis by blockchain analytics provider, Glassnode, suggests that concentration is nowhere near as dramatic, and that the relative amount of bitcoin held by smaller entities has been consistently growing over time.

So though the Bitcoin blockchain is transparent, address ownership is pseudonymous, which means that we can infer certain information about the concentration of crypto ownership, and use this to provide insight in to value, but never really know the true supply distribution at a granular level.

This has spawned an entirely new field of analysis called on-chain analytics  - the closest thing to blockchain economics - which uses patterns in address behaviour to infer future price movement.

Read more: What is Bitcoin Halving | How it Works and Why Does it Matter?

Lost or Burned Coins 

Another factor that further muddies the waters around supply distribution is the number of coins that can never be spent because their Private Keys are lost, or they have been sent to a burn address. 

Though there are some well-publicised cases where significant amounts of bitcoin have been lost, it is impossible to put an exact figure on the total amount of lost coins for any cryptocurrency.

Dormancy - a measure of how long addresses have been inactive - is the main hint that on-chain analysts use to calculate how many coins are genuinely lost. Studies estimate that around 3 million bitcoin are irretrievable, which equates to over 14% of the Maximum Supply.

This is an important consideration as price is a function of demand and supply. If the supply of available coins is actually smaller than thought, but demand is unchanged, existing coins become more valuable. This is another reason why Marketcap can be misleading, because it cannot account for lost or burned coins.

Intentionally burning Bitcoin - by sending to an address that is known to be irretrievable -  is for obvious reasons, very rare. Burning coins is, however, an important concept in inflationary coins as way to counteract supply growth and the negative impact on price.

Unfortunately, burning generally happens as a manual action, without warning, because it is associated with price increases.  Burning can be used  programmatically to reduce supply inflation in uncapped cryptocurrencies, as we'll see with Ethereum below.

As if measuring supply distribution data wasn't hard enough, there is another crucial consideration impacting value, that raw data doesn't account for, which is how coins can be shared out before a project is even launched.

If we compare crypto's two dominant currencies - Bitcoin and Ether - were distributed at launch, we can understand why that is so important.

Bitcoin's Sacred Launch

Bitcoin was the first cryptocurrency, created in 2008. We don't know who created it, we just have a pseudonym, Satoshi Nakamoto, who disappeared soon after it was up and running. Their last public communication was in December, 2010. 

The creation of Bitcoin is sometimes called a Sacred Launch, because of the manner in which it started is exactly how it runs now. No deals were cut, no venture capitalists involved, no shareholders. No initial distribution to vested parties.

Given what we now know about the relationship of supply distribution to value, Bitcoin's Sacred Launch is a significant part of its appeal. But though Satoshi didn't award his/herself a huge stack of coins for creating Bitcoin, they had to play the role of sole Miner until others were convinced to do so, and therefore were earning the 50 BTC reward every 10 minutes for a considerable time.

Much is made of what is described as Satoshi’s coins, the vast amount of bitcoin earned when he/she was the only one mining it in the months after launch. 

The addresses that hold it amount to around 1.1 million coins, none of which have ever moved, accounting for one of the four addresses holding 100,000 to 1 million bitcoin in the chart above.

Even rumours that they Satoshi's coins have moved can have a huge impact on price, showing that tokenomics is not just a matter of numbers, but includes elements of behavioural analysis, inference and game theory.

Though a significant amount of bitcoin is definitely in a few hands, it's Sacred Launch and permissionless nature are regarded as features, rather than bugs.

Most of the cryptocurrencies that followed however, took a different approach to their launch and how supply was initially distributed. 

Ethereum & the concept of Premine

It turns out that the initial approach taken by Satoshi was the exception, rather than the rule, largely because the majority of cryptocurrencies that followed were created by a known team, and supported by early investors, both of whom were rewarded with coins before the network was up and running. 

One of the reasons why skeptics think crypto has no value is because of the idea that, given its virtual nature, it can just be created out of thin air. In many cases that is actually what happens with the initial distribution of a new coin, aka a Premine.

The idea of a Premine began with the launch of Ethereum in 2013. Rather than a Sacred Launch, Ethereum’s founders decided on an initial distribution of Ether - the native token - that included those who were part of the original team, developers and community with a portion set aside for early investors, through what was known as the Initial Coin Offering (ICO). 

The Premine was essentially crypto’s way of using a traditional form of equity distribution to reward entrepreneurs with a stake in their creation, but can put a significant proportion of the overall supply in very few hands, and depending on what restrictions are placed on selling, can tell you a lot about how focused the founders are on creating long term value, or short term personal gain. 

The ICO used a completely new approach to investing in a tech start-up, attempting to give everyone an equal chance to invest, by setting aside a fixed amount on a first-come-first-served basis, which - in the case of Ethereum's launch - simply required an investor to sending bitcoin to a specific address.

This was intended to counter the privileged access that venture capital has to privately investing in emerging companies. That was the theory, things didn't quite work out in practice.

Unfortunately Premines and ICOs quickly got out of control, and the idea of democratising early stage investment soon evaporated. Initial allocations incentivised hype and over-promise, while ICOs were set by FOMO and greed.

  • If you had enough ETH you could game the system by paying ridiculous fees & frontrunning
  • In many cases ICOs were staggered, with privileged access to early investors or brokers

Premines and visible founders are two of the biggest arguments used by Bitcoin Maximalists who feel that only Bitcoin provides genuine decentralisation because it has no single controlling figure, and has a vast network of Nodes that all have to agree on potential rule changes.

This is why tokenomics must include some measure of decentralisation, because even if a cryptocurrency has a maximum supply, its founders are capable of simply rewriting the rules in their favour, or simply disappearing in a so-called rug pull.

Address distribution should be a consideration when trying to understand what value a cryptocurrency has. The more diverse ownership is, the lower the chance that price can be impacted by own holder or a small group of holders.

Node Distribution

Just as concentration of supply within a few hands is unhealthy, if there are only a small number of miners/validators, the threshold to force a change to the supply schedule is relatively low, which could also devastate value.

In the same way, the distribution of those that run the network - the Nodes and Validators - has a crucial influence. Nodes enforce the rules that govern how a cryptocurrency works, including the supply schedule and consensus method already mentioned.

If there is only a small number of Nodes, they can collude to enforce a different version of those rules or to gain a majority agreement on a different version of the blockchain record the network holds (aka a 51% attack).

Either scenario means there is no certainty that the tokenomics can be relied up, which negatively impacts potential value.

Top exchanges for token-coin trading. Follow instructions and make unlimited money

BinancePoloniexBitfinexHuobiMXCProBITGate.ioCoinbase

3. Tokenomics & Incentives

Another important consideration of tokenomics are the incentives users have to play some role in a cryptocurrency's function. The most explicit reward is that provided for processing new blocks of transactions, which differs depending on the consensus method used; the two main methods having already been introduced.

Mining (PoW) - Being rewarded for processing transactions by running mining algorithms for Proof of Work blockchains, like Bitcoin

Validating/Staking (PoS) - Being rewarded for validating transactions by staking funds in Proof of Stake blockchains.

Blockchains are self-organising. They don't recruit or contract Miners or Validators, they simply join the network because of the economic incentive for providing a service. The byproduct of more Nodes is an increase in the resilience and independence of the network.

Being directly involved as a Miner or Validator requires technical knowledge, and up-front costs, such as  specialist equipment, which in the case of Bitcoin means industrial scale operations beyond the budget of solo miners, and in the case of Ethereum, a minimum stake of 32 ETH.

But as the crypto ecosystem has become more sophisticated opportunities to passively generate income, by indirectly staking and mining, have grown dramatically.

Users can simply stake funds for PoS chains with a few clicks within a supported wallet and generate a passive income, or add their Bitcoin to a Mining Pool to generate a share of the aggregate mining rewards.

Ethereum will experience a significant change in its tokenomics in 2022, changing from a Proof of Work consensus mechanism to Proof of Stake. ETH holders have been able to stake since December 2020, when Ethereum 2.0 launched.

Total Value Locked (TVL) provides a measure of how much Ethereum has been staked, while figures are also available for how much ETH is now being burned, and the impact on overall supply. 

Both these metrics are being interpreted positively by supporters of Ethereum, but its detractors simply say that the ability to make wholesale changes to its governing principles illustrates weakness, not strength.

How successful chains are at attracting this financial backing has a significant impact on price, especially where funds are locked for a given period as part of the commitment, as this provides price stability.

The impact of fees

Whatever consensus method a cryptocurrency uses, it can only grow if there is demand for transactions from users, which will be influenced by:

  • the cost of making a transaction, how it is calculated & who earns it
  • how fast a transaction is processed

Fees and Miner/Validator revenue are two sides of the same coin, providing a barometer of blockchain usage and health. Low fees can incentivise usage; while an active and growing user base attracts more Miners/Validators, keen to earn fees. This creates network effects, generating value for all participants in a win-win situation.

Fees are especially important where they pay for computational power, rather than just the processing of transactions. This type of blockchain emerged in the years following Bitcoin’s launch, starting with Ethereum, known as the world's computer. It provides processes the majority of transactions related to the growth areas of DEFI and NFTs, but has become a victim of its own success with its fees - measured in something called GAS - pricing out all but the wealthiest users. 

Addressing that challenge is one of the key objectives of the changes in the Ethereum Roadmap. EIP 1559 - aka the London Upgrade - which happened in August 2021.

Not only did the fee estimation process completely change, with the aim of making fees cheaper, the changes to Ethereum's fee structure are also having a significant impact on its tokenomics. Instead of all transaction fees going to Ethereum Miners, a mechanism was introduced to burn a portion of fees turning it from inflationary (no maximum supply cap) to disinflationary. 

Consensus methods and fee structures can therefore, provide important incentives for participation in a blockchain ecosystem, and even directly impact supply, so should be considered as part of tokenomics. There are also a number of other incentives and influences that complete the tokenomics picture.

IEOs, IDOs & Bonding Curves

ICOs failed because they fuelled bad behaviour from both entrepreneurs, with exit scams and untested ideas, and from investors, encouraging short term speculation, rather than actual usage.

What has emerged are more innovative ways to incentivise ownership and usage of tokens - as intended - that learn from these mistakes.

One approach to launching is to negotiate directly with centralised exchanges to ensure they are listed and tap into the existing base of users - known as an IEO - Initial Exchange Offering. This can have a significant impact on ownership distribution and price, as illustrated by the well publicised price boost that coins listed on Coinbase experience. But this is a long way from Bitcoin's organic, decentralised debut.

IEOs put all the power in the hands of the large exchanges, who will pick and choose coins that they deem anticipate demand for. But given crypto is about removing the middleman, one of the most interesting  developments in coin launches is the IDO - Initial Decentralised Exchange Offering.

An IDO is a programmatic way of listing a new token on a Decentralised Exchange (DEX) using Ethereum Smart Contracts and mathematics to shape incentives for buying and selling through something called a Bonding Curve.

Bonding Curves create a fixed price discovery mechanism based on supply and demand of a new token, relative to the price of Ethereum. Their complexity warrants a completely separate article, but it is enough to know that the shape of bonding curves is relevant to the tokenomics of new ERC20 coins launched on DEXs or DEFI platforms, because it can incentivise the timing of investment.

While bonding curves a mathematically complex way to incentivise investment in the new cryptocurrencies, there are more obvious and cruder approaches, particularly within DEFI, where the focus is providing interest on tokens. as a way to encourage early investment.

APYs & Ponzinomics

DEFI has exploded over the last 18 months, with over $90bn in TVL according to Defi Pulse, but this has also fuelled a mania around APY (average percentage yield). 

Many tokens have no real use case other than incentivising users to buy and stake/lock-up the coin in order to generate early liquidity. This doesn’t reward positive behaviour, but simply creates a race to the bottom, with users chasing ludicrous returns then dumping coins before the interest rates inevitably crash. This approach has been nicknamed Ponzinomics as the ongoing function of the token is unsustainable.

Airdrops

There is another way to reward holders in terms of how much they have actually used the token as it was intended - Airdrops. DEFI projects like Uniswap and 1Inch are good examples, while OpenSea did the same for those most active in minting and trading NFTs.

Airdrops are financed from the initial treasury but unfortunately aren’t built into road maps, as telegraphing them would be self-defeating.

Many savvy investors simply use new DEFI, NFT or Metaverse platforms simply in the hope, or expectation, of an Airdrop. That makes them relevant to tokenomics as they will drastically alter supply distribution of a token, but given the secrecy that surrounds them, can only be factored into retrospectively. 

DAOs & Governance

We’ve already discussed how the concentration of ownership and the network impacts perceived value, given the concern that control rests in a few hands. Even where there is a healthy distribution of holding addresses, they are largely passive, and have no specific influence on how the cryptocurrency functions.

There is a growing move towards crypto projects that are actively run by their communities through DAOs (Decentralised Autonomous Organisations). 

DAOs give holders of the native token the right to actively participate in its governance. Token holders can submit proposals and receive votes, in proportion to their holdings, on which proposals are accepted . DAOs therefore have a crucial influence on tokenomics because the community can decide to tweak or even rip up the rules. 

DAOs are essentially attempts to create a new digital democracy via crypto, and still have a lot of hurdles to overcome, as rational rules have to be written by irrational humans.

Tokenomics & Rational Decision Making

Sensible tokenomics doesn’t guarantee a project will succeed, nor does a blatantly vague token model doom a coin to failure.

For every project that makes huge efforts toward transparent supply schedules, good governance and healthy incentives for using of the network, there are hundreds, if not thousands, that have fuzzy or non-existent distribution logic because sensible tokenomics isn’t their aim, they simply want to meme, or hustle their way to higher market capitalisation.

Coins like Dogecoin or Shiba Inu have crazy supply schedules yet can still generate a huge market cap - bigger than global publicly traded brands - because investors are irrational. 

So studying tokenomics on its own doesn’t mean that you can find cryptocurrencies that will succeed and increase in price, as you have to also understand how other people are making decisions, many of whom have no interest in tokenomics, or even know what it means. 

What tokenomics does give you is a framework to understand how a coin is intended to work, which can form part of an investing decision.

Here’s a summary of what the main metrics can tell you:

  • Maximum supply - Positive indicator for an effective store of value; if there is no supply cap, there will be ongoing inflation, which may dilute the value of all existing coins.Network/Nodes - The more diverse the better. Will make arbitrary decisions less likely, producing stability.
  • Supply Distribution - The more evenly distributed the better, as there is less chance that one person can have a disproportionate impact on price by selling their coins
  • Fee Revenue - Shows you how much people are actively using it; a proxy for cashflow
  • TVL Locked - Shows that users are willing to put their money where their mouth is, and lock in their investment for a share of rewards
  • Governance, Airdrops, Incentives & launch strategies can all influence supply distribution so should be considered as part of tokenomics.
Too Soon for 'To the Moon': What the BTC Rally Really Means

The principles, philosophies and models by which tokens, coins and the projects that underpin are at the very beginning of experimenting with what works, and what doesn't.

There are plenty of model that won't work, and we expect those projects will fade away. But for the ones that do, will go on to inspire and guide new projects still to come.

Thank for visiting and reading this article! Please don't forget to leave a like, comment and share!

#blockchain #tokenization #cryptocurrency 

What is GEEK

Buddha Community

What is Tokenomics | The Real Value of a Token

Words Counted: A Ruby Natural Language Processor.

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Are you using WordsCounted to do something interesting? Please tell me about it.

 

Demo

Visit this website for one example of what you can do with WordsCounted.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: abitdodgy
Source code: https://github.com/abitdodgy/words_counted
License: MIT license

#ruby  #ruby-on-rails 

Royce  Reinger

Royce Reinger

1658068560

WordsCounted: A Ruby Natural Language Processor

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Are you using WordsCounted to do something interesting? Please tell me about it.

Gem Version 

RubyDoc documentation.

Demo

Visit this website for one example of what you can do with WordsCounted.


Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: Abitdodgy
Source Code: https://github.com/abitdodgy/words_counted 
License: MIT license

#ruby #nlp 

aaron silva

aaron silva

1622197808

SafeMoon Clone | Create A DeFi Token Like SafeMoon | DeFi token like SafeMoon

SafeMoon is a decentralized finance (DeFi) token. This token consists of RFI tokenomics and auto-liquidity generating protocol. A DeFi token like SafeMoon has reached the mainstream standards under the Binance Smart Chain. Its success and popularity have been immense, thus, making the majority of the business firms adopt this style of cryptocurrency as an alternative.

A DeFi token like SafeMoon is almost similar to the other crypto-token, but the only difference being that it charges a 10% transaction fee from the users who sell their tokens, in which 5% of the fee is distributed to the remaining SafeMoon owners. This feature rewards the owners for holding onto their tokens.

Read More @ https://bit.ly/3oFbJoJ

#create a defi token like safemoon #defi token like safemoon #safemoon token #safemoon token clone #defi token

aviana farren

aviana farren

1623836330

Embrace the growth of DeFi Token Development Like SafeMoon in real-world

“The DeFi token development like SafeMoon was initially launched in March 2021 and created huge hype among global users. It is noted that more than 2 million holders have adopted this SafeMoon token in recent times after its launch in the market. The DeFi token like SafeMoon has hit the market cap for about $2.5 billion. This digital currency has experienced a steady increase in its price value to top the crypto list in the trade market. The future of cryptocurrency is expanding wide opportunities for upcoming investors and startups to make their investments worthy.”

The SafeMoon like token development is becoming more popular in the real world, making investors go crazy over these digital currencies since their value is soaring high in the marketplace. The DeFi like SafeMoon token has grabbed users attention in less time when compared to other crypto tokens in the market. The SafeMoon like token exists on the blockchain for the long run and does not rely on any intermediaries like financial institutions or exchanges. It has a peer-to-peer (P2P) network that benefits global users from experiencing fast and secure transactions.

What is SafeMoon?

SafeMoon is considered a decentralized finance (DeFi) token with great demand and value in the crypto market. It is mainly known for its functionalities like Reflection, LP Acquisition and burning. The DeFi token like SafeMoon functions exactly like tokenomics of the reflected finance, and it is operated through the Binance Smart Chain framework. It is a combination of liquidity generating protocol and RFI tokenomics in the blockchain platform. The launch of the SafeMoon token eliminates the need for central authority like banks or governments to benefit the users with secure processing at high speed without any interruption.

SafeMoon Tokenomics :

The SafeMoon tokenomics describes the economic status of the crypto tokens and has a more sound monetary policy than other competitors in the market. However, it is figured that investment towards DeFi like SafeMoon tokens has a higher potential for returns to benefit the investors in future and the risk associated with it is less. The total supply of SafeMoon tokens is estimated at 1,000,000,000,000,000, and 600,000,000,000 of these tokens are still in circulation. Burned Dev tokens supply is calculated as 223,000,000,000,000, and the shorthand is 223 Trillion. The Fair launch supply is closed around 777,000,000,000,000, and it is circulated for about 777 Trillion.

SafeMoon Specification :

The SafeMoon like DeFi token development is currently the fast-moving cryptos and struck the market cap for about $2,965,367,638. The SafeMoon token price value is found to be $0.000005065 that lured a wide range of audience attention in a short period. The total supply of tokens in the present is one quadrillion tokens.

SafeMoon Protocol :

The SafeMoon Protocol is considered as community-driven DeFi token that focuses on reflection, LP acquisition, and burn in each trade where the transaction is taxed into 5% fee redistributed to all existing holders, 5% fee is split into 50/50 where the half is sold by the contract into BNB and another half of SafeMoon tokens pairs with BNB and added as liquidity pair on PancakeSwap.

Safety: A step by step plan for ensuring 100% safety.

  • Dev burned all tokens in the wallet before the launch.
  • Fair launch on DxSale.
  • LP locked on DxLocker for four years
  • LP generated with every trade and locked on Pancake

Why is there a need for reflection & static?

The reflect mechanism effectively allows token holders to hang on their tokens based on percentages carried out and relying upon total tokens held by owners. The static rewards play a significant role in solving a host of problems to benefit the investors with profits based on the volume of tokens being traded in the market. This mechanism focuses on satisfying the early adopters selling their tokens after farming high APYs.

What is the role of Manual Burns?

The manual burns do matter at times, and sometimes they don’t. The continuous burn on any protocol is efficient for a shorter period, which means there is no possibility of controlling it in any way. It is necessary to have the SafeMoon like token burns controlled and promoted for further achievements over community rewards. It is possible that even manual burns and the amounts to be tracked down easily and advertised. The burn strategy of DeFi like SafeMoon token, is beneficial and rewarding for users engaged over the long term.

How efficient is Automatic Liquidity Pool (LP)?

The SafeMoon protocol ensures to take the assets automatically from token holders and locks them for liquidity. The main intention is to keep the holder in touch with the performance of the SafeMoon token by preventing the dips from whales when they are adopted for the mass trade-off.
The DeFi like SafeMoon token, has great price value in the trade market with fewer fluctuations.

Attractive features present in DeFi like SafeMoon token platform :

  • Stable Rewards
  • Manual Burning
  • LP Acquisition
  • Community Governed Tokens
  • RFI Staking Rewards
  • Automated Liquidity Pool
  • Automated Market Making

What are the benefits offered in SafeMoon like Token Development?

  • The SafeMoon like token development maintains high transparency over user transaction details to gain their trust.
  • It eliminates the need for intermediaries in DeFi token like SafeMoon platform to benefit the users with less gas fee, wait time and faster transaction speed.
  • The DeFi token development like SafeMoon supports borderless transactions for users to transfer funds from anywhere and anytime.
  • It benefits the token holders from gaining exclusive ownership rights over their purchased DeFi like SafeMoon tokens from the marketplace.
  • The smart contracts present in DeFi like SafeMoon token platform manages to operate the overall flow of transactions without any delay.
  • Investors can generate immediate liquidity from DeFi like SafeMoon tokens to increase their business revenue in a short period.

Summing Up :

The DeFi token development like SafeMoon is the next game-changer for the upcoming generation to explore the benefits for their business growth. The investments towards DeFi like SafeMoon token has excellent value in the long run that benefits the investors with high returns. It is highly efficient for trade, buy/sell and transaction. Investors can connect with any reputed blockchain company with professional experience developing a world-class DeFi like SafeMoon token platform with high-end features cost-effectively.

#defi token development like safemoon #defi like safemoon token #defi like safemoon token platform #safemoon like token development #defi token like safemoon

Bella Garvin

Bella Garvin

1624088381

Real Estate App Development I Real Estate Software Development USA

Orbit Edge is a top real-estate app development company that provides top-quality real estate software and app development solutions that facilitates the realtors, builder and other property brokers. Time-saving and cost-saving real estate software solutions help enterprises to sustain themselves in the real estate market.

#real estate app development company #real estate website development #real estate app development services #real estate app development #real estate software development company