What is Gigco (GIG) | What is Gigco token | What is GIG token

In this article, we'll discuss information about the Gigco project and GIG token. What is Gigco (GIG) | What is Gigco token | What is GIG token?

GIGCO is a brand-new live music booking platform that incorporates Solana blockchain technology to provide instantly deployable smart contracts and the network’s tokenized payment system.

For too many years, the act of coordinating live music events has followed the same old rutted pathways. Consequently, the industry is stymied by a lack of an easily accessible database of venues or artists, and a small number of booking agents have monopolized the industry.

GIGCO seeks to provide an all-in-one tokenized mobile application which tailors a seamless experience for artists, venues, and music fans alike.

What is GIGCO?

The GIGCO ethos empowers artists and venues to connect directly. Gigco does this by:

● Establishing a new trustworthy booking protocol 

● A healthy utility token 

● Integrating user fanbase inputs.

Music fans worldwide want to feel more connected to their favorite artists and venues. GIGCO will enable this connection allowing fans to benefit together in the success of their favorite artists through a combination of NFTs and cleverly designed staking programs.

GIGCO app is designed to solve inherent problems with the entire process of live music bookings, ticket sales, music management, and a dozen more music-related functions through the implementation of smart contracts. By introducing a tokenized payment system, GIGCO ensures that the transaction of value within the app is fast and seamless. In addition, structuring the token distribution to focus on rewarding early adopters of the platform aligns directly with the ethos of the GIGCO model.

GIGCO is a U.K limited company with headquarters in Newcastle upon Tyne. The main development office in Laos P.D.R.. Our team consists of a collection of musicians and programmers who share a vision of more adaptable, fluid systems for all aspects of the live music industry.

Issues with the Current Systems

Issues limiting the current income generation systems for music producers and live performers are due to a small minority of agents, intermediaries, and distribution platforms that have increased their stranglehold on the industry over recent decades. This stealthy grip held by such a small number of people and corporations has left millions of exceptionally talented musicians worldwide unable to grind out enough income from their life passion.

As an example of this vice-like grip: to earn a UK minimum yearly wage of £18,137 a year on Spotify, you would need to have fans stream your music 6,477,120 times if you own 100% of the rights. Many who sign to major labels only receive 20% as the artist and a further 8-15% if they wrote 100% of the material themselves, so at best 35%, meaning the reality is around 20 million streams per year to earn minimum wage. Platforms like Amazon and Apple offer artists slightly better rates per stream, but others like YouTube even less.

You would need almost 50 million views per year to earn a minimum wage through YouTube. 

The Vision

The GIGCO vision is a decentralized booking protocol where artists and venues can operate independently of agents or intermediaries, providing working opportunities for millions of talented musicians no matter how big or small the venue they play. The rise of Blockchain technology since the creation of Bitcoin, the innovation of smart contract development, and a move from Proof-of-Work to Proof-of-Stake consensus mechanisms birthed considerable increases in network speed and ability over the last decade, making the GIGCO vision possible.

The GIGCO app will leverage Solana technology to allow instantly deployable smart contracts for arranging gigs, selling event tickets, purchasing merchandise, facilitating platform staking protocols, and further into the future, providing equipment hire, production/stage crews, and much more.

The token native to GIGCO, GIG, has already been minted on the Solana blockchain. User wallets, created with Solana technology, will allow for thousands of transactions per second with maximum security. An incentivized token economy model will be built, focusing on rewarding active community members who engage in programs to drive initial growth, user adoption, and general brand awareness. Utilizing the GIG token & NFTs to promote network ecosystem growth and allow fans to support their favorite artists or venues, sharing in their success both emotionally and financially.

NFTs can be minted on Solana, allowing GIGCO to provide a marketplace for artists who wish to sell their work. By creating music NFTs, GIGCO will offer NFS ‘Non-Fungible Song’ functionality, allowing streaming of music on a pay-per-play basis. In addition to this, GIGCO will mint unique NFT Ticket Stubs & Promotional Posters.

GIGCO will ensure secure data governance while supporting procedures that progress further decentralization over time.

header

The GIGCO Ecosystem

The GIGCO app is the core of the ecosystem. The app lists venues, artists and gigs. A venue can source artists, organize gigs and sell tickets through the app. An artist can organize gigs as well; they can also sell merchandise and music NFTs through their profile. Everyone can use GIG to pay for goods or services, including tickets to live shows or booking an artist on the platform. Settling a transaction with GIG will always result in the best discounts being applied and micro-transaction fees. The ecosystem includes complete payment flexibility. It will be possible to pay using fiat currencies (through Stripe online payment integration) and other crypto payment methods, but to encourage continued use of GIG, settling the transaction solely with GIG will always be the cheapest possible option.

Gig Finde: Arrange live music gigs in seconds using specific search filters to narrow down potential artists or venues, select the desired profile, and commit to the GIG (smart) contract. This process confirms both parties are insured and covered for possible no-shows or defaulters. Professional or beginner, everyone can find the right fit and cultivate their career with GIGCO.

Insta-Gig: As a venue owner, gigs (live shows) can be arranged instantly using our swipe function to seamlessly browse the artist video database and select who you like. Then, a smart contract is created, and the gig is arranged.

Ticket Sales: Event tickets are created automatically via smart contracts within the app. When a gig is arranged, the venue owner will choose various ticketing options available from a drop-down menu. Options could include: VIP, Premium, Standard, Basic, Seated, Standing, and different payment methods available GIG, other Crypto, or Fiat.

What is GIG token?

GIG is the identification ticker for the token native to the GIGCO platform. Like all tokens, GIG is a digital voucher that can be exchanged for goods or services. It can also be given as a gift or as part of a promotional offer. Through intuitively designed reward-based listing and referral programs, GIG will feature heavily during the onboarding of users to the platform.

One hundred and ninety million GIG tokens have been minted on the Solana blockchain as it offers fast, secure, and highly scalable transactions of value across the platform. The creation of the GIG token allows users to transfer value throughout the network much more fluidly than with fiat transfers and with lower fees.

GIG Tokenomics

At the pre-launch stage, GIG tokens will be available as a reward. Rewards will be earned by listing venues or referring artists. Once the IDO is launched, GIG tokens will be available from DEX’s (decentralized exchanges).

A total of 190 million GIG tokens have been created on the Solana blockchain; 4.75 million tokens will become the initial liquid supply after the IDO. The remaining tokens will be unlocked over 36 months as per the vesting schedule until the circulating supply matches the total supply.

The token will be allocated to various sections of the business, as illustrated in the chart below:

Ecosystem and Community sectors combined will receive 50% of the total supply, providing the project with resources required to drive network growth. ICO and early investment round sales account for 19% of the supply, with a further 15% reserved for current and future advisory and development teams. The remaining 16% of the supply is split between the reserve and liquidity pools as insurance against any uncertainty in cash flow development.

The remaining GIG tokens will be unlocked and distributed over a vesting period of 36 months. The schedule of unlocking varies dependent upon the sector and shall be as below:

  • Seed: 0% unlocked at listing. 6–12 months cliff. Daily vesting over 6 months. 
  • Private: 0% unlocked at listing. 3–9 months cliff. Daily vesting over 6 months. 
  • Public: 50% unlocked at listing. 
  • Team: 6 months cliff. Daily vesting over 24 months. 
  • Advisor: 6 months cliff. Daily vesting over 24 months. 
  • Liquidity: Monthly vesting for 36 months. 
  • Ecosystem: Monthly vesting for 36 months. 
  • Community: Monthly vesting for 36 months. 
  • Reserve: Daily vesting for 12 months.

Buyback Policy

With so many exciting and scalable use cases for GIG, tokens acquired through the GIGCO buyback policy will be reintroduced into the economy through community rewards and giveaways. Currently, there is no plan to hold any token-burning events.

The value of GIG will be managed through the frequency of release of tokens from the liquidity reserve. GIGCO reserves the right to review and alter this policy once the ecosystem has gone past the launch/start-up phase.

GIGCO will look at multiple ways to own as many GIG tokens as possible over time. The product is strong, with much room for developing new features and expanding into alternative marketplaces; therefore, GIG token will increase in scarcity and value.

  • GIGCO Original NFT Profits: A series of original collectibles will be released over time, the profits from sales will be used to buy back GIG tokens. The bought-back GIG tokens will be reintroduced to the community and liquidity reserves, generating more volume and trust. 
  • GIGCO Monthly Profits: A total of 20% of the monthly net profit from GIGCO will be used to buy back GIG tokens from the market to be added to community and liquidity reserves and other future community engagement. 
  • GIGCO Quarterly Buybacks: Larger targeted quarterly buybacks will also make more tokens available to the community and the liquidity reserves. 
  • GIGCO Purchase Swap: A swapping process will occur whenever a transaction is made on the platform where the purchaser does not already hold the GIG token required to settle the balance. The balance of a transaction will always be displayed in GIG with other options available to settle. If, for example, the purchaser requires to settle in USD, algorithms will display the checkout balance in USD, the transaction is settled, and at this moment, an API call is made to all exchanges where GIG is available. When the best price for GIG is located, a swap for the exact purchase price is activated, and the USD is changed for GIG token. This process helps drive exchange volume while ensuring that GIGCO always holds as much GIG as possible.

Transaction Fees

GIGCO will introduce transaction fees but is still finalizing the policy. The initial transaction fee policy will match Solana’s fee structure until GIGCO has determined details to support the buyback plan. GIGCO’s pricing policy is to maintain an industry-low fee structure to keep profits high for venues & artists and ticket costs low for music fans.

GIGCO will be paying Solana’s fees when distributing rewards. The user will only have to provide a valid SPL wallet address.

How and Where to Buy BikeRush token?

You will have to first buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…

We will use Binance Exchange here as it is one of the largest crypto exchanges that accept fiat deposits.

Once you finished the KYC process. You will be asked to add a payment method. Here you can either choose to provide a credit/debit card or use a bank transfer, and buy one of the major cryptocurrencies, usually either Bitcoin (BTC), Ethereum (ETH), Tether (USDT), Binance (BNB)…

☞ SIGN UP ON BINANCE

Step by Step Guide : What is Binance | How to Create an account on Binance (Updated 2022)

Once finished you will then need to make a BTC/ETH/USDT/BNB deposit to the exchange from Binance depending on the available market pairs. After the deposit is confirmed you may then purchase GIG from the exchange.

The top exchange for trading in GIG token is currently: DO will go live on the 24th of June, 2022 on SolRazr!

🔺DISCLAIMER: The Information in the post isn’t financial advice, is intended FOR GENERAL INFORMATION PURPOSES ONLY. Trading Cryptocurrency is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money.

🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner

Find more information GIG token ☞ Website

I hope this post will help you. Don't forget to leave a like, comment and sharing it with others. Thank you!

#bitcoin #cryptocurrency 

What is GEEK

Buddha Community

What is Gigco (GIG) | What is Gigco token | What is GIG token

Words Counted: A Ruby Natural Language Processor.

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Are you using WordsCounted to do something interesting? Please tell me about it.

 

Demo

Visit this website for one example of what you can do with WordsCounted.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: abitdodgy
Source code: https://github.com/abitdodgy/words_counted
License: MIT license

#ruby  #ruby-on-rails 

Royce  Reinger

Royce Reinger

1658068560

WordsCounted: A Ruby Natural Language Processor

WordsCounted

We are all in the gutter, but some of us are looking at the stars.

-- Oscar Wilde

WordsCounted is a Ruby NLP (natural language processor). WordsCounted lets you implement powerful tokensation strategies with a very flexible tokeniser class.

Features

  • Out of the box, get the following data from any string or readable file, or URL:
    • Token count and unique token count
    • Token densities, frequencies, and lengths
    • Char count and average chars per token
    • The longest tokens and their lengths
    • The most frequent tokens and their frequencies.
  • A flexible way to exclude tokens from the tokeniser. You can pass a string, regexp, symbol, lambda, or an array of any combination of those types for powerful tokenisation strategies.
  • Pass your own regexp rules to the tokeniser if you prefer. The default regexp filters special characters but keeps hyphens and apostrophes. It also plays nicely with diacritics (UTF and unicode characters): Bayrūt is treated as ["Bayrūt"] and not ["Bayr", "ū", "t"], for example.
  • Opens and reads files. Pass in a file path or a url instead of a string.

Installation

Add this line to your application's Gemfile:

gem 'words_counted'

And then execute:

$ bundle

Or install it yourself as:

$ gem install words_counted

Usage

Pass in a string or a file path, and an optional filter and/or regexp.

counter = WordsCounted.count(
  "We are all in the gutter, but some of us are looking at the stars."
)

# Using a file
counter = WordsCounted.from_file("path/or/url/to/my/file.txt")

.count and .from_file are convenience methods that take an input, tokenise it, and return an instance of WordsCounted::Counter initialized with the tokens. The WordsCounted::Tokeniser and WordsCounted::Counter classes can be used alone, however.

API

WordsCounted

WordsCounted.count(input, options = {})

Tokenises input and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.count("Hello Beirut!")

Accepts two options: exclude and regexp. See Excluding tokens from the analyser and Passing in a custom regexp respectively.

WordsCounted.from_file(path, options = {})

Reads and tokenises a file, and initializes a WordsCounted::Counter object with the resulting tokens.

counter = WordsCounted.from_file("hello_beirut.txt")

Accepts the same options as .count.

Tokeniser

The tokeniser allows you to tokenise text in a variety of ways. You can pass in your own rules for tokenisation, and apply a powerful filter with any combination of rules as long as they can boil down into a lambda.

Out of the box the tokeniser includes only alpha chars. Hyphenated tokens and tokens with apostrophes are considered a single token.

#tokenise([pattern: TOKEN_REGEXP, exclude: nil])

tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise

# With `exclude`
tokeniser = WordsCounted::Tokeniser.new("Hello Beirut!").tokenise(exclude: "hello")

# With `pattern`
tokeniser = WordsCounted::Tokeniser.new("I <3 Beirut!").tokenise(pattern: /[a-z]/i)

See Excluding tokens from the analyser and Passing in a custom regexp for more information.

Counter

The WordsCounted::Counter class allows you to collect various statistics from an array of tokens.

#token_count

Returns the token count of a given string.

counter.token_count #=> 15

#token_frequency

Returns a sorted (unstable) two-dimensional array where each element is a token and its frequency. The array is sorted by frequency in descending order.

counter.token_frequency

[
  ["the", 2],
  ["are", 2],
  ["we",  1],
  # ...
  ["all", 1]
]

#most_frequent_tokens

Returns a hash where each key-value pair is a token and its frequency.

counter.most_frequent_tokens

{ "are" => 2, "the" => 2 }

#token_lengths

Returns a sorted (unstable) two-dimentional array where each element contains a token and its length. The array is sorted by length in descending order.

counter.token_lengths

[
  ["looking", 7],
  ["gutter",  6],
  ["stars",   5],
  # ...
  ["in",      2]
]

#longest_tokens

Returns a hash where each key-value pair is a token and its length.

counter.longest_tokens

{ "looking" => 7 }

#token_density([ precision: 2 ])

Returns a sorted (unstable) two-dimentional array where each element contains a token and its density as a float, rounded to a precision of two. The array is sorted by density in descending order. It accepts a precision argument, which must be a float.

counter.token_density

[
  ["are",     0.13],
  ["the",     0.13],
  ["but",     0.07 ],
  # ...
  ["we",      0.07 ]
]

#char_count

Returns the char count of tokens.

counter.char_count #=> 76

#average_chars_per_token([ precision: 2 ])

Returns the average char count per token rounded to two decimal places. Accepts a precision argument which defaults to two. Precision must be a float.

counter.average_chars_per_token #=> 4

#uniq_token_count

Returns the number of unique tokens.

counter.uniq_token_count #=> 13

Excluding tokens from the tokeniser

You can exclude anything you want from the input by passing the exclude option. The exclude option accepts a variety of filters and is extremely flexible.

  1. A space-delimited string. The filter will normalise the string.
  2. A regular expression.
  3. A lambda.
  4. A symbol that names a predicate method. For example :odd?.
  5. An array of any combination of the above.
tokeniser =
  WordsCounted::Tokeniser.new(
    "Magnificent! That was magnificent, Trevor."
  )

# Using a string
tokeniser.tokenise(exclude: "was magnificent")
# => ["that", "trevor"]

# Using a regular expression
tokeniser.tokenise(exclude: /trevor/)
# => ["magnificent", "that", "was", "magnificent"]

# Using a lambda
tokeniser.tokenise(exclude: ->(t) { t.length < 4 })
# => ["magnificent", "that", "magnificent", "trevor"]

# Using symbol
tokeniser = WordsCounted::Tokeniser.new("Hello! محمد")
tokeniser.tokenise(exclude: :ascii_only?)
# => ["محمد"]

# Using an array
tokeniser = WordsCounted::Tokeniser.new(
  "Hello! اسماءنا هي محمد، كارولينا، سامي، وداني"
)
tokeniser.tokenise(
  exclude: [:ascii_only?, /محمد/, ->(t) { t.length > 6}, "و"]
)
# => ["هي", "سامي", "وداني"]

Passing in a custom regexp

The default regexp accounts for letters, hyphenated tokens, and apostrophes. This means twenty-one is treated as one token. So is Mohamad's.

/[\p{Alpha}\-']+/

You can pass your own criteria as a Ruby regular expression to split your string as desired.

For example, if you wanted to include numbers, you can override the regular expression:

counter = WordsCounted.count("Numbers 1, 2, and 3", pattern: /[\p{Alnum}\-']+/)
counter.tokens
#=> ["numbers", "1", "2", "and", "3"]

Opening and reading files

Use the from_file method to open files. from_file accepts the same options as .count. The file path can be a URL.

counter = WordsCounted.from_file("url/or/path/to/file.text")

Gotchas

A hyphen used in leu of an em or en dash will form part of the token. This affects the tokeniser algorithm.

counter = WordsCounted.count("How do you do?-you are well, I see.")
counter.token_frequency

[
  ["do",   2],
  ["how",  1],
  ["you",  1],
  ["-you", 1], # WTF, mate!
  ["are",  1],
  # ...
]

In this example -you and you are separate tokens. Also, the tokeniser does not include numbers by default. Remember that you can pass your own regular expression if the default behaviour does not fit your needs.

A note on case sensitivity

The program will normalise (downcase) all incoming strings for consistency and filters.

Roadmap

Ability to open URLs

def self.from_url
  # open url and send string here after removing html
end

Are you using WordsCounted to do something interesting? Please tell me about it.

Gem Version 

RubyDoc documentation.

Demo

Visit this website for one example of what you can do with WordsCounted.


Contributors

See contributors.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Author: Abitdodgy
Source Code: https://github.com/abitdodgy/words_counted 
License: MIT license

#ruby #nlp 

aaron silva

aaron silva

1622197808

SafeMoon Clone | Create A DeFi Token Like SafeMoon | DeFi token like SafeMoon

SafeMoon is a decentralized finance (DeFi) token. This token consists of RFI tokenomics and auto-liquidity generating protocol. A DeFi token like SafeMoon has reached the mainstream standards under the Binance Smart Chain. Its success and popularity have been immense, thus, making the majority of the business firms adopt this style of cryptocurrency as an alternative.

A DeFi token like SafeMoon is almost similar to the other crypto-token, but the only difference being that it charges a 10% transaction fee from the users who sell their tokens, in which 5% of the fee is distributed to the remaining SafeMoon owners. This feature rewards the owners for holding onto their tokens.

Read More @ https://bit.ly/3oFbJoJ

#create a defi token like safemoon #defi token like safemoon #safemoon token #safemoon token clone #defi token

aaron silva

aaron silva

1621844791

SafeMoon Clone | SafeMoon Token Clone | SafeMoon Token Clone Development

The SafeMoon Token Clone Development is the new trendsetter in the digital world that brought significant changes to benefit the growth of investors’ business in a short period. The SafeMoon token clone is the most widely discussed topic among global users for its value soaring high in the marketplace. The SafeMoon token development is a combination of RFI tokenomics and the auto-liquidity generating process. The SafeMoon token is a replica of decentralized finance (DeFi) tokens that are highly scalable and implemented with tamper-proof security.

The SafeMoon tokens execute efficient functionalities like RFI Static Rewards, Automated Liquidity Provisions, and Automatic Token Burns. The SafeMoon token is considered the most advanced stable coin in the crypto market. It gained global audience attention for managing the stability of asset value without any fluctuations in the marketplace. The SafeMoon token clone is completely decentralized that eliminates the need for intermediaries and benefits the users with less transaction fee and wait time to overtake the traditional banking process.

Reasons to invest in SafeMoon Token Clone :

  • The SafeMoon token clone benefits the investors with Automated Liquidity Pool as a unique feature since it adds more revenue for their business growth in less time. The traders can experience instant trade round the clock for reaping profits with less investment towards the SafeMoon token.
  • It is integrated with high-end security protocols like two-factor authentication and signature process to prevent various hacks and vulnerable activities. The Smart Contract system in SafeMoon token development manages the overall operation of transactions without any delay,
  • The users can obtain a reward amount based on the volume of SafeMoon tokens traded in the marketplace. The efficient trading mechanism allows the users to trade the SafeMoon tokens at the best price for farming. The user can earn higher rewards based on the staking volume of tokens by users in the trade market.
  • It allows the token holders to gain complete ownership over their SafeMoon tokens after purchasing from DeFi exchanges. The SafeMoon community governs the token distribution, price fluctuations, staking, and every other token activity. The community boosts the value of SafeMoon tokens.
  • The Automated Burning tokens result in the community no longer having control over the SafeMoon tokens. Instead, the community can control the burn of the tokens efficiently for promoting its value in the marketplace. The transaction of SafeMoon tokens on the blockchain platform is fast, safe, and secure.

The SafeMoon Token Clone Development is a promising future for upcoming investors and startups to increase their business revenue in less time. The SafeMoon token clone has great demand in the real world among millions of users for its value in the market. Investors can contact leading Infinite Block Tech to gain proper assistance in developing a world-class SafeMoon token clone that increases the business growth in less time.

#safemoon token #safemoon token clone #safemoon token clone development #defi token

Angelina roda

Angelina roda

1624230000

How to Buy FEG Token - The EASIEST Method 2021. JUST IN A FEW MINUTES!!!

How to Buy FEG Token - The EASIEST Method 2021
In today’s video, I will be showing you guys how to buy the FEG token/coin using Trust Wallet and Pancakeswap. This will work for both iOS and Android devices!
📺 The video in this post was made by More LimSanity
The origin of the article: https://www.youtube.com/watch?v=LAVwpiEN6bg
🔺 DISCLAIMER: The article is for information sharing. The content of this video is solely the opinions of the speaker who is not a licensed financial advisor or registered investment advisor. Not investment advice or legal advice.
Cryptocurrency trading is VERY risky. Make sure you understand these risks and that you are responsible for what you do with your money
🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner
⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!
☞ **-----CLICK HERE-----**⭐ ⭐ ⭐
Thanks for visiting and watching! Please don’t forget to leave a like, comment and share!

#bitcoin #blockchain #feg token #token #how to buy feg token #how to buy feg token - the easiest method 2021