Vern  Greenholt

Vern Greenholt

1598560260

The Problems of Generative Adversarial Networks (GANs)

In the last post I really discussed GANs (the structure and the steps of the training and the loss function). In this post, I will discuss the limitation and the problem of training a GAN.

1. Introduction

There are many various problems that prevent successful GAN training due to the GANs’ extremely diverse applications. Accordingly, improving the training of the GANs is an open research field for researchers.

Before discussing the problems, let have a quick look at some of the GAN equations.

Image for post

The architecture of the GAN with derivation terms of the loss function

The figure above indicates, the derivation terms of the final loss function for training the discriminator and generator with the corresponding gradient. In the following are some of the basic problems of GAN that lead to hard training.

2. Vanishing gradients

The first problem in the GAN training that should be considered serious is vanishing gradients. Before diving into the Vanishing gradients problem, the KL-divergence and JS-divergence should be explained.

The concept of divergence between two probability distributions can be defined as a measure of the distance between two distributions. Imagine if we minimize the divergence, we also hope that the two distributions are equal.

2.1 KL-Divergence

In many generative models, the goal is to create a model which maximizes the Maximum Likelihood Estimation(shortly MLE) which can be defined as the best model parameters to fit the training data. Maximum likelihood equation is shown in follow.

#kl-and-js-divergence #mode-collapse #nash-equilibrium #problem-of-gan #vanishing-gradient #neural networks

What is GEEK

Buddha Community

The Problems of Generative Adversarial Networks (GANs)
Vern  Greenholt

Vern Greenholt

1598560260

The Problems of Generative Adversarial Networks (GANs)

In the last post I really discussed GANs (the structure and the steps of the training and the loss function). In this post, I will discuss the limitation and the problem of training a GAN.

1. Introduction

There are many various problems that prevent successful GAN training due to the GANs’ extremely diverse applications. Accordingly, improving the training of the GANs is an open research field for researchers.

Before discussing the problems, let have a quick look at some of the GAN equations.

Image for post

The architecture of the GAN with derivation terms of the loss function

The figure above indicates, the derivation terms of the final loss function for training the discriminator and generator with the corresponding gradient. In the following are some of the basic problems of GAN that lead to hard training.

2. Vanishing gradients

The first problem in the GAN training that should be considered serious is vanishing gradients. Before diving into the Vanishing gradients problem, the KL-divergence and JS-divergence should be explained.

The concept of divergence between two probability distributions can be defined as a measure of the distance between two distributions. Imagine if we minimize the divergence, we also hope that the two distributions are equal.

2.1 KL-Divergence

In many generative models, the goal is to create a model which maximizes the Maximum Likelihood Estimation(shortly MLE) which can be defined as the best model parameters to fit the training data. Maximum likelihood equation is shown in follow.

#kl-and-js-divergence #mode-collapse #nash-equilibrium #problem-of-gan #vanishing-gradient #neural networks

Paper Reading on Generative Adversarial Nets

Generative Adversarial Nets

The main idea is to develop a generative model via an adversarial process. We will discuss what is an adversarial process later. GAN consists of two model. The one is generative model G and the other is discriminative model D. The purpose of a generative model is to generate the closest data as possible for give some input. The purpose of a discriminative model between two classes 0 and 1. 0 meaning the class belongs to Generative output and 1 meaning the class belongs to the true input sample from the original data.

This architecture corresponds to the minmax two-player game. One tries to create conflict over the other. Such networks are called adversarial networks. In the process of creating conflicts, both of them learn to be better and stronger than each other. When the discriminator makes an output of value ½ or 0.5, it implies that the discriminator is not able to distinguish whether the value came from the generator output or the original sample.

Here, the G and D are defined by the multilayered perceptron such that the entire system can be trained with back propagation. The training of the discriminator and generator are done separately.

According to the paper, the generative model can be thought of as analogous to a team of counterfeiters who are trying to produce a fake currency and use them without getting caught.

While, the discriminative model can be thought of as analogous to the Police who are trying to detect the fake currency. Here, both the teams try to improve their methods until the currencies are indistinguishable from the original currency.

Adversarial Networks

Straight from the paper,

To learn the generator’s distribution Pg over data x, we define a prior on input noise variables Pz(z), then represent a mapping to data space as G(z; θg ).

where G is a differentiable function represented by a multilayer perceptron with parameters θ g .

We also define a second multilayer perceptron D(x; θd ) that outputs a single scalar.

Where D(x) represents the probability that x came from the data rather than Pg.

The architecture of GAN can be explained from the following figure.

Image for post

#generative-adversarial #discriminator #adversarial-network #deep-learning #neural-networks

Angela  Dickens

Angela Dickens

1595047860

Attention in GANs (Generative Adversarial Networks)

In 2017, the paper “Attention is all you need”shocked the land of NLP (Natural Language Processing). It was shocking not only because it has a good paper title, but also because it introduced to the world a new model architecture called “Transformer”, which proved to perform much better than the traditional RNN type of networks and paved its way to the state of the art NLP model “BERT”.

This stone cast in the pond of NLP has created ripples in the pond of GANs (Generative Adversarial Networks). Many have been inspired by this and attempt to garner the magic power of attention. But when I first started reading papers about using attention in GANs, it appeared to me there are so many different meanings behind the same “attention” word. In case you are as confused as I was, let me be at your service and shed some light on what people really mean when they say they use “attention” in GANs.

Meaning 1: Self-attention

Self-attention in GANs is very similar to the mechanism in the NLP Transformer model. Basically, it addresses the difficulty of the AI model in understanding long-range dependency.

In NLP, the problem arises when there is a long sentence. Take the example of this Oscar Wilde’s quote _“To __live _is the rarest thing in the world. Most people exist, that is all.” The two words in bold (“live” and “exist”) have a relationship, but they are placed far apart from each other which makes it hard for the RNN type of AI model to capture the relationship.

It is almost the same in GANs, most of GANs use CNN structure which is good at capturing local features and may overlook the long-range dependency when it is outside of its receptive field. As a result, it is easy for GANs to generate realistic-looking furs on dogs, but it may make a mistake by generating a dog with 5 legs.

Self-Attention Generative Adversarial Networks(SAGAN) adds a self-attention module to guide the model to look at features at distant portions of the image. In the pictures below, the picture on the left is the generated image, with some sample locations labeled with color dots. The other images showing the corresponding attention map of the locations. I find the most interesting one is the 5th image with the cyan dot. It shows that when the model generates the left ear of the dog, it not only looks at the local region around the left ear but also looks at the right ear.

Image for post

Visualization of the attention map for the color labeled locations. source

If you are interested in the technical details of SAGAN, other than reading the paper, I also recommend this post.

Meaning 2: Attention in the discriminator

GANs consist of a generator and a discriminator. In the GANs world, they are like two gods eternally at war, where the generator god tirelessly creates, and the discriminator god stands at the side and criticizes how bad these creations are. It may sound like the discriminator is the bad god, but that is not true. It is through these criticisms that the generator god knows how to improve.

If these “criticisms” from discriminator are so helpful, why not we pay more attention to them? Let’s see how this paper “U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation_” _does this.

The project U-GAT-IT tackles a difficult task — converts a human photo into a Japanese anime image. It is difficult because an anime character’s face is vastly different from a real person’s face. Take an example of the pair images below. The anime character on the right is deemed as a good conversion from the person on the left. But is we put ourselves in the computer’s shoes for a moment, we will see that the eyes, nose, and mouth in the two images are very different, the structure and proportion of the face also changes a lot. It is very hard for a computer to know what features to preserve and what to modify.

#machine-learning #gans #neural-networks #deep-learning #generative-model #deep learning

amelia jones

1591340335

How To Take Help Of Referencing Generator

APA Referencing Generator

Many students use APA style as the key citation style in their assignment in university or college. Although, many people find it quite difficult to write the reference of the source. You ought to miss the names and dates of authors. Hence, APA referencing generator is important for reducing the burden of students. They can now feel quite easy to do the assignments on time.

The functioning of APA referencing generator

If you are struggling hard to write the APA referencing then you can take the help of APA referencing generator. It will create an excellent list. You are required to enter the information about the source. Just ensure that the text is credible and original. If you will copy references then it is a copyright violation.

You can use a referencing generator in just a click. It will generate the right references for all the sources. You are required to organize in alphabetical order. The generator will make sure that you will get good grades.

How to use APA referencing generator?

Select what is required to be cited such as journal, book, film, and others. You can choose the type of required citations list and enter all the required fields. The fields are dates, author name, title, editor name, and editions, name of publishers, chapter number, page numbers, and title of journals. You can click for reference to be generated and you will get the desired result.

Chicago Referencing Generator

Do you require the citation style? You can rely on Chicago Referencing Generator and will ensure that you will get the right citation in just a click. The generator is created to provide solutions to students to cite their research paper in Chicago style. It has proved to be the quickest and best citation generator on the market. The generator helps to sort the homework issues in few seconds. It also saves a lot of time and energy.

This tool helps researchers, professional writers, and students to manage and generate text citation essays. It will help to write Chicago style in a fast and easy way. It also provides details and directions for formatting and cites resources.

So, you must stop wasting the time and can go for Chicago Referencing Generator or APA referencing generator. These citation generators will help to solve the problem of citation issues. You can easily create citations by using endnotes and footnotes.

So, you can generate bibliographies, references, in-text citations, and title pages. These are fully automatic referencing style. You are just required to enter certain details about the citation and you will get the citation in the proper and required format.

So, if you are feeling any problem in doing assignment then you can take the help of assignment help.
If you require help for Assignment then livewebtutors is the right place for you. If you see our prices, you will observe that they are actually very affordable. Also, you can always expect a discount. Our team is capable and versatile enough to offer you exactly what you need, the best services for the prices you can afford.

read more:- Are you struggling to write a bibliography? Use Harvard referencing generator

#apa referencing generator #harvard referencing generator #chicago referencing generator #mla referencing generator #deakin referencing generator #oxford referencing generator

Royce  Reinger

Royce Reinger

1658977500

A Ruby Library for Generating Text with Recursive Template Grammars

Calyx

Calyx provides a simple API for generating text with declarative recursive grammars.

Install

Command Line

gem install calyx

Gemfile

gem 'calyx'

Examples

The best way to get started quickly is to install the gem and run the examples locally.

Any Gradient

Requires Roda and Rack to be available.

gem install roda

Demonstrates how to use Calyx to construct SVG graphics. Any Gradient generates a rectangle with a linear gradient of random colours.

Run as a web server and preview the output in a browser (http://localhost:9292):

ruby examples/any_gradient.rb

Or generate SVG files via a command line pipe:

ruby examples/any_gradient > gradient1.xml

Tiny Woodland Bot

Requires the Twitter client gem and API access configured for a specific Twitter handle.

gem install twitter

Demonstrates how to use Calyx to make a minimal Twitter bot that periodically posts unique tweets. See @tiny_woodland on Twitter and the writeup here.

TWITTER_CONSUMER_KEY=XXX-XXX
TWITTER_CONSUMER_SECRET=XXX-XXX
TWITTER_ACCESS_TOKEN=XXX-XXX
TWITTER_CONSUMER_SECRET=XXX-XXX
ruby examples/tiny_woodland_bot.rb

Faker

Faker is a popular library for generating fake names and associated sample data like internet addresses, company names and locations.

This example demonstrates how to use Calyx to reproduce the same functionality using custom lists defined in a YAML configuration file.

ruby examples/faker.rb

Usage

Require the library and inherit from Calyx::Grammar to construct a set of rules to generate a text.

require 'calyx'

class HelloWorld < Calyx::Grammar
  start 'Hello world.'
end

To generate the text itself, initialize the object and call the generate method.

hello = HelloWorld.new
hello.generate
# > "Hello world."

Obviously, this hardcoded sentence isn’t very interesting by itself. Possible variations can be added to the text by adding additional rules which provide a named set of text strings. The rule delimiter syntax ({}) can be used to substitute the generated content of other rules.

class HelloWorld < Calyx::Grammar
  start '{greeting} world.'
  greeting 'Hello', 'Hi', 'Hey', 'Yo'
end

Each time #generate runs, it evaluates the tree and randomly selects variations of rules to construct a resulting string.

hello = HelloWorld.new

hello.generate
# > "Hi world."

hello.generate
# > "Hello world."

hello.generate
# > "Yo world."

By convention, the start rule specifies the default starting point for generating the final text. You can start from any other named rule by passing it explicitly to the generate method.

class HelloWorld < Calyx::Grammar
  hello 'Hello world.'
end

hello = HelloWorld.new
hello.generate(:hello)

Block Constructors

As an alternative to subclassing, you can also construct rules unique to an instance by passing a block when initializing the class:

hello = Calyx::Grammar.new do
  start '{greeting} world.'
  greeting 'Hello', 'Hi', 'Hey', 'Yo'
end

hello.generate

Template Expressions

Basic rule substitution uses single curly brackets as delimiters for template expressions:

fruit = Calyx::Grammar.new do
  start '{colour} {fruit}'
  colour 'red', 'green', 'yellow'
  fruit 'apple', 'pear', 'tomato'
end

6.times { fruit.generate }
# => "yellow pear"
# => "red apple"
# => "green tomato"
# => "red pear"
# => "yellow tomato"
# => "green apple"

Nesting and Substitution

Rules are recursive. They can be arbitrarily nested and connected to generate larger and more complex texts.

class HelloWorld < Calyx::Grammar
  start '{greeting} {world_phrase}.'
  greeting 'Hello', 'Hi', 'Hey', 'Yo'
  world_phrase '{happy_adj} world', '{sad_adj} world', 'world'
  happy_adj 'wonderful', 'amazing', 'bright', 'beautiful'
  sad_adj 'cruel', 'miserable'
end

Nesting and hierarchy can be manipulated to balance consistency with novelty. The exact same word atoms can be combined in a variety of ways to produce strikingly different resulting texts.

module HelloWorld
  class Sentiment < Calyx::Grammar
    start '{happy_phrase}', '{sad_phrase}'
    happy_phrase '{happy_greeting} {happy_adj} world.'
    happy_greeting 'Hello', 'Hi', 'Hey', 'Yo'
    happy_adj 'wonderful', 'amazing', 'bright', 'beautiful'
    sad_phrase '{sad_greeting} {sad_adj} world.'
    sad_greeting 'Goodbye', 'So long', 'Farewell'
    sad_adj 'cruel', 'miserable'
  end

  class Mixed < Calyx::Grammar
    start '{greeting} {adj} world.'
    greeting 'Hello', 'Hi', 'Hey', 'Yo', 'Goodbye', 'So long', 'Farewell'
    adj 'wonderful', 'amazing', 'bright', 'beautiful', 'cruel', 'miserable'
  end
end

Random Sampling

By default, the outcomes of generated rules are selected with Ruby’s built-in pseudorandom number generator (as seen in methods like Kernel.rand and Array.sample). To seed the random number generator, pass in an integer seed value as the first argument to the constructor:

grammar = Calyx::Grammar.new(seed: 12345) do
  # rules...
end

Alternatively, you can pass a preconfigured instance of Ruby’s stdlib Random class:

random = Random.new(12345)

grammar = Calyx::Grammar.new(rng: random) do
  # rules...
end

When a random seed isn’t supplied, Time.new.to_i is used as the default seed, which makes each run of the generator relatively unique.

Weighted Choices

Choices can be weighted so that some rules have a greater probability of expanding than others.

Weights are defined by passing a hash instead of a list of rules where the keys are strings or symbols representing the grammar rules and the values are weights.

Weights can be represented as floats, integers or ranges.

  • Floats must be in the interval 0..1 and the given weights for a production must sum to 1.
  • Ranges must be contiguous and cover the entire interval from 1 to the maximum value of the largest range.
  • Integers (Fixnums) will produce a distribution based on the sum of all given numbers, with each number being a fraction of that sum.

The following definitions produce an equivalent weighting of choices:

Calyx::Grammar.new do
  start 'heads' => 1, 'tails' => 1
end

Calyx::Grammar.new do
  start 'heads' => 0.5, 'tails' => 0.5
end

Calyx::Grammar.new do
  start 'heads' => 1..5, 'tails' => 6..10
end

Calyx::Grammar.new do
  start 'heads' => 50, 'tails' => 50
end

There’s a lot of interesting things you can do with this. For example, you can model the triangular distribution produced by rolling 2d6:

Calyx::Grammar.new do
  start(
    '2' => 1,
    '3' => 2,
    '4' => 3,
    '5' => 4,
    '6' => 5,
    '7' => 6,
    '8' => 5,
    '9' => 4,
    '10' => 3,
    '11' => 2,
    '12' => 1
  )
end

Or reproduce Gary Gygax’s famous generation table from the original Dungeon Master’s Guide (page 171):

Calyx::Grammar.new do
  start(
    :empty => 0.6,
    :monster => 0.1,
    :monster_treasure => 0.15,
    :special => 0.05,
    :trick_trap => 0.05,
    :treasure => 0.05
  )
  empty 'Empty'
  monster 'Monster Only'
  monster_treasure 'Monster and Treasure'
  special 'Special'
  trick_trap 'Trick/Trap.'
  treasure 'Treasure'
end

String Modifiers

Dot-notation is supported in template expressions, allowing you to call any available method on the String object returned from a rule. Formatting methods can be chained arbitrarily and will execute in the same way as they would in native Ruby code.

greeting = Calyx::Grammar.new do
  start '{hello.capitalize} there.', 'Why, {hello} there.'
  hello 'hello', 'hi'
end

4.times { greeting.generate }
# => "Hello there."
# => "Hi there."
# => "Why, hello there."
# => "Why, hi there."

You can also extend the grammar with custom modifiers that provide useful formatting functions.

Filters

Filters accept an input string and return the transformed output:

greeting = Calyx::Grammar.new do
  filter :shoutycaps do |input|
    input.upcase
  end

  start '{hello.shoutycaps} there.', 'Why, {hello.shoutycaps} there.'
  hello 'hello', 'hi'
end

4.times { greeting.generate }
# => "HELLO there."
# => "HI there."
# => "Why, HELLO there."
# => "Why, HI there."

Mappings

The mapping shortcut allows you to specify a map of regex patterns pointing to their resulting substitution strings:

green_bottle = Calyx::Grammar.new do
  mapping :pluralize, /(.+)/ => '\\1s'
  start 'One green {bottle}.', 'Two green {bottle.pluralize}.'
  bottle 'bottle'
end

2.times { green_bottle.generate }
# => "One green bottle."
# => "Two green bottles."

Modifier Mixins

In order to use more intricate rewriting and formatting methods in a modifier chain, you can add methods to a module and embed it in a grammar using the modifier classmethod.

Modifier methods accept a single argument representing the input string from the previous step in the expression chain and must return a string, representing the modified output.

module FullStop
  def full_stop(input)
    input << '.'
  end
end

hello = Calyx::Grammar.new do
  modifier FullStop
  start '{hello.capitalize.full_stop}'
  hello 'hello'
end

hello.generate
# => "Hello."

To share custom modifiers across multiple grammars, you can include the module in Calyx::Modifiers. This will make the methods available to all subsequent instances:

module FullStop
  def full_stop(input)
    input << '.'
  end
end

class Calyx::Modifiers
  include FullStop
end

Monkeypatching String

Alternatively, you can combine methods from existing Gems that monkeypatch String:

require 'indefinite_article'

module FullStop
  def full_stop
    self << '.'
  end
end

class String
  include FullStop
end

noun_articles = Calyx::Grammar.new do
  start '{fruit.with_indefinite_article.capitalize.full_stop}'
  fruit 'apple', 'orange', 'banana', 'pear'
end

4.times { noun_articles.generate }
# => "An apple."
# => "An orange."
# => "A banana."
# => "A pear."

Memoized Rules

Rule expansions can be ‘memoized’ so that multiple references to the same rule return the same value. This is useful for picking a noun from a list and reusing it in multiple places within a text.

The @ sigil is used to mark memoized rules. This evaluates the rule and stores it in memory the first time it’s referenced. All subsequent references to the memoized rule use the same stored value.

# Without memoization
grammar = Calyx::Grammar.new do
  start '{name} <{name.downcase}>'
  name 'Daenerys', 'Tyrion', 'Jon'
end

3.times { grammar.generate }
# => Daenerys <jon>
# => Tyrion <daenerys>
# => Jon <tyrion>

# With memoization
grammar = Calyx::Grammar.new do
  start '{@name} <{@name.downcase}>'
  name 'Daenerys', 'Tyrion', 'Jon'
end

3.times { grammar.generate }
# => Tyrion <tyrion>
# => Daenerys <daenerys>
# => Jon <jon>

Note that the memoization symbol can only be used on the right hand side of a production rule.

Unique Rules

Rule expansions can be marked as ‘unique’, meaning that multiple references to the same rule always return a different value. This is useful for situations where the same result appearing twice would appear awkward and messy.

Unique rules are marked by the $ sigil.

grammar = Calyx::Grammar.new do
  start "{$medal}, {$medal}, {$medal}"
  medal 'Gold', 'Silver', 'Bronze'
end

grammar.generate
# => Silver, Bronze, Gold

Dynamically Constructing Rules

Template expansions can be dynamically constructed at runtime by passing a context map of rules to the #generate method:

class AppGreeting < Calyx::Grammar
  start 'Hi {username}!', 'Welcome back {username}...', 'Hola {username}'
end

context = {
  username: UserModel.username
}

greeting = AppGreeting.new
greeting.generate(context)

External File Formats

In addition to defining grammars in pure Ruby, you can load them from external JSON and YAML files:

hello = Calyx::Grammar.load('hello.yml')
hello.generate

The format requires a flat map with keys representing the left-hand side named symbols and the values representing the right hand side substitution rules.

In JSON:

{
  "start": "{greeting} world.",
  "greeting": ["Hello", "Hi", "Hey", "Yo"]
}

In YAML:

---
start: "{greeting} world."
greeting:
  - Hello
  - Hi
  - Hey
  - Yo

Accessing the Raw Generated Tree

Calling #evaluate on the grammar instance will give you access to the raw generated tree structure before it gets flattened into a string.

The tree is encoded as an array of nested arrays, with the leading symbols labeling the choices and rules selected, and the trailing terminal leaves encoding string values.

This may not make a lot of sense unless you’re familiar with the concept of s-expressions. It’s a fairly speculative feature at this stage, but it leads to some interesting possibilities.

grammar = Calyx::Grammar.new do
  start 'Riddle me ree.'
end

grammar.evaluate
# => [:start, [:choice, [:concat, [[:atom, "Riddle me ree."]]]]]

Roadmap

Rough plan for stabilising the API and features for a 1.0 release.

VersionFeatures planned
0.6block constructor
0.7support for template context map passed to generate
0.8method missing metaclass API
0.9return grammar tree from #evaluate, with flattened string from #generate being separate
0.10inject custom string functions for parameterised rules, transforms and mappings
0.11support YAML format (and JSON?)
0.12API documentation
0.13Support for unique rules
0.14Support for Ruby 2.4
0.15Options config and ‘strict mode’ error handling
0.16Improve representation of weighted probability selection
0.17Return result object from #generate calls

Credits

Author & Maintainer

Contributors

Author: Maetl
Source Code: https://github.com/maetl/calyx 
License: MIT license

#ruby #text