1660736100
The goal of ERC721A is to provide a fully compliant implementation of IERC721 with significant gas savings for minting multiple NFTs in a single transaction. This project and implementation will be updated regularly and will continue to stay up to date with best practices.
The Azuki team created ERC721A for its sale on 1/12/22. There was significant demand for 8700 tokens made available to the public, and all were minted within minutes. The network BASEFEE remained low despite huge demand, resulting in low gas costs for minters, while minimizing network disruption for the wider ecosystem as well.
For more information on how ERC721A works under the hood, please visit our blog. To find other projects that are using ERC721A, please visit erc721a.org and our curated projects list.
Chiru Labs is not liable for any outcomes as a result of using ERC721A. DYOR.
https://chiru-labs.github.io/ERC721A/
https://github.com/chiru-labs/ERC721A-Upgradeable
npm install --save-dev erc721a
Once installed, you can use the contracts in the library by importing them:
pragma solidity ^0.8.4;
import "erc721a/contracts/ERC721A.sol";
contract Azuki is ERC721A {
constructor() ERC721A("Azuki", "AZUKI") {}
function mint(uint256 quantity) external payable {
// `_mint`'s second argument now takes in a `quantity`, not a `tokenId`.
_mint(msg.sender, quantity);
}
}
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
Don't forget to give the project a star! Thanks again!
git checkout -b feature/AmazingFeature
)git commit -m 'Add some AmazingFeature'
)git push origin feature/AmazingFeature
)npm install
npm run test
Project Link: https://github.com/chiru-labs/ERC721A
📢 Version 4.x introduces several breaking changes. Please refer to the documentation for more details.
We highly recommend reading the migration guide, especially the part on supportsInterface
if you are using with OpenZeppelin extensions (e.g. ERC2981).
Author: chiru-labs
Source code: https://github.com/chiru-labs/ERC721A
License: MIT license
#solidity #smartcontract #blockchain #web3 #ethereum #javascript
1659500100
Form objects decoupled from your models.
Reform gives you a form object with validations and nested setup of models. It is completely framework-agnostic and doesn't care about your database.
Although reform can be used in any Ruby framework, it comes with Rails support, works with simple_form and other form gems, allows nesting forms to implement has_one and has_many relationships, can compose a form from multiple objects and gives you coercion.
Reform is part of the Trailblazer framework. Full documentation is available on the project site.
Temporary note: Reform 2.2 does not automatically load Rails files anymore (e.g. ActiveModel::Validations
). You need the reform-rails
gem, see Installation.
Forms are defined in separate classes. Often, these classes partially map to a model.
class AlbumForm < Reform::Form
property :title
validates :title, presence: true
end
Fields are declared using ::property
. Validations work exactly as you know it from Rails or other frameworks. Note that validations no longer go into the model.
Forms have a ridiculously simple API with only a handful of public methods.
#initialize
always requires a model that the form represents.#validate(params)
updates the form's fields with the input data (only the form, not the model) and then runs all validations. The return value is the boolean result of the validations.#errors
returns validation messages in a classic ActiveModel style.#sync
writes form data back to the model. This will only use setter methods on the model(s).#save
(optional) will call #save
on the model and nested models. Note that this implies a #sync
call.#prepopulate!
(optional) will run pre-population hooks to "fill out" your form before rendering.In addition to the main API, forms expose accessors to the defined properties. This is used for rendering or manual operations.
In your controller or operation you create a form instance and pass in the models you want to work on.
class AlbumsController
def new
@form = AlbumForm.new(Album.new)
end
This will also work as an editing form with an existing album.
def edit
@form = AlbumForm.new(Album.find(1))
end
Reform will read property values from the model in setup. In our example, the AlbumForm
will call album.title
to populate the title
field.
Your @form
is now ready to be rendered, either do it yourself or use something like Rails' #form_for
, simple_form
or formtastic
.
= form_for @form do |f|
= f.input :title
Nested forms and collections can be easily rendered with fields_for
, etc. Note that you no longer pass the model to the form builder, but the Reform instance.
Optionally, you might want to use the #prepopulate!
method to pre-populate fields and prepare the form for rendering.
After form submission, you need to validate the input.
class SongsController
def create
@form = SongForm.new(Song.new)
#=> params: {song: {title: "Rio", length: "366"}}
if @form.validate(params[:song])
The #validate
method first updates the values of the form - the underlying model is still treated as immutuable and remains unchanged. It then runs all validations you provided in the form.
It's the only entry point for updating the form. This is per design, as separating writing and validation doesn't make sense for a form.
This allows rendering the form after validate
with the data that has been submitted. However, don't get confused, the model's values are still the old, original values and are only changed after a #save
or #sync
operation.
After validation, you have two choices: either call #save
and let Reform sort out the rest. Or call #sync
, which will write all the properties back to the model. In a nested form, this works recursively, of course.
It's then up to you what to do with the updated models - they're still unsaved.
The easiest way to save the data is to call #save
on the form.
if @form.validate(params[:song])
@form.save #=> populates album with incoming data
# by calling @form.album.title=.
else
# handle validation errors.
end
This will sync the data to the model and then call album.save
.
Sometimes, you need to do saving manually.
Reform allows default values to be provided for properties.
class AlbumForm < Reform::Form
property :price_in_cents, default: 9_95
end
Calling #save
with a block will provide a nested hash of the form's properties and values. This does not call #save
on the models and allows you to implement the saving yourself.
The block parameter is a nested hash of the form input.
@form.save do |hash|
hash #=> {title: "Greatest Hits"}
Album.create(hash)
end
You can always access the form's model. This is helpful when you were using populators to set up objects when validating.
@form.save do |hash|
album = @form.model
album.update_attributes(hash[:album])
end
Reform provides support for nested objects. Let's say the Album
model keeps some associations.
class Album < ActiveRecord::Base
has_one :artist
has_many :songs
end
The implementation details do not really matter here, as long as your album exposes readers and writes like Album#artist
and Album#songs
, this allows you to define nested forms.
class AlbumForm < Reform::Form
property :title
validates :title, presence: true
property :artist do
property :full_name
validates :full_name, presence: true
end
collection :songs do
property :name
end
end
You can also reuse an existing form from elsewhere using :form
.
property :artist, form: ArtistForm
Reform will wrap defined nested objects in their own forms. This happens automatically when instantiating the form.
album.songs #=> [<Song name:"Run To The Hills">]
form = AlbumForm.new(album)
form.songs[0] #=> <SongForm model: <Song name:"Run To The Hills">>
form.songs[0].name #=> "Run To The Hills"
When rendering a nested form you can use the form's readers to access the nested forms.
= text_field :title, @form.title
= text_field "artist[name]", @form.artist.name
Or use something like #fields_for
in a Rails environment.
= form_for @form do |f|
= f.text_field :title
= f.fields_for :artist do |a|
= a.text_field :name
validate
will assign values to the nested forms. sync
and save
work analogue to the non-nested form, just in a recursive way.
The block form of #save
would give you the following data.
@form.save do |nested|
nested #=> {title: "Greatest Hits",
# artist: {name: "Duran Duran"},
# songs: [{title: "Hungry Like The Wolf"},
# {title: "Last Chance On The Stairways"}]
# }
end
The manual saving with block is not encouraged. You should rather check the Disposable docs to find out how to implement your manual tweak with the official API.
Very often, you need to give Reform some information how to create or find nested objects when validate
ing. This directive is called populator and documented here.
Add this line to your Gemfile:
gem "reform"
Reform works fine with Rails 3.1-5.0. However, inheritance of validations with ActiveModel::Validations
is broken in Rails 3.2 and 4.0.
Since Reform 2.2, you have to add the reform-rails
gem to your Gemfile
to automatically load ActiveModel/Rails files.
gem "reform-rails"
Since Reform 2.0 you need to specify which validation backend you want to use (unless you're in a Rails environment where ActiveModel will be used).
To use ActiveModel (not recommended because very out-dated).
require "reform/form/active_model/validations"
Reform::Form.class_eval do
include Reform::Form::ActiveModel::Validations
end
To use dry-validation (recommended).
require "reform/form/dry"
Reform::Form.class_eval do
feature Reform::Form::Dry
end
Put this in an initializer or on top of your script.
Reform allows to map multiple models to one form. The complete documentation is here, however, this is how it works.
class AlbumForm < Reform::Form
include Composition
property :id, on: :album
property :title, on: :album
property :songs, on: :cd
property :cd_id, on: :cd, from: :id
end
When initializing a composition, you have to pass a hash that contains the composees.
AlbumForm.new(album: album, cd: CD.find(1))
Reform comes many more optional features, like hash fields, coercion, virtual fields, and so on. Check the full documentation here.
Reform is part of the Trailblazer project. Please buy my book to support the development and learn everything about Reform - there's two chapters dedicated to Reform!
By explicitly defining the form layout using ::property
there is no more need for protecting from unwanted input. strong_parameter
or attr_accessible
become obsolete. Reform will simply ignore undefined incoming parameters.
Temporary note: This is the README and API for Reform 2. On the public API, only a few tiny things have changed. Here are the Reform 1.2 docs.
Anyway, please upgrade and report problems and do not simply assume that we will magically find out what needs to get fixed. When in trouble, join us on Gitter.
Full documentation for Reform is available online, or support us and grab the Trailblazer book. There is an Upgrading Guide to help you migrate through versions.
Great thanks to Blake Education for giving us the freedom and time to develop this project in 2013 while working on their project.
Author: trailblazer
Source code: https://github.com/trailblazer/reform
License: MIT license
1660736100
The goal of ERC721A is to provide a fully compliant implementation of IERC721 with significant gas savings for minting multiple NFTs in a single transaction. This project and implementation will be updated regularly and will continue to stay up to date with best practices.
The Azuki team created ERC721A for its sale on 1/12/22. There was significant demand for 8700 tokens made available to the public, and all were minted within minutes. The network BASEFEE remained low despite huge demand, resulting in low gas costs for minters, while minimizing network disruption for the wider ecosystem as well.
For more information on how ERC721A works under the hood, please visit our blog. To find other projects that are using ERC721A, please visit erc721a.org and our curated projects list.
Chiru Labs is not liable for any outcomes as a result of using ERC721A. DYOR.
https://chiru-labs.github.io/ERC721A/
https://github.com/chiru-labs/ERC721A-Upgradeable
npm install --save-dev erc721a
Once installed, you can use the contracts in the library by importing them:
pragma solidity ^0.8.4;
import "erc721a/contracts/ERC721A.sol";
contract Azuki is ERC721A {
constructor() ERC721A("Azuki", "AZUKI") {}
function mint(uint256 quantity) external payable {
// `_mint`'s second argument now takes in a `quantity`, not a `tokenId`.
_mint(msg.sender, quantity);
}
}
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
Don't forget to give the project a star! Thanks again!
git checkout -b feature/AmazingFeature
)git commit -m 'Add some AmazingFeature'
)git push origin feature/AmazingFeature
)npm install
npm run test
Project Link: https://github.com/chiru-labs/ERC721A
📢 Version 4.x introduces several breaking changes. Please refer to the documentation for more details.
We highly recommend reading the migration guide, especially the part on supportsInterface
if you are using with OpenZeppelin extensions (e.g. ERC2981).
Author: chiru-labs
Source code: https://github.com/chiru-labs/ERC721A
License: MIT license
#solidity #smartcontract #blockchain #web3 #ethereum #javascript
1597559012
in this post, i will show you easy steps for multiple file upload in laravel 7, 6.
As well as how to validate file type, size before uploading to database in laravel.
You can easily upload multiple file with validation in laravel application using the following steps:
https://www.tutsmake.com/laravel-6-multiple-file-upload-with-validation-example/
#laravel multiple file upload validation #multiple file upload in laravel 7 #multiple file upload in laravel 6 #upload multiple files laravel 7 #upload multiple files in laravel 6 #upload multiple files php laravel
1624363555
Though there are lots of technological innovations going on, is there anything bigger than the phenomenon of Non-Fungible Tokens (NFTs)?. These unique crypto collectibles have attained a market capitalization of $22.69 billion along with a daily trading volume of $2.02 billion as per CoinMarketCap.
One of the biggest advantages of NFTs is that artists and content creators are getting adequate monetary compensation for their work. Digital collectibles eliminate the role of intermediaries in buying and selling. Content developers can directly reach out to their target audience. They need not depend on any middleman or a platform. NFTs are interoperable, non-interchangeable, scarce, and very transparent.
Entrepreneurs looking to enter the profitable opportunity of NFT trading can partner with a competent app development company to fulfil their business objectives. Skilled developers create an advanced White-label NFT minting platform. It can run on several blockchain networks like Binance Smart Chain (BSC), Cardano, EOS, Ethereum, Flow, Polkadot, Stellar and TRON.
What is the crucial role played by a White-label NFT Minting app?
It facilitates the hassle-free sale of - artwork, domain names, fashion accessories, gaming assets, music (albums and tracks), photos, software licenses, sports goods, and videos.
A White-label NFT Minting platform also distributes - crypto collectibles for free to investors through an airdrop program. Users have to complete certain tasks within a stipulated time to get free crypto tokens.
A White-label NFT Minting app ensures a high level of safety - and transparency for all buyers and sellers. It conducts Anti-Money Laundering (AML) and Know Your Customer (KYC) verification on all the traders. This ensures a trustworthy experience for both institutional and retail investors.
An ERC-721 Token Minting platform contains - comprehensive information like contract address of the token pair, name of the owner, token balance, token ID, and the total supply of the token. Besides that, it contains a function for the quick transfer of the tokens.
Investors can keep close track of the Ethereum (ETH) tokens through the Etherscan Block Explorer. It shares real-time updates about the daily and weekly trading volumes/ transfers on the Ethereum blockchain network.
An app development company will also help in the hassle-free sale of ERC-721 digital collectibles on white-label Ethereum-based NFT marketplaces like CryptoKitties, Mintable.app, OpenSea, and Rarible.
A White-label NFT Minting platform safeguards - the crypto assets of traders. It offers secure digital wallets like Argent, Fortmatic, MetaMask, MyEtherWallet (MEW), Portis, Torus, Trust Wallet, WalletConnect, and WalletLink.
A White-label NFT Minting app gives 24x7 technical support - for artists, buyers, and content creators. Round-the-clock assistance is offered to resolve their glitches and issues via Discord, email, live chat, phone, and Telegram.
Conclusion
The global NFT trading market has grown tremendously with the emergence of multiple marketplaces. Content creators can become millionaires overnight by selling their crypto collectibles for a high price.
Hence ambitious entrepreneurs can become an important part of the booming digital collectible selling industry. Set up a game-changing White-label NFT Minting platform with dedicated assistance from an app development company.
#white-label nft minting platform #white-label nft minting app #erc-721 token minting platform #nft minting platform #nft minting software
1656080760
The plugin provides the functionality to merge OpenApiSpecification files (formerly known as swagger) with one or multiple YML files containing the the x-amazon-apigateway extensions. There are several use-cases to keep both information separated, e.g. it is needed to deploy different api gateway integrations depending on a stage environment.
When dealing with functional tests you do not want to test the production environment, but only a mocking response.
The plugin supports YML based OpenApi3 specification files only
See the examples folder for a full working example
Installation & Setup
Run npm install
in your Serverless project.
$ npm install --save-dev serverless-openapi-integration-helper
Add the plugin to your serverless.yml file
plugins:
- serverless-openapi-integration-helper
Plugin configuration
You can configure the plugin under the key openApiIntegration. See See Configuration Reference for a list of available options
The mapping array must be used to configure where the files containing the x-amazon-apigateway-integration blocks are located.
openApiIntegration:
package: true #New feature! Hook into the package & deploy process
inputFile: schema.yml
mapping:
- stage: [dev, prod] #multiple stages
path: integrations
- stage: test #single stage
path: mocks
In the above example all YML files inside the integrations directory will be processed and merged with the schema.yml file when deploying the dev stage
serverless deploy --stage=dev
To use a different x-amazon-apigateway to perform functional tests (with mocking responses e.g) the directory mock is processed and merged with the schema.yml file when deploying the test stage
serverless deploy --stage=test
Usage
You can setup a fully working API GATEWAY with any openApi 3.0 specification file First create the input file containing the OpenApiSpecification
# ./schema.yml
openapi: 3.0.0
info:
description: User Registration
version: 1.0.0
title: UserRegistration
paths:
/api/v1/user:
post:
summary: adds a user
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/Customer'
responses:
'201':
description: user created
components:
schemas:
Customer:
type: object
required:
- email_address
- password
properties:
email_address:
type: string
example: test@example.com
password:
type: string
format: password
example: someStrongPassword#
The plugin will generate the x-amazon-apigateway integrations objects for all methods that do not have an integration.
#generate a file containing a gateway mock integration in the directory /mocks
serverless integration create --output mocks --type mock --stage=test
#generate a file containing the production integration in the directory integrations/
serverless integration create --output integrations --type http --stage=prod
Supported types are
The plugin now generates a merged file during deployment that is automatically injected in your serverless resources
#Create OpenApi File containing mocking responses (usable in functional tests) and deploy to ApiGateway
serverless deploy --stage==test
#Create OpenApi File containing the production integration and deploy to ApiGateway
serverless deploy --stage=prod
The generated output is automatically injected in the resources.Resources.YOUR_API_GATEWAY.Properties.Body property
resources:
Resources:
ApiGatewayRestApi:
Type: AWS::ApiGateway::RestApi
Properties:
ApiKeySourceType: HEADER
Body: ~ #autogenerated by plugin
Description: "Some Description"
FailOnWarnings: false
Name: ${opt:stage, self:provider.stage}-some-name
EndpointConfiguration:
Types:
- REGIONAL
ApiGatewayDeployment:
Type: AWS::ApiGateway::Deployment
Properties:
RestApiId:
Ref: ApiGatewayRestApi
StageName: ${opt:stage, self:provider.stage}
Commands
The generate command can be used independently with
serverless integration merge --stage=dev
Of course then the API Gateway Body property has to be specified manually
resources:
Resources:
ApiGatewayRestApi:
Type: AWS::ApiGateway::RestApi
Properties:
ApiKeySourceType: HEADER
Body: ${file(openapi-integration/api.yml)}
CORS generator
The plugin can generate full CORS support out of the box.
openApiIntegration:
cors: true
...
If enabled, the plugin generates all required OPTIONS methods as well as the required header informations and adds a mocking response to API Gateway. You can customize the CORS templates by placing your own files inside a directory openapi-integration (in your project root). The following files can be overwritten:
Filename | Description |
---|---|
headers.yml | All headers required for CORS support |
integration.yml | Contains the x-amazon-apigateway-integration block |
path.yml | OpenApi specification for the OPTIONS method |
response-parameters.yml | The response Parameters of the x-amazon-apigateway-integration responses |
See the EXAMPLES directory for detailed instructions.
Auto Mock Generator
If enabled, the plugin generates mocking responses for all methods that do not have an x-amazon-apigateway-integration block defined. It takes the first 2xx response defined in the openApi specification and generates a simple mocking response on the fly
openApiIntegration:
autoMock: true
...
When using the autoMock feature, you do not need to specify inputPath mappings, since all endpoints are mocked automatically
openApiIntegration:
package: true
inputFile: schema.yml
mapping: ~
VALIDATION generator
The plugin supports full request validation out of the box
openApiIntegration:
validation: true
...
If enabled, the plugin generates the x-amazon-apigateway-request-validators blocks and adds a basic request validation to all methods. You can customize the VALIDATION template by placing your own files inside a directory openapi-integration (in your project root). The following files can be overwritten:
Filename | Description |
---|---|
request-validator.yml | The x-amazon-apigateway-request-validators block |
See the EXAMPLES directory for detailed instructions.
Proxy Manager
The proxymanager feature automates the complete generation of an HTTP proxy integration. You only have to define the target URL and all necessary AWS integration blocks are generated on-the-fly during deployment.
openApiIntegration:
cors: true
validation: true
mapping:
- stage: [dev, prod]
proxyManager:
type: http_proxy
baseUrl: https://www.example.com
pattern: "(?<=api\/v1)\/.+"
...
With this setting, no separate integration files need to be created
A combination of your own and auto-generated files is still possible without any problems
at the moment only http_proxy supported
The base url is required to map the path variable from the openapi specification to the URI from the aws integration.
Example:
#original openapi specification
paths:
/api/v1/user:
post:
...
will be translated to
#generated openapi specification output
paths:
/api/v1/user:
post:
...
x-amazon-apigateway-integration:
type: http_proxy
passthroughBehavior: when_no_match
httpMethod: POST
uri: https://www.example.com/api/v1/user
The pattern can be used to adapt the mapping of the base url using regexp, to remove a prefix, or a version string
Example:
baseUrl: https://www.example.com
pattern: "(?<=api\/v1)\/.+"
will translate the route /api/v1/user to https://www.example.com/user
Configuration Reference
configure the plugin under the key openApiIntegration
openApiIntegration:
inputFile: schema.yml #required
package: true #optional, defaults to false
inputDirectory: ./ #optional, defaults to ./
cors: true #optional, defaults to false
autoMock: true #optional, defaults to false
validation: true #optional, defaults to false
mapping: #optional, can be completely blank if autoMock option is enabled
- stage: [dev, prod] #multiple stages
path: integrations
proxyManager: #optional
type: http_proxy
baseUrl: https://example.com
pattern: "(?<=v1)\/.+"
- stage: test #single stage
path: mocks/customer.yml
outputFile: api.yml #optional, defaults to api.yml
outputDirectory: openapi-integration #optional, defaults to ./openapi-integration
Known Issues
When using serverless framework only to deploy your aws resources without having any lambda functions or triggers, the AWS Gateway deploymemt does not behave as expected. Any deployment to an existing stage will be ignored, since CloudFormation does not redeploy a stage if the DeploymentIdentifier has not changed.
The plugin serverless-random-gateway-deployment-id solves this problem by adding a random id to the deployment-name and all references to it on every deploy
See the examples folder for a full working example
Serverless variables inside the openapi integration files are not resolved correctly when using the package & deploy hooks. This problem can be solved by using the api gateway STAGE VARIABLES.
See the examples folder for a full working example
Example
service:
name: user-registration
provider:
name: aws
stage: dev
region: eu-central-1
plugins:
- serverless-openapi-integration-helper
openApiIntegration:
inputFile: schema.yml
package: true
mapping:
- path: integrations
stage: [dev, prod]
- path: mocks/customer.yml
stage: test
functions:
resources:
Resources:
ApiGatewayRestApi:
Type: AWS::ApiGateway::RestApi
Properties:
ApiKeySourceType: HEADER
Body: ~
Description: "Some Description"
FailOnWarnings: false
Name: ${opt:stage, self:provider.stage}-some-name
EndpointConfiguration:
Types:
- REGIONAL
ApiGatewayDeployment:
Type: AWS::ApiGateway::Deployment
Properties:
RestApiId:
Ref: ApiGatewayRestApi
StageName: ${opt:stage, self:provider.stage}
serverless deploy --stage=test
serverless deploy --stage=prod
Approach to a functional test of schema validation
The plugin works well in combination with the serverless-plugin-test-helper to automate tests against the deployed api gateway
npm install --save-dev serverless-plugin-test-helper
add the plugin as a plugin dependency in your serverless configuration file and configure the plugin according to the Readme
#./serveless.yml
plugins:
- serverless-plugin-test-helper
- serverless-openapi-integration-helper
[...]
resources:
Outputs:
GatewayUrl: # This is the key that will be used in the generated outputs file
Description: This is a helper for functional tests
Value: !Join
- ''
- - 'https://'
- !Ref ApiGatewayRestApi
- '.execute-api.'
- ${opt:region, self:provider.region}
- '.amazonaws.com/'
- ${opt:stage, self:provider.stage}
Resources:
ApiGatewayRestApi:
Type: AWS::ApiGateway::RestApi
Properties:
ApiKeySourceType: HEADER
Body: ~
Description: User Registration (${opt:stage, self:provider.stage})
FailOnWarnings: false
Name: ${opt:stage, self:provider.stage}-gateway
EndpointConfiguration:
Types:
- REGIONAL
ApiGatewayDeployment:
Type: AWS::ApiGateway::Deployment
Properties:
RestApiId:
Ref: ApiGatewayRestApi
StageName: ${opt:stage, self:provider.stage}
Add a functional test (e.g. with jest)
//tests/registration.js
import {getOutput} from 'serverless-plugin-test-helper';
import axios from 'axios';
axios.defaults.adapter = require('axios/lib/adapters/http'); //Todo
const URL = getOutput('GatewayUrl');
test('request validation on registration', async () => {
expect.assertions(1);
const {status} = await axios.post(URL + '/api/v1/user',
{
"email_address": "test@example.com",
"password": "someStrongPassword#"
},
{
headers: {
'Content-Type': 'application/json',
}
});
expect(status).toEqual(201);
});
test('request validation on registration (invalid request)', async () => {
expect.assertions(1);
try {
await axios.post(URL + '/api/v1/user',
{
"email": "test@example.com"
},
{
headers: {
'Content-Type': 'application/json',
}
});
} catch (e) {
expect(e.response).toMatchObject({
statusText: 'Bad Request',
status: 400
});
}
});
Then perform the functional test
serverless deploy --stage=test
npm test
serverless remove --stage=test
The command will
See the examples folder for a full working example
Feedback is appreciated! If you have an idea for how this plugin/library can be improved (or even just a complaint/criticism) then please open an issue.
Author: yndlingsfar
Source Code: https://github.com/yndlingsfar/serverless-openapi-integration-helper
License: MIT license