A GUI for Local DynamoDB- Dynamodb-Admin

Quick Start Guide

1. Install the package globally from npm.

$ npm install -g dynamodb-admin

2. Run DynamoDB locally inside a Docker container

Make sure you have Docker installed on your system. Instructions are here.

Now pull and run the Docker dynamodb-local image to spin up your very own DynamoDB instance running on port 8000.

$ docker pull amazon/dynamodb-local
$ docker run -p 8000:8000 amazon/dynamodb-local

3. Start dynamodb-admin (with defaults)

MacOS/Linux

$ dynamodb-admin

Windows

> export DYNAMO_ENDPOINT=http://localhost:8000
> dynamodb-admin

After these steps you will have:

The next step is to create a table and start reading/writing to it!

Advanced Setup

You may need to override regions, endpoints and/or credentials to peek inside local DynamoDB instances you have spun up to replicate a production environment.

If so, just override the defaults when starting the service. You can override some, or all of these, as required.

DYNAMO_ENDPOINT=http://localhost:<PORT> AWS_REGION=<AWS-REGION> AWS_ACCESS_KEY_ID=<YOUR-ACCESS-KEY> AWS_SECRET_ACCESS_KEY=<YOUR-SECRET> dynamodb-admin

Setting up your Tables

The easy way — for simple use cases

Here we are going to create your table using the dynamodb-admin GUI. This is most likely going to be appropriate for your use-case.

Clicking on ‘Create table’ takes us to a screen where we can define how our table should look. Remember that DynamoDB is effectively a key-value store, meaning to get started we only need to define a table name and a hash attribute (the primary key). If you expect that you’ll need to perform lookups based on another attribute of your data, you may want to add some Secondary Indices.

In this example, I’ve named the table ‘Cats’ and given it a primary index ‘name’ and a secondary index ‘owner’. Both indices are of type String and we must give the secondary index a name — which can be different from the name of the attribute.

‘name’ is not a good choice of primary index in practice, as it means only one cat with a given name can be present in the table. Instead, it would be better to give each cat a unique id and use that as the primary index.

Finally, add your first item to the table by clicking the ‘Create item’ link on the top right. This will take you to a new screen where you can enter the json which defines the record. The only requirement for each entry is that the primary key is included.

The Hard(er) Way — for more complex table structures

Using the GUI to set up tables is fine for simple tables, or when you’re just exploring how your data storage might be structured. However, if you are trying to mimic a complex table, or you want to stand-up tables quickly for testing you may want to use the command line to create the table(s) for you.

1. Install the AWS CLI

Instructions for installing (for Mac) via the command line are here. This is a very powerful utility tool. I’d recommend installing it if you work with AWS even if you don’t opt to use it here.

2. Create a table schema

If you already have a table schema you can skip this and move along to Step 3.

$ aws dynamodb create-table --generate-cli-skeleton > dynamo_table_def.json

This will create a basic table schema and pipe it into a file named dynamo_table_def.json.

You can now open the json file and edit it to fit your desired table schema. This is great because you can now use this same schema file when you need the table, rather than manually setting it up each time via the GUI.

If you’re just curious what the schema should look like, or you need some inspiration for your own — here is the schema for the Cats table.

{
  "AttributeDefinitions": [
    {
      "AttributeName": "name",
      "AttributeType": "S"
    },
    {
      "AttributeName": "owner",
      "AttributeType": "S"
    }
  ],
  "TableName": "Cats",
  "KeySchema": [
    {
      "AttributeName": "name",
      "KeyType": "HASH"
    }
  ],
  "ProvisionedThroughput": {
    "ReadCapacityUnits": 3,
    "WriteCapacityUnits": 3
  },
  "GlobalSecondaryIndexes": [
    {
      "IndexName": "idx_owner",
      "KeySchema": [
        {
          "AttributeName": "owner",
          "KeyType": "HASH"
        }
      ],
      "Projection": {
        "ProjectionType": "ALL"
      },
      "ProvisionedThroughput": {
        "ReadCapacityUnits": 3,
        "WriteCapacityUnits": 3
      }
    }
  ]
}

3. Use the schema to create the table

Finally, create the table locally

$ aws dynamodb create-table --cli-input-json file://dynamo_table_def.json --endpoint-url http://localhost:8000

Remember the --endpoint-url parameter, otherwise a real table will be created in whatever region your AWS CLI defaults to.

After running this command, go back to dynamodb-admin in your browser. You’ll see your table has been created. Now it’s time to use it!

Use Cases

I’ve picked 3 examples to show how dynamodb-admin can help you develop and test your applications.

Running locally alongside an application

This example is super simple. Let’s say you’re developing a Python application which reads from a DynamoDB table of movies. You may want run a local DynamoDB instance for development and tests, to avoid standing up unnecessary infrastructure. Dynamodb-local is a godsend for this. However it can be fiddly to put data in the table, from the command line.

You could write code to put the correct items in the table. Indeed for tests this might be ideal, as you absolutely should test the logic you’re using to read and write from the table.

However to quickly test some code path or to build out a feature, when the remote infrastructure or data is not present, it’s typically much easier to put the data into the DynamoDB table manually.

You can set up your table and add some movies, using the GUI, as described above. Then read from the table like so:

from pprint import pprint
import boto3
from botocore.exceptions import ClientError


def get_movie(title, year, dynamodb=None):
    if not dynamodb:
        dynamodb = boto3.resource('dynamodb', endpoint_url="http://localhost:8000")

    table = dynamodb.Table('Movies')

    try:
        response = table.get_item(Key={'year': year, 'title': title})
    except ClientError as e:
        print(e.response['Error']['Message'])
    else:
        return response['Item']


if __name__ == '__main__':
    movie = get_movie("The Big New Movie", 2015,)
    if movie:
        print("Get movie succeeded:")
        pprint(movie, sort_dicts=False)

Creating the table and putting items in it, using dynamodb-admin, lets you focus on the business logic. When you’re happy with how your logic looks, you can focus on writing to the table.

Using dynamodb-admin as a library

Since dynamodb-admin is a Node library, we can use it inside our Node projects. This is again great for local development, as each time you run the service you have what is effectively the AWS console ready to view and manipulate the data.

const AWS = require('aws-sdk');
const {createServer} = require('dynamodb-admin');
 
const dynamodb = new AWS.DynamoDB();
const dynClient = new AWS.DynamoDB.DocumentClient({service: dynamodb});
 
const app = createServer(dynamodb, dynClient);
 
const port = 8001;
const server = app.listen(port);
server.on('listening', () => {
  const address = server.address();
  console.log(`  listening on http://0.0.0.0:${address.port}`);
});

Using dynamodb-admin with AWS Amplify

AWS Amplify is a development framework that deals with a lot of the common problems when building a mobile or web application, setting up the infrastructure required for you. Each part of the framework deserves a blog post of its own, but here we are going to be looking at mocking the DynamoDB tables Amplify creates based on your GraphQL API definition.

If you’d like to find out how to use Amplify to create a GraphQL API the documentation is here

Amplify let’s you mock services used by your app with the Amplify CLI tool by running

$ amplify mock <service>

If you have used Amplify to create a GraphQL API to serve as the backend for your project you can run

$ amplify mock api

This will do two things:

  1. Starts a mock Appsync API endpoint on port 20002
  2. Creates a DynamoDB instance on port 62224

We can now use dynamodb-admin to take a peek inside the tables Amplify has created, based on our API’s requirements by running

$ AWS_REGION=us-fake-1 AWS_ACCESS_KEY_ID=fake AWS_SECRET_ACCESS_KEY=fake DYNAMO_ENDPOINT=http://localhost:62224  dynamodb-admin

I’ve personally found this really useful to test locally, before committing to pushing my API changes.

Originally published by https://medium.com/swlh 

#dynamodb #aws #code #dynamodb-admin #dynamodb-local

What is GEEK

Buddha Community

A GUI for Local DynamoDB- Dynamodb-Admin

A GUI for Local DynamoDB- Dynamodb-Admin

Quick Start Guide

1. Install the package globally from npm.

$ npm install -g dynamodb-admin

2. Run DynamoDB locally inside a Docker container

Make sure you have Docker installed on your system. Instructions are here.

Now pull and run the Docker dynamodb-local image to spin up your very own DynamoDB instance running on port 8000.

$ docker pull amazon/dynamodb-local
$ docker run -p 8000:8000 amazon/dynamodb-local

3. Start dynamodb-admin (with defaults)

MacOS/Linux

$ dynamodb-admin

Windows

> export DYNAMO_ENDPOINT=http://localhost:8000
> dynamodb-admin

After these steps you will have:

The next step is to create a table and start reading/writing to it!

Advanced Setup

You may need to override regions, endpoints and/or credentials to peek inside local DynamoDB instances you have spun up to replicate a production environment.

If so, just override the defaults when starting the service. You can override some, or all of these, as required.

DYNAMO_ENDPOINT=http://localhost:<PORT> AWS_REGION=<AWS-REGION> AWS_ACCESS_KEY_ID=<YOUR-ACCESS-KEY> AWS_SECRET_ACCESS_KEY=<YOUR-SECRET> dynamodb-admin

Setting up your Tables

The easy way — for simple use cases

Here we are going to create your table using the dynamodb-admin GUI. This is most likely going to be appropriate for your use-case.

Clicking on ‘Create table’ takes us to a screen where we can define how our table should look. Remember that DynamoDB is effectively a key-value store, meaning to get started we only need to define a table name and a hash attribute (the primary key). If you expect that you’ll need to perform lookups based on another attribute of your data, you may want to add some Secondary Indices.

In this example, I’ve named the table ‘Cats’ and given it a primary index ‘name’ and a secondary index ‘owner’. Both indices are of type String and we must give the secondary index a name — which can be different from the name of the attribute.

‘name’ is not a good choice of primary index in practice, as it means only one cat with a given name can be present in the table. Instead, it would be better to give each cat a unique id and use that as the primary index.

Finally, add your first item to the table by clicking the ‘Create item’ link on the top right. This will take you to a new screen where you can enter the json which defines the record. The only requirement for each entry is that the primary key is included.

The Hard(er) Way — for more complex table structures

Using the GUI to set up tables is fine for simple tables, or when you’re just exploring how your data storage might be structured. However, if you are trying to mimic a complex table, or you want to stand-up tables quickly for testing you may want to use the command line to create the table(s) for you.

1. Install the AWS CLI

Instructions for installing (for Mac) via the command line are here. This is a very powerful utility tool. I’d recommend installing it if you work with AWS even if you don’t opt to use it here.

2. Create a table schema

If you already have a table schema you can skip this and move along to Step 3.

$ aws dynamodb create-table --generate-cli-skeleton > dynamo_table_def.json

This will create a basic table schema and pipe it into a file named dynamo_table_def.json.

You can now open the json file and edit it to fit your desired table schema. This is great because you can now use this same schema file when you need the table, rather than manually setting it up each time via the GUI.

If you’re just curious what the schema should look like, or you need some inspiration for your own — here is the schema for the Cats table.

{
  "AttributeDefinitions": [
    {
      "AttributeName": "name",
      "AttributeType": "S"
    },
    {
      "AttributeName": "owner",
      "AttributeType": "S"
    }
  ],
  "TableName": "Cats",
  "KeySchema": [
    {
      "AttributeName": "name",
      "KeyType": "HASH"
    }
  ],
  "ProvisionedThroughput": {
    "ReadCapacityUnits": 3,
    "WriteCapacityUnits": 3
  },
  "GlobalSecondaryIndexes": [
    {
      "IndexName": "idx_owner",
      "KeySchema": [
        {
          "AttributeName": "owner",
          "KeyType": "HASH"
        }
      ],
      "Projection": {
        "ProjectionType": "ALL"
      },
      "ProvisionedThroughput": {
        "ReadCapacityUnits": 3,
        "WriteCapacityUnits": 3
      }
    }
  ]
}

3. Use the schema to create the table

Finally, create the table locally

$ aws dynamodb create-table --cli-input-json file://dynamo_table_def.json --endpoint-url http://localhost:8000

Remember the --endpoint-url parameter, otherwise a real table will be created in whatever region your AWS CLI defaults to.

After running this command, go back to dynamodb-admin in your browser. You’ll see your table has been created. Now it’s time to use it!

Use Cases

I’ve picked 3 examples to show how dynamodb-admin can help you develop and test your applications.

Running locally alongside an application

This example is super simple. Let’s say you’re developing a Python application which reads from a DynamoDB table of movies. You may want run a local DynamoDB instance for development and tests, to avoid standing up unnecessary infrastructure. Dynamodb-local is a godsend for this. However it can be fiddly to put data in the table, from the command line.

You could write code to put the correct items in the table. Indeed for tests this might be ideal, as you absolutely should test the logic you’re using to read and write from the table.

However to quickly test some code path or to build out a feature, when the remote infrastructure or data is not present, it’s typically much easier to put the data into the DynamoDB table manually.

You can set up your table and add some movies, using the GUI, as described above. Then read from the table like so:

from pprint import pprint
import boto3
from botocore.exceptions import ClientError


def get_movie(title, year, dynamodb=None):
    if not dynamodb:
        dynamodb = boto3.resource('dynamodb', endpoint_url="http://localhost:8000")

    table = dynamodb.Table('Movies')

    try:
        response = table.get_item(Key={'year': year, 'title': title})
    except ClientError as e:
        print(e.response['Error']['Message'])
    else:
        return response['Item']


if __name__ == '__main__':
    movie = get_movie("The Big New Movie", 2015,)
    if movie:
        print("Get movie succeeded:")
        pprint(movie, sort_dicts=False)

Creating the table and putting items in it, using dynamodb-admin, lets you focus on the business logic. When you’re happy with how your logic looks, you can focus on writing to the table.

Using dynamodb-admin as a library

Since dynamodb-admin is a Node library, we can use it inside our Node projects. This is again great for local development, as each time you run the service you have what is effectively the AWS console ready to view and manipulate the data.

const AWS = require('aws-sdk');
const {createServer} = require('dynamodb-admin');
 
const dynamodb = new AWS.DynamoDB();
const dynClient = new AWS.DynamoDB.DocumentClient({service: dynamodb});
 
const app = createServer(dynamodb, dynClient);
 
const port = 8001;
const server = app.listen(port);
server.on('listening', () => {
  const address = server.address();
  console.log(`  listening on http://0.0.0.0:${address.port}`);
});

Using dynamodb-admin with AWS Amplify

AWS Amplify is a development framework that deals with a lot of the common problems when building a mobile or web application, setting up the infrastructure required for you. Each part of the framework deserves a blog post of its own, but here we are going to be looking at mocking the DynamoDB tables Amplify creates based on your GraphQL API definition.

If you’d like to find out how to use Amplify to create a GraphQL API the documentation is here

Amplify let’s you mock services used by your app with the Amplify CLI tool by running

$ amplify mock <service>

If you have used Amplify to create a GraphQL API to serve as the backend for your project you can run

$ amplify mock api

This will do two things:

  1. Starts a mock Appsync API endpoint on port 20002
  2. Creates a DynamoDB instance on port 62224

We can now use dynamodb-admin to take a peek inside the tables Amplify has created, based on our API’s requirements by running

$ AWS_REGION=us-fake-1 AWS_ACCESS_KEY_ID=fake AWS_SECRET_ACCESS_KEY=fake DYNAMO_ENDPOINT=http://localhost:62224  dynamodb-admin

I’ve personally found this really useful to test locally, before committing to pushing my API changes.

Originally published by https://medium.com/swlh 

#dynamodb #aws #code #dynamodb-admin #dynamodb-local

Ahebwe  Oscar

Ahebwe Oscar

1620177818

Django admin full Customization step by step

Welcome to my blog , hey everyone in this article you learn how to customize the Django app and view in the article you will know how to register  and unregister  models from the admin view how to add filtering how to add a custom input field, and a button that triggers an action on all objects and even how to change the look of your app and page using the Django suit package let’s get started.

Database

Custom Titles of Django Admin

Exclude in Django Admin

Fields in Django Admin

#django #create super user django #customize django admin dashboard #django admin #django admin custom field display #django admin customization #django admin full customization #django admin interface #django admin register all models #django customization

Localization - Laravel Localization Example

In this example i will show you localization - laravel localization example.

Laravel’s localization features provide a convenient way to retrieve text in different languages, allowing you to easily support multiple languages within your application. So here i will show you how to create localization or laravel dynamic language.

Read More : Localization - Laravel Localization Example

https://websolutionstuff.com/post/localization-laravel-localization-example


Read Also : How To Integrate Paypal Payment Gateway In Laravel

https://websolutionstuff.com/post/how-to-integrate-paypal-payment-gateway-in-laravel

#localization - laravel localization example #localization tutorial #localization #laravel multi languag #laravel documentation #laravel localization

Mark Anderson

Mark Anderson

1616591309

Local Bitcoin Clone launch a trending business in crypto platform

The Blockchain App Factory offers a Local Bitcoin clone platform for its client with an impressive outcome that lures many users quickly. It allows the traders to buy and sell cryptocurrency for paying a particular party. This platform comes with peer-to-peer (P2P) with escrow for secure transactions, which helps in gaining trust and comfort with the feedback mechanism.

#local bitcoin clone script #buy & sell bitcoins with local currency #local bitcoin clone #best local bitcoin clone #local bitcoin exchange script #local bitcoin clone scripts

Mark Anderson

Mark Anderson

1617788072

Embrace your business revenue with local bitcoin clone

The trading in local bitcoin clones provides an efficient platform for buying and selling local bitcoin and other cryptocurrencies. It adds value for users to trade cryptocurrencies locally in this generation. This platform offers P2P exchange with escrow protection for trade to happen safely and securely. It has a feedback mechanism and dispute resolution process that enables trading to be trustworthy and comfortable. The bitcoin clone comes with a customization solution to design as per the business requirement.

Cons of Trading Local Bitcoin Clone :

  • Peer-to-peer (P2P) trade.
  • Updated pricing structure.
  • Rapid Bitcoin Trade.
  • Safer trade with Escrow.
  • Background check for trustworthy traders.
  • Automated trading.

Attractive Features of Local Bitcoin Clone :

  • Two Factor Authentication provides state-of-the-art security solutions for every buyer and seller in the marketplace.
  • The creation of crypto-wallets provides escrow integration and performs safe transactions between buyer and seller.
  • It has multiple coin support to be Escrowed and transact without any inconvenience in the market.
  • It enables current live prices displayed in the market for allowing buyers to have a clear idea of trading the coins.
  • The sellers can advertise coins along with cryptocurrency details for the buyers to open a trade request.

The local bitcoin clone enables users worldwide to trade using local currencies for the exchange of cryptocurrency. The rise of local bitcoin value is high in the marketplace for investors to step forward and generate more business revenue in less time by contacting Blockchain App Factory to provide complete guidance.

##local bitcoin clone ##best local bitcoin clone ##buy and sell local bitcoin clone ##ready-made local bitcoin clone ##buy and sell local bitcoin