Dylan North

Dylan North

1566875542

How to Set up file uploads to S3 via Django

Originally published by David Bruton at blog.theodo

I’m going to show you how to correctly configure your Django project to allow you to upload files safely and securely to an AWS S3 bucket. This is a great way of handling files in Django and as you will see in this guide, is quick and easy to set up.

Setting up a simple Django App

This can be skipped if you already have a project Django has a quick guide to set up a Django project here which I’ve followed to quickly set up a demo app.

Adding a photo model

Once you’ve done the basic set up in the tutorial, you can create a “Photos” model to start testing your uploads. I’ve included the photo model class I’ve used for this below so you can add the following directly into your models.py file:

from django.db import models
import uuid

class Photo(models.Model):
    uuid = models.UUIDField(
        primary_key=True, default=uuid.uuid4, editable=False,
    )
    created_at = models.DateTimeField(auto_now_add=True) 
    title = models.CharField(max_length=100)
    photo = models.FileField()

This is just a simple Django model which can be used to store photos. The model has a Django File Field for the photo file itself as well as a title, a created at datetime field and an UUID field.

The Django admin allows file uploads using the File field on a model which means we can use this built in functionality without having to spend time setting up forms to test our uploads.

Testing default behaviour of file uploads

First we need to go to the admin site for your project and find the photos model you have defined. Make sure you have registered your model in your admin.py so that you have access to it here.

admin screenshot 1024x544

Click on “Add photo” and you will be able to give your photo object a title and choose the file we want to upload as shown below. Once you’ve done this, click Save.

add new photo 1024x241

title and file 1024x365

successful upload 1024x333

If we do this, we see that the file is uploaded to the root directory of the app.

This is rarely where we want to store uploaded files, but this can be easily modified using the MEDIA_ROOT property in the settings file of your app. This defines the path at which media file uploads will be stored.

To demonstrate this we can set this value to a path within the app using the BASE_DIR which is defined automatically when generating the project with django. If you add the folowing line to your settings.pythis will store your uploads inside the project folder at the file path specified in quotes.

MEDIA_ROOT = os.path.join(BASE_DIR, 'path/to/store/my/files/')

If you now try and upload a file using the admin, we see in the root directory of my app example-django-app there is a new file path created to the photo uploaded.

filepath

However, this is still often not a sensible place to be storing uploaded files for many reasons. If your server is on AWS or another cloud service, the files will be lost when scaling and creating new server instances as they are stored directly on a server instance’s file system. This is a major problem.

Furthermore, they are not going to be easily accessible when trying to retrieve them as they are directly saved on the server, especially if you need the files to be easily accessible externally (e.g. to allow controlled access to other users or websites).

We need a better solution for storing these files somewhere which can easily be accessed to retrieve uploaded files for those with permission, is scalable for handling large numbers of uploads on your Django app and will keep files stored safely even if your server is scaling up or down.

AWS S3 Buckets

Amazon’s “Simple Storage Service” (S3), offers a secure, reliable and scalable storage system which is perfect for this use. It is simple to set up and can be used regardless of where your Django site is being hosted.

Setting up an S3 bucket and allowing the Django app access

You can create an S3 bucket easily by logging into your AWS account, going to the S3 section of the AWS console, clicking “Create bucket” and following the steps to set up. You will also need to create a user with Programmatic access to your AWS account in the “IAM” console and give it permissions for S3FullAccess for your app to be able to correctly access your bucket. AWS provide this useful walkthrough to help set this up

Once you have done this you need to set the access key and secret access key generated when you created the user in your settings.pyfile like shown below.

AWS_ACCESS_KEY_ID = 'YOUR_ACCESS_KEY_HERE'
AWS_SECRET_ACCESS_KEY = 'YOUR_SECRET_ACCESS_KEY_HERE'

Also make sure you remove the MEDIA_ROOT property in the settings file.

For your production environment, it’s best to get these values from environment variables rather than hardcoding them in your settings file.

Installing some helper libraries

We’re nearly ready to start using the S3 bucket for uploads, we just need to install 2 python libraries: boto3 and django-storages. Boto3 is the SDK that AWS provide for Python to be able to manage AWS services from within your code. We need to use it to specify the S3 bucket which your file uploads need to be directed to (you can look at the boto3 documentation here).

Django-storages is a useful library which provides storage helper methods (you can read the documentation for it here). You will need to add 'storages' into your INSTALLED_APPS of settings file too.

To install the libraries, you can simply run:

pip install boto3
pip install django-storages

inside the virtual environment your app is running in and adding these libraries to your dependencies list.

Once you’ve done this, you need to define a few more variables in your settings file:

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
AWS_STORAGE_BUCKET_NAME = 'insert-your-bucket-name-here'
AWS_S3_REGION_NAME = 'eu-west-2'

When you created your S3 bucket you will have defined a name for the bucket as well as choosing an AWS region for it to be hosted in. Insert these here.

We should be ready now to upload files directly to S3 via the Django admin upload we tried earlier. If we try the uploads again we will see the files no longer appear in the root directory of our app but are stored in the S3 bucket we set up.

S3 bucket 1024x303

We are now able to retrieve these files in S3 in the AWS console or directly in the Django admin.

Conclusion

We have been able to set up file uploads from a very basic Django up quickly and easily. You should now be able to easily set this up on future Django projects to make your handling of file uploads hassle free.

#django #python #web-development

What is GEEK

Buddha Community

How to Set up file uploads to S3 via Django
Hermann  Frami

Hermann Frami

1651383480

A Simple Wrapper Around Amplify AppSync Simulator

This serverless plugin is a wrapper for amplify-appsync-simulator made for testing AppSync APIs built with serverless-appsync-plugin.

Install

npm install serverless-appsync-simulator
# or
yarn add serverless-appsync-simulator

Usage

This plugin relies on your serverless yml file and on the serverless-offline plugin.

plugins:
  - serverless-dynamodb-local # only if you need dynamodb resolvers and you don't have an external dynamodb
  - serverless-appsync-simulator
  - serverless-offline

Note: Order is important serverless-appsync-simulator must go before serverless-offline

To start the simulator, run the following command:

sls offline start

You should see in the logs something like:

...
Serverless: AppSync endpoint: http://localhost:20002/graphql
Serverless: GraphiQl: http://localhost:20002
...

Configuration

Put options under custom.appsync-simulator in your serverless.yml file

| option | default | description | | ------------------------ | -------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------- | | apiKey | 0123456789 | When using API_KEY as authentication type, the key to authenticate to the endpoint. | | port | 20002 | AppSync operations port; if using multiple APIs, the value of this option will be used as a starting point, and each other API will have a port of lastPort + 10 (e.g. 20002, 20012, 20022, etc.) | | wsPort | 20003 | AppSync subscriptions port; if using multiple APIs, the value of this option will be used as a starting point, and each other API will have a port of lastPort + 10 (e.g. 20003, 20013, 20023, etc.) | | location | . (base directory) | Location of the lambda functions handlers. | | refMap | {} | A mapping of resource resolutions for the Ref function | | getAttMap | {} | A mapping of resource resolutions for the GetAtt function | | importValueMap | {} | A mapping of resource resolutions for the ImportValue function | | functions | {} | A mapping of external functions for providing invoke url for external fucntions | | dynamoDb.endpoint | http://localhost:8000 | Dynamodb endpoint. Specify it if you're not using serverless-dynamodb-local. Otherwise, port is taken from dynamodb-local conf | | dynamoDb.region | localhost | Dynamodb region. Specify it if you're connecting to a remote Dynamodb intance. | | dynamoDb.accessKeyId | DEFAULT_ACCESS_KEY | AWS Access Key ID to access DynamoDB | | dynamoDb.secretAccessKey | DEFAULT_SECRET | AWS Secret Key to access DynamoDB | | dynamoDb.sessionToken | DEFAULT_ACCESS_TOKEEN | AWS Session Token to access DynamoDB, only if you have temporary security credentials configured on AWS | | dynamoDb.* | | You can add every configuration accepted by DynamoDB SDK | | rds.dbName | | Name of the database | | rds.dbHost | | Database host | | rds.dbDialect | | Database dialect. Possible values (mysql | postgres) | | rds.dbUsername | | Database username | | rds.dbPassword | | Database password | | rds.dbPort | | Database port | | watch | - *.graphql
- *.vtl | Array of glob patterns to watch for hot-reloading. |

Example:

custom:
  appsync-simulator:
    location: '.webpack/service' # use webpack build directory
    dynamoDb:
      endpoint: 'http://my-custom-dynamo:8000'

Hot-reloading

By default, the simulator will hot-relad when changes to *.graphql or *.vtl files are detected. Changes to *.yml files are not supported (yet? - this is a Serverless Framework limitation). You will need to restart the simulator each time you change yml files.

Hot-reloading relies on watchman. Make sure it is installed on your system.

You can change the files being watched with the watch option, which is then passed to watchman as the match expression.

e.g.

custom:
  appsync-simulator:
    watch:
      - ["match", "handlers/**/*.vtl", "wholename"] # => array is interpreted as the literal match expression
      - "*.graphql"                                 # => string like this is equivalent to `["match", "*.graphql"]`

Or you can opt-out by leaving an empty array or set the option to false

Note: Functions should not require hot-reloading, unless you are using a transpiler or a bundler (such as webpack, babel or typescript), un which case you should delegate hot-reloading to that instead.

Resource CloudFormation functions resolution

This plugin supports some resources resolution from the Ref, Fn::GetAtt and Fn::ImportValue functions in your yaml file. It also supports some other Cfn functions such as Fn::Join, Fb::Sub, etc.

Note: Under the hood, this features relies on the cfn-resolver-lib package. For more info on supported cfn functions, refer to the documentation

Basic usage

You can reference resources in your functions' environment variables (that will be accessible from your lambda functions) or datasource definitions. The plugin will automatically resolve them for you.

provider:
  environment:
    BUCKET_NAME:
      Ref: MyBucket # resolves to `my-bucket-name`

resources:
  Resources:
    MyDbTable:
      Type: AWS::DynamoDB::Table
      Properties:
        TableName: myTable
      ...
    MyBucket:
      Type: AWS::S3::Bucket
      Properties:
        BucketName: my-bucket-name
    ...

# in your appsync config
dataSources:
  - type: AMAZON_DYNAMODB
    name: dynamosource
    config:
      tableName:
        Ref: MyDbTable # resolves to `myTable`

Override (or mock) values

Sometimes, some references cannot be resolved, as they come from an Output from Cloudformation; or you might want to use mocked values in your local environment.

In those cases, you can define (or override) those values using the refMap, getAttMap and importValueMap options.

  • refMap takes a mapping of resource name to value pairs
  • getAttMap takes a mapping of resource name to attribute/values pairs
  • importValueMap takes a mapping of import name to values pairs

Example:

custom:
  appsync-simulator:
    refMap:
      # Override `MyDbTable` resolution from the previous example.
      MyDbTable: 'mock-myTable'
    getAttMap:
      # define ElasticSearchInstance DomainName
      ElasticSearchInstance:
        DomainEndpoint: 'localhost:9200'
    importValueMap:
      other-service-api-url: 'https://other.api.url.com/graphql'

# in your appsync config
dataSources:
  - type: AMAZON_ELASTICSEARCH
    name: elasticsource
    config:
      # endpoint resolves as 'http://localhost:9200'
      endpoint:
        Fn::Join:
          - ''
          - - https://
            - Fn::GetAtt:
                - ElasticSearchInstance
                - DomainEndpoint

Key-value mock notation

In some special cases you will need to use key-value mock nottation. Good example can be case when you need to include serverless stage value (${self:provider.stage}) in the import name.

This notation can be used with all mocks - refMap, getAttMap and importValueMap

provider:
  environment:
    FINISH_ACTIVITY_FUNCTION_ARN:
      Fn::ImportValue: other-service-api-${self:provider.stage}-url

custom:
  serverless-appsync-simulator:
    importValueMap:
      - key: other-service-api-${self:provider.stage}-url
        value: 'https://other.api.url.com/graphql'

Limitations

This plugin only tries to resolve the following parts of the yml tree:

  • provider.environment
  • functions[*].environment
  • custom.appSync

If you have the need of resolving others, feel free to open an issue and explain your use case.

For now, the supported resources to be automatically resovled by Ref: are:

  • DynamoDb tables
  • S3 Buckets

Feel free to open a PR or an issue to extend them as well.

External functions

When a function is not defined withing the current serverless file you can still call it by providing an invoke url which should point to a REST method. Make sure you specify "get" or "post" for the method. Default is "get", but you probably want "post".

custom:
  appsync-simulator:
    functions:
      addUser:
        url: http://localhost:3016/2015-03-31/functions/addUser/invocations
        method: post
      addPost:
        url: https://jsonplaceholder.typicode.com/posts
        method: post

Supported Resolver types

This plugin supports resolvers implemented by amplify-appsync-simulator, as well as custom resolvers.

From Aws Amplify:

  • NONE
  • AWS_LAMBDA
  • AMAZON_DYNAMODB
  • PIPELINE

Implemented by this plugin

  • AMAZON_ELASTIC_SEARCH
  • HTTP
  • RELATIONAL_DATABASE

Relational Database

Sample VTL for a create mutation

#set( $cols = [] )
#set( $vals = [] )
#foreach( $entry in $ctx.args.input.keySet() )
  #set( $regex = "([a-z])([A-Z]+)")
  #set( $replacement = "$1_$2")
  #set( $toSnake = $entry.replaceAll($regex, $replacement).toLowerCase() )
  #set( $discard = $cols.add("$toSnake") )
  #if( $util.isBoolean($ctx.args.input[$entry]) )
      #if( $ctx.args.input[$entry] )
        #set( $discard = $vals.add("1") )
      #else
        #set( $discard = $vals.add("0") )
      #end
  #else
      #set( $discard = $vals.add("'$ctx.args.input[$entry]'") )
  #end
#end
#set( $valStr = $vals.toString().replace("[","(").replace("]",")") )
#set( $colStr = $cols.toString().replace("[","(").replace("]",")") )
#if ( $valStr.substring(0, 1) != '(' )
  #set( $valStr = "($valStr)" )
#end
#if ( $colStr.substring(0, 1) != '(' )
  #set( $colStr = "($colStr)" )
#end
{
  "version": "2018-05-29",
  "statements":   ["INSERT INTO <name-of-table> $colStr VALUES $valStr", "SELECT * FROM    <name-of-table> ORDER BY id DESC LIMIT 1"]
}

Sample VTL for an update mutation

#set( $update = "" )
#set( $equals = "=" )
#foreach( $entry in $ctx.args.input.keySet() )
  #set( $cur = $ctx.args.input[$entry] )
  #set( $regex = "([a-z])([A-Z]+)")
  #set( $replacement = "$1_$2")
  #set( $toSnake = $entry.replaceAll($regex, $replacement).toLowerCase() )
  #if( $util.isBoolean($cur) )
      #if( $cur )
        #set ( $cur = "1" )
      #else
        #set ( $cur = "0" )
      #end
  #end
  #if ( $util.isNullOrEmpty($update) )
      #set($update = "$toSnake$equals'$cur'" )
  #else
      #set($update = "$update,$toSnake$equals'$cur'" )
  #end
#end
{
  "version": "2018-05-29",
  "statements":   ["UPDATE <name-of-table> SET $update WHERE id=$ctx.args.input.id", "SELECT * FROM <name-of-table> WHERE id=$ctx.args.input.id"]
}

Sample resolver for delete mutation

{
  "version": "2018-05-29",
  "statements":   ["UPDATE <name-of-table> set deleted_at=NOW() WHERE id=$ctx.args.id", "SELECT * FROM <name-of-table> WHERE id=$ctx.args.id"]
}

Sample mutation response VTL with support for handling AWSDateTime

#set ( $index = -1)
#set ( $result = $util.parseJson($ctx.result) )
#set ( $meta = $result.sqlStatementResults[1].columnMetadata)
#foreach ($column in $meta)
    #set ($index = $index + 1)
    #if ( $column["typeName"] == "timestamptz" )
        #set ($time = $result["sqlStatementResults"][1]["records"][0][$index]["stringValue"] )
        #set ( $nowEpochMillis = $util.time.parseFormattedToEpochMilliSeconds("$time.substring(0,19)+0000", "yyyy-MM-dd HH:mm:ssZ") )
        #set ( $isoDateTime = $util.time.epochMilliSecondsToISO8601($nowEpochMillis) )
        $util.qr( $result["sqlStatementResults"][1]["records"][0][$index].put("stringValue", "$isoDateTime") )
    #end
#end
#set ( $res = $util.parseJson($util.rds.toJsonString($util.toJson($result)))[1][0] )
#set ( $response = {} )
#foreach($mapKey in $res.keySet())
    #set ( $s = $mapKey.split("_") )
    #set ( $camelCase="" )
    #set ( $isFirst=true )
    #foreach($entry in $s)
        #if ( $isFirst )
          #set ( $first = $entry.substring(0,1) )
        #else
          #set ( $first = $entry.substring(0,1).toUpperCase() )
        #end
        #set ( $isFirst=false )
        #set ( $stringLength = $entry.length() )
        #set ( $remaining = $entry.substring(1, $stringLength) )
        #set ( $camelCase = "$camelCase$first$remaining" )
    #end
    $util.qr( $response.put("$camelCase", $res[$mapKey]) )
#end
$utils.toJson($response)

Using Variable Map

Variable map support is limited and does not differentiate numbers and strings data types, please inject them directly if needed.

Will be escaped properly: null, true, and false values.

{
  "version": "2018-05-29",
  "statements":   [
    "UPDATE <name-of-table> set deleted_at=NOW() WHERE id=:ID",
    "SELECT * FROM <name-of-table> WHERE id=:ID and unix_timestamp > $ctx.args.newerThan"
  ],
  variableMap: {
    ":ID": $ctx.args.id,
##    ":TIMESTAMP": $ctx.args.newerThan -- This will be handled as a string!!!
  }
}

Requires

Author: Serverless-appsync
Source Code: https://github.com/serverless-appsync/serverless-appsync-simulator 
License: MIT License

#serverless #sync #graphql 

I am Developer

1597559012

Multiple File Upload in Laravel 7, 6

in this post, i will show you easy steps for multiple file upload in laravel 7, 6.

As well as how to validate file type, size before uploading to database in laravel.

Laravel 7/6 Multiple File Upload

You can easily upload multiple file with validation in laravel application using the following steps:

  1. Download Laravel Fresh New Setup
  2. Setup Database Credentials
  3. Generate Migration & Model For File
  4. Make Route For File uploading
  5. Create File Controller & Methods
  6. Create Multiple File Blade View
  7. Run Development Server

https://www.tutsmake.com/laravel-6-multiple-file-upload-with-validation-example/

#laravel multiple file upload validation #multiple file upload in laravel 7 #multiple file upload in laravel 6 #upload multiple files laravel 7 #upload multiple files in laravel 6 #upload multiple files php laravel

Ahebwe  Oscar

Ahebwe Oscar

1620200340

how to integrate CKEditor in Django

how to integrate CKEditor in Django

Welcome to my Blog, in this article we learn about how to integrate CKEditor in Django and inside this, we enable the image upload button to add an image in the blog from local. When I add a CKEditor first time in my project then it was very difficult for me but now I can easily implement it in my project so you can learn and implement CKEditor in your project easily.

how to integrate CKEditor in Django

#django #add image upload in ckeditor #add image upload option ckeditor #ckeditor image upload #ckeditor image upload from local #how to add ckeditor in django #how to add image upload plugin in ckeditor #how to install ckeditor in django #how to integrate ckeditor in django #image upload in ckeditor #image upload option in ckeditor

I am Developer

1595240610

Laravel 7 File Upload Via API Example From Scratch

Laravel 7 file/image upload via API using postman example tutorial. Here, you will learn how to upload files/images via API using postman in laravel app.

As well as you can upload images via API using postman in laravel apps and also you can upload images via api using ajax in laravel apps.

If you work with laravel apis and want to upload files or images using postman or ajax. And also want to validate files or images before uploading to server via API or ajax in laravel.

So this tutorial will guide you step by step on how to upload file vie API using postman and ajax in laravel with validation.

Laravel Image Upload Via API Using Postman Example

File

Follow the below given following steps and upload file vie apis using postman with validation in laravel apps:

  • Step 1: Install Laravel New App
  • Step 2: Add Database Credentials
  • Step 3: Generate Migration & Model
  • Step 4: Create Routes For File
  • Step 5: Generate Controller by Artisan
  • Step 6: Run Development Server
  • Step 7: Laravel Upload File Via Api Using PostMan

Checkout Full Article here https://www.tutsmake.com/laravel-file-upload-via-api-example-from-scratch/

#uploading files via laravel api #laravel file upload api using postman #laravel image upload via api #upload image using laravel api #image upload api in laravel validation #laravel send file to api

Ahebwe  Oscar

Ahebwe Oscar

1620177818

Django admin full Customization step by step

Welcome to my blog , hey everyone in this article you learn how to customize the Django app and view in the article you will know how to register  and unregister  models from the admin view how to add filtering how to add a custom input field, and a button that triggers an action on all objects and even how to change the look of your app and page using the Django suit package let’s get started.

Database

Custom Titles of Django Admin

Exclude in Django Admin

Fields in Django Admin

#django #create super user django #customize django admin dashboard #django admin #django admin custom field display #django admin customization #django admin full customization #django admin interface #django admin register all models #django customization