1660599180
Ce didacticiel explique comment utiliser SQLAlchemy de manière asynchrone avec SQLModel et FastAPI. Nous allons également configurer Alembic pour gérer les migrations de bases de données.
Commencez par cloner le projet de base à partir du référentiel fastapi-sqlmodel-alembic :
$ git clone -b base https://github.com/testdrivenio/fastapi-sqlmodel-alembic
$ cd fastapi-sqlmodel-alembic
À partir de la racine du projet, créez les images et lancez les conteneurs Docker :
$ docker-compose up -d --build
Une fois la construction terminée, accédez à http://localhost:8004/ping . Tu devrais voir:
{
"ping": "pong!"
}
Jetez un coup d'œil à la structure du projet avant de continuer.
Ajoutons ensuite SQLModel , une bibliothèque permettant d'interagir avec des bases de données SQL à partir de code Python, avec des objets Python. Basé sur des annotations de type Python, il s'agit essentiellement d'un wrapper au-dessus de pydantic et SQLAlchemy , ce qui facilite le travail avec les deux.
Nous aurons également besoin de Psycopg .
Ajoutez les deux dépendances à project/requirements.txt :
fastapi==0.68.1
psycopg2-binary==2.9.1
sqlmodel==0.0.4
uvicorn==0.15.0
Créez deux nouveaux fichiers dans "project/app", db.py et models.py .
projet/app/models.py :
from sqlmodel import SQLModel, Field
class SongBase(SQLModel):
name: str
artist: str
class Song(SongBase, table=True):
id: int = Field(default=None, primary_key=True)
class SongCreate(SongBase):
pass
Ici, nous avons défini trois modèles :
SongBase
est le modèle de base dont les autres héritent. Il a deux propriétés, name
et artist
, qui sont toutes deux des chaînes. Il s'agit d'un modèle de données uniquement car il manque table=True
, ce qui signifie qu'il n'est utilisé que comme modèle pydantique.Song
, quant à lui, ajoute une id
propriété au modèle de base. C'est un modèle de table, donc c'est un modèle pydantic et SQLAlchemy. Il représente une table de base de données.SongCreate
est un modèle pydantic de données uniquement qui sera utilisé pour créer de nouvelles instances de chanson.projet/app/db.py :
import os
from sqlmodel import create_engine, SQLModel, Session
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_engine(DATABASE_URL, echo=True)
def init_db():
SQLModel.metadata.create_all(engine)
def get_session():
with Session(engine) as session:
yield session
Ici, nous :
create_engine
de SQLModel. Les principales différences entre les versions de SQLModel create_engine
et de SQLAlchemy sont que la version de SQLModel ajoute des annotations de type (pour la prise en charge de l'éditeur) et active le style SQLAlchemy "2.0" des moteurs et des connexions . De plus, nous sommes passés echo=True
afin que nous puissions voir les requêtes SQL générées dans le terminal. C'est toujours agréable à activer en mode développement à des fins de débogage.Ensuite, dans project/app/main.py , créons les tables au démarrage, en utilisant l' événement startup :
from fastapi import FastAPI
from app.db import init_db
from app.models import Song
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
Il convient de noter que from app.models import Song
c'est obligatoire. Sans cela, la table des chansons ne sera pas créée.
Pour tester, arrêtez les anciens conteneurs et volumes, reconstruisez les images et lancez les nouveaux conteneurs :
$ docker-compose down -v
$ docker-compose up -d --build
Ouvrez les journaux du conteneur via docker-compose logs web
. Tu devrais voir:
web_1 | CREATE TABLE song (
web_1 | name VARCHAR NOT NULL,
web_1 | artist VARCHAR NOT NULL,
web_1 | id SERIAL,
web_1 | PRIMARY KEY (id)
web_1 | )
Ouvrez psql :
$ docker-compose exec db psql --username=postgres --dbname=foo
psql (13.4 (Debian 13.4-1.pgdg100+1))
Type "help" for help.
foo=# \dt
List of relations
Schema | Name | Type | Owner
--------+------+-------+----------
public | song | table | postgres
(1 row)
foo=# \q
Avec le tableau en place, ajoutons quelques nouvelles routes à project/app/main.py :
from fastapi import Depends, FastAPI
from sqlalchemy import select
from sqlmodel import Session
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
def get_songs(session: Session = Depends(get_session)):
result = session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
def add_song(song: SongCreate, session: Session = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
session.commit()
session.refresh(song)
return song
Ajouter une chanson :
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
Dans votre navigateur, accédez à http://localhost:8004/songs . Tu devrais voir:
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
Passons à autre chose, ajoutons le support asynchrone à SQLModel.
Tout d'abord, faites descendre les conteneurs et les volumes :
$ docker-compose down -v
Mettez à jour l'URI de la base de données dans docker-compose.yml , en ajoutant+asyncpg
:
environment:
- DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/foo
Ensuite, remplacez Psycopg par asyncpg :
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Mettez à jour project/app/db.py : pour utiliser les saveurs asynchrones du moteur et de la session de SQLAlchemy :
import os
from sqlmodel import SQLModel
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_async_engine(DATABASE_URL, echo=True, future=True)
async def init_db():
async with engine.begin() as conn:
# await conn.run_sync(SQLModel.metadata.drop_all)
await conn.run_sync(SQLModel.metadata.create_all)
async def get_session() -> AsyncSession:
async_session = sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
async with async_session() as session:
yield session
Remarques:
expire_on_commit=False
.metadata.create_all
ne s'exécute pas de manière asynchrone, nous avons donc utilisé run_sync pour l'exécuter de manière synchrone dans la fonction async.Transformez -vous on_startup
en une fonction asynchrone dans project/app/main.py :
@app.on_event("startup")
async def on_startup():
await init_db()
C'est ça. Reconstruisez les images et faites tourner les conteneurs :
$ docker-compose up -d --build
Assurez-vous que les tables ont été créées.
Enfin, mettez à jour les gestionnaires de route dans project/app/main.py pour utiliser l'exécution asynchrone :
from fastapi import Depends, FastAPI
from sqlalchemy.future import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
async def on_startup():
await init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Ajoutez une nouvelle chanson et assurez-vous que http://localhost:8004/songs fonctionne comme prévu.
Enfin, ajoutons Alembic au mélange pour gérer correctement les changements de schéma de base de données.
Ajoutez-le au fichier requirements :
alembic==1.7.1
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Supprimez l'événement de démarrage de project/app/main.py car nous ne voulons plus des tables créées au démarrage :
@app.on_event("startup")
async def on_startup():
await init_db()
Encore une fois, réduisez les conteneurs et les volumes existants :
$ docker-compose down -v
Remontez les conteneurs :
$ docker-compose up -d --build
Jetez un coup d'œil à Utilisation d'Asyncio avec Alembic pendant la construction des nouvelles images.
Une fois les conteneurs sauvegardés, initialisez Alembic avec le template async :
$ docker-compose exec web alembic init -t async migrations
Dans le dossier "project/migrations" généré, importez SQLModel dans script.py.mako , un fichier de modèle Mako :
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel # NEW
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}
Désormais, lorsqu'un nouveau fichier de migration est généré, il inclura import sqlmodel
.
Ensuite, nous devons mettre à jour le haut de project/migrations/env.py comme ceci :
import asyncio
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy.ext.asyncio import AsyncEngine
from sqlmodel import SQLModel # NEW
from alembic import context
from app.models import Song # NEW
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = SQLModel.metadata # UPDATED
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
...
Ici, nous avons importé SQLModel et notre modèle de chanson. Nous avons ensuite défini les métadonnéestarget_metadata
de notre modèle , . Pour en savoir plus sur l' argument, consultez Auto Generating Migrations à partir de la documentation officielle d'Alembic.SQLModel.metadatatarget_metadata
Mise à jour sqlalchemy.url
dans project/alembic.ini :
sqlalchemy.url = postgresql+asyncpg://postgres:postgres@db:5432/foo
Pour générer le premier fichier de migration, exécutez :
$ docker-compose exec web alembic revision --autogenerate -m "init"
Si tout s'est bien passé, vous devriez voir un nouveau fichier de migration dans "project/migrations/versions" qui ressemble à ceci :
"""init
Revision ID: f9c634db477d
Revises:
Create Date: 2021-09-10 00:24:32.718895
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel
# revision identifiers, used by Alembic.
revision = 'f9c634db477d'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('song',
sa.Column('name', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('artist', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('id', sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_song_artist'), 'song', ['artist'], unique=False)
op.create_index(op.f('ix_song_id'), 'song', ['id'], unique=False)
op.create_index(op.f('ix_song_name'), 'song', ['name'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_name'), table_name='song')
op.drop_index(op.f('ix_song_id'), table_name='song')
op.drop_index(op.f('ix_song_artist'), table_name='song')
op.drop_table('song')
# ### end Alembic commands ###
Appliquez la migration :
$ docker-compose exec web alembic upgrade head
Assurez-vous que vous pouvez ajouter une chanson.
Testons rapidement un changement de schéma. Mettez à jour le SongBase
modèle dans project/app/models.py :
class SongBase(SQLModel):
name: str
artist: str
year: Optional[int] = None
N'oubliez pas l'importation :
from typing import Optional
Créez un nouveau fichier de migration :
$ docker-compose exec web alembic revision --autogenerate -m "add year"
Mettez à jour les fonctions upgrade
et downgrade
à partir du fichier de migration généré automatiquement comme suit :
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('song', sa.Column('year', sa.Integer(), nullable=True))
op.create_index(op.f('ix_song_year'), 'song', ['year'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_year'), table_name='song')
op.drop_column('song', 'year')
# ### end Alembic commands ###
Appliquez la migration :
$ docker-compose exec web alembic upgrade head
Mettez à jour les gestionnaires de routage :
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, year=song.year, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist, year=song.year)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Test:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai", "year":"2021"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
Dans ce didacticiel, nous avons expliqué comment configurer SQLAlchemy, SQLModel et Alembic pour qu'ils fonctionnent avec FastAPI de manière asynchrone.
Si vous recherchez plus de défis, consultez tous nos tutoriels et cours FastAPI .
Vous pouvez trouver le code source dans le dépôt fastapi-sqlmodel-alembic . Acclamations!
Source : https://testdrive.io
#fastapi #sqlmodel #sqlalchemy
1660599180
Ce didacticiel explique comment utiliser SQLAlchemy de manière asynchrone avec SQLModel et FastAPI. Nous allons également configurer Alembic pour gérer les migrations de bases de données.
Commencez par cloner le projet de base à partir du référentiel fastapi-sqlmodel-alembic :
$ git clone -b base https://github.com/testdrivenio/fastapi-sqlmodel-alembic
$ cd fastapi-sqlmodel-alembic
À partir de la racine du projet, créez les images et lancez les conteneurs Docker :
$ docker-compose up -d --build
Une fois la construction terminée, accédez à http://localhost:8004/ping . Tu devrais voir:
{
"ping": "pong!"
}
Jetez un coup d'œil à la structure du projet avant de continuer.
Ajoutons ensuite SQLModel , une bibliothèque permettant d'interagir avec des bases de données SQL à partir de code Python, avec des objets Python. Basé sur des annotations de type Python, il s'agit essentiellement d'un wrapper au-dessus de pydantic et SQLAlchemy , ce qui facilite le travail avec les deux.
Nous aurons également besoin de Psycopg .
Ajoutez les deux dépendances à project/requirements.txt :
fastapi==0.68.1
psycopg2-binary==2.9.1
sqlmodel==0.0.4
uvicorn==0.15.0
Créez deux nouveaux fichiers dans "project/app", db.py et models.py .
projet/app/models.py :
from sqlmodel import SQLModel, Field
class SongBase(SQLModel):
name: str
artist: str
class Song(SongBase, table=True):
id: int = Field(default=None, primary_key=True)
class SongCreate(SongBase):
pass
Ici, nous avons défini trois modèles :
SongBase
est le modèle de base dont les autres héritent. Il a deux propriétés, name
et artist
, qui sont toutes deux des chaînes. Il s'agit d'un modèle de données uniquement car il manque table=True
, ce qui signifie qu'il n'est utilisé que comme modèle pydantique.Song
, quant à lui, ajoute une id
propriété au modèle de base. C'est un modèle de table, donc c'est un modèle pydantic et SQLAlchemy. Il représente une table de base de données.SongCreate
est un modèle pydantic de données uniquement qui sera utilisé pour créer de nouvelles instances de chanson.projet/app/db.py :
import os
from sqlmodel import create_engine, SQLModel, Session
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_engine(DATABASE_URL, echo=True)
def init_db():
SQLModel.metadata.create_all(engine)
def get_session():
with Session(engine) as session:
yield session
Ici, nous :
create_engine
de SQLModel. Les principales différences entre les versions de SQLModel create_engine
et de SQLAlchemy sont que la version de SQLModel ajoute des annotations de type (pour la prise en charge de l'éditeur) et active le style SQLAlchemy "2.0" des moteurs et des connexions . De plus, nous sommes passés echo=True
afin que nous puissions voir les requêtes SQL générées dans le terminal. C'est toujours agréable à activer en mode développement à des fins de débogage.Ensuite, dans project/app/main.py , créons les tables au démarrage, en utilisant l' événement startup :
from fastapi import FastAPI
from app.db import init_db
from app.models import Song
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
Il convient de noter que from app.models import Song
c'est obligatoire. Sans cela, la table des chansons ne sera pas créée.
Pour tester, arrêtez les anciens conteneurs et volumes, reconstruisez les images et lancez les nouveaux conteneurs :
$ docker-compose down -v
$ docker-compose up -d --build
Ouvrez les journaux du conteneur via docker-compose logs web
. Tu devrais voir:
web_1 | CREATE TABLE song (
web_1 | name VARCHAR NOT NULL,
web_1 | artist VARCHAR NOT NULL,
web_1 | id SERIAL,
web_1 | PRIMARY KEY (id)
web_1 | )
Ouvrez psql :
$ docker-compose exec db psql --username=postgres --dbname=foo
psql (13.4 (Debian 13.4-1.pgdg100+1))
Type "help" for help.
foo=# \dt
List of relations
Schema | Name | Type | Owner
--------+------+-------+----------
public | song | table | postgres
(1 row)
foo=# \q
Avec le tableau en place, ajoutons quelques nouvelles routes à project/app/main.py :
from fastapi import Depends, FastAPI
from sqlalchemy import select
from sqlmodel import Session
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
def get_songs(session: Session = Depends(get_session)):
result = session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
def add_song(song: SongCreate, session: Session = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
session.commit()
session.refresh(song)
return song
Ajouter une chanson :
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
Dans votre navigateur, accédez à http://localhost:8004/songs . Tu devrais voir:
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
Passons à autre chose, ajoutons le support asynchrone à SQLModel.
Tout d'abord, faites descendre les conteneurs et les volumes :
$ docker-compose down -v
Mettez à jour l'URI de la base de données dans docker-compose.yml , en ajoutant+asyncpg
:
environment:
- DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/foo
Ensuite, remplacez Psycopg par asyncpg :
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Mettez à jour project/app/db.py : pour utiliser les saveurs asynchrones du moteur et de la session de SQLAlchemy :
import os
from sqlmodel import SQLModel
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_async_engine(DATABASE_URL, echo=True, future=True)
async def init_db():
async with engine.begin() as conn:
# await conn.run_sync(SQLModel.metadata.drop_all)
await conn.run_sync(SQLModel.metadata.create_all)
async def get_session() -> AsyncSession:
async_session = sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
async with async_session() as session:
yield session
Remarques:
expire_on_commit=False
.metadata.create_all
ne s'exécute pas de manière asynchrone, nous avons donc utilisé run_sync pour l'exécuter de manière synchrone dans la fonction async.Transformez -vous on_startup
en une fonction asynchrone dans project/app/main.py :
@app.on_event("startup")
async def on_startup():
await init_db()
C'est ça. Reconstruisez les images et faites tourner les conteneurs :
$ docker-compose up -d --build
Assurez-vous que les tables ont été créées.
Enfin, mettez à jour les gestionnaires de route dans project/app/main.py pour utiliser l'exécution asynchrone :
from fastapi import Depends, FastAPI
from sqlalchemy.future import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
async def on_startup():
await init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Ajoutez une nouvelle chanson et assurez-vous que http://localhost:8004/songs fonctionne comme prévu.
Enfin, ajoutons Alembic au mélange pour gérer correctement les changements de schéma de base de données.
Ajoutez-le au fichier requirements :
alembic==1.7.1
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Supprimez l'événement de démarrage de project/app/main.py car nous ne voulons plus des tables créées au démarrage :
@app.on_event("startup")
async def on_startup():
await init_db()
Encore une fois, réduisez les conteneurs et les volumes existants :
$ docker-compose down -v
Remontez les conteneurs :
$ docker-compose up -d --build
Jetez un coup d'œil à Utilisation d'Asyncio avec Alembic pendant la construction des nouvelles images.
Une fois les conteneurs sauvegardés, initialisez Alembic avec le template async :
$ docker-compose exec web alembic init -t async migrations
Dans le dossier "project/migrations" généré, importez SQLModel dans script.py.mako , un fichier de modèle Mako :
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel # NEW
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}
Désormais, lorsqu'un nouveau fichier de migration est généré, il inclura import sqlmodel
.
Ensuite, nous devons mettre à jour le haut de project/migrations/env.py comme ceci :
import asyncio
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy.ext.asyncio import AsyncEngine
from sqlmodel import SQLModel # NEW
from alembic import context
from app.models import Song # NEW
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = SQLModel.metadata # UPDATED
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
...
Ici, nous avons importé SQLModel et notre modèle de chanson. Nous avons ensuite défini les métadonnéestarget_metadata
de notre modèle , . Pour en savoir plus sur l' argument, consultez Auto Generating Migrations à partir de la documentation officielle d'Alembic.SQLModel.metadatatarget_metadata
Mise à jour sqlalchemy.url
dans project/alembic.ini :
sqlalchemy.url = postgresql+asyncpg://postgres:postgres@db:5432/foo
Pour générer le premier fichier de migration, exécutez :
$ docker-compose exec web alembic revision --autogenerate -m "init"
Si tout s'est bien passé, vous devriez voir un nouveau fichier de migration dans "project/migrations/versions" qui ressemble à ceci :
"""init
Revision ID: f9c634db477d
Revises:
Create Date: 2021-09-10 00:24:32.718895
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel
# revision identifiers, used by Alembic.
revision = 'f9c634db477d'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('song',
sa.Column('name', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('artist', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('id', sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_song_artist'), 'song', ['artist'], unique=False)
op.create_index(op.f('ix_song_id'), 'song', ['id'], unique=False)
op.create_index(op.f('ix_song_name'), 'song', ['name'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_name'), table_name='song')
op.drop_index(op.f('ix_song_id'), table_name='song')
op.drop_index(op.f('ix_song_artist'), table_name='song')
op.drop_table('song')
# ### end Alembic commands ###
Appliquez la migration :
$ docker-compose exec web alembic upgrade head
Assurez-vous que vous pouvez ajouter une chanson.
Testons rapidement un changement de schéma. Mettez à jour le SongBase
modèle dans project/app/models.py :
class SongBase(SQLModel):
name: str
artist: str
year: Optional[int] = None
N'oubliez pas l'importation :
from typing import Optional
Créez un nouveau fichier de migration :
$ docker-compose exec web alembic revision --autogenerate -m "add year"
Mettez à jour les fonctions upgrade
et downgrade
à partir du fichier de migration généré automatiquement comme suit :
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('song', sa.Column('year', sa.Integer(), nullable=True))
op.create_index(op.f('ix_song_year'), 'song', ['year'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_year'), table_name='song')
op.drop_column('song', 'year')
# ### end Alembic commands ###
Appliquez la migration :
$ docker-compose exec web alembic upgrade head
Mettez à jour les gestionnaires de routage :
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, year=song.year, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist, year=song.year)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Test:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai", "year":"2021"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
Dans ce didacticiel, nous avons expliqué comment configurer SQLAlchemy, SQLModel et Alembic pour qu'ils fonctionnent avec FastAPI de manière asynchrone.
Si vous recherchez plus de défis, consultez tous nos tutoriels et cours FastAPI .
Vous pouvez trouver le code source dans le dépôt fastapi-sqlmodel-alembic . Acclamations!
Source : https://testdrive.io
1660628820
This tutorial looks at how to work with SQLAlchemy asynchronously with SQLModel and FastAPI. We'll also configure Alembic for handling database migrations.
Source: https://testdriven.io
1660606440
本教程著眼於如何使用 SQLModel 和 FastAPI 異步使用 SQLAlchemy。我們還將配置 Alembic 以處理數據庫遷移。
首先從fastapi-sqlmodel-alembic 存儲庫中克隆基礎項目:
$ git clone -b base https://github.com/testdrivenio/fastapi-sqlmodel-alembic
$ cd fastapi-sqlmodel-alembic
從項目根目錄,創建鏡像並啟動 Docker 容器:
$ docker-compose up -d --build
構建完成後,導航到http://localhost:8004/ping。你應該看到:
{
"ping": "pong!"
}
在繼續之前快速瀏覽一下項目結構。
接下來,讓我們添加SQLModel,一個用於從 Python 代碼與 SQL 數據庫交互的庫,以及 Python 對象。基於 Python 類型註釋,它本質上是pydantic和SQLAlchemy之上的一個包裝器,可以輕鬆使用兩者。
我們還需要Psycopg。
將兩個依賴項添加到project/requirements.txt:
fastapi==0.68.1
psycopg2-binary==2.9.1
sqlmodel==0.0.4
uvicorn==0.15.0
在“project/app”中創建兩個新文件db.py和models.py。
項目/應用程序/models.py:
from sqlmodel import SQLModel, Field
class SongBase(SQLModel):
name: str
artist: str
class Song(SongBase, table=True):
id: int = Field(default=None, primary_key=True)
class SongCreate(SongBase):
pass
在這裡,我們定義了三個模型:
SongBase
是其他人繼承的基本模型。它有兩個屬性,name
和artist
,它們都是字符串。這是一個純數據模型,因為它缺少table=True
,這意味著它僅用作 pydantic 模型。Song
,同時,將id
屬性添加到基本模型。這是一個表模型,所以它是一個 pydantic 和 SQLAlchemy 模型。它代表一個數據庫表。SongCreate
是一個純數據的 pydantic 模型,將用於創建新的歌曲實例。項目/應用程序/db.py:
import os
from sqlmodel import create_engine, SQLModel, Session
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_engine(DATABASE_URL, echo=True)
def init_db():
SQLModel.metadata.create_all(engine)
def get_session():
with Session(engine) as session:
yield session
在這裡,我們:
create_engine
SQLModel 和 SQLAlchemy 版本之間的主要區別在於create_engine
SQLModel 版本添加了類型註釋(用於編輯器支持)並啟用了SQLAlchemy “2.0” 風格的引擎和連接。另外,我們傳入,echo=True
所以我們可以在終端中看到生成的 SQL 查詢。出於調試目的,在開發模式下啟用這總是很好的。接下來,在project/app/main.py中,讓我們在啟動時使用啟動事件創建表:
from fastapi import FastAPI
from app.db import init_db
from app.models import Song
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
值得注意的是,這from app.models import Song
是必需的。沒有它,就不會創建歌曲表。
要進行測試,請關閉舊容器和卷,重建映像並啟動新容器:
$ docker-compose down -v
$ docker-compose up -d --build
通過打開容器日誌docker-compose logs web
。你應該看到:
web_1 | CREATE TABLE song (
web_1 | name VARCHAR NOT NULL,
web_1 | artist VARCHAR NOT NULL,
web_1 | id SERIAL,
web_1 | PRIMARY KEY (id)
web_1 | )
打開 psql:
$ docker-compose exec db psql --username=postgres --dbname=foo
psql (13.4 (Debian 13.4-1.pgdg100+1))
Type "help" for help.
foo=# \dt
List of relations
Schema | Name | Type | Owner
--------+------+-------+----------
public | song | table | postgres
(1 row)
foo=# \q
隨著表格的出現,讓我們向project/app/main.py添加一些新路由:
from fastapi import Depends, FastAPI
from sqlalchemy import select
from sqlmodel import Session
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
def get_songs(session: Session = Depends(get_session)):
result = session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
def add_song(song: SongCreate, session: Session = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
session.commit()
session.refresh(song)
return song
添加歌曲:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
在瀏覽器中,導航到http://localhost:8004/songs。你應該看到:
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
繼續,讓我們為 SQLModel 添加異步支持。
首先,降低容器和卷:
$ docker-compose down -v
更新docker-compose.yml中的數據庫 URI ,添加+asyncpg
:
environment:
- DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/foo
接下來,將 Psycopg 替換為asyncpg:
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
更新project/app/db.py:使用 SQLAlchemy 引擎和會話的異步風格:
import os
from sqlmodel import SQLModel
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_async_engine(DATABASE_URL, echo=True, future=True)
async def init_db():
async with engine.begin() as conn:
# await conn.run_sync(SQLModel.metadata.drop_all)
await conn.run_sync(SQLModel.metadata.create_all)
async def get_session() -> AsyncSession:
async_session = sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
async with async_session() as session:
yield session
筆記:
expire_on_commit=False
。metadata.create_all
不會異步執行,所以我們使用run_sync在 async 函數中同步執行。在project/app/main.py中on_startup
變成一個異步函數:
@app.on_event("startup")
async def on_startup():
await init_db()
而已。重建圖像並啟動容器:
$ docker-compose up -d --build
確保已創建表。
最後,更新project/app/main.py中的路由處理程序以使用異步執行:
from fastapi import Depends, FastAPI
from sqlalchemy.future import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
async def on_startup():
await init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
await session.commit()
await session.refresh(song)
return song
添加一首新歌並確保http://localhost:8004/songs按預期工作。
最後,讓我們將Alembic添加到組合中以正確處理數據庫架構更改。
將其添加到需求文件中:
alembic==1.7.1
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
從project/app/main.py中刪除啟動事件,因為我們不再希望在啟動時創建表:
@app.on_event("startup")
async def on_startup():
await init_db()
再次,降低現有的容器和卷:
$ docker-compose down -v
旋轉容器備份:
$ docker-compose up -d --build
在構建新圖像時快速查看將 Asyncio 與 Alembic一起使用。
備份容器後,使用異步模板初始化 Alembic:
$ docker-compose exec web alembic init -t async migrations
在生成的“project/migrations”文件夾中,將 SQLModel 導入到script.py.mako中,這是一個Mako模板文件:
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel # NEW
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}
現在,當生成新的遷移文件時,它將包含import sqlmodel
.
Next, we need to update the top of project/migrations/env.py like so:
import asyncio
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy.ext.asyncio import AsyncEngine
from sqlmodel import SQLModel # NEW
from alembic import context
from app.models import Song # NEW
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = SQLModel.metadata # UPDATED
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
...
Here, we imported SQLModel and our song model. We then set target_metadata
to our model's MetaData, SQLModel.metadata
. For more on the target_metadata
argument, check out Auto Generating Migrations from the official Alembic docs.
Update sqlalchemy.url
in project/alembic.ini:
sqlalchemy.url = postgresql+asyncpg://postgres:postgres@db:5432/foo
To generate the first migration file, run:
$ docker-compose exec web alembic revision --autogenerate -m "init"
If all went well, you should see a new migration file in "project/migrations/versions" that looks something like this:
"""init
Revision ID: f9c634db477d
Revises:
Create Date: 2021-09-10 00:24:32.718895
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel
# revision identifiers, used by Alembic.
revision = 'f9c634db477d'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('song',
sa.Column('name', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('artist', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('id', sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_song_artist'), 'song', ['artist'], unique=False)
op.create_index(op.f('ix_song_id'), 'song', ['id'], unique=False)
op.create_index(op.f('ix_song_name'), 'song', ['name'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_name'), table_name='song')
op.drop_index(op.f('ix_song_id'), table_name='song')
op.drop_index(op.f('ix_song_artist'), table_name='song')
op.drop_table('song')
# ### end Alembic commands ###
Apply the migration:
$ docker-compose exec web alembic upgrade head
Ensure you can add a song.
Let's quickly test a schema change. Update the SongBase
model in project/app/models.py:
class SongBase(SQLModel):
name: str
artist: str
year: Optional[int] = None
Don't forget the import:
from typing import Optional
Create a new migration file:
$ docker-compose exec web alembic revision --autogenerate -m "add year"
從自動生成的遷移文件中更新upgrade
和函數,如下所示:downgrade
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('song', sa.Column('year', sa.Integer(), nullable=True))
op.create_index(op.f('ix_song_year'), 'song', ['year'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_year'), table_name='song')
op.drop_column('song', 'year')
# ### end Alembic commands ###
應用遷移:
$ docker-compose exec web alembic upgrade head
更新路由處理程序:
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, year=song.year, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist, year=song.year)
session.add(song)
await session.commit()
await session.refresh(song)
return song
測試:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai", "year":"2021"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
在本教程中,我們介紹瞭如何配置 SQLAlchemy、SQLModel 和 Alembic 以異步使用 FastAPI。
如果您正在尋找更多挑戰,請查看我們所有的 FastAPI教程和課程。
您可以在fastapi-sqlmodel-alembic 存儲庫中找到源代碼。乾杯!
1660584420
Este tutorial mostra como trabalhar com SQLAlchemy de forma assíncrona com SQLModel e FastAPI. Também configuraremos o Alembic para lidar com migrações de banco de dados.
Comece clonando o projeto base do repositório fastapi-sqlmodel-alembic :
$ git clone -b base https://github.com/testdrivenio/fastapi-sqlmodel-alembic
$ cd fastapi-sqlmodel-alembic
Na raiz do projeto, crie as imagens e gire os contêineres do Docker:
$ docker-compose up -d --build
Quando a compilação estiver concluída, navegue até http://localhost:8004/ping . Você deveria ver:
{
"ping": "pong!"
}
Dê uma olhada rápida na estrutura do projeto antes de prosseguir.
Em seguida, vamos adicionar SQLModel , uma biblioteca para interagir com bancos de dados SQL do código Python, com objetos Python. Com base nas anotações do tipo Python, é essencialmente um wrapper sobre pydantic e SQLAlchemy , facilitando o trabalho com ambos.
Também precisaremos de Psycopg .
Adicione as duas dependências a project/requirements.txt :
fastapi==0.68.1
psycopg2-binary==2.9.1
sqlmodel==0.0.4
uvicorn==0.15.0
Crie dois novos arquivos em "project/app", db.py e models.py .
project/app/models.py :
from sqlmodel import SQLModel, Field
class SongBase(SQLModel):
name: str
artist: str
class Song(SongBase, table=True):
id: int = Field(default=None, primary_key=True)
class SongCreate(SongBase):
pass
Aqui, definimos três modelos:
SongBase
é o modelo básico do qual os outros herdam. Ele tem duas propriedades, name
e artist
, ambas sendo strings. Este é um modelo somente de dados, pois não possui table=True
, o que significa que é usado apenas como um modelo pydantic.Song
, enquanto isso, adiciona uma id
propriedade ao modelo base. É um modelo de tabela, portanto, é um modelo pydantic e SQLAlchemy. Representa uma tabela de banco de dados.SongCreate
é um modelo pydantic somente de dados que será usado para criar novas instâncias de música.projeto/aplicativo/db.py :
import os
from sqlmodel import create_engine, SQLModel, Session
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_engine(DATABASE_URL, echo=True)
def init_db():
SQLModel.metadata.create_all(engine)
def get_session():
with Session(engine) as session:
yield session
Aqui nós:
create_engine
SQLModel. As principais diferenças entre a versão do SQLModel create_engine
e do SQLAlchemy é que a versão SQLModel adiciona anotações de tipo (para suporte ao editor) e habilita o estilo SQLAlchemy "2.0" de mecanismos e conexões . Além disso, passamos echo=True
para que possamos ver as consultas SQL geradas no terminal. Isso é sempre bom para habilitar no modo de desenvolvimento para fins de depuração.Em seguida, dentro de project/app/main.py , vamos criar as tabelas na inicialização, usando o evento startup :
from fastapi import FastAPI
from app.db import init_db
from app.models import Song
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
Vale a pena notar que from app.models import Song
é necessário. Sem ele, a tabela de músicas não será criada.
Para testar, desative os contêineres e volumes antigos, reconstrua as imagens e ative os novos contêineres:
$ docker-compose down -v
$ docker-compose up -d --build
Abra os logs do contêiner via docker-compose logs web
. Você deveria ver:
web_1 | CREATE TABLE song (
web_1 | name VARCHAR NOT NULL,
web_1 | artist VARCHAR NOT NULL,
web_1 | id SERIAL,
web_1 | PRIMARY KEY (id)
web_1 | )
Abra o psql:
$ docker-compose exec db psql --username=postgres --dbname=foo
psql (13.4 (Debian 13.4-1.pgdg100+1))
Type "help" for help.
foo=# \dt
List of relations
Schema | Name | Type | Owner
--------+------+-------+----------
public | song | table | postgres
(1 row)
foo=# \q
Com a tabela montada, vamos adicionar algumas novas rotas para project/app/main.py :
from fastapi import Depends, FastAPI
from sqlalchemy import select
from sqlmodel import Session
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
def get_songs(session: Session = Depends(get_session)):
result = session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
def add_song(song: SongCreate, session: Session = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
session.commit()
session.refresh(song)
return song
Adicione uma música:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
Em seu navegador, navegue até http://localhost:8004/songs . Você deveria ver:
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
Continuando, vamos adicionar suporte assíncrono ao SQLModel.
Primeiro, derrube os contêineres e volumes:
$ docker-compose down -v
Atualize o URI do banco de dados em docker-compose.yml , adicionando +asyncpg
:
environment:
- DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/foo
Em seguida, substitua Psycopg por asyncpg :
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Atualize project/app/db.py : para usar os sabores assíncronos do mecanismo e da sessão do SQLAlchemy:
import os
from sqlmodel import SQLModel
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_async_engine(DATABASE_URL, echo=True, future=True)
async def init_db():
async with engine.begin() as conn:
# await conn.run_sync(SQLModel.metadata.drop_all)
await conn.run_sync(SQLModel.metadata.create_all)
async def get_session() -> AsyncSession:
async_session = sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
async with async_session() as session:
yield session
Notas:
expire_on_commit=False
.metadata.create_all
não é executado de forma assíncrona, então usamos run_sync para executá-lo de forma síncrona dentro da função assíncrona.Transforme on_startup
em uma função assíncrona em project/app/main.py :
@app.on_event("startup")
async def on_startup():
await init_db()
É isso. Reconstrua as imagens e gire os contêineres:
$ docker-compose up -d --build
Certifique-se de que as tabelas foram criadas.
Por fim, atualize os manipuladores de rotas em project/app/main.py para usar a execução assíncrona:
from fastapi import Depends, FastAPI
from sqlalchemy.future import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
async def on_startup():
await init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Adicione uma nova música e certifique-se de que http://localhost:8004/songs funcione conforme o esperado.
Por fim, vamos adicionar o Alembic à mistura para lidar adequadamente com as alterações do esquema do banco de dados.
Adicione-o ao arquivo de requisitos:
alembic==1.7.1
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Remova o evento de inicialização de project/app/main.py , pois não queremos mais que as tabelas sejam criadas na inicialização:
@app.on_event("startup")
async def on_startup():
await init_db()
Novamente, derrube os contêineres e volumes existentes:
$ docker-compose down -v
Gire os contêineres de volta:
$ docker-compose up -d --build
Dê uma olhada rápida em Using Asyncio with Alembic enquanto as novas imagens estão sendo construídas.
Depois que os contêineres estiverem de volta, inicialize o Alembic com o modelo assíncrono :
$ docker-compose exec web alembic init -t async migrations
Dentro da pasta "project/migrations" gerada, importe SQLModel para script.py.mako , um arquivo de modelo Mako :
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel # NEW
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}
Agora, quando um novo arquivo de migração for gerado, ele incluirá import sqlmodel
.
Em seguida, precisamos atualizar o topo de project/migrations/env.py assim:
import asyncio
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy.ext.asyncio import AsyncEngine
from sqlmodel import SQLModel # NEW
from alembic import context
from app.models import Song # NEW
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = SQLModel.metadata # UPDATED
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
...
Aqui, importamos SQLModel e nosso modelo de música. Em seguida, definimos como MetaDatatarget_metadata
do nosso modelo , . Para saber mais sobre o argumento, confira Auto Generating Migrations nos documentos oficiais do Alembic.SQLModel.metadatatarget_metadata
Atualização sqlalchemy.url
em project/alembic.ini :
sqlalchemy.url = postgresql+asyncpg://postgres:postgres@db:5432/foo
Para gerar o primeiro arquivo de migração, execute:
$ docker-compose exec web alembic revision --autogenerate -m "init"
Se tudo correu bem, você deverá ver um novo arquivo de migração em "projeto/migrações/versões" que se parece com isto:
"""init
Revision ID: f9c634db477d
Revises:
Create Date: 2021-09-10 00:24:32.718895
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel
# revision identifiers, used by Alembic.
revision = 'f9c634db477d'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('song',
sa.Column('name', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('artist', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('id', sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_song_artist'), 'song', ['artist'], unique=False)
op.create_index(op.f('ix_song_id'), 'song', ['id'], unique=False)
op.create_index(op.f('ix_song_name'), 'song', ['name'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_name'), table_name='song')
op.drop_index(op.f('ix_song_id'), table_name='song')
op.drop_index(op.f('ix_song_artist'), table_name='song')
op.drop_table('song')
# ### end Alembic commands ###
Aplique a migração:
$ docker-compose exec web alembic upgrade head
Certifique-se de que você pode adicionar uma música.
Vamos testar rapidamente uma mudança de esquema. Atualize o SongBase
modelo em project/app/models.py :
class SongBase(SQLModel):
name: str
artist: str
year: Optional[int] = None
Não esqueça da importação:
from typing import Optional
Crie um novo arquivo de migração:
$ docker-compose exec web alembic revision --autogenerate -m "add year"
Atualize as funções upgrade
e downgrade
do arquivo de migração gerado automaticamente da seguinte forma:
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('song', sa.Column('year', sa.Integer(), nullable=True))
op.create_index(op.f('ix_song_year'), 'song', ['year'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_year'), table_name='song')
op.drop_column('song', 'year')
# ### end Alembic commands ###
Aplique a migração:
$ docker-compose exec web alembic upgrade head
Atualize os manipuladores de rotas:
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, year=song.year, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist, year=song.year)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Teste:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai", "year":"2021"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
Neste tutorial, abordamos como configurar SQLAlchemy, SQLModel e Alembic para trabalhar com FastAPI de forma assíncrona.
Se você está procurando por mais desafios, confira todos os nossos tutoriais e cursos FastAPI .
Você pode encontrar o código-fonte no repositório fastapi-sqlmodel-alembic . Felicidades!
Fonte: https://testdrive.io
1660614060
В этом руководстве рассматривается, как работать с SQLAlchemy асинхронно с SQLModel и FastAPI. Мы также настроим Alembic для обработки миграции базы данных.
Начните с клонирования базового проекта из репозитория fastapi-sqlmodel-alembic :
$ git clone -b base https://github.com/testdrivenio/fastapi-sqlmodel-alembic
$ cd fastapi-sqlmodel-alembic
В корне проекта создайте образы и разверните контейнеры Docker:
$ docker-compose up -d --build
После завершения сборки перейдите по адресу http://localhost:8004/ping . Тебе следует увидеть:
{
"ping": "pong!"
}
Прежде чем двигаться дальше, ознакомьтесь со структурой проекта.
Далее добавим SQLModel , библиотеку для взаимодействия с базами данных SQL из кода Python, с объектами Python. Основанный на аннотациях типов Python, он по сути представляет собой оболочку поверх pydantic и SQLAlchemy , что упрощает работу с ними обоими.
Нам также понадобится Psycopg .
Добавьте две зависимости в project/requirements.txt :
fastapi==0.68.1
psycopg2-binary==2.9.1
sqlmodel==0.0.4
uvicorn==0.15.0
Создайте два новых файла в "project/app", db.py и models.py .
проект/приложение/models.py :
from sqlmodel import SQLModel, Field
class SongBase(SQLModel):
name: str
artist: str
class Song(SongBase, table=True):
id: int = Field(default=None, primary_key=True)
class SongCreate(SongBase):
pass
Здесь мы определили три модели:
SongBase
является базовой моделью, от которой наследуются остальные. Он имеет два свойства name
и artist
, оба из которых являются строками. Это модель только для данных, так как в ней отсутствует table=True
, а это означает, что она используется только как пидантическая модель.Song
, тем временем добавляет id
свойство к базовой модели. Это табличная модель, поэтому это модель pydantic и SQLAlchemy. Он представляет собой таблицу базы данных.SongCreate
— это модель pydantic только для данных, которая будет использоваться для создания новых экземпляров песни.проект/приложение/db.py :
import os
from sqlmodel import create_engine, SQLModel, Session
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_engine(DATABASE_URL, echo=True)
def init_db():
SQLModel.metadata.create_all(engine)
def get_session():
with Session(engine) as session:
yield session
Мы тут:
create_engine
SQLModel. Основные различия между версиями SQLModel create_engine
и SQLAlchemy заключаются в том, что версия SQLModel добавляет аннотации типов (для поддержки редактора) и включает движки и соединения в стиле SQLAlchemy "2.0" . Кроме того, мы прошли, echo=True
чтобы мы могли видеть сгенерированные SQL-запросы в терминале. Это всегда полезно включить в режиме разработки для целей отладки.Далее внутри project/app/main.py давайте создадим таблицы при запуске, используя событие запуска :
from fastapi import FastAPI
from app.db import init_db
from app.models import Song
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
Стоит отметить, что from app.models import Song
требуется. Без него таблица песен не будет создана.
Для проверки отключите старые контейнеры и тома, перестройте образы и запустите новые контейнеры:
$ docker-compose down -v
$ docker-compose up -d --build
Откройте журналы контейнера через docker-compose logs web
. Тебе следует увидеть:
web_1 | CREATE TABLE song (
web_1 | name VARCHAR NOT NULL,
web_1 | artist VARCHAR NOT NULL,
web_1 | id SERIAL,
web_1 | PRIMARY KEY (id)
web_1 | )
Откройте PSQL:
$ docker-compose exec db psql --username=postgres --dbname=foo
psql (13.4 (Debian 13.4-1.pgdg100+1))
Type "help" for help.
foo=# \dt
List of relations
Schema | Name | Type | Owner
--------+------+-------+----------
public | song | table | postgres
(1 row)
foo=# \q
Подняв таблицу, давайте добавим несколько новых маршрутов в project/app/main.py :
from fastapi import Depends, FastAPI
from sqlalchemy import select
from sqlmodel import Session
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
def get_songs(session: Session = Depends(get_session)):
result = session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
def add_song(song: SongCreate, session: Session = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
session.commit()
session.refresh(song)
return song
Добавить песню:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
В браузере перейдите по адресу http://localhost:8004/songs . Тебе следует увидеть:
{
"id": 1,
"name": "Midnight Fit",
"artist": "Mogwai"
}
Двигаясь дальше, давайте добавим поддержку асинхронности в SQLModel.
Во-первых, снесите контейнеры и тома:
$ docker-compose down -v
Обновите URI базы данных в docker-compose.yml , добавив +asyncpg
:
environment:
- DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/foo
Затем замените Psycopg на asyncpg :
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Обновите project/app/db.py : чтобы использовать асинхронные варианты движка и сеанса SQLAlchemy:
import os
from sqlmodel import SQLModel
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
DATABASE_URL = os.environ.get("DATABASE_URL")
engine = create_async_engine(DATABASE_URL, echo=True, future=True)
async def init_db():
async with engine.begin() as conn:
# await conn.run_sync(SQLModel.metadata.drop_all)
await conn.run_sync(SQLModel.metadata.create_all)
async def get_session() -> AsyncSession:
async_session = sessionmaker(
engine, class_=AsyncSession, expire_on_commit=False
)
async with async_session() as session:
yield session
Заметки:
expire_on_commit=False
.metadata.create_all
не выполняется асинхронно, поэтому мы использовали run_sync для его синхронного выполнения внутри асинхронной функции.Включите on_startup
асинхронную функцию в project/app/main.py :
@app.on_event("startup")
async def on_startup():
await init_db()
Вот и все. Пересоберите образы и разверните контейнеры:
$ docker-compose up -d --build
Убедитесь, что таблицы созданы.
Наконец, обновите обработчики маршрутов в project/app/main.py , чтобы использовать асинхронное выполнение:
from fastapi import Depends, FastAPI
from sqlalchemy.future import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.db import get_session, init_db
from app.models import Song, SongCreate
app = FastAPI()
@app.on_event("startup")
async def on_startup():
await init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Добавьте новую песню и убедитесь, что http://localhost:8004/songs работает должным образом.
Наконец, давайте добавим Alembic для правильной обработки изменений схемы базы данных.
Добавьте его в файл требований:
alembic==1.7.1
asyncpg==0.24.0
fastapi==0.68.1
sqlmodel==0.0.4
uvicorn==0.15.0
Удалите событие запуска из project/app/main.py , так как мы больше не хотим, чтобы таблицы создавались при запуске:
@app.on_event("startup")
async def on_startup():
await init_db()
Снова сносим существующие контейнеры и тома:
$ docker-compose down -v
Верните контейнеры обратно:
$ docker-compose up -d --build
Кратко ознакомьтесь с разделом Использование Asyncio с Alembic , пока создаются новые образы.
После резервного копирования контейнеров инициализируйте Alembic с помощью асинхронного шаблона:
$ docker-compose exec web alembic init -t async migrations
В сгенерированной папке «project/migrations» импортируйте SQLModel в script.py.mako , файл шаблона Mako :
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel # NEW
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}
Теперь, когда создается новый файл миграции, он будет включать файлы import sqlmodel
.
Далее нам нужно обновить верхнюю часть project/migrations/env.py следующим образом:
import asyncio
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy.ext.asyncio import AsyncEngine
from sqlmodel import SQLModel # NEW
from alembic import context
from app.models import Song # NEW
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = SQLModel.metadata # UPDATED
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
...
Здесь мы импортировали SQLModel и нашу модель песни. Затем мы устанавливаем метаданныеtarget_metadata
нашей модели , . Чтобы узнать больше об этом аргументе, ознакомьтесь с разделом «Автоматическое создание миграции» в официальной документации Alembic.SQLModel.metadatatarget_metadata
Обновление sqlalchemy.url
в проекте/alembic.ini :
sqlalchemy.url = postgresql+asyncpg://postgres:postgres@db:5432/foo
Чтобы сгенерировать первый файл миграции, запустите:
$ docker-compose exec web alembic revision --autogenerate -m "init"
Если все прошло хорошо, вы должны увидеть новый файл миграции в "project/migrations/versions", который выглядит примерно так:
"""init
Revision ID: f9c634db477d
Revises:
Create Date: 2021-09-10 00:24:32.718895
"""
from alembic import op
import sqlalchemy as sa
import sqlmodel
# revision identifiers, used by Alembic.
revision = 'f9c634db477d'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('song',
sa.Column('name', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('artist', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
sa.Column('id', sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_song_artist'), 'song', ['artist'], unique=False)
op.create_index(op.f('ix_song_id'), 'song', ['id'], unique=False)
op.create_index(op.f('ix_song_name'), 'song', ['name'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_name'), table_name='song')
op.drop_index(op.f('ix_song_id'), table_name='song')
op.drop_index(op.f('ix_song_artist'), table_name='song')
op.drop_table('song')
# ### end Alembic commands ###
Примените миграцию:
$ docker-compose exec web alembic upgrade head
Убедитесь, что вы можете добавить песню.
Давайте быстро проверим изменение схемы. Обновите SongBase
модель в project/app/models.py :
class SongBase(SQLModel):
name: str
artist: str
year: Optional[int] = None
Не забудьте импорт:
from typing import Optional
Создайте новый файл миграции:
$ docker-compose exec web alembic revision --autogenerate -m "add year"
Обновите функции upgrade
и downgrade
из автоматически сгенерированного файла миграции следующим образом:
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('song', sa.Column('year', sa.Integer(), nullable=True))
op.create_index(op.f('ix_song_year'), 'song', ['year'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_song_year'), table_name='song')
op.drop_column('song', 'year')
# ### end Alembic commands ###
Примените миграцию:
$ docker-compose exec web alembic upgrade head
Обновите обработчики маршрута:
@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
result = await session.execute(select(Song))
songs = result.scalars().all()
return [Song(name=song.name, artist=song.artist, year=song.year, id=song.id) for song in songs]
@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
song = Song(name=song.name, artist=song.artist, year=song.year)
session.add(song)
await session.commit()
await session.refresh(song)
return song
Тест:
$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai", "year":"2021"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs
В этом руководстве мы рассмотрели, как настроить SQLAlchemy, SQLModel и Alembic для асинхронной работы с FastAPI.
Если вам нужны дополнительные задачи, ознакомьтесь со всеми нашими руководствами и курсами по FastAPI .
Вы можете найти исходный код в репозитории fastapi-sqlmodel-alembic . Ваше здоровье!
Источник: https://testdriven.io