Mercurial > libervia-backend
view sat/memory/migration/env.py @ 3934:e345d93fb6e5
plugin OXPS: OpenPGP for XMPP Pubsub implementation:
OpenPGP for XMPP Pubsub (https://xmpp.org/extensions/inbox/pubsub-encryption.html,
currently a protoXEP) is implemented and activated when `encrypted` is set to `True` in
pubsub's `extra` data.
On item retrieval, the decryption is transparent if the key is known, except if the
`decrypt` key in `extra` is set to `False` (notably useful when one wants to checks that
data is well encrypted).
Methods and corresponding bridge methods have been implemented to manage shared secrets
(to share, revoke or rotate the secrets).
plugin XEP-0060's `XEP-0060_publish` trigger point as been move before actual publish so
item can be modified (here e2ee) by the triggers. A new `XEP-0060_items` trigger point has
also been added.
`encrypted` flag can be used with plugin XEP-0277's microblog data
rel 380
author | Goffi <goffi@goffi.org> |
---|---|
date | Sat, 15 Oct 2022 20:36:53 +0200 |
parents | 54c249ec35ce |
children | 524856bd7b19 |
line wrap: on
line source
import asyncio from logging.config import fileConfig from sqlalchemy import pool from sqlalchemy.ext.asyncio import create_async_engine from alembic import context from sat.memory import sqla_config from sat.memory.sqla_mapping import Base # this is the Alembic Config object, which provides # access to the values within the .ini file in use. config = context.config # Interpret the config file for Python logging. # This line sets up loggers basically. fileConfig(config.config_file_name) # add your model's MetaData object here # for 'autogenerate' support # from myapp import mymodel # target_metadata = mymodel.Base.metadata target_metadata = Base.metadata # other values from the config, defined by the needs of env.py, # can be acquired: # my_important_option = config.get_main_option("my_important_option") # ... etc. def run_migrations_offline(): """Run migrations in 'offline' mode. This configures the context with just a URL and not an Engine, though an Engine is acceptable here as well. By skipping the Engine creation we don't even need a DBAPI to be available. Calls to context.execute() here emit the given string to the script output. """ db_config = sqla_config.getDbConfig() context.configure( url=db_config["url"], target_metadata=target_metadata, literal_binds=True, dialect_opts={"paramstyle": "named"}, ) with context.begin_transaction(): context.run_migrations() def include_name(name, type_, parent_names): if type_ == "table": if name.startswith("pubsub_items_fts"): return False return True def do_run_migrations(connection): context.configure( connection=connection, target_metadata=target_metadata, render_as_batch=True, include_name=include_name ) with context.begin_transaction(): context.run_migrations() async def run_migrations_online(): """Run migrations in 'online' mode. In this scenario we need to create an Engine and associate a connection with the context. """ db_config = sqla_config.getDbConfig() engine = create_async_engine( db_config["url"], poolclass=pool.NullPool, future=True, ) async with engine.connect() as connection: await connection.run_sync(do_run_migrations) if context.is_offline_mode(): run_migrations_offline() else: asyncio.run(run_migrations_online())