Alembic migration with postgres multi schema not working #1506
Replies: 2 comments
-
Hi, Have you tried looking at this section of the docs? https://alembic.sqlalchemy.org/en/latest/autogenerate.html#autogenerate-include-hooks |
Beta Was this translation helpful? Give feedback.
-
I have tried. Below are my include_object, include_name, process_revision_directives( which i didn't understand much :-() I have referred to this https://alembic.sqlalchemy.org/en/latest/cookbook.html#rudimental-schema-level-multi-tenancy-for-postgresql-databases as i am using schema_translate_map Below are the commands i am running:
Now when i do Pls suggest.
def include_object(
object: SchemaItem, name,
type_: Union[
Literal["schema"], Literal["table"], Literal["column"], Literal["index"], Literal["unique_constraint"],
Literal["foreign_key_constraint"]],
reflected: bool, compare_to: Optional[SchemaItem]
):
if type_ == "table" and name == 'alembic_version':
return False
else:
print(f"reflected {reflected}: {type_} {name}")
return True
def include_name(name: str | None, type_: str, parent_names: dict[str, str | None]) -> bool:
print("inside include name")
print(type_, name)
print(target_metadata.schema)
print(type(parent_names), parent_names)
if type_ == "schema":
return name in [None, 'plat']
elif type_ == "table":
return (
parent_names["schema_qualified_table_name"] in
target_metadata.tables
)
else:
return True
def process_revision_directives(
_context: MigrationContext,
revision: str | Iterable[str | None] | Iterable[str],
directives: list[MigrationScript],
) -> None:
script = directives[0]
upgrade_ops = script.upgrade_ops.ops
print("process_revision_directives called")
for directive in directives:
if isinstance(directive, operations.ops.CreateTableOp):
print(f"directive scheama = {directive.schema}")
if directive.schema is None:
directive.schema = get_target_schema()
elif directive.schema == 'public':
# Handle special cases for 'public' schema if needed
directive.schema = get_target_schema() # or leave it as is based on your logic
for op in upgrade_ops:
print(op.info)
context.configure(
connection=connection,
target_metadata=target_metadata,
include_schemas=False,
version_table_schema=target_schema,
render_item=render_item,
include_object=include_object,
process_revision_directives=process_revision_directives,
include_name=include_name,
# render_as_batch=True,
)
with context.begin_transaction():
context.run_migrations() |
Beta Was this translation helpful? Give feedback.
-
Describe the bug
When running alembic migration with postgres multi schema, it is not working.
Expected behavior
migration should be able to identify tables that need to be created in public schema and those that have been added in other schema say test
To Reproduce
Create 2 dbmodels, one is public and one with no schema defined.
class User(Base, TableName):
ACCOUNT_TYPES = [
('atlassian', 'Atlassian'),
('app', 'App'),
('customer', 'Customer'),
]
table_args = (
{'schema': 'public'}
)
class Issue(Base):
use_snake_case = True
# Drop column list -
# issue_hierarchy_level
id = mapped_column(
BIGINT, nullable=False, primary_key=True,
# unique=True
)
key = mapped_column(TEXT, nullable=False, index=True)
parent_id = mapped_column(
BIGINT,
# ForeignKey(column="issue.id", deferrable=True, initially='DEFERRED'),
nullable=True, index=True,
)
parent_key = mapped_column(
String,
# ForeignKey(column="issue.key", deferrable=True, initially='DEFERRED'),
nullable=True, index=True,
)
summary = mapped_column(String, nullable=False)
table_args = (
UniqueConstraint("id", name="uq_issue_id"),
UniqueConstraint("key", name="uq_issue_key"),
ForeignKeyConstraint([parent_id], [id], name="fk_issue_parent_id"),
ForeignKeyConstraint([parent_key], [key], name="fk_issue_parent_key"),
)
Error
Versions.
Additional context
The issue is how to handle multi schema for postgres
Have a nice day!
Beta Was this translation helpful? Give feedback.
All reactions