Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inspect violated constraints #224

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
46 changes: 36 additions & 10 deletions flask_resty/view.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
import flask
from flask.views import MethodView
from marshmallow import fields
import sqlalchemy as sa
from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import Load
from sqlalchemy.orm.exc import NoResultFound
Expand Down Expand Up @@ -415,20 +416,45 @@ def commit(self):
def resolve_integrity_error(self, error):
original_error = error.orig

if (
hasattr(original_error, 'pgcode') and
original_error.pgcode in (
if hasattr(original_error, 'pgcode'):
if original_error.pgcode in (
'23502', # not_null_violation
'23514', # check_violation
)
):
# Using the psycopg2 error code, we can tell that this was not from
# an integrity error that was not a conflict. This means there was
# a schema bug, so we emit an interal server error instead.
return error
):
# Using the psycopg2 error code, we can tell that this was not
# from an integrity error that was not a conflict. This means
# there was a schema bug, so we emit an internal server error
# instead.
return error
else:
diag = original_error.diag

schema_name = diag.schema_name
table_name = diag.table_name
constraint_name = diag.constraint_name

insp = sa.inspect(self.session.bind)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does it make more sense to do this at startup for each table, or to cache these?

unique_constraints = insp.get_unique_constraints(
table_name,
schema=schema_name,
)

for unique_constraint in unique_constraints:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

... so you can make this a map based on name. not sure if all this is prem optimization, but I think makes also the code cleaner. for with breaks smells to me

if unique_constraint['name'] == constraint_name:
column_names = unique_constraint['column_names']
break
else:
column_names = ()
else:
column_names = ()

flask.current_app.logger.exception("handled integrity error")
return ApiError(409, {'code': 'invalid_data.conflict'})

error = {'code': 'invalid_data.conflict'}
if column_names:
error['source'] = {'pointer': '/data/{}'.format(column_names[-1])}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll actually have to go through the schema and figure out which deserializer field corresponds to this column.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to tweak the whole path thing, too. Generally the insert is only on the main model (i.e. no relationship inserts, most likely), but... eh.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For multi-column unique constraints (e.g. tuple of tenant_id, name), we generally want to report the conflict on the last column, but this is something we need to tweak.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that assumption would require some level of documentation. Alternatively could make the default to show all columns and then have a hook in the view to allow the view to reduce what is shown.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

docs for sure. the issue is that we have a lot of unique constraints like (tenant_id, name), and it would be odd to show an error on both fields in such cases

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could also have a pointers field being an array of pointers. we are already not jsonapi compliant anyway :trollface:


return ApiError(409, error)

def set_item_response_meta(self, item):
super(ModelView, self).set_item_response_meta(item)
Expand Down
16 changes: 15 additions & 1 deletion tests/test_view_errors.py
Original file line number Diff line number Diff line change
Expand Up @@ -230,10 +230,24 @@ def test_integrity_error_conflict(client, path):
}])


@pytest.mark.parametrize('path', ('/widgets', '/widgets_flush'))
def test_integrity_error_conflict_with_source(db, client, path):
if db.engine.driver != 'psycopg2':
pytest.xfail("IntegrityError source detection requires psycopg2")

response = client.post(path, data={
'name': "Foo",
})
assert_response(response, 409, [{
'code': 'invalid_data.conflict',
'source': {'pointer': '/data/name'},
}])


@pytest.mark.parametrize('path', ('/widgets', '/widgets_flush'))
def test_integrity_error_uncaught(db, app, client, path):
if db.engine.driver != 'psycopg2':
pytest.xfail("IntegrityError cause detection only works with psycopg2")
pytest.xfail("IntegrityError cause detection requires psycopg2")

app.testing = False

Expand Down