diff --git a/doc/deployment_guide.md b/doc/deployment_guide.md index c48f240c..90a59885 100644 --- a/doc/deployment_guide.md +++ b/doc/deployment_guide.md @@ -1,5 +1,81 @@ # Deployment Guide +The tech stack supporting the Data Sharing Portal has been updated and we are now deploying the application to two separate servers (one for the web application and another for the database). + +## Web Application Server + +While deploying the web application server, I recommend following the instructions outlined in this guide [Digital Ocean Deployment Guide for Ubuntu 20.04](https://www.digitalocean.com/community/tutorials/how-to-set-up-django-with-postgres-nginx-and-gunicorn-on-ubuntu-20-04). However, many of the steps in the Digital Ocean guide have been modified, so additional instructions can be found below. + +1. **Clone repositories:** Clone the web application and config repositories. I recommend cloning the repositories to the **'/opt'** directory using the command below. + - `cd /opt` + - `git clone https://github.com/ODM2/ODM2DataSharingPortal.git` + - `git checkout develop` use 'master' branch for production server + - `cd /opt` + - `git clone https://github.com/LimnoTech/ODM2DataSharingPortalConfig.git` + - `cd ./ODM2DataSharingPortalConfig` + - `git checkout develop` +2. **Set up Python Environment:** For this project we will be using mini conda to create and manage our Python environments. I did not find a mini conda build in our package manager, so opted to get one directly from conda [Anaconda.com](https://anaconda.com). The instructions below include a link to the latest version of conda, which is what we used in our deployment. Future users of these instructions should double check available versions and consider if the latest release is correct for their application. It may make more sense to use an older LTS release for example. Also note these deployment instructions are for a server using an ARM CPU architecture. If deploying to a more traditional x86 CPU, look for the release of mini conda that is not on the 'aarch64' Linux platform. + - `cd ~` + - `wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-aarch64.sh` + - `chmod +x` Miniconda-latest-Linux-aarch64.sh` + - `sudo ./Miniconda3-latest-Linux-aarch64.sh` + - when prompted where to install I selected '/opt/miniconda3' + - I also needed to add out conda install to the system path + -`export PATH=$PATH:/opt/miniconda3/bin` + - `conda init bash` which initializes mini conda + - `exec bash` restart bash so that initialization takes effect + + Next we'll set up a virtual environment from the .yml file. + - `conda env create -f /opt/ODM2DataSharingPortal/environment_py38_dj22.yml` Note I modified this yml file so that the environment name was 'ODM2DataSharingPortal' + - `conda activate ODM2DataSharingPortal` +3. **Create a symlink so the Web App will use the settings.json file in the config repo:** Note the instructions below use the development/staging settings. You may need to point to a different file for the production deployment. + - `sudo ln -s /opt/ODM2DataSharingPortalConfig/django/staging.settings.json /opt/ODM2DataSharingPortal/src/WebSDL/settings/settings.json` +4. **Set Up Gunicorn:** + - Gunicorn is not in the default mini conda channel, so we'll need to get it from conda forge. + - `conda config --add channels conda-forge` + - `conda install -c conda-forge gunicorn` + - Test if guincorn starts + - `conda activate ODM2DataSharingPortal` + - `cd /opt/ODM2DataSharingPortal/src` + - `gunicorn wsgi:application --bind 0.0.0.0:8000` + - After we know the testing works, copy over the gunicorn service file from the config repo to the server. I tried a symlink here but that did not work (probably because it is a service). I ended up just copying the file from the config repo to system services. Also note my copy command renames the file from envirodiy to gunicorn. The service we just created will automatically gunicorn with the appropriate arugments and create a socket to the application that will become the entry point for nginx. + - `sudo cp /opt/ODM2DataSharingPortalConfig/GUnicorn/envirodiy.service /etc/systemd/system/gunicorn.service` + - As mentioned above, the service will automatically call gunicorn, but where gunicorn is on your system will depend on a variey of thing (i.e. how and where python was installed). In order to make the service file flexible and not dependent on the specifics of the installation I reference gunicorn located at '/usr/bin/gunicorn'. However this is very likely not were unicorn was installed. The solution is to create a symlink. + - optional `whereis gunicorn` to help find the install location + - `sudo ln /path/to/gunicorn /usr/bin/gunicorn` + - Finally I needed to modify the permissions of wsgi.py so that gunicorn to spin up the application. + - `cd /opt/ODM2DataSharingPortal/src` + - `chmod +755 wsgi.py` + - Now we just need to start the GUnicorn service we created + - `sudo systemctl start gunicorn`` +5. **Set up nginx** + - Install nginx + - `sudo apt install nginx` + - Create symlink between config repo and nginx + - `sudo ln -s /opt/ODM2DataSharingPortalConfig/nginx/staging_data_environdiy /etc/nginx/sites-enabled/ODM2DataSharingPortal` + - Test nginx + - `sudo nginx -t` + - If there were no errors during the test, start nginx which should make the site accessable online. + - `sudo systemctl start nginx` + +6. **Set up SSL certificate** + + - The following commands are taken verbatim from https://certbot.eff.org/lets-encrypt/ubuntufocal-nginx.html: + + - Install current version of certbot + - `sudo snap install core; sudo snap refresh core` + - `sudo snap install --classic certbot` + - `sudo ln -s /snap/bin/certbot /usr/bin/certbot` + - Install certificate + - `sudo certbot --nginx` (enter domain name `staging.monitormywatershed.org` when prompted) + - Verify installation and auto-renewal + - `sudo certbot certificates` + - `sudo certbot renew --dry-run` + +--- + +## Python 2.7 + To deploy an instance of the Data Sharing Portal, follow the [Digital Ocean Ubuntu deployment guide](https://www.digitalocean.com/community/tutorials/how-to-set-up-django-with-postgres-nginx-and-gunicorn-on-ubuntu-16-04). This deployment guide is an outline with specific Data Sharing Portal configuration steps. diff --git a/environment.yml b/environment.yml new file mode 100644 index 00000000..b89a41d0 --- /dev/null +++ b/environment.yml @@ -0,0 +1,50 @@ +# Develop Environments for ODM2DataSharingPortal +name: ODM2DataSharingPortal +channels: + - conda-forge + - defaults + +dependencies: + # For ODM2DataSharingPortal migration to AWS + - python =3.8.10 # May 3, 2021: final regular Py 3.8 release. https://www.python.org/downloads/release/python-3810/ + # - django =2.2.* # Installs 2.2.14 but lastest is 2.2.24. Use pip to install. + # https://docs.djangoproject.com/en/3.2/releases + + # # Other Requirements + # - beautifulsoup4 =4.9.3 # with python 3.8 + - coverage >=5.5 + - google-api-python-client >=2.12.0 + - hs_restclient >=1.3.7 # https://github.com/hydroshare/hs_restclient + - markdown >=3.3.4 + # - oauthlib >=3.1.1 # with google api + - pandas >=1.3 + - psycopg2 >=2.9.1 + - python-crontab >=2.5.1 + # - requests # with python 3.8 + # - six # with python 3.8 + + + # Dev tools + - python-language-server + + # package management + - conda + - conda-build + - pip + + # Dependency versions not available on conda-forge + - pip: + # - codegen >=1.0 # necessary? Last updated in 2012. https://pypi.org/project/codegen/ + - django ==2.2.24 + - django-admin-select2 # >=1.0.1 + - django-debug-toolbar # >=1.11.1 + - django-discover-runner # >=1.0 + - django-reset-migrations # >=0.3.1 + - django-webtest # >=1.8.0 + - django-widget-tweaks # >=1.4.1 + - djangorestframework + # - patterns >=0.3 # necessary? Last updated in 2014. https://pypi.org/project/patterns/ + # - sqlparse # with django=2.2 + # - waitress # with django extensions + # - WebOb # with django extensions + # - WebTest # with django extensions diff --git a/release_commands.txt b/release_commands.txt deleted file mode 100644 index e69de29b..00000000 diff --git a/requirements.txt b/requirements.txt deleted file mode 100644 index 55f88e81..00000000 --- a/requirements.txt +++ /dev/null @@ -1,28 +0,0 @@ -beautifulsoup4==4.5.1 -codegen==1.0 -coverage==4.2 -Django==1.11.28 -django-debug-toolbar==1.6 -django-discover-runner==1.0 -django-webtest==1.8.0 -django-widget-tweaks==1.4.1 -djangorestframework==3.9.1 -Markdown==2.6.7 -patterns==0.3 -psycopg2==2.7.1 -six==1.10.0 -sqlparse==0.2.2 -waitress==1.4.3 -WebOb==1.6.2 -WebTest==2.0.23 -requests==2.20.0 -hs_restclient==1.2.10 -enum==0.4.6 -unicodecsv==0.14.1 -python-crontab==2.2.8 -oauthlib==2.0.* -django-reset-migrations==0.3.1 -google-api-python-client==1.6.7 -django-admin-select2==1.0.1 -pandas==0.23.4 -influxdb==5.2.1 diff --git a/src/WebSDL/settings/base.py b/src/WebSDL/settings/base.py index f9f1b634..8720e93e 100644 --- a/src/WebSDL/settings/base.py +++ b/src/WebSDL/settings/base.py @@ -61,7 +61,8 @@ 'django.contrib.staticfiles', 'widget_tweaks', 'requests', - 'reset_migrations' + 'reset_migrations', + 'timeseries_visualization' ] MIDDLEWARE = [ @@ -70,7 +71,6 @@ 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', - 'django.contrib.auth.middleware.SessionAuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', 'hydroshare_util.middleware.AuthMiddleware', @@ -104,7 +104,7 @@ }, ] -WSGI_APPLICATION = 'WebSDL.wsgi.application' +#WSGI_APPLICATION = 'WebSDL.wsgi.application' # Database @@ -123,8 +123,6 @@ 'TEST': database['test'] if 'test' in database else {}, } -INFLUX_CONNECTION = data['influx_connection'] - # Password validation # https://docs.djangoproject.com/en/1.9/ref/settings/#auth-password-validators @@ -148,13 +146,9 @@ # https://docs.djangoproject.com/en/1.9/topics/i18n/ LANGUAGE_CODE = 'en-us' - USE_I18N = True - USE_L10N = True - LOGIN_URL = '/login/' - DATABASE_ROUTERS = ['WebSDL.db_routers.WebSDLRouter'] @@ -165,32 +159,20 @@ # SECURE_SSL_REDIRECT = True RECAPTCHA_KEY = data["recaptcha_secret_key"] if "recaptcha_secret_key" in data else "" - RECAPTCHA_USER_KEY = data["recaptcha_user_key"] if "recaptcha_user_key" in data else "" - RECAPTCHA_VERIFY_URL = "https://www.google.com/recaptcha/api/siteverify" EMAIL_SENDER = data['password_email_sender'] if 'password_email_sender' in data else '', - NOTIFY_EMAIL = data['notify_email_sender'] if 'notify_email_sender' in data else '' - DEFAULT_FROM_EMAIL = EMAIL_SENDER[0] if isinstance(EMAIL_SENDER, tuple) else EMAIL_SENDER - NOTIFY_EMAIL_SENDER = NOTIFY_EMAIL[0] if isinstance(NOTIFY_EMAIL, tuple) else NOTIFY_EMAIL - EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend' - EMAIL_SERVER = data['email_host'] if 'email_host' in data else '', - EMAIL_HOST = EMAIL_SERVER[0] if isinstance(EMAIL_SERVER, tuple) else EMAIL_SERVER - EMAIL_HOST_USER = data['email_user'] if 'email_user' in data else '' - EMAIL_HOST_PASSWORD = data['email_password'] if 'email_password' in data else '' - EMAIL_USE_TLS = True - DATETIME_FORMAT = "N j, Y g:i a" HYDROSHARE_UTIL_CONFIG = { @@ -199,26 +181,16 @@ 'REDIRECT_URI': data['hydroshare_oauth']['redirect_uri'] } -INFLUX_URL_QUERY = data['influx_query'] - -INFLUX_UPDATE_URL = data['influx_updater_query']['url'] - -INFLUX_UPDATE_BODY = data['influx_updater_query']['body'] - # This data period is measured in days SENSOR_DATA_PERIOD = data['sensor_data_period'] if 'sensor_data_period' in data else '2' -TSA_URL = data['tsa_url'] if 'tsa_url' in data else '' - # crontab job settings CRONTAB_USER = data.get('crontab_user', getpass.getuser()) - CRONTAB_LOGFILE_PATH = data.get('crontab_log_file', '/var/log/odm2websdl-cron.log') - CRONTAB_EXECUTE_DAILY_AT_HOUR = 5 GOOGLE_API_CONF = data.get('google_api_conf', None) AUTH_USER_MODEL = 'accounts.User' -DEBUG = True if 'debug_mode' in data and data['debug_mode'] == "True" else False +DEBUG = True if 'debug_mode' in data and data['debug_mode'] == "True" else False \ No newline at end of file diff --git a/src/WebSDL/urls.py b/src/WebSDL/urls.py index 9af9ec7a..2d33379a 100644 --- a/src/WebSDL/urls.py +++ b/src/WebSDL/urls.py @@ -17,12 +17,13 @@ from django.conf.urls import url, include from django.contrib import admin from django.contrib.auth import views as auth_views -from django.core.urlresolvers import reverse_lazy +from django.urls import reverse_lazy from accounts.views import UserRegistrationView, UserUpdateView -BASE_URL = settings.SITE_URL[1:] +#BASE_URL = settings.SITE_URL[1:] +BASE_URL = '' login_configuration = { 'redirect_field_name': 'next' @@ -41,19 +42,20 @@ } urlpatterns = [ - url(r'^' + BASE_URL + 'password-reset/$', auth_views.password_reset, password_reset_configuration, name='password_reset'), - url(r'^' + BASE_URL + 'password-reset/done/$', auth_views.password_reset_done, name='password_reset_done'), - url(r'^' + BASE_URL + 'password-reset/(?P[0-9A-Za-z]+)-(?P.+)/$', auth_views.password_reset_confirm, password_done_configuration, name='password_reset_confirm'), - url(r'^' + BASE_URL + 'password-reset/completed/$', auth_views.password_reset_complete, name='password_reset_complete'), + url(r'^' + BASE_URL + 'password-reset/$', auth_views.PasswordChangeView, password_reset_configuration, name='password_reset'), + url(r'^' + BASE_URL + 'password-reset/done/$', auth_views.PasswordChangeView, name='password_reset_done'), + url(r'^' + BASE_URL + 'password-reset/(?P[0-9A-Za-z]+)-(?P.+)/$', auth_views.PasswordResetConfirmView, password_done_configuration, name='password_reset_confirm'), + url(r'^' + BASE_URL + 'password-reset/completed/$', auth_views.PasswordResetCompleteView, name='password_reset_complete'), url(r'^' + BASE_URL + 'admin/', admin.site.urls), - url(r'^' + BASE_URL + 'login/$', auth_views.login, login_configuration, name='login'), - url(r'^' + BASE_URL + 'logout/$', auth_views.logout, logout_configuration, name='logout'), + url(r'^' + BASE_URL + 'login/$', auth_views.LoginView.as_view(), login_configuration, name='login'), + url(r'^' + BASE_URL + 'logout/$', auth_views.LogoutView.as_view(), logout_configuration, name='logout'), url(r'^' + BASE_URL + 'register/$', UserRegistrationView.as_view(), name='user_registration'), url(r'^' + BASE_URL + 'account/$', UserUpdateView.as_view(), name='user_account'), url(r'^' + BASE_URL + 'api-auth/', include('rest_framework.urls', namespace='rest_framework')), url(r'^' + BASE_URL + 'hydroshare/', include('hydroshare.urls', namespace='hydroshare')), url(BASE_URL, include('dataloaderinterface.urls')), - url(BASE_URL, include('dataloaderservices.urls')) + url(BASE_URL, include('dataloaderservices.urls')), + url(BASE_URL, include('timeseries_visualization.urls')) ] # if settings.DEBUG: diff --git a/src/dataloader/models.py b/src/dataloader/models.py index 644136c5..d14664ff 100644 --- a/src/dataloader/models.py +++ b/src/dataloader/models.py @@ -3,6 +3,10 @@ import inspect import sys import uuid +from django.db.models.deletion import CASCADE +from django.db.models.fields.related import OneToOneField + +from django.utils.tree import Node from dataloader.querysets import AffiliationQuerySet, RelatedActionManager, ResultManager, \ DataLoggerFileManager, InstrumentOutputVariableManager, \ @@ -50,7 +54,7 @@ class Meta: @python_2_unicode_compatible class AnnotationBridge(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - annotation = models.ForeignKey('Annotation', db_column='annotationid') + annotation = models.ForeignKey('Annotation', db_column='annotationid', on_delete=models.CASCADE) def __str__(self): return '%s' % self.annotation @@ -62,7 +66,7 @@ class Meta: @python_2_unicode_compatible class ExtensionPropertyBridge(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - property = models.ForeignKey('ExtensionProperty', db_column='propertyid') + property = models.ForeignKey('ExtensionProperty', db_column='propertyid', on_delete=models.CASCADE) property_value = models.CharField(db_column='propertyvalue', max_length=255) def __str__(self): @@ -75,7 +79,7 @@ class Meta: @python_2_unicode_compatible class ExternalIdentifierBridge(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - external_identifier_system = models.ForeignKey('ExternalIdentifierSystem', db_column='externalidentifiersystemid') + external_identifier_system = models.ForeignKey('ExternalIdentifierSystem', db_column='externalidentifiersystemid', on_delete=models.CASCADE) def __str__(self): return '%s' % self.external_identifier_system @@ -87,7 +91,7 @@ class Meta: @python_2_unicode_compatible class ObjectRelation(models.Model): relation_id = models.AutoField(db_column='relationid', primary_key=True) - relationship_type = models.ForeignKey('RelationshipType', db_column='relationshiptypecv') + relationship_type = models.ForeignKey('RelationshipType', db_column='relationshiptypecv', on_delete=models.CASCADE) def __str__(self): return '%s' % self.relationship_type_id @@ -100,8 +104,8 @@ class Meta: @python_2_unicode_compatible class ExtendedResult(models.Model): - result = models.OneToOneField('Result', db_column='resultid', primary_key=True) - spatial_reference = models.ForeignKey('SpatialReference', db_column='spatialreferenceid', blank=True, null=True) + result = models.OneToOneField('Result', db_column='resultid', on_delete=models.CASCADE, primary_key=True) + spatial_reference = models.ForeignKey('SpatialReference', db_column='spatialreferenceid', on_delete=models.CASCADE, blank=True, null=True) def __str__(self): return '%s' % self.result @@ -136,7 +140,7 @@ class Meta: @python_2_unicode_compatible class ResultValueAnnotation(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - annotation = models.ForeignKey('Annotation', db_column='annotationid') + annotation = models.ForeignKey('Annotation', db_column='annotationid', on_delete=models.CASCADE) def __str__(self): return '%s %s' % (self.value_datetime, self.data_value) @@ -151,7 +155,7 @@ class Meta: class AggregatedComponent(models.Model): - aggregation_statistic = models.ForeignKey('AggregationStatistic', db_column='aggregationstatisticcv') + aggregation_statistic = models.ForeignKey('AggregationStatistic', db_column='aggregationstatisticcv', on_delete=models.CASCADE) class Meta: abstract = True @@ -159,7 +163,7 @@ class Meta: class TimeAggregationComponent(models.Model): time_aggregation_interval = models.FloatField(db_column='timeaggregationinterval') - time_aggregation_interval_unit = models.ForeignKey('Unit', related_name='+', db_column='timeaggregationintervalunitsid', blank=True, null=True) + time_aggregation_interval_unit = models.ForeignKey('Unit', related_name='+', db_column='timeaggregationintervalunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True @@ -167,7 +171,7 @@ class Meta: class XOffsetComponent(models.Model): x_location = models.FloatField(db_column='xlocation') - x_location_unit = models.ForeignKey('Unit', related_name='+', db_column='xlocationunitsid', blank=True, null=True) + x_location_unit = models.ForeignKey('Unit', related_name='+', db_column='xlocationunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True @@ -175,7 +179,7 @@ class Meta: class YOffsetComponent(models.Model): y_location = models.FloatField(db_column='ylocation') - y_location_unit = models.ForeignKey('Unit', related_name='+', db_column='ylocationunitsid', blank=True, null=True) + y_location_unit = models.ForeignKey('Unit', related_name='+', db_column='ylocationunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True @@ -183,7 +187,7 @@ class Meta: class ZOffsetComponent(models.Model): z_location = models.FloatField(db_column='zlocation') - z_location_unit = models.ForeignKey('Unit', related_name='+', db_column='zlocationunitsid', blank=True, null=True) + z_location_unit = models.ForeignKey('Unit', related_name='+', db_column='zlocationunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True @@ -191,7 +195,7 @@ class Meta: class XIntendedComponent(models.Model): intended_x_spacing = models.FloatField(db_column='intendedxspacing') - intended_x_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedxspacingunitsid', blank=True, null=True) + intended_x_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedxspacingunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True @@ -199,7 +203,7 @@ class Meta: class YIntendedComponent(models.Model): intended_y_spacing = models.FloatField(db_column='intendedyspacing', blank=True, null=True) - intended_y_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedyspacingunitsid', blank=True, null=True) + intended_y_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedyspacingunitsid',on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True @@ -207,7 +211,7 @@ class Meta: class ZIntendedComponent(models.Model): intended_z_spacing = models.FloatField(db_column='intendedzspacing', blank=True, null=True) - intended_z_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedzspacingunitsid', blank=True, null=True) + intended_z_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedzspacingunitsid',on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True @@ -215,15 +219,15 @@ class Meta: class TimeIntendedComponent(models.Model): intended_time_spacing = models.FloatField(db_column='intendedtimespacing', blank=True, null=True) - intended_time_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedtimespacingunitsid', blank=True, null=True) + intended_time_spacing_unit = models.ForeignKey('Unit', related_name='+', db_column='intendedtimespacingunitsid',on_delete=models.CASCADE, blank=True, null=True) class Meta: abstract = True class QualityControlComponent(models.Model): - censor_code = models.ForeignKey('CensorCode', db_column='censorcodecv') - quality_code = models.ForeignKey('QualityCode', db_column='qualitycodecv') + censor_code = models.ForeignKey('CensorCode', db_column='censorcodecv',on_delete=models.CASCADE) + quality_code = models.ForeignKey('QualityCode', db_column='qualitycodecv',on_delete=models.CASCADE) class Meta: abstract = True @@ -420,12 +424,12 @@ class Meta: @python_2_unicode_compatible class Organization(ODM2Model): organization_id = models.AutoField(db_column='organizationid', primary_key=True) - organization_type = models.ForeignKey('OrganizationType', db_column='organizationtypecv') + organization_type = models.ForeignKey('OrganizationType', db_column='organizationtypecv', on_delete=models.CASCADE) organization_code = models.CharField(db_column='organizationcode', max_length=50, unique=True) organization_name = models.CharField(db_column='organizationname', max_length=255) organization_description = models.CharField(db_column='organizationdescription', blank=True, max_length=500) organization_link = models.CharField(db_column='organizationlink', blank=True, max_length=255) - parent_organization = models.ForeignKey('self', db_column='parentorganizationid', blank=True, null=True) + parent_organization = models.ForeignKey('self', db_column='parentorganizationid', blank=True, null=True, on_delete=models.CASCADE) people = models.ManyToManyField('People', through='Affiliation') @@ -447,8 +451,8 @@ class Meta: @python_2_unicode_compatible class Affiliation(ODM2Model): affiliation_id = models.AutoField(db_column='affiliationid', primary_key=True) - person = models.ForeignKey('People', related_name='affiliations', db_column='personid') - organization = models.ForeignKey('Organization', related_name='affiliations', db_column='organizationid', blank=True, null=True) + person = models.ForeignKey('People', related_name='affiliations', db_column='personid', on_delete=models.CASCADE) + organization = models.ForeignKey('Organization', related_name='affiliations', db_column='organizationid', on_delete=models.CASCADE, blank=True, null=True) is_primary_organization_contact = models.NullBooleanField(db_column='isprimaryorganizationcontact', default=None) affiliation_start_date = models.DateField(db_column='affiliationstartdate') affiliation_end_date = models.DateField(db_column='affiliationenddate', blank=True, null=True) @@ -480,12 +484,12 @@ class Meta: @python_2_unicode_compatible class Method(ODM2Model): method_id = models.AutoField(db_column='methodid', primary_key=True) - method_type = models.ForeignKey('MethodType', db_column='methodtypecv') + method_type = models.ForeignKey('MethodType', db_column='methodtypecv', on_delete=models.CASCADE) method_code = models.CharField(db_column='methodcode', max_length=50) method_name = models.CharField(db_column='methodname', max_length=255) method_description = models.CharField(db_column='methoddescription', blank=True, max_length=500) method_link = models.CharField(db_column='methodlink', blank=True, max_length=255) - organization = models.ForeignKey('Organization', db_column='organizationid', blank=True, null=True) + organization = models.ForeignKey('Organization', db_column='organizationid', on_delete=models.CASCADE, blank=True, null=True) annotations = models.ManyToManyField('Annotation', related_name='annotated_methods', through='MethodAnnotation') extension_property_values = models.ManyToManyField('ExtensionProperty', related_name='methods', through='MethodExtensionPropertyValue') @@ -507,8 +511,8 @@ class Meta: @python_2_unicode_compatible class Action(ODM2Model): action_id = models.AutoField(db_column='actionid', primary_key=True) - action_type = models.ForeignKey('ActionType', db_column='actiontypecv') - method = models.ForeignKey('Method', db_column='methodid') + action_type = models.ForeignKey('ActionType', db_column='actiontypecv', on_delete=models.CASCADE) + method = models.ForeignKey('Method', db_column='methodid', on_delete=CASCADE) begin_datetime = models.DateTimeField(db_column='begindatetime') begin_datetime_utc_offset = models.IntegerField(db_column='begindatetimeutcoffset') end_datetime = models.DateTimeField(db_column='enddatetime', blank=True, null=True) @@ -541,8 +545,8 @@ class Meta: @python_2_unicode_compatible class ActionBy(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - action = models.ForeignKey('Action', related_name="action_by", db_column='actionid') - affiliation = models.ForeignKey('Affiliation', db_column='affiliationid') + action = models.ForeignKey('Action', related_name="action_by", db_column='actionid', on_delete=models.CASCADE) + affiliation = models.ForeignKey('Affiliation', db_column='affiliationid', on_delete=models.CASCADE) is_action_lead = models.BooleanField(db_column='isactionlead', default=None) role_description = models.CharField(db_column='roledescription', blank=True, max_length=255) @@ -564,13 +568,13 @@ class Meta: class SamplingFeature(models.Model): sampling_feature_id = models.AutoField(db_column='samplingfeatureid', primary_key=True) sampling_feature_uuid = models.UUIDField(default=uuid.uuid4, editable=False, db_column='samplingfeatureuuid') - sampling_feature_type = models.ForeignKey('SamplingFeatureType', db_column='samplingfeaturetypecv') + sampling_feature_type = models.ForeignKey('SamplingFeatureType', db_column='samplingfeaturetypecv', on_delete=models.CASCADE) sampling_feature_code = models.CharField(db_column='samplingfeaturecode', max_length=50, unique=True) sampling_feature_name = models.CharField(db_column='samplingfeaturename', blank=True, max_length=255) sampling_feature_description = models.CharField(db_column='samplingfeaturedescription', blank=True, max_length=500) - sampling_feature_geo_type = models.ForeignKey('SamplingFeatureGeoType', db_column='samplingfeaturegeotypecv', blank=True, null=True) + sampling_feature_geo_type = models.ForeignKey('SamplingFeatureGeoType', db_column='samplingfeaturegeotypecv', on_delete=models.CASCADE, blank=True, null=True) elevation_m = models.FloatField(db_column='elevation_m', blank=True, null=True) - elevation_datum = models.ForeignKey('ElevationDatum', db_column='elevationdatumcv', blank=True, null=True) + elevation_datum = models.ForeignKey('ElevationDatum', db_column='elevationdatumcv', on_delete=models.CASCADE, blank=True, null=True) feature_geometry = models.BinaryField(db_column='featuregeometry', blank=True, null=True) actions = models.ManyToManyField('Action', related_name='sampling_features', through='FeatureAction') @@ -602,8 +606,8 @@ class Meta: @python_2_unicode_compatible class FeatureAction(models.Model): feature_action_id = models.AutoField(db_column='featureactionid', primary_key=True) - sampling_feature = models.ForeignKey('SamplingFeature', related_name="feature_actions", db_column='samplingfeatureid') - action = models.ForeignKey('Action', related_name="feature_actions", db_column='actionid') + sampling_feature = models.ForeignKey('SamplingFeature', related_name="feature_actions", db_column='samplingfeatureid', on_delete=models.CASCADE) + action = models.ForeignKey('Action', related_name="feature_actions", db_column='actionid', on_delete=models.CASCADE) objects = FeatureActionQuerySet.as_manager() @@ -623,7 +627,7 @@ class Meta: class DataSet(models.Model): data_set_id = models.AutoField(db_column='datasetid', primary_key=True) data_set_uuid = models.UUIDField(default=uuid.uuid4, editable=False, db_column='datasetuuid') - data_set_type = models.ForeignKey('DataSetType', db_column='datasettypecv') + data_set_type = models.ForeignKey('DataSetType', db_column='datasettypecv', on_delete=models.CASCADE) data_set_code = models.CharField(db_column='datasetcode', max_length=50) data_set_title = models.CharField(db_column='datasettitle', max_length=255) data_set_abstract = models.CharField(db_column='datasetabstract', max_length=500) @@ -662,8 +666,8 @@ class Meta: class RelatedAction(ObjectRelation): - action = models.ForeignKey('Action', related_name='related_actions', db_column='actionid') - related_action = models.ForeignKey('Action', related_name='reverse_related_actions', db_column='relatedactionid') + action = models.ForeignKey('Action', related_name='related_actions', db_column='actionid', on_delete=models.CASCADE) + related_action = models.ForeignKey('Action', related_name='reverse_related_actions', db_column='relatedactionid', on_delete=models.CASCADE) objects = RelatedActionManager() @@ -683,11 +687,11 @@ class Meta: @python_2_unicode_compatible class TaxonomicClassifier(models.Model): taxonomic_classifier_id = models.AutoField(db_column='taxonomicclassifierid', primary_key=True) - taxonomic_classifier_type = models.ForeignKey('TaxonomicClassifierType', db_column='taxonomicclassifiertypecv') + taxonomic_classifier_type = models.ForeignKey('TaxonomicClassifierType', db_column='taxonomicclassifiertypecv', on_delete=models.CASCADE) taxonomic_classifier_name = models.CharField(db_column='taxonomicclassifiername', max_length=255) taxonomic_classifier_common_name = models.CharField(db_column='taxonomicclassifiercommonname', blank=True, max_length=255) taxonomic_classifier_description = models.CharField(db_column='taxonomicclassifierdescription', blank=True, max_length=500) - parent_taxonomic_classifier = models.ForeignKey('self', db_column='parenttaxonomicclassifierid', blank=True, null=True) + parent_taxonomic_classifier = models.ForeignKey('self', db_column='parenttaxonomicclassifierid', on_delete=models.CASCADE, blank=True, null=True) external_identifiers = models.ManyToManyField('ExternalIdentifierSystem', related_name='taxonomic_classifier', through='TaxonomicClassifierExternalIdentifier') @@ -708,7 +712,7 @@ class Meta: @python_2_unicode_compatible class Unit(models.Model): unit_id = models.AutoField(db_column='unitsid', primary_key=True) - unit_type = models.ForeignKey('UnitsType', db_column='unitstypecv') + unit_type = models.ForeignKey('UnitsType', db_column='unitstypecv', on_delete=models.CASCADE) unit_abbreviation = models.CharField(db_column='unitsabbreviation', max_length=255) unit_name = models.CharField(db_column='unitsname', max_length=255) unit_link = models.CharField(db_column='unitslink', blank=True, max_length=255) @@ -729,11 +733,11 @@ class Meta: @python_2_unicode_compatible class Variable(models.Model): variable_id = models.AutoField(db_column='variableid', primary_key=True) - variable_type = models.ForeignKey('VariableType', db_column='variabletypecv') + variable_type = models.ForeignKey('VariableType', db_column='variabletypecv', on_delete=models.CASCADE) variable_code = models.CharField(db_column='variablecode', max_length=50) - variable_name = models.ForeignKey('VariableName', db_column='variablenamecv') + variable_name = models.ForeignKey('VariableName', db_column='variablenamecv', on_delete=models.CASCADE) variable_definition = models.CharField(db_column='variabledefinition', blank=True, max_length=500) - speciation = models.ForeignKey('Speciation', db_column='speciationcv', blank=True, null=True) + speciation = models.ForeignKey('Speciation', db_column='speciationcv', on_delete=models.CASCADE, blank=True, null=True) no_data_value = models.FloatField(db_column='nodatavalue') extension_property_values = models.ManyToManyField('ExtensionProperty', related_name='variables', @@ -758,18 +762,18 @@ class Meta: class Result(models.Model): result_id = models.AutoField(db_column='resultid', primary_key=True) result_uuid = models.UUIDField(default=uuid.uuid4, editable=False, db_column='resultuuid') - feature_action = models.ForeignKey('FeatureAction', related_name='results', db_column='featureactionid') - result_type = models.ForeignKey('ResultType', db_column='resulttypecv') - variable = models.ForeignKey('Variable', db_column='variableid') - unit = models.ForeignKey('Unit', db_column='unitsid') - taxonomic_classifier = models.ForeignKey('TaxonomicClassifier', db_column='taxonomicclassifierid', blank=True, null=True) - processing_level = models.ForeignKey(ProcessingLevel, db_column='processinglevelid') + feature_action = models.ForeignKey('FeatureAction', related_name='results', db_column='featureactionid', on_delete=models.CASCADE) + result_type = models.ForeignKey('ResultType', db_column='resulttypecv', on_delete=models.CASCADE) + variable = models.ForeignKey('Variable', db_column='variableid', on_delete=models.CASCADE) + unit = models.ForeignKey('Unit', db_column='unitsid', on_delete=models.CASCADE) + taxonomic_classifier = models.ForeignKey('TaxonomicClassifier', db_column='taxonomicclassifierid', on_delete=models.CASCADE, blank=True, null=True) + processing_level = models.ForeignKey(ProcessingLevel, db_column='processinglevelid', on_delete=models.CASCADE) result_datetime = models.DateTimeField(db_column='resultdatetime', blank=True, null=True) result_datetime_utc_offset = models.BigIntegerField(db_column='resultdatetimeutcoffset', blank=True, null=True) valid_datetime = models.DateTimeField(db_column='validdatetime', blank=True, null=True) valid_datetime_utc_offset = models.BigIntegerField(db_column='validdatetimeutcoffset', blank=True, null=True) - status = models.ForeignKey('Status', db_column='statuscv', blank=True) - sampled_medium = models.ForeignKey('Medium', db_column='sampledmediumcv') + status = models.ForeignKey('Status', db_column='statuscv', blank=True, on_delete=models.CASCADE) + sampled_medium = models.ForeignKey('Medium', db_column='sampledmediumcv', on_delete=models.CASCADE) value_count = models.IntegerField(db_column='valuecount', default=0) data_sets = models.ManyToManyField('DataSet', related_name='results', through='DataSetResult') @@ -803,7 +807,7 @@ class Meta: @python_2_unicode_compatible class DataLoggerProgramFile(models.Model): program_id = models.AutoField(db_column='programid', primary_key=True) - affiliation = models.ForeignKey('Affiliation', db_column='affiliationid', related_name='data_logger_programs') + affiliation = models.ForeignKey('Affiliation', db_column='affiliationid', on_delete=models.CASCADE, related_name='data_logger_programs') program_name = models.CharField(db_column='programname', max_length=255) program_description = models.CharField(db_column='programdescription', blank=True, max_length=500) program_version = models.CharField(db_column='programversion', blank=True, max_length=50) @@ -824,7 +828,7 @@ class Meta: @python_2_unicode_compatible class DataLoggerFile(models.Model): data_logger_file_id = models.AutoField(db_column='dataloggerfileid', primary_key=True) - program = models.ForeignKey('DataLoggerProgramFile', db_column='programid', related_name='data_logger_files') + program = models.ForeignKey('DataLoggerProgramFile', db_column='programid', on_delete=models.CASCADE, related_name='data_logger_files') data_logger_file_name = models.CharField(db_column='dataloggerfilename', max_length=255) data_logger_file_description = models.CharField(db_column='dataloggerfiledescription', blank=True, max_length=500) data_logger_file_link = models.FileField(db_column='dataloggerfilelink', blank=True) @@ -847,17 +851,17 @@ class Meta: @python_2_unicode_compatible class DataLoggerFileColumn(models.Model): data_logger_file_column_id = models.AutoField(db_column='dataloggerfilecolumnid', primary_key=True) - result = models.ForeignKey('Result', related_name='data_logger_file_columns', db_column='resultid', blank=True, null=True) - data_logger_file = models.ForeignKey('DataLoggerFile', related_name='data_logger_file_columns', db_column='dataloggerfileid') - instrument_output_variable = models.ForeignKey('InstrumentOutputVariable', related_name='data_logger_file_columns', db_column='instrumentoutputvariableid') + result = models.ForeignKey('Result', related_name='data_logger_file_columns', db_column='resultid', on_delete=models.CASCADE, blank=True, null=True) + data_logger_file = models.ForeignKey('DataLoggerFile', related_name='data_logger_file_columns', db_column='dataloggerfileid', on_delete=models.CASCADE) + instrument_output_variable = models.ForeignKey('InstrumentOutputVariable', related_name='data_logger_file_columns', db_column='instrumentoutputvariableid', on_delete=models.CASCADE) column_label = models.CharField(db_column='columnlabel', max_length=50) column_description = models.CharField(db_column='columndescription', blank=True, max_length=500) measurement_equation = models.CharField(db_column='measurementequation', blank=True, max_length=255) scan_interval = models.FloatField(db_column='scaninterval', blank=True, null=True) - scan_interval_unit = models.ForeignKey('Unit', related_name='scan_interval_data_logger_file_columns', db_column='scanintervalunitsid', blank=True, null=True) + scan_interval_unit = models.ForeignKey('Unit', related_name='scan_interval_data_logger_file_columns', db_column='scanintervalunitsid', blank=True, null=True, on_delete=models.CASCADE) recording_interval = models.FloatField(db_column='recordinginterval', blank=True, null=True) - recording_interval_unit = models.ForeignKey('Unit', related_name='recording_interval_data_logger_file_columns', db_column='recordingintervalunitsid', blank=True, null=True) - aggregation_statistic = models.ForeignKey('AggregationStatistic', related_name='data_logger_file_columns', db_column='aggregationstatisticcv', blank=True) + recording_interval_unit = models.ForeignKey('Unit', related_name='recording_interval_data_logger_file_columns', db_column='recordingintervalunitsid', on_delete=models.CASCADE, blank=True, null=True) + aggregation_statistic = models.ForeignKey('AggregationStatistic', related_name='data_logger_file_columns', db_column='aggregationstatisticcv', on_delete=models.CASCADE, blank=True) def __str__(self): return '%s %s' % (self.column_label, self.column_description) @@ -874,7 +878,7 @@ class Meta: @python_2_unicode_compatible class EquipmentModel(models.Model): equipment_model_id = models.AutoField(db_column='equipmentmodelid', primary_key=True) - model_manufacturer = models.ForeignKey('Organization', db_column='modelmanufacturerid') + model_manufacturer = models.ForeignKey('Organization', db_column='modelmanufacturerid', on_delete=models.CASCADE) model_part_number = models.CharField(db_column='modelpartnumber', blank=True, max_length=50) model_name = models.CharField(db_column='modelname', max_length=255) model_description = models.CharField(db_column='modeldescription', blank=True, max_length=500) @@ -904,12 +908,12 @@ class Meta: @python_2_unicode_compatible class InstrumentOutputVariable(models.Model): instrument_output_variable_id = models.AutoField(db_column='instrumentoutputvariableid', primary_key=True) - model = models.ForeignKey('EquipmentModel', related_name='instrument_output_variables', db_column='modelid') - variable = models.ForeignKey('Variable', related_name='instrument_output_variables', db_column='variableid') - instrument_method = models.ForeignKey('Method', related_name='instrument_output_variables', db_column='instrumentmethodid') + model = models.ForeignKey('EquipmentModel', related_name='instrument_output_variables', db_column='modelid', on_delete=models.CASCADE) + variable = models.ForeignKey('Variable', related_name='instrument_output_variables', db_column='variableid', on_delete=models.CASCADE) + instrument_method = models.ForeignKey('Method', related_name='instrument_output_variables', db_column='instrumentmethodid', on_delete=models.CASCADE) instrument_resolution = models.CharField(db_column='instrumentresolution', blank=True, max_length=255) instrument_accuracy = models.CharField(db_column='instrumentaccuracy', blank=True, max_length=255) - instrument_raw_output_unit = models.ForeignKey('Unit', related_name='instrument_output_variables', db_column='instrumentrawoutputunitsid') + instrument_raw_output_unit = models.ForeignKey('Unit', related_name='instrument_output_variables', db_column='instrumentrawoutputunitsid', on_delete=models.CASCADE) objects = InstrumentOutputVariableManager() @@ -936,11 +940,11 @@ class Equipment(models.Model): equipment_id = models.AutoField(db_column='equipmentid', primary_key=True) equipment_code = models.CharField(db_column='equipmentcode', max_length=50) equipment_name = models.CharField(db_column='equipmentname', max_length=255) - equipment_type = models.ForeignKey('EquipmentType', db_column='equipmenttypecv') - equipment_model = models.ForeignKey('EquipmentModel', related_name='equipment', db_column='equipmentmodelid') + equipment_type = models.ForeignKey('EquipmentType', db_column='equipmenttypecv', on_delete=models.CASCADE) + equipment_model = models.ForeignKey('EquipmentModel', related_name='equipment', db_column='equipmentmodelid', on_delete=models.CASCADE) equipment_serial_number = models.CharField(db_column='equipmentserialnumber', max_length=50) - equipment_owner = models.ForeignKey('People', related_name='owned_equipment', db_column='equipmentownerid') - equipment_vendor = models.ForeignKey('Organization', related_name='equipment', db_column='equipmentvendorid') + equipment_owner = models.ForeignKey('People', related_name='owned_equipment', db_column='equipmentownerid', on_delete=models.CASCADE) + equipment_vendor = models.ForeignKey('Organization', related_name='equipment', db_column='equipmentvendorid', on_delete=models.CASCADE) equipment_purchase_date = models.DateTimeField(db_column='equipmentpurchasedate') equipment_purchase_order_number = models.CharField(db_column='equipmentpurchaseordernumber', blank=True, max_length=50) equipment_description = models.CharField(db_column='equipmentdescription', blank=True, max_length=500) @@ -970,8 +974,8 @@ class Meta: @python_2_unicode_compatible class CalibrationReferenceEquipment(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - action = models.ForeignKey('CalibrationAction', related_name='+', db_column='actionid') - equipment = models.ForeignKey('Equipment', related_name='+', db_column='equipmentid') + action = models.ForeignKey('CalibrationAction', related_name='+', db_column='actionid', on_delete=models.CASCADE) + equipment = models.ForeignKey('Equipment', related_name='+', db_column='equipmentid', on_delete=models.CASCADE) objects = CalibrationReferenceEquipmentManager() @@ -990,8 +994,8 @@ class Meta: @python_2_unicode_compatible class EquipmentUsed(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - action = models.ForeignKey('Action', related_name='+', db_column='actionid') - equipment = models.ForeignKey('Equipment', related_name='+', db_column='equipmentid') + action = models.ForeignKey('Action', related_name='+', db_column='actionid', on_delete=models.CASCADE) + equipment = models.ForeignKey('Equipment', related_name='+', db_column='equipmentid', on_delete=models.CASCADE) objects = EquipmentUsedManager() @@ -1009,7 +1013,7 @@ class Meta: @python_2_unicode_compatible class MaintenanceAction(models.Model): - action = models.OneToOneField(Action, related_name='maintenance', db_column='actionid', primary_key=True) + action = models.OneToOneField(Action, related_name='maintenance', db_column='actionid', on_delete=models.CASCADE, primary_key=True) is_factory_service = models.BooleanField(db_column='isfactoryservice', default=None) maintenance_code = models.CharField(db_column='maintenancecode', blank=True, max_length=50) maintenance_reason = models.CharField(db_column='maintenancereason', blank=True, max_length=500) @@ -1029,8 +1033,8 @@ class Meta: class RelatedEquipment(ObjectRelation): - equipment = models.ForeignKey('Equipment', related_name='related_equipment', db_column='equipmentid') - related_equipment = models.ForeignKey('Equipment', related_name='reverse_related_equipment', db_column='relatedequipmentid') + equipment = models.ForeignKey('Equipment', related_name='related_equipment', db_column='equipmentid', on_delete=models.CASCADE) + related_equipment = models.ForeignKey('Equipment', related_name='reverse_related_equipment', db_column='relatedequipmentid', on_delete=models.CASCADE) relationship_start_datetime = models.DateTimeField(db_column='relationshipstartdatetime') relationship_start_datetime_utc_offset = models.IntegerField(db_column='relationshipstartdatetimeutcoffset') relationship_end_datetime = models.DateTimeField(db_column='relationshipenddatetime', blank=True, null=True) @@ -1053,9 +1057,9 @@ class Meta: @python_2_unicode_compatible class CalibrationAction(models.Model): - action = models.OneToOneField(Action, related_name='calibration', db_column='actionid', primary_key=True) + action = models.OneToOneField(Action, related_name='calibration', db_column='actionid', on_delete=models.CASCADE, primary_key=True) calibration_check_value = models.FloatField(db_column='calibrationcheckvalue', blank=True, null=True) - instrument_output_variable = models.ForeignKey('InstrumentOutputVariable', db_column='instrumentoutputvariableid') + instrument_output_variable = models.ForeignKey('InstrumentOutputVariable', db_column='instrumentoutputvariableid', on_delete=models.CASCADE) calibration_equation = models.CharField(db_column='calibrationequation', blank=True, max_length=255) calibration_standards = models.ManyToManyField('ReferenceMaterial', related_name='calibration_actions', through='CalibrationStandard') @@ -1081,7 +1085,7 @@ class Meta: class Directive(models.Model): directive_id = models.AutoField(db_column='directiveid', primary_key=True) - directive_type = models.ForeignKey('DirectiveType', db_column='directivetypecv') + directive_type = models.ForeignKey('DirectiveType', db_column='directivetypecv', on_delete=models.CASCADE) directive_description = models.CharField(db_column='directivedescription', max_length=500) def __repr__(self): @@ -1095,8 +1099,8 @@ class Meta: class ActionDirective(models.Model): bridge_id = models.IntegerField(db_column='bridgeid', primary_key=True) - action = models.ForeignKey('Action', related_name='+', db_column='actionid') - directive = models.ForeignKey('Directive', related_name='+', db_column='directiveid') + action = models.ForeignKey('Action', related_name='+', db_column='actionid', on_delete=models.CASCADE) + directive = models.ForeignKey('Directive', related_name='+', db_column='directiveid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1108,7 +1112,7 @@ class Meta: class SpecimenBatchPosition(models.Model): - feature_action = models.OneToOneField('FeatureAction', db_column='featureactionid', primary_key=True) + feature_action = models.OneToOneField('FeatureAction', db_column='featureactionid', on_delete=models.CASCADE, primary_key=True) batch_position_number = models.IntegerField(db_column='batchpositionnumber') batch_position_label = models.CharField(db_column='batchpositionlabel', blank=True, max_length=255) @@ -1145,9 +1149,9 @@ class Meta: class Specimen(models.Model): - sampling_feature = models.OneToOneField('SamplingFeature', db_column='samplingfeatureid', primary_key=True) - specimen_type = models.ForeignKey('SpecimenType', db_column='specimentypecv') - specimen_medium = models.ForeignKey('Medium', db_column='specimenmediumcv') + sampling_feature = models.OneToOneField('SamplingFeature', db_column='samplingfeatureid', on_delete=models.CASCADE, primary_key=True) + specimen_type = models.ForeignKey('SpecimenType', db_column='specimentypecv', on_delete=models.CASCADE) + specimen_medium = models.ForeignKey('Medium', db_column='specimenmediumcv', on_delete=models.CASCADE) is_field_specimen = models.BooleanField(db_column='isfieldspecimen', default=None) def __repr__(self): @@ -1162,13 +1166,13 @@ class Meta: class SpatialOffset(models.Model): spatial_offset_id = models.AutoField(db_column='spatialoffsetid', primary_key=True) - spatial_offset_type = models.ForeignKey('SpatialOffsetType', db_column='spatialoffsettypecv') + spatial_offset_type = models.ForeignKey('SpatialOffsetType', db_column='spatialoffsettypecv', on_delete=models.CASCADE) offset_1_value = models.FloatField(db_column='offset1value') - offset_1_unit = models.ForeignKey('Unit', related_name='+', db_column='offset1unitid') + offset_1_unit = models.ForeignKey('Unit', related_name='+', db_column='offset1unitid', on_delete=models.CASCADE) offset_2_value = models.FloatField(db_column='offset2value', blank=True, null=True) - offset_2_unit = models.ForeignKey('Unit', related_name='+', db_column='offset2unitid', blank=True, null=True) + offset_2_unit = models.ForeignKey('Unit', related_name='+', db_column='offset2unitid', on_delete=models.CASCADE, blank=True, null=True) offset_3_value = models.FloatField(db_column='offset3value', blank=True, null=True) - offset_3_unit = models.ForeignKey('Unit', related_name='+', db_column='offset3unitid', blank=True, null=True) + offset_3_unit = models.ForeignKey('Unit', related_name='+', db_column='offset3unitid', on_delete=models.CASCADE, blank=True, null=True) def __repr__(self): return "" % ( @@ -1180,11 +1184,11 @@ class Meta: class Site(models.Model): - sampling_feature = models.OneToOneField('SamplingFeature', related_name='site', db_column='samplingfeatureid', primary_key=True) - site_type = models.ForeignKey('SiteType', db_column='sitetypecv') + sampling_feature = models.OneToOneField('SamplingFeature', related_name='site', db_column='samplingfeatureid', on_delete=models.CASCADE, primary_key=True) + site_type = models.ForeignKey('SiteType', db_column='sitetypecv', on_delete=models.CASCADE) latitude = models.FloatField(db_column='latitude') longitude = models.FloatField(db_column='longitude') - spatial_reference = models.ForeignKey('SpatialReference', db_column='spatialreferenceid') + spatial_reference = models.ForeignKey('SpatialReference', db_column='spatialreferenceid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1196,9 +1200,9 @@ class Meta: class RelatedFeature(ObjectRelation): - sampling_feature = models.ForeignKey('SamplingFeature', related_name='related_features_sampling_feature', db_column='samplingfeatureid') - related_feature = models.ForeignKey('SamplingFeature', related_name='related_features_related_feature', db_column='relatedfeatureid') - spatial_offset = models.ForeignKey('SpatialOffset', db_column='spatialoffsetid', blank=True, null=True) + sampling_feature = models.ForeignKey('SamplingFeature', related_name='related_features_sampling_feature', db_column='samplingfeatureid', on_delete=models.CASCADE) + related_feature = models.ForeignKey('SamplingFeature', related_name='related_features_related_feature', db_column='relatedfeatureid', on_delete=models.CASCADE) + spatial_offset = models.ForeignKey('SpatialOffset', db_column='spatialoffsetid', on_delete=models.CASCADE, blank=True, null=True) def __repr__(self): return "" % ( @@ -1212,9 +1216,9 @@ class Meta: class SpecimenTaxonomicClassifier(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - sampling_feature = models.ForeignKey('Specimen', related_name='taxonomic_classifiers', db_column='samplingfeatureid') - taxonomic_classifier = models.ForeignKey('TaxonomicClassifier', related_name='specimens', db_column='taxonomicclassifierid') - citation = models.ForeignKey('Citation', related_name='specimen_taxonomic_classifiers', db_column='citationid', blank=True, null=True) + sampling_feature = models.ForeignKey('Specimen', related_name='taxonomic_classifiers', db_column='samplingfeatureid', on_delete=models.CASCADE) + taxonomic_classifier = models.ForeignKey('TaxonomicClassifier', related_name='specimens', db_column='taxonomicclassifierid', on_delete=models.CASCADE) + citation = models.ForeignKey('Citation', related_name='specimen_taxonomic_classifiers', db_column='citationid', on_delete=models.CASCADE, blank=True, null=True) def __repr__(self): return "" % ( @@ -1248,8 +1252,8 @@ class Meta: class RelatedModel(ObjectRelation): - model = models.ForeignKey('Model', related_name='related_model_model', db_column='modelid') - related_model = models.ForeignKey('Model', related_name='related_model_related_model', db_column='relatedmodelid') + model = models.ForeignKey('Model', related_name='related_model_model', db_column='modelid', on_delete=models.CASCADE) + related_model = models.ForeignKey('Model', related_name='related_model_related_model', db_column='relatedmodelid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1262,7 +1266,7 @@ class Meta: class Simulation(models.Model): simulation_id = models.AutoField(db_column='simulationid', primary_key=True) - action = models.ForeignKey('Action', related_name='simulations', db_column='actionid') + action = models.ForeignKey('Action', related_name='simulations', db_column='actionid', on_delete=models.CASCADE) simulation_name = models.CharField(db_column='simulationname', max_length=255) simulation_description = models.CharField(db_column='simulationdescription', max_length=500, blank=True) simulation_start_datetime = models.DateTimeField(db_column='simulationstartdatetime') @@ -1270,9 +1274,9 @@ class Simulation(models.Model): simulation_end_datetime = models.DateTimeField(db_column='simulationenddatetime') simulation_end_datetime_utc_offset = models.IntegerField(db_column='simulationenddatetimeutcoffset') time_step_value = models.FloatField(db_column='timestepvalue') - time_step_unit = models.ForeignKey('Unit', related_name='simulations', db_column='timestepunitsid') - input_data_set = models.ForeignKey('DataSet', related_name='simulations', db_column='inputdatasetid', blank=True, null=True) - model = models.ForeignKey('Model', related_name='simulations', db_column='modelid') + time_step_unit = models.ForeignKey('Unit', related_name='simulations', db_column='timestepunitsid', on_delete=models.CASCADE) + input_data_set = models.ForeignKey('DataSet', related_name='simulations', db_column='inputdatasetid', on_delete=models.CASCADE, blank=True, null=True) + model = models.ForeignKey('Model', related_name='simulations', db_column='modelid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1310,14 +1314,14 @@ class Meta: class Annotation(models.Model): annotation_id = models.AutoField(db_column='annotationid', primary_key=True) - annotation_type = models.ForeignKey('AnnotationType', db_column='annotationtypecv') + annotation_type = models.ForeignKey('AnnotationType', db_column='annotationtypecv', on_delete=models.CASCADE) annotation_code = models.CharField(db_column='annotationcode', blank=True, max_length=50) annotation_text = models.CharField(db_column='annotationtext', max_length=500) annotation_datetime = models.DateTimeField(db_column='annotationdatetime', blank=True, null=True) annotation_utc_offset = models.IntegerField(db_column='annotationutcoffset', blank=True, null=True) annotation_link = models.CharField(db_column='annotationlink', blank=True, max_length=255) - annotator = models.ForeignKey('People', db_column='annotatorid', blank=True, null=True) - citation = models.ForeignKey('Citation', db_column='citationid', blank=True, null=True) + annotator = models.ForeignKey('People', db_column='annotatorid', on_delete=models.CASCADE, blank=True, null=True) + citation = models.ForeignKey('Citation', db_column='citationid', on_delete=models.CASCADE, blank=True, null=True) def __repr__(self): return "" % ( @@ -1330,7 +1334,7 @@ class Meta: class ActionAnnotation(AnnotationBridge): - action = models.ForeignKey('Action', related_name='+', db_column='actionid') + action = models.ForeignKey('Action', related_name='+', db_column='actionid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1342,7 +1346,7 @@ class Meta: class EquipmentAnnotation(AnnotationBridge): - equipment = models.ForeignKey('Equipment', related_name='+', db_column='equipmentid') + equipment = models.ForeignKey('Equipment', related_name='+', db_column='equipmentid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1354,7 +1358,7 @@ class Meta: class MethodAnnotation(AnnotationBridge): - method = models.ForeignKey('Method', related_name='+', db_column='methodid') + method = models.ForeignKey('Method', related_name='+', db_column='methodid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1366,7 +1370,7 @@ class Meta: class ResultAnnotation(AnnotationBridge): - result = models.ForeignKey('Result', related_name='dated_annotations', db_column='resultid') + result = models.ForeignKey('Result', related_name='dated_annotations', db_column='resultid', on_delete=models.CASCADE) begin_datetime = models.DateTimeField(db_column='begindatetime') end_datetime = models.DateTimeField(db_column='enddatetime') @@ -1380,7 +1384,7 @@ class Meta: class SamplingFeatureAnnotation(AnnotationBridge): - sampling_feature = models.ForeignKey('SamplingFeature', related_name='+', db_column='samplingfeatureid') + sampling_feature = models.ForeignKey('SamplingFeature', related_name='+', db_column='samplingfeatureid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1397,8 +1401,8 @@ class Meta: class DataSetResult(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - data_set = models.ForeignKey('DataSet', related_name='+', db_column='datasetid') - result = models.ForeignKey('Result', related_name='+', db_column='resultid') + data_set = models.ForeignKey('DataSet', related_name='+', db_column='datasetid', on_delete=models.CASCADE) + result = models.ForeignKey('Result', related_name='+', db_column='resultid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1411,10 +1415,10 @@ class Meta: class DataQuality(models.Model): data_quality_id = models.AutoField(db_column='dataqualityid', primary_key=True) - data_quality_type = models.ForeignKey('DataQualityType', db_column='dataqualitytypecv') + data_quality_type = models.ForeignKey('DataQualityType', db_column='dataqualitytypecv', on_delete=models.CASCADE) data_quality_code = models.CharField(db_column='dataqualitycode', max_length=255) data_quality_value = models.FloatField(db_column='dataqualityvalue', blank=True, null=True) - data_quality_value_unit = models.ForeignKey('Unit', db_column='dataqualityvalueunitsid', blank=True, null=True) + data_quality_value_unit = models.ForeignKey('Unit', db_column='dataqualityvalueunitsid', on_delete=models.CASCADE, blank=True, null=True) data_quality_description = models.CharField(db_column='dataqualitydescription', blank=True, max_length=500) data_quality_link = models.CharField(db_column='dataqualitylink', blank=True, max_length=255) @@ -1429,14 +1433,14 @@ class Meta: class ReferenceMaterial(models.Model): reference_material_id = models.AutoField(db_column='referencematerialid', primary_key=True) - reference_material_medium = models.ForeignKey('Medium', db_column='referencematerialmediumcv') - reference_material_organization = models.ForeignKey('Organization', db_column='referencematerialorganizationid') + reference_material_medium = models.ForeignKey('Medium', db_column='referencematerialmediumcv', on_delete=models.CASCADE) + reference_material_organization = models.ForeignKey('Organization', db_column='referencematerialorganizationid', on_delete=models.CASCADE) reference_material_code = models.CharField(db_column='referencematerialcode', max_length=50) reference_material_lot_code = models.CharField(db_column='referencemateriallotcode', blank=True, max_length=255) reference_material_purchase_date = models.DateTimeField(db_column='referencematerialpurchasedate', blank=True, null=True) reference_material_expiration_date = models.DateTimeField(db_column='referencematerialexpirationdate', blank=True, null=True) reference_material_certificate_link = models.FileField(db_column='referencematerialcertificatelink', blank=True) # TODO: is it a link or a file link? BOTH - sampling_feature = models.ForeignKey('SamplingFeature', db_column='samplingfeatureid', blank=True, null=True) + sampling_feature = models.ForeignKey('SamplingFeature', db_column='samplingfeatureid', on_delete=models.CASCADE, blank=True, null=True) external_identifiers = models.ManyToManyField('ExternalIdentifierSystem', related_name='reference_materials', through='ReferenceMaterialExternalIdentifier') @@ -1453,8 +1457,8 @@ class Meta: class CalibrationStandard(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - action = models.ForeignKey('CalibrationAction', related_name='+', db_column='actionid') - reference_material = models.ForeignKey('ReferenceMaterial', related_name='+', db_column='calibration_standards') + action = models.ForeignKey('CalibrationAction', related_name='+', db_column='actionid', on_delete=models.CASCADE) + reference_material = models.ForeignKey('ReferenceMaterial', related_name='+', db_column='calibration_standards', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1467,12 +1471,12 @@ class Meta: class ReferenceMaterialValue(models.Model): reference_material_value_id = models.AutoField(db_column='referencematerialvalueid', primary_key=True) - reference_material = models.ForeignKey('ReferenceMaterial', related_name='referencematerialvalue', db_column='referencematerialid') + reference_material = models.ForeignKey('ReferenceMaterial', related_name='referencematerialvalue', db_column='referencematerialid', on_delete=models.CASCADE) reference_material_value = models.FloatField(db_column='referencematerialvalue') reference_material_accuracy = models.FloatField(db_column='referencematerialaccuracy', blank=True, null=True) - variable = models.ForeignKey('Variable', db_column='variableid') - unit = models.ForeignKey('Unit', db_column='unitsid') - citation = models.ForeignKey('Citation', db_column='citationid', blank=True, null=True) + variable = models.ForeignKey('Variable', db_column='variableid', on_delete=models.CASCADE) + unit = models.ForeignKey('Unit', db_column='unitsid', on_delete=models.CASCADE) + citation = models.ForeignKey('Citation', db_column='citationid', on_delete=models.CASCADE, blank=True, null=True) def __repr__(self): return "" % ( @@ -1484,8 +1488,8 @@ class Meta: class ResultNormalizationValue(models.Model): - result = models.OneToOneField('Result', db_column='resultid', primary_key=True) - normalized_by_reference_material_value = models.ForeignKey('ReferenceMaterialValue', db_column='normalizedbyreferencematerialvalueid') + result = models.OneToOneField('Result', db_column='resultid', on_delete=models.CASCADE, primary_key=True) + normalized_by_reference_material_value = models.ForeignKey('ReferenceMaterialValue', db_column='normalizedbyreferencematerialvalueid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1499,8 +1503,8 @@ class Meta: class ResultDataQuality(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - result = models.ForeignKey('Result', related_name='+', db_column='resultid') - data_quality = models.ForeignKey('DataQuality', related_name='+', db_column='dataqualityid') + result = models.ForeignKey('Result', related_name='+', db_column='resultid', on_delete=models.CASCADE) + data_quality = models.ForeignKey('DataQuality', related_name='+', db_column='dataqualityid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1519,8 +1523,8 @@ class ExtensionProperty(models.Model): property_id = models.AutoField(db_column='propertyid', primary_key=True) property_name = models.CharField(db_column='propertyname', max_length=255) property_description = models.CharField(db_column='propertydescription', blank=True, max_length=500) - property_data_type = models.ForeignKey('PropertyDataType', db_column='propertydatatypecv') - property_units = models.ForeignKey('Unit', db_column='propertyunitsid', blank=True, null=True) + property_data_type = models.ForeignKey('PropertyDataType', db_column='propertydatatypecv', on_delete=models.CASCADE) + property_units = models.ForeignKey('Unit', db_column='propertyunitsid', on_delete=models.CASCADE, blank=True, null=True) def __repr__(self): return "" % ( @@ -1532,7 +1536,7 @@ class Meta: class ActionExtensionPropertyValue(ExtensionPropertyBridge): - action = models.ForeignKey('Action', db_column='actionid') + action = models.ForeignKey('Action', db_column='actionid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1544,7 +1548,7 @@ class Meta: class CitationExtensionPropertyValue(ExtensionPropertyBridge): - citation = models.ForeignKey('Citation', db_column='citationid') + citation = models.ForeignKey('Citation', db_column='citationid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1557,7 +1561,7 @@ class Meta: class MethodExtensionPropertyValue(ExtensionPropertyBridge): - method = models.ForeignKey('Method', db_column='methodid') + method = models.ForeignKey('Method', db_column='methodid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1570,7 +1574,7 @@ class Meta: class ResultExtensionPropertyValue(ExtensionPropertyBridge): - result = models.ForeignKey('Result', db_column='resultid') + result = models.ForeignKey('Result', db_column='resultid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1583,7 +1587,7 @@ class Meta: class SamplingFeatureExtensionPropertyValue(ExtensionPropertyBridge): - sampling_feature = models.ForeignKey('SamplingFeature', db_column='samplingfeatureid') + sampling_feature = models.ForeignKey('SamplingFeature', db_column='samplingfeatureid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1596,7 +1600,7 @@ class Meta: class VariableExtensionPropertyValue(ExtensionPropertyBridge): - variable = models.ForeignKey('Variable', db_column='variableid') + variable = models.ForeignKey('Variable', db_column='variableid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1615,7 +1619,7 @@ class Meta: class ExternalIdentifierSystem(models.Model): external_identifier_system_id = models.AutoField(db_column='externalidentifiersystemid', primary_key=True) external_identifier_system_name = models.CharField(db_column='externalidentifiersystemname', max_length=255) - identifier_system_organization = models.ForeignKey('Organization', db_column='identifiersystemorganizationid') + identifier_system_organization = models.ForeignKey('Organization', db_column='identifiersystemorganizationid', on_delete=models.CASCADE) external_identifier_system_description = models.CharField(db_column='externalidentifiersystemdescription', blank=True, max_length=500) external_identifier_system_url = models.CharField(db_column='externalidentifiersystemurl', blank=True, max_length=255) @@ -1630,7 +1634,7 @@ class Meta: class CitationExternalIdentifier(ExternalIdentifierBridge): - citation = models.ForeignKey('Citation', db_column='citationid') + citation = models.ForeignKey('Citation', db_column='citationid', on_delete=models.CASCADE) citation_external_identifier = models.CharField(db_column='citationexternalidentifier', max_length=255) citation_external_identifier_uri = models.CharField(db_column='citationexternalidentifieruri', blank=True, max_length=255) @@ -1644,7 +1648,7 @@ class Meta: class MethodExternalIdentifier(ExternalIdentifierBridge): - method = models.ForeignKey('Method', db_column='methodid') + method = models.ForeignKey('Method', db_column='methodid', on_delete=models.CASCADE) method_external_identifier = models.CharField(db_column='methodexternalidentifier', max_length=255) method_external_identifier_uri = models.CharField(db_column='methodexternalidentifieruri', blank=True, max_length=255) @@ -1658,7 +1662,7 @@ class Meta: class PersonExternalIdentifier(ExternalIdentifierBridge): - person = models.ForeignKey('People', db_column='personid') + person = models.ForeignKey('People', db_column='personid', on_delete=models.CASCADE) person_external_identifier = models.CharField(db_column='personexternalidentifier', max_length=255) person_external_identifier_uri = models.CharField(db_column='personexternalidentifieruri', blank=True, max_length=255) @@ -1673,7 +1677,7 @@ class Meta: class ReferenceMaterialExternalIdentifier(ExternalIdentifierBridge): - reference_material = models.ForeignKey('ReferenceMaterial', db_column='referencematerialid') + reference_material = models.ForeignKey('ReferenceMaterial', db_column='referencematerialid', on_delete=models.CASCADE) reference_material_external_identifier = models.CharField(db_column='referencematerialexternalidentifier', max_length=255) reference_material_external_identifier_uri = models.CharField(db_column='referencematerialexternalidentifieruri', blank=True, max_length=255) @@ -1688,7 +1692,7 @@ class Meta: class SamplingFeatureExternalIdentifier(ExternalIdentifierBridge): - sampling_feature = models.ForeignKey('SamplingFeature', db_column='samplingfeatureid') + sampling_feature = models.ForeignKey('SamplingFeature', db_column='samplingfeatureid', on_delete=models.CASCADE) sampling_feature_external_identifier = models.CharField(db_column='samplingfeatureexternalidentifier', max_length=255) sampling_feature_external_identifier_uri = models.CharField(db_column='samplingfeatureexternalidentifieruri', blank=True, max_length=255) @@ -1703,7 +1707,7 @@ class Meta: class SpatialReferenceExternalIdentifier(ExternalIdentifierBridge): - spatial_reference = models.ForeignKey('SpatialReference', db_column='spatialreferenceid') + spatial_reference = models.ForeignKey('SpatialReference', db_column='spatialreferenceid', on_delete=models.CASCADE) spatial_reference_external_identifier = models.CharField(db_column='spatialreferenceexternalidentifier', max_length=255) spatial_reference_external_identifier_uri = models.CharField(db_column='spatialreferenceexternalidentifieruri', blank=True, max_length=255) @@ -1718,7 +1722,7 @@ class Meta: class TaxonomicClassifierExternalIdentifier(ExternalIdentifierBridge): - taxonomic_classifier = models.ForeignKey('TaxonomicClassifier', db_column='taxonomicclassifierid') + taxonomic_classifier = models.ForeignKey('TaxonomicClassifier', db_column='taxonomicclassifierid', on_delete=models.CASCADE) taxonomic_classifier_external_identifier = models.CharField(db_column='taxonomicclassifierexternalidentifier', max_length=255) taxonomic_classifier_external_identifier_uri = models.CharField(db_column='taxonomicclassifierexternalidentifieruri', blank=True, max_length=255) @@ -1733,7 +1737,7 @@ class Meta: class VariableExternalIdentifier(ExternalIdentifierBridge): - variable = models.ForeignKey('Variable', db_column='variableid') + variable = models.ForeignKey('Variable', db_column='variableid', on_delete=models.CASCADE) variable_external_identifier = models.CharField(db_column='variableexternalidentifer', max_length=255) variable_external_identifier_uri = models.CharField(db_column='variableexternalidentifieruri', blank=True, max_length=255) @@ -1752,8 +1756,8 @@ class Meta: class AuthorList(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - citation = models.ForeignKey('Citation', db_column='citationid') - person = models.ForeignKey('People', db_column='personid') + citation = models.ForeignKey('Citation', db_column='citationid', on_delete=models.CASCADE) + person = models.ForeignKey('People', db_column='personid', on_delete=models.CASCADE) author_order = models.IntegerField(db_column='authororder') def __repr__(self): @@ -1767,9 +1771,9 @@ class Meta: class DataSetCitation(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - data_set = models.ForeignKey('DataSet', db_column='datasetid') - relationship_type = models.ForeignKey('RelationshipType', db_column='relationshiptypecv') - citation = models.ForeignKey('Citation', db_column='citationid') + data_set = models.ForeignKey('DataSet', db_column='datasetid', on_delete=models.CASCADE) + relationship_type = models.ForeignKey('RelationshipType', db_column='relationshiptypecv', on_delete=models.CASCADE) + citation = models.ForeignKey('Citation', db_column='citationid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1794,8 +1798,8 @@ class Meta: class ResultDerivationEquation(models.Model): - result = models.OneToOneField('Result', db_column='resultid', primary_key=True) - derivation_equation = models.ForeignKey('DerivationEquation', db_column='derivationequationid') + result = models.OneToOneField('Result', db_column='resultid', on_delete=models.CASCADE, primary_key=True) + derivation_equation = models.ForeignKey('DerivationEquation', db_column='derivationequationid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1808,9 +1812,9 @@ class Meta: class MethodCitation(models.Model): bridge_id = models.AutoField(db_column='bridgeid', primary_key=True) - method = models.ForeignKey('Method', db_column='methodid') - relationship_type = models.ForeignKey('RelationshipType', db_column='relationshiptypecv') - citation = models.ForeignKey('Citation', db_column='citationid') + method = models.ForeignKey('Method', db_column='methodid', on_delete=models.CASCADE) + relationship_type = models.ForeignKey('RelationshipType', db_column='relationshiptypecv', on_delete=models.CASCADE) + citation = models.ForeignKey('Citation', db_column='citationid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1822,8 +1826,8 @@ class Meta: class RelatedAnnotation(ObjectRelation): - annotation = models.ForeignKey('Annotation', related_name='related_annonation_annotation', db_column='annotationid') - related_annotation = models.ForeignKey('Annotation', related_name='related_annotation_related_annontation', db_column='relatedannotationid') + annotation = models.ForeignKey('Annotation', related_name='related_annonation_annotation', db_column='annotationid', on_delete=models.CASCADE) + related_annotation = models.ForeignKey('Annotation', related_name='related_annotation_related_annontation', db_column='relatedannotationid', on_delete=models.CASCADE) def __repr__(self): return "" % ( @@ -1836,8 +1840,8 @@ class Meta: class RelatedDataSet(ObjectRelation): - data_set = models.ForeignKey('DataSet', related_name='related_dataset_dataset', db_column='datasetid') - related_data_set = models.ForeignKey('DataSet', related_name='related_dataset_related_dataset', db_column='relateddatasetid') + data_set = models.ForeignKey('DataSet', related_name='related_dataset_dataset', db_column='datasetid', on_delete=models.CASCADE) + related_data_set = models.ForeignKey('DataSet', related_name='related_dataset_related_dataset', db_column='relateddatasetid', on_delete=models.CASCADE) version_code = models.CharField(db_column='versioncode', blank=True, max_length=50) def __repr__(self): @@ -1851,8 +1855,8 @@ class Meta: class RelatedResult(ObjectRelation): - result = models.ForeignKey('Result', db_column='resultid') - related_result = models.ForeignKey('Result', related_name='related_result_related_result', db_column='relatedresultid') + result = models.ForeignKey('Result', db_column='resultid', on_delete=models.CASCADE) + related_result = models.ForeignKey('Result', related_name='related_result_related_result', db_column='relatedresultid', on_delete=models.CASCADE) version_code = models.CharField(db_column='versioncode', blank=True, max_length=50) related_result_sequence_number = models.IntegerField(db_column='relatedresultsequencenumber', blank=True, null=True) @@ -1881,7 +1885,7 @@ class Meta: class CategoricalResult(ExtendedResult, XOffsetComponent, YOffsetComponent, ZOffsetComponent): - quality_code = models.ForeignKey('QualityCode', db_column='qualitycodecv') + quality_code = models.ForeignKey('QualityCode', db_column='qualitycodecv', on_delete=models.CASCADE) class Meta: db_table = 'categoricalresults' @@ -1889,7 +1893,7 @@ class Meta: class TransectResult(ExtendedResult, AggregatedComponent, ZOffsetComponent, TimeIntendedComponent): intended_transect_spacing = models.FloatField(db_column='intendedtransectspacing') - intended_transect_spacing_unit = models.ForeignKey('Unit', db_column='intendedtransectspacingunitsid', blank=True, null=True) + intended_transect_spacing_unit = models.ForeignKey('Unit', db_column='intendedtransectspacingunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: db_table = 'transectresults' @@ -1897,7 +1901,7 @@ class Meta: class SpectraResult(ExtendedResult, AggregatedComponent, XOffsetComponent, YOffsetComponent, ZOffsetComponent): intended_wavelength_spacing = models.FloatField(db_column='intendedwavelengthspacing') - intended_wavelength_spacing_unit = models.ForeignKey('Unit', db_column='intendedwavelengthspacingunitsid', blank=True, null=True) + intended_wavelength_spacing_unit = models.ForeignKey('Unit', db_column='intendedwavelengthspacingunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: db_table = 'spectraresults' @@ -1915,7 +1919,7 @@ class Meta: class TrajectoryResult(ExtendedResult, AggregatedComponent, TimeIntendedComponent): intended_trajectory_spacing = models.FloatField(db_column='intendedtrajectoryspacing') - intended_trajectory_spacing_unit = models.ForeignKey('Unit', db_column='intendedtrajectoryspacingunitsid', blank=True, null=True) + intended_trajectory_spacing_unit = models.ForeignKey('Unit', db_column='intendedtrajectoryspacingunitsid', on_delete=models.CASCADE, blank=True, null=True) class Meta: db_table = 'trajectoryresults' @@ -1927,7 +1931,7 @@ class Meta: class CategoricalResultValue(ResultValue): - result = models.ForeignKey('CategoricalResult', db_column='resultid') + result = models.ForeignKey('CategoricalResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.CharField(db_column='datavalue', max_length=255) annotations = models.ManyToManyField('Annotation', related_name='annotated_categorical_values', through='CategoricalResultValueAnnotation') @@ -1937,7 +1941,7 @@ class Meta: class MeasurementResultValue(ResultValue): - result = models.ForeignKey('MeasurementResult', db_column='resultid') + result = models.ForeignKey('MeasurementResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.FloatField(db_column='datavalue') annotations = models.ManyToManyField('Annotation', related_name='annotated_measurement_values', through='MeasurementResultValueAnnotation') @@ -1947,7 +1951,7 @@ class Meta: class PointCoverageResultValue(ResultValue, XOffsetComponent, YOffsetComponent, QualityControlComponent): - result = models.ForeignKey('PointCoverageResult', db_column='resultid') + result = models.ForeignKey('PointCoverageResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.BigIntegerField(db_column='datavalue') annotations = models.ManyToManyField('Annotation', related_name='annotated_point_coverage_values', through='PointCoverageResultValueAnnotation') @@ -1957,7 +1961,7 @@ class Meta: class ProfileResultValue(ResultValue, ZOffsetComponent, QualityControlComponent, TimeAggregationComponent): - result = models.ForeignKey('ProfileResult', db_column='resultid') + result = models.ForeignKey('ProfileResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.FloatField(db_column='datavalue') z_aggregation_interval = models.FloatField(db_column='zaggregationinterval') annotations = models.ManyToManyField('Annotation', related_name='annotated_profile_values', @@ -1968,7 +1972,7 @@ class Meta: class SectionResultValue(ResultValue, AggregatedComponent, XOffsetComponent, ZOffsetComponent, QualityControlComponent, TimeAggregationComponent): - result = models.ForeignKey('SectionResult', db_column='resultid') + result = models.ForeignKey('SectionResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.FloatField(db_column='datavalue') x_aggregation_interval = models.FloatField(db_column='xaggregationinterval') z_aggregation_interval = models.FloatField(db_column='zaggregationinterval') @@ -1980,11 +1984,11 @@ class Meta: class SpectraResultValue(ResultValue, QualityControlComponent, TimeAggregationComponent): - result = models.ForeignKey('SpectraResult', db_column='resultid') + result = models.ForeignKey('SpectraResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.FloatField(db_column='datavalue') excitation_wavelength = models.FloatField(db_column='excitationwavelength') emission_wavelength = models.FloatField(db_column='emissionwavelength') - wavelength_unit = models.ForeignKey('Unit', db_column='wavelengthunitsid') + wavelength_unit = models.ForeignKey('Unit', db_column='wavelengthunitsid', on_delete=models.CASCADE) annotations = models.ManyToManyField('Annotation', related_name='annotated_spectra_values', through='SpectraResultValueAnnotation') @@ -1993,7 +1997,7 @@ class Meta: class TimeSeriesResultValue(ResultValue, QualityControlComponent, TimeAggregationComponent): - result = models.ForeignKey('TimeSeriesResult', related_name='values', db_column='resultid') + result = models.ForeignKey('TimeSeriesResult', related_name='values', db_column='resultid', on_delete=models.CASCADE) data_value = models.FloatField(db_column='datavalue') annotations = models.ManyToManyField('Annotation', related_name='annotated_time_series_values', through='TimeSeriesResultValueAnnotation') @@ -2006,11 +2010,11 @@ class Meta: class TrajectoryResultValue(ResultValue, XOffsetComponent, YOffsetComponent, ZOffsetComponent, QualityControlComponent, TimeAggregationComponent): - result = models.ForeignKey('TrajectoryResult', db_column='resultid') + result = models.ForeignKey('TrajectoryResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.FloatField(db_column='datavalue') trajectory_distance = models.FloatField(db_column='trajectorydistance') trajectory_distance_aggregation_interval = models.FloatField(db_column='trajectorydistanceaggregationinterval') - trajectory_distance_unit = models.ForeignKey('Unit', db_column='trajectorydistanceunitsid') + trajectory_distance_unit = models.ForeignKey('Unit', db_column='trajectorydistanceunitsid', on_delete=models.CASCADE) annotations = models.ManyToManyField('Annotation', related_name='annotated_Trajectory_values', through='TrajectoryResultValueAnnotation') @@ -2019,11 +2023,11 @@ class Meta: class TransectResultValue(ResultValue, AggregatedComponent, XOffsetComponent, YOffsetComponent, QualityControlComponent, TimeAggregationComponent): - result = models.ForeignKey('TransectResult', db_column='resultid') + result = models.ForeignKey('TransectResult', db_column='resultid', on_delete=models.CASCADE) data_value = models.FloatField(db_column='datavalue') transect_distance = models.FloatField(db_column='transectdistance') transect_distance_aggregation_interval = models.FloatField(db_column='transectdistanceaggregationinterval') - transect_distance_unit = models.ForeignKey('Unit', db_column='transectdistanceunitsid') + transect_distance_unit = models.ForeignKey('Unit', db_column='transectdistanceunitsid', on_delete=models.CASCADE) annotations = models.ManyToManyField('Annotation', related_name='annotated_transect_values', through='TransectResultValueAnnotation') @@ -2032,63 +2036,63 @@ class Meta: class MeasurementResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('MeasurementResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('MeasurementResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'measurementresultvalueannotations' class CategoricalResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('CategoricalResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('CategoricalResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'categoricalresultvalueannotations' class PointCoverageResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('PointCoverageResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('PointCoverageResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'pointcoverageresultvalueannotations' class ProfileResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('ProfileResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('ProfileResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'profileresultvalueannotations' class SectionResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('SectionResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('SectionResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'sectionresultvalueannotations' class SpectraResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('SpectraResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('SpectraResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'spectraresultvalueannotations' class TimeSeriesResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('TimeSeriesResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('TimeSeriesResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'timeseriesresultvalueannotations' class TrajectoryResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('TrajectoryResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('TrajectoryResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'trajectoryresultvalueannotations' class TransectResultValueAnnotation(ResultValueAnnotation): - value = models.ForeignKey('TransectResultValue', related_name='+', db_column='valueid') + value = models.ForeignKey('TransectResultValue', related_name='+', db_column='valueid', on_delete=models.CASCADE) class Meta: db_table = 'transectresultvalueannotations' diff --git a/src/dataloaderinterface/ajax.py b/src/dataloaderinterface/ajax.py new file mode 100644 index 00000000..77ed2740 --- /dev/null +++ b/src/dataloaderinterface/ajax.py @@ -0,0 +1,80 @@ +import json +from typing import List, Dict, Any + +#PRT - temporarily avoiding the Django models because there appears to be a mismatch for the foreign key +#from dataloaderinterface.models import SensorMeasurement, SiteSensor +#SiteSensor -> odm2.results +#SensorMeasurement -> odm2.resulttimeseries + +import sqlalchemy +import pandas as pd +import numpy as np +from django.conf import settings + +_dbsettings = settings.DATABASES['odm2'] +_connection_str = f"postgresql://{_dbsettings['USER']}:{_dbsettings['PASSWORD']}@{_dbsettings['HOST']}:{_dbsettings['PORT']}/{_dbsettings['NAME']}" +_db_engine = sqlalchemy.create_engine(_connection_str) + +def get_result_timeseries_recent(request_data:Dict[str,Any]) -> str: + result_id = int(request_data['resultid']) + + #PRT - tried the django models approach but these models have odd relationships + # and I'm sure this is pulling the correct data. Using sqlalchemy work around for short term + #timeseries_results = SensorMeasurement.objects.filter(sensor__in=sensors) + #response = {'data':list(timeseries_results.values())} + + #SQL Alchemy work around to models + with _db_engine.connect() as connection: + query = f'SELECT valueid, datavalue, valuedatetime, valuedatetimeutcoffset ' \ + f'FROM odm2.timeseriesresultvalues WHERE resultid = {result_id} ' \ + 'AND valuedatetime >= ' \ + f'(SELECT MAX(valuedatetime) FROM odm2.timeseriesresultvalues WHERE resultid = {result_id}) ' \ + " - INTERVAL '3 DAYS' " \ + 'ORDER BY valuedatetime;' + df = pd.read_sql(query, connection) + return df.to_json(orient='records') + +def get_result_timeseries(request_data:Dict[str,Any]) -> str: + result_id = int(request_data['resultid']) + + with _db_engine.connect() as connection: + query = f'SELECT valueid, datavalue, valuedatetime, valuedatetimeutcoffset ' \ + f'FROM odm2.timeseriesresultvalues WHERE resultid = {result_id} ' \ + 'ORDER BY valuedatetime;' + df = pd.read_sql(query, connection) + #-9999 is used for NaN alternative by sensors + df = df.replace(-9999,np.nan) + df = df.dropna() + #convert from utc to local sensor time + df['valuedatetime'] = df['valuedatetime'] + pd.to_timedelta(df['valuedatetimeutcoffset'], unit='hours') + df = df.dropna() + data = df.to_json(orient='columns') + response = f'{{"result_id":{result_id}, "data":{data} }}' + return response + +def get_sampling_feature_metadata(request_data:Dict[str,Any]) -> str: + sampling_feature_code = str(request_data['sampling_feature_code']) + + with _db_engine.connect() as connection: + query = f"SELECT rs.resultid, rs.resultuuid, samplingfeaturecode, "\ + "samplingfeaturename, sampledmediumcv, un.unitsabbreviation, "\ + "un.unitsname, variablenamecv, variablecode, zlocation, " \ + "untrs.unitsabbreviation AS zlocationunits " \ + "FROM odm2.samplingfeatures AS sf " \ + "JOIN odm2.featureactions AS fa ON fa.samplingfeatureid=sf.samplingfeatureid " \ + "JOIN odm2.results AS rs ON rs.featureactionid = fa.featureactionid " \ + "JOIN odm2.variables AS vr ON vr.variableid = rs.variableid " \ + "LEFT JOIN odm2.units AS un ON un.unitsid = rs.unitsid " \ + f"LEFT JOIN odm2.timeseriesresults AS tsr ON tsr.resultid = rs.resultid " \ + f"LEFT JOIN odm2.units AS untrs ON untrs.unitsid = tsr.zlocationunitsid "\ + f"WHERE sf.samplingfeaturecode = '{sampling_feature_code}'; " + df = pd.read_sql(query, connection) + return df.to_json(orient='records', default_handler=str) + +def get_sampling_features(request_data:Dict[str,Any]) -> str: + with _db_engine.connect() as connection: + query = f'SELECT samplingfeatureuuid, samplingfeaturecode, samplingfeaturename ' \ + f'FROM odm2.samplingfeatures ' \ + f'ORDER BY samplingfeaturecode;' + df = pd.read_sql(query, connection) + return df.to_json(orient='records', default_handler=str) \ No newline at end of file diff --git a/src/dataloaderinterface/models.py b/src/dataloaderinterface/models.py index f288d790..bb26006d 100644 --- a/src/dataloaderinterface/models.py +++ b/src/dataloaderinterface/models.py @@ -93,7 +93,7 @@ def __repr__(self): class SensorMeasurement(models.Model): - sensor = models.OneToOneField('SiteSensor', related_name='last_measurement', primary_key=True) + sensor = models.OneToOneField('SiteSensor', related_name='last_measurement', on_delete=models.CASCADE, primary_key=True) value_datetime = models.DateTimeField() value_datetime_utc_offset = models.DurationField() data_value = models.FloatField() @@ -157,14 +157,14 @@ def __repr__(self): class SiteSensor(models.Model): - registration = models.ForeignKey('SiteRegistration', db_column='RegistrationID', related_name='sensors') + registration = models.ForeignKey('SiteRegistration', db_column='RegistrationID', on_delete=models.CASCADE, related_name='sensors') result_id = models.IntegerField(db_column='ResultID', unique=True, null=True) result_uuid = models.UUIDField(db_column='ResultUUID', unique=True, null=True) height = models.FloatField(blank=True, null=True) sensor_notes = models.TextField(blank=True, null=True) - sensor_output = models.ForeignKey('SensorOutput', related_name='sensor_instances', null=True) + sensor_output = models.ForeignKey('SensorOutput', related_name='sensor_instances', on_delete=models.CASCADE, null=True) class Meta: ordering = ['result_id'] @@ -181,21 +181,6 @@ def make_model(self): def sensor_identity(self): return "{0}_{1}_{2}".format(self.registration.sampling_feature_code, self.sensor_output.variable_code, self.result_id) - @property - def influx_url(self): - if not self.last_measurement: - return - - return settings.INFLUX_URL_QUERY.format( - result_uuid=self.influx_identifier, - last_measurement=self.last_measurement.value_datetime.strftime('%Y-%m-%dT%H:%M:%SZ'), - days_of_data=settings.SENSOR_DATA_PERIOD - ) - - @property - def influx_identifier(self): - return 'uuid_{}'.format(str(self.result_uuid).replace('-', '_')) - def __str__(self): return '%s' % (self.sensor_identity) @@ -206,8 +191,8 @@ def __repr__(self): class SiteAlert(models.Model): - user = models.ForeignKey(settings.AUTH_USER_MODEL, db_column='User', related_name='site_alerts') - site_registration = models.ForeignKey('SiteRegistration', db_column='RegistrationID', related_name='alerts') + user = models.ForeignKey(settings.AUTH_USER_MODEL, db_column='User', on_delete=models.CASCADE, related_name='site_alerts') + site_registration = models.ForeignKey('SiteRegistration', db_column='RegistrationID', on_delete=models.CASCADE, related_name='alerts') last_alerted = models.DateTimeField(db_column='LastAlerted', blank=True, null=True) hours_threshold = models.DurationField(db_column='HoursThreshold', default=timedelta(hours=1)) @@ -217,4 +202,4 @@ def __str__(self): def __repr__(self): return "" % ( self.id, self.site_registration.sampling_feature_code, self.last_alerted, self.hours_threshold, - ) + ) \ No newline at end of file diff --git a/src/dataloaderinterface/static/dataloaderinterface/css/style.css b/src/dataloaderinterface/static/dataloaderinterface/css/style.css index 258c6f87..1ea2823f 100644 --- a/src/dataloaderinterface/static/dataloaderinterface/css/style.css +++ b/src/dataloaderinterface/static/dataloaderinterface/css/style.css @@ -642,6 +642,7 @@ footer { color: #FFF; min-height: 300px; padding: 60px 0; + overflow: hidden; } footer h5 { diff --git a/src/dataloaderinterface/static/dataloaderinterface/js/site-detail.js b/src/dataloaderinterface/static/dataloaderinterface/js/site-detail.js index 09e31b82..b46e5102 100644 --- a/src/dataloaderinterface/static/dataloaderinterface/js/site-detail.js +++ b/src/dataloaderinterface/static/dataloaderinterface/js/site-detail.js @@ -2,6 +2,8 @@ const EXTENT_HOURS = 72; const GAP_HOURS = 6; const STALE_DATA_CUTOFF = new Date(new Date() - 1000 * 60 * 60 * EXTENT_HOURS); +const LOCAL_UTC_OFFSET = new Date().getTimezoneOffset() / 60; //in hours + function initMap() { var defaultZoomLevel = 18; var latitude = parseFloat($('#site-latitude').val()); @@ -23,9 +25,25 @@ function initMap() { }); } +function format_date(date) { + year = String(date.getFullYear()).padStart(4, '0'); + month = String(date.getMonth()+1).padStart(2, '0'); + day = String(date.getDate()).padStart(2, '0'); + hour = String(date.getHours()).padStart(2, '0'); + minute = String(date.getMinutes()).padStart(2, '0'); + second = String(date.getSeconds()).padStart(2, '0'); + return `${year}-${month}-${day} ${hour}:${minute}:${second}`; +} + function fillValueTable(table, data) { var rows = data.map(function (dataValue) { - return "" + dataValue.DateTime + "" + dataValue.TimeOffset + "" + dataValue.Value + ""; + //looks to be 1 hour offset between python datetime integer and JS + date = new Date(dataValue.valuedatetime + (dataValue.valuedatetimeutcoffset + LOCAL_UTC_OFFSET) * 3600000); + var row_string = "" + + format_date(date) + "" + + dataValue.valuedatetimeutcoffset + "" + + dataValue.datavalue + ""; + return row_string; }); table.append($(rows.join(''))); } @@ -69,18 +87,18 @@ function drawSparklinePlot(seriesInfo, seriesData) { } var lastRead = Math.max.apply(Math, seriesData.map(function(value){ - return new Date(value.DateTime); + return new Date(value.valuedatetime); })); var dataTimeOffset = Math.min.apply(Math, seriesData.map(function(value){ - return new Date(value.DateTime); + return new Date(value.valuedatetime); })); var xAxis = d3.scaleTime().range([0, width]); var yAxis = d3.scaleLinear().range([height, 0]); var yDomain = d3.extent(seriesData, function(d) { - return parseFloat(d.Value); + return parseFloat(d.datavalue); }); var yPadding = (yDomain[1] - yDomain[0]) / 20; // 5% padding yDomain[0] -= yPadding; @@ -91,11 +109,11 @@ function drawSparklinePlot(seriesInfo, seriesData) { var line = d3.line() .x(function(d) { - var date = new Date(d.DateTime); + var date = new Date(d.valuedatetime); return xAxis(date); }) .y(function(d) { - return yAxis(d.Value); + return yAxis(d.datavalue); }); var svg = d3.select(plotBox.get(0)).append("svg") @@ -118,7 +136,7 @@ function drawSparklinePlot(seriesInfo, seriesData) { var paths = []; for (var i = 0; i < seriesData.length; i++) { - var currentDate = new Date(seriesData[i].DateTime); + var currentDate = new Date(seriesData[i].valuedatetime); if (previousDate) { gapOffset = new Date(currentDate - 1000 * 60 * 60 * GAP_HOURS); @@ -144,7 +162,7 @@ function drawSparklinePlot(seriesInfo, seriesData) { svg.append("circle") .attr("r", 2) .style("fill", "steelblue") - .attr("transform", "translate(" + xAxis(new Date(paths[i][0].DateTime)) + ", " + yAxis(paths[i][0].Value) + ")") + .attr("transform", "translate(" + xAxis(new Date(paths[i][0].valuedatetime)) + ", " + yAxis(paths[i][0].datavalue) + ")") } else { svg.append("path") @@ -156,37 +174,48 @@ function drawSparklinePlot(seriesInfo, seriesData) { } function getTimeSeriesData(sensorInfo) { - if (sensorInfo['influxUrl'] === 'None' ) { return; } + request_data = {method:'get_result_timeseries_recent', 'resultid': sensorInfo['resultId']} $.ajax({ - url: sensorInfo['influxUrl'] - }).done(function(influx_data) { - var resultSet = influx_data.results ? influx_data.results.shift() : null; - if (resultSet && resultSet.series && resultSet.series.length) { - var influxSeries = resultSet.series.shift(); - var indexes = { - time: influxSeries.columns.indexOf("time"), - value: influxSeries.columns.indexOf("DataValue"), - offset: influxSeries.columns.indexOf("UTCOffset") - }; - var values = influxSeries.values.map(function(influxValue) { - return { - DateTime: influxValue[indexes.time].match(/^(\d{4}\-\d\d\-\d\d([tT][\d:]*)?)/).shift(), - Value: influxValue[indexes.value], - TimeOffset: influxValue[indexes.offset] - } - }); - - fillValueTable($('table.data-values[data-result-id=' + sensorInfo['resultId'] + ']'), values); - drawSparklineOnResize(sensorInfo, values); - drawSparklinePlot(sensorInfo, values); - } else { - console.log('No data values were found for this site'); + url: '../../dataloader/ajax/', + data: {request_data: JSON.stringify(request_data)}, + method: 'POST', + success: function(data) { + response_data = JSON.parse(data) + $table = $('table.data-values[data-result-id=' + sensorInfo['resultId'] + ']'); + fillValueTable($table, response_data); + drawSparklineOnResize(sensorInfo, response_data); + drawSparklinePlot(sensorInfo, response_data); + /* + var resultSet = influx_data.results ? influx_data.results.shift() : null; + if (resultSet && resultSet.series && resultSet.series.length) { + var influxSeries = resultSet.series.shift(); + var indexes = { + time: influxSeries.columns.indexOf("time"), + value: influxSeries.columns.indexOf("DataValue"), + offset: influxSeries.columns.indexOf("UTCOffset") + }; + var values = influxSeries.values.map(function(influxValue) { + return { + DateTime: influxValue[indexes.time].match(/^(\d{4}\-\d\d\-\d\d([tT][\d:]*)?)/).shift(), + Value: influxValue[indexes.value], + TimeOffset: influxValue[indexes.offset] + } + }); + + fillValueTable($('table.data-values[data-result-id=' + sensorInfo['resultId'] + ']'), values); + drawSparklineOnResize(sensorInfo, values); + drawSparklinePlot(sensorInfo, values); + } else { + console.log('No data values were found for this site'); + drawSparklinePlot(sensorInfo, []); // Will just render the empty message + // console.info(series.getdatainflux); + } + */ + }, + fail: function() { drawSparklinePlot(sensorInfo, []); // Will just render the empty message - // console.info(series.getdatainflux); + console.log('data failed to load.'); } - }).fail(function() { - drawSparklinePlot(sensorInfo, []); // Will just render the empty message - console.log('data failed to load.'); }); } diff --git a/src/dataloaderinterface/static/timeseries_visualization/css/style.css b/src/dataloaderinterface/static/timeseries_visualization/css/style.css new file mode 100644 index 00000000..44c89056 --- /dev/null +++ b/src/dataloaderinterface/static/timeseries_visualization/css/style.css @@ -0,0 +1,377 @@ +/* Glyphicons for collapsible panels*/ +.panel-heading .accordion-toggle:after { + /* symbol for "opening" panels */ + font-family: 'Glyphicons Halflings'; /* essential for enabling glyphicon */ + content: "\e114"; /* adjust as needed, taken from bootstrap.css */ + float: right; /* adjust as needed */ + color: grey; /* adjust as needed */ +} + +.panel-heading .accordion-toggle.collapsed:after { + /* symbol for "collapsed" panels */ + content: "\e080"; /* adjust as needed, taken from bootstrap.css */ +} + +button span { + font-family: "Helvetica Neue", Helvetica, Arial, sans-serif; + font-size: 14px; + line-height: 1.428571429; +} + +.bar rect { + shape-rendering: crispEdges; + stroke-width: 1; + stroke: black; + stroke-opacity: 0.8; +} + +line.tick { + stroke: #cccccc; + fill: none; + shape-rendering: crispEdges; +} + +/* body*/ + +.panel-body { + padding: 0px; +} + +/* Left Panel*/ + +#leftPanel .list-group { + padding: 0; + margin: 0px; +} + +.align-center { + list-style: none; + text-align: center; +} + +.CheckBoxLabel { + /* Some browsers use different cursor notations*/ + cursor: pointer; + cursor: hand; +} + +#leftPanel .list-group-item { + border-top-right-radius: 0px !important; + border-top-left-radius: 0px !important; + border: 0px; + padding: 0px 15px; +} + +#leftPanel li:hover { + background: #ddd; +} + +.highlight { + background: #C1F9FC; +} + +#leftPanel .panel, .row { + border-radius: 0; + margin: 0; + border-top: 0px; +} + +#leftPanel .panel-group { + margin-bottom: 0; + height: 100%; + width: 300px; +} + +.facets-container { + height: calc(100% - 26px); + overflow-y: auto; + margin-left: 25px; + margin-bottom: 0; +} + +.facets-container .glyphicon { + color: #999; + left: -5px; +} + +.facets-container .glyphicon:hover { + color: #555; +} + +.panel-toolbar-left { + float: left; + height: 100%; + background: #fff; + display: inline; + clear: none; + width: 26px; + position: fixed; + z-index: 20; + border-right: 1px solid #ddd; +} + +.toolbar-top-button { + float: right; + margin-right: 8px; + color: #999; + padding: 3px; + cursor: default; +} + +.panel-toolbar-top { + background: #fff; + height: 26px; + border: 1px solid #ddd; + border-right: 0; +} + +.toolbar-left-button { + margin-left: -16px; + margin-top: 16px; + height: 26px; + color: #555; + padding-right: 10px; + padding-left: 10px; + cursor: default; +} + +.toolbar-left-button[data-enabled="true"]:hover { + color: #000; + background: #ddd; + + -webkit-touch-callout: none; + -webkit-user-select: none; + -khtml-user-select: none; + -moz-user-select: none; + -ms-user-select: none; + user-select: none; +} + +.toolbar-left-button[data-enabled="false"] { + opacity: 0.6; /* Real browsers */ + filter: alpha(opacity=60); /* IE */ +} + +.toolbar-top-button:hover { + color: #555; +} + +.rotate { + display: inline-block; + -webkit-transform: rotate(-90deg); + -moz-transform: rotate(-90deg); + -ms-transform: rotate(-90deg); + -o-transform: rotate(-90deg); + transform: rotate(-90deg); + + /* also accepts left, right, top, bottom coordinates; not required, but a good idea for styling */ + -webkit-transform-origin: 50% 50%; + -moz-transform-origin: 50% 50%; + -ms-transform-origin: 50% 50%; + -o-transform-origin: 50% 50%; + transform-origin: 50% 50%; + + /* Should be unset in IE9+ I think. */ + filter: progid:DXImageTransform.Microsoft.BasicImage(rotation=3); +} + +.col-sm-3, .col-sm-9 { + padding: 0px; + height: 100%; + border-right: 1px solid #ddd; + float: left; +} + +.col-sm-3 { + width: auto; + min-width: 25px; +} + +.col-sm-9 { + float: none; + overflow: hidden; + width: inherit; +} + +.row { + margin-right: 0px; + margin-left: 0px; + height: -moz-calc(100% - 81px); + height: -webkit-calc(100% - 81px); + height: calc(100% - 81px); +} + +#btnSetPlotOptions, +#load-site, +#datePanel { + float: right; +} + +.datepicker { + text-align: center; +} + +#visualizationContent { + width: 100%; + overflow: hidden; + padding: 0; +} + +#graphArea { + display: inline-block; + height: 100%; + width: 100%; +} + +#graphContainer { + width: -moz-calc(100% - 307px); + width: -webkit-calc(100% - 307px); + width: calc(100% - 307px); + float: left; + padding-top: 5px; +} + +#graphContainer .label { + font-size: 100%; + font-weight: normal; +} + +#panel-right ul li { + padding: 3px 15px; +} + +#panel-right { + background: white; + right: 0; + width: 307px; + float: right; + border-left: 1px solid #aaa; +} + +#panel-right-container { + overflow-x: hidden; + padding-bottom: 0; +} + +#panel-right .panel { + min-height: 80px; + border: 1px solid #aaa; +} + +#panel-right .panel-heading { + background: #ddd; +} + +.panel-heading { + padding-left:10px; +} + +.container-title { + font-family: "Helvetica Neue", Helvetica, Arial, sans-serif; + font-size: 14px; +} + +#plotOptionsContainer .panel-content, +.panel-content { + padding: 10px; + height: 100%; + display: inline-block; +} + +#plotOptionsContainer table { + margin-bottom: 0; +} + +#plotOptionsContainer td { + border-top: 0; + vertical-align: middle; + padding-bottom: 2px; +} + +#plotOptionsContainer tr td:first-child { + text-align: right; +} + +#visualizationDropDown, #plotOptionsContainer .btn-group { + width: 100%; +} + +#plotOptionsContainer input { + border: 1px solid #ddd; + width: 100%; +} + +button.close { + margin-left: 3px; +} + +/* ================================================================= +highcharts implementation +*/ + +.series-panel { + font-size: 14px; + line-height: 16px; + margin-bottom: 6px;; +} + +.series-panel>span { + display: inline-block; + line-height: 16px; +} + +.series-panel>input { + vertical-align: top; + padding-right: 2px;; +} + +.series-panel>span>.uuid { + line-height:0px; + font-size:10px; +} + +#site-select { + width:100%; + margin-bottom: 10px; +} + +/*---------------------------------------------------------------- + messaging +*/ + +.message-box { + position: fixed; + top:40%; + left:50%; + min-width:300px; + max-width:500px; + display:none; + z-index: 999; + + background-color: white; + border: 2px solid black; + border-radius: 10px; +} + +.message-box>#title { + border-bottom: 2px solid black; + background: lightgray; + width: 100%; + display: block; + border-radius: 10px 10px 0px 0px; + padding-left: 5px; + font-weight: bold; + color: black; +} + +.message-box>#msg{ + padding: 10px; + font-size:16; + display:block; +} + +.message-box>input { + margin: 3px; + border-radius: 3px; + float:right; +} \ No newline at end of file diff --git a/src/dataloaderinterface/static/timeseries_visualization/js/plotting.js b/src/dataloaderinterface/static/timeseries_visualization/js/plotting.js new file mode 100644 index 00000000..cf4f5b31 --- /dev/null +++ b/src/dataloaderinterface/static/timeseries_visualization/js/plotting.js @@ -0,0 +1,228 @@ +function initChart(target_element) { + _chart = createChart(target_element); + _axes = [-999, -999, -999, -999, -999, -999]; +} + + +function createChart(renderTo) { + chart = new Highcharts.chart(renderTo, { + chart: { + plotBorderColor: '#CCC', + plotBorderWidth: 2, + type: 'line', + zoomType: 'x', + height:700, + spacingLeft:20, + spacingRight:20, + }, + credits: { + enabled: false + }, + title: { + text: '' + }, + subtitle: { + text: 'Click and drag in the plot area to zoom in' + }, + xAxis: { + type: 'datetime', + //tickInterval: 30, + //tickWidth: 1, + dateTimeLabelFormats: { + day: "%m/%d/%y", + month: "%b-%y" + }, + labels: { + style: { + fontSize: '14px' + } + /* + formatter: function () { + var dateparts = this.value.split('T') + return (dateparts[0]) + } + */ + }, + title: { + text: 'Monitoring Date (local time of sensor)', + style: { + fontSize: '16px' + } + } + }, + yAxis: [ + { + type: 'linear', + title: { + text: '', + style: { + fontSize: '15px' + } + }, + labels: { + style: { + fontSize: '13px' + } + }, + min: -1, + }, + { + type: 'linear', + title: { + text: '', + style: { + fontSize: '15px' + } + }, + labels: { + style: { + fontSize: '13px' + }, + }, + min: -1, + opposite: true, + }, + { + type: 'linear', + title: { + text: '', + style: { + fontSize: '15px' + } + }, + labels: { + style: { + fontSize: '13px' + } + }, + min: -1, + }, + { + type: 'linear', + title: { + text: '', + style: { + fontSize: '15px' + } + }, + labels: { + style: { + fontSize: '13px' + }, + }, + min: -1, + opposite: true, + }, + { + type: 'linear', + title: { + text: '', + style: { + fontSize: '15px' + } + }, + labels: { + style: { + fontSize: '13px' + } + }, + min: -1, + }, + { + type: 'linear', + title: { + text: '', + style: { + fontSize: '15px' + } + }, + labels: { + style: { + fontSize: '13px' + }, + }, + min: -1, + opposite: true, + } + ], + legend: { + enabled: true, + layout: 'horizontal', + align: 'left', + verticalAlign: 'top', + backgroundColor: (Highcharts.theme && Highcharts.theme.background2) || 'white', + borderColor: '#CCC', + borderWidth: 1, + }, + tooltip: { + pointFormat: '{series.name}: ' + '{point.y:.1f}' + '' + }, + plotOptions: { + }, + pane: { + size: '60%' + }, + series: [] + }); + return chart; +} + +function addYAxis(align="left") { + axis_count = _chart.yAxis.length; + axis_buffer = .5 * axis_count % 2; + + opposite = false; + left = -axis_buffer; + right = 0; + if (align === "right") { + opposite = true; + left = -axis_buffer; + right = 0; + } + axis = _chart.addAxis({ + type: 'linear', + title: { + text: '', + style: { + fontSize: '15px' + } + }, + labels: { + align: align, + style: { + fontSize: '13px' + } + }, + min: -1, + opposite: opposite, + }); + _chart.redraw(); +} + +function addSeries(yAxis, axis_title, series_name, x, y) { + data = x.map((e,i) => [e,y[i]]); + + series = _chart.addSeries({ + type:'line', + data:data, + yAxis: yAxis, + connectNulls:false, + name: series_name, + gapSize: 1000, + }); + series_color = series.color; + + axis = _chart.yAxis[yAxis] + axis.setTitle({'text':axis_title, 'style':{'color':series_color}}); + axis.setTitle({'text':axis_title, 'style':{'color':series_color}}); + axis.update({'ColorString':series_color}); + extremes = axis.getExtremes(); + axis.setExtremes(extremes.dataMin,extremes.dataMax); +} + +function removeSeries(yAxis) { + axis = _chart.yAxis[yAxis]; + axis.series[0].remove(); + axis.setTitle({text:''}) + axis.setExtremes(); +} \ No newline at end of file diff --git a/src/dataloaderinterface/static/timeseries_visualization/js/visualization.js b/src/dataloaderinterface/static/timeseries_visualization/js/visualization.js new file mode 100644 index 00000000..2270aa42 --- /dev/null +++ b/src/dataloaderinterface/static/timeseries_visualization/js/visualization.js @@ -0,0 +1,301 @@ +var _chart; + +var _axes = []; +var _resultMetadata = {}; +var _resultsTimeSeries = {}; + +var min_date; +var max_date; + +var _init = true; + +$(function () { + initChart('cht_ts'); + //default to last year for xaxis + max_date = new Date(Date.now()); + min_date = new Date(Date.now()); + min_date.setFullYear(min_date.getFullYear() - 1); + updatePlotDateRange(min_date, max_date); + + if (_samplingfeaturecode !== undefined) { + getSamplingFeatureMetadata(_samplingfeaturecode); + } + + ajax({method:'get_sampling_features'}, populateSamplingFeatureSelect); + + $(document).on('click', '.plottable-series', function() { + resultid = $(this).attr('id').split("_")[1]; + checked = $(this).prop('checked'); + changeTimeSeries(resultid, checked); + }); + + $('#load-site').on('click', function() { + samplingfeaturecode = $('#site-select').find(':selected').prop('id'); + getSamplingFeatureMetadata(samplingfeaturecode); + }); + + $('#btnSetPlotOptions').on('click', function() { + min = $('#dpd1').val(); + min_date = null; + if (min !== '') {min_date = new Date(min);} + max = $('#dpd2').val(); + max_date = null; + if (max !== '') {max_date = new Date(max);} + updatePlotDateRange(min_date, max_date); + }); + + $('#btnLastYear').on('click', function() { + max_date = new Date(Date.now()); + min_date = new Date(Date.now()); + min_date.setFullYear(min_date.getFullYear() - 1); + updatePlotDateRange(min_date, max_date); + }); + + $('#btnLastMonth').on('click', function() { + max_date = new Date(Date.now()); + min_date = new Date(Date.now()); + min_date.setMonth(min_date.getMonth() - 1); + updatePlotDateRange(min_date, max_date); + }) + + $('#btnAll').on('click', function() { + min_date = null; + max_date = null; + updatePlotDateRange(min_date, max_date); + }); + + $('#series-filter').on('change', function() { + filter_text = $('#series-filter').val().toLowerCase(); + $series = $('#plottableSeries') + $series.find('.series-panel').each(function(index, element) { + if (element.innerText.toLowerCase().includes(filter_text)) { + $(element).show(); + } + else { + $(element).hide(); + } + }); + }); + + $('.message-box>input').on('click', function(){ + $('.message-box').hide(); + }); + +}); + +function displayMessage(title, msg) { + $('.message-box #title').text(title); + $('.message-box #msg').text(msg); + $('.message-box').show(); +} + +function dateToString(date) { + if (date !== null && date !== undefined) { + year = date.getUTCFullYear(); + month = date.getUTCMonth() + 1; + day = date.getUTCDate(); + return `${year}-${month}-${day}`; + } + return ''; +} + +function updatePlotDateRange(min, max) { + $('#dpd1').val(''); + $('#dpd2').val(''); + if (min != null) { + $('#dpd1').val(dateToString(min)); + min = min.getTime(); + } + if (max != null) { + $('#dpd2').val(dateToString(max)); + max = max.getTime(); + } + _chart.xAxis[0].update({'min':min, 'max':max}); + _chart.xAxis[0].setExtremes(); +} + +function changeTimeSeries(result_id, checked) { + $plotted = $('#plottedSeries'); + $notplotted = $('#plottableSeries'); + $panel = $(`#series-panel_${result_id}`) + if ($plotted.children().length == 6 && checked) { + $input = $panel.find('input') + $($input).prop("checked",false); + displayMessage("Warning: Too Many Time Series Selected", + "A maximum of six(6) time series can be plotted at a single time. Please " + + "remove a plot series by unchecking it prior to plotting any additional " + + "time series." + ); + return; + } + + if (checked) { + $panel.remove(); + $plotted.append($panel); + if (result_id in _resultsTimeSeries) { + plotSeries(_resultsTimeSeries[result_id], result_id) + } + else { + getTimeseriesData(result_id); + } + } + if (!checked) { + unPlotSeries(result_id); + $panel.remove(); + $notplotted.append($panel); + } +} + +function initAddSeries(response) { + response_obj = JSON.parse(response); + for ([index, metadata] of Object.entries(response_obj)) { + _resultMetadata[metadata.resultid] = metadata + } + populateSeriesBlock(); + if (_init) { + _init = false; + if (_result_id !== undefined && _result_id !== '') { + changeTimeSeries(_result_id, true); + $(`#plot-series-check_${_result_id}`).prop('checked', true); + } + } +} + +function populateSeriesBlock(){ + $block = $('#plottableSeries') + $block.empty(); + for ([key, metadata] of Object.entries(_resultMetadata)) { + $panel = makeSeriesPanel(metadata); + $block.append($panel); + } +} + +function makeSeriesPanel(metadata) { + zlocation_text = '' + if (metadata.zlocation !== undefined && metadata.zlocation !== null) { + zlocation_text = `: ${metadata.zlocation} ${metadata.zlocationunits}` + } + + $panel = $(`
`) + $panel.append(``); + $panel.append(`` + + `${metadata.samplingfeaturecode} `+ + `(${metadata.sampledmediumcv}`+ + `${zlocation_text})
` + + `${metadata.variablecode} ` + + `(${metadata.unitsabbreviation})
` + + `UUID: ${metadata.resultuuid}` + + `
`); + return $panel +} + +function getEmptyAxis() { + for(i=0; i<_axes.length; i++) { + if (_axes[i] == -999 ) { + return i; + }; + } + return -1; +} + +function unPlotSeries(resultid) { + for(i=0; i<_axes.length; i++) { + if (_axes[i] == resultid) { + removeSeries(i); + _axes[i] = -999; + }; + } +} + +function getTimeseriesDataCallback(response_data) { + response_json = JSON.parse(response_data); + resultid = response_json.result_id; + _resultsTimeSeries[resultid] = { + 'x':Object.values(response_json.data.valuedatetime), + 'y':Object.values(response_json.data.datavalue) + }; + plotSeries(_resultsTimeSeries[resultid],resultid); +} + +function plotSeries(timeseriesData, resultid) { + x = timeseriesData['x']; + y = timeseriesData['y']; + metadata = _resultMetadata[resultid] + axis = getEmptyAxis() + if (axis >= 0) { + _axes[axis] = resultid; + + zlocation_text = '' + if (metadata.zlocation !== undefined && metadata.zlocation !== null) { + zlocation_text = `: ${metadata.zlocation} ${metadata.zlocationunits}` + } + + axis_title = `[${metadata.samplingfeaturecode} ${metadata.sampledmediumcv} ${zlocation_text}] ` + + `${metadata.variablecode} (${metadata.unitsabbreviation})`; + + addSeries(axis, axis_title, axis_title, x, y); + } +} + +function ajax(request_data, callback_success, callback_fail, url='/dataloader/ajax/') { + $.ajax({ + url: url, + data: {request_data: JSON.stringify(request_data)}, + method: 'POST', + success: function(response) { + if (typeof (callback_success) !== 'undefined') { + callback_success(response); + } + else if (typeof (response) !== 'undefined') { + return response ; + } + }, + fail: function(response) { + if (typeof (callback_fail) !== 'undefined') { + callback_fail(response); + } + else if (typeof (response) !== 'undefined') { + return response ; + } + } + }); +} + +function getSamplingFeatureMetadata(sampling_feature_code) { + request_data = { + method: 'get_sampling_feature_metadata', + sampling_feature_code: sampling_feature_code + } + ajax(request_data, initAddSeries); +} + +function getTimeseriesData(resultid, startdate, enddate) { + request_data = { + method: 'get_result_timeseries', + resultid: resultid, + startdate: startdate, + enddate: enddate + } + ajax(request_data, getTimeseriesDataCallback); +} + +function populateSamplingFeatureSelect(response) { + $select = $('#site-select'); + $select.empty(); + + data = JSON.parse(response); + + for ([index, samplingFeature] of Object.entries(data)) { + var selected = '' + if (samplingFeature.samplingfeaturecode == _samplingfeaturecode) { + selected = 'selected' + } + option = ``; + $select.append(option); + } +} + diff --git a/src/dataloaderinterface/templates/dataloaderinterface/base.html b/src/dataloaderinterface/templates/dataloaderinterface/base.html index 4f2ab142..76597688 100644 --- a/src/dataloaderinterface/templates/dataloaderinterface/base.html +++ b/src/dataloaderinterface/templates/dataloaderinterface/base.html @@ -82,8 +82,7 @@ Browse Sites diff --git a/src/dataloaderinterface/templates/dataloaderinterface/site_details.html b/src/dataloaderinterface/templates/dataloaderinterface/site_details.html index cd90538e..d80c713f 100644 --- a/src/dataloaderinterface/templates/dataloaderinterface/site_details.html +++ b/src/dataloaderinterface/templates/dataloaderinterface/site_details.html @@ -308,7 +308,7 @@
diff --git a/src/dataloaderinterface/templates/timeseries_visualization/tool.html b/src/dataloaderinterface/templates/timeseries_visualization/tool.html new file mode 100644 index 00000000..73873193 --- /dev/null +++ b/src/dataloaderinterface/templates/timeseries_visualization/tool.html @@ -0,0 +1,104 @@ +{% extends 'dataloaderinterface/base.html' %} +{% load staticfiles %} + +{% block content %} + +
+ + +
+ +
+ +
+
+
+
+
+
+
+
+ {% comment %} + + {% endcomment %} +
+
+
+ Load Site +
+
+ + +
+
+
+
Date Range
+
+ +
+ + + +
+ + + + + + + + + + + + + + + +
Begin
End
+ +
+
+
+
+
Plotted Series +
+
+
+
+
Add Series +
+
+
+ + +
+
+
+
+ +
+
+
+ + + + + + + + + + + + + +{% endblock content %} diff --git a/src/dataloaderinterface/templatetags/helpers.py b/src/dataloaderinterface/templatetags/helpers.py index 971f1ec4..6584a4e7 100644 --- a/src/dataloaderinterface/templatetags/helpers.py +++ b/src/dataloaderinterface/templatetags/helpers.py @@ -33,13 +33,17 @@ def replace_hour(value, argv): return '' -@register.filter("is_stale") +@register.filter("is_stale") # Consider adding timezone: https://docs.djangoproject.com/en/3.2/howto/custom-template-tags/#filters-and-time-zones def is_stale(value, default): + ''' + Used to test if `SiteAlert.last_alerted` datetime value has exceeded a threshold vs datetime.utcnow(), + if and only if `site.latest_measurement.value_datetime` is null. (Anthony note: confirm this logic is correct.) + ''' if not value: return '' try: - if default > 0: + if default.last_alterted > 0: return (datetime.utcnow() - value) > timedelta(hours=default.hours_threshold.total_seconds()/3600) return (datetime.utcnow() - value) > timedelta(hours=72) except AttributeError: diff --git a/src/dataloaderinterface/urls.py b/src/dataloaderinterface/urls.py index 6a61a16b..954f7278 100644 --- a/src/dataloaderinterface/urls.py +++ b/src/dataloaderinterface/urls.py @@ -19,6 +19,8 @@ HomeView, BrowseSitesListView, SiteUpdateView, SiteDeleteView, StatusListView, LeafPackListUpdateView, \ TermsOfUseView, DMCAView, PrivacyView, CookiePolicyView +import dataloaderinterface.views as views + urlpatterns = [ url(r'^$', HomeView.as_view(), name='home'), url(r'^sites/$', SitesListView.as_view(), name='sites_list'), @@ -33,6 +35,8 @@ url(r'^sites/update/(?P.*?)/leafpacks/$', LeafPackListUpdateView.as_view(), name='leafpacks'), url(r'^sites/update/(?P.*)/$', SiteUpdateView.as_view(), name='site_update'), url(r'^sites/delete/(?P.*)/$', SiteDeleteView.as_view(), name='site_delete'), - url(r'^sites/(?P.*)/leafpack/', include('leafpack.urls', namespace='leafpack')), + url(r'^sites/(?P.*)/leafpack/', include(('leafpack.urls', 'leafpack'), namespace='leafpack')), url(r'^sites/(?P.*)/$', SiteDetailView.as_view(), name='site_detail'), + url(r'^dataloader/ajax/', views.ajax_router, name='ajax'), + ] diff --git a/src/dataloaderinterface/views.py b/src/dataloaderinterface/views.py index 5a149342..39f7b774 100644 --- a/src/dataloaderinterface/views.py +++ b/src/dataloaderinterface/views.py @@ -6,7 +6,7 @@ from django.contrib.auth.decorators import login_required from django.utils.decorators import method_decorator -from django.core.urlresolvers import reverse, reverse_lazy +from django.urls import reverse, reverse_lazy from django.core.exceptions import ObjectDoesNotExist from django.http.response import HttpResponseRedirect, Http404 from django.shortcuts import redirect @@ -21,6 +21,13 @@ from hydroshare.models import HydroShareResource, HydroShareAccount from leafpack.models import LeafPack +from django.views.decorators.csrf import csrf_exempt +import dataloaderinterface.ajax as ajax +from django.core.handlers.wsgi import WSGIRequest + +import json +from django.http import HttpResponse, JsonResponse +from typing import Union class LoginRequiredMixin(object): @classmethod @@ -119,7 +126,6 @@ def get_context_data(self, **kwargs): context['is_followed'] = self.object.followed_by.filter(id=self.request.user.id).exists() context['can_administer_site'] = self.request.user.is_authenticated and self.request.user.can_administer_site(self.object) context['is_site_owner'] = self.request.user == self.object.django_user - context['tsa_url'] = settings.TSA_URL context['leafpacks'] = LeafPack.objects.filter(site_registration=context['site'].pk).order_by('-placement_date') @@ -332,3 +338,13 @@ def post(self, request, *args, **kwargs): else: messages.error(request, 'There are still some required fields that need to be filled out!') return self.form_invalid(form) + +@csrf_exempt +def ajax_router(request: WSGIRequest) -> Union[JsonResponse,HttpResponse]: + request_data = json.loads(request.POST.get('request_data')) + try: + method = getattr(ajax, request_data['method']) + response = method(request_data) + return JsonResponse(response, safe=False) + except AttributeError: #Invalid method specified + return HttpResponse(status=405) \ No newline at end of file diff --git a/src/dataloaderservices/views.py b/src/dataloaderservices/views.py index 0243eb4f..62e7e66d 100644 --- a/src/dataloaderservices/views.py +++ b/src/dataloaderservices/views.py @@ -1,10 +1,10 @@ -import codecs import csv import os from collections import OrderedDict -from datetime import timedelta, datetime +from datetime import time, timedelta, datetime -from StringIO import StringIO +from io import StringIO +from django.utils import encoding import requests from django.conf import settings @@ -16,7 +16,6 @@ from django.db.models import QuerySet from django.shortcuts import reverse from rest_framework.generics import GenericAPIView -from unicodecsv.py2 import UnicodeWriter from dataloader.models import SamplingFeature, TimeSeriesResultValue, Unit, EquipmentModel, TimeSeriesResult, Result from django.db.models.expressions import F @@ -34,10 +33,22 @@ from leafpack.models import LeafPack +from typing import Iterable, List, Tuple +from django.core.handlers.wsgi import WSGIRequest + +import pandas as pd + +#PRT - temporary work around after replacing InfluxDB but not replacement models +import sqlalchemy +from sqlalchemy.sql import text +from django.conf import settings +_dbsettings = settings.DATABASES['odm2'] +_connection_str = f"postgresql://{_dbsettings['USER']}:{_dbsettings['PASSWORD']}@{_dbsettings['HOST']}:{_dbsettings['PORT']}/{_dbsettings['NAME']}" +_db_engine = sqlalchemy.create_engine(_connection_str) + # TODO: Check user permissions to edit, add, or remove stuff with a permissions class. # TODO: Use generic api views for create, edit, delete, and list. - class ModelVariablesApi(APIView): authentication_classes = (SessionAuthentication, ) @@ -172,19 +183,23 @@ def post(self, request, format=None): class SensorDataUploadView(APIView): authentication_classes = (SessionAuthentication,) header_row_indicators = ('Data Logger', 'Sampling Feature', - 'Sensor', 'Variable', 'Result', 'Date and Time') + 'Sensor', 'Variable', 'Result', 'Date and Time','Code') def should_skip_row(self, row): if row[0].startswith(self.header_row_indicators): return True + def decode_utf8_sig(self, input_iterator:Iterable) -> str: + for item in input_iterator: + yield item.decode('utf-8-sig') + def build_results_dict(self, data_file): results = {'utc_offset': 0, 'site_uuid': '', 'results': {}} got_feature_uuid = False got_result_uuids = False got_UTC_offset = False - for index, row in enumerate(csv.reader(data_file)): + for row in csv.reader(self.decode_utf8_sig(data_file)): if row[0].startswith('Sampling Feature') and not got_feature_uuid: results['site_uuid'] = row[0].replace( @@ -248,63 +263,55 @@ def post(self, request, *args, **kwargs): data_value_units = Unit.objects.get(unit_name='hour minute') sensors = registration.sensors.all() - reader = csv.reader(data_file) - for row in reader: + warnings = [] + for row in csv.reader(self.decode_utf8_sig(data_file)): if self.should_skip_row(row): continue else: - # process data series try: measurement_datetime = parse_datetime(row[0]) - except ValueError: - print('invalid date {}'.format(row[0])) + if not measurement_datetime: + measurement_datetime = pd.to_datetime(row[0]) + if not measurement_datetime: + raise ValueError + except (ValueError, TypeError): + warnings.append('Unrecognized date format: {}'.format(row[0])) continue - - if not measurement_datetime: - print('invalid date format {}'.format(row[0])) - continue - measurement_datetime = measurement_datetime.replace(tzinfo=None) - timedelta(hours=results_mapping['utc_offset']) for sensor in sensors: uuid = str(sensor.result_uuid) if uuid not in results_mapping['results']: - print('uuid {} in file does not correspond to a measured variable in {}'.format(uuid, registration.sampling_feature_code)) + #TODO - consider revised approach where we loop over column in CSV and not all sensors + #this would allow us to return warning that result uuid is not recognized. continue data_value = row[results_mapping['results'][uuid]['index']] - - results_mapping['results'][uuid]['values'].append(( - long((measurement_datetime - datetime.utcfromtimestamp(0)).total_seconds()), # -> timestamp - data_value # -> data value (duh) - )) - - try: - # Create data value - TimeSeriesResultValue.objects.create( + result_value = TimeseriesResultValueTechDebt( result_id=sensor.result_id, - value_datetime_utc_offset=results_mapping['utc_offset'], + data_value=data_value, + utc_offset=results_mapping['utc_offset'], value_datetime=measurement_datetime, - censor_code_id='Not censored', - quality_code_id='None', + censor_code='Not censored', + quality_code='None', time_aggregation_interval=1, - time_aggregation_interval_unit=data_value_units, - data_value=data_value - ) - except IntegrityError as ie: - print('value not created for {}'.format(uuid)) + time_aggregation_interval_unit=data_value_units.unit_id, + ) + try: + result = InsertTimeseriesResultValues(result_value) + except Exception as e: + warnings.append(f"Error inserting value '{data_value}'"\ + f"at datetime '{measurement_datetime}' for result uuid '{uuid}'") continue - print('updating sensor metadata') + #block is responsible for keeping separate dataloader database metadata in sync + #long term plan is to eliminate this, but need to keep for the now for sensor in sensors: uuid = str(sensor.result_uuid) if uuid not in results_mapping['results']: print('uuid {} in file does not correspond to a measured variable in {}'.format(uuid, registration.sampling_feature_code)) continue - last_data_value = row[results_mapping['results'][uuid]['index']] - - # create last measurement object last_measurement = SensorMeasurement.objects.filter(sensor=sensor).first() if not last_measurement or last_measurement and last_measurement.value_datetime < measurement_datetime: last_measurement and last_measurement.delete() @@ -314,31 +321,17 @@ def post(self, request, *args, **kwargs): value_datetime_utc_offset=timedelta(hours=results_mapping['utc_offset']), data_value=last_data_value ) - - # Insert data values into influx instance. - influx_request_url = settings.INFLUX_UPDATE_URL - influx_series_template = settings.INFLUX_UPDATE_BODY - - all_values = results_mapping['results'][uuid]['values'] # -> [(timestamp, data_value), ] - influx_request_body = '\n'.join( - [influx_series_template.format( - result_uuid=uuid.replace('-', '_'), - data_value=value, - utc_offset=results_mapping['utc_offset'], - timestamp_s=timestamp - ) for timestamp, value in all_values] - ) - - requests.post(influx_request_url, influx_request_body.encode()) - - # send email informing the data upload is done - print('sending email') - subject = 'Data Sharing Portal data upload completed' - message = 'Your data upload for site {} is complete.'.format(registration.sampling_feature_code) - sender = "\"Data Sharing Portal Upload\" " - addresses = [request.user.email] - if send_mail(subject, message, sender, addresses, fail_silently=True): - print('email sent!') + #end meta data syncing block + + #TODO: Decouple email from this method by having email sender class + #subject = 'Data Sharing Portal data upload completed' + #message = 'Your data upload for site {} is complete.'.format(registration.sampling_feature_code) + #sender = "\"Data Sharing Portal Upload\" " + #addresses = [request.user.email] + #if send_mail(subject, message, sender, addresses, fail_silently=True): + # print('email sent!') + if warnings: + return Response({'warnings': warnings}, status.HTTP_206_PARTIAL_CONTENT) return Response({'message': 'file has been processed successfully'}, status.HTTP_200_OK) @@ -347,7 +340,7 @@ class CSVDataApi(View): date_format = '%Y-%m-%d %H:%M:%S' - def get(self, request, *args, **kwargs): + def get(self, request:WSGIRequest, *args, **kwargs) -> HttpResponse: """ Downloads csv file for given result id's. @@ -362,9 +355,7 @@ def get(self, request, *args, **kwargs): result_ids = [request.GET.get('result_id', [])] elif 'result_ids' in request.GET: result_ids = request.GET['result_ids'].split(',') - - result_ids = filter(lambda x: len(x) > 0, result_ids) - + if not len(result_ids): return Response({'error': 'Result ID(s) not found.'}) @@ -378,7 +369,7 @@ def get(self, request, *args, **kwargs): return response @staticmethod - def get_csv_file(result_ids, request=None): # type: (list, any) -> (str, StringIO) + def get_csv_file(result_ids:List[str], request:WSGIRequest=None) -> Tuple[str, StringIO]: """ Gathers time series data for the passed in result id's to generate a csv file for download """ @@ -404,7 +395,7 @@ def get_csv_file(result_ids, request=None): # type: (list, any) -> (str, String raise ValueError('Time Series Result(s) not found (result id(s): {}).'.format(', '.join(result_ids))) csv_file = StringIO() - csv_writer = UnicodeWriter(csv_file) + csv_writer = csv.writer(csv_file) csv_file.write(CSVDataApi.generate_metadata(time_series_result, request=request)) csv_writer.writerow(CSVDataApi.get_csv_headers(time_series_result)) csv_writer.writerows(CSVDataApi.get_data_values(time_series_result)) @@ -425,13 +416,13 @@ def get_csv_file(result_ids, request=None): # type: (list, any) -> (str, String return filename, csv_file @staticmethod - def get_csv_headers(ts_results): # type: ([TimeSeriesResult]) -> None + def get_csv_headers(ts_results:List[TimeSeriesResult]) -> None: headers = ['DateTime', 'TimeOffset', 'DateTimeUTC'] var_codes = [ts_result.result.variable.variable_code for ts_result in ts_results] return headers + CSVDataApi.clean_variable_codes(var_codes) @staticmethod - def clean_variable_codes(varcodes): # type: ([str]) -> [str] + def clean_variable_codes(varcodes:List[str]) -> List[str]: """ Looks for duplicate variable codes and appends a number if collisions exist. @@ -451,7 +442,7 @@ def clean_variable_codes(varcodes): # type: ([str]) -> [str] return varcodes @staticmethod - def get_data_values(time_series_results): # type: (QuerySet) -> object + def get_data_values(time_series_results:QuerySet) -> object: result_ids = [result_id[0] for result_id in time_series_results.values_list('pk')] data_values_queryset = TimeSeriesResultValue.objects.filter(result_id__in=result_ids).order_by('value_datetime').values('value_datetime', 'value_datetime_utc_offset', 'result_id', 'data_value') data_values_map = OrderedDict() @@ -463,7 +454,7 @@ def get_data_values(time_series_results): # type: (QuerySet) -> object }) data = [] - for timestamp, values in data_values_map.iteritems(): + for timestamp, values in data_values_map.items(): local_timestamp = timestamp + timedelta(hours=values['utc_offset']) row = [ local_timestamp.strftime(CSVDataApi.date_format), # Local DateTime @@ -481,22 +472,23 @@ def get_data_values(time_series_results): # type: (QuerySet) -> object return data @staticmethod - def read_file(fname): + def read_file(fname:str) -> str: fpath = os.path.join(os.path.dirname(__file__), 'csv_templates', fname) - with codecs.open(fpath, 'r', encoding='utf-8') as fin: - return fin.read() + with open(fpath, 'r', encoding='utf8') as f: + contents = f.read() + return contents @staticmethod - def generate_metadata(time_series_results, request=None): # type: (QuerySet, any) -> str - metadata = str() + def generate_metadata(time_series_results:QuerySet, request:WSGIRequest=None) -> str: + metadata = '' # Get the first TimeSeriesResult object and use it to get values for the # "Site Information" block in the header of the CSV tsr = time_series_results.first() site_sensor = SiteSensor.objects.select_related('registration').filter(result_id=tsr.result.result_id).first() - metadata += CSVDataApi.read_file('site_information.txt').format( - site=site_sensor.registration - ).encode('utf-8') + site_info_template = CSVDataApi.read_file('site_information.txt') + site_info_template = site_info_template.format(site=site_sensor.registration) + metadata += site_info_template time_series_results_as_list = [tsr for tsr in time_series_results] @@ -581,77 +573,57 @@ class TimeSeriesValuesApi(APIView): authentication_classes = (UUIDAuthentication, ) def post(self, request, format=None): - # make sure that the data is in the request (sampling_feature, timestamp(?), ...) if not return error response - # if 'sampling_feature' not in request.data or 'timestamp' not in request.data: if not all(key in request.data for key in ('timestamp', 'sampling_feature')): raise exceptions.ParseError("Required data not found in request.") - # parse the received timestamp try: measurement_datetime = parse_datetime(request.data['timestamp']) except ValueError: raise exceptions.ParseError('The timestamp value is not valid.') - if not measurement_datetime: raise exceptions.ParseError('The timestamp value is not well formatted.') - if measurement_datetime.utcoffset() is None: raise exceptions.ParseError('The timestamp value requires timezone information.') - utc_offset = int(measurement_datetime.utcoffset().total_seconds() / timedelta(hours=1).total_seconds()) - - # saving datetimes in utc time now. measurement_datetime = measurement_datetime.replace(tzinfo=None) - timedelta(hours=utc_offset) - # get odm2 sampling feature if it matches sampling feature uuid sent sampling_feature = SamplingFeature.objects.filter(sampling_feature_uuid__exact=request.data['sampling_feature']).first() if not sampling_feature: raise exceptions.ParseError('Sampling Feature code does not match any existing site.') - - # get all feature actions related to the sampling feature, along with the results, results variables, and actions. feature_actions = sampling_feature.feature_actions.prefetch_related('results__variable', 'action').all() + errors = [] for feature_action in feature_actions: result = feature_action.results.all().first() - site_sensor = SiteSensor.objects.filter(result_id=result.result_id).first() - - is_first_value = result.value_count == 0 - - # don't create a new TimeSeriesValue for results that are not included in the request if str(result.result_uuid) not in request.data: continue - result_value = TimeSeriesResultValue( + result_value = TimeseriesResultValueTechDebt( result_id=result.result_id, - value_datetime_utc_offset=utc_offset, + data_value=request.data[str(result.result_uuid)], value_datetime=measurement_datetime, - censor_code_id='Not censored', - quality_code_id='None', + utc_offset=utc_offset, + censor_code='Not censored', + quality_code='None', time_aggregation_interval=1, - time_aggregation_interval_unit=Unit.objects.get(unit_name='hour minute'), - data_value=request.data[str(result.result_uuid)] - ) + time_aggregation_interval_unit=(Unit.objects.get(unit_name='hour minute')).unit_id) try: - result_value.save() + query_result = InsertTimeseriesResultValues(result_value) except Exception as e: - # continue adding the remaining measurements in the request. - # TODO: use a logger to log the failed request information. - continue - # raise exceptions.ParseError("{variable_code} value not saved {exception_message}".format( - # variable_code=result.variable.variable_code, exception_message=e - # )) - + errors.append(f"Failed to INSERT data for uuid('{result.result_uuid}')") + + # PRT - long term we would like to remove dataloader database but for now + # this block of code keeps dataloaderinterface_sensormeasurement table in sync result.value_count = F('value_count') + 1 result.result_datetime = measurement_datetime result.result_datetime_utc_offset = utc_offset - - # delete last measurement + site_sensor = SiteSensor.objects.filter(result_id=result.result_id).first() last_measurement = SensorMeasurement.objects.filter(sensor=site_sensor).first() if not last_measurement: SensorMeasurement.objects.create( sensor=site_sensor, value_datetime=result_value.value_datetime, - value_datetime_utc_offset=timedelta(hours=result_value.value_datetime_utc_offset), + value_datetime_utc_offset=timedelta(hours=result_value.utc_offset), data_value=result_value.data_value ) elif last_measurement and result_value.value_datetime > last_measurement.value_datetime: @@ -659,11 +631,11 @@ def post(self, request, format=None): SensorMeasurement.objects.create( sensor=site_sensor, value_datetime=result_value.value_datetime, - value_datetime_utc_offset=timedelta(hours=result_value.value_datetime_utc_offset), + value_datetime_utc_offset=timedelta(hours=result_value.utc_offset), data_value=result_value.data_value ) - if is_first_value: + if result.value_count == 0: result.valid_datetime = measurement_datetime result.valid_datetime_utc_offset = utc_offset @@ -678,21 +650,57 @@ def post(self, request, format=None): 'valid_datetime', 'valid_datetime_utc_offset' ]) except Exception as e: - # Temporary fix. TODO: Use logger and be more specific. exception catch is too broad. + #PRT - An exception here means the dataloaderinterface data tables will not in sync + # for this sensor, but that is better than a fail state where data is lost so pass + # expection for now. Long term plan is to remove this whole block of code. pass - - # Insert data value into influx instance. - try: - influx_request_url = settings.INFLUX_UPDATE_URL - influx_request_body = settings.INFLUX_UPDATE_BODY.format( - result_uuid=str(site_sensor.result_uuid).replace('-', '_'), - data_value=result_value.data_value, - utc_offset=result_value.value_datetime_utc_offset, - timestamp_s=long((result_value.value_datetime - datetime.utcfromtimestamp(0)).total_seconds()), - ) - requests.post(influx_request_url, influx_request_body.encode()) - except Exception as e: - # Temporary fix. TODO: Use logger and be more specific. exception catch is too broad. - continue + # End dataloaderinterface_sensormeasurement sync block + if errors: return Response(errors, status=status.HTTP_500_INTERNAL_SERVER_ERROR) return Response({}, status.HTTP_201_CREATED) + +class TimeseriesResultValueTechDebt(): + def __init__(self, + result_id:str, + data_value:float, + value_datetime:datetime, + utc_offset:int, + censor_code:str, + quality_code:str, + time_aggregation_interval:int, + time_aggregation_interval_unit:int) -> None: + self.result_id = result_id + self.data_value = data_value + self.utc_offset = utc_offset + self.value_datetime = value_datetime + self.censor_code= censor_code + self.quality_code = quality_code + self.time_aggregation_interval = time_aggregation_interval + self.time_aggregation_interval_unit = time_aggregation_interval_unit + +def InsertTimeseriesResultValues(result_value : TimeseriesResultValueTechDebt) -> None: + with _db_engine.connect() as connection: + query = text("INSERT INTO odm2.timeseriesresultvalues " \ + "(valueid, resultid, datavalue, valuedatetime, valuedatetimeutcoffset, " \ + "censorcodecv, qualitycodecv, timeaggregationinterval, timeaggregationintervalunitsid) " \ + "VALUES ( " \ + "(SELECT nextval('odm2.\"timeseriesresultvalues_valueid_seq\"'))," \ + ":result_id, " \ + ":data_value, " \ + ":value_datetime, " \ + ":utc_offset, " \ + ":censor_code, " \ + ":quality_code, " \ + ":time_aggregation_interval, " \ + ":time_aggregation_interval_unit);") + result = connection.execute(query, + result_id=result_value.result_id, + data_value=result_value.data_value, + value_datetime=result_value.value_datetime, + utc_offset=result_value.utc_offset, + censor_code=result_value.censor_code, + quality_code=result_value.quality_code, + time_aggregation_interval=result_value.time_aggregation_interval, + time_aggregation_interval_unit=result_value.time_aggregation_interval_unit, + ) + return result \ No newline at end of file diff --git a/src/hydroshare/models.py b/src/hydroshare/models.py index 27c55b73..7bf8001b 100644 --- a/src/hydroshare/models.py +++ b/src/hydroshare/models.py @@ -51,7 +51,7 @@ class Meta: # HSUAccount - holds information for user's Hydroshare account class HydroShareAccount(models.Model): - user = models.OneToOneField(settings.AUTH_USER_MODEL, null=True, blank=True, related_name='hydroshare_account') + user = models.OneToOneField(settings.AUTH_USER_MODEL, null=True, blank=True, related_name='hydroshare_account', on_delete=models.CASCADE) is_enabled = models.BooleanField(default=False) ext_id = models.IntegerField(unique=True) # external hydroshare account id token = models.ForeignKey(OAuthToken, db_column='token_id', null=True, on_delete=models.CASCADE) @@ -118,7 +118,7 @@ class HydroShareResource(models.Model): hs_account = models.ForeignKey(HydroShareAccount, db_column='hs_account_id', on_delete=models.CASCADE, null=True, blank=True) ext_id = models.CharField(max_length=255, blank=True, null=True, unique=True) # external hydroshare resource id - site_registration = models.OneToOneField(SiteRegistration, related_name='hydroshare_resource') + site_registration = models.OneToOneField(SiteRegistration, related_name='hydroshare_resource', on_delete=models.CASCADE) sync_type = models.CharField(max_length=255, default='manual', choices=HYDROSHARE_SYNC_TYPES) update_freq = models.CharField(max_length=32, verbose_name='Update Frequency', default='daily') is_enabled = models.BooleanField(default=True) diff --git a/src/hydroshare/urls.py b/src/hydroshare/urls.py index 7e5dab3a..4707a022 100644 --- a/src/hydroshare/urls.py +++ b/src/hydroshare/urls.py @@ -1,7 +1,6 @@ from django.conf.urls import url -from hydroshare.views import HydroShareResourceUpdateView, HydroShareResourceCreateView, HydroShareResourceDeleteView, \ - OAuthAuthorize, OAuthRedirect +from hydroshare.views import (HydroShareResourceUpdateView, HydroShareResourceCreateView, HydroShareResourceDeleteView, OAuthAuthorize, OAuthRedirect) app_name = 'hydroshare' urlpatterns = [ diff --git a/src/hydroshare/views.py b/src/hydroshare/views.py index 652f9026..b40711ff 100644 --- a/src/hydroshare/views.py +++ b/src/hydroshare/views.py @@ -11,7 +11,7 @@ from django.contrib.auth.decorators import login_required from django.utils import timezone -from django.core.urlresolvers import reverse +from django.urls import reverse from django.core.exceptions import ObjectDoesNotExist from django.http.response import HttpResponse, JsonResponse, HttpResponseServerError from django.shortcuts import redirect @@ -438,7 +438,7 @@ def get(self, request, *args, **kwargs): token_dict = AuthUtil.authorize_client_callback(request) # type: dict auth_util = AuthUtil.authorize(token=token_dict) # type: AuthUtil except Exception as e: - print 'Authorizition failure: {}'.format(e) + print('Authorizition failure: {}'.format(e)) return HttpResponse(mark_safe("

Error: Authorization failure!

{e}

".format(e=e))) client = auth_util.get_client() # type: HydroShareAdapter diff --git a/src/hydroshare_util/auth.py b/src/hydroshare_util/auth.py index aee64ab1..f6342fc1 100644 --- a/src/hydroshare_util/auth.py +++ b/src/hydroshare_util/auth.py @@ -3,7 +3,7 @@ import requests from oauthlib.oauth2 import InvalidGrantError from hs_restclient import HydroShareAuthOAuth2, HydroShareAuthBasic -from adapter import HydroShareAdapter +from hydroshare_util.adapter import HydroShareAdapter from . import HydroShareUtilityBaseClass, ImproperlyConfiguredError from django.shortcuts import redirect import logging as logger diff --git a/src/hydroshare_util/resource.py b/src/hydroshare_util/resource.py index 480905e1..2ae1cb5a 100644 --- a/src/hydroshare_util/resource.py +++ b/src/hydroshare_util/resource.py @@ -7,7 +7,7 @@ from hs_restclient import HydroShareNotFound, HydroShareNotAuthorized from hydroshare_util.adapter import HydroShareAdapter from . import HydroShareUtilityBaseClass -from coverage import CoverageFactory, Coverage +from hydroshare_util.coverage import CoverageFactory, Coverage TMP_FILE_PATH = '~$hydroshare_tmp_files' diff --git a/src/leafpack/models.py b/src/leafpack/models.py index 278efee8..171c8efb 100644 --- a/src/leafpack/models.py +++ b/src/leafpack/models.py @@ -7,6 +7,7 @@ from django.db.models import Sum, Q from operator import __or__ as OR +from functools import reduce class Macroinvertebrate(models.Model): """ diff --git a/src/leafpack/views.py b/src/leafpack/views.py index 123d3b9f..8ae9aa80 100644 --- a/src/leafpack/views.py +++ b/src/leafpack/views.py @@ -14,7 +14,7 @@ from .models import LeafPack, Macroinvertebrate, LeafPackType, LeafPackSensitivityGroup from .forms import LeafPackForm, LeafPackBugForm, LeafPackBugFormFactory, LeafPackBug -from csv_writer import LeafPackCSVWriter +from leafpack.csv_writer import LeafPackCSVWriter class LeafPackViewMixin(object): diff --git a/src/manage.py b/src/manage.py index cab191e3..d745b1e2 100644 --- a/src/manage.py +++ b/src/manage.py @@ -3,7 +3,7 @@ import sys if __name__ == "__main__": - os.environ.setdefault("DJANGO_SETTINGS_MODULE", "WebSDL.settings") + os.environ.setdefault("DJANGO_SETTINGS_MODULE", "WebSDL.settings.development") from django.core.management import execute_from_command_line diff --git a/src/timeseries_visualization/urls.py b/src/timeseries_visualization/urls.py new file mode 100644 index 00000000..80448457 --- /dev/null +++ b/src/timeseries_visualization/urls.py @@ -0,0 +1,10 @@ +from django.conf.urls import url +from timeseries_visualization import views + +APP_URL = 'tsv' + +urlpatterns = [ + url(r'^' + APP_URL + '/(?P.*)/(?P.*)/$', views.tsv), + url(r'^' + APP_URL + '/(?P.*)/$', views.tsv), + url(r'^' + APP_URL + '/$', views.tsv, name='timeseries_visualization'), +] \ No newline at end of file diff --git a/src/timeseries_visualization/views.py b/src/timeseries_visualization/views.py new file mode 100644 index 00000000..34e1e200 --- /dev/null +++ b/src/timeseries_visualization/views.py @@ -0,0 +1,10 @@ +from django.http import HttpResponse, HttpRequest +from django.shortcuts import render + +APP_DIR = 'timeseries_visualization' + +def tsv(request:HttpRequest, sampling_feature_code:str='', result_id:str='') -> HttpResponse: + args = {} + args['sampling_feature_code'] = sampling_feature_code + args['result_id'] = result_id + return render(request, f'{APP_DIR}/tool.html', args) \ No newline at end of file