Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Repair EpicsSignal.set() for array PVs #1207

Merged
merged 12 commits into from
Aug 1, 2024
Merged

Conversation

prjemian
Copy link
Contributor

@prjemian prjemian commented Jul 30, 2024

@prjemian prjemian added the bug label Jul 30, 2024
@prjemian prjemian self-assigned this Jul 30, 2024
@prjemian
Copy link
Contributor Author

Existing CI passes but this code needs unit tests. I will add.

@prjemian prjemian requested review from tacaswell and a team July 31, 2024 15:36
@prjemian
Copy link
Contributor Author

This is ready for review. All CI tests related directly to this code pass. While the CI fails randomly, the fails happen during tests of other code and do not appear to be triggered by this code.

@prjemian
Copy link
Contributor Author

prjemian commented Jul 31, 2024

@grace227 and @mpres0408: You are both invited to comment on this PR.

@prjemian
Copy link
Contributor Author

@tacaswell asked in #1206 (comment):

If you put across CA to an array with less than the full array is the expected semantics:

  • "here is your whole new array, please reduce your size to match"
  • "here is a partial update, overwrite just the start of the array but leave the rest the same"

?

When we "put" to an array should we be resetting the size as well or is the issue that the monitor is providing up to max size rather than the current size?

Empirically, it is a variation of option 2 above. Write the array with the new content and ZEROES for any remaining items. These arrays are taken directly from the C storage in the IOC.

There is no option for you to change the array size from an EPICS client. EPICS will return the array as described when the IOC is started. The EPICS waveform record has additional fields that provide some introspection about the full size and the size actually in use. PyEpics has some additional documentation about arrays and waveform records.

@prjemian
Copy link
Contributor Author

All tests pass. Some runners appear to stall out and need to be canceled, then restarted. Seems like SOP for the test suite here.

@tacaswell
Copy link
Contributor

I'm having trouble sorting out exactly where the trimming in the pyepics docs is happening and the behavior appears to be at odds with the documented behavior from pyepics, although

This feature was introduced in Epics CA 3.14.12.1, and may not work for data from IOCs running extremely old versions of Epics base.

My first question is just using pyepcis (with none of our layers) do you get the behavior pyepics documents with this PV? If not, is this a bug in pyepics, a really old IOC, or dependent on the record?

https://github.com/pyepics/pyepics/blob/eace0a176312965bfc5c1f0e7cd889397a46dc07/epics/pv.py#L578-L579 This looks to be the place where count=None gets converted to something that can be passed to libca. If you change the get in

current_value = signal.get()

to be

current_value = signal.get(count=0)

do we get the IOC to trim the values for us?

I wonder if we are getting some "latching" in the metadata that we do not want....

Would it be better to pad the "target" array to include the expected 0s ?

I am surprised this is only coming up now, do we really use variable length arrays super rarely?

If this is the only way to fix this, then so be it, but as usually I'm really worried about adding heuristics that are counter to the plain reading of the intent ([1, 2] and [1, 2, 0] as arrays are not "the same" under most understandings of "the same" (my own counter example is tack on a couple of extra 0 to both and then it makes sense if they are representing null-terminated strings but that is a very specific case)) as they tend to spawn their own workarounds later.

@prjemian
Copy link
Contributor Author

The EPICS IOC always gives us the full array.

@prjemian
Copy link
Contributor Author

Writing $m$ points to an array of $n$ (where $m <=n$) is as common as raindrops. Not unusual or rare. We encountered this problem with PyEpics as the control layer and nothing special case about it.

@prjemian
Copy link
Contributor Author

Here's the PyEpics version:

In [2]: pv = epics.PV("gp:userArrayCalc1.AA")

In [3]: pv.wait_for_connection()
Out[3]: True

In [4]: len(pv.get())
Out[4]: 8000

In [5]: len(pv.get(count=10))
Out[5]: 10

In [6]: pv.get(count=10)
Out[6]: 
array([0.61020738, 0.35412478, 0.39598853, 0.06866379, 0.52719417,
       0.        , 0.        , 0.        , 0.        , 0.        ])

with this PV:

gp:userArrayCalc1.AA
    State:            connected
    Host:             192.168.144.94:45409
    Access:           read, write
    Native data type: DBF_DOUBLE
    Request type:     DBR_DOUBLE
    Element count:    8000

The IOC is from EPICS base 7.0.5, synApps 6.2.1 as a docker image.

@prjemian
Copy link
Contributor Author

Could we call the .get(count=m) from this .set() method? By modifying the _wait_for_value() function (when val is array_like)?

current_value = signal.get()

current_value = signal.get()

Might avoid completely the proposed modifications to _compare_maybe_enum().

@prjemian
Copy link
Contributor Author

I am surprised this is only coming up now, do we really use variable length arrays super rarely?

Me, too. Maybe this is the first time we are seeing user-written code writing shortened arrays to array PVs. Users are starting to write code that exercises EPICS IOC-directed fly scans. (Either the IOC directs the fly scan, such as with sscan record or the IOC tells hardware to run the fly scan.)

@prjemian
Copy link
Contributor Author

After commenting out my proposal, these diffs implement changes to _wait_for_value() as described above:

@@ -267,7 +267,9 @@ def _wait_for_value(signal, val, poll_time=0.01, timeout=10, rtol=None, atol=Non
     TimeoutError if timeout is exceeded
     """
     expiration_time = ttime.time() + timeout if timeout is not None else None
-    current_value = signal.get()
+    array_like = (list, np.ndarray, tuple)
+    item_count = None if not isinstance(val, array_like) else len(val)
+    current_value = signal.get(count=item_count)
 
     if atol is None and hasattr(signal, "tolerance"):
         atol = signal.tolerance
@@ -305,7 +307,7 @@ def _wait_for_value(signal, val, poll_time=0.01, timeout=10, rtol=None, atol=Non
         ttime.sleep(poll_time)
         if poll_time < 0.1:
             poll_time *= 2  # logarithmic back-off
-        current_value = signal.get()
+        current_value = signal.get(count=item_count)
         if expiration_time is not None and ttime.time() > expiration_time:
             raise TimeoutError(
                 "Attempted to set %r to value %r and timed "

They pass the tests locally (with a couple adjustments).

ophyd/utils/epics_pvs.py Outdated Show resolved Hide resolved
ophyd/utils/epics_pvs.py Outdated Show resolved Hide resolved
ophyd/utils/epics_pvs.py Outdated Show resolved Hide resolved
@tacaswell
Copy link
Contributor

I like this solution a lot better, but am very confused why the new tests are passing!

@prjemian
Copy link
Contributor Author

I like that it tests with both PyEpics and caproto. Which tests concern you? For me, I expected this one to raise an Exception:

[[1, 2, 3], [1, 2, 3, 0], [], None, None, False], # different shape

(bluesky_2024_2) prjemian@arf:~/.../Bluesky/ophyd$ pytest -vvv --lf ./ophyd/tests/test_utils.py -k test_compare_maybe_enum
================================================================ test session starts =================================================================
platform linux -- Python 3.11.0, pytest-8.2.2, pluggy-1.5.0 -- /home/prjemian/.conda/envs/bluesky_2024_2/bin/python3.11
cachedir: .pytest_cache
rootdir: /home/prjemian/Documents/projects/Bluesky/ophyd
configfile: pytest.ini
plugins: anyio-4.4.0, pytest_notebook-0.10.0
collected 84 items / 38 deselected / 46 selected                                                                                                     
run-last-failure: 66 known failures not in selected tests

ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-0-0-enums0-None-None-True] PASSED                                                   [  2%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-0-1-enums1-None-None-False] PASSED                                                  [  4%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-NO-0-enums2-None-None-True] PASSED                                                  [  6%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-NO-1-enums3-None-None-False] PASSED                                                 [  8%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-0-YES-enums4-None-None-False] PASSED                                                [ 10%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-1-YES-enums5-None-None-True] PASSED                                                 [ 13%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-NO-YES-enums6-None-None-False] PASSED                                               [ 15%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-YES-YES-enums7-None-None-True] PASSED                                               [ 17%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a8-b8-enums8-None-None-True] PASSED                                                 [ 19%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a9-b9-enums9-None-None-False] PASSED                                                [ 21%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a10-b10-enums10-None-None-False] PASSED                                             [ 23%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-5-b11-enums11-None-None-False] PASSED                                               [ 26%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a12-5-enums12-None-None-False] PASSED                                               [ 28%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a13-b13-enums13-None-None-True] PASSED                                              [ 30%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a14-b14-enums14-None-None-True] PASSED                                              [ 32%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a15-b15-enums15-0.01-None-False] PASSED                                             [ 34%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a16-b16-enums16-0.2-None-True] PASSED                                               [ 36%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums17-None-0.01-False] PASSED                                           [ 39%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums18-None-0.2-True] PASSED                                             [ 41%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-True-False-enums19-None-None-False] PASSED                                          [ 43%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-1-1-enums20-None-None-True] PASSED                                                  [ 45%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums21-0.01-None-False] PASSED                                           [ 47%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums22-0.2-None-True] PASSED                                             [ 50%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-0-0-enums0-None-None-True] PASSED                                                   [ 52%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-0-1-enums1-None-None-False] PASSED                                                  [ 54%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-NO-0-enums2-None-None-True] PASSED                                                  [ 56%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-NO-1-enums3-None-None-False] PASSED                                                 [ 58%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-0-YES-enums4-None-None-False] PASSED                                                [ 60%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-1-YES-enums5-None-None-True] PASSED                                                 [ 63%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-NO-YES-enums6-None-None-False] PASSED                                               [ 65%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-YES-YES-enums7-None-None-True] PASSED                                               [ 67%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a8-b8-enums8-None-None-True] PASSED                                                 [ 69%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a9-b9-enums9-None-None-False] PASSED                                                [ 71%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a10-b10-enums10-None-None-False] PASSED                                             [ 73%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-5-b11-enums11-None-None-False] PASSED                                               [ 76%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a12-5-enums12-None-None-False] PASSED                                               [ 78%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a13-b13-enums13-None-None-True] PASSED                                              [ 80%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a14-b14-enums14-None-None-True] PASSED                                              [ 82%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a15-b15-enums15-0.01-None-False] PASSED                                             [ 84%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a16-b16-enums16-0.2-None-True] PASSED                                               [ 86%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums17-None-0.01-False] PASSED                                           [ 89%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums18-None-0.2-True] PASSED                                             [ 91%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-True-False-enums19-None-None-False] PASSED                                          [ 93%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-1-1-enums20-None-None-True] PASSED                                                  [ 95%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums21-0.01-None-False] PASSED                                           [ 97%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums22-0.2-None-True] PASSED                                             [100%]

================================================================== warnings summary ==================================================================
../../../../.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/epics/ca.py:28
  /home/prjemian/.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/epics/ca.py:28: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    from pkg_resources import resource_filename

../../../../.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/pkg_resources/__init__.py:2825
../../../../.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/pkg_resources/__init__.py:2825
  /home/prjemian/.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/pkg_resources/__init__.py:2825: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('sphinxcontrib')`.
  Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=================================================== 46 passed, 38 deselected, 3 warnings in 0.06s ====================================================

@prjemian
Copy link
Contributor Author

Most of the parameter sets I added were to test existing parts of _compare_maybe_enum(). The array test sets are for existing functionality and the new case, when the array from EPICS is longer. I expected that one to fail. Now I understand it is filtered by this line:

if atol is not None or rtol is not None:

I'll try an additional test locally.

        [[1, 2, 3], [1, 2, 3, 0], [], None, 1e-8, None, False],  # different shape

This set raises a ValueError exception from numpy: operands could not be broadcast together with shapes (3,) (4,)
With the new changes, we won't see that condition so it does not belong in the test suite.

@prjemian
Copy link
Contributor Author

prjemian commented Aug 1, 2024

@tacaswell All CI runners have passed. Anything else for this PR?

@tacaswell tacaswell merged commit 050b98f into master Aug 1, 2024
14 checks passed
@tacaswell tacaswell deleted the 1206-EpicsSignal-array-set branch August 1, 2024 14:58
@tacaswell
Copy link
Contributor

This has the additional benefit of reducing network traffic in cases where we are only using a small part of big array.

@prjemian
Copy link
Contributor Author

prjemian commented Aug 1, 2024

@tacaswell Thanks for your contribution here. The final code is much simpler and consistent with the intent of the .set() method.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

FailedStatus from ophyd while writing array to EpicsSignal
2 participants