-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Repair EpicsSignal.set() for array PVs #1207
Conversation
Existing CI passes but this code needs unit tests. I will add. |
This is ready for review. All CI tests related directly to this code pass. While the CI fails randomly, the fails happen during tests of other code and do not appear to be triggered by this code. |
@grace227 and @mpres0408: You are both invited to comment on this PR. |
@tacaswell asked in #1206 (comment):
Empirically, it is a variation of option 2 above. Write the array with the new content and ZEROES for any remaining items. These arrays are taken directly from the C storage in the IOC. There is no option for you to change the array size from an EPICS client. EPICS will return the array as described when the IOC is started. The EPICS |
All tests pass. Some runners appear to stall out and need to be canceled, then restarted. Seems like SOP for the test suite here. |
I'm having trouble sorting out exactly where the trimming in the pyepics docs is happening and the behavior appears to be at odds with the documented behavior from pyepics, although
My first question is just using pyepcis (with none of our layers) do you get the behavior pyepics documents with this PV? If not, is this a bug in pyepics, a really old IOC, or dependent on the record? https://github.com/pyepics/pyepics/blob/eace0a176312965bfc5c1f0e7cd889397a46dc07/epics/pv.py#L578-L579 This looks to be the place where ophyd/ophyd/utils/epics_pvs.py Line 308 in 9c77d86
to be current_value = signal.get(count=0) do we get the IOC to trim the values for us? I wonder if we are getting some "latching" in the metadata that we do not want.... Would it be better to pad the "target" array to include the expected 0s ? I am surprised this is only coming up now, do we really use variable length arrays super rarely? If this is the only way to fix this, then so be it, but as usually I'm really worried about adding heuristics that are counter to the plain reading of the intent ( |
The EPICS IOC always gives us the full array. |
Writing |
Here's the PyEpics version: In [2]: pv = epics.PV("gp:userArrayCalc1.AA")
In [3]: pv.wait_for_connection()
Out[3]: True
In [4]: len(pv.get())
Out[4]: 8000
In [5]: len(pv.get(count=10))
Out[5]: 10
In [6]: pv.get(count=10)
Out[6]:
array([0.61020738, 0.35412478, 0.39598853, 0.06866379, 0.52719417,
0. , 0. , 0. , 0. , 0. ]) with this PV: gp:userArrayCalc1.AA
State: connected
Host: 192.168.144.94:45409
Access: read, write
Native data type: DBF_DOUBLE
Request type: DBR_DOUBLE
Element count: 8000 The IOC is from EPICS base 7.0.5, synApps 6.2.1 as a docker image. |
Could we call the ophyd/ophyd/utils/epics_pvs.py Line 270 in 9c77d86
ophyd/ophyd/utils/epics_pvs.py Line 308 in 9c77d86
Might avoid completely the proposed modifications to |
Me, too. Maybe this is the first time we are seeing user-written code writing shortened arrays to array PVs. Users are starting to write code that exercises EPICS IOC-directed fly scans. (Either the IOC directs the fly scan, such as with sscan record or the IOC tells hardware to run the fly scan.) |
After commenting out my proposal, these diffs implement changes to @@ -267,7 +267,9 @@ def _wait_for_value(signal, val, poll_time=0.01, timeout=10, rtol=None, atol=Non
TimeoutError if timeout is exceeded
"""
expiration_time = ttime.time() + timeout if timeout is not None else None
- current_value = signal.get()
+ array_like = (list, np.ndarray, tuple)
+ item_count = None if not isinstance(val, array_like) else len(val)
+ current_value = signal.get(count=item_count)
if atol is None and hasattr(signal, "tolerance"):
atol = signal.tolerance
@@ -305,7 +307,7 @@ def _wait_for_value(signal, val, poll_time=0.01, timeout=10, rtol=None, atol=Non
ttime.sleep(poll_time)
if poll_time < 0.1:
poll_time *= 2 # logarithmic back-off
- current_value = signal.get()
+ current_value = signal.get(count=item_count)
if expiration_time is not None and ttime.time() > expiration_time:
raise TimeoutError(
"Attempted to set %r to value %r and timed " They pass the tests locally (with a couple adjustments). |
I like this solution a lot better, but am very confused why the new tests are passing! |
I like that it tests with both PyEpics and caproto. Which tests concern you? For me, I expected this one to raise an Exception: ophyd/ophyd/tests/test_utils.py Line 153 in 9df37cd
(bluesky_2024_2) prjemian@arf:~/.../Bluesky/ophyd$ pytest -vvv --lf ./ophyd/tests/test_utils.py -k test_compare_maybe_enum
================================================================ test session starts =================================================================
platform linux -- Python 3.11.0, pytest-8.2.2, pluggy-1.5.0 -- /home/prjemian/.conda/envs/bluesky_2024_2/bin/python3.11
cachedir: .pytest_cache
rootdir: /home/prjemian/Documents/projects/Bluesky/ophyd
configfile: pytest.ini
plugins: anyio-4.4.0, pytest_notebook-0.10.0
collected 84 items / 38 deselected / 46 selected
run-last-failure: 66 known failures not in selected tests
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-0-0-enums0-None-None-True] PASSED [ 2%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-0-1-enums1-None-None-False] PASSED [ 4%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-NO-0-enums2-None-None-True] PASSED [ 6%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-NO-1-enums3-None-None-False] PASSED [ 8%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-0-YES-enums4-None-None-False] PASSED [ 10%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-1-YES-enums5-None-None-True] PASSED [ 13%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-NO-YES-enums6-None-None-False] PASSED [ 15%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-YES-YES-enums7-None-None-True] PASSED [ 17%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a8-b8-enums8-None-None-True] PASSED [ 19%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a9-b9-enums9-None-None-False] PASSED [ 21%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a10-b10-enums10-None-None-False] PASSED [ 23%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-5-b11-enums11-None-None-False] PASSED [ 26%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a12-5-enums12-None-None-False] PASSED [ 28%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a13-b13-enums13-None-None-True] PASSED [ 30%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a14-b14-enums14-None-None-True] PASSED [ 32%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a15-b15-enums15-0.01-None-False] PASSED [ 34%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-a16-b16-enums16-0.2-None-True] PASSED [ 36%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums17-None-0.01-False] PASSED [ 39%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums18-None-0.2-True] PASSED [ 41%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-True-False-enums19-None-None-False] PASSED [ 43%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-1-1-enums20-None-None-True] PASSED [ 45%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums21-0.01-None-False] PASSED [ 47%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[caproto-3-3.12345-enums22-0.2-None-True] PASSED [ 50%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-0-0-enums0-None-None-True] PASSED [ 52%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-0-1-enums1-None-None-False] PASSED [ 54%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-NO-0-enums2-None-None-True] PASSED [ 56%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-NO-1-enums3-None-None-False] PASSED [ 58%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-0-YES-enums4-None-None-False] PASSED [ 60%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-1-YES-enums5-None-None-True] PASSED [ 63%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-NO-YES-enums6-None-None-False] PASSED [ 65%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-YES-YES-enums7-None-None-True] PASSED [ 67%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a8-b8-enums8-None-None-True] PASSED [ 69%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a9-b9-enums9-None-None-False] PASSED [ 71%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a10-b10-enums10-None-None-False] PASSED [ 73%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-5-b11-enums11-None-None-False] PASSED [ 76%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a12-5-enums12-None-None-False] PASSED [ 78%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a13-b13-enums13-None-None-True] PASSED [ 80%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a14-b14-enums14-None-None-True] PASSED [ 82%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a15-b15-enums15-0.01-None-False] PASSED [ 84%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-a16-b16-enums16-0.2-None-True] PASSED [ 86%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums17-None-0.01-False] PASSED [ 89%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums18-None-0.2-True] PASSED [ 91%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-True-False-enums19-None-None-False] PASSED [ 93%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-1-1-enums20-None-None-True] PASSED [ 95%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums21-0.01-None-False] PASSED [ 97%]
ophyd/tests/test_utils.py::test_compare_maybe_enum[pyepics-3-3.12345-enums22-0.2-None-True] PASSED [100%]
================================================================== warnings summary ==================================================================
../../../../.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/epics/ca.py:28
/home/prjemian/.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/epics/ca.py:28: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
from pkg_resources import resource_filename
../../../../.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/pkg_resources/__init__.py:2825
../../../../.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/pkg_resources/__init__.py:2825
/home/prjemian/.conda/envs/bluesky_2024_2/lib/python3.11/site-packages/pkg_resources/__init__.py:2825: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('sphinxcontrib')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=================================================== 46 passed, 38 deselected, 3 warnings in 0.06s ==================================================== |
Most of the parameter sets I added were to test existing parts of ophyd/ophyd/utils/epics_pvs.py Line 353 in 9df37cd
I'll try an additional test locally. [[1, 2, 3], [1, 2, 3, 0], [], None, 1e-8, None, False], # different shape This set raises a ValueError exception from numpy: |
@tacaswell All CI runners have passed. Anything else for this PR? |
This has the additional benefit of reducing network traffic in cases where we are only using a small part of big array. |
@tacaswell Thanks for your contribution here. The final code is much simpler and consistent with the intent of the |
Thanks to @grace227 and @mpres0408 for identifying this problem!