-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
e2e test framework poc needs more flexibility #37
Comments
/assign @n3wscott |
Another example would be the API operations for different types https://github.com/knative/docs/blob/master/docs/serving/spec/knative-api-specification-1.0.md#service |
Do you have an example test with code so I can understand? |
If I was using https://github.com/knative-sandbox/reconciler-test/blob/master/pkg/test/context.go I'd write it as follows // see poc: https://github.com/knative/networking/blob/de41c209249935d1b1c915ccc6179f2bf60ca221/test/conformance/ingress/run_test.go
func TestService(gotest *testing.T) {
t := globals.NewT(gotest)
t.Stable("Service", StableAPIOps)
}
func StableAPIOps(t *test.T) {
var s v1.Service = someTestService()
t.Must("Create", func(t *test.T) {
t.ServingClient.Create(...)
...
})
t.Must("Update", func(t *test.T) {
t.ServingClient.Update(...)
...
})
t.Must("Get", func(t *test.T) {
t.ServingClient.Get(...)
...
})
t.Should("Patch", func(t *test.T) {
t.ServingClient.Patch(...)
...
})
t.Must("Delete", func(t *test.T) {
t.ServingClient.Delete(...)
...
})
} |
I would assume that looks something like: func StableAPIOpsFeature() *feature.Feature {
f := new(feature.Feature)
f.Stable("Service").
Must("Create", func(ctx context.Context, t *test.T) {
servingclient.Get(ctx).Create(...)
...
})).
Must("Update", func(ctx context.Context, t *test.T) {
servingclient.Get(ctx).Update(...)
...
}).
Must("Get", func(ctx context.Context, t *test.T) {
servingclient.Get(ctx).Get(...)
...
}).
Should("Patch", func(ctx context.Context, t *test.T) {
servingclient.Get(ctx).Patch(...)
...
}).
Must("Delete", func(ctx context.Context, t *test.T) {
servingclient.Get(ctx).Delete(...)
...
})
return f
} We need to build some support code, like implement namespace scoped clients and inject them into context: like the tbd |
the caveat is that each test should be independent of the previous. Because if you depend on ordering, it will fail, or it will be filtered out if only |
In the prior example I have control over the test sequence since they naturally tie into golang's test lifecycle. This is the point I'm trying to make - this separation makes this more cumbersome ie. another example asserting certain filesystem properties in our runtime contracts. func TestRuntimePaths(gotest *testing.T) {
t := globals.NewT(gotest)
var s v1.Service = someTestService()
t.ServingClient.Create(s)
// once ready fetch it's runtime environment
env := http.Get(s.Status.URL + "/runtime")
for _, path := range runtimev1.MustFilesystemPaths {
t.Must(path, checkPath(env, path))
}
for _, path := range runtimev1.MayFilesystemPaths {
t.May(path, checkPath(env, path))
}
} |
bump |
I am trading cumbersome with composability and decoupled tests. I would write the func TestRuntimePaths(t *testing.T) {
ctx, env := global.Environment()
var s v1.Service = someTestService()
// Use the existing helpers to make a KSVC.
env.Prerequisite(ctx, t, features.ServiceIsCreatedAndReady(s))
// once KSVC is ready fetch it's runtime environment
rt := http.Get(s.Status.URL + "/runtime")
// RuntimePaths is the feature we are trying to test.
env.Test(ctx, runtime.RuntimePaths(rt))
env.Finish()
}
// ...in a runtime features package...
func RuntimePaths(env http.Response) *feature.Feature {
f := new(feature.Feature)
s := f.Stable("RuntimePaths")
for _, path := range runtimev1.MustFilesystemPaths {
s.Must("have "+path , checkPath(env, path))
}
for _, path := range runtimev1.MayFilesystemPaths {
s.May("have"+path, checkPath(env, path))
}
return f
} This might look like more code, but what we can do is break down the thing we are trying to assert is a feature (the payload of the runtime contract) from the how we get that response. So with some edits to the test entry point, we can test the runtime or whatever to another way of getting that data, say, a |
The example above leads me to more questions/thoughts: 1) Use of
|
I'm beginning to wonder if Maybe |
@dprotaso since you want to parallelize all tests, why not? I'm not sure I get your question |
Shouldn't test authors control the flow? What if assertions have side-effects on the remote object being tested? This seems relevant to #3 - To guarantee test isolation I would have to change my feature definition either by:
|
An assertion by definition shouldn't have a side effect right? Are you sure that particular assertion shouldn't live in the setup phase? |
Yup - I agree.
Ideally it should - but then I'm forced to split my
|
Can you provide a use case for that? It might help me reason on the issue |
I mentioned it all here see point 2): #37 (comment) Another hypothetical kingress := new(feature.Feature)
//
kingress.Stable("websocket").Must("receive traffic", AssertWebSocketTraffic())
kingires.Stable("http2").Must("receive traffic", AssertHTTP2Traffic()) If each
wrt. to side-effects this example doesn't have one - I'm just highlighting a use case and feature split implications |
This is kinda similar to the discussion happening here: #51. My understanding is that to generate state you should use I see this even more clearly in your sample with Maybe what might be useful is something like "feature template": in your case the setup and teardown is the same except (i guess) one step to trigger the state change to stimulate the kingress. But, assuming you create feature So what we could do is to create a sort of "template" that you can use to define common setup and teardown for more features: the |
The name
Ville wanted a setup step that was external to the Test or Feature to get the env to a state that is required by orthogonal to the test you would like to preform. In Eventing this would be installing a class of Broker, in Serving this would be configuring the ingress type. To test out the usage I thought it would be handy to use
I think
The framework should provide the hooks but make no opinion on which way is best. For some cases having some magic thing in context might be the best, like a set of namespaced clients setup in some Prerequisite phase serving likes to use.
I don't think you can or want to. I would see this written as a series of // RuntimeV1Conformance
func TestRuntimeV1Conformance(t *testing.T) {
ctx, env := global.Environment()
env.Test(ctx, t, conformance.Feature1())
env.Test(ctx, t, conformance.Feature2())
env.Test(ctx, t, conformance.Feature3())
env.Test(ctx, t, conformance.Feature4())
<... etc>
env.Finish()
} The results of this will be collected into a report that is more easily consumable and understood because the Test phase focused on an aspect of conformance. (that is my hope)
I would see this as something like: // RuntimeV1Conformance
func TestRuntimeV1Conformance(t *testing.T) {
ctx, env := global.Environment()
s := "ksvc-name"
env.Precondition(ctx, t, conformance.GivenSerivce(s))
env.Test(ctx, t, conformance.Feature1(s))
env.Test(ctx, t, conformance.Feature2(s))
env.Test(ctx, t, conformance.Feature3(s))
env.Test(ctx, t, conformance.Feature4(s))
<... etc>
env.Finish()
}
Big plus one, we need to support this, the runner code is total PoC. I am trying to focus on the following things:
I have ran into some struggles so far with the PoC. One being the timing and isolation of the Steps makes it hard to pass results, so the step and feature isolation cause you to think about how to compose the test so it has no or few dependencies. This results in each step having a bit more code than you might expect, but that also results in the steps being composable in new ways than you originally wrote them in, as an example: // TestBrokerAsMiddleware
func TestBrokerAsMiddleware(t *testing.T) {
t.Parallel()
ctx, env := global.Environment(
knative.WithKnativeNamespace(system.Namespace()),
knative.WithLoggingConfig,
knative.WithTracingConfig,
k8s.WithEventListener,
)
// Install and wait for a Ready Broker.
env.Prerequisite(ctx, t, features.BrokerGoesReady("default", "MTChannelBroker"))
// Test that a Broker can act as middleware.
env.Test(ctx, t, features.BrokerAsMiddleware("default"))
env.Finish()
} Here, I need a Broker to be ready, but the focus of the test is not a ready Broker. I wanted to write a feature that assumes for a ready Broker of a given name, I can pass events through it. This means that I can vendor the features and reuse this in other downstream repos can leverage this same test: // <... In eventing-rabbitmq ...>
// TestBrokerAsMiddleware
func TestBrokerAsMiddleware(t *testing.T) {
t.Parallel()
ctx, env := global.Environment(
knative.WithKnativeNamespace(system.Namespace()),
knative.WithLoggingConfig,
knative.WithTracingConfig,
k8s.WithEventListener,
)
// Create a RabbitmqCluster in the env, the CO that creates the underlying RabbitMQ Broker (not knative).
env.Prerequisite(ctx, t, rabbitfeatures.RabbitMQBrokerIsCreated())
// Install and wait for a Ready Broker.
env.Prerequisite(ctx, t, features.BrokerGoesReady("default", "RabbitMQBroker"))
// Test that a Broker can act as middleware.
env.Test(ctx, t, features.BrokerAsMiddleware("default"))
env.Finish()
} So the downstream repo has opt'ed to include this test, but they only have to add the test entry point, not the features. The reasoning here is likely the downstream needs to do some additional setup like above. |
Then I'm confused by the feature state methods ( A template approach would work - but I was pointing out in 2) that things neither clearly defined nor consistent |
I would need to know more about KIngress to be able to answer in full, but I have been assuming a Feature tests a contained set of functionality, and variations would be passed down to it. I think you can compose the test you are wanting to, and you will have to do a bit more work when trying to special case a particular feature of an implementation, if that is not generally testable by all implementations. |
1) Use of Feature in lieu of a list of steps in Prerequisites
Whatever goes here shouldn't have state (alpha, beta, stable) or levels (must, should, may) decorations since those could be skipped via environment flags etc. ie. What if I was only testing alpha features of a broker - the broker ready feature passed to a 2) Generating state from prerequisite steps & using them in assertions
The pointer doesn't change my interpretation - 3) Coalescing of Features Steps
We do this in networking currently and it good for discoverability. A side effect is diffs become very clear https://github.com/knative/networking/pull/277/files. Another example
I see only one service in this example. Also we want to group relevant conformance features to be run together to make it easier for downstream folks to run the correct set of tests. Right now in networking this done using an exported
Can you elaborate? |
It's conformance so there's are no variations on the features being tested. The only thing that differs is the ingress installation and maybe configuring some global test properties (ie. does DNS work, which ingress class etc.) |
I have made several issues to continue the discussion in the forks we have made: for 1): Framework should report Prerequisite features with Asserts #64 for 2): Framework needs to help test authors understand how to pass state between steps #66 for 3): Framework needs a way to group Features into Sets #65
I think we need to define this a bit more. The thinking is the runner can get metadata out of the feature and generate data around the run of the test and conformance is somehow different, but I think we are waiting for the conformance group to ask for data. |
Tagging @nak3 @ZhiminXiang @tcnghia - can someone from networking start taking a look at the current framework and the open issues Scott created. I don't want be the sole person representing the group. It may be worth doing a small POC with one or two KIngress features. |
This issue is stale because it has been open for 90 days with no |
So none of the networking conformance tests currently have levels. Only serving has some.
I think for serving a runtime conformance test will be structured as
A problem I see is the info from step 2 needs to go to step 3. With the framework as is this is going to be cumbersome - especially if each levelled assertion has unique logic.
Originally posted by @dprotaso in #30 (comment)
The text was updated successfully, but these errors were encountered: