Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lambda Integration Tests - Investigate Stream return type #2530

Open
tippmar-nr opened this issue Jun 6, 2024 · 2 comments
Open

Lambda Integration Tests - Investigate Stream return type #2530

tippmar-nr opened this issue Jun 6, 2024 · 2 comments

Comments

@tippmar-nr
Copy link
Member

Some of our Lambda integration tests have function handlers that return a Stream rather than the usual response object. (see ApplicationLoadBalancerRequestHandlerReturnsStream and other methods in LambdaSelfExecutingAssembly\Program.cs).

The integration tests pass as expected, but the lambda test tool fixture generates some errors that suggest that we might be doing something wrong in how we're returning the stream response.

    2024-06-06T20:01:01.231Z	000000000001	fail	Amazon.Lambda.Serialization.SystemTextJson.JsonSerializerException: Error converting the response object of type System.IO.Stream from the Lambda function to JSON: Timeouts are not supported on this stream.
     ---> System.InvalidOperationException: Timeouts are not supported on this stream.
       at System.IO.Stream.get_ReadTimeout()
       at System.Text.Json.Serialization.Metadata.JsonPropertyInfo`1.GetMemberAndWriteJson(Object obj, WriteStack& state, Utf8JsonWriter writer)
       at System.Text.Json.Serialization.Converters.ObjectDefaultConverter`1.OnTryWrite(Utf8JsonWriter writer, T value, JsonSerializerOptions options, WriteStack& state)
       at System.Text.Json.Serialization.JsonConverter`1.TryWrite(Utf8JsonWriter writer, T& value, JsonSerializerOptions options, WriteStack& state)
       at System.Text.Json.Serialization.JsonConverter`1.WriteCore(Utf8JsonWriter writer, T& value, JsonSerializerOptions options, WriteStack& state)
       at System.Text.Json.Serialization.Metadata.JsonTypeInfo`1.Serialize(Utf8JsonWriter writer, T& rootValue, Object rootValueBoxed)
       at System.Text.Json.JsonSerializer.Serialize[TValue](Utf8JsonWriter writer, TValue value, JsonSerializerOptions options)
       at Amazon.Lambda.Serialization.SystemTextJson.AbstractLambdaJsonSerializer.Serialize[T](T response, Stream responseStream)
       --- End of inner exception stack trace ---
       at Amazon.Lambda.Serialization.SystemTextJson.AbstractLambdaJsonSerializer.Serialize[T](T response, Stream responseStream)
       at Amazon.Lambda.RuntimeSupport.HandlerWrapper.<>c__DisplayClass44_0`2.<GetHandlerWrapper>b__0(InvocationRequest invocation)
       at Amazon.Lambda.RuntimeSupport.LambdaBootstrap.InvokeOnceAsync(CancellationToken cancellationToken)

We should investigate whether our implementation that returns a Stream is the correct way to do it in Lambda land.

@workato-integration
Copy link

@nrcventura
Copy link
Member

The main purpose of this test scenario is to trigger a case that forces our instrumentation to make no assumptions about the return type used for these web request lambdas. The Stream type was chosen arbitrarily for these tests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants