Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix panic when agentpf.client creates a Tunnel #3775

Merged
merged 2 commits into from
Jan 20, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -148,9 +148,9 @@ jobs:

![Assets](https://static.scarf.sh/a.png?x-pxid=d842651a-2e4d-465a-98e1-4808722c01ab)
- uses: actions/checkout@v4
if: needs.publish-release.semver_check.outputs.make_latest
if: steps.semver_check.outputs.make_latest == true
- name: Update Homebrew
if: needs.publish-release.semver_check.outputs.make_latest
if: steps.semver_check.outputs.make_latest == true
run: |
v=${{ github.ref_name }}
packaging/homebrew-package.sh "${v#v}" tel2oss "${{ vars.GH_BOT_USER }}" "${{ vars.GH_BOT_EMAIL }}" "${{ secrets.HOMEBREW_TAP_TOKEN }}"
Expand Down
10 changes: 10 additions & 0 deletions CHANGELOG.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,16 @@ docDescription: >-
environments, access to instantaneous feedback loops, and highly
customizable development environments.
items:
- version: 2.21.2
date: (TBD)
notes:
- type: bugfix
title: Fix panic when agentpf.client creates a Tunnel
body: >-
A race could occur where several requests where made to `agentpf.client.Tunnel` on a client that had errored
when creating its port-forward to the agent. The implementation could handle one such requests but not
several, resulting in a panic in situations where multiple simultaneous requests were made to the same client
during a very short time period,
- version: 2.21.1
date: 2024-12-17
notes:
Expand Down
7 changes: 7 additions & 0 deletions docs/release-notes.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,13 @@

[comment]: # (Code generated by relnotesgen. DO NOT EDIT.)
# <img src="images/logo.png" height="64px"/> Telepresence Release Notes
## Version 2.21.2
## <div style="display:flex;"><img src="images/bugfix.png" alt="bugfix" style="width:30px;height:fit-content;"/><div style="display:flex;margin-left:7px;">Fix panic when agentpf.client creates a Tunnel</div></div>
<div style="margin-left: 15px">

A race could occur where several requests where made to `agentpf.client.Tunnel` on a client that had errored when creating its port-forward to the agent. The implementation could handle one such requests but not several, resulting in a panic in situations where multiple simultaneous requests were made to the same client during a very short time period,
</div>

## Version 2.21.1 <span style="font-size: 16px;">(December 17)</span>
## <div style="display:flex;"><img src="images/bugfix.png" alt="bugfix" style="width:30px;height:fit-content;"/><div style="display:flex;margin-left:7px;">[Allow ingest of serverless deployments without specifying an inject-container-ports annotation](https://github.com/telepresenceio/telepresence/issues/3741)</div></div>
<div style="margin-left: 15px">
Expand Down
5 changes: 5 additions & 0 deletions docs/release-notes.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,11 @@ import { Note, Title, Body } from '@site/src/components/ReleaseNotes'
[comment]: # (Code generated by relnotesgen. DO NOT EDIT.)

# Telepresence Release Notes
## Version 2.21.2
<Note>
<Title type="bugfix">Fix panic when agentpf.client creates a Tunnel</Title>
<Body>A race could occur where several requests where made to `agentpf.client.Tunnel` on a client that had errored when creating its port-forward to the agent. The implementation could handle one such requests but not several, resulting in a panic in situations where multiple simultaneous requests were made to the same client during a very short time period,</Body>
</Note>
## Version 2.21.1 <span style={{fontSize:'16px'}}>(December 17)</span>
<Note>
<Title type="bugfix" docs="https://github.com/telepresenceio/telepresence/issues/3741">Allow ingest of serverless deployments without specifying an inject-container-ports annotation</Title>
Expand Down
27 changes: 18 additions & 9 deletions pkg/client/agentpf/clients.go
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,8 @@ func (ac *client) Tunnel(ctx context.Context, opts ...grpc.CallOption) (tunnel.C
select {
case err, ok := <-ac.ready:
if ok {
// Put error back on channel in case this Tunnel is used again before it's deleted.
ac.ready <- err
return nil, err
}
// ready channel is closed. We are ready to go.
Expand All @@ -74,14 +76,24 @@ func (ac *client) Tunnel(ctx context.Context, opts ...grpc.CallOption) (tunnel.C
}

func (ac *client) connect(ctx context.Context, deleteMe func()) {
defer close(ac.ready)
dialCtx, dialCancel := context.WithTimeout(ctx, 5*time.Second)
defer dialCancel()

conn, cli, _, err := k8sclient.ConnectToAgent(dialCtx, ac.info.PodName, ac.info.Namespace, uint16(ac.info.ApiPort))
var err error
defer func() {
if err == nil {
close(ac.ready)
} else {
deleteMe()
ac.ready <- err
}
}()

var conn *grpc.ClientConn
var cli agent.AgentClient

conn, cli, _, err = k8sclient.ConnectToAgent(dialCtx, ac.info.PodName, ac.info.Namespace, uint16(ac.info.ApiPort))
if err != nil {
deleteMe()
ac.ready <- err
return
}

Expand All @@ -94,10 +106,7 @@ func (ac *client) connect(ctx context.Context, deleteMe func()) {
intercepted := ac.info.Intercepted
ac.Unlock()
if intercepted {
if err = ac.startDialWatcherReady(ctx); err != nil {
deleteMe()
ac.ready <- err
}
err = ac.startDialWatcherReady(ctx)
}
}

Expand Down Expand Up @@ -495,7 +504,7 @@ func (s *clients) updateClients(ctx context.Context, ais []*manager.AgentPodInfo
return oldValue, false
}
ac := &client{
ready: make(chan error),
ready: make(chan error, 1),
session: s.session,
info: ai,
}
Expand Down
Loading