1
0
mirror of https://github.com/containous/traefik.git synced 2025-09-19 01:44:23 +03:00

Compare commits

...

28 Commits

Author SHA1 Message Date
Kevin Pollet
4613ddd757 Prepare release v3.1.6 2024-10-09 15:54:05 +02:00
kevinpollet
5d5dd9dd30 Merge branch v2.11 into v3.1 2024-10-09 15:19:14 +02:00
Kevin Pollet
934ca5fd22 Prepare release v2.11.12 2024-10-09 14:32:04 +02:00
Michel Heusschen
f16d14cfa6 Reuse compression writers 2024-10-09 14:14:03 +02:00
mmatur
4625bdf5cb Merge current v2.11 into v3.1 2024-10-08 17:54:23 +02:00
Kevin Pollet
7b477f762a Upgrade to node 22.9 and yarn lock to fix vulnerabilities
Co-authored-by: Romain <rtribotte@users.noreply.github.com>
2024-10-08 17:52:04 +02:00
Dylan Rodgers
157cf75e38 Update business callout in docs 2024-10-08 12:06:04 +02:00
Jesper Noordsij
ab35b3266a Ensure shellcheck failure exit code is reflected in GH job result 2024-10-08 11:58:05 +02:00
Michel Heusschen
d339bfc8d2 Use correct default weight in Accept-Encoding 2024-10-08 11:48:04 +02:00
Dmitry Romashov
0a6b8780f0 Adopt a layout for the large amount of entrypoint port numbers 2024-10-08 10:44:04 +02:00
Kevin Pollet
fc563d3f6e Fix the resolved TAG_NAME for commit in multiple tags 2024-10-07 09:32:05 +02:00
ttys3
a762cce430 Close wasm middleware to prevent memory leak 2024-10-04 16:36:04 +02:00
Kevin Pollet
306d3f277d Bump github.com/klauspost/compress to dbd6c381492a 2024-10-04 10:48:04 +02:00
Ludovic Fernandez
6f7649fccc Bump golangci-lint to 1.61.0 2024-10-04 09:38:04 +02:00
Matt Brown
e8ab3af74d Clarify only header fields may be redacted in access-logs 2024-10-03 16:28:04 +02:00
Kevin Pollet
a2ab3e534d Prepare release v3.1.5 2024-10-02 14:42:05 +02:00
kevinpollet
8cfa68a8e1 Merge branch v2.11 into v3.1 2024-10-02 11:25:30 +02:00
Kevin Pollet
518caa79f9 Prepare release v2.11.11 2024-10-02 11:10:04 +02:00
romain
b641d5cf2a Merge current v2.11 into v3.1 2024-09-30 14:59:38 +02:00
Mathieu
4d6cb6af03 Ensure defaultGeneratedCert.main as Subject's CN 2024-09-30 12:10:05 +02:00
Kevin Pollet
9eb804a689 Bump github.com/klauspost/compress to 8e14b1b5a913 2024-09-30 11:56:04 +02:00
Jesper Noordsij
c02b72ca51 Disable IngressClass lookup when disableClusterScopeResources is enabled 2024-09-27 16:24:04 +02:00
Rémi BUISSON
2bb712135d Specify default format value for access log 2024-09-27 15:34:04 +02:00
Michel Heusschen
14e5d4b4b3 Remove unused boot files from webui 2024-09-27 15:22:04 +02:00
lyrandy
e485edbe9f Update API documentation to mention pagination 2024-09-27 15:00:06 +02:00
Romain
61bb3ab991 Rework condition to not log on timeout 2024-09-27 11:34:05 +02:00
Romain
e62f8af23b Rework condition to not log on timeout 2024-09-27 11:20:04 +02:00
Romain
a42d396ed2 Clean connection headers for forward auth request only
Co-authored-by: Kevin Pollet <pollet.kevin@gmail.com>
2024-09-27 11:18:05 +02:00
38 changed files with 1736 additions and 1614 deletions

View File

@@ -7,7 +7,7 @@ on:
env:
GO_VERSION: '1.23'
GOLANGCI_LINT_VERSION: v1.60.3
GOLANGCI_LINT_VERSION: v1.61.0
MISSPELL_VERSION: v0.6.0
jobs:

View File

@@ -25,7 +25,7 @@ global_job_config:
- export "PATH=${GOPATH}/bin:${PATH}"
- mkdir -vp "${SEMAPHORE_GIT_DIR}" "${GOPATH}/bin"
- export GOPROXY=https://proxy.golang.org,direct
- curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b "${GOPATH}/bin" v1.60.3
- curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b "${GOPATH}/bin" v1.61.0
- curl -sSfL https://gist.githubusercontent.com/traefiker/6d7ac019c11d011e4f131bb2cca8900e/raw/goreleaser.sh | bash -s -- -b "${GOPATH}/bin"
- checkout
- cache restore traefik-$(checksum go.sum)

View File

@@ -1,3 +1,52 @@
## [v3.1.6](https://github.com/traefik/traefik/tree/v3.1.6) (2024-10-09)
[All Commits](https://github.com/traefik/traefik/compare/v3.1.5...v3.1.6)
**Bug fixes:**
- **[middleware]** Reuse compression writers ([#11168](https://github.com/traefik/traefik/pull/11168) by [michelheusschen](https://github.com/michelheusschen))
- **[middleware]** Use correct default weight in Accept-Encoding ([#11084](https://github.com/traefik/traefik/pull/11084) by [michelheusschen](https://github.com/michelheusschen))
- **[plugins]** Close wasm middleware to prevent memory leak ([#11151](https://github.com/traefik/traefik/pull/11151) by [ttys3](https://github.com/ttys3))
**Misc:**
- Merge branch v2.11 into v3.1 ([#11179](https://github.com/traefik/traefik/pull/11179) by [kevinpollet](https://github.com/kevinpollet))
- Merge branch v2.11 into v3.1 ([#11174](https://github.com/traefik/traefik/pull/11174) by [mmatur](https://github.com/mmatur))
## [v2.11.12](https://github.com/traefik/traefik/tree/v2.11.12) (2024-10-09)
[All Commits](https://github.com/traefik/traefik/compare/v2.11.11...v2.11.12)
**Bug fixes:**
- **[middleware]** Bump github.com/klauspost/compress to dbd6c381492a ([#11162](https://github.com/traefik/traefik/pull/11162) by [kevinpollet](https://github.com/kevinpollet))
- **[webui]** Upgrade to node 22.9 and yarn lock to fix vulnerabilities ([#11173](https://github.com/traefik/traefik/pull/11173) by [kevinpollet](https://github.com/kevinpollet))
- **[webui]** Adopt a layout for the large amount of entrypoint port numbers ([#11157](https://github.com/traefik/traefik/pull/11157) by [framebassman](https://github.com/framebassman))
**Documentation:**
- **[accesslogs]** Clarify that only header fields may be redacted in access-logs ([#11139](https://github.com/traefik/traefik/pull/11139) by [mattbnz](https://github.com/mattbnz))
- Update business callout ([#11172](https://github.com/traefik/traefik/pull/11172) by [tomatokoolaid](https://github.com/tomatokoolaid))
## [v3.1.5](https://github.com/traefik/traefik/tree/v3.1.5) (2024-10-02)
[All Commits](https://github.com/traefik/traefik/compare/v3.1.4...v3.1.5)
**Bug fixes:**
- **[k8s/ingress,k8s]** Disable IngressClass lookup when disableClusterScopeResources is enabled ([#11111](https://github.com/traefik/traefik/pull/11111) by [jnoordsij](https://github.com/jnoordsij))
- **[server]** Rework condition to not log on timeout ([#11132](https://github.com/traefik/traefik/pull/11132) by [rtribotte](https://github.com/rtribotte))
**Misc:**
- Merge branch v2.11 into v3.1 ([#11149](https://github.com/traefik/traefik/pull/11149) by [kevinpollet](https://github.com/kevinpollet))
- Merge branch v2.11 into v3.1 ([#11142](https://github.com/traefik/traefik/pull/11142) by [rtribotte](https://github.com/rtribotte))
## [v2.11.11](https://github.com/traefik/traefik/tree/v2.11.11) (2024-10-02)
[All Commits](https://github.com/traefik/traefik/compare/v2.11.10...v2.11.11)
**Bug fixes:**
- **[acme]** Ensure defaultGeneratedCert.main as Subject&#39;s CN ([#10581](https://github.com/traefik/traefik/pull/10581) by [Lamatte](https://github.com/Lamatte))
- **[middleware,authentication]** Clean connection headers for forward auth request only ([#11095](https://github.com/traefik/traefik/pull/11095) by [rtribotte](https://github.com/rtribotte))
- **[middleware]** Bump github.com/klauspost/compress to 8e14b1b5a913 ([#11141](https://github.com/traefik/traefik/pull/11141) by [kevinpollet](https://github.com/kevinpollet))
- **[server]** Rework condition to not log on timeout ([#11133](https://github.com/traefik/traefik/pull/11133) by [rtribotte](https://github.com/rtribotte))
- **[webui]** Remove unused boot files from webui ([#11109](https://github.com/traefik/traefik/pull/11109) by [michelheusschen](https://github.com/michelheusschen))
**Documentation:**
- **[accesslogs]** Specify default format value for access log ([#11130](https://github.com/traefik/traefik/pull/11130) by [darkweaver87](https://github.com/darkweaver87))
- **[api]** Update API documentation to mention pagination ([#11115](https://github.com/traefik/traefik/pull/11115) by [lyrandy](https://github.com/lyrandy))
## [v3.1.4](https://github.com/traefik/traefik/tree/v3.1.4) (2024-09-19)
[All Commits](https://github.com/traefik/traefik/compare/v3.1.3...v3.1.4)

View File

@@ -1,13 +1,10 @@
SRCS = $(shell git ls-files '*.go' | grep -v '^vendor/')
TAG_NAME := $(shell git tag -l --contains HEAD)
TAG_NAME := $(shell git describe --abbrev=0 --tags --exact-match)
SHA := $(shell git rev-parse HEAD)
VERSION_GIT := $(if $(TAG_NAME),$(TAG_NAME),$(SHA))
VERSION := $(if $(VERSION),$(VERSION),$(VERSION_GIT))
GIT_BRANCH := $(subst heads/,,$(shell git rev-parse --abbrev-ref HEAD 2>/dev/null))
REPONAME := $(shell echo $(REPO) | tr '[:upper:]' '[:lower:]')
BIN_NAME := traefik
CODENAME ?= cheddar

View File

@@ -5,6 +5,6 @@
Add API Gateway or API Management capabilities seamlessly to your existing Traefik deployments.
No rip and replace. No learning curve.
- [Explore our API Gateway](https://traefik.io/traefik-hub-api-gateway/)
- [Explore our API Gateway](https://traefik.io/traefik-hub-api-gateway/) ([Watch the Demo Video](https://info.traefik.io/watch-traefik-api-gw-demo?cta=doc))
- [Explore our API Management](https://traefik.io/traefik-hub/)
- [Get 24/7/365 Commercial Support for Traefik OSS](https://info.traefik.io/request-commercial-support)

View File

@@ -640,3 +640,15 @@ Increasing the `readTimeout` value could be the solution notably if you are deal
- TCP: `Error while handling TCP connection: readfrom tcp X.X.X.X:X->X.X.X.X:X: read tcp X.X.X.X:X->X.X.X.X:X: i/o timeout`
- HTTP: `'499 Client Closed Request' caused by: context canceled`
- HTTP: `ReverseProxy read error during body copy: read tcp X.X.X.X:X->X.X.X.X:X: use of closed network connection`
## v2.11.3
### Connection headers
In `v2.11.3`, the handling of the request Connection headers directives has changed to prevent any abuse.
Before, Traefik removed any header listed in the Connection header just before forwarding the request to the backends.
Now, Traefik removes the headers listed in the Connection header as soon as the request is handled.
As a consequence, middlewares do not have access to those Connection headers,
and a new option has been introduced to specify which ones could go through the middleware chain before being removed: `<entrypoint>.forwardedHeaders.connection`.
Please check out the [entrypoint forwarded headers connection option configuration](../routing/entrypoints.md#forwarded-headers) documentation.

View File

@@ -67,6 +67,8 @@ accessLog:
### `format`
_Optional, Default="common"_
By default, logs are written using the Common Log Format (CLF).
To write logs in JSON, use `json` in the `format` option.
If the given format is unsupported, the default (CLF) is used instead.
@@ -156,7 +158,8 @@ Each field can be set to:
- `keep` to keep the value
- `drop` to drop the value
- `redact` to replace the value with "redacted"
Header fields may also optionally be set to `redact` to replace the value with "REDACTED".
The `defaultMode` for `fields.names` is `keep`.

View File

@@ -136,6 +136,15 @@ api:
All the following endpoints must be accessed with a `GET` HTTP request.
!!! info "Pagination"
By default, up to 100 results are returned per page, and the next page can be checked using the `X-Next-Page` HTTP Header.
To control pagination, use the `page` and `per_page` query parameters.
```bash
curl https://traefik.example.com:8080/api/http/routers?page=2&per_page=20
```
| Path | Description |
|--------------------------------|---------------------------------------------------------------------------------------------|
| `/api/http/routers` | Lists all the HTTP routers information. |

9
go.mod
View File

@@ -30,10 +30,10 @@ require (
github.com/hashicorp/go-retryablehttp v0.7.7
github.com/hashicorp/go-version v1.6.0
github.com/hashicorp/nomad/api v0.0.0-20240122103822-8a4bd61caf74 // No tag on the repo.
github.com/http-wasm/http-wasm-host-go v0.6.0
github.com/http-wasm/http-wasm-host-go v0.7.0
github.com/influxdata/influxdb-client-go/v2 v2.7.0
github.com/influxdata/influxdb1-client v0.0.0-20200827194710-b269163b24ab // No tag on the repo.
github.com/klauspost/compress v1.17.9
github.com/klauspost/compress v1.17.11-0.20241004063537-dbd6c381492a // Required to have the content-type fix: https://github.com/klauspost/compress/pull/1013
github.com/kvtools/consul v1.0.2
github.com/kvtools/etcdv3 v1.0.2
github.com/kvtools/redis v1.1.0
@@ -60,7 +60,7 @@ require (
github.com/tailscale/tscert v0.0.0-20230806124524-28a91b69a046 // No tag on the repo.
github.com/testcontainers/testcontainers-go v0.32.0
github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0
github.com/tetratelabs/wazero v1.7.2
github.com/tetratelabs/wazero v1.8.0
github.com/tidwall/gjson v1.17.0
github.com/traefik/grpc-web v0.16.0
github.com/traefik/paerser v0.2.1
@@ -369,6 +369,3 @@ replace (
// ambiguous import: found package github.com/tencentcloud/tencentcloud-sdk-go/tencentcloud/common/http in multiple modules
// tencentcloud uses monorepo with multimodule but the go.mod files are incomplete.
exclude github.com/tencentcloud/tencentcloud-sdk-go v3.0.83+incompatible
// Replace to handle new wasmexport in official go and wazergo for http calls.
replace github.com/http-wasm/http-wasm-host-go => github.com/traefik/http-wasm-host-go v0.0.0-20240618100324-3c53dcaa1a70

12
go.sum
View File

@@ -549,6 +549,8 @@ github.com/hashicorp/serf v0.8.2/go.mod h1:6hOLApaqBFA1NXqRQAsxw9QxuDEvNxSQRwA/J
github.com/hashicorp/serf v0.10.1 h1:Z1H2J60yRKvfDYAOZLd2MU0ND4AH/WDz7xYHDWQsIPY=
github.com/hashicorp/serf v0.10.1/go.mod h1:yL2t6BqATOLGc5HF7qbFkTfXoPIY0WZdWHfEvMqbG+4=
github.com/hpcloud/tail v1.0.0/go.mod h1:ab1qPbhIpdTxEkNHXyeSf5vhxWSCs/tWer42PpOxQnU=
github.com/http-wasm/http-wasm-host-go v0.7.0 h1:+1KrRyOO6tWiDB24QrtSYyDmzFLBBs3jioKaUT0mq1c=
github.com/http-wasm/http-wasm-host-go v0.7.0/go.mod h1:adXKcLmL7yuavH/e0kBAp7b3TgAHTo/enCduyN5bXGM=
github.com/huandu/xstrings v1.3.3/go.mod h1:y5/lhBue+AyNmUVz9RLU9xbLR0o4KIIExikq4ovT0aE=
github.com/huandu/xstrings v1.5.0 h1:2ag3IFq9ZDANvthTwTiqSSZLjDc+BedvHPAp5tJy2TI=
github.com/huandu/xstrings v1.5.0/go.mod h1:y5/lhBue+AyNmUVz9RLU9xbLR0o4KIIExikq4ovT0aE=
@@ -595,8 +597,8 @@ github.com/kisielk/errcheck v1.1.0/go.mod h1:EZBBE59ingxPouuu3KfxchcWSUPOHkagtvW
github.com/kisielk/errcheck v1.5.0/go.mod h1:pFxgyoBC7bSaBwPgfKdkLd5X25qrDl4LWUI2bnpBCr8=
github.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+oQHNcck=
github.com/klauspost/compress v1.10.3/go.mod h1:aoV0uJVorq1K+umq18yTdKaF57EivdYsUV+/s2qKfXs=
github.com/klauspost/compress v1.17.9 h1:6KIumPrER1LHsvBVuDa0r5xaG0Es51mhhB9BQB2qeMA=
github.com/klauspost/compress v1.17.9/go.mod h1:Di0epgTjJY877eYKx5yC51cX2A2Vl2ibi7bDH9ttBbw=
github.com/klauspost/compress v1.17.11-0.20241004063537-dbd6c381492a h1:cwHOqPB4H4iQq8177kf2SxpjNbcjJ2m3lNwKIe28Hqg=
github.com/klauspost/compress v1.17.11-0.20241004063537-dbd6c381492a/go.mod h1:pMDklpSncoRMuLFrf1W9Ss9KT+0rH90U12bZKk7uwG0=
github.com/klauspost/cpuid/v2 v2.0.9/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg=
github.com/klauspost/cpuid/v2 v2.2.5 h1:0E5MSMDEoAulmXNFquVs//DdoomxaoTY1kUhbc/qbZg=
github.com/klauspost/cpuid/v2 v2.2.5/go.mod h1:Lcz8mBdAVJIBVzewtcLocK12l3Y+JytZYpaMropDUws=
@@ -993,8 +995,8 @@ github.com/testcontainers/testcontainers-go v0.32.0 h1:ug1aK08L3gCHdhknlTTwWjPHP
github.com/testcontainers/testcontainers-go v0.32.0/go.mod h1:CRHrzHLQhlXUsa5gXjTOfqIEJcrK5+xMDmBr/WMI88E=
github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0 h1:Z3DTMveNUqeGJZ+CXZhpvI7OF1BS71Ywi3SwoXLZ4Lc=
github.com/testcontainers/testcontainers-go/modules/k3s v0.32.0/go.mod h1:SYp1WtvNc3n/cg5atO6LvaOd2aqkQYMSDCcWPOUdaZg=
github.com/tetratelabs/wazero v1.7.2 h1:1+z5nXJNwMLPAWaTePFi49SSTL0IMx/i3Fg8Yc25GDc=
github.com/tetratelabs/wazero v1.7.2/go.mod h1:ytl6Zuh20R/eROuyDaGPkp82O9C/DJfXAwJfQ3X6/7Y=
github.com/tetratelabs/wazero v1.8.0 h1:iEKu0d4c2Pd+QSRieYbnQC9yiFlMS9D+Jr0LsRmcF4g=
github.com/tetratelabs/wazero v1.8.0/go.mod h1:yAI0XTsMBhREkM/YDAK/zNou3GoiAce1P6+rp/wQhjs=
github.com/tidwall/gjson v1.17.0 h1:/Jocvlh98kcTfpN2+JzGQWQcqrPQwDrVEMApx/M5ZwM=
github.com/tidwall/gjson v1.17.0/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/match v1.1.1 h1:+Ho715JplO36QYgwN9PGYNhgZvoUSc9X2c80KVTi+GA=
@@ -1009,8 +1011,6 @@ github.com/tklauser/numcpus v0.6.1/go.mod h1:1XfjsgE2zo8GVw7POkMbHENHzVg3GzmoZ9f
github.com/tmc/grpc-websocket-proxy v0.0.0-20190109142713-0ad062ec5ee5/go.mod h1:ncp9v5uamzpCO7NfCPTXjqaC+bZgJeR0sMTm6dMHP7U=
github.com/traefik/grpc-web v0.16.0 h1:eeUWZaFg6ZU0I9dWOYE2D5qkNzRBmXzzuRlxdltascY=
github.com/traefik/grpc-web v0.16.0/go.mod h1:2ttniSv7pTgBWIU2HZLokxRfFX3SA60c/DTmQQgVml4=
github.com/traefik/http-wasm-host-go v0.0.0-20240618100324-3c53dcaa1a70 h1:I+oBnV0orhmasb87yaX54tOAfqrV9+yKoQ1Cum5mq8w=
github.com/traefik/http-wasm-host-go v0.0.0-20240618100324-3c53dcaa1a70/go.mod h1:zQB3w+df4hryDEqBorGyA1DwPJ86LfKIASNLFuj6CuI=
github.com/traefik/paerser v0.2.1 h1:LFgeak1NmjEHF53c9ENdXdL1UMkF/lD5t+7Evsz4hH4=
github.com/traefik/paerser v0.2.1/go.mod h1:7BBDd4FANoVgaTZG+yh26jI6CA2nds7D/4VTEdIsh24=
github.com/traefik/yaegi v0.16.1 h1:f1De3DVJqIDKmnasUF6MwmWv1dSEEat0wcpXhD2On3E=

View File

@@ -13,22 +13,21 @@ const (
upgradeHeader = "Upgrade"
)
// Remover removes hop-by-hop headers listed in the "Connection" header.
// RemoveConnectionHeaders removes hop-by-hop headers listed in the "Connection" header.
// See RFC 7230, section 6.1.
func Remover(next http.Handler) http.HandlerFunc {
return func(rw http.ResponseWriter, req *http.Request) {
next.ServeHTTP(rw, Remove(req))
}
}
// Remove removes hop-by-hop header on the request.
func Remove(req *http.Request) *http.Request {
func RemoveConnectionHeaders(req *http.Request) {
var reqUpType string
if httpguts.HeaderValuesContainsToken(req.Header[connectionHeader], upgradeHeader) {
reqUpType = req.Header.Get(upgradeHeader)
}
removeConnectionHeaders(req.Header)
for _, f := range req.Header[connectionHeader] {
for _, sf := range strings.Split(f, ",") {
if sf = textproto.TrimString(sf); sf != "" {
req.Header.Del(sf)
}
}
}
if reqUpType != "" {
req.Header.Set(connectionHeader, upgradeHeader)
@@ -36,16 +35,4 @@ func Remove(req *http.Request) *http.Request {
} else {
req.Header.Del(connectionHeader)
}
return req
}
func removeConnectionHeaders(h http.Header) {
for _, f := range h[connectionHeader] {
for _, sf := range strings.Split(f, ",") {
if sf = textproto.TrimString(sf); sf != "" {
h.Del(sf)
}
}
}
}

View File

@@ -50,19 +50,13 @@ func TestRemover(t *testing.T) {
t.Run(test.desc, func(t *testing.T) {
t.Parallel()
next := http.HandlerFunc(func(rw http.ResponseWriter, req *http.Request) {})
h := Remover(next)
req := httptest.NewRequest(http.MethodGet, "https://localhost", nil)
for k, v := range test.reqHeaders {
req.Header.Set(k, v)
}
rw := httptest.NewRecorder()
h.ServeHTTP(rw, req)
RemoveConnectionHeaders(req)
assert.Equal(t, test.expected, req.Header)
})

View File

@@ -120,8 +120,6 @@ func (fa *forwardAuth) GetTracingInformation() (string, string, trace.SpanKind)
func (fa *forwardAuth) ServeHTTP(rw http.ResponseWriter, req *http.Request) {
logger := middlewares.GetLogger(req.Context(), fa.name, typeNameForward)
req = Remove(req)
forwardReq, err := http.NewRequestWithContext(req.Context(), http.MethodGet, fa.address, nil)
if err != nil {
logger.Debug().Msgf("Error calling %s. Cause %s", fa.address, err)
@@ -262,6 +260,8 @@ func (fa *forwardAuth) buildModifier(authCookies []*http.Cookie) func(res *http.
func writeHeader(req, forwardReq *http.Request, trustForwardHeader bool, allowedHeaders []string) {
utils.CopyHeaders(forwardReq.Header, req.Header)
RemoveConnectionHeaders(forwardReq)
utils.RemoveHeaders(forwardReq.Header, hopHeaders...)
forwardReq.Header = filterForwardRequestHeaders(forwardReq.Header, allowedHeaders)

View File

@@ -1,6 +1,7 @@
package compress
import (
"cmp"
"slices"
"strconv"
"strings"
@@ -19,7 +20,7 @@ const (
type Encoding struct {
Type string
Weight *float64
Weight float64
}
func getCompressionType(acceptEncoding []string, defaultType string) string {
@@ -37,11 +38,11 @@ func getCompressionType(acceptEncoding []string, defaultType string) string {
encoding := encodings[0]
if encoding.Type == identityName && encoding.Weight != nil && *encoding.Weight == 0 {
if encoding.Type == identityName && encoding.Weight == 0 {
return notAcceptable
}
if encoding.Type == wildcardName && encoding.Weight != nil && *encoding.Weight == 0 {
if encoding.Type == wildcardName && encoding.Weight == 0 {
return notAcceptable
}
@@ -83,11 +84,13 @@ func parseAcceptEncoding(acceptEncoding []string) ([]Encoding, bool) {
continue
}
var weight *float64
// If no "q" parameter is present, the default weight is 1.
// https://www.rfc-editor.org/rfc/rfc9110.html#name-quality-values
weight := 1.0
if len(parsed) > 1 && strings.HasPrefix(parsed[1], "q=") {
w, _ := strconv.ParseFloat(strings.TrimPrefix(parsed[1], "q="), 64)
weight = &w
weight = w
hasWeight = true
}
@@ -98,41 +101,9 @@ func parseAcceptEncoding(acceptEncoding []string) ([]Encoding, bool) {
}
}
slices.SortFunc(encodings, compareEncoding)
slices.SortFunc(encodings, func(a, b Encoding) int {
return cmp.Compare(b.Weight, a.Weight)
})
return encodings, hasWeight
}
func compareEncoding(a, b Encoding) int {
lhs, rhs := a.Weight, b.Weight
if lhs == nil && rhs == nil {
return 0
}
if lhs == nil && *rhs == 0 {
return -1
}
if lhs == nil {
return 1
}
if rhs == nil && *lhs == 0 {
return 1
}
if rhs == nil {
return -1
}
if *lhs < *rhs {
return 1
}
if *lhs > *rhs {
return -1
}
return 0
}

View File

@@ -74,6 +74,11 @@ func Test_getCompressionType(t *testing.T) {
values: []string{"gzip, *;q=0"},
expected: gzipName,
},
{
desc: "mixed weight",
values: []string{"gzip, br;q=0.9"},
expected: gzipName,
},
}
for _, test := range testCases {
@@ -98,10 +103,10 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "weight",
values: []string{"br;q=1.0, zstd;q=0.9, gzip;q=0.8, *;q=0.1"},
expected: []Encoding{
{Type: brotliName, Weight: ptr[float64](1)},
{Type: zstdName, Weight: ptr(0.9)},
{Type: gzipName, Weight: ptr(0.8)},
{Type: wildcardName, Weight: ptr(0.1)},
{Type: brotliName, Weight: 1},
{Type: zstdName, Weight: 0.9},
{Type: gzipName, Weight: 0.8},
{Type: wildcardName, Weight: 0.1},
},
assertWeight: assert.True,
},
@@ -109,10 +114,10 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "mixed",
values: []string{"zstd,gzip, br;q=1.0, *;q=0"},
expected: []Encoding{
{Type: brotliName, Weight: ptr[float64](1)},
{Type: zstdName},
{Type: gzipName},
{Type: wildcardName, Weight: ptr[float64](0)},
{Type: zstdName, Weight: 1},
{Type: gzipName, Weight: 1},
{Type: brotliName, Weight: 1},
{Type: wildcardName, Weight: 0},
},
assertWeight: assert.True,
},
@@ -120,10 +125,10 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "no weight",
values: []string{"zstd, gzip, br, *"},
expected: []Encoding{
{Type: zstdName},
{Type: gzipName},
{Type: brotliName},
{Type: wildcardName},
{Type: zstdName, Weight: 1},
{Type: gzipName, Weight: 1},
{Type: brotliName, Weight: 1},
{Type: wildcardName, Weight: 1},
},
assertWeight: assert.False,
},
@@ -131,9 +136,9 @@ func Test_parseAcceptEncoding(t *testing.T) {
desc: "weight and identity",
values: []string{"gzip;q=1.0, identity; q=0.5, *;q=0"},
expected: []Encoding{
{Type: gzipName, Weight: ptr[float64](1)},
{Type: identityName, Weight: ptr(0.5)},
{Type: wildcardName, Weight: ptr[float64](0)},
{Type: gzipName, Weight: 1},
{Type: identityName, Weight: 0.5},
{Type: wildcardName, Weight: 0},
},
assertWeight: assert.True,
},
@@ -150,7 +155,3 @@ func Test_parseAcceptEncoding(t *testing.T) {
})
}
}
func ptr[T any](t T) *T {
return &t
}

View File

@@ -670,39 +670,32 @@ func Test1xxResponses(t *testing.T) {
assert.NotEqualValues(t, body, fakeBody)
}
func BenchmarkCompress(b *testing.B) {
func BenchmarkCompressGzip(b *testing.B) {
runCompressionBenchmark(b, gzipName)
}
func BenchmarkCompressBrotli(b *testing.B) {
runCompressionBenchmark(b, brotliName)
}
func BenchmarkCompressZstandard(b *testing.B) {
runCompressionBenchmark(b, zstdName)
}
func runCompressionBenchmark(b *testing.B, algorithm string) {
b.Helper()
testCases := []struct {
name string
parallel bool
size int
}{
{
name: "2k",
size: 2048,
},
{
name: "20k",
size: 20480,
},
{
name: "100k",
size: 102400,
},
{
name: "2k parallel",
parallel: true,
size: 2048,
},
{
name: "20k parallel",
parallel: true,
size: 20480,
},
{
name: "100k parallel",
parallel: true,
size: 102400,
},
{"2k", false, 2048},
{"20k", false, 20480},
{"100k", false, 102400},
{"2k parallel", true, 2048},
{"20k parallel", true, 20480},
{"100k parallel", true, 102400},
}
for _, test := range testCases {
@@ -716,7 +709,7 @@ func BenchmarkCompress(b *testing.B) {
handler, _ := New(context.Background(), next, dynamic.Compress{}, "testing")
req, _ := http.NewRequest(http.MethodGet, "/whatever", nil)
req.Header.Set("Accept-Encoding", "gzip")
req.Header.Set("Accept-Encoding", algorithm)
b.ReportAllocs()
b.SetBytes(int64(test.size))
@@ -724,7 +717,7 @@ func BenchmarkCompress(b *testing.B) {
b.ResetTimer()
b.RunParallel(func(pb *testing.PB) {
for pb.Next() {
runBenchmark(b, req, handler)
runBenchmark(b, req, handler, algorithm)
}
})
return
@@ -732,13 +725,13 @@ func BenchmarkCompress(b *testing.B) {
b.ResetTimer()
for range b.N {
runBenchmark(b, req, handler)
runBenchmark(b, req, handler, algorithm)
}
})
}
}
func runBenchmark(b *testing.B, req *http.Request, handler http.Handler) {
func runBenchmark(b *testing.B, req *http.Request, handler http.Handler, algorithm string) {
b.Helper()
res := httptest.NewRecorder()
@@ -747,7 +740,7 @@ func runBenchmark(b *testing.B, req *http.Request, handler http.Handler) {
b.Fatalf("Expected 200 but got %d", code)
}
assert.Equal(b, gzipName, res.Header().Get(contentEncodingHeader))
assert.Equal(b, algorithm, res.Header().Get(contentEncodingHeader))
}
func generateBytes(length int) []byte {

View File

@@ -8,6 +8,7 @@ import (
"mime"
"net"
"net/http"
"sync"
"github.com/andybalholm/brotli"
"github.com/klauspost/compress/zstd"
@@ -45,6 +46,7 @@ type CompressionHandler struct {
excludedContentTypes []parsedContentType
includedContentTypes []parsedContentType
next http.Handler
writerPool sync.Pool
}
// NewCompressionHandler returns a new compressing handler.
@@ -92,7 +94,7 @@ func NewCompressionHandler(cfg Config, next http.Handler) (http.Handler, error)
func (c *CompressionHandler) ServeHTTP(rw http.ResponseWriter, r *http.Request) {
rw.Header().Add(vary, acceptEncoding)
compressionWriter, err := newCompressionWriter(c.cfg.Algorithm, rw)
compressionWriter, err := c.getCompressionWriter(rw)
if err != nil {
logger := middlewares.GetLogger(r.Context(), c.cfg.MiddlewareName, typeName)
logger.Debug().Msgf("Create compression handler: %v", err)
@@ -100,6 +102,7 @@ func (c *CompressionHandler) ServeHTTP(rw http.ResponseWriter, r *http.Request)
rw.WriteHeader(http.StatusInternalServerError)
return
}
defer c.putCompressionWriter(compressionWriter)
responseWriter := &responseWriter{
rw: rw,
@@ -130,6 +133,8 @@ type compression interface {
// as it would otherwise send some extra "end of compression" bytes.
// Close also makes sure to flush whatever was left to write from the buffer.
Close() error
// Reset reinitializes the state of the encoder, allowing it to be reused.
Reset(w io.Writer)
}
type compressionWriter struct {
@@ -137,6 +142,19 @@ type compressionWriter struct {
alg string
}
func (c *CompressionHandler) getCompressionWriter(rw io.Writer) (*compressionWriter, error) {
if writer, ok := c.writerPool.Get().(*compressionWriter); ok {
writer.compression.Reset(rw)
return writer, nil
}
return newCompressionWriter(c.cfg.Algorithm, rw)
}
func (c *CompressionHandler) putCompressionWriter(writer *compressionWriter) {
writer.Reset(nil)
c.writerPool.Put(writer)
}
func newCompressionWriter(algo string, in io.Writer) (*compressionWriter, error) {
switch algo {
case brotliName:

View File

@@ -8,6 +8,7 @@ import (
"os"
"path/filepath"
"reflect"
"runtime"
"strings"
"github.com/http-wasm/http-wasm-host-go/handler"
@@ -135,7 +136,19 @@ func (b *wasmMiddlewareBuilder) buildMiddleware(ctx context.Context, next http.H
return nil, nil, fmt.Errorf("creating middleware: %w", err)
}
return mw.NewHandler(ctx, next), applyCtx, nil
h := mw.NewHandler(ctx, next)
// Traefik does not Close the middleware when creating a new instance on a configuration change.
// When the middleware is marked to be GC, we need to close it so the wasm instance is properly closed.
// Reference: https://github.com/traefik/traefik/issues/11119
runtime.SetFinalizer(h, func(_ http.Handler) {
if err := mw.Close(ctx); err != nil {
logger.Err(err).Msg("[wasm] middleware Close failed")
} else {
logger.Debug().Msg("[wasm] middleware Close ok")
}
})
return h, applyCtx, nil
}
// WasmMiddleware is an HTTP handler plugin wrapper.

View File

@@ -424,8 +424,7 @@ func (p *Provider) watchNewDomains(ctx context.Context) {
if len(route.TLS.Domains) > 0 {
domains := deleteUnnecessaryDomains(ctxRouter, route.TLS.Domains)
for i := range len(domains) {
domain := domains[i]
for _, domain := range domains {
safe.Go(func() {
dom, cert, err := p.resolveCertificate(ctx, domain, traefiktls.DefaultTLSStoreName)
if err != nil {
@@ -461,8 +460,7 @@ func (p *Provider) watchNewDomains(ctx context.Context) {
if len(route.TLS.Domains) > 0 {
domains := deleteUnnecessaryDomains(ctxRouter, route.TLS.Domains)
for i := range len(domains) {
domain := domains[i]
for _, domain := range domains {
safe.Go(func() {
dom, cert, err := p.resolveCertificate(ctx, domain, traefiktls.DefaultTLSStoreName)
if err != nil {
@@ -554,8 +552,11 @@ func (p *Provider) resolveDefaultCertificate(ctx context.Context, domains []stri
p.resolvingDomainsMutex.Lock()
sort.Strings(domains)
domainKey := strings.Join(domains, ",")
sortedDomains := make([]string, len(domains))
copy(sortedDomains, domains)
sort.Strings(sortedDomains)
domainKey := strings.Join(sortedDomains, ",")
if _, ok := p.resolvingDomains[domainKey]; ok {
p.resolvingDomainsMutex.Unlock()
@@ -955,12 +956,14 @@ func (p *Provider) certExists(validDomains []string) bool {
p.certificatesMu.RLock()
defer p.certificatesMu.RUnlock()
sort.Strings(validDomains)
sortedDomains := make([]string, len(validDomains))
copy(sortedDomains, validDomains)
sort.Strings(sortedDomains)
for _, cert := range p.certificates {
domains := cert.Certificate.Domain.ToStrArray()
sort.Strings(domains)
if reflect.DeepEqual(domains, validDomains) {
if reflect.DeepEqual(domains, sortedDomains) {
return true
}
}

View File

@@ -219,7 +219,7 @@ func (p *Provider) loadConfigurationFromIngresses(ctx context.Context, client Cl
var ingressClasses []*netv1.IngressClass
if !p.DisableIngressClassLookup {
if !p.DisableIngressClassLookup && !p.DisableClusterScopeResources {
ics, err := client.GetIngressClasses()
if err != nil {
log.Ctx(ctx).Warn().Err(err).Msg("Failed to list ingress classes")

View File

@@ -26,11 +26,12 @@ func Bool(v bool) *bool { return &v }
func TestLoadConfigurationFromIngresses(t *testing.T) {
testCases := []struct {
desc string
ingressClass string
expected *dynamic.Configuration
allowEmptyServices bool
disableIngressClassLookup bool
desc string
ingressClass string
expected *dynamic.Configuration
allowEmptyServices bool
disableIngressClassLookup bool
disableClusterScopeResources bool
}{
{
desc: "Empty ingresses",
@@ -1335,6 +1336,38 @@ func TestLoadConfigurationFromIngresses(t *testing.T) {
},
},
},
{
// Duplicate test case with the same fixture as the one above, but with the disableClusterScopeResources option to true.
// Showing that disabling the ingressClass discovery still allow the discovery of ingresses with ingress annotation.
desc: "Ingress with ingress annotation",
disableClusterScopeResources: true,
expected: &dynamic.Configuration{
HTTP: &dynamic.HTTPConfiguration{
Middlewares: map[string]*dynamic.Middleware{},
Routers: map[string]*dynamic.Router{
"testing-bar": {
Rule: "PathPrefix(`/bar`)",
Service: "testing-service1-80",
},
},
Services: map[string]*dynamic.Service{
"testing-service1-80": {
LoadBalancer: &dynamic.ServersLoadBalancer{
PassHostHeader: Bool(true),
ResponseForwarding: &dynamic.ResponseForwarding{
FlushInterval: ptypes.Duration(100 * time.Millisecond),
},
Servers: []dynamic.Server{
{
URL: "http://10.10.0.1:8080",
},
},
},
},
},
},
},
},
{
desc: "Ingress with ingressClass",
expected: &dynamic.Configuration{
@@ -1377,6 +1410,19 @@ func TestLoadConfigurationFromIngresses(t *testing.T) {
},
},
},
{
// Duplicate test case with the same fixture as the one above, but with the disableClusterScopeResources option to true.
// Showing that disabling the ingressClass discovery avoid discovering Ingresses with an IngressClass.
desc: "Ingress with ingressClass",
disableClusterScopeResources: true,
expected: &dynamic.Configuration{
HTTP: &dynamic.HTTPConfiguration{
Middlewares: map[string]*dynamic.Middleware{},
Routers: map[string]*dynamic.Router{},
Services: map[string]*dynamic.Service{},
},
},
},
{
desc: "Ingress with named port",
expected: &dynamic.Configuration{
@@ -1455,9 +1501,10 @@ func TestLoadConfigurationFromIngresses(t *testing.T) {
clientMock := newClientMock(generateTestFilename(test.desc))
p := Provider{
IngressClass: test.ingressClass,
AllowEmptyServices: test.allowEmptyServices,
DisableIngressClassLookup: test.disableIngressClassLookup,
IngressClass: test.ingressClass,
AllowEmptyServices: test.allowEmptyServices,
DisableIngressClassLookup: test.disableIngressClassLookup,
DisableClusterScopeResources: test.disableClusterScopeResources,
}
conf := p.loadConfigurationFromIngresses(context.Background(), clientMock)

View File

@@ -28,7 +28,7 @@ func isPostgres(br *bufio.Reader) (bool, error) {
peeked, err := br.Peek(i)
if err != nil {
var opErr *net.OpError
if !errors.Is(err, io.EOF) && (!errors.As(err, &opErr) || opErr.Timeout()) {
if !errors.Is(err, io.EOF) && (!errors.As(err, &opErr) || !opErr.Timeout()) {
log.Error().Err(err).Msg("Error while Peeking first byte")
}
return false, err

View File

@@ -363,7 +363,7 @@ func clientHelloInfo(br *bufio.Reader) (*clientHello, error) {
hdr, err := br.Peek(1)
if err != nil {
var opErr *net.OpError
if !errors.Is(err, io.EOF) && (!errors.As(err, &opErr) || opErr.Timeout()) {
if !errors.Is(err, io.EOF) && (!errors.As(err, &opErr) || !opErr.Timeout()) {
log.Error().Err(err).Msg("Error while Peeking first byte")
}
return nil, err

View File

@@ -1,8 +1,8 @@
#!/usr/bin/env bash
# shellcheck disable=SC2046
set -e -o pipefail
# shellcheck disable=SC1091 # Cannot check source of this file
source /go/src/k8s.io/code-generator/kube_codegen.sh
git config --global --add safe.directory "/go/src/${PROJECT_MODULE}"

View File

@@ -4,11 +4,11 @@ RepositoryName = "traefik"
OutputType = "file"
FileName = "traefik_changelog.md"
# example new bugfix v3.1.4
# example new bugfix v3.1.6
CurrentRef = "v3.1"
PreviousRef = "v3.1.3"
PreviousRef = "v3.1.5"
BaseBranch = "v3.1"
FutureCurrentRefName = "v3.1.4"
FutureCurrentRefName = "v3.1.6"
ThresholdPreviousRef = 10
ThresholdCurrentRef = 10

View File

@@ -15,9 +15,9 @@ for os in linux darwin windows freebsd openbsd; do
go clean -cache
done
cat dist/**/*_checksums.txt >> dist/traefik_${VERSION}_checksums.txt
cat dist/**/*_checksums.txt >> "dist/traefik_${VERSION}_checksums.txt"
rm dist/**/*_checksums.txt
tar cfz dist/traefik-${VERSION}.src.tar.gz \
tar cfz "dist/traefik-${VERSION}.src.tar.gz" \
--exclude-vcs \
--exclude .idea \
--exclude .travis \
@@ -25,4 +25,4 @@ tar cfz dist/traefik-${VERSION}.src.tar.gz \
--exclude .github \
--exclude dist .
chown -R $(id -u):$(id -g) dist/
chown -R "$(id -u)":"$(id -g)" dist/

View File

@@ -6,6 +6,7 @@ script_dir="$( cd "$( dirname "${0}" )" && pwd -P)"
if command -v shellcheck
then
exit_code=0
# The list of shell script come from the (grep ...) command, feeding the loop
while IFS= read -r script_to_check
do
@@ -18,7 +19,13 @@ then
| grep -v '.git/' | grep -v 'vendor/' | grep -v 'node_modules/' \
| cut -d':' -f1
)
wait # Wait for all background command to be completed
# Wait for all background command to be completed
for p in $(jobs -p)
do
wait "$p" || exit_code=$?
done
exit $exit_code
else
echo "== Command shellcheck not found in your PATH. No shell script checked."
exit 1
fi

View File

@@ -1,7 +1,7 @@
FROM node:20.14
FROM node:22.9-alpine3.20
# Current Active LTS release according to (https://nodejs.org/en/about/releases/)
ENV WEBUI_DIR /src/webui
ENV WEBUI_DIR=/src/webui
RUN mkdir -p $WEBUI_DIR
COPY package.json $WEBUI_DIR/

View File

@@ -5,6 +5,8 @@ import (
"io/fs"
)
// Files starting with . and _ are excluded by default
//
//go:embed static
var assets embed.FS

View File

@@ -20,16 +20,13 @@
"dependencies": {
"@quasar/extras": "^1.16.12",
"axios": "^1.7.4",
"bowser": "^2.11.0",
"chart.js": "^4.4.1",
"core-js": "^3.35.1",
"dot-prop": "^8.0.2",
"iframe-resizer": "^4.3.9",
"lodash.isequal": "4.5.0",
"moment": "^2.30.1",
"quasar": "^2.16.6",
"query-string": "^8.1.0",
"vh-check": "^2.0.5",
"vue": "^3.0.0",
"vue-chartjs": "^5.3.0",
"vue-router": "^4.0.12",
@@ -53,8 +50,11 @@
"postcss": "^8.4.14",
"vitest": "^1.6.0"
},
"resolutions": {
"cookie": "^0.7.0"
},
"engines": {
"node": "^20 || ^18 || ^16",
"node": "^22 || ^20 || ^18 || ^16",
"npm": ">= 6.13.4",
"yarn": ">= 1.22.22"
},

View File

@@ -13,9 +13,7 @@ module.exports = configure(function (ctx) {
// app boot file (/src/boot)
// --> boot files are part of "main.js"
boot: [
'api',
'_hacks',
'_init'
'api'
],
css: [

View File

@@ -1,16 +0,0 @@
import iframeResize from 'iframe-resizer/js/iframeResizer'
const resize = {
mounted (el, binding) {
const options = binding.value || {}
el.addEventListener('load', () => iframeResize(options, el))
},
unmounted (el) {
const resizableEl = el
if (resizableEl.iFrameResizer) {
resizableEl.iFrameResizer.removeListeners()
}
}
}
export default resize

View File

@@ -1,10 +0,0 @@
import { APP } from '../_helpers/APP'
import Boot from '../_middleware/Boot'
export default async ({ app, router, store }) => {
app.use(Boot)
APP.root = app
APP.router = router
APP.store = store
}

View File

@@ -1,13 +0,0 @@
import Bowser from 'bowser'
import vhCheck from 'vh-check'
const browser = Bowser.getParser(window.navigator.userAgent)
// In Mobile
if (browser.getPlatform().type === 'mobile') {
vhCheck()
}
export default async ({ app, Vue }) => {
}

View File

@@ -1,30 +0,0 @@
import { APP } from '../_helpers/APP'
import errors from '../_helpers/Errors'
import resize from '../_directives/resize'
export default async ({ app, router }) => {
// Directives
app.directive('resize', resize)
// Router
// ----------------------------------------------
router.beforeEach(async (to, from, next) => {
// Set APP
APP.routeTo = to
APP.routeFrom = from
next()
})
// Api (axios)
// ----------------------------------------------
APP.api.interceptors.request.use((config) => {
console.log('interceptors -> config', config)
// config.headers['Accept'] = '*/*'
return config
})
APP.api.interceptors.response.use((response) => {
console.log('interceptors -> response', response)
return response
}, errors.handleResponse)
}

View File

@@ -64,7 +64,7 @@ export default defineComponent({
&-focus {
border: solid 2px $accent;
}
&-ex-size{
&-ex-size {
.text-h3 {
font-size: 22px;
}

View File

@@ -15,7 +15,7 @@
<div
v-for="(entryItems, index) in entryAll.items"
:key="index"
class="col-12 col-sm-6 col-md-2"
class="col-12 col-sm-6 col-md-3"
>
<panel-entry
:name="entryItems.name"

File diff suppressed because it is too large Load Diff