IF YOU WOULD LIKE TO GET AN ACCOUNT, please write an
email to Administrator. User accounts are meant only to access repo
and report issues and/or generate pull requests.
This is a purpose-specific Git hosting for
BaseALT
projects. Thank you for your understanding!
Только зарегистрированные пользователи имеют доступ к сервису!
Для получения аккаунта, обратитесь к администратору.
This will allow instance admins to view signup pattern patterns for
public instances. It is modelled after discourse, mastodon, and
MediaWiki's approaches.
Note: This has privacy implications, but as the above-stated open-source
projects take this approach, especially MediaWiki, which I have no doubt
looked into this thoroughly, it is likely okay for us, too. However, I
would be appreciative of any feedback on how this could be improved.
---------
Co-authored-by: Giteabot <teabot@gitea.io>
All refs under `refs/pull` should only be changed from Gitea inside but
not by pushing from outside of Gitea.
This PR will prevent the pull refs update but allow other refs to be
updated on the same pushing with `--mirror` operations.
The main changes are to add checks on `update` hook but not
`pre-receive` because `update` will be invoked by every ref but
`pre-receive` will revert all changes once one ref update fails.
Change the copy to use `ActionsArtifact.StoragePath` instead of the
`ArtifactPath`. Skip artifacts that are expired, and don't error if the
file to copy does not exist.
---
When trying to migrate actions artifact storage from local to MinIO, we
encountered errors that prevented the process from completing
successfully:
* The migration tries to copy the files using the per-run
`ArtifactPath`, instead of the unique `StoragePath`.
* Artifacts that have been marked expired and had their files deleted
would throw an error
* Artifacts that are pending, but don't have a file uploaded yet will
throw an error.
This PR addresses these cases, and allow the process to complete
successfully.
This PR implemented object storages(LFS/Packages/Attachments and etc.)
for Azure Blob Storage. It depends on azure official golang SDK and can
support both the azure blob storage cloud service and azurite mock
server.
Replace #25458Fix#22527
- [x] CI Tests
- [x] integration test, MSSQL integration tests will now based on
azureblob
- [x] unit test
- [x] CLI Migrate Storage
- [x] Documentation for configuration added
------
TODO (other PRs):
- [ ] Improve performance of `blob download`.
---------
Co-authored-by: yp05327 <576951401@qq.com>
Add a configuration item to enable S3 virtual-hosted style (V2) to solve
the problem caused by some S3 service providers not supporting path
style (V1).
Merging PR may fail because of various problems. The pull request may
have a dirty state because there is no transaction when merging a pull
request. ref
https://github.com/go-gitea/gitea/pull/25741#issuecomment-2074126393
This PR moves all database update operations to post-receive handler for
merging a pull request and having a database transaction. That means if
database operations fail, then the git merging will fail, the git client
will get a fail result.
There are already many tests for pull request merging, so we don't need
to add a new one.
---------
Co-authored-by: wxiaoguang <wxiaoguang@gmail.com>
Follow #30472:
When a user is created by command line `./gitea admin user create`:
Old behavior before #30472: the first user (admin or non-admin) doesn't
need to change password.
Revert to the old behavior before #30472
Noteable additions:
- `redefines-builtin-id` forbid variable names that shadow go builtins
- `empty-lines` remove unnecessary empty lines that `gofumpt` does not
remove for some reason
- `superfluous-else` eliminate more superfluous `else` branches
Rules are also sorted alphabetically and I cleaned up various parts of
`.golangci.yml`.
Attempts to resolve#28720.
---
Note that I am not a Gitea administrator so I don't normally use the
gitea CLI. Just saw this issue and wanted an opportunity to understand
how this subcommand works and see if I can add this feature :^)
I tested both with `--skip-db` and without and it appears to not add any
database-specific files to the generated archive i.e. I don't see a
`gitea-db.sql` or `gitea.db` file:
```console
$ TAGS="bindata sqlite sqlite_unlock_notify" make backend
Running go generate...
bindata for migration already up-to-date
bindata for options already up-to-date
bindata for public already up-to-date
bindata for templates already up-to-date
$ ./gitea dump --skip-db
2024/04/20 01:16:11 ...s/setting/session.go:77:loadSessionFrom() [I] Session Service Enabled
2024/04/20 01:16:11 ...s/storage/storage.go:176:initAttachments() [I] Initialising Attachment storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/attachments
2024/04/20 01:16:11 ...s/storage/storage.go:166:initAvatars() [I] Initialising Avatar storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/avatars
2024/04/20 01:16:11 ...s/storage/storage.go:192:initRepoAvatars() [I] Initialising Repository Avatar storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/repo-avatars
2024/04/20 01:16:11 ...s/storage/storage.go:186:initLFS() [I] Initialising LFS storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/lfs
2024/04/20 01:16:11 ...s/storage/storage.go:198:initRepoArchives() [I] Initialising Repository Archive storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/repo-archive
2024/04/20 01:16:11 ...s/storage/storage.go:208:initPackages() [I] Initialising Packages storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/packages
2024/04/20 01:16:11 ...s/storage/storage.go:219:initActions() [I] Initialising Actions storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/actions_log
2024/04/20 01:16:11 ...s/storage/storage.go:223:initActions() [I] Initialising ActionsArtifacts storage with type: local
2024/04/20 01:16:11 ...les/storage/local.go:33:NewLocalStorage() [I] Creating new Local Storage at /workspaces/gitea/data/actions_artifacts
2024/04/20 01:16:11 cmd/dump.go:172:runDump() [I] Dumping local repositories... /workspaces/gitea/data/gitea-repositories
2024/04/20 01:16:11 cmd/dump.go:195:runDump() [I] Skipping database
2024/04/20 01:16:11 cmd/dump.go:229:runDump() [I] Adding custom configuration file from /workspaces/gitea/custom/conf/app.ini
2024/04/20 01:16:11 cmd/dump.go:256:runDump() [I] Packing data directory.../workspaces/gitea/data
2024/04/20 01:16:11 cmd/dump.go:335:runDump() [I] Finish dumping in file /workspaces/gitea/gitea-dump-1713575771.zip
$ unzip /workspaces/gitea/gitea-dump-1713575771.zip -d example
Archive: /workspaces/gitea/gitea-dump-1713575771.zip
. . .
$ ls example/
app.ini custom data repos
$ ls example/data/
actions_artifacts actions_log avatars home indexers jwt queues repo-archive repo-avatars tmp
```
Co-authored-by: Giteabot <teabot@gitea.io>
Agit returned result should be from `ProcReceive` hook but not
`PostReceive` hook. Then for all non-agit pull requests, it will not
check the pull requests for every pushing `refs/pull/%d/head`.
Major changes:
* Move some functions like "addReader" / "isSubDir" /
"addRecursiveExclude" to a separate package, and add tests
* Clarify the filename&dump type logic and add tests
* Clarify the logger behavior and remove FIXME comments
Co-authored-by: Giteabot <teabot@gitea.io>
* use `setup(ctx, c.Bool("debug"))` like all other callers
* `setting.RunMode = "dev"` is a no-op.
* `if _, err := os.Stat(setting.RepoRootPath); err != nil` could be
simplified
Old code is not consistent for generating & decoding the JWT secrets.
Now, the callers only need to use 2 consistent functions:
NewJwtSecretWithBase64 and DecodeJwtSecretBase64
And remove a non-common function Base64FixedDecode from util.go
Fixes#28660
Fixes an admin api bug related to `user.LoginSource`
Fixed `/user/emails` response not identical to GitHub api
This PR unifies the user update methods. The goal is to keep the logic
only at one place (having audit logs in mind). For example, do the
password checks only in one method not everywhere a password is updated.
After that PR is merged, the user creation should be next.
This PR adds a new `must-change-password` parameter to the
`change-password` cli command.
We already have the `must-change-password` command but it feels natural
to have this integrated into the `change-password` cli command.
---------
Co-authored-by: 6543 <6543@obermui.de>
## Purpose
This is a refactor toward building an abstraction over managing git
repositories.
Afterwards, it does not matter anymore if they are stored on the local
disk or somewhere remote.
## What this PR changes
We used `git.OpenRepository` everywhere previously.
Now, we should split them into two distinct functions:
Firstly, there are temporary repositories which do not change:
```go
git.OpenRepository(ctx, diskPath)
```
Gitea managed repositories having a record in the database in the
`repository` table are moved into the new package `gitrepo`:
```go
gitrepo.OpenRepository(ctx, repo_model.Repo)
```
Why is `repo_model.Repository` the second parameter instead of file
path?
Because then we can easily adapt our repository storage strategy.
The repositories can be stored locally, however, they could just as well
be stored on a remote server.
## Further changes in other PRs
- A Git Command wrapper on package `gitrepo` could be created. i.e.
`NewCommand(ctx, repo_model.Repository, commands...)`. `git.RunOpts{Dir:
repo.RepoPath()}`, the directory should be empty before invoking this
method and it can be filled in the function only. #28940
- Remove the `RepoPath()`/`WikiPath()` functions to reduce the
possibility of mistakes.
---------
Co-authored-by: delvh <dev.lh@web.de>
Mainly for MySQL/MSSQL.
It is important for Gitea to use case-sensitive database charset
collation. If the database is using a case-insensitive collation, Gitea
will show startup error/warning messages, and show the errors/warnings
on the admin panel's Self-Check page.
Make `gitea doctor convert` work for MySQL to convert the collations of
database & tables & columns.
* Fix#28131
## ⚠️ BREAKING ⚠️
It is not quite breaking, but it's highly recommended to convert the
database&table&column to a consistent and case-sensitive collation.
The 4 functions are duplicated, especially as interface methods. I think
we just need to keep `MustID` the only one and remove other 3.
```
MustID(b []byte) ObjectID
MustIDFromString(s string) ObjectID
NewID(b []byte) (ObjectID, error)
NewIDFromString(s string) (ObjectID, error)
```
Introduced the new interfrace method `ComputeHash` which will replace
the interface `HasherInterface`. Now we don't need to keep two
interfaces.
Reintroduced `git.NewIDFromString` and `git.MustIDFromString`. The new
function will detect the hash length to decide which objectformat of it.
If it's 40, then it's SHA1. If it's 64, then it's SHA256. This will be
right if the commitID is a full one. So the parameter should be always a
full commit id.
@AdamMajer Please review.
Refactor Hash interfaces and centralize hash function. This will allow
easier introduction of different hash function later on.
This forms the "no-op" part of the SHA256 enablement patch.
1. Do not sort the "checks" slice again and again when "Register", it
just wastes CPU when the Gitea instance runs
2. If a check doesn't exist, tell the end user
3. Add some tests
Hi,
This PR fixes#27988. The use of `path.join`(which uses `/` as the file
separator) to construct paths and comparing them with paths constructed
using `filepath.join`(which uses platform specific file separator) is
the root cause of this issue.
The desired behavior is to ignore attachments when dumping data
directory. Due to the what's mentioned above, the function
`addRecursiveExclude` is not actually ignoring the attachments directory
and is being written to the archive. The attachment directory is again
added to the archive (with different file separator as mentioned in the
issue) causing a duplicate entry on windows.
The solution is to use `filepath.join` in `addResursiveExclude` to
construct `currentAbsPath`.
The steps to reproduce it.
First, create a new oauth2 source.
Then, a user login with this oauth2 source.
Disable the oauth2 source.
Visit users -> settings -> security, 500 will be displayed.
This is because this page only load active Oauth2 sources but not all
Oauth2 sources.
This PR removed `unittest.MainTest` the second parameter
`TestOptions.GiteaRoot`. Now it detects the root directory by current
working directory.
---------
Co-authored-by: wxiaoguang <wxiaoguang@gmail.com>
Part of #27065
This reduces the usage of `db.DefaultContext`. I think I've got enough
files for the first PR. When this is merged, I will continue working on
this.
Considering how many files this PR affect, I hope it won't take to long
to merge, so I don't end up in the merge conflict hell.
---------
Co-authored-by: wxiaoguang <wxiaoguang@gmail.com>