Compare commits

...

47 Commits

Author SHA1 Message Date
Gavin Mak
e71a8c6dd8 project: disable auto-gc for depth=1 in git config
During sync, `git checkout` can trigger fetch for missing objects in
partial clones. This internal fetch can trigger `git maintenance` or
`git gc` and cause delays during the local checkout phase. Set
maintenance.auto to false and gc.auto to 0 in during `_InitRemote` if
`depth=1` to ensure that implicit fetches spawned by git skip GC.

Bug: 379111283
Change-Id: I6b22a4867f29b6e9598746cb752820a84dc2aeb6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540681
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2026-01-08 11:33:40 -08:00
Mike Frysinger
c687b5df9e run_tests/release: require Python 3.9+
While we support running `repo` on clients with older Python versions,
we don't need to hold the runners & release code back.  These are only
used by repo devs on their systems to develop & release repo.

Python 3.9 was picked due to its typing changs which we've already
started using in this code.

Change-Id: I6f8885c84298760514c25abeb1fccb0338947bf4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/539801
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2026-01-06 11:36:26 -08:00
Mike Frysinger
1dd9c57a28 tests: drop tox support
This hasn't been working out as well as we'd hope.  Tox relies on
the system having Python versions installed which distros don't
tend to carry anymore.  Our custom run_tests leverages vpython
when possible to run stable Python 3.8 & 3.11 versions which is
providing an OK level of coverage in practice.

Change-Id: Ida517f7be47ca95703e43bc0af5a24dd70c0467e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540001
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-01-06 11:32:42 -08:00
Mike Frysinger
4525c2e0ad github: add black check action
Change-Id: Ic87c1c5c72fb8a01108146c1f9d78466acb57278
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540021
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2026-01-06 11:00:32 -08:00
Mike Frysinger
45dcd738b7 tests: skip AF_UNIX tests when unavailable
UNIX sockets aren't available under Windows, so skip the test.

Change-Id: Ic4ca22d161c6dee628352aad07ac6aaceb472ac2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540002
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2026-01-06 10:17:53 -08:00
Mike Frysinger
1dad86dc00 check-metadata: skip files that do not exist
If the files don't exist, then they can't have errors, so skip checking.

Change-Id: I3ed4be4912b253c5454df41d690cb33dfe191289
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540003
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-01-06 10:17:32 -08:00
Mike Frysinger
622a5bf9c2 init: change --manifest-depth default to 1
Most users do not care about the manifest history in .repo/manifests/.
Let's change the default to 1 so things work smoothly for most people
most of the time.  For the rare folks who want the full history, they
can add --manifest-depth=0 to their `repo init`.

This has no effect on existing checkouts.

Spot checking Android & CrOS manifests shows significant speedups.
Full history can take O(10's seconds) to O(minutes) while depth of 1
takes constant time of O(~5 seconds).

Bug: 468033850
Change-Id: I4b8ed62a8a636babcc5226552badb69600d0c353
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/535481
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2026-01-05 06:36:08 -08:00
Gavin Mak
871e4c7ed1 sync: skip bloat check if fresh sync
Initial syncs won't have accumulated any garbage.

Bug: 379111283
Change-Id: I04b2ecde3e33f1f055038861a2705ab6aabb36d1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/536083
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-12-15 15:24:45 -08:00
Gavin Mak
5b0b5513d6 project: only use --no-auto-gc for git 2.23.0+
The flag for git fetch was introduced in git 2.23.0. Also skip the bloat
check after sync if using an older version.

Bug: 468589976
Bug: 379111283
Change-Id: Ib53e5494350c71a83906e5219d3a8c2b654e531f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/536082
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-12-15 11:32:49 -08:00
Gavin Mak
b5991d7128 sync: Add heuristic warning for bloated shallow repositories
For clone-depth="1" repositories that are dirty or have local commits,
add a check at the end of sync to detect excessive git object
accumulation.

This prevents silent performance degradation and disk exhaustion in
large prebuilts repos where automatic GC is typically disabled from
https://gerrit.googlesource.com/git-repo/+/7f87c54043ce9a35a5bb60a09ee846f9d7070352

Bug: 379111283
Change-Id: I376f38e1555cc6e906d852f6e63dc1c8f6331b4f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/534701
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-12-10 11:34:40 -08:00
Gavin Mak
7f87c54043 project: disable auto-gc on fetch for projects with clone-depth=1
This prevents GC hangs on repos with large binaries by skipping implicit
GC during network fetch, using clone-depth=1 as a heuristic.

Bug: 379111283
Change-Id: I977bf8cd521b11e37eba7ebc9f62120f2bbaf760
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/533802
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-08 12:18:48 -08:00
Kaushik Lingarkar
50c6226075 Prevent leftover bare gitdirs after failed sync attempts
The gitdir for a project may be left in a state with bare=true due
to a previous failed sync. In this state, during a subsequent sync
attempt, repo will skip initializing the gitdir (since the directory
already exists) and directly attempt to checkout the worktree, which
will fail because the project is bare. To reduce the chance of this
happening, initialize the gitdir in a temp directory and move it once
it is ready.

Bug: 457478027
Change-Id: I4767494a3a54e7734174eae3a0d939fa9d174288
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/524203
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-12-05 10:35:46 -08:00
Peter Kjellerstedt
1e4b2887a7 project: Make the error message more logical when a linkfile fail
Due to the odd naming of the arguments to symlink(), the error when it
failed to create a symbolic link was misleading.

Change-Id: I1d0f30ade5970d80186f13e01c426b066cd1062f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/532541
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-03 08:48:11 -08:00
Peter Kjellerstedt
31b4b19387 info: Print a newline after printing the superproject's revision
Change-Id: Ib20233dad4e1f1fd54dbf5ca0324be22fe0e4db1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/528463
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-03 08:08:06 -08:00
Peter Kjellerstedt
2b6de52a36 Rename XmlManifest.GetGroupsStr() to XmlManifest.GetManifestGroupsStr()
This makes it more clear what kind of groups it refers to.

Change-Id: I47369050d1436efcc77f3a69d5b7c99a536b23bc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/528462
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-03 07:57:22 -08:00
Peter Kjellerstedt
91ec998598 manifest_xml, git_superproject: Rename an argument for XmlManifest.ToXml()
Rename the groups argument to filter_groups to make it more clear what
kind of groups it refers to.

Change-Id: I90e6e9aa74a7e3e697705dd4bf8676226055878b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/528461
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-12-02 11:44:24 -08:00
Mike Frysinger
08964a1658 docs: manifest-format: reformat spec to align the CDATA parts
Most of the file was doing this, but we've been inconsistent when
adding new entries.  Realign all of them.

Change-Id: I99ddb3a1e859235b249b6f08731bdadad8086d4e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/532461
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2025-12-02 10:43:56 -08:00
Peter Kjellerstedt
3073a90046 manifest: Propagate revision attribute through multiple levels of include
Make sure a revision attribute for an include element is propagated
through multiple levels of manifest includes.

Change-Id: If37d65b0cd47da673719976598175d0eb6b7cbbe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525341
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-11-26 02:08:44 -08:00
Peter Kjellerstedt
75773b8b9d manifest, project: Store project groups as sets
This helps a lot when including common manifests with groups and they
use extend-project.

Change-Id: Ic574e7d6696139d0eb90d9915e8c7048d5e89c07
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525323
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2025-11-26 02:08:07 -08:00
Peter Kjellerstedt
412367bfaf project: Use dicts to keep track of copyfiles and linkfiles
This avoids copying/linking the same file/link multiple times if a
copyfile/linkfile element with the same values has been specifed
multiple times. This can happen when including a common manifest that
uses an extend-project element that has a copyfile/linkfile element.

This uses dicts rather than sets to store the copyfiles and linkfiles to
make sure the order they are specified in the manifest is maintained.
For Python 3.7+, maintaining the order that keys are added to dicts is
guaranteed, and for Python 3.6 it happened to be true.

The _CopyFile class and the _LinkFile class are changed to inherit from
NamedTuple to be able to store them in dicts.

Change-Id: I9f5a80298b875251a81c5fe7d353e262d104fae4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525322
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2025-11-26 02:07:35 -08:00
Peter Kjellerstedt
47c24b5c40 manifest: Make include groups propagate to extend-project elements
Any groups specified to an include element should propagate to any
extend-project elements and then on to the projects.

Change-Id: I62b95689cc13660858564ae569cbfd095961ecc7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525321
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-11-26 02:05:48 -08:00
Gavin Mak
be33106ffc wipe: Add new repo wipe subcommand
This new command allows users to delete projects from the worktree
and from the `.repo` directory. It is a destructive operation.

It handles shared projects by refusing to wipe them unless the
`--force` flag is used. It also checks for uncommitted changes
before wiping.

Bug: 393383056
Change-Id: Ia30d8ffdc781a3f179af56310ce31c9dae331bbe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/490801
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-11-21 10:48:42 -08:00
Mike Frysinger
5998c0b506 tests: manifest_xml: convert most path usage to pathlib
Should be functionally the same, but with pathlib APIs that we've
been slowly adopting in other places, especially unittests.

Change-Id: I81364117f8eaeaf138097cdfc484d4848b7ea5bd
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525881
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-11-11 10:58:51 -08:00
Peter Kjellerstedt
877ef91be2 man: Regenerate after manifest update
Change-Id: I0e7ef5d4189eaaf6878be709b437ecfb57570e3f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/524921
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2025-11-06 15:03:30 -08:00
Peter Kjellerstedt
4ab2284a94 manifest: Make extend-project support copyfile, linkfile and annotation
This allows an existing project to be extended by these elements.

Change-Id: I6826e518f39ca86485301491639101943b7e2ae0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/519781
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-10-27 11:38:07 -07:00
Gavin Mak
1afe96a7e9 sync: fix saving of fetch times and local state
Interleaved sync didn't save _fetch_times and _local_sync_state to disk.
Phased sync saved them, but incorrectly applied moving average smoothing
repeatedly when fetching submodules, and discarded historical data
during partial syncs.

Move .Save() calls to the end of main sync loops to ensure they run
once. Update _FetchTimes.Save() to merge new data with existing history,
preventing data loss.

Change-Id: I174f98a62ac86859f1eeea1daba65eb35c227852
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/519821
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-10-20 11:28:21 -07:00
Mike Frysinger
2719a8e203 run_tests: log each command run
This should make it clear to devs what commands are run and which fail
in the CI.

Change-Id: Ie863540cba6de7da933b4f32947ad09edee4aa45
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/519361
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-10-15 11:09:48 -07:00
Jeroen Dhollander
e4872ac8ba sync: Use 'git rebase' during 'repo sync --rebase'
'repo sync --rebase' should do a rebase if it encounters local commits
during a 'repo sync'.
This was broken by
https://gerrit-review.git.corp.google.com/c/git-repo/+/437421,
which caused this to execute the '_doff' hook (which stands for
'do fast forward'), which is implemented using 'git merge --no-stat'.

This caused *multiple* actual editor windows to pop up (*) during
'repo sync --rebase', asking the user to enter a commit message for the
merge.

In this CL I explicitly make that code path do a 'git rebase'.

(*) and if you use a terminal editor like 'vim', this means you have 2+ concurrent vim windows rendered in the same terminal, while 'repo sync' keeps on printing other output lines, again in the same terminal. The result is .... not pretty to say the least :(

Bug: b:434565811
Test: Used it myself for over a week.
Change-Id: I0bf3ff181f15b9d5b2e3f85f7f84e302139fdab7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/518602
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jeroen Dhollander <jeroendh@google.com>
Commit-Queue: Jeroen Dhollander <jeroendh@google.com>
2025-10-15 08:32:00 -07:00
Kaushik Lingarkar
4623264809 Fix submodule initialization in interleaved sync mode
With the introduction of interleaved sync mode, the submodule activation
logic broke because the 'has_submodules' attribute was no longer being
populated when needed. With this change, each submodule is initialized
when it enters the Sync_LocalHalf stage, whereas previously all
submodules were initialized at once when the parent repository entered
the Sync_LocalHalf stage. The init is now retried if it fails, as
submodules may concurrently modify the parent’s git config, potentially
causing contention when attempting to obtain a lock on it.

This change makes the submodule activation logic more robust and less
prone to breakage.

Bug: 444366154
Change-Id: I25eca4ea2a6868219045cfa088988eb01ded47d2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/509041
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@oss.qualcomm.com>
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-10-14 12:07:04 -07:00
Kaushik Lingarkar
67383bdba9 Follow up "Fix shallow clones when upstream attribute is present"
This reverts commit 38d2fe11b9.

Reason for revert: The issue described in I00acd4c61 remains unresolved.
The previous fix incorrectly accessed use_superproject from the Project
class, though it was only defined in ManifestProject. This change uses
it from the manifest attr available in the Project class.

Bug: b/427093249
Change-Id: Ife6d46cd85840f2989f60c2ca4d5a7dcf5d7477a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/508821
Reviewed-by: Xin Li <delphij@google.com>
Reviewed-by: Krzysztof Wesolowski <krzysztof.wesolowski@volvocars.com>
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
2025-09-22 12:40:22 -07:00
Mike Frysinger
d30414bb53 forall: fix crash with no command
When callback= is used, optparse does not automatically initialize
The destination when a dest= is not specified.  Refine the test to
allow dest= options when callback= is used even when it seems like
it is otherwise redundant.

Bug: b/436611422
Change-Id: I5185f95cb857ca6d37357cac77fb117a83db9c0c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/509861
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-09-17 12:54:30 -07:00
Mike Frysinger
80d1a5ad3e run_tests: add file header checker for licensing blocks
Change-Id: Ic0bfa3b03e2ba46d565a5bc2c1b7a7463b7dca2c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/500103
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-21 11:16:35 -07:00
Mike Frysinger
c615c964fb man: regen after sync updates
Change-Id: I20937c365b3f0be76e278d17c05b76a0d5e59deb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/500101
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-08-21 11:11:38 -07:00
Mike Frysinger
5ed12ec81d standardize file header wrt licensing
We've been slightly inconsistent in the license header in files.
Standardize them so we can automate checking.

Change-Id: I3cdf85c9485d33cac2bb05c8080dfada3e5a5e8d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/500102
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-08-21 11:04:41 -07:00
Mike Frysinger
58a59fdfbc CONTRIBUTING: rename doc per Google OSS policies
Google OSS policies say to name this "CONTRIBUTING.md".

Change-Id: I037f52a443caacc89868b7c14af91dd3d1b681a9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/499761
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-08-20 14:15:53 -07:00
Gavin Mak
38d2fe11b9 Revert "Fix shallow clones when upstream attribute is present"
This reverts commit d9cc0a1526.

Reason for revert: AttributeError: 'Project' object has no attribute 'use_superproject'

Bug: b/427093249
Change-Id: I57b285ab21f58b040e68ec14b85425f43f0abcca
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498641
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-08-14 16:35:26 -07:00
Gavin Mak
854fe440f2 git_superproject: fix AttributeError in Superproject logging
Ensure _git_event_log is initialized before use in _LogMessage. This
avoids crashes when _git_event_log is accessed before it's set, such as
during repo info.

Bug: 435317391
Change-Id: I3adc32d6a9377558e852bbb43f9cf82041fcf1bc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498521
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-14 15:39:41 -07:00
Gavin Mak
d534a5537f sync: Fix missing error details in interleaved summary
When checkout errors occurred in interleaved sync, they were wrapped in
a SyncError with no message, causing blank lines in the final summary.
Refactor _SyncResult to hold a list of exceptions, ensuring the original
error messages are propagated correctly.

Bug: 438178765
Change-Id: Ic25e515068959829cb6290cfd9e4c2d3963bbbea
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498342
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-08-14 09:54:15 -07:00
Gavin Mak
a64149a7a7 sync: Record and propagate errors from deferred actions
Failures in deferred sync actions were not recorded because `_Later.Run`
discarded the `GitError` exception. Record the specific error using
`syncbuf.fail()` and propagate it for proper error aggregation and
reporting.

Bug: 438178765
Change-Id: Iad59e389f9677bd6b8d873ee1ea2aa6ce44c86fa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498141
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-13 23:17:56 -07:00
Gavin Mak
3e6acf2778 progress: Fix race condition causing fileno crash
A race condition occurs when sync redirects sys.stderr to capture worker output, while a background progress thread simultaneously calls fileno() on it. This causes an io.UnsupportedOperation error. Fix by caching the original sys.stderr for all progress bar IO.

Change-Id: Idb1f45d707596d31238a19fd373cac3bf669c405
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498121
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-13 23:16:55 -07:00
Gavin Mak
a6e1a59ac1 sync: Avoid duplicate projects in error text
Keep track of finished projects, not just successful ones, when deciding
which projects still need to be synced. Also project errors are already
reported by sync workers so stall detection doesn't need to add failed
projects to the error list.

Bug: 438178765
Change-Id: Ibf15aad009ba7295e70c8df2ff158215085e9732
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498062
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-08-13 23:16:55 -07:00
Gavin Mak
380bf9546e sync: always show sync result stderr_text on error
_ProcessSyncInterleavedResults currently only shows stderr_text if
verbose. Show it if a sync worker fails, regardless of verbosity.

Bug: 438178765
Change-Id: If24dcb10fb5d6857386782d371e3f9c6844dece9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498061
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-08-13 23:16:55 -07:00
Krzysztof Wesolowski
d9cc0a1526 Fix shallow clones when upstream attribute is present
The _CheckForImmutableRevision method was modified in commit 0e776a58 to
include upstream branch validation for superproject scenarios. However,
this change inadvertently broke shallow clones when both clone-depth and
upstream attributes are specified in regular (non-superproject)
manifests.

Issue: When upstream is present, _CheckForImmutableRevision performs two
additional checks: 1. git rev-list on the upstream reference 2. git
merge-base --is-ancestor between revision and upstream

In shallow clones, the upstream branch history may not be available
locally, causing these checks to fail. This triggers the retry mechanism
that removes depth limitations, effectively converting shallow clones to
full clones, resulting in excessive disk usage.

Fix: Make upstream validation conditional on superproject usage. This
preserves the original superproject fix while restoring the method's
original behavior for regular scenarios - checking only if the immutable
revision (SHA1/tag) exists locally.

Note: The SetRevisionId method from the same commit 0e776a58 is left
unchanged as it only stores upstream information (no git operations),
which is beneficial for preserving branch context for commands like
'repo start' without causing fetch-related issues.

The fix ensures that manifests with both clone-depth and upstream work
correctly in non-superproject scenarios, maintaining shallow clone
efficiency and reducing disk usage.

Bug: b/427093249
Change-Id: I00acd4c61b179cd2abf796c2fecb7a2f38016a18
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/493883
Tested-by: Krzysztof Wesolowski <krzysztof.wesolowski@volvocars.com>
Commit-Queue: Krzysztof Wesolowski <krzysztof.wesolowski@volvocars.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Kamaljeet Maini <kamaljeet@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2025-08-05 08:28:37 -07:00
Gavin Mak
8c3585f367 project: fallback to reading HEAD when rev-parse fails
git rev-parse fails on invalid HEAD, e.g. after incomplete sync, causing
NoManifestException. Fall back to v2.56's direct file reading when
rev-parse fails.

Bug: 435045466
Change-Id: Ia14560335110c00d80408b2a93595a84446f8a57
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/495181
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-08-04 12:17:44 -07:00
Gavin Mak
239fad7146 hooks: verify hooks project has worktree before running
Skip hook if its project is not present on disk.

Bug: 434232630
Change-Id: I09a8b412d078af7a068d533f7be320d5b02327be
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/494441
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-07-28 08:37:08 -07:00
Kuang-che Wu
d3eec0acdd sync: fix connection error on macOS for interleaved sync
Bug: 377538810
Test: on macos, repo sync -j64
Change-Id: I6af4d4e6669dc882f165cbb9142ad4db9b346b73
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/494241
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
Tested-by: Kuang-che Wu <kcwu@google.com>
2025-07-28 02:05:24 -07:00
Gavin Mak
7f7d70efe4 project: Fix GetHead to handle detached HEADs
The switch to git rev-parse caused GetHead() to return the literal
string 'HEAD' when in a detached state. This broke repo prune, which
expects a commit SHA.

Bug: 434077990
Change-Id: I80b7d5965749096b59e854f61e913aa74c857b99
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/494401
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-07-25 14:30:07 -07:00
52 changed files with 1853 additions and 619 deletions

16
.github/workflows/black.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
# GitHub actions workflow.
# https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions
# https://black.readthedocs.io/en/stable/integrations/github_actions.html
name: Format
on:
push:
branches: [main]
jobs:
format:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: psf/black@stable

View File

@@ -18,5 +18,5 @@ jobs:
Thanks for your contribution!
Unfortunately, we don't use GitHub pull requests to manage code
contributions to this repository.
Instead, please see [README.md](../blob/HEAD/SUBMITTING_PATCHES.md)
Instead, please see [README.md](../blob/HEAD/CONTRIBUTING.md)
which provides full instructions on how to get involved.

View File

@@ -27,6 +27,6 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install tox tox-gh-actions
- name: Test with tox
run: tox
python -m pip install pytest
- name: Run tests
run: python -m pytest

View File

@@ -1,4 +1,4 @@
# Copyright 2023 The Android Open Source Project
# Copyright (C) 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -43,17 +43,12 @@ probably need to split up your commit to finer grained pieces.
Lint any changes by running:
```sh
$ tox -e lint -- file.py
$ flake8
```
And format with:
```sh
$ tox -e format -- file.py
```
Or format everything:
```sh
$ tox -e format
$ black file.py
```
Repo uses [black](https://black.readthedocs.io/) with line length of 80 as its
@@ -73,15 +68,11 @@ the entire project in the included `.flake8` file.
[PEP 8]: https://www.python.org/dev/peps/pep-0008/
[flake8 documentation]: https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html#in-line-ignoring-errors
## Running tests
We use [pytest](https://pytest.org/) and [tox](https://tox.readthedocs.io/) for
running tests. You should make sure to install those first.
To run the full suite against all supported Python versions, simply execute:
```sh
$ tox -p auto
```
We use [pytest](https://pytest.org/) for running tests. You should make sure to
install that first.
We have [`./run_tests`](./run_tests) which is a simple wrapper around `pytest`:
```sh

View File

@@ -14,7 +14,7 @@ that you can put anywhere in your path.
* Docs: <https://source.android.com/source/using-repo.html>
* [repo Manifest Format](./docs/manifest-format.md)
* [repo Hooks](./docs/repo-hooks.md)
* [Submitting patches](./SUBMITTING_PATCHES.md)
* [Contributing](./CONTRIBUTING.md)
* Running Repo in [Microsoft Windows](./docs/windows.md)
* GitHub mirror: <https://github.com/GerritCodeReview/git-repo>
* Postsubmit tests: <https://github.com/GerritCodeReview/git-repo/actions>

View File

@@ -399,7 +399,7 @@ class Command:
result = []
if not groups:
groups = manifest.GetGroupsStr()
groups = manifest.GetManifestGroupsStr()
groups = [x for x in re.split(r"[,\s]+", groups) if x]
if not args:

View File

@@ -1,4 +1,4 @@
# Copyright 2021 The Android Open Source Project
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -59,7 +59,7 @@ following DTD:
<!ATTLIST manifest-server url CDATA #REQUIRED>
<!ELEMENT submanifest EMPTY>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest remote IDREF #IMPLIED>
<!ATTLIST submanifest project CDATA #IMPLIED>
<!ATTLIST submanifest manifest-name CDATA #IMPLIED>
@@ -81,9 +81,9 @@ following DTD:
<!ATTLIST project sync-c CDATA #IMPLIED>
<!ATTLIST project sync-s CDATA #IMPLIED>
<!ATTLIST project sync-tags CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project clone-depth CDATA #IMPLIED>
<!ATTLIST project force-path CDATA #IMPLIED>
<!ATTLIST project force-path CDATA #IMPLIED>
<!ELEMENT annotation EMPTY>
<!ATTLIST annotation name CDATA #REQUIRED>
@@ -95,19 +95,21 @@ following DTD:
<!ATTLIST copyfile dest CDATA #REQUIRED>
<!ELEMENT linkfile EMPTY>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile dest CDATA #REQUIRED>
<!ELEMENT extend-project EMPTY>
<!ATTLIST extend-project name CDATA #REQUIRED>
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project dest-path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ATTLIST extend-project revision CDATA #IMPLIED>
<!ATTLIST extend-project remote CDATA #IMPLIED>
<!ELEMENT extend-project (annotation*,
copyfile*,
linkfile*)>
<!ATTLIST extend-project name CDATA #REQUIRED>
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project dest-path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ATTLIST extend-project revision CDATA #IMPLIED>
<!ATTLIST extend-project remote CDATA #IMPLIED>
<!ATTLIST extend-project dest-branch CDATA #IMPLIED>
<!ATTLIST extend-project upstream CDATA #IMPLIED>
<!ATTLIST extend-project base-rev CDATA #IMPLIED>
<!ATTLIST extend-project upstream CDATA #IMPLIED>
<!ATTLIST extend-project base-rev CDATA #IMPLIED>
<!ELEMENT remove-project EMPTY>
<!ATTLIST remove-project name CDATA #IMPLIED>
@@ -116,7 +118,7 @@ following DTD:
<!ATTLIST remove-project base-rev CDATA #IMPLIED>
<!ELEMENT repo-hooks EMPTY>
<!ATTLIST repo-hooks in-project CDATA #REQUIRED>
<!ATTLIST repo-hooks in-project CDATA #REQUIRED>
<!ATTLIST repo-hooks enabled-list CDATA #REQUIRED>
<!ELEMENT superproject EMPTY>
@@ -125,7 +127,7 @@ following DTD:
<!ATTLIST superproject revision CDATA #IMPLIED>
<!ELEMENT contactinfo EMPTY>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED>
@@ -285,7 +287,7 @@ should be placed. If not supplied, `revision` is used.
`path` may not be an absolute path or use "." or ".." path components.
Attribute `groups`: List of additional groups to which all projects
Attribute `groups`: Set of additional groups to which all projects
in the included submanifest belong. This appends and recurses, meaning
all projects in submanifests carry all parent submanifest groups.
Same syntax as the corresponding element of `project`.
@@ -353,7 +355,7 @@ When using `repo upload`, changes will be submitted for code
review on this branch. If unspecified both here and in the
default element, `revision` is used instead.
Attribute `groups`: List of groups to which this project belongs,
Attribute `groups`: Set of groups to which this project belongs,
whitespace or comma separated. All projects belong to the group
"all", and each project automatically belongs to a group of
its name:`name` and path:`path`. E.g. for
@@ -401,7 +403,7 @@ of the repo client where the Git working directory for this project
should be placed. This is used to move a project in the checkout by
overriding the existing `path` setting.
Attribute `groups`: List of additional groups to which this project
Attribute `groups`: Set of additional groups to which this project
belongs. Same syntax as the corresponding element of `project`.
Attribute `revision`: If specified, overrides the revision of the original
@@ -427,19 +429,20 @@ Same syntax as the corresponding element of `project`.
### Element annotation
Zero or more annotation elements may be specified as children of a
project or remote element. Each element describes a name-value pair.
For projects, this name-value pair will be exported into each project's
environment during a 'forall' command, prefixed with `REPO__`. In addition,
there is an optional attribute "keep" which accepts the case insensitive values
"true" (default) or "false". This attribute determines whether or not the
project element, an extend-project element, or a remote element. Each
element describes a name-value pair. For projects, this name-value pair
will be exported into each project's environment during a 'forall'
command, prefixed with `REPO__`. In addition, there is an optional
attribute "keep" which accepts the case insensitive values "true"
(default) or "false". This attribute determines whether or not the
annotation will be kept when exported with the manifest subcommand.
### Element copyfile
Zero or more copyfile elements may be specified as children of a
project element. Each element describes a src-dest pair of files;
the "src" file will be copied to the "dest" place during `repo sync`
command.
project element, or an extend-project element. Each element describes a
src-dest pair of files; the "src" file will be copied to the "dest"
place during `repo sync` command.
"src" is project relative, "dest" is relative to the top of the tree.
Copying from paths outside of the project or to paths outside of the repo
@@ -450,10 +453,14 @@ Intermediate paths must not be symlinks either.
Parent directories of "dest" will be automatically created if missing.
The files are copied in the order they are specified in the manifests.
If multiple elements specify the same source and destination, they will
only be applied as one, based on the first occurence. Files are copied
before any links specified via linkfile elements are created.
### Element linkfile
It's just like copyfile and runs at the same time as copyfile but
instead of copying it creates a symlink.
It's just like copyfile, but instead of copying it creates a symlink.
The symlink is created at "dest" (relative to the top of the tree) and
points to the path specified by "src" which is a path in the project.
@@ -463,6 +470,11 @@ Parent directories of "dest" will be automatically created if missing.
The symlink target may be a file or directory, but it may not point outside
of the repo client.
The links are created in the order they are specified in the manifests.
If multiple elements specify the same source and destination, they will
only be applied as one, based on the first occurence. Links are created
after any files specified via copyfile elements are copied.
### Element remove-project
Deletes a project from the internal manifest table, possibly
@@ -560,13 +572,16 @@ the manifest repository's root.
"name" may not be an absolute path or use "." or ".." path components.
These restrictions are not enforced for [Local Manifests].
Attribute `groups`: List of additional groups to which all projects
Attribute `groups`: Set of additional groups to which all projects
in the included manifest belong. This appends and recurses, meaning
all projects in included manifests carry all parent include groups.
This also applies to all extend-project elements in the included manifests.
Same syntax as the corresponding element of `project`.
Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`)
default to which all projects in the included manifest belong.
default to which all projects in the included manifest belong. This recurses,
meaning it will apply to all projects in all manifests included as a result of
this element.
## Local Manifests {#local-manifests}

View File

@@ -222,6 +222,12 @@ class GitConfig:
value = "true" if value else "false"
self.SetString(name, value)
def SetInt(self, name: str, value: int) -> None:
"""Set an integer value for a key."""
if value is not None:
value = str(value)
self.SetString(name, value)
def GetString(self, name: str, all_keys: bool = False) -> Union[str, None]:
"""Get the first value for a key, or None if it is not defined.

View File

@@ -1,5 +1,4 @@
#!/bin/sh
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");

View File

@@ -190,7 +190,8 @@ class Superproject:
message = f"{self._LogMessagePrefix()} {fmt.format(*inputs)}"
if self._print_messages:
print(message, file=sys.stderr)
self._git_event_log.ErrorEvent(message, fmt)
if self._git_event_log:
self._git_event_log.ErrorEvent(message, fmt)
def _LogMessagePrefix(self):
"""Returns the prefix string to be logged in each log message"""
@@ -421,7 +422,8 @@ class Superproject:
)
return None
manifest_str = self._manifest.ToXml(
groups=self._manifest.GetGroupsStr(), omit_local=True
filter_groups=self._manifest.GetManifestGroupsStr(),
omit_local=True,
).toxml()
manifest_path = self._manifest_path
try:

View File

@@ -1,3 +1,19 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Event logging in the git trace2 EVENT format."""
from git_command import GetEventTargetPath
from git_command import RepoSourceVersion
from git_trace2_event_log_base import BaseEventLog

View File

@@ -101,12 +101,11 @@ class RepoHook:
self._abort_if_user_denies = abort_if_user_denies
# Store the full path to the script for convenience.
if self._hooks_project:
self._script_fullpath = None
if self._hooks_project and self._hooks_project.worktree:
self._script_fullpath = os.path.join(
self._hooks_project.worktree, self._hook_type + ".py"
)
else:
self._script_fullpath = None
def _GetHash(self):
"""Return a hash of the contents of the hooks directory.
@@ -443,6 +442,7 @@ class RepoHook:
if (
self._bypass_hooks
or not self._hooks_project
or not self._script_fullpath
or self._hook_type not in self._hooks_project.enabled_repo_hooks
):
return True

View File

@@ -1,5 +1,4 @@
#!/usr/bin/env python3
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "September 2024" "repo init" "Repo Manual"
.TH REPO "1" "December 2025" "repo init" "Repo Manual"
.SH NAME
repo \- repo init - manual page for repo init
.SH SYNOPSIS
@@ -53,7 +53,7 @@ create a git checkout of the manifest repo
.TP
\fB\-\-manifest\-depth\fR=\fI\,DEPTH\/\fR
create a shallow clone of the manifest repo with given
depth (0 for full clone); see git clone (default: 0)
depth (0 for full clone); see git clone (default: 1)
.SS Manifest (only) checkout options:
.TP
\fB\-c\fR, \fB\-\-current\-branch\fR

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "April 2025" "repo manifest" "Repo Manual"
.TH REPO "1" "December 2025" "repo manifest" "Repo Manual"
.SH NAME
repo \- repo manifest - manual page for repo manifest
.SH SYNOPSIS
@@ -139,7 +139,7 @@ include*)>
<!ATTLIST manifest\-server url CDATA #REQUIRED>
.IP
<!ELEMENT submanifest EMPTY>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest remote IDREF #IMPLIED>
<!ATTLIST submanifest project CDATA #IMPLIED>
<!ATTLIST submanifest manifest\-name CDATA #IMPLIED>
@@ -170,9 +170,9 @@ CDATA #IMPLIED>
<!ATTLIST project sync\-c CDATA #IMPLIED>
<!ATTLIST project sync\-s CDATA #IMPLIED>
<!ATTLIST project sync\-tags CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project clone\-depth CDATA #IMPLIED>
<!ATTLIST project force\-path CDATA #IMPLIED>
<!ATTLIST project force\-path CDATA #IMPLIED>
.IP
<!ELEMENT annotation EMPTY>
<!ATTLIST annotation name CDATA #REQUIRED>
@@ -184,19 +184,34 @@ CDATA #IMPLIED>
<!ATTLIST copyfile dest CDATA #REQUIRED>
.IP
<!ELEMENT linkfile EMPTY>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile dest CDATA #REQUIRED>
.TP
<!ELEMENT extend\-project (annotation*,
copyfile*,
linkfile*)>
.TP
<!ATTLIST extend\-project name
CDATA #REQUIRED>
.TP
<!ATTLIST extend\-project path
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project dest\-path
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project groups
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project revision
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project remote
CDATA #IMPLIED>
.IP
<!ELEMENT extend\-project EMPTY>
<!ATTLIST extend\-project name CDATA #REQUIRED>
<!ATTLIST extend\-project path CDATA #IMPLIED>
<!ATTLIST extend\-project dest\-path CDATA #IMPLIED>
<!ATTLIST extend\-project groups CDATA #IMPLIED>
<!ATTLIST extend\-project revision CDATA #IMPLIED>
<!ATTLIST extend\-project remote CDATA #IMPLIED>
<!ATTLIST extend\-project dest\-branch CDATA #IMPLIED>
<!ATTLIST extend\-project upstream CDATA #IMPLIED>
<!ATTLIST extend\-project base\-rev CDATA #IMPLIED>
<!ATTLIST extend\-project upstream CDATA #IMPLIED>
<!ATTLIST extend\-project base\-rev CDATA #IMPLIED>
.IP
<!ELEMENT remove\-project EMPTY>
<!ATTLIST remove\-project name CDATA #IMPLIED>
@@ -205,7 +220,7 @@ CDATA #IMPLIED>
<!ATTLIST remove\-project base\-rev CDATA #IMPLIED>
.IP
<!ELEMENT repo\-hooks EMPTY>
<!ATTLIST repo\-hooks in\-project CDATA #REQUIRED>
<!ATTLIST repo\-hooks in\-project CDATA #REQUIRED>
<!ATTLIST repo\-hooks enabled\-list CDATA #REQUIRED>
.IP
<!ELEMENT superproject EMPTY>
@@ -214,7 +229,7 @@ CDATA #IMPLIED>
<!ATTLIST superproject revision CDATA #IMPLIED>
.IP
<!ELEMENT contactinfo EMPTY>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
.IP
<!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED>
@@ -362,7 +377,7 @@ supplied, `revision` is used.
.PP
`path` may not be an absolute path or use "." or ".." path components.
.PP
Attribute `groups`: List of additional groups to which all projects in the
Attribute `groups`: Set of additional groups to which all projects in the
included submanifest belong. This appends and recurses, meaning all projects in
submanifests carry all parent submanifest groups. Same syntax as the
corresponding element of `project`.
@@ -424,7 +439,7 @@ Attribute `dest\-branch`: Name of a Git branch (e.g. `main`). When using `repo
upload`, changes will be submitted for code review on this branch. If
unspecified both here and in the default element, `revision` is used instead.
.PP
Attribute `groups`: List of groups to which this project belongs, whitespace or
Attribute `groups`: Set of groups to which this project belongs, whitespace or
comma separated. All projects belong to the group "all", and each project
automatically belongs to a group of its name:`name` and path:`path`. E.g. for
`<project name="monkeys" path="barrel\-of"/>`, that project definition is
@@ -468,8 +483,8 @@ repo client where the Git working directory for this project should be placed.
This is used to move a project in the checkout by overriding the existing `path`
setting.
.PP
Attribute `groups`: List of additional groups to which this project belongs.
Same syntax as the corresponding element of `project`.
Attribute `groups`: Set of additional groups to which this project belongs. Same
syntax as the corresponding element of `project`.
.PP
Attribute `revision`: If specified, overrides the revision of the original
project. Same syntax as the corresponding element of `project`.
@@ -493,19 +508,21 @@ element of `project`.
.PP
Element annotation
.PP
Zero or more annotation elements may be specified as children of a project or
remote element. Each element describes a name\-value pair. For projects, this
name\-value pair will be exported into each project's environment during a
\&'forall' command, prefixed with `REPO__`. In addition, there is an optional
attribute "keep" which accepts the case insensitive values "true" (default) or
"false". This attribute determines whether or not the annotation will be kept
when exported with the manifest subcommand.
Zero or more annotation elements may be specified as children of a project
element, an extend\-project element, or a remote element. Each element describes
a name\-value pair. For projects, this name\-value pair will be exported into each
project's environment during a 'forall' command, prefixed with `REPO__`. In
addition, there is an optional attribute "keep" which accepts the case
insensitive values "true" (default) or "false". This attribute determines
whether or not the annotation will be kept when exported with the manifest
subcommand.
.PP
Element copyfile
.PP
Zero or more copyfile elements may be specified as children of a project
element. Each element describes a src\-dest pair of files; the "src" file will be
copied to the "dest" place during `repo sync` command.
element, or an extend\-project element. Each element describes a src\-dest pair of
files; the "src" file will be copied to the "dest" place during `repo sync`
command.
.PP
"src" is project relative, "dest" is relative to the top of the tree. Copying
from paths outside of the project or to paths outside of the repo client is not
@@ -516,10 +533,14 @@ Intermediate paths must not be symlinks either.
.PP
Parent directories of "dest" will be automatically created if missing.
.PP
The files are copied in the order they are specified in the manifests. If
multiple elements specify the same source and destination, they will only be
applied as one, based on the first occurence. Files are copied before any links
specified via linkfile elements are created.
.PP
Element linkfile
.PP
It's just like copyfile and runs at the same time as copyfile but instead of
copying it creates a symlink.
It's just like copyfile, but instead of copying it creates a symlink.
.PP
The symlink is created at "dest" (relative to the top of the tree) and points to
the path specified by "src" which is a path in the project.
@@ -529,6 +550,11 @@ Parent directories of "dest" will be automatically created if missing.
The symlink target may be a file or directory, but it may not point outside of
the repo client.
.PP
The links are created in the order they are specified in the manifests. If
multiple elements specify the same source and destination, they will only be
applied as one, based on the first occurence. Links are created after any files
specified via copyfile elements are copied.
.PP
Element remove\-project
.PP
Deletes a project from the internal manifest table, possibly allowing a
@@ -620,13 +646,16 @@ repository's root.
"name" may not be an absolute path or use "." or ".." path components. These
restrictions are not enforced for [Local Manifests].
.PP
Attribute `groups`: List of additional groups to which all projects in the
Attribute `groups`: Set of additional groups to which all projects in the
included manifest belong. This appends and recurses, meaning all projects in
included manifests carry all parent include groups. Same syntax as the
included manifests carry all parent include groups. This also applies to all
extend\-project elements in the included manifests. Same syntax as the
corresponding element of `project`.
.PP
Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`)
default to which all projects in the included manifest belong.
default to which all projects in the included manifest belong. This recurses,
meaning it will apply to all projects in all manifests included as a result of
this element.
.PP
Local Manifests
.PP

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "June 2025" "repo smartsync" "Repo Manual"
.TH REPO "1" "August 2025" "repo smartsync" "Repo Manual"
.SH NAME
repo \- repo smartsync - manual page for repo smartsync
.SH SYNOPSIS
@@ -20,12 +20,11 @@ number of CPU cores)
.TP
\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
number of network jobs to run in parallel (defaults to
\fB\-\-jobs\fR or 1). Ignored when \fB\-\-interleaved\fR is set
\fB\-\-jobs\fR or 1). Ignored unless \fB\-\-no\-interleaved\fR is set
.TP
\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
number of local checkout jobs to run in parallel
(defaults to \fB\-\-jobs\fR or 8). Ignored when \fB\-\-interleaved\fR
is set
(defaults to \fB\-\-jobs\fR or 8). Ignored unless \fB\-\-nointerleaved\fR is set
.TP
\fB\-f\fR, \fB\-\-force\-broken\fR
obsolete option (to be deleted in the future)
@@ -60,7 +59,10 @@ use the existing manifest checkout as\-is. (do not
update to the latest revision)
.TP
\fB\-\-interleaved\fR
fetch and checkout projects in parallel (experimental)
fetch and checkout projects in parallel (default)
.TP
\fB\-\-no\-interleaved\fR
fetch and checkout projects in phases
.TP
\fB\-n\fR, \fB\-\-network\-only\fR
fetch only, don't update working tree
@@ -149,6 +151,16 @@ operate on this manifest and its submanifests
.TP
\fB\-\-no\-repo\-verify\fR
do not verify repo source code
.SS post\-sync hooks:
.TP
\fB\-\-no\-verify\fR
Do not run the post\-sync hook.
.TP
\fB\-\-verify\fR
Run the post\-sync hook without prompting.
.TP
\fB\-\-ignore\-hooks\fR
Do not abort if post\-sync hooks fail.
.PP
Run `repo help smartsync` to view the detailed manual.
.SH DETAILS

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "June 2025" "repo sync" "Repo Manual"
.TH REPO "1" "August 2025" "repo sync" "Repo Manual"
.SH NAME
repo \- repo sync - manual page for repo sync
.SH SYNOPSIS
@@ -20,12 +20,11 @@ number of CPU cores)
.TP
\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
number of network jobs to run in parallel (defaults to
\fB\-\-jobs\fR or 1). Ignored when \fB\-\-interleaved\fR is set
\fB\-\-jobs\fR or 1). Ignored unless \fB\-\-no\-interleaved\fR is set
.TP
\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
number of local checkout jobs to run in parallel
(defaults to \fB\-\-jobs\fR or 8). Ignored when \fB\-\-interleaved\fR
is set
(defaults to \fB\-\-jobs\fR or 8). Ignored unless \fB\-\-nointerleaved\fR is set
.TP
\fB\-f\fR, \fB\-\-force\-broken\fR
obsolete option (to be deleted in the future)
@@ -60,7 +59,10 @@ use the existing manifest checkout as\-is. (do not
update to the latest revision)
.TP
\fB\-\-interleaved\fR
fetch and checkout projects in parallel (experimental)
fetch and checkout projects in parallel (default)
.TP
\fB\-\-no\-interleaved\fR
fetch and checkout projects in phases
.TP
\fB\-n\fR, \fB\-\-network\-only\fR
fetch only, don't update working tree
@@ -156,6 +158,16 @@ operate on this manifest and its submanifests
.TP
\fB\-\-no\-repo\-verify\fR
do not verify repo source code
.SS post\-sync hooks:
.TP
\fB\-\-no\-verify\fR
Do not run the post\-sync hook.
.TP
\fB\-\-verify\fR
Run the post\-sync hook without prompting.
.TP
\fB\-\-ignore\-hooks\fR
Do not abort if post\-sync hooks fail.
.PP
Run `repo help sync` to view the detailed manual.
.SH DETAILS

61
man/repo-wipe.1 Normal file
View File

@@ -0,0 +1,61 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "November 2025" "repo wipe" "Repo Manual"
.SH NAME
repo \- repo wipe - manual page for repo wipe
.SH SYNOPSIS
.B repo
\fI\,wipe <project>\/\fR...
.SH DESCRIPTION
Summary
.PP
Wipe projects from the worktree
.SH OPTIONS
.TP
\fB\-h\fR, \fB\-\-help\fR
show this help message and exit
.TP
\fB\-f\fR, \fB\-\-force\fR
force wipe shared projects and uncommitted changes
.TP
\fB\-\-force\-uncommitted\fR
force wipe even if there are uncommitted changes
.TP
\fB\-\-force\-shared\fR
force wipe even if the project shares an object
directory
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR
show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help wipe` to view the detailed manual.
.SH DETAILS
.PP
The 'repo wipe' command removes the specified projects from the worktree (the
checked out source code) and deletes the project's git data from `.repo`.
.PP
This is a destructive operation and cannot be undone.
.PP
Projects can be specified either by name, or by a relative or absolute path to
the project's local directory.
.SH EXAMPLES
.SS # Wipe the project "platform/build" by name:
$ repo wipe platform/build
.SS # Wipe the project at the path "build/make":
$ repo wipe build/make

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "April 2025" "repo" "Repo Manual"
.TH REPO "1" "November 2025" "repo" "Repo Manual"
.SH NAME
repo \- repository management tool built on top of git
.SH SYNOPSIS
@@ -132,6 +132,9 @@ Upload changes for code review
.TP
version
Display the version of repo
.TP
wipe
Wipe projects from the worktree
.PP
See 'repo help <command>' for more information on a specific command.
Bug reports: https://issues.gerritcodereview.com/issues/new?component=1370071

View File

@@ -255,7 +255,7 @@ class _XmlSubmanifest:
project: a string, the name of the manifest project.
revision: a string, the commitish.
manifestName: a string, the submanifest file name.
groups: a list of strings, the groups to add to all projects in the
groups: a set of strings, the groups to add to all projects in the
submanifest.
default_groups: a list of strings, the default groups to sync.
path: a string, the relative path for the submanifest checkout.
@@ -281,7 +281,7 @@ class _XmlSubmanifest:
self.project = project
self.revision = revision
self.manifestName = manifestName
self.groups = groups
self.groups = groups or set()
self.default_groups = default_groups
self.path = path
self.parent = parent
@@ -304,7 +304,7 @@ class _XmlSubmanifest:
self.repo_client = RepoClient(
parent.repodir,
linkFile,
parent_groups=",".join(groups) or "",
parent_groups=groups,
submanifest_path=os.path.join(parent.path_prefix, self.relpath),
outer_client=outer_client,
default_groups=default_groups,
@@ -345,7 +345,7 @@ class _XmlSubmanifest:
manifestName = self.manifestName or "default.xml"
revision = self.revision or self.name
path = self.path or revision.split("/")[-1]
groups = self.groups or []
groups = self.groups
return SubmanifestSpec(
self.name, manifestUrl, manifestName, revision, path, groups
@@ -359,9 +359,7 @@ class _XmlSubmanifest:
def GetGroupsStr(self):
"""Returns the `groups` given for this submanifest."""
if self.groups:
return ",".join(self.groups)
return ""
return ",".join(sorted(self.groups))
def GetDefaultGroupsStr(self):
"""Returns the `default-groups` given for this submanifest."""
@@ -381,7 +379,7 @@ class SubmanifestSpec:
self.manifestName = manifestName
self.revision = revision
self.path = path
self.groups = groups or []
self.groups = groups
class XmlManifest:
@@ -393,7 +391,7 @@ class XmlManifest:
manifest_file,
local_manifests=None,
outer_client=None,
parent_groups="",
parent_groups=None,
submanifest_path="",
default_groups=None,
):
@@ -409,7 +407,8 @@ class XmlManifest:
manifests. This will usually be
|repodir|/|LOCAL_MANIFESTS_DIR_NAME|.
outer_client: RepoClient of the outer manifest.
parent_groups: a string, the groups to apply to this projects.
parent_groups: a set of strings, the groups to apply to this
manifest.
submanifest_path: The submanifest root relative to the repo root.
default_groups: a string, the default manifest groups to use.
"""
@@ -432,7 +431,7 @@ class XmlManifest:
self.manifestFileOverrides = {}
self.local_manifests = local_manifests
self._load_local_manifests = True
self.parent_groups = parent_groups
self.parent_groups = parent_groups or set()
self.default_groups = default_groups
if submanifest_path and not outer_client:
@@ -567,21 +566,29 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
"""
return [x for x in re.split(r"[,\s]+", field) if x]
def _ParseSet(self, field):
"""Parse fields that contain flattened sets.
These are whitespace & comma separated. Empty elements will be
discarded.
"""
return set(self._ParseList(field))
def ToXml(
self,
peg_rev=False,
peg_rev_upstream=True,
peg_rev_dest_branch=True,
groups=None,
filter_groups=None,
omit_local=False,
):
"""Return the current manifest XML."""
mp = self.manifestProject
if groups is None:
groups = mp.manifest_groups
if groups:
groups = self._ParseList(groups)
if filter_groups is None:
filter_groups = mp.manifest_groups
if filter_groups:
filter_groups = self._ParseList(filter_groups)
doc = xml.dom.minidom.Document()
root = doc.createElement("manifest")
@@ -654,7 +661,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
output_project(parent, parent_node, project)
def output_project(parent, parent_node, p):
if not p.MatchesGroups(groups):
if not p.MatchesGroups(filter_groups):
return
if omit_local and self.IsFromLocalManifest(p):
@@ -725,10 +732,9 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
le.setAttribute("dest", lf.dest)
e.appendChild(le)
default_groups = ["all", "name:%s" % p.name, "path:%s" % p.relpath]
egroups = [g for g in p.groups if g not in default_groups]
if egroups:
e.setAttribute("groups", ",".join(egroups))
groups = p.groups - {"all", f"name:{p.name}", f"path:{p.relpath}"}
if groups:
e.setAttribute("groups", ",".join(sorted(groups)))
for a in p.annotations:
if a.keep == "true":
@@ -1116,7 +1122,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups += f",platform-{platform.system().lower()}"
return groups
def GetGroupsStr(self):
def GetManifestGroupsStr(self):
"""Returns the manifest group string that should be synced."""
return (
self.manifestProject.manifest_groups or self.GetDefaultGroupsStr()
@@ -1171,12 +1177,12 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
b = b[len(R_HEADS) :]
self.branch = b
parent_groups = self.parent_groups
parent_groups = self.parent_groups.copy()
if self.path_prefix:
parent_groups = (
parent_groups |= {
f"{SUBMANIFEST_GROUP_PREFIX}:path:"
f"{self.path_prefix},{parent_groups}"
)
f"{self.path_prefix}"
}
# The manifestFile was specified by the user which is why we
# allow include paths to point anywhere.
@@ -1202,16 +1208,16 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# Since local manifests are entirely managed by
# the user, allow them to point anywhere the
# user wants.
local_group = (
local_group = {
f"{LOCAL_MANIFEST_GROUP_PREFIX}:"
f"{local_file[:-4]}"
)
}
nodes.append(
self._ParseManifestXml(
local,
self.subdir,
parent_groups=(
f"{local_group},{parent_groups}"
local_group | parent_groups
),
restrict_includes=False,
)
@@ -1262,7 +1268,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self,
path,
include_root,
parent_groups="",
parent_groups=None,
restrict_includes=True,
parent_node=None,
):
@@ -1271,11 +1277,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
Args:
path: The XML file to read & parse.
include_root: The path to interpret include "name"s relative to.
parent_groups: The groups to apply to this projects.
parent_groups: The set of groups to apply to this manifest.
restrict_includes: Whether to constrain the "name" attribute of
includes.
parent_node: The parent include node, to apply attribute to this
projects.
parent_node: The parent include node, to apply attributes to this
manifest.
Returns:
List of XML nodes.
@@ -1299,6 +1305,14 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
nodes = []
for node in manifest.childNodes:
if (
parent_node
and node.nodeName in ("include", "project")
and not node.hasAttribute("revision")
):
node.setAttribute(
"revision", parent_node.getAttribute("revision")
)
if node.nodeName == "include":
name = self._reqatt(node, "name")
if restrict_includes:
@@ -1307,12 +1321,10 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
raise ManifestInvalidPathError(
f'<include> invalid "name": {name}: {msg}'
)
include_groups = ""
if parent_groups:
include_groups = parent_groups
include_groups = (parent_groups or set()).copy()
if node.hasAttribute("groups"):
include_groups = (
node.getAttribute("groups") + "," + include_groups
include_groups |= self._ParseSet(
node.getAttribute("groups")
)
fp = os.path.join(include_root, name)
if not os.path.isfile(fp):
@@ -1335,21 +1347,16 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
f"failed parsing included manifest {name}: {e}"
)
else:
if parent_groups and node.nodeName == "project":
nodeGroups = parent_groups
if node.hasAttribute("groups"):
nodeGroups = (
node.getAttribute("groups") + "," + nodeGroups
)
node.setAttribute("groups", nodeGroups)
if (
parent_node
and node.nodeName == "project"
and not node.hasAttribute("revision")
if parent_groups and node.nodeName in (
"project",
"extend-project",
):
node.setAttribute(
"revision", parent_node.getAttribute("revision")
)
nodeGroups = parent_groups.copy()
if node.hasAttribute("groups"):
nodeGroups |= self._ParseSet(
node.getAttribute("groups")
)
node.setAttribute("groups", ",".join(sorted(nodeGroups)))
nodes.append(node)
return nodes
@@ -1458,7 +1465,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
dest_path = node.getAttribute("dest-path")
groups = node.getAttribute("groups")
if groups:
groups = self._ParseList(groups)
groups = self._ParseSet(groups or "")
revision = node.getAttribute("revision")
remote_name = node.getAttribute("remote")
if not remote_name:
@@ -1479,7 +1486,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if path and p.relpath != path:
continue
if groups:
p.groups.extend(groups)
p.groups |= groups
if revision:
if base_revision:
if p.revisionExpr != base_revision:
@@ -1509,6 +1516,14 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
p.UpdatePaths(relpath, worktree, gitdir, objdir)
self._paths[p.relpath] = p
for n in node.childNodes:
if n.nodeName == "copyfile":
self._ParseCopyFile(p, n)
elif n.nodeName == "linkfile":
self._ParseLinkFile(p, n)
elif n.nodeName == "annotation":
self._ParseAnnotation(p, n)
if node.nodeName == "repo-hooks":
# Only one project can be the hooks project
if repo_hooks_project is not None:
@@ -1802,7 +1817,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups = ""
if node.hasAttribute("groups"):
groups = node.getAttribute("groups")
groups = self._ParseList(groups)
groups = self._ParseSet(groups)
default_groups = self._ParseList(node.getAttribute("default-groups"))
path = node.getAttribute("path")
if path == "":
@@ -1911,11 +1926,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
upstream = node.getAttribute("upstream") or self._default.upstreamExpr
groups = ""
if node.hasAttribute("groups"):
groups = node.getAttribute("groups")
groups = self._ParseList(groups)
if parent is None:
(
relpath,
@@ -1930,8 +1940,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
parent, name, path
)
default_groups = ["all", "name:%s" % name, "path:%s" % relpath]
groups.extend(set(default_groups).difference(groups))
groups = ""
if node.hasAttribute("groups"):
groups = node.getAttribute("groups")
groups = self._ParseSet(groups)
groups |= {"all", f"name:{name}", f"path:{relpath}"}
if self.IsMirror and node.hasAttribute("force-path"):
if XmlBool(node, "force-path", False):
@@ -1963,11 +1976,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
for n in node.childNodes:
if n.nodeName == "copyfile":
self._ParseCopyFile(project, n)
if n.nodeName == "linkfile":
elif n.nodeName == "linkfile":
self._ParseLinkFile(project, n)
if n.nodeName == "annotation":
elif n.nodeName == "annotation":
self._ParseAnnotation(project, n)
if n.nodeName == "project":
elif n.nodeName == "project":
project.subprojects.append(
self._ParseProject(n, parent=project)
)

View File

@@ -25,7 +25,10 @@ except ImportError:
from repo_trace import IsTraceToStderr
_TTY = sys.stderr.isatty()
# Capture the original stderr stream. We use this exclusively for progress
# updates to ensure we talk to the terminal even if stderr is redirected.
_STDERR = sys.stderr
_TTY = _STDERR.isatty()
# This will erase all content in the current line (wherever the cursor is).
# It does not move the cursor, so this is usually followed by \r to move to
@@ -133,11 +136,11 @@ class Progress:
def _write(self, s):
s = "\r" + s
if self._elide:
col = os.get_terminal_size(sys.stderr.fileno()).columns
col = os.get_terminal_size(_STDERR.fileno()).columns
if len(s) > col:
s = s[: col - 1] + ".."
sys.stderr.write(s)
sys.stderr.flush()
_STDERR.write(s)
_STDERR.flush()
def start(self, name):
self._active += 1
@@ -211,9 +214,9 @@ class Progress:
# Erase the current line, print the message with a newline,
# and then immediately redraw the progress bar on the new line.
sys.stderr.write("\r" + CSI_ERASE_LINE)
sys.stderr.write(msg + "\n")
sys.stderr.flush()
_STDERR.write("\r" + CSI_ERASE_LINE)
_STDERR.write(msg + "\n")
_STDERR.flush()
self.update(inc=0)
def end(self):

View File

@@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import datetime
import errno
import filecmp
import glob
@@ -390,22 +391,17 @@ def _SafeExpandPath(base, subpath, skipfinal=False):
return path
class _CopyFile:
class _CopyFile(NamedTuple):
"""Container for <copyfile> manifest element."""
def __init__(self, git_worktree, src, topdir, dest):
"""Register a <copyfile> request.
Args:
git_worktree: Absolute path to the git project checkout.
src: Relative path under |git_worktree| of file to read.
topdir: Absolute path to the top of the repo client checkout.
dest: Relative path under |topdir| of file to write.
"""
self.git_worktree = git_worktree
self.topdir = topdir
self.src = src
self.dest = dest
# Absolute path to the git project checkout.
git_worktree: str
# Relative path under |git_worktree| of file to read.
src: str
# Absolute path to the top of the repo client checkout.
topdir: str
# Relative path under |topdir| of file to write.
dest: str
def _Copy(self):
src = _SafeExpandPath(self.git_worktree, self.src)
@@ -439,22 +435,17 @@ class _CopyFile:
logger.error("error: Cannot copy file %s to %s", src, dest)
class _LinkFile:
class _LinkFile(NamedTuple):
"""Container for <linkfile> manifest element."""
def __init__(self, git_worktree, src, topdir, dest):
"""Register a <linkfile> request.
Args:
git_worktree: Absolute path to the git project checkout.
src: Target of symlink relative to path under |git_worktree|.
topdir: Absolute path to the top of the repo client checkout.
dest: Relative path under |topdir| of symlink to create.
"""
self.git_worktree = git_worktree
self.topdir = topdir
self.src = src
self.dest = dest
# Absolute path to the git project checkout.
git_worktree: str
# Target of symlink relative to path under |git_worktree|.
src: str
# Absolute path to the top of the repo client checkout.
topdir: str
# Relative path under |topdir| of symlink to create.
dest: str
def __linkIt(self, relSrc, absDest):
# Link file if it does not exist or is out of date.
@@ -471,9 +462,7 @@ class _LinkFile:
os.makedirs(dest_dir)
platform_utils.symlink(relSrc, absDest)
except OSError:
logger.error(
"error: Cannot link file %s to %s", relSrc, absDest
)
logger.error("error: Cannot symlink %s to %s", absDest, relSrc)
def _Link(self):
"""Link the self.src & self.dest paths.
@@ -564,7 +553,7 @@ class Project:
revisionExpr,
revisionId,
rebase=True,
groups=None,
groups=set(),
sync_c=False,
sync_s=False,
sync_tags=True,
@@ -633,8 +622,9 @@ class Project:
self.subprojects = []
self.snapshots = {}
self.copyfiles = []
self.linkfiles = []
# Use dicts to dedupe while maintaining declared order.
self.copyfiles = {}
self.linkfiles = {}
self.annotations = []
self.dest_branch = dest_branch
@@ -642,10 +632,6 @@ class Project:
# project containing repo hooks.
self.enabled_repo_hooks = []
# This will be updated later if the project has submodules and
# if they will be synced.
self.has_subprojects = False
def RelPath(self, local=True):
"""Return the path for the project relative to a manifest.
@@ -852,9 +838,9 @@ class Project:
"""
default_groups = self.manifest.default_groups or ["default"]
expanded_manifest_groups = manifest_groups or default_groups
expanded_project_groups = ["all"] + (self.groups or [])
expanded_project_groups = {"all"} | self.groups
if "notdefault" not in expanded_project_groups:
expanded_project_groups += ["default"]
expanded_project_groups |= {"default"}
matched = False
for group in expanded_manifest_groups:
@@ -1539,18 +1525,14 @@ class Project:
force_checkout=False,
force_rebase=False,
submodules=False,
errors=None,
verbose=False,
):
"""Perform only the local IO portion of the sync process.
Network access is not required.
"""
if errors is None:
errors = []
def fail(error: Exception):
errors.append(error)
syncbuf.fail(self, error)
if not os.path.exists(self.gitdir):
@@ -1567,8 +1549,8 @@ class Project:
# TODO(https://git-scm.com/docs/git-worktree#_bugs): Re-evaluate if
# submodules can be init when using worktrees once its support is
# complete.
if self.has_subprojects and not self.use_git_worktrees:
self._InitSubmodules()
if self.parent and not self.use_git_worktrees:
self._InitSubmodule()
all_refs = self.bare_ref.all
self.CleanPublishedCache(all_refs)
revid = self.GetRevisionId(all_refs)
@@ -1597,6 +1579,9 @@ class Project:
self._FastForward(revid)
self._CopyAndLinkFiles()
def _dorebase():
self._Rebase(upstream="@{upstream}")
def _dosubmodules():
self._SyncSubmodules(quiet=True)
@@ -1688,19 +1673,24 @@ class Project:
if pub:
not_merged = self._revlist(not_rev(revid), pub)
if not_merged:
if upstream_gain and not force_rebase:
# The user has published this branch and some of those
# commits are not yet merged upstream. We do not want
# to rewrite the published commits so we punt.
fail(
LocalSyncFail(
"branch %s is published (but not merged) and is "
"now %d commits behind. Fix this manually or rerun "
"with the --rebase option to force a rebase."
% (branch.name, len(upstream_gain)),
project=self.name,
if upstream_gain:
if force_rebase:
# Try to rebase local published but not merged changes
# on top of the upstream changes.
syncbuf.later1(self, _dorebase, not verbose)
else:
# The user has published this branch and some of those
# commits are not yet merged upstream. We do not want
# to rewrite the published commits so we punt.
fail(
LocalSyncFail(
"branch %s is published (but not merged) and "
"is now %d commits behind. Fix this manually "
"or rerun with the --rebase option to force a "
"rebase." % (branch.name, len(upstream_gain)),
project=self.name,
)
)
)
return
syncbuf.later1(self, _doff, not verbose)
return
@@ -1794,7 +1784,7 @@ class Project:
Paths should have basic validation run on them before being queued.
Further checking will be handled when the actual copy happens.
"""
self.copyfiles.append(_CopyFile(self.worktree, src, topdir, dest))
self.copyfiles[_CopyFile(self.worktree, src, topdir, dest)] = True
def AddLinkFile(self, src, dest, topdir):
"""Mark |dest| to create a symlink (relative to |topdir|) pointing to
@@ -1805,7 +1795,7 @@ class Project:
Paths should have basic validation run on them before being queued.
Further checking will be handled when the actual link happens.
"""
self.linkfiles.append(_LinkFile(self.worktree, src, topdir, dest))
self.linkfiles[_LinkFile(self.worktree, src, topdir, dest)] = True
def AddAnnotation(self, name, value, keep):
self.annotations.append(Annotation(name, value, keep))
@@ -2363,8 +2353,6 @@ class Project:
)
result.append(subproject)
result.extend(subproject.GetDerivedSubprojects())
if result:
self.has_subprojects = True
return result
def EnableRepositoryExtension(self, key, value="true", version=1):
@@ -2415,7 +2403,9 @@ class Project:
# throws an error.
revs = [f"{self.revisionExpr}^0"]
upstream_rev = None
if self.upstream:
# Only check upstream when using superproject.
if self.upstream and self.manifest.manifestProject.use_superproject:
upstream_rev = self.GetRemote().ToLocal(self.upstream)
revs.append(upstream_rev)
@@ -2427,7 +2417,9 @@ class Project:
log_as_error=False,
)
if self.upstream:
# Only verify upstream relationship for superproject scenarios
# without affecting plain usage.
if self.upstream and self.manifest.manifestProject.use_superproject:
self.bare_git.merge_base(
"--is-ancestor",
self.revisionExpr,
@@ -2580,6 +2572,16 @@ class Project:
if os.path.exists(os.path.join(self.gitdir, "shallow")):
cmd.append("--depth=2147483647")
# Use clone-depth="1" as a heuristic for repositories containing
# large binaries and disable auto GC to prevent potential hangs.
# Check the configured depth because the `depth` argument might be None
# if REPO_ALLOW_SHALLOW=0 converted it to a partial clone.
effective_depth = (
self.clone_depth or self.manifest.manifestProject.depth
)
if effective_depth == 1 and git_require((2, 23, 0)):
cmd.append("--no-auto-gc")
if not verbose:
cmd.append("--quiet")
if not quiet and sys.stdout.isatty():
@@ -3030,16 +3032,39 @@ class Project:
project=self.name,
)
def _InitSubmodules(self, quiet=True):
"""Initialize the submodules for the project."""
def _InitSubmodule(self, quiet=True):
"""Initialize the submodule."""
cmd = ["submodule", "init"]
if quiet:
cmd.append("-q")
if GitCommand(self, cmd).Wait() != 0:
raise GitError(
f"{self.name} submodule init",
project=self.name,
cmd.extend(["--", self.worktree])
max_retries = 3
base_delay_secs = 1
jitter_ratio = 1 / 3
for attempt in range(max_retries):
git_cmd = GitCommand(
None,
cmd,
cwd=self.parent.worktree,
capture_stdout=True,
capture_stderr=True,
)
if git_cmd.Wait() == 0:
return
error = git_cmd.stderr or git_cmd.stdout
if "lock" in error:
delay = base_delay_secs * (2**attempt)
delay += random.uniform(0, delay * jitter_ratio)
logger.warning(
f"Attempt {attempt+1}/{max_retries}: "
+ f"git {' '.join(cmd)} failed."
+ f" Error: {error}."
+ f" Sleeping {delay:.2f}s before retrying."
)
time.sleep(delay)
else:
break
git_cmd.VerifyCommand()
def _Rebase(self, upstream, onto=None):
cmd = ["rebase"]
@@ -3059,8 +3084,13 @@ class Project:
raise GitError(f"{self.name} merge {head} ", project=self.name)
def _InitGitDir(self, mirror_git=None, force_sync=False, quiet=False):
# Prefix for temporary directories created during gitdir initialization.
TMP_GITDIR_PREFIX = ".tmp-project-initgitdir-"
init_git_dir = not os.path.exists(self.gitdir)
init_obj_dir = not os.path.exists(self.objdir)
tmp_gitdir = None
curr_gitdir = self.gitdir
curr_config = self.config
try:
# Initialize the bare repository, which contains all of the objects.
if init_obj_dir:
@@ -3080,27 +3110,33 @@ class Project:
# well.
if self.objdir != self.gitdir:
if init_git_dir:
os.makedirs(self.gitdir)
os.makedirs(os.path.dirname(self.gitdir), exist_ok=True)
tmp_gitdir = tempfile.mkdtemp(
prefix=TMP_GITDIR_PREFIX,
dir=os.path.dirname(self.gitdir),
)
curr_config = GitConfig.ForRepository(
gitdir=tmp_gitdir, defaults=self.manifest.globalConfig
)
curr_gitdir = tmp_gitdir
if init_obj_dir or init_git_dir:
self._ReferenceGitDir(
self.objdir, self.gitdir, copy_all=True
self.objdir, curr_gitdir, copy_all=True
)
try:
self._CheckDirReference(self.objdir, self.gitdir)
self._CheckDirReference(self.objdir, curr_gitdir)
except GitError as e:
if force_sync:
logger.error(
"Retrying clone after deleting %s", self.gitdir
)
try:
platform_utils.rmtree(os.path.realpath(self.gitdir))
if self.worktree and os.path.exists(
os.path.realpath(self.worktree)
):
platform_utils.rmtree(
os.path.realpath(self.worktree)
)
rm_dirs = (
tmp_gitdir,
self.gitdir,
self.worktree,
)
for d in rm_dirs:
if d and os.path.exists(d):
platform_utils.rmtree(os.path.realpath(d))
return self._InitGitDir(
mirror_git=mirror_git,
force_sync=False,
@@ -3151,18 +3187,21 @@ class Project:
m = self.manifest.manifestProject.config
for key in ["user.name", "user.email"]:
if m.Has(key, include_defaults=False):
self.config.SetString(key, m.GetString(key))
curr_config.SetString(key, m.GetString(key))
if not self.manifest.EnableGitLfs:
self.config.SetString(
curr_config.SetString(
"filter.lfs.smudge", "git-lfs smudge --skip -- %f"
)
self.config.SetString(
curr_config.SetString(
"filter.lfs.process", "git-lfs filter-process --skip"
)
self.config.SetBoolean(
curr_config.SetBoolean(
"core.bare", True if self.manifest.IsMirror else None
)
if tmp_gitdir:
platform_utils.rename(tmp_gitdir, self.gitdir)
tmp_gitdir = None
if not init_obj_dir:
# The project might be shared (obj_dir already initialized), but
# such information is not available here. Instead of passing it,
@@ -3179,6 +3218,27 @@ class Project:
if init_git_dir and os.path.exists(self.gitdir):
platform_utils.rmtree(self.gitdir)
raise
finally:
# Clean up the temporary directory created during the process,
# as well as any stale ones left over from previous attempts.
if tmp_gitdir and os.path.exists(tmp_gitdir):
platform_utils.rmtree(tmp_gitdir)
age_threshold = datetime.timedelta(days=1)
now = datetime.datetime.now()
for tmp_dir in glob.glob(
os.path.join(
os.path.dirname(self.gitdir), f"{TMP_GITDIR_PREFIX}*"
)
):
try:
mtime = datetime.datetime.fromtimestamp(
os.path.getmtime(tmp_dir)
)
if now - mtime > age_threshold:
platform_utils.rmtree(tmp_dir)
except OSError:
pass
def _UpdateHooks(self, quiet=False):
if os.path.exists(self.objdir):
@@ -3256,6 +3316,15 @@ class Project:
remote.ResetFetch(mirror=True)
remote.Save()
# Disable auto-gc for depth=1 to prevent hangs during lazy fetches
# inside git checkout for partial clones.
effective_depth = (
self.clone_depth or self.manifest.manifestProject.depth
)
if effective_depth == 1:
self.config.SetBoolean("maintenance.auto", False)
self.config.SetInt("gc.auto", 0)
def _InitMRef(self):
"""Initialize the pseudo m/<manifest branch> ref."""
if self.manifest.branch:
@@ -3835,10 +3904,35 @@ class Project:
def GetHead(self):
"""Return the ref that HEAD points to."""
try:
return self.rev_parse("--symbolic-full-name", HEAD)
symbolic_head = self.rev_parse("--symbolic-full-name", HEAD)
if symbolic_head == HEAD:
# Detached HEAD. Return the commit SHA instead.
return self.rev_parse(HEAD)
return symbolic_head
except GitError as e:
logger.warning(
"project %s: unparseable HEAD; trying to recover.\n"
"Check that HEAD ref in .git/HEAD is valid. The error "
"was: %s",
self._project.RelPath(local=False),
e,
)
# Fallback to direct file reading for compatibility with broken
# repos, e.g. if HEAD points to an unborn branch.
path = self.GetDotgitPath(subpath=HEAD)
raise NoManifestException(path, str(e))
try:
with open(path) as fd:
line = fd.readline()
except OSError:
raise NoManifestException(path, str(e))
try:
line = line.decode()
except AttributeError:
pass
if line.startswith("ref: "):
return line[5:-1]
return line[:-1]
def SetHead(self, ref, message=None):
cmdv = []
@@ -4002,7 +4096,8 @@ class _Later:
if not self.quiet:
out.nl()
return True
except GitError:
except GitError as e:
syncbuf.fail(self.project, e)
out.nl()
return False
@@ -4018,7 +4113,12 @@ class _SyncColoring(Coloring):
class SyncBuffer:
def __init__(self, config, detach_head=False):
self._messages = []
self._failures = []
# Failures that have not yet been printed. Cleared after printing.
self._pending_failures = []
# A persistent record of all failures during the buffer's lifetime.
self._all_failures = []
self._later_queue1 = []
self._later_queue2 = []
@@ -4033,7 +4133,9 @@ class SyncBuffer:
self._messages.append(_InfoMessage(project, fmt % args))
def fail(self, project, err=None):
self._failures.append(_Failure(project, err))
failure = _Failure(project, err)
self._pending_failures.append(failure)
self._all_failures.append(failure)
self._MarkUnclean()
def later1(self, project, what, quiet):
@@ -4053,6 +4155,11 @@ class SyncBuffer:
self.recent_clean = True
return recent_clean
@property
def errors(self):
"""Returns a list of all exceptions accumulated in the buffer."""
return [f.why for f in self._all_failures if f.why]
def _MarkUnclean(self):
self.clean = False
self.recent_clean = False
@@ -4071,18 +4178,18 @@ class SyncBuffer:
return True
def _PrintMessages(self):
if self._messages or self._failures:
if self._messages or self._pending_failures:
if os.isatty(2):
self.out.write(progress.CSI_ERASE_LINE)
self.out.write("\r")
for m in self._messages:
m.Print(self)
for m in self._failures:
for m in self._pending_failures:
m.Print(self)
self._messages = []
self._failures = []
self._pending_failures = []
class MetaProject(Project):

View File

@@ -1,4 +1,4 @@
# Copyright 2023 The Android Open Source Project
# Copyright (C) 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -14,7 +14,6 @@
[tool.black]
line-length = 80
# NB: Keep in sync with tox.ini.
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312'
[tool.pytest.ini_options]

155
release/check-metadata.py Executable file
View File

@@ -0,0 +1,155 @@
#!/usr/bin/env python3
# Copyright (C) 2025 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool to check various metadata (e.g. licensing) in source files."""
import argparse
from pathlib import Path
import re
import sys
import util
_FILE_HEADER_RE = re.compile(
r"""# Copyright \(C\) 20[0-9]{2} The Android Open Source Project
#
# Licensed under the Apache License, Version 2\.0 \(the "License"\);
# you may not use this file except in compliance with the License\.
# You may obtain a copy of the License at
#
# http://www\.apache\.org/licenses/LICENSE-2\.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied\.
# See the License for the specific language governing permissions and
# limitations under the License\.
"""
)
def check_license(path: Path, lines: list[str]) -> bool:
"""Check license header."""
# Enforce licensing on configs & scripts.
if not (
path.suffix in (".bash", ".cfg", ".ini", ".py", ".toml")
or lines[0] in ("#!/bin/bash", "#!/bin/sh", "#!/usr/bin/env python3")
):
return True
# Extract the file header.
header_lines = []
for line in lines:
if line.startswith("#"):
header_lines.append(line)
else:
break
if not header_lines:
print(
f"error: {path.relative_to(util.TOPDIR)}: "
"missing file header (copyright+licensing)",
file=sys.stderr,
)
return False
# Skip the shebang.
if header_lines[0].startswith("#!"):
header_lines.pop(0)
# If this file is imported into the tree, then leave it be.
if header_lines[0] == "# DO NOT EDIT THIS FILE":
return True
header = "".join(f"{x}\n" for x in header_lines)
if not _FILE_HEADER_RE.match(header):
print(
f"error: {path.relative_to(util.TOPDIR)}: "
"file header incorrectly formatted",
file=sys.stderr,
)
print(
"".join(f"> {x}\n" for x in header_lines), end="", file=sys.stderr
)
return False
return True
def check_path(opts: argparse.Namespace, path: Path) -> bool:
"""Check a single path."""
try:
data = path.read_text(encoding="utf-8")
except FileNotFoundError:
return True
lines = data.splitlines()
# NB: Use list comprehension and not a generator so we run all the checks.
return all(
[
check_license(path, lines),
]
)
def check_paths(opts: argparse.Namespace, paths: list[Path]) -> bool:
"""Check all the paths."""
# NB: Use list comprehension and not a generator so we check all paths.
return all([check_path(opts, x) for x in paths])
def find_files(opts: argparse.Namespace) -> list[Path]:
"""Find all the files in the source tree."""
result = util.run(
opts,
["git", "ls-tree", "-r", "-z", "--name-only", "HEAD"],
cwd=util.TOPDIR,
capture_output=True,
encoding="utf-8",
)
return [util.TOPDIR / x for x in result.stdout.split("\0")[:-1]]
def get_parser() -> argparse.ArgumentParser:
"""Get a CLI parser."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument(
"-n",
"--dry-run",
dest="dryrun",
action="store_true",
help="show everything that would be done",
)
parser.add_argument(
"paths",
nargs="*",
help="the paths to scan",
)
return parser
def main(argv: list[str]) -> int:
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
paths = opts.paths
if not opts.paths:
paths = find_files(opts)
return 0 if check_paths(opts, paths) else 1
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View File

@@ -27,6 +27,9 @@ import sys
import util
assert sys.version_info >= (3, 9), "Release framework requires Python 3.9+"
def sign(opts):
"""Sign the launcher!"""
output = ""

View File

@@ -30,6 +30,9 @@ import sys
import util
assert sys.version_info >= (3, 9), "Release framework requires Python 3.9+"
# We currently sign with the old DSA key as it's been around the longest.
# We should transition to RSA by Jun 2020, and ECC by Jun 2021.
KEYID = util.KEYID_DSA

View File

@@ -24,7 +24,7 @@ from typing import List, Optional
import urllib.request
assert sys.version_info >= (3, 8), "Python 3.8+ required"
assert sys.version_info >= (3, 9), "Release framework requires Python 3.9+"
TOPDIR = Path(__file__).resolve().parent.parent

View File

@@ -30,6 +30,10 @@ import tempfile
from typing import List
# NB: This script is currently imported by tests/ to unittest some logic.
assert sys.version_info >= (3, 6), "Python 3.6+ required"
THIS_FILE = Path(__file__).resolve()
TOPDIR = THIS_FILE.parent.parent
MANDIR = TOPDIR.joinpath("man")

View File

@@ -14,7 +14,7 @@
"""Random utility code for release tools."""
import os
from pathlib import Path
import re
import shlex
import subprocess
@@ -24,8 +24,9 @@ import sys
assert sys.version_info >= (3, 6), "This module requires Python 3.6+"
TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
HOMEDIR = os.path.expanduser("~")
THIS_FILE = Path(__file__).resolve()
TOPDIR = THIS_FILE.parent.parent
HOMEDIR = Path("~").expanduser()
# These are the release keys we sign with.
@@ -54,7 +55,7 @@ def run(opts, cmd, check=True, **kwargs):
def import_release_key(opts):
"""Import the public key of the official release repo signing key."""
# Extract the key from our repo launcher.
launcher = getattr(opts, "launcher", os.path.join(TOPDIR, "repo"))
launcher = getattr(opts, "launcher", TOPDIR / "repo")
print(f'Importing keys from "{launcher}" launcher script')
with open(launcher, encoding="utf-8") as fp:
data = fp.read()

5
repo
View File

@@ -1,5 +1,4 @@
#!/usr/bin/env python3
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -130,7 +129,7 @@ if not REPO_REV:
BUG_URL = "https://issues.gerritcodereview.com/issues/new?component=1370071"
# increment this whenever we make important changes to this script
VERSION = (2, 54)
VERSION = (2, 61)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (2, 3)
@@ -326,7 +325,7 @@ def InitParser(parser):
group.add_option(
"--manifest-depth",
type="int",
default=0,
default=1,
metavar="DEPTH",
help="create a shallow clone of the manifest repo with "
"given depth (0 for full clone); see git clone "

View File

@@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -17,26 +17,37 @@
import functools
import os
import shlex
import shutil
import subprocess
import sys
from typing import List
# NB: While tests/* support Python >=3.6 to match requirements.json for `repo`,
# the higher level runner logic does not need to be held back.
assert sys.version_info >= (3, 9), "Test/release framework requires Python 3.9+"
ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
def log_cmd(cmd: str, argv: list[str]) -> None:
"""Log a debug message to make history easier to track."""
print("+", cmd, shlex.join(argv), file=sys.stderr)
@functools.lru_cache()
def is_ci() -> bool:
"""Whether we're running in our CI system."""
return os.getenv("LUCI_CQ") == "yes"
def run_pytest(argv: List[str]) -> int:
def run_pytest(argv: list[str]) -> int:
"""Returns the exit code from pytest."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
log_cmd("pytest", argv)
return subprocess.run(
[sys.executable, "-m", "pytest"] + argv,
check=False,
@@ -44,11 +55,12 @@ def run_pytest(argv: List[str]) -> int:
).returncode
def run_pytest_py38(argv: List[str]) -> int:
def run_pytest_py38(argv: list[str]) -> int:
"""Returns the exit code from pytest under Python 3.8."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
log_cmd("[vpython 3.8] pytest", argv)
try:
return subprocess.run(
[
@@ -77,8 +89,10 @@ def run_black():
"release/update-hooks",
"release/update-manpages",
]
argv = ["--diff", "--check", ROOT_DIR] + extra_programs
log_cmd("black", argv)
return subprocess.run(
[sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs,
[sys.executable, "-m", "black"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
@@ -86,8 +100,10 @@ def run_black():
def run_flake8():
"""Returns the exit code from flake8."""
argv = [ROOT_DIR]
log_cmd("flake8", argv)
return subprocess.run(
[sys.executable, "-m", "flake8", ROOT_DIR],
[sys.executable, "-m", "flake8"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
@@ -95,8 +111,21 @@ def run_flake8():
def run_isort():
"""Returns the exit code from isort."""
argv = ["--check", ROOT_DIR]
log_cmd("isort", argv)
return subprocess.run(
[sys.executable, "-m", "isort", "--check", ROOT_DIR],
[sys.executable, "-m", "isort"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
def run_check_metadata():
"""Returns the exit code from check-metadata."""
argv = []
log_cmd("release/check-metadata.py", argv)
return subprocess.run(
[sys.executable, "release/check-metadata.py"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
@@ -109,8 +138,10 @@ def run_update_manpages() -> int:
print("update-manpages: help2man not found; skipping test")
return 0
argv = ["--check"]
log_cmd("release/update-manpages", argv)
return subprocess.run(
[sys.executable, "release/update-manpages", "--check"],
[sys.executable, "release/update-manpages"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
@@ -124,6 +155,7 @@ def main(argv):
run_black,
run_flake8,
run_isort,
run_check_metadata,
run_update_manpages,
)
# Run all the tests all the time to get full feedback. Don't exit on the

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env python3
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the 'License");
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#

View File

@@ -133,7 +133,7 @@ without iterating through the remaining projects.
@staticmethod
def _cmd_option(option, _opt_str, _value, parser):
setattr(parser.values, option.dest or "command", list(parser.rargs))
setattr(parser.values, option.dest, list(parser.rargs))
while parser.rargs:
del parser.rargs[0]
@@ -161,6 +161,7 @@ without iterating through the remaining projects.
p.add_option(
"-c",
"--command",
dest="command",
help="command (and arguments) to execute",
action="callback",
callback=self._cmd_option,

View File

@@ -88,7 +88,7 @@ class Info(PagedCommand):
self.manifest = self.manifest.outer_client
manifestConfig = self.manifest.manifestProject.config
mergeBranch = manifestConfig.GetBranch("default").merge
manifestGroups = self.manifest.GetGroupsStr()
manifestGroups = self.manifest.GetManifestGroupsStr()
self.heading("Manifest branch: ")
if self.manifest.default.revisionExpr:
@@ -106,6 +106,7 @@ class Info(PagedCommand):
srev = sp.commit_id if sp and sp.commit_id else "None"
self.heading("Superproject revision: ")
self.headtext(srev)
self.out.nl()
self.printSeparator()

View File

@@ -87,6 +87,10 @@ _ONE_DAY_S = 24 * 60 * 60
_REPO_ALLOW_SHALLOW = os.environ.get("REPO_ALLOW_SHALLOW")
_BLOAT_PACK_COUNT_THRESHOLD = 10
_BLOAT_SIZE_PACK_THRESHOLD_KB = 10 * 1024 * 1024 # 10 GiB in KiB
_BLOAT_SIZE_GARBAGE_THRESHOLD_KB = 1 * 1024 * 1024 # 1 GiB in KiB
logger = RepoLogger(__file__)
@@ -204,14 +208,13 @@ class _SyncResult(NamedTuple):
relpath (str): The project's relative path from the repo client top.
remote_fetched (bool): True if the remote was actually queried.
fetch_success (bool): True if the fetch operation was successful.
fetch_error (Optional[Exception]): The Exception from a failed fetch,
or None.
fetch_errors (List[Exception]): The Exceptions from a failed fetch.
fetch_start (Optional[float]): The time.time() when fetch started.
fetch_finish (Optional[float]): The time.time() when fetch finished.
checkout_success (bool): True if the checkout operation was
successful.
checkout_error (Optional[Exception]): The Exception from a failed
checkout, or None.
checkout_errors (List[Exception]): The Exceptions from a failed
checkout.
checkout_start (Optional[float]): The time.time() when checkout
started.
checkout_finish (Optional[float]): The time.time() when checkout
@@ -224,12 +227,12 @@ class _SyncResult(NamedTuple):
remote_fetched: bool
fetch_success: bool
fetch_error: Optional[Exception]
fetch_errors: List[Exception]
fetch_start: Optional[float]
fetch_finish: Optional[float]
checkout_success: bool
checkout_error: Optional[Exception]
checkout_errors: List[Exception]
checkout_start: Optional[float]
checkout_finish: Optional[float]
@@ -976,9 +979,6 @@ later is required to fix a server side protocol bug.
sync_event.set()
sync_progress_thread.join()
self._fetch_times.Save()
self._local_sync_state.Save()
if not self.outer_client.manifest.IsArchive:
self._GCProjects(projects, opt, err_event)
@@ -1004,53 +1004,58 @@ later is required to fix a server side protocol bug.
to_fetch.extend(all_projects)
to_fetch.sort(key=self._fetch_times.Get, reverse=True)
result = self._Fetch(to_fetch, opt, err_event, ssh_proxy, errors)
success = result.success
fetched = result.projects
if not success:
err_event.set()
if opt.network_only:
# Bail out now; the rest touches the working tree.
if err_event.is_set():
e = SyncError(
"error: Exited sync due to fetch errors.",
aggregate_errors=errors,
)
logger.error(e)
raise e
return _FetchMainResult([])
# Iteratively fetch missing and/or nested unregistered submodules.
previously_missing_set = set()
while True:
self._ReloadManifest(None, manifest)
all_projects = self.GetProjects(
args,
missing_ok=True,
submodules_ok=opt.fetch_submodules,
manifest=manifest,
all_manifests=not opt.this_manifest_only,
)
missing = []
for project in all_projects:
if project.gitdir not in fetched:
missing.append(project)
if not missing:
break
# Stop us from non-stopped fetching actually-missing repos: If set
# of missing repos has not been changed from last fetch, we break.
missing_set = {p.name for p in missing}
if previously_missing_set == missing_set:
break
previously_missing_set = missing_set
result = self._Fetch(missing, opt, err_event, ssh_proxy, errors)
try:
result = self._Fetch(to_fetch, opt, err_event, ssh_proxy, errors)
success = result.success
new_fetched = result.projects
fetched = result.projects
if not success:
err_event.set()
fetched.update(new_fetched)
if opt.network_only:
# Bail out now; the rest touches the working tree.
if err_event.is_set():
e = SyncError(
"error: Exited sync due to fetch errors.",
aggregate_errors=errors,
)
logger.error(e)
raise e
return _FetchMainResult([])
# Iteratively fetch missing and/or nested unregistered submodules.
previously_missing_set = set()
while True:
self._ReloadManifest(None, manifest)
all_projects = self.GetProjects(
args,
missing_ok=True,
submodules_ok=opt.fetch_submodules,
manifest=manifest,
all_manifests=not opt.this_manifest_only,
)
missing = []
for project in all_projects:
if project.gitdir not in fetched:
missing.append(project)
if not missing:
break
# Stop us from non-stopped fetching actually-missing repos: If
# set of missing repos has not been changed from last fetch, we
# break.
missing_set = {p.name for p in missing}
if previously_missing_set == missing_set:
break
previously_missing_set = missing_set
result = self._Fetch(missing, opt, err_event, ssh_proxy, errors)
success = result.success
new_fetched = result.projects
if not success:
err_event.set()
fetched.update(new_fetched)
finally:
self._fetch_times.Save()
self._local_sync_state.Save()
return _FetchMainResult(all_projects)
@@ -1092,10 +1097,10 @@ later is required to fix a server side protocol bug.
force_sync=force_sync,
force_checkout=force_checkout,
force_rebase=force_rebase,
errors=errors,
verbose=verbose,
)
success = syncbuf.Finish()
errors.extend(syncbuf.errors)
except KeyboardInterrupt:
logger.error("Keyboard interrupt while processing %s", project.name)
except GitError as e:
@@ -1370,6 +1375,110 @@ later is required to fix a server side protocol bug.
t.join()
pm.end()
@classmethod
def _CheckOneBloatedProject(cls, project_index: int) -> Optional[str]:
"""Checks if a single project is bloated.
Args:
project_index: The index of the project in the parallel context.
Returns:
The name of the project if it is bloated, else None.
"""
project = cls.get_parallel_context()["projects"][project_index]
if not project.Exists or not project.worktree:
return None
# Only check dirty or locally modified projects. These can't be
# freshly cloned and will accumulate garbage.
try:
is_dirty = project.IsDirty(consider_untracked=True)
manifest_rev = project.GetRevisionId(project.bare_ref.all)
head_rev = project.work_git.rev_parse(HEAD)
has_local_commits = manifest_rev != head_rev
if not (is_dirty or has_local_commits):
return None
output = project.bare_git.count_objects("-v")
except Exception:
return None
stats = {}
for line in output.splitlines():
try:
key, value = line.split(": ", 1)
stats[key.strip()] = int(value.strip())
except ValueError:
pass
pack_count = stats.get("packs", 0)
size_pack_kb = stats.get("size-pack", 0)
size_garbage_kb = stats.get("size-garbage", 0)
is_fragmented = (
pack_count > _BLOAT_PACK_COUNT_THRESHOLD
and size_pack_kb > _BLOAT_SIZE_PACK_THRESHOLD_KB
)
has_excessive_garbage = (
size_garbage_kb > _BLOAT_SIZE_GARBAGE_THRESHOLD_KB
)
if is_fragmented or has_excessive_garbage:
return project.name
return None
def _CheckForBloatedProjects(self, projects, opt):
"""Check for shallow projects that are accumulating unoptimized data.
For projects with clone-depth="1" that are dirty (have local changes),
run 'git count-objects -v' and warn if the repository is accumulating
excessive pack files or garbage.
"""
# We only care about bloated projects if we have a git version that
# supports --no-auto-gc (2.23.0+) since what we use to disable auto-gc
# in Project._RemoteFetch.
if not git_require((2, 23, 0)):
return
projects = [p for p in projects if p.clone_depth]
if not projects:
return
bloated_projects = []
pm = Progress(
"Checking for bloat", len(projects), delay=False, quiet=opt.quiet
)
def _ProcessResults(pool, pm, results):
for result in results:
if result:
bloated_projects.append(result)
pm.update(msg="")
with self.ParallelContext():
self.get_parallel_context()["projects"] = projects
self.ExecuteInParallel(
opt.jobs,
self._CheckOneBloatedProject,
range(len(projects)),
callback=_ProcessResults,
output=pm,
chunksize=1,
)
pm.end()
for project_name in bloated_projects:
warn_msg = (
f'warning: Project "{project_name}" is accumulating '
'unoptimized data. Please run "repo sync --auto-gc" or '
'"repo gc --repack" to clean up.'
)
self.git_event_log.ErrorEvent(warn_msg)
logger.warning(warn_msg)
def _UpdateRepoProject(self, opt, manifest, errors):
"""Fetch the repo project and check for updates."""
if opt.local_only:
@@ -1753,10 +1862,10 @@ later is required to fix a server side protocol bug.
mp.Sync_LocalHalf(
syncbuf,
submodules=mp.manifest.HasSubmodules,
errors=errors,
verbose=opt.verbose,
)
clean = syncbuf.Finish()
errors.extend(syncbuf.errors)
self.event_log.AddSync(
mp, event_log.TASK_SYNC_LOCAL, start, time.time(), clean
)
@@ -2001,6 +2110,9 @@ later is required to fix a server side protocol bug.
"experience, sync the entire tree."
)
if existing:
self._CheckForBloatedProjects(all_projects, opt)
if not opt.quiet:
print("repo sync has finished successfully.")
@@ -2210,7 +2322,7 @@ later is required to fix a server side protocol bug.
"""Syncs a single project for interleaved sync."""
fetch_success = False
remote_fetched = False
fetch_error = None
fetch_errors = []
fetch_start = None
fetch_finish = None
network_output = ""
@@ -2243,16 +2355,17 @@ later is required to fix a server side protocol bug.
)
fetch_success = sync_result.success
remote_fetched = sync_result.remote_fetched
fetch_error = sync_result.error
if sync_result.error:
fetch_errors.append(sync_result.error)
except KeyboardInterrupt:
logger.error(
"Keyboard interrupt while processing %s", project.name
)
except GitError as e:
fetch_error = e
fetch_errors.append(e)
logger.error("error.GitError: Cannot fetch %s", e)
except Exception as e:
fetch_error = e
fetch_errors.append(e)
logger.error(
"error: Cannot fetch %s (%s: %s)",
project.name,
@@ -2264,7 +2377,7 @@ later is required to fix a server side protocol bug.
network_output = network_output_capture.getvalue()
checkout_success = False
checkout_error = None
checkout_errors = []
checkout_start = None
checkout_finish = None
checkout_stderr = ""
@@ -2284,33 +2397,29 @@ later is required to fix a server side protocol bug.
project.manifest.manifestProject.config,
detach_head=opt.detach_head,
)
local_half_errors = []
project.Sync_LocalHalf(
syncbuf,
force_sync=opt.force_sync,
force_checkout=opt.force_checkout,
force_rebase=opt.rebase,
errors=local_half_errors,
verbose=opt.verbose,
)
checkout_success = syncbuf.Finish()
if local_half_errors:
checkout_error = SyncError(
aggregate_errors=local_half_errors
)
if syncbuf.errors:
checkout_errors.extend(syncbuf.errors)
except KeyboardInterrupt:
logger.error(
"Keyboard interrupt while processing %s", project.name
)
except GitError as e:
checkout_error = e
checkout_errors.append(e)
logger.error(
"error.GitError: Cannot checkout %s: %s",
project.name,
e,
)
except Exception as e:
checkout_error = e
checkout_errors.append(e)
logger.error(
"error: Cannot checkout %s: %s: %s",
project.name,
@@ -2335,8 +2444,8 @@ later is required to fix a server side protocol bug.
fetch_success=fetch_success,
remote_fetched=remote_fetched,
checkout_success=checkout_success,
fetch_error=fetch_error,
checkout_error=checkout_error,
fetch_errors=fetch_errors,
checkout_errors=checkout_errors,
stderr_text=stderr_text.strip(),
fetch_start=fetch_start,
fetch_finish=fetch_finish,
@@ -2382,7 +2491,7 @@ later is required to fix a server side protocol bug.
def _ProcessSyncInterleavedResults(
self,
synced_relpaths: Set[str],
finished_relpaths: Set[str],
err_event: _threading.Event,
errors: List[Exception],
opt: optparse.Values,
@@ -2398,7 +2507,8 @@ later is required to fix a server side protocol bug.
pm.update()
project = projects[result.project_index]
if opt.verbose and result.stderr_text:
success = result.fetch_success and result.checkout_success
if result.stderr_text and (opt.verbose or not success):
pm.display_message(result.stderr_text)
if result.fetch_start:
@@ -2425,19 +2535,19 @@ later is required to fix a server side protocol bug.
result.checkout_success,
)
if result.fetch_success and result.checkout_success:
synced_relpaths.add(result.relpath)
else:
finished_relpaths.add(result.relpath)
if not success:
ret = False
err_event.set()
if result.fetch_error:
errors.append(result.fetch_error)
if result.fetch_errors:
errors.extend(result.fetch_errors)
self._interleaved_err_network = True
self._interleaved_err_network_results.append(
result.relpath
)
if result.checkout_error:
errors.append(result.checkout_error)
if result.checkout_errors:
errors.extend(result.checkout_errors)
self._interleaved_err_checkout = True
self._interleaved_err_checkout_results.append(
result.relpath
@@ -2479,7 +2589,7 @@ later is required to fix a server side protocol bug.
self._interleaved_err_checkout_results = []
err_event = multiprocessing.Event()
synced_relpaths = set()
finished_relpaths = set()
project_list = list(all_projects)
pm = Progress(
"Syncing",
@@ -2494,112 +2604,120 @@ later is required to fix a server side protocol bug.
sync_event = _threading.Event()
sync_progress_thread = self._CreateSyncProgressThread(pm, sync_event)
with multiprocessing.Manager() as manager, ssh.ProxyManager(
manager
) as ssh_proxy:
ssh_proxy.sock()
with self.ParallelContext():
self.get_parallel_context()["ssh_proxy"] = ssh_proxy
# TODO(gavinmak): Use multprocessing.Queue instead of dict.
self.get_parallel_context()[
"sync_dict"
] = multiprocessing.Manager().dict()
sync_progress_thread.start()
try:
with multiprocessing.Manager() as manager, ssh.ProxyManager(
manager
) as ssh_proxy:
ssh_proxy.sock()
with self.ParallelContext():
self.get_parallel_context()["ssh_proxy"] = ssh_proxy
# TODO(gavinmak): Use multprocessing.Queue instead of dict.
self.get_parallel_context()[
"sync_dict"
] = multiprocessing.Manager().dict()
sync_progress_thread.start()
try:
# Outer loop for dynamic project discovery. This continues
# until no unsynced projects remain.
while True:
projects_to_sync = [
p
for p in project_list
if p.relpath not in synced_relpaths
]
if not projects_to_sync:
break
try:
# Outer loop for dynamic project discovery. This
# continues until no unsynced projects remain.
while True:
projects_to_sync = [
p
for p in project_list
if p.relpath not in finished_relpaths
]
if not projects_to_sync:
break
pending_relpaths = {p.relpath for p in projects_to_sync}
if previously_pending_relpaths == pending_relpaths:
stalled_projects_str = "\n".join(
f" - {path}"
for path in sorted(list(pending_relpaths))
)
logger.error(
"The following projects failed and could not "
"be synced:\n%s",
stalled_projects_str,
)
err_event.set()
# Include these in the final error report.
self._interleaved_err_checkout = True
self._interleaved_err_checkout_results.extend(
list(pending_relpaths)
)
break
previously_pending_relpaths = pending_relpaths
self.get_parallel_context()[
"projects"
] = projects_to_sync
project_index_map = {
p: i for i, p in enumerate(projects_to_sync)
}
# Inner loop to process projects in a hierarchical
# order. This iterates through levels of project
# dependencies (e.g. 'foo' then 'foo/bar'). All projects
# in one level can be processed in parallel, but we must
# wait for a level to complete before starting the next.
for level_projects in _SafeCheckoutOrder(
projects_to_sync
):
if not level_projects:
continue
objdir_project_map = collections.defaultdict(list)
for p in level_projects:
objdir_project_map[p.objdir].append(
project_index_map[p]
pending_relpaths = {
p.relpath for p in projects_to_sync
}
if previously_pending_relpaths == pending_relpaths:
stalled_projects_str = "\n".join(
f" - {path}"
for path in sorted(list(pending_relpaths))
)
logger.error(
"The following projects failed and could "
"not be synced:\n%s",
stalled_projects_str,
)
work_items = list(objdir_project_map.values())
if not work_items:
continue
jobs = max(1, min(opt.jobs, len(work_items)))
callback = functools.partial(
self._ProcessSyncInterleavedResults,
synced_relpaths,
err_event,
errors,
opt,
)
if not self.ExecuteInParallel(
jobs,
functools.partial(self._SyncProjectList, opt),
work_items,
callback=callback,
output=pm,
chunksize=1,
):
err_event.set()
break
previously_pending_relpaths = pending_relpaths
if err_event.is_set() and opt.fail_fast:
raise SyncFailFastError(aggregate_errors=errors)
self.get_parallel_context()[
"projects"
] = projects_to_sync
project_index_map = {
p: i for i, p in enumerate(projects_to_sync)
}
self._ReloadManifest(None, manifest)
project_list = self.GetProjects(
args,
missing_ok=True,
submodules_ok=opt.fetch_submodules,
manifest=manifest,
all_manifests=not opt.this_manifest_only,
)
pm.update_total(len(project_list))
finally:
sync_event.set()
sync_progress_thread.join()
# Inner loop to process projects in a hierarchical
# order. This iterates through levels of project
# dependencies (e.g. 'foo' then 'foo/bar'). All
# projects in one level can be processed in
# parallel, but we must wait for a level to complete
# before starting the next.
for level_projects in _SafeCheckoutOrder(
projects_to_sync
):
if not level_projects:
continue
objdir_project_map = collections.defaultdict(
list
)
for p in level_projects:
objdir_project_map[p.objdir].append(
project_index_map[p]
)
work_items = list(objdir_project_map.values())
if not work_items:
continue
jobs = max(1, min(opt.jobs, len(work_items)))
callback = functools.partial(
self._ProcessSyncInterleavedResults,
finished_relpaths,
err_event,
errors,
opt,
)
if not self.ExecuteInParallel(
jobs,
functools.partial(
self._SyncProjectList, opt
),
work_items,
callback=callback,
output=pm,
chunksize=1,
initializer=self.InitWorker,
):
err_event.set()
if err_event.is_set() and opt.fail_fast:
raise SyncFailFastError(
aggregate_errors=errors
)
self._ReloadManifest(None, manifest)
project_list = self.GetProjects(
args,
missing_ok=True,
submodules_ok=opt.fetch_submodules,
manifest=manifest,
all_manifests=not opt.this_manifest_only,
)
pm.update_total(len(project_list))
finally:
sync_event.set()
sync_progress_thread.join()
finally:
self._fetch_times.Save()
self._local_sync_state.Save()
pm.end()
@@ -2703,17 +2821,19 @@ class _FetchTimes:
self._saved = {}
def Save(self):
if self._saved is None:
if not self._seen:
return
self._Load()
for name, t in self._seen.items():
# Keep a moving average across the previous/current sync runs.
old = self._saved.get(name, t)
self._seen[name] = (self._ALPHA * t) + ((1 - self._ALPHA) * old)
self._saved[name] = (self._ALPHA * t) + ((1 - self._ALPHA) * old)
try:
with open(self._path, "w") as f:
json.dump(self._seen, f, indent=2)
json.dump(self._saved, f, indent=2)
except (OSError, TypeError):
platform_utils.remove(self._path, missing_ok=True)

184
subcmds/wipe.py Normal file
View File

@@ -0,0 +1,184 @@
# Copyright (C) 2025 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
from typing import List
from command import Command
from error import GitError
from error import RepoExitError
import platform_utils
from project import DeleteWorktreeError
class Error(RepoExitError):
"""Exit error when wipe command fails."""
class Wipe(Command):
"""Delete projects from the worktree and .repo"""
COMMON = True
helpSummary = "Wipe projects from the worktree"
helpUsage = """
%prog <project>...
"""
helpDescription = """
The '%prog' command removes the specified projects from the worktree
(the checked out source code) and deletes the project's git data from `.repo`.
This is a destructive operation and cannot be undone.
Projects can be specified either by name, or by a relative or absolute path
to the project's local directory.
Examples:
# Wipe the project "platform/build" by name:
$ repo wipe platform/build
# Wipe the project at the path "build/make":
$ repo wipe build/make
"""
def _Options(self, p):
# TODO(crbug.com/gerrit/393383056): Add --broken option to scan and
# wipe broken projects.
p.add_option(
"-f",
"--force",
action="store_true",
help="force wipe shared projects and uncommitted changes",
)
p.add_option(
"--force-uncommitted",
action="store_true",
help="force wipe even if there are uncommitted changes",
)
p.add_option(
"--force-shared",
action="store_true",
help="force wipe even if the project shares an object directory",
)
def ValidateOptions(self, opt, args: List[str]):
if not args:
self.Usage()
def Execute(self, opt, args: List[str]):
# Get all projects to handle shared object directories.
all_projects = self.GetProjects(None, all_manifests=True, groups="all")
projects_to_wipe = self.GetProjects(args, all_manifests=True)
relpaths_to_wipe = {p.relpath for p in projects_to_wipe}
# Build a map from objdir to the relpaths of projects that use it.
objdir_map = {}
for p in all_projects:
objdir_map.setdefault(p.objdir, set()).add(p.relpath)
uncommitted_projects = []
shared_objdirs = {}
objdirs_to_delete = set()
for project in projects_to_wipe:
if project == self.manifest.manifestProject:
raise Error(
f"error: cannot wipe the manifest project: {project.name}"
)
try:
if project.HasChanges():
uncommitted_projects.append(project.name)
except GitError:
uncommitted_projects.append(f"{project.name} (corrupted)")
users = objdir_map.get(project.objdir, {project.relpath})
is_shared = not users.issubset(relpaths_to_wipe)
if is_shared:
shared_objdirs.setdefault(project.objdir, set()).update(users)
else:
objdirs_to_delete.add(project.objdir)
block_uncommitted = uncommitted_projects and not (
opt.force or opt.force_uncommitted
)
block_shared = shared_objdirs and not (opt.force or opt.force_shared)
if block_uncommitted or block_shared:
error_messages = []
if block_uncommitted:
error_messages.append(
"The following projects have uncommitted changes or are "
"corrupted:\n"
+ "\n".join(f" - {p}" for p in sorted(uncommitted_projects))
)
if block_shared:
shared_dir_messages = []
for objdir, users in sorted(shared_objdirs.items()):
other_users = users - relpaths_to_wipe
projects_to_wipe_in_dir = users & relpaths_to_wipe
message = f"""Object directory {objdir} is shared by:
Projects to be wiped: {', '.join(sorted(projects_to_wipe_in_dir))}
Projects not to be wiped: {', '.join(sorted(other_users))}"""
shared_dir_messages.append(message)
error_messages.append(
"The following projects have shared object directories:\n"
+ "\n".join(sorted(shared_dir_messages))
)
if block_uncommitted and block_shared:
error_messages.append(
"Use --force to wipe anyway, or --force-uncommitted and "
"--force-shared to specify."
)
elif block_uncommitted:
error_messages.append("Use --force-uncommitted to wipe anyway.")
else:
error_messages.append("Use --force-shared to wipe anyway.")
raise Error("\n\n".join(error_messages))
# If we are here, either there were no issues, or --force was used.
# Proceed with wiping.
successful_wipes = set()
for project in projects_to_wipe:
try:
# Force the delete here since we've already performed our
# own safety checks above.
project.DeleteWorktree(force=True, verbose=opt.verbose)
successful_wipes.add(project.relpath)
except DeleteWorktreeError as e:
print(
f"error: failed to wipe {project.name}: {e}",
file=sys.stderr,
)
# Clean up object directories only if all projects using them were
# successfully wiped.
for objdir in objdirs_to_delete:
users = objdir_map.get(objdir, set())
# Check if every project that uses this objdir has been
# successfully processed. If a project failed to be wiped, don't
# delete the object directory, or we'll corrupt the remaining
# project.
if users.issubset(successful_wipes):
if os.path.exists(objdir):
if opt.verbose:
print(
f"Deleting objects directory: {objdir}",
file=sys.stderr,
)
platform_utils.rmtree(objdir)

View File

@@ -1,4 +1,4 @@
# Copyright 2022 The Android Open Source Project
# Copyright (C) 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -1,4 +1,4 @@
# Copyright 2021 The Android Open Source Project
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -1,4 +1,4 @@
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -166,6 +166,30 @@ class GitConfigReadWriteTests(unittest.TestCase):
config = self.get_config()
self.assertIsNone(config.GetBoolean("foo.bar"))
def test_SetInt(self):
"""Test SetInt behavior."""
# Set a value.
self.assertIsNone(self.config.GetInt("foo.bar"))
self.config.SetInt("foo.bar", 10)
self.assertEqual(10, self.config.GetInt("foo.bar"))
# Make sure the value was actually written out.
config = self.get_config()
self.assertEqual(10, config.GetInt("foo.bar"))
self.assertEqual("10", config.GetString("foo.bar"))
# Update the value.
self.config.SetInt("foo.bar", 20)
self.assertEqual(20, self.config.GetInt("foo.bar"))
config = self.get_config()
self.assertEqual(20, config.GetInt("foo.bar"))
# Delete the value.
self.config.SetInt("foo.bar", None)
self.assertIsNone(self.config.GetInt("foo.bar"))
config = self.get_config()
self.assertIsNone(config.GetInt("foo.bar"))
def test_GetSyncAnalysisStateData(self):
"""Test config entries with a sync state analysis data."""
superproject_logging_data = {}

View File

@@ -368,6 +368,7 @@ class EventLogTestCase(unittest.TestCase):
with self.assertRaises(TypeError):
self._event_log_module.Write(path=1234)
@unittest.skipIf(not hasattr(socket, "AF_UNIX"), "Requires AF_UNIX sockets")
def test_write_socket(self):
"""Test Write() with Unix domain socket for |path| and validate received
traces."""

View File

@@ -15,6 +15,7 @@
"""Unittests for the manifest_xml.py module."""
import os
from pathlib import Path
import platform
import re
import tempfile
@@ -97,36 +98,34 @@ class ManifestParseTestCase(unittest.TestCase):
def setUp(self):
self.tempdirobj = tempfile.TemporaryDirectory(prefix="repo_tests")
self.tempdir = self.tempdirobj.name
self.repodir = os.path.join(self.tempdir, ".repo")
self.manifest_dir = os.path.join(self.repodir, "manifests")
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME
self.tempdir = Path(self.tempdirobj.name)
self.repodir = self.tempdir / ".repo"
self.manifest_dir = self.repodir / "manifests"
self.manifest_file = self.repodir / manifest_xml.MANIFEST_FILE_NAME
self.local_manifest_dir = (
self.repodir / manifest_xml.LOCAL_MANIFESTS_DIR_NAME
)
self.local_manifest_dir = os.path.join(
self.repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME
)
os.mkdir(self.repodir)
os.mkdir(self.manifest_dir)
self.repodir.mkdir()
self.manifest_dir.mkdir()
# The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, "manifests.git")
os.mkdir(gitdir)
with open(os.path.join(gitdir, "config"), "w") as fp:
fp.write(
"""[remote "origin"]
gitdir = self.repodir / "manifests.git"
gitdir.mkdir()
(gitdir / "config").write_text(
"""[remote "origin"]
url = https://localhost:0/manifest
"""
)
)
def tearDown(self):
self.tempdirobj.cleanup()
def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing."""
with open(self.manifest_file, "w", encoding="utf-8") as fp:
fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
self.manifest_file.write_text(data, encoding="utf-8")
return manifest_xml.XmlManifest(
str(self.repodir), str(self.manifest_file)
)
@staticmethod
def encodeXmlAttr(attr):
@@ -243,12 +242,14 @@ class XmlManifestTests(ManifestParseTestCase):
def test_link(self):
"""Verify Link handling with new names."""
manifest = manifest_xml.XmlManifest(self.repodir, self.manifest_file)
with open(os.path.join(self.manifest_dir, "foo.xml"), "w") as fp:
fp.write("<manifest></manifest>")
manifest = manifest_xml.XmlManifest(
str(self.repodir), str(self.manifest_file)
)
(self.manifest_dir / "foo.xml").write_text("<manifest></manifest>")
manifest.Link("foo.xml")
with open(self.manifest_file) as fp:
self.assertIn('<include name="foo.xml" />', fp.read())
self.assertIn(
'<include name="foo.xml" />', self.manifest_file.read_text()
)
def test_toxml_empty(self):
"""Verify the ToXml() helper."""
@@ -406,10 +407,9 @@ class IncludeElementTests(ManifestParseTestCase):
def test_revision_default(self):
"""Check handling of revision attribute."""
root_m = os.path.join(self.manifest_dir, "root.xml")
with open(root_m, "w") as fp:
fp.write(
"""
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
@@ -418,17 +418,34 @@ class IncludeElementTests(ManifestParseTestCase):
<project name="root-name2" path="root-path2" />
</manifest>
"""
)
with open(os.path.join(self.manifest_dir, "stable.xml"), "w") as fp:
fp.write(
"""
)
(self.manifest_dir / "stable.xml").write_text(
"""
<manifest>
<include name="man1.xml" />
<include name="man2.xml" revision="stable-branch2" />
<project name="stable-name1" path="stable-path1" />
<project name="stable-name2" path="stable-path2" revision="stable-branch2" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(self.repodir, root_m)
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<project name="man1-name1" />
<project name="man1-name2" revision="stable-branch3" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<project name="man2-name1" />
<project name="man2-name2" revision="stable-branch3" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
for proj in include_m.projects:
if proj.name == "root-name1":
# Check include revision not set on root level proj.
@@ -442,12 +459,19 @@ class IncludeElementTests(ManifestParseTestCase):
if proj.name == "stable-name2":
# Check stable proj revision can override include node.
self.assertEqual("stable-branch2", proj.revisionExpr)
if proj.name == "man1-name1":
self.assertEqual("stable-branch", proj.revisionExpr)
if proj.name == "man1-name2":
self.assertEqual("stable-branch3", proj.revisionExpr)
if proj.name == "man2-name1":
self.assertEqual("stable-branch2", proj.revisionExpr)
if proj.name == "man2-name2":
self.assertEqual("stable-branch3", proj.revisionExpr)
def test_group_levels(self):
root_m = os.path.join(self.manifest_dir, "root.xml")
with open(root_m, "w") as fp:
fp.write(
"""
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
@@ -456,25 +480,23 @@ class IncludeElementTests(ManifestParseTestCase):
<project name="root-name2" path="root-path2" groups="r2g1,r2g2" />
</manifest>
"""
)
with open(os.path.join(self.manifest_dir, "level1.xml"), "w") as fp:
fp.write(
"""
)
(self.manifest_dir / "level1.xml").write_text(
"""
<manifest>
<include name="level2.xml" groups="level2-group" />
<project name="level1-name1" path="level1-path1" />
</manifest>
"""
)
with open(os.path.join(self.manifest_dir, "level2.xml"), "w") as fp:
fp.write(
"""
)
(self.manifest_dir / "level2.xml").write_text(
"""
<manifest>
<project name="level2-name1" path="level2-path1" groups="l2g1,l2g2" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(self.repodir, root_m)
)
include_m = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
for proj in include_m.projects:
if proj.name == "root-name1":
# Check include group not set on root level proj.
@@ -492,6 +514,41 @@ class IncludeElementTests(ManifestParseTestCase):
# Check level2 proj group not removed.
self.assertIn("l2g1", proj.groups)
def test_group_levels_with_extend_project(self):
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<include name="man1.xml" groups="top-group1" />
<include name="man2.xml" groups="top-group2" />
</manifest>
"""
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<project name="project1" path="project1" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<extend-project name="project1" groups="eg1" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
proj = include_m.projects[0]
# Check project has inherited group via project element.
self.assertIn("top-group1", proj.groups)
# Check project has inherited group via extend-project element.
self.assertIn("top-group2", proj.groups)
# Check project has set group via extend-project element.
self.assertIn("eg1", proj.groups)
def test_allow_bad_name_from_user(self):
"""Check handling of bad name attribute from the user's input."""
@@ -510,9 +567,8 @@ class IncludeElementTests(ManifestParseTestCase):
manifest.ToXml()
# Setup target of the include.
target = os.path.join(self.tempdir, "target.xml")
with open(target, "w") as fp:
fp.write("<manifest></manifest>")
target = self.tempdir / "target.xml"
target.write_text("<manifest></manifest>")
# Include with absolute path.
parse(os.path.abspath(target))
@@ -526,12 +582,9 @@ class IncludeElementTests(ManifestParseTestCase):
def parse(name):
name = self.encodeXmlAttr(name)
# Setup target of the include.
with open(
os.path.join(self.manifest_dir, "target.xml"),
"w",
encoding="utf-8",
) as fp:
fp.write(f'<manifest><include name="{name}"/></manifest>')
(self.manifest_dir / "target.xml").write_text(
f'<manifest><include name="{name}"/></manifest>'
)
manifest = self.getXmlManifest(
"""
@@ -578,18 +631,18 @@ class ProjectElementTests(ManifestParseTestCase):
manifest.projects[0].name: manifest.projects[0].groups,
manifest.projects[1].name: manifest.projects[1].groups,
}
self.assertCountEqual(
result["test-name"], ["name:test-name", "all", "path:test-path"]
self.assertEqual(
result["test-name"], {"name:test-name", "all", "path:test-path"}
)
self.assertCountEqual(
self.assertEqual(
result["extras"],
["g1", "g2", "g1", "name:extras", "all", "path:path"],
{"g1", "g2", "name:extras", "all", "path:path"},
)
groupstr = "default,platform-" + platform.system().lower()
self.assertEqual(groupstr, manifest.GetGroupsStr())
self.assertEqual(groupstr, manifest.GetManifestGroupsStr())
groupstr = "g1,g2,g1"
manifest.manifestProject.config.SetString("manifest.groups", groupstr)
self.assertEqual(groupstr, manifest.GetGroupsStr())
self.assertEqual(groupstr, manifest.GetManifestGroupsStr())
def test_set_revision_id(self):
"""Check setting of project's revisionId."""
@@ -1214,6 +1267,166 @@ class ExtendProjectElementTests(ManifestParseTestCase):
self.assertEqual(len(manifest.projects), 1)
self.assertEqual(manifest.projects[0].upstream, "bar")
def test_extend_project_copyfiles(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="myproject" />
<extend-project name="myproject">
<copyfile src="foo" dest="bar" />
</extend-project>
</manifest>
"""
)
self.assertEqual(list(manifest.projects[0].copyfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].copyfiles)[0].dest, "bar")
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="myproject">'
'<copyfile dest="bar" src="foo"/>'
"</project>"
"</manifest>",
)
def test_extend_project_duplicate_copyfiles(self):
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="myproject" />
<include name="man1.xml" />
<include name="man2.xml" />
</manifest>
"""
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "common.xml").write_text(
"""
<manifest>
<extend-project name="myproject">
<copyfile dest="bar" src="foo"/>
</extend-project>
</manifest>
"""
)
manifest = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
self.assertEqual(len(manifest.projects[0].copyfiles), 1)
self.assertEqual(list(manifest.projects[0].copyfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].copyfiles)[0].dest, "bar")
def test_extend_project_linkfiles(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="myproject" />
<extend-project name="myproject">
<linkfile src="foo" dest="bar" />
</extend-project>
</manifest>
"""
)
self.assertEqual(list(manifest.projects[0].linkfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].linkfiles)[0].dest, "bar")
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="myproject">'
'<linkfile dest="bar" src="foo"/>'
"</project>"
"</manifest>",
)
def test_extend_project_duplicate_linkfiles(self):
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="myproject" />
<include name="man1.xml" />
<include name="man2.xml" />
</manifest>
"""
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "common.xml").write_text(
"""
<manifest>
<extend-project name="myproject">
<linkfile dest="bar" src="foo"/>
</extend-project>
</manifest>
"""
)
manifest = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
self.assertEqual(len(manifest.projects[0].linkfiles), 1)
self.assertEqual(list(manifest.projects[0].linkfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].linkfiles)[0].dest, "bar")
def test_extend_project_annotations(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="myproject" />
<extend-project name="myproject">
<annotation name="foo" value="bar" />
</extend-project>
</manifest>
"""
)
self.assertEqual(manifest.projects[0].annotations[0].name, "foo")
self.assertEqual(manifest.projects[0].annotations[0].value, "bar")
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="myproject">'
'<annotation name="foo" value="bar"/>'
"</project>"
"</manifest>",
)
class NormalizeUrlTests(ManifestParseTestCase):
"""Tests for normalize_url() in manifest_xml.py"""

View File

@@ -1,4 +1,4 @@
# Copyright 2021 The Android Open Source Project
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -1,4 +1,4 @@
# Copyright 2022 The Android Open Source Project
# Copyright (C) 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -1,4 +1,4 @@
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -94,7 +94,12 @@ class AllCommands(unittest.TestCase):
"""Block redundant dest= arguments."""
def _check_dest(opt):
if opt.dest is None or not opt._long_opts:
"""Check the dest= setting."""
# If the destination is not set, nothing to check.
# If long options are not set, then there's no implicit destination.
# If callback is used, then a destination might be needed because
# optparse cannot assume a value is always stored.
if opt.dest is None or not opt._long_opts or opt.callback:
return
long = opt._long_opts[0]

View File

@@ -681,6 +681,9 @@ class InterleavedSyncTest(unittest.TestCase):
# Mock _GetCurrentBranchOnly for worker tests.
mock.patch.object(sync.Sync, "_GetCurrentBranchOnly").start()
self.cmd._fetch_times = mock.Mock()
self.cmd._local_sync_state = mock.Mock()
def tearDown(self):
"""Clean up resources."""
shutil.rmtree(self.repodir)
@@ -801,6 +804,7 @@ class InterleavedSyncTest(unittest.TestCase):
with mock.patch("subcmds.sync.SyncBuffer") as mock_sync_buffer:
mock_sync_buf_instance = mock.MagicMock()
mock_sync_buf_instance.Finish.return_value = True
mock_sync_buf_instance.errors = []
mock_sync_buffer.return_value = mock_sync_buf_instance
result_obj = self.cmd._SyncProjectList(opt, [0])
@@ -809,8 +813,8 @@ class InterleavedSyncTest(unittest.TestCase):
result = result_obj.results[0]
self.assertTrue(result.fetch_success)
self.assertTrue(result.checkout_success)
self.assertIsNone(result.fetch_error)
self.assertIsNone(result.checkout_error)
self.assertEqual(result.fetch_errors, [])
self.assertEqual(result.checkout_errors, [])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_called_once()
@@ -832,8 +836,8 @@ class InterleavedSyncTest(unittest.TestCase):
self.assertFalse(result.fetch_success)
self.assertFalse(result.checkout_success)
self.assertEqual(result.fetch_error, fetch_error)
self.assertIsNone(result.checkout_error)
self.assertEqual(result.fetch_errors, [fetch_error])
self.assertEqual(result.checkout_errors, [])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_not_called()
@@ -870,7 +874,7 @@ class InterleavedSyncTest(unittest.TestCase):
self.assertFalse(result.fetch_success)
self.assertFalse(result.checkout_success)
self.assertEqual(result.fetch_error, fetch_error)
self.assertEqual(result.fetch_errors, [fetch_error])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_not_called()
@@ -892,8 +896,8 @@ class InterleavedSyncTest(unittest.TestCase):
self.assertTrue(result.fetch_success)
self.assertFalse(result.checkout_success)
self.assertIsNone(result.fetch_error)
self.assertEqual(result.checkout_error, checkout_error)
self.assertEqual(result.fetch_errors, [])
self.assertEqual(result.checkout_errors, [checkout_error])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_called_once()
@@ -909,6 +913,7 @@ class InterleavedSyncTest(unittest.TestCase):
with mock.patch("subcmds.sync.SyncBuffer") as mock_sync_buffer:
mock_sync_buf_instance = mock.MagicMock()
mock_sync_buf_instance.Finish.return_value = True
mock_sync_buf_instance.errors = []
mock_sync_buffer.return_value = mock_sync_buf_instance
result_obj = self.cmd._SyncProjectList(opt, [0])

263
tests/test_subcmds_wipe.py Normal file
View File

@@ -0,0 +1,263 @@
# Copyright (C) 2025 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import shutil
from unittest import mock
import pytest
import project
from subcmds import wipe
def _create_mock_project(tempdir, name, objdir_path=None, has_changes=False):
"""Creates a mock project with necessary attributes and directories."""
worktree = os.path.join(tempdir, name)
gitdir = os.path.join(tempdir, ".repo/projects", f"{name}.git")
if objdir_path:
objdir = objdir_path
else:
objdir = os.path.join(tempdir, ".repo/project-objects", f"{name}.git")
os.makedirs(worktree, exist_ok=True)
os.makedirs(gitdir, exist_ok=True)
os.makedirs(objdir, exist_ok=True)
proj = project.Project(
manifest=mock.MagicMock(),
name=name,
remote=mock.MagicMock(),
gitdir=gitdir,
objdir=objdir,
worktree=worktree,
relpath=name,
revisionExpr="main",
revisionId="abcd",
)
proj.HasChanges = mock.MagicMock(return_value=has_changes)
def side_effect_delete_worktree(force=False, verbose=False):
if os.path.exists(proj.worktree):
shutil.rmtree(proj.worktree)
if os.path.exists(proj.gitdir):
shutil.rmtree(proj.gitdir)
return True
proj.DeleteWorktree = mock.MagicMock(
side_effect=side_effect_delete_worktree
)
return proj
def _run_wipe(all_projects, projects_to_wipe_names, options=None):
"""Helper to run the Wipe command with mocked projects."""
cmd = wipe.Wipe()
cmd.manifest = mock.MagicMock()
def get_projects_mock(projects, all_manifests=False, **kwargs):
if projects is None:
return all_projects
names_to_find = set(projects)
return [p for p in all_projects if p.name in names_to_find]
cmd.GetProjects = mock.MagicMock(side_effect=get_projects_mock)
if options is None:
options = []
opts = cmd.OptionParser.parse_args(options + projects_to_wipe_names)[0]
cmd.CommonValidateOptions(opts, projects_to_wipe_names)
cmd.ValidateOptions(opts, projects_to_wipe_names)
cmd.Execute(opts, projects_to_wipe_names)
def test_wipe_single_unshared_project(tmp_path):
"""Test wiping a single project that is not shared."""
p1 = _create_mock_project(str(tmp_path), "project/one")
_run_wipe([p1], ["project/one"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)
def test_wipe_multiple_unshared_projects(tmp_path):
"""Test wiping multiple projects that are not shared."""
p1 = _create_mock_project(str(tmp_path), "project/one")
p2 = _create_mock_project(str(tmp_path), "project/two")
_run_wipe([p1, p2], ["project/one", "project/two"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)
assert not os.path.exists(p2.worktree)
assert not os.path.exists(p2.gitdir)
assert not os.path.exists(p2.objdir)
def test_wipe_shared_project_no_force_raises_error(tmp_path):
"""Test that wiping a shared project without --force raises an error."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
with pytest.raises(wipe.Error) as e:
_run_wipe([p1, p2], ["project/one"])
assert "shared object directories" in str(e.value)
assert "project/one" in str(e.value)
assert "project/two" in str(e.value)
assert os.path.exists(p1.worktree)
assert os.path.exists(p1.gitdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
assert os.path.exists(shared_objdir)
def test_wipe_shared_project_with_force(tmp_path):
"""Test wiping a shared project with --force."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
_run_wipe([p1, p2], ["project/one"], options=["--force"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert os.path.exists(shared_objdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
def test_wipe_all_sharing_projects(tmp_path):
"""Test wiping all projects that share an object directory."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
_run_wipe([p1, p2], ["project/one", "project/two"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p2.worktree)
assert not os.path.exists(p2.gitdir)
assert not os.path.exists(shared_objdir)
def test_wipe_with_uncommitted_changes_raises_error(tmp_path):
"""Test wiping a project with uncommitted changes raises an error."""
p1 = _create_mock_project(str(tmp_path), "project/one", has_changes=True)
with pytest.raises(wipe.Error) as e:
_run_wipe([p1], ["project/one"])
assert "uncommitted changes" in str(e.value)
assert "project/one" in str(e.value)
assert os.path.exists(p1.worktree)
assert os.path.exists(p1.gitdir)
assert os.path.exists(p1.objdir)
def test_wipe_with_uncommitted_changes_with_force(tmp_path):
"""Test wiping a project with uncommitted changes with --force."""
p1 = _create_mock_project(str(tmp_path), "project/one", has_changes=True)
_run_wipe([p1], ["project/one"], options=["--force"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)
def test_wipe_uncommitted_and_shared_raises_combined_error(tmp_path):
"""Test that uncommitted and shared projects raise a combined error."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path),
"project/one",
objdir_path=shared_objdir,
has_changes=True,
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
with pytest.raises(wipe.Error) as e:
_run_wipe([p1, p2], ["project/one"])
assert "uncommitted changes" in str(e.value)
assert "shared object directories" in str(e.value)
assert "project/one" in str(e.value)
assert "project/two" in str(e.value)
assert os.path.exists(p1.worktree)
assert os.path.exists(p1.gitdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
assert os.path.exists(shared_objdir)
def test_wipe_shared_project_with_force_shared(tmp_path):
"""Test wiping a shared project with --force-shared."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
_run_wipe([p1, p2], ["project/one"], options=["--force-shared"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert os.path.exists(shared_objdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
def test_wipe_with_uncommitted_changes_with_force_uncommitted(tmp_path):
"""Test wiping uncommitted changes with --force-uncommitted."""
p1 = _create_mock_project(str(tmp_path), "project/one", has_changes=True)
_run_wipe([p1], ["project/one"], options=["--force-uncommitted"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)

View File

@@ -1,4 +1,4 @@
# Copyright 2022 The Android Open Source Project
# Copyright (C) 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

63
tox.ini
View File

@@ -1,63 +0,0 @@
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# https://tox.readthedocs.io/
[tox]
envlist = lint, py36, py37, py38, py39, py310, py311, py312
requires = virtualenv<20.22.0
[gh-actions]
python =
3.6: py36
3.7: py37
3.8: py38
3.9: py39
3.10: py310
3.11: py311
3.12: py312
[testenv]
deps =
-c constraints.txt
black
flake8
isort
pytest
pytest-timeout
commands = {envpython} run_tests {posargs}
setenv =
GIT_AUTHOR_NAME = Repo test author
GIT_COMMITTER_NAME = Repo test committer
EMAIL = repo@gerrit.nodomain
[testenv:lint]
skip_install = true
deps =
-c constraints.txt
black
flake8
commands =
black --check {posargs:. repo run_tests release/update-hooks release/update-manpages}
flake8
[testenv:format]
skip_install = true
deps =
-c constraints.txt
black
flake8
commands =
black {posargs:. repo run_tests release/update-hooks release/update-manpages}
flake8