Compare commits

...

100 Commits
v2.54 ... v2.62

Author SHA1 Message Date
Mike Frysinger
ade45de770 docs: windows: mention Developer Mode for symlinks
This is probably better than recommending Administrator access.

Change-Id: Ic916f15fe03f7fa1e03c685265b4774bfc1279c2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/563581
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-03-20 14:07:06 -07:00
Lucas Tanure
0251fb33c4 project: don't re-shallow manually unshallowed repos during sync
If a user has manually unshallowed a repo (e.g. via
`git fetch --unshallow`), the absence of the `shallow` file in the
gitdir indicates a full clone. Re-applying depth during a subsequent
sync would undo the user's intent. Skip re-shallowing in this case
by clearing depth when the project is not new and no shallow file
is present.

Change-Id: I4ee0e78018de9078fe1bd77a9615613ef0c40d33
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/558743
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Carlos Fernandez <carlosfsanz@meta.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Lucas Alves <ltanure@gmail.com>
Tested-by: Lucas Alves <ltanure@gmail.com>
Reviewed-by: Lucas Alves <ltanure@gmail.com>
2026-03-20 10:08:40 -07:00
Jacky Liu
0176586544 Use git_superproject.UseSuperproject() everywhere
Currently somewhere use git_superproject.UseSuperproject(), which checks
both the manifest config and user's config, and otherwhere use
manifest.manifestProject.use_superproject, which only checks the
manifest config. This causes Inconsistent behaviors for users who do not
set --use-superproject when doing repo init but have
repo.superprojectChoice in their git config.

Replace where using manifest.manifestProject.use_superproject with
git_superproject.UseSuperproject() to respect user's config and avoid
inconsistency.

Bug: 454514213
Change-Id: I1f734235cdd67b8a6915f1d05967d1aaa4d03f2a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/561801
Commit-Queue: Jacky Liu <qsliu@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jacky Liu <qsliu@google.com>
2026-03-18 21:03:07 -07:00
Mike Frysinger
582804a59e pydev: drop Python 2 reference
Not sure who uses this anymore, but might as well delete obviously
wrong content.

Change-Id: I5cdf1cf699c81b7db32b400f371134d21f474743
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/563161
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2026-03-18 12:01:37 -07:00
Mike Frysinger
afc3d55d39 isort: merge config into pyproject.toml
Change-Id: I3a50de04897789c7b2f291882faf1c862645b054
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/563141
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-03-18 12:01:25 -07:00
Mike Frysinger
f24bc7aed5 tests: switch some test modules to pytest
Change-Id: I524b5ff2d77f8232f94e21921b00ba4027d2ac4f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/563081
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-03-18 11:13:26 -07:00
Gavin Mak
83b8ebdbbe git_superproject: avoid re-initing bare repo
Running sync with reftable on a files-backed workspace fails to re-init
the superproject dir with:
```
fatal: could not open
'.../.repo/exp-superproject/<hash>-superproject.git/refs/heads' for writing:
Is a directory
```

Bug: 476209856
Change-Id: Ie8473d66069aafefa5661bd3ea8e73b2b27c6a38
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/550981
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2026-03-17 14:30:35 -07:00
Gavin Mak
a0abfd7339 project: resolve unborn HEAD robustly in reftable repos
Use `git symbolic-ref` to resolve HEAD before trying to parse .git/HEAD
directly which is unreliable for reftable repos.

Bug: 476209856
Change-Id: I60185d945c5b43c871945c0126cfdf52194e745d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/550762
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2026-03-17 14:30:20 -07:00
Gavin Mak
403fedfeb5 project: support reftable anchors in worktree .git migration
The reftable backend creates real refs/ and reftable/ dirs. Update
_MigrateOldWorkTreeGitDir to expect these dirs and remove them.

Bug: 476209856
Change-Id: I4700da70cb466e25ecbc51ba4de9a906b8716bd8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/550761
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2026-03-17 14:30:01 -07:00
Gavin Mak
f14c577fce project: avoid direct packed-refs writes during fetch
Replace raw file manipulation with native `git update-ref` commands
inside a try/finally block to ensure temp refs are created/cleaned up
regardless of storage format.

Bug: 476209856
Change-Id: I228e81d3d3b323328260f6672075193421c8dc47
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/550421
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2026-03-17 14:29:38 -07:00
Gavin Mak
67881c0c3b git_refs: read refs via git plumbing for files/reftable
Replace direct `packed-refs` file parsing with `git for-each-ref`
plumbing to support both `files` and `reftable` backends.

Bug: 476209856
Change-Id: I2ad8ff8f3382426600f15370c997f9bc17165485
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/550401
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2026-03-17 14:28:39 -07:00
Mike Frysinger
551087cd98 tests: add a util module for sharing code
We've started duplicating code among test modules.  Start a common
utils module to hold that, and migrate over TempGitTree to start.

Change-Id: I10b2abd133535c90fbda4d6686602d7e5861d875
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/559041
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2026-03-12 20:09:19 -07:00
Mike Frysinger
8da56a0cc5 man: refresh after recent changes
Change-Id: Ibd60f89406e89255b3284413442b1d9c0ccbfb6d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/559601
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Jeffery Miller <jefferymiller@google.com>
2026-03-10 07:35:40 -07:00
Jeffery Miller
0f01cd24e9 docs: Document support for child elements in extend-project
Clarify the existence and behavior of child elements when added to
extend-project.

Change-Id: Id9f270166c8498d4051495b9a1f68360f66e9143
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/553742
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jeffery Miller <jefferymiller@google.com>
Commit-Queue: Jeffery Miller <jefferymiller@google.com>
2026-02-19 08:38:56 -08:00
Jeffery Miller
1ee98667cc tests: Add extend-project test for additional annotations
Multiple annotations can exist for the same name when
extending projects. Add a test case to show this behavior.

Change-Id: I12bbd25e642c7e615e32f66a1c364a39ac81902c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/553906
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Jeffery Miller <jefferymiller@google.com>
Tested-by: Jeffery Miller <jefferymiller@google.com>
2026-02-19 08:35:03 -08:00
Jordan Esh
6f9622fe1c sync: Remove dependency on ssh if not needed
When running on a machine without the `ssh` command, repo sync would fail even if no ssh or ssh proxy was required. Use exception handling inside ssh.ProxyManager to more gracefully handle the case where ssh is not installed.

Bug: 467714011
Change-Id: I602a0819638ead4d02de88b750839bc3d70549ce
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/535141
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Jordan Esh <esh.jordan@gmail.com>
Commit-Queue: Jordan Esh <esh.jordan@gmail.com>
2026-02-11 17:00:07 -08:00
Gavin Mak
5cb0251248 gc: fix untargeted projects being deleted
`delete_unused_projects` needs a full list of active projects to figure
out which orphaned .git dirs need to be deleted. Otherwise it thinks
that only the projects specified in args are active.

Bug: 447626164
Change-Id: I02beebf6a01c77742a8db78221452d71cd78ea73
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/550061
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2026-02-09 08:34:42 -08:00
Gavin Mak
a214fd31bd manifest: Introduce sync-j-max attribute to cap sync jobs
Add a way for manifest owners to limit how many sync jobs run in
parallel.

Bug: 481100878
Change-Id: Ia6cbe02cbc83c9e414b53b8d14fe5e7e1b802505
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/548963
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2026-02-06 13:32:28 -08:00
Sam Saccone
62cd0de6cf Make git trace2 logging errors conditional on verbose mode.
Add a verbose attribute to the EventLog class, defaulting to False.
Error messages printed to sys.stderr within the EventLog.Write method
are now guarded by this verbose flag. In main.py, set EventLog.verbose
to True if the command-line --verbose option is used. This prevents
trace2 logging failures from being printed to stderr unless verbose
output is explicitly requested.

PROMPT=convert all git trace2 logging print messages to verbose only
logging

BUG: b/479811034
Change-Id: I8757ee52117d766f2f3ec47856db64cc4f51143c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/547542
Tested-by: Sam Saccone <samccone@google.com>
Reviewed-by: Julia Tuttle <juliatuttle@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2026-02-03 11:53:48 -08:00
Mike Frysinger
b60512a75a run_tests: log tool versions
Change-Id: I4eee58786bae6d442773c63fa937fb11eda1e2f0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/547863
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2026-01-29 12:32:46 -08:00
Mike Frysinger
5d88972390 Revert "init: change --manifest-depth default to 1"
This reverts commit 622a5bf9c2.

CrOS infra is failing to sync code now for some reason.
Until we can investigate further, pull this back out.

Bug: 475668525
Bug: 468033850
Change-Id: I35a8623a95336df1be27ea870afbfc8065609f01
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/545141
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-01-21 12:32:59 -08:00
Gavin Mak
3c0e67bbc5 manifest_xml: prevent extend-project from inheriting local groups
When extending a project in a local manifest, the project inherits the
`local:` group. This causes the superproject override logic (which omits
projects with `local:` groups) to incorrectly exclude the project from
the override manifest. This leads to "extend-project element specifies
non-existent project" errors during sync reload.

Fix this by stripping `local:` groups from extended projects, ensuring
they remain visible to superproject overrides while still allowing other
inherited groups to persist.

Bug: 470374343
Change-Id: I1a057ebffebc11a19dc14dde7cc13b9f18cdd0a3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/543222
Reviewed-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2026-01-15 10:27:32 -08:00
Nico Wald
3b7b20ac1d CONTRIBUTING: fix HTTP password URL
Change-Id: I7ae085896fe951c2b1c662689fa111a0661f988d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/539762
Tested-by: Nico Wald <nicowald@mac.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Nico Wald <nicowald@mac.com>
2026-01-12 09:48:47 -08:00
Gavin Mak
e71a8c6dd8 project: disable auto-gc for depth=1 in git config
During sync, `git checkout` can trigger fetch for missing objects in
partial clones. This internal fetch can trigger `git maintenance` or
`git gc` and cause delays during the local checkout phase. Set
maintenance.auto to false and gc.auto to 0 in during `_InitRemote` if
`depth=1` to ensure that implicit fetches spawned by git skip GC.

Bug: 379111283
Change-Id: I6b22a4867f29b6e9598746cb752820a84dc2aeb6
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540681
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2026-01-08 11:33:40 -08:00
Mike Frysinger
c687b5df9e run_tests/release: require Python 3.9+
While we support running `repo` on clients with older Python versions,
we don't need to hold the runners & release code back.  These are only
used by repo devs on their systems to develop & release repo.

Python 3.9 was picked due to its typing changs which we've already
started using in this code.

Change-Id: I6f8885c84298760514c25abeb1fccb0338947bf4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/539801
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2026-01-06 11:36:26 -08:00
Mike Frysinger
1dd9c57a28 tests: drop tox support
This hasn't been working out as well as we'd hope.  Tox relies on
the system having Python versions installed which distros don't
tend to carry anymore.  Our custom run_tests leverages vpython
when possible to run stable Python 3.8 & 3.11 versions which is
providing an OK level of coverage in practice.

Change-Id: Ida517f7be47ca95703e43bc0af5a24dd70c0467e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540001
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-01-06 11:32:42 -08:00
Mike Frysinger
4525c2e0ad github: add black check action
Change-Id: Ic87c1c5c72fb8a01108146c1f9d78466acb57278
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540021
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2026-01-06 11:00:32 -08:00
Mike Frysinger
45dcd738b7 tests: skip AF_UNIX tests when unavailable
UNIX sockets aren't available under Windows, so skip the test.

Change-Id: Ic4ca22d161c6dee628352aad07ac6aaceb472ac2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540002
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2026-01-06 10:17:53 -08:00
Mike Frysinger
1dad86dc00 check-metadata: skip files that do not exist
If the files don't exist, then they can't have errors, so skip checking.

Change-Id: I3ed4be4912b253c5454df41d690cb33dfe191289
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/540003
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2026-01-06 10:17:32 -08:00
Mike Frysinger
622a5bf9c2 init: change --manifest-depth default to 1
Most users do not care about the manifest history in .repo/manifests/.
Let's change the default to 1 so things work smoothly for most people
most of the time.  For the rare folks who want the full history, they
can add --manifest-depth=0 to their `repo init`.

This has no effect on existing checkouts.

Spot checking Android & CrOS manifests shows significant speedups.
Full history can take O(10's seconds) to O(minutes) while depth of 1
takes constant time of O(~5 seconds).

Bug: 468033850
Change-Id: I4b8ed62a8a636babcc5226552badb69600d0c353
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/535481
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2026-01-05 06:36:08 -08:00
Gavin Mak
871e4c7ed1 sync: skip bloat check if fresh sync
Initial syncs won't have accumulated any garbage.

Bug: 379111283
Change-Id: I04b2ecde3e33f1f055038861a2705ab6aabb36d1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/536083
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-12-15 15:24:45 -08:00
Gavin Mak
5b0b5513d6 project: only use --no-auto-gc for git 2.23.0+
The flag for git fetch was introduced in git 2.23.0. Also skip the bloat
check after sync if using an older version.

Bug: 468589976
Bug: 379111283
Change-Id: Ib53e5494350c71a83906e5219d3a8c2b654e531f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/536082
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-12-15 11:32:49 -08:00
Gavin Mak
b5991d7128 sync: Add heuristic warning for bloated shallow repositories
For clone-depth="1" repositories that are dirty or have local commits,
add a check at the end of sync to detect excessive git object
accumulation.

This prevents silent performance degradation and disk exhaustion in
large prebuilts repos where automatic GC is typically disabled from
https://gerrit.googlesource.com/git-repo/+/7f87c54043ce9a35a5bb60a09ee846f9d7070352

Bug: 379111283
Change-Id: I376f38e1555cc6e906d852f6e63dc1c8f6331b4f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/534701
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-12-10 11:34:40 -08:00
Gavin Mak
7f87c54043 project: disable auto-gc on fetch for projects with clone-depth=1
This prevents GC hangs on repos with large binaries by skipping implicit
GC during network fetch, using clone-depth=1 as a heuristic.

Bug: 379111283
Change-Id: I977bf8cd521b11e37eba7ebc9f62120f2bbaf760
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/533802
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-08 12:18:48 -08:00
Kaushik Lingarkar
50c6226075 Prevent leftover bare gitdirs after failed sync attempts
The gitdir for a project may be left in a state with bare=true due
to a previous failed sync. In this state, during a subsequent sync
attempt, repo will skip initializing the gitdir (since the directory
already exists) and directly attempt to checkout the worktree, which
will fail because the project is bare. To reduce the chance of this
happening, initialize the gitdir in a temp directory and move it once
it is ready.

Bug: 457478027
Change-Id: I4767494a3a54e7734174eae3a0d939fa9d174288
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/524203
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-12-05 10:35:46 -08:00
Peter Kjellerstedt
1e4b2887a7 project: Make the error message more logical when a linkfile fail
Due to the odd naming of the arguments to symlink(), the error when it
failed to create a symbolic link was misleading.

Change-Id: I1d0f30ade5970d80186f13e01c426b066cd1062f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/532541
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-03 08:48:11 -08:00
Peter Kjellerstedt
31b4b19387 info: Print a newline after printing the superproject's revision
Change-Id: Ib20233dad4e1f1fd54dbf5ca0324be22fe0e4db1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/528463
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-03 08:08:06 -08:00
Peter Kjellerstedt
2b6de52a36 Rename XmlManifest.GetGroupsStr() to XmlManifest.GetManifestGroupsStr()
This makes it more clear what kind of groups it refers to.

Change-Id: I47369050d1436efcc77f3a69d5b7c99a536b23bc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/528462
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-12-03 07:57:22 -08:00
Peter Kjellerstedt
91ec998598 manifest_xml, git_superproject: Rename an argument for XmlManifest.ToXml()
Rename the groups argument to filter_groups to make it more clear what
kind of groups it refers to.

Change-Id: I90e6e9aa74a7e3e697705dd4bf8676226055878b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/528461
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-12-02 11:44:24 -08:00
Mike Frysinger
08964a1658 docs: manifest-format: reformat spec to align the CDATA parts
Most of the file was doing this, but we've been inconsistent when
adding new entries.  Realign all of them.

Change-Id: I99ddb3a1e859235b249b6f08731bdadad8086d4e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/532461
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2025-12-02 10:43:56 -08:00
Peter Kjellerstedt
3073a90046 manifest: Propagate revision attribute through multiple levels of include
Make sure a revision attribute for an include element is propagated
through multiple levels of manifest includes.

Change-Id: If37d65b0cd47da673719976598175d0eb6b7cbbe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525341
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-11-26 02:08:44 -08:00
Peter Kjellerstedt
75773b8b9d manifest, project: Store project groups as sets
This helps a lot when including common manifests with groups and they
use extend-project.

Change-Id: Ic574e7d6696139d0eb90d9915e8c7048d5e89c07
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525323
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2025-11-26 02:08:07 -08:00
Peter Kjellerstedt
412367bfaf project: Use dicts to keep track of copyfiles and linkfiles
This avoids copying/linking the same file/link multiple times if a
copyfile/linkfile element with the same values has been specifed
multiple times. This can happen when including a common manifest that
uses an extend-project element that has a copyfile/linkfile element.

This uses dicts rather than sets to store the copyfiles and linkfiles to
make sure the order they are specified in the manifest is maintained.
For Python 3.7+, maintaining the order that keys are added to dicts is
guaranteed, and for Python 3.6 it happened to be true.

The _CopyFile class and the _LinkFile class are changed to inherit from
NamedTuple to be able to store them in dicts.

Change-Id: I9f5a80298b875251a81c5fe7d353e262d104fae4
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525322
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2025-11-26 02:07:35 -08:00
Peter Kjellerstedt
47c24b5c40 manifest: Make include groups propagate to extend-project elements
Any groups specified to an include element should propagate to any
extend-project elements and then on to the projects.

Change-Id: I62b95689cc13660858564ae569cbfd095961ecc7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525321
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-11-26 02:05:48 -08:00
Gavin Mak
be33106ffc wipe: Add new repo wipe subcommand
This new command allows users to delete projects from the worktree
and from the `.repo` directory. It is a destructive operation.

It handles shared projects by refusing to wipe them unless the
`--force` flag is used. It also checks for uncommitted changes
before wiping.

Bug: 393383056
Change-Id: Ia30d8ffdc781a3f179af56310ce31c9dae331bbe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/490801
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-11-21 10:48:42 -08:00
Mike Frysinger
5998c0b506 tests: manifest_xml: convert most path usage to pathlib
Should be functionally the same, but with pathlib APIs that we've
been slowly adopting in other places, especially unittests.

Change-Id: I81364117f8eaeaf138097cdfc484d4848b7ea5bd
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/525881
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-11-11 10:58:51 -08:00
Peter Kjellerstedt
877ef91be2 man: Regenerate after manifest update
Change-Id: I0e7ef5d4189eaaf6878be709b437ecfb57570e3f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/524921
Commit-Queue: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
2025-11-06 15:03:30 -08:00
Peter Kjellerstedt
4ab2284a94 manifest: Make extend-project support copyfile, linkfile and annotation
This allows an existing project to be extended by these elements.

Change-Id: I6826e518f39ca86485301491639101943b7e2ae0
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/519781
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Peter Kjellerstedt <peter.kjellerstedt@axis.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-10-27 11:38:07 -07:00
Gavin Mak
1afe96a7e9 sync: fix saving of fetch times and local state
Interleaved sync didn't save _fetch_times and _local_sync_state to disk.
Phased sync saved them, but incorrectly applied moving average smoothing
repeatedly when fetching submodules, and discarded historical data
during partial syncs.

Move .Save() calls to the end of main sync loops to ensure they run
once. Update _FetchTimes.Save() to merge new data with existing history,
preventing data loss.

Change-Id: I174f98a62ac86859f1eeea1daba65eb35c227852
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/519821
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-10-20 11:28:21 -07:00
Mike Frysinger
2719a8e203 run_tests: log each command run
This should make it clear to devs what commands are run and which fail
in the CI.

Change-Id: Ie863540cba6de7da933b4f32947ad09edee4aa45
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/519361
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-10-15 11:09:48 -07:00
Jeroen Dhollander
e4872ac8ba sync: Use 'git rebase' during 'repo sync --rebase'
'repo sync --rebase' should do a rebase if it encounters local commits
during a 'repo sync'.
This was broken by
https://gerrit-review.git.corp.google.com/c/git-repo/+/437421,
which caused this to execute the '_doff' hook (which stands for
'do fast forward'), which is implemented using 'git merge --no-stat'.

This caused *multiple* actual editor windows to pop up (*) during
'repo sync --rebase', asking the user to enter a commit message for the
merge.

In this CL I explicitly make that code path do a 'git rebase'.

(*) and if you use a terminal editor like 'vim', this means you have 2+ concurrent vim windows rendered in the same terminal, while 'repo sync' keeps on printing other output lines, again in the same terminal. The result is .... not pretty to say the least :(

Bug: b:434565811
Test: Used it myself for over a week.
Change-Id: I0bf3ff181f15b9d5b2e3f85f7f84e302139fdab7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/518602
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Jeroen Dhollander <jeroendh@google.com>
Commit-Queue: Jeroen Dhollander <jeroendh@google.com>
2025-10-15 08:32:00 -07:00
Kaushik Lingarkar
4623264809 Fix submodule initialization in interleaved sync mode
With the introduction of interleaved sync mode, the submodule activation
logic broke because the 'has_submodules' attribute was no longer being
populated when needed. With this change, each submodule is initialized
when it enters the Sync_LocalHalf stage, whereas previously all
submodules were initialized at once when the parent repository entered
the Sync_LocalHalf stage. The init is now retried if it fails, as
submodules may concurrently modify the parent’s git config, potentially
causing contention when attempting to obtain a lock on it.

This change makes the submodule activation logic more robust and less
prone to breakage.

Bug: 444366154
Change-Id: I25eca4ea2a6868219045cfa088988eb01ded47d2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/509041
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Nasser Grainawi <nasser.grainawi@oss.qualcomm.com>
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-10-14 12:07:04 -07:00
Kaushik Lingarkar
67383bdba9 Follow up "Fix shallow clones when upstream attribute is present"
This reverts commit 38d2fe11b9.

Reason for revert: The issue described in I00acd4c61 remains unresolved.
The previous fix incorrectly accessed use_superproject from the Project
class, though it was only defined in ManifestProject. This change uses
it from the manifest attr available in the Project class.

Bug: b/427093249
Change-Id: Ife6d46cd85840f2989f60c2ca4d5a7dcf5d7477a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/508821
Reviewed-by: Xin Li <delphij@google.com>
Reviewed-by: Krzysztof Wesolowski <krzysztof.wesolowski@volvocars.com>
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
2025-09-22 12:40:22 -07:00
Mike Frysinger
d30414bb53 forall: fix crash with no command
When callback= is used, optparse does not automatically initialize
The destination when a dest= is not specified.  Refine the test to
allow dest= options when callback= is used even when it seems like
it is otherwise redundant.

Bug: b/436611422
Change-Id: I5185f95cb857ca6d37357cac77fb117a83db9c0c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/509861
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-09-17 12:54:30 -07:00
Mike Frysinger
80d1a5ad3e run_tests: add file header checker for licensing blocks
Change-Id: Ic0bfa3b03e2ba46d565a5bc2c1b7a7463b7dca2c
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/500103
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-21 11:16:35 -07:00
Mike Frysinger
c615c964fb man: regen after sync updates
Change-Id: I20937c365b3f0be76e278d17c05b76a0d5e59deb
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/500101
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-08-21 11:11:38 -07:00
Mike Frysinger
5ed12ec81d standardize file header wrt licensing
We've been slightly inconsistent in the license header in files.
Standardize them so we can automate checking.

Change-Id: I3cdf85c9485d33cac2bb05c8080dfada3e5a5e8d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/500102
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-08-21 11:04:41 -07:00
Mike Frysinger
58a59fdfbc CONTRIBUTING: rename doc per Google OSS policies
Google OSS policies say to name this "CONTRIBUTING.md".

Change-Id: I037f52a443caacc89868b7c14af91dd3d1b681a9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/499761
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-08-20 14:15:53 -07:00
Gavin Mak
38d2fe11b9 Revert "Fix shallow clones when upstream attribute is present"
This reverts commit d9cc0a1526.

Reason for revert: AttributeError: 'Project' object has no attribute 'use_superproject'

Bug: b/427093249
Change-Id: I57b285ab21f58b040e68ec14b85425f43f0abcca
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498641
Reviewed-by: Mike Frysinger <vapier@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-08-14 16:35:26 -07:00
Gavin Mak
854fe440f2 git_superproject: fix AttributeError in Superproject logging
Ensure _git_event_log is initialized before use in _LogMessage. This
avoids crashes when _git_event_log is accessed before it's set, such as
during repo info.

Bug: 435317391
Change-Id: I3adc32d6a9377558e852bbb43f9cf82041fcf1bc
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498521
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-14 15:39:41 -07:00
Gavin Mak
d534a5537f sync: Fix missing error details in interleaved summary
When checkout errors occurred in interleaved sync, they were wrapped in
a SyncError with no message, causing blank lines in the final summary.
Refactor _SyncResult to hold a list of exceptions, ensuring the original
error messages are propagated correctly.

Bug: 438178765
Change-Id: Ic25e515068959829cb6290cfd9e4c2d3963bbbea
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498342
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-08-14 09:54:15 -07:00
Gavin Mak
a64149a7a7 sync: Record and propagate errors from deferred actions
Failures in deferred sync actions were not recorded because `_Later.Run`
discarded the `GitError` exception. Record the specific error using
`syncbuf.fail()` and propagate it for proper error aggregation and
reporting.

Bug: 438178765
Change-Id: Iad59e389f9677bd6b8d873ee1ea2aa6ce44c86fa
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498141
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-13 23:17:56 -07:00
Gavin Mak
3e6acf2778 progress: Fix race condition causing fileno crash
A race condition occurs when sync redirects sys.stderr to capture worker output, while a background progress thread simultaneously calls fileno() on it. This causes an io.UnsupportedOperation error. Fix by caching the original sys.stderr for all progress bar IO.

Change-Id: Idb1f45d707596d31238a19fd373cac3bf669c405
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498121
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-08-13 23:16:55 -07:00
Gavin Mak
a6e1a59ac1 sync: Avoid duplicate projects in error text
Keep track of finished projects, not just successful ones, when deciding
which projects still need to be synced. Also project errors are already
reported by sync workers so stall detection doesn't need to add failed
projects to the error list.

Bug: 438178765
Change-Id: Ibf15aad009ba7295e70c8df2ff158215085e9732
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498062
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-08-13 23:16:55 -07:00
Gavin Mak
380bf9546e sync: always show sync result stderr_text on error
_ProcessSyncInterleavedResults currently only shows stderr_text if
verbose. Show it if a sync worker fails, regardless of verbosity.

Bug: 438178765
Change-Id: If24dcb10fb5d6857386782d371e3f9c6844dece9
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/498061
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-08-13 23:16:55 -07:00
Krzysztof Wesolowski
d9cc0a1526 Fix shallow clones when upstream attribute is present
The _CheckForImmutableRevision method was modified in commit 0e776a58 to
include upstream branch validation for superproject scenarios. However,
this change inadvertently broke shallow clones when both clone-depth and
upstream attributes are specified in regular (non-superproject)
manifests.

Issue: When upstream is present, _CheckForImmutableRevision performs two
additional checks: 1. git rev-list on the upstream reference 2. git
merge-base --is-ancestor between revision and upstream

In shallow clones, the upstream branch history may not be available
locally, causing these checks to fail. This triggers the retry mechanism
that removes depth limitations, effectively converting shallow clones to
full clones, resulting in excessive disk usage.

Fix: Make upstream validation conditional on superproject usage. This
preserves the original superproject fix while restoring the method's
original behavior for regular scenarios - checking only if the immutable
revision (SHA1/tag) exists locally.

Note: The SetRevisionId method from the same commit 0e776a58 is left
unchanged as it only stores upstream information (no git operations),
which is beneficial for preserving branch context for commands like
'repo start' without causing fetch-related issues.

The fix ensures that manifests with both clone-depth and upstream work
correctly in non-superproject scenarios, maintaining shallow clone
efficiency and reducing disk usage.

Bug: b/427093249
Change-Id: I00acd4c61b179cd2abf796c2fecb7a2f38016a18
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/493883
Tested-by: Krzysztof Wesolowski <krzysztof.wesolowski@volvocars.com>
Commit-Queue: Krzysztof Wesolowski <krzysztof.wesolowski@volvocars.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Kamaljeet Maini <kamaljeet@google.com>
Reviewed-by: Xin Li <delphij@google.com>
2025-08-05 08:28:37 -07:00
Gavin Mak
8c3585f367 project: fallback to reading HEAD when rev-parse fails
git rev-parse fails on invalid HEAD, e.g. after incomplete sync, causing
NoManifestException. Fall back to v2.56's direct file reading when
rev-parse fails.

Bug: 435045466
Change-Id: Ia14560335110c00d80408b2a93595a84446f8a57
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/495181
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-08-04 12:17:44 -07:00
Gavin Mak
239fad7146 hooks: verify hooks project has worktree before running
Skip hook if its project is not present on disk.

Bug: 434232630
Change-Id: I09a8b412d078af7a068d533f7be320d5b02327be
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/494441
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-07-28 08:37:08 -07:00
Kuang-che Wu
d3eec0acdd sync: fix connection error on macOS for interleaved sync
Bug: 377538810
Test: on macos, repo sync -j64
Change-Id: I6af4d4e6669dc882f165cbb9142ad4db9b346b73
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/494241
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Kuang-che Wu <kcwu@google.com>
Tested-by: Kuang-che Wu <kcwu@google.com>
2025-07-28 02:05:24 -07:00
Gavin Mak
7f7d70efe4 project: Fix GetHead to handle detached HEADs
The switch to git rev-parse caused GetHead() to return the literal
string 'HEAD' when in a detached state. This broke repo prune, which
expects a commit SHA.

Bug: 434077990
Change-Id: I80b7d5965749096b59e854f61e913aa74c857b99
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/494401
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-07-25 14:30:07 -07:00
Gavin Mak
720bd1e96b sync: Don't checkout if no worktree
Interleaved sync should not try checkout out a project if it's a mirror.

Change-Id: I2549faab197a3202d79a10e44b449b68d53e3fe7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/492942
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-07-23 15:57:49 -07:00
Gavin Mak
25858c8b16 sync: Default to interleaved mode
The previous default, "phased" sync (separate network and checkout
phases), can now be selected with `--no-interleaved`.

Bug: 421935613
Bug: 432082000
Change-Id: Ia8624daa609a28ea2f87f8ea4b42138d8b3e9269
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/489681
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-07-21 14:51:36 -07:00
Gavin Mak
52bab0ba27 project: Use git rev-parse to read HEAD
Don't directly read `.git/HEAD`, git already has a command for this.

Bug: 432200791
Change-Id: Iba030650224143eb07c44da1fa56341d9deb4288
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/492941
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-07-21 14:50:46 -07:00
Gavin Mak
2e6d0881d9 sync: Improve UI and error reporting for interleaved mode
This fixes two issues:
1. the progress bar could show a count greater than the total if new projects were discovered mid-sync. Update the progress bar total dynamically
2. Make "Stall detected" error message more actionable

Bug: 432206932
Change-Id: Ie2a4ada5b1770cae0302fb06590641c522cbb7e7
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/491941
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-07-17 17:30:33 -07:00
Gavin Mak
74edacd8e5 project: Use plumbing commands to manage HEAD
Don't directly manipulate `.git/HEAD` since it bypasses Git's internal
state management.

Bug: 432200791
Change-Id: I1c9264bcf107d34574a82b60a22ea2c83792951b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/491841
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-07-17 15:41:59 -07:00
Gavin Mak
5d95ba8d85 progress: Make end() idempotent
This fixes the double "done" text on successful interleaved sync.

Bug: 421935613
Change-Id: I4f01418cb0340129a8f0a2a5835f7e3fa6a6b119
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/487081
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-07-02 13:11:23 -07:00
Kenny Cheng
82d500eb7a sync: support post-sync hook in <repo-hooks>
Add support for a new hook type "post-sync" declared in the manifest using
<repo-hooks>. This allows executing a script automatically after a successful
`repo sync`.

This is useful for initializing developer environments, installing project-wide
Git hooks, generating configs, and other post-sync automation tasks.

Example manifest usage:

  <project name="myorg/repo-hooks" path="hooks" revision="main" />
  <repo-hooks in-project="myorg/repo-hooks" enabled-list="post-sync">
    <hook name="post-sync" />
  </repo-hooks>

The hook script must be named `post-sync.py` and located at the root of the
hook project.

The post-sync hook does not block `repo sync`; if the script fails, the sync
still completes successfully with a warning.

Test: Added `post-sync.py` in hook project and verified it runs after `repo sync`

Bug: b/421694721
Change-Id: I69f3158f0fc319d73a85028d6e90fea02c1dc8c8
Signed-off-by: Kenny Cheng <chao.shun.cheng.tw@gmail.com>
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/480581
Reviewed-by: Scott Lee <ddoman@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
2025-07-01 16:11:50 -07:00
Matt Moeller
21269c3eed init: Add environment variable for git-lfs
Convenient way to always enable or disable git-lfs without having to
remember to put on the command line.

Useful if you want to ALWAYS have git-lfs enabled on your system when
you 'init' a new project.

Also useful if you are using the Jenkins repo plugin as it doesn't
provide an option for enabling git-lfs in its UI.

Change-Id: Ieb1bbe83de9c21523ab69b30fc5047c257d02731
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/437661
Commit-Queue: Scott Lee <ddoman@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Fatahillah Wk <fatahillahwkwk@gmail.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Matt Moeller <moeller.matt@gmail.com>
Reviewed-by: Yingchun Li <sword.l.dragon@gmail.com>
2025-06-30 15:27:26 -07:00
Gavin Mak
99b5a17f2c sync: Share final error handling logic between sync modes
Dedupe error reporting logic for phased and interleaved sync modes by
extracting it into _ReportErrors.

Error reporting will now distinguish between network and local failures
and lists the specific repos that failed in each phase.

Bug: 421935613
Change-Id: I4604a83943dbbd71d979158d7a1c4b8c243347d2
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/484541
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-06-23 16:06:34 -07:00
Gavin Mak
df3c4017f9 sync: Share manifest list update logic between sync modes
Extract the manifest update loop from _SyncPhased into a new
_UpdateManifestLists method and use it in both sync types.

Bug: 421935613
Change-Id: If499a3ce4a0bbb3c4641dba52ca5c1c82b11f16f
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/484341
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-06-23 09:17:05 -07:00
Gavin Mak
f7a3f99dc9 sync: Share self-update logic between sync modes
The logic for checking for repo self-updates lives in _FetchMain, which
is part of the "phased" sync path.

Extract this logic into a new _UpdateRepoProject helper method. Call
this common helper from _ExecuteHelper before either sync mode begins,
so the repo self-update check is always performed.

Bug: 421935613
Change-Id: I9a804f43fbf6239c4146be446040be531f12fc8a
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/484041
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-06-23 09:12:41 -07:00
Gavin Mak
6b8e9fc8db sync: clarify job flags when using interleaved
--jobs-network and --jobs-checkout are ignored with --interleaved.

Bug: 421935613
Change-Id: Ib69413993c4f970b385bd09318972716e5ac3324
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/485021
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-06-18 15:23:59 -07:00
Gavin Mak
7b6ffed4ae sync: Implement --interleaved sync worker
For each assigned project, the worker sequentially calls
Sync_NetworkHalf and Sync_LocalHalf, respecting --local-only and
--network-only flags. To prevent scrambled progress bars, all stderr
output from the checkout phase is captured (shown with --verbose).
Result objects now carry status and timing information from the worker
for state updates.

Bug: 421935613
Change-Id: I398602e08a375e974a8914e5fa48ffae673dda9b
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/483301
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-06-18 10:26:27 -07:00
Gavin Mak
b4b323a8bd sync: Add orchestration logic for --interleaved
Introduce the parallel orchestration framework for `repo sync
--interleaved`.

The new logic respects project dependencies by processing them in
hierarchical levels. Projects sharing a git object directory are grouped
and processed serially. Also reuse the familiar fetch progress bar UX.

Bug: 421935613
Change-Id: Ia388a231fa96b3220e343f952f07021bc9817d19
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/483281
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-06-17 16:13:36 -07:00
Gavin Mak
f91f4462e6 upload: fix FileNotFoundError when no superproject
Upload gets a FileNotFoundError if not using superproject because it
tries to access the superproject's repo_id before checking if
superproject was actually enabled.

Reorder the logic to check use_superproject first.

Change-Id: I65cd2adab481e799dd7bb75e1a83553ad6e34d8d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/484401
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-06-17 13:31:02 -07:00
Gavin Mak
85352825ff sync: Add scaffolding for interleaved sync
Prepare for an interleaved fetch and checkout mode for `repo sync`. The
goal of the new mode is to significantly speed up syncs by running fetch
and checkout operations in parallel for different projects, rather than
waiting for all fetches to complete before starting any checkouts.

Bug: 421935613
Change-Id: I8c66d1e790c7bba6280e409b95238c5e4e61a9c8
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/482821
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
Tested-by: Gavin Mak <gavinmak@google.com>
2025-06-11 16:31:35 -07:00
Scott Lee
b262d0e461 info: fix mismatched format args and wrong symbol name
Bug: 416589884
Change-Id: Icbaade585932f0cbb51367e07925ef606f089697
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/482762
Commit-Queue: Scott Lee <ddoman@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Lint: Scott Lee <ddoman@google.com>
Tested-by: Scott Lee <ddoman@google.com>
2025-06-10 12:38:23 -07:00
Mike Frysinger
044e52e236 hooks: add internal check for external hook API
Add an internal check to make sure we always follow the API we've
documented for external authors.  Since the internal call is a bit
ad-hoc, it can be easy to miss a call site.

Change-Id: Ie8cd298d1fc34f10f3c5eb353512a3e881f42252
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/481721
Reviewed-by: Nasser Grainawi <nasser.grainawi@oss.qualcomm.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-06-06 11:12:13 -07:00
Gavin Mak
0cb88a8d79 git_superproject: Replace walrus operator
It was released in python 3.8, and repo still supports 3.6.

Bug: 422226033
Change-Id: I6bdd2cdbb074766ecfb1492d842c847781c4b264
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/481201
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-06-04 10:02:01 -07:00
Gavin Mak
08815ad3eb upload: Add rev to rootRepo push option
Bug: b/401147338
Change-Id: Iac19af5aadd250538702920d9beaeef9250c78fe
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/478801
Tested-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-05-28 11:44:55 -07:00
Scott Lee
3c8bae27ec info: print superproject revision
Bug: 416589884
Change-Id: I5d1c709518d76d777a7f07c4c774569773c5a265
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/478205
Lint: Scott Lee <ddoman@google.com>
Tested-by: Scott Lee <ddoman@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
Commit-Queue: Scott Lee <ddoman@google.com>
2025-05-27 11:49:32 -07:00
Mike Frysinger
06338abe79 subcmds: delete redundant dest= settings
Add a test to enforce this too.

Change-Id: I80b5cf567aa33db9c24b53428c66d69f9c1d8d74
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/478481
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
2025-05-27 09:26:43 -07:00
Gavin Mak
8d37f61471 upload: Add superproject identifier as push option
When uploading, add the root superproject repo as a push option in the
format `-o custom-keyed-value=rootRepo:$HOST/$PROJECT`.

Bug: b/401147338
Change-Id: I00230256eb7ae307b03840bb4090c28dc8a0505e
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/472601
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
Tested-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Gavin Mak <gavinmak@google.com>
2025-05-05 15:10:49 -07:00
Mike Frysinger
1acbc14c34 manifest: generalize --json as --format=<format>
This will make it easier to add more formats without exploding the
common --xxx space and checking a large set of boolean flags.

Also fill out the test coverage while we're here.

Bug: b/412725063
Change-Id: I754013dc6cb3445f8a0979cefec599d55dafdcff
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/471941
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
Tested-by: Mike Frysinger <vapier@google.com>
2025-04-30 12:25:15 -07:00
Mike Frysinger
c448ba9cc7 run_tests: only allow help2man skipping in CI
Make sure we run this for local devs.

Change-Id: I472b7c347086d54649dd9d5778eea4737447b353
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/471921
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Scott Lee <ddoman@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-04-30 11:47:18 -07:00
Mike Frysinger
21cbcc54e9 update-manpages: include in unittests
People often forget to regen when making interface changes.

We skip the test if help2man isn't installed since it's not common,
and it's not available on our CI bots currently.

Change-Id: Ib4911a0e3fa1294ad90e4ac8afc047a0b7c2b66d
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/469741
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-04-28 10:24:33 -07:00
Erik Elmeke
0f200bb3a1 flake8: Ignore .venv directory
.venv is by convention a very common place for venvs and
is the default in some tools, for example like "Astral uv".
The third-party packages installed there should not be linted.

Change-Id: I3278d90c2fdfc8a34a2488e82d4df8e836111ce1
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/469941
Tested-by: Erik Elmeke <erik@haleytek.corp-partner.google.com>
Commit-Queue: Josip Sokcevic <sokcevic@chromium.org>
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
2025-04-23 08:33:37 -07:00
Mike Frysinger
c8da28c3ed man: regenerate man pages
Change-Id: Ie348f7a29523655bf1d6247af8302ff885420d75
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/469742
Tested-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Commit-Queue: Mike Frysinger <vapier@google.com>
2025-04-22 11:37:45 -07:00
Erik Elmeke
c061593a12 manifest: Remove redundant re-raise of BaseExceptions
This change should be a noop from a functional point of view.
Exceptions inheriting directly from BaseException (KeyboardInterrupt,
SystemExit) are not caught by "except Exception", they will instead
continue raising upwards the stack, so there is no need to explicitly
catch and re-raise them.

Change-Id: Ic10764af4a6c05d1162f8b21651e7864ed742286
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/469601
Reviewed-by: Mike Frysinger <vapier@google.com>
Reviewed-by: Josip Sokcevic <sokcevic@chromium.org>
Tested-by: Erik Elmeke <erik@haleytek.corp-partner.google.com>
2025-04-22 10:20:08 -07:00
Kaushik Lingarkar
a94457d1ce Fallback to full sync when depth enabled fetch of a sha1 fails
In sha1 mode, when depth is enabled, syncing the revision from
upstream may not work because some servers only allow fetching
named refs. Fetching a specific sha1 may result in an error like
'server does not allow request for unadvertised object'. In this
case, attempt a full sync with depth disabled.

Bug: 410825502
Change-Id: If51bcf18b877cd9491706f5bc3d6fd13c0c3d4f3
Reviewed-on: https://gerrit-review.googlesource.com/c/git-repo/+/468282
Commit-Queue: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Tested-by: Kaushik Lingarkar <kaushikl@qti.qualcomm.com>
Reviewed-by: Gavin Mak <gavinmak@google.com>
Reviewed-by: Mike Frysinger <vapier@google.com>
2025-04-17 11:46:11 -07:00
90 changed files with 5153 additions and 1631 deletions

View File

@@ -12,5 +12,6 @@ extend-ignore =
# E731: do not assign a lambda expression, use a def
E731,
exclude =
.venv,
venv,
.tox,

16
.github/workflows/black.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
# GitHub actions workflow.
# https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions
# https://black.readthedocs.io/en/stable/integrations/github_actions.html
name: Format
on:
push:
branches: [main]
jobs:
format:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: psf/black@stable

View File

@@ -18,5 +18,5 @@ jobs:
Thanks for your contribution!
Unfortunately, we don't use GitHub pull requests to manage code
contributions to this repository.
Instead, please see [README.md](../blob/HEAD/SUBMITTING_PATCHES.md)
Instead, please see [README.md](../blob/HEAD/CONTRIBUTING.md)
which provides full instructions on how to get involved.

View File

@@ -27,6 +27,6 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install tox tox-gh-actions
- name: Test with tox
run: tox
python -m pip install pytest
- name: Run tests
run: python -m pytest

View File

@@ -1,41 +0,0 @@
# Copyright 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Config file for the isort python module.
# This is used to enforce import sorting standards.
#
# https://pycqa.github.io/isort/docs/configuration/options.html
[settings]
# Be compatible with `black` since it also matches what we want.
profile = black
line_length = 80
length_sort = false
force_single_line = true
lines_after_imports = 2
from_first = false
case_sensitive = false
force_sort_within_sections = true
order_by_type = false
# Ignore generated files.
extend_skip_glob = *_pb2.py
# Allow importing multiple classes on a single line from these modules.
# https://google.github.io/styleguide/pyguide#s2.2-imports
single_line_exclusions =
abc,
collections.abc,
typing,

View File

@@ -5,6 +5,5 @@
<pydev_pathproperty name="org.python.pydev.PROJECT_SOURCE_PATH">
<path>/git-repo</path>
</pydev_pathproperty>
<pydev_property name="org.python.pydev.PYTHON_PROJECT_VERSION">python 2.7</pydev_property>
<pydev_property name="org.python.pydev.PYTHON_PROJECT_INTERPRETER">Default</pydev_property>
</pydev_project>

View File

@@ -43,17 +43,12 @@ probably need to split up your commit to finer grained pieces.
Lint any changes by running:
```sh
$ tox -e lint -- file.py
$ flake8
```
And format with:
```sh
$ tox -e format -- file.py
```
Or format everything:
```sh
$ tox -e format
$ black file.py
```
Repo uses [black](https://black.readthedocs.io/) with line length of 80 as its
@@ -73,15 +68,11 @@ the entire project in the included `.flake8` file.
[PEP 8]: https://www.python.org/dev/peps/pep-0008/
[flake8 documentation]: https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html#in-line-ignoring-errors
## Running tests
We use [pytest](https://pytest.org/) and [tox](https://tox.readthedocs.io/) for
running tests. You should make sure to install those first.
To run the full suite against all supported Python versions, simply execute:
```sh
$ tox -p auto
```
We use [pytest](https://pytest.org/) for running tests. You should make sure to
install that first.
We have [`./run_tests`](./run_tests) which is a simple wrapper around `pytest`:
```sh
@@ -143,7 +134,7 @@ they have right to redistribute your work under the Apache License:
Ensure you have obtained an HTTP password to authenticate:
https://gerrit-review.googlesource.com/new-password
https://www.googlesource.com/new-password
Ensure that you have the local commit hook installed to automatically
add a ChangeId to your commits:

View File

@@ -14,7 +14,7 @@ that you can put anywhere in your path.
* Docs: <https://source.android.com/source/using-repo.html>
* [repo Manifest Format](./docs/manifest-format.md)
* [repo Hooks](./docs/repo-hooks.md)
* [Submitting patches](./SUBMITTING_PATCHES.md)
* [Contributing](./CONTRIBUTING.md)
* Running Repo in [Microsoft Windows](./docs/windows.md)
* GitHub mirror: <https://github.com/GerritCodeReview/git-repo>
* Postsubmit tests: <https://github.com/GerritCodeReview/git-repo/actions>

View File

@@ -399,7 +399,7 @@ class Command:
result = []
if not groups:
groups = manifest.GetGroupsStr()
groups = manifest.GetManifestGroupsStr()
groups = [x for x in re.split(r"[,\s]+", groups) if x]
if not args:

View File

@@ -1,4 +1,4 @@
# Copyright 2021 The Android Open Source Project
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -51,6 +51,7 @@ following DTD:
<!ATTLIST default dest-branch CDATA #IMPLIED>
<!ATTLIST default upstream CDATA #IMPLIED>
<!ATTLIST default sync-j CDATA #IMPLIED>
<!ATTLIST default sync-j-max CDATA #IMPLIED>
<!ATTLIST default sync-c CDATA #IMPLIED>
<!ATTLIST default sync-s CDATA #IMPLIED>
<!ATTLIST default sync-tags CDATA #IMPLIED>
@@ -59,7 +60,7 @@ following DTD:
<!ATTLIST manifest-server url CDATA #REQUIRED>
<!ELEMENT submanifest EMPTY>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest remote IDREF #IMPLIED>
<!ATTLIST submanifest project CDATA #IMPLIED>
<!ATTLIST submanifest manifest-name CDATA #IMPLIED>
@@ -81,9 +82,9 @@ following DTD:
<!ATTLIST project sync-c CDATA #IMPLIED>
<!ATTLIST project sync-s CDATA #IMPLIED>
<!ATTLIST project sync-tags CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project clone-depth CDATA #IMPLIED>
<!ATTLIST project force-path CDATA #IMPLIED>
<!ATTLIST project force-path CDATA #IMPLIED>
<!ELEMENT annotation EMPTY>
<!ATTLIST annotation name CDATA #REQUIRED>
@@ -95,19 +96,21 @@ following DTD:
<!ATTLIST copyfile dest CDATA #REQUIRED>
<!ELEMENT linkfile EMPTY>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile dest CDATA #REQUIRED>
<!ELEMENT extend-project EMPTY>
<!ATTLIST extend-project name CDATA #REQUIRED>
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project dest-path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ATTLIST extend-project revision CDATA #IMPLIED>
<!ATTLIST extend-project remote CDATA #IMPLIED>
<!ELEMENT extend-project (annotation*,
copyfile*,
linkfile*)>
<!ATTLIST extend-project name CDATA #REQUIRED>
<!ATTLIST extend-project path CDATA #IMPLIED>
<!ATTLIST extend-project dest-path CDATA #IMPLIED>
<!ATTLIST extend-project groups CDATA #IMPLIED>
<!ATTLIST extend-project revision CDATA #IMPLIED>
<!ATTLIST extend-project remote CDATA #IMPLIED>
<!ATTLIST extend-project dest-branch CDATA #IMPLIED>
<!ATTLIST extend-project upstream CDATA #IMPLIED>
<!ATTLIST extend-project base-rev CDATA #IMPLIED>
<!ATTLIST extend-project upstream CDATA #IMPLIED>
<!ATTLIST extend-project base-rev CDATA #IMPLIED>
<!ELEMENT remove-project EMPTY>
<!ATTLIST remove-project name CDATA #IMPLIED>
@@ -116,7 +119,7 @@ following DTD:
<!ATTLIST remove-project base-rev CDATA #IMPLIED>
<!ELEMENT repo-hooks EMPTY>
<!ATTLIST repo-hooks in-project CDATA #REQUIRED>
<!ATTLIST repo-hooks in-project CDATA #REQUIRED>
<!ATTLIST repo-hooks enabled-list CDATA #REQUIRED>
<!ELEMENT superproject EMPTY>
@@ -125,7 +128,7 @@ following DTD:
<!ATTLIST superproject revision CDATA #IMPLIED>
<!ELEMENT contactinfo EMPTY>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED>
@@ -211,7 +214,9 @@ can be found. Used when syncing a revision locked manifest in
-c mode to avoid having to sync the entire ref space. Project elements
not setting their own `upstream` will inherit this value.
Attribute `sync-j`: Number of parallel jobs to use when synching.
Attribute `sync-j`: Number of parallel jobs to use when syncing.
Attribute `sync-j-max`: Maximum number of parallel jobs to use when syncing.
Attribute `sync-c`: Set to true to only sync the given Git
branch (specified in the `revision` attribute) rather than the
@@ -285,7 +290,7 @@ should be placed. If not supplied, `revision` is used.
`path` may not be an absolute path or use "." or ".." path components.
Attribute `groups`: List of additional groups to which all projects
Attribute `groups`: Set of additional groups to which all projects
in the included submanifest belong. This appends and recurses, meaning
all projects in submanifests carry all parent submanifest groups.
Same syntax as the corresponding element of `project`.
@@ -353,7 +358,7 @@ When using `repo upload`, changes will be submitted for code
review on this branch. If unspecified both here and in the
default element, `revision` is used instead.
Attribute `groups`: List of groups to which this project belongs,
Attribute `groups`: Set of groups to which this project belongs,
whitespace or comma separated. All projects belong to the group
"all", and each project automatically belongs to a group of
its name:`name` and path:`path`. E.g. for
@@ -393,6 +398,11 @@ attributes of an existing project without completely replacing the
existing project definition. This makes the local manifest more robust
against changes to the original manifest.
The `extend-project` element can also contain `annotation`, `copyfile`, and
`linkfile` child elements. These are added to the project's definition. A
`copyfile` or `linkfile` with a `dest` that already exists in the project
will overwrite the original.
Attribute `path`: If specified, limit the change to projects checked out
at the specified path, rather than all projects with the given name.
@@ -401,7 +411,7 @@ of the repo client where the Git working directory for this project
should be placed. This is used to move a project in the checkout by
overriding the existing `path` setting.
Attribute `groups`: List of additional groups to which this project
Attribute `groups`: Set of additional groups to which this project
belongs. Same syntax as the corresponding element of `project`.
Attribute `revision`: If specified, overrides the revision of the original
@@ -427,19 +437,20 @@ Same syntax as the corresponding element of `project`.
### Element annotation
Zero or more annotation elements may be specified as children of a
project or remote element. Each element describes a name-value pair.
For projects, this name-value pair will be exported into each project's
environment during a 'forall' command, prefixed with `REPO__`. In addition,
there is an optional attribute "keep" which accepts the case insensitive values
"true" (default) or "false". This attribute determines whether or not the
project element, an extend-project element, or a remote element. Each
element describes a name-value pair. For projects, this name-value pair
will be exported into each project's environment during a 'forall'
command, prefixed with `REPO__`. In addition, there is an optional
attribute "keep" which accepts the case insensitive values "true"
(default) or "false". This attribute determines whether or not the
annotation will be kept when exported with the manifest subcommand.
### Element copyfile
Zero or more copyfile elements may be specified as children of a
project element. Each element describes a src-dest pair of files;
the "src" file will be copied to the "dest" place during `repo sync`
command.
project element, or an extend-project element. Each element describes a
src-dest pair of files; the "src" file will be copied to the "dest"
place during `repo sync` command.
"src" is project relative, "dest" is relative to the top of the tree.
Copying from paths outside of the project or to paths outside of the repo
@@ -450,10 +461,14 @@ Intermediate paths must not be symlinks either.
Parent directories of "dest" will be automatically created if missing.
The files are copied in the order they are specified in the manifests.
If multiple elements specify the same source and destination, they will
only be applied as one, based on the first occurence. Files are copied
before any links specified via linkfile elements are created.
### Element linkfile
It's just like copyfile and runs at the same time as copyfile but
instead of copying it creates a symlink.
It's just like copyfile, but instead of copying it creates a symlink.
The symlink is created at "dest" (relative to the top of the tree) and
points to the path specified by "src" which is a path in the project.
@@ -463,6 +478,11 @@ Parent directories of "dest" will be automatically created if missing.
The symlink target may be a file or directory, but it may not point outside
of the repo client.
The links are created in the order they are specified in the manifests.
If multiple elements specify the same source and destination, they will
only be applied as one, based on the first occurence. Links are created
after any files specified via copyfile elements are copied.
### Element remove-project
Deletes a project from the internal manifest table, possibly
@@ -560,13 +580,16 @@ the manifest repository's root.
"name" may not be an absolute path or use "." or ".." path components.
These restrictions are not enforced for [Local Manifests].
Attribute `groups`: List of additional groups to which all projects
Attribute `groups`: Set of additional groups to which all projects
in the included manifest belong. This appends and recurses, meaning
all projects in included manifests carry all parent include groups.
This also applies to all extend-project elements in the included manifests.
Same syntax as the corresponding element of `project`.
Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`)
default to which all projects in the included manifest belong.
default to which all projects in the included manifest belong. This recurses,
meaning it will apply to all projects in all manifests included as a result of
this element.
## Local Manifests {#local-manifests}

View File

@@ -133,3 +133,43 @@ def main(project_list, worktree_list=None, **kwargs):
kwargs: Leave this here for forward-compatibility.
"""
```
### post-sync
This hook runs when `repo sync` completes without errors.
Note: This includes cases where no actual checkout may occur. The hook will still run.
For example:
- `repo sync -n` performs network fetches only and skips the checkout phase.
- `repo sync <project>` only updates the specified project(s).
- Partial failures may still result in a successful exit.
This hook is useful for post-processing tasks such as setting up git hooks,
bootstrapping configuration files, or running project initialization logic.
The hook is defined using the existing `<repo-hooks>` manifest block and is
optional. If the hook script fails or is missing, `repo sync` will still
complete successfully, and the error will be printed as a warning.
Example:
```xml
<project name="myorg/dev-tools" path="tools" revision="main" />
<repo-hooks in-project="myorg/dev-tools" enabled-list="post-sync">
<hook name="post-sync" />
</repo-hooks>
```
The `post-sync.py` file should be defined like:
```py
def main(repo_topdir=None, **kwargs):
"""Main function invoked directly by repo.
We must use the name "main" as that is what repo requires.
Args:
repo_topdir: The absolute path to the top-level directory of the repo workspace.
kwargs: Leave this here for forward-compatibility.
"""
```

View File

@@ -50,8 +50,11 @@ Git worktrees (see the previous section for more info).
Repo will use symlinks heavily internally.
On *NIX platforms, this isn't an issue, but Windows makes it a bit difficult.
There are some documents out there for how to do this, but usually the easiest
answer is to run your shell as an Administrator and invoke repo/git in that.
The easiest method to allow users to create symlinks is by enabling
[Windows Developer Mode](https://learn.microsoft.com/en-us/windows/advanced-settings/developer-mode).
The next easiest answer is to run your shell as an Administrator and invoke
repo/git in that.
This isn't a great solution, but Windows doesn't make this easy, so here we are.

View File

@@ -22,7 +22,6 @@ from typing import Any, Optional
from error import GitError
from error import RepoExitError
from git_refs import HEAD
from git_trace2_event_log_base import BaseEventLog
import platform_utils
from repo_logging import RepoLogger
@@ -83,7 +82,7 @@ def RepoSourceVersion():
proj = os.path.dirname(os.path.abspath(__file__))
env[GIT_DIR] = os.path.join(proj, ".git")
result = subprocess.run(
[GIT, "describe", HEAD],
[GIT, "describe", "HEAD"],
stdout=subprocess.PIPE,
stderr=subprocess.DEVNULL,
encoding="utf-8",

View File

@@ -222,6 +222,12 @@ class GitConfig:
value = "true" if value else "false"
self.SetString(name, value)
def SetInt(self, name: str, value: int) -> None:
"""Set an integer value for a key."""
if value is not None:
value = str(value)
self.SetString(name, value)
def GetString(self, name: str, all_keys: bool = False) -> Union[str, None]:
"""Get the first value for a key, or None if it is not defined.

View File

@@ -14,6 +14,7 @@
import os
from git_command import GitCommand
import platform_utils
from repo_trace import Trace
@@ -86,9 +87,8 @@ class GitRefs:
self._symref = {}
self._mtime = {}
self._ReadPackedRefs()
self._ReadLoose("refs/")
self._ReadLoose1(os.path.join(self._gitdir, HEAD), HEAD)
self._ReadRefs()
self._ReadSymbolicRef(HEAD)
scan = self._symref
attempts = 0
@@ -102,64 +102,95 @@ class GitRefs:
scan = scan_next
attempts += 1
def _ReadPackedRefs(self):
path = os.path.join(self._gitdir, "packed-refs")
self._TrackMtime(HEAD)
self._TrackMtime("config")
self._TrackMtime("packed-refs")
self._TrackTreeMtimes("refs")
self._TrackTreeMtimes("reftable")
@staticmethod
def _IsNullRef(ref_id: str) -> bool:
"""Check if a ref_id is a null object ID."""
return ref_id and all(ch == "0" for ch in ref_id)
def _ReadRefs(self) -> None:
"""Read all references using git for-each-ref."""
p = GitCommand(
None,
["for-each-ref", "--format=%(objectname)%00%(refname)%00%(symref)"],
capture_stdout=True,
capture_stderr=True,
bare=True,
gitdir=self._gitdir,
)
if p.Wait() != 0:
return
for line in p.stdout.splitlines():
ref_id, name, symref = line.split("\0")
if symref:
self._symref[name] = symref
elif ref_id and not self._IsNullRef(ref_id):
self._phyref[name] = ref_id
def _ReadSymbolicRef(self, name: str) -> None:
"""Read a symbolic reference."""
p = GitCommand(
None,
["symbolic-ref", "-q", name],
capture_stdout=True,
capture_stderr=True,
bare=True,
gitdir=self._gitdir,
)
if p.Wait() == 0:
ref = p.stdout.strip()
if ref:
self._symref[name] = ref
return
p = GitCommand(
None,
["rev-parse", "--verify", "-q", name],
capture_stdout=True,
capture_stderr=True,
bare=True,
gitdir=self._gitdir,
)
if p.Wait() == 0:
ref_id = p.stdout.strip()
if ref_id:
self._phyref[name] = ref_id
def _TrackMtime(self, name: str) -> None:
"""Track the modification time of a specific gitdir path."""
path = os.path.join(self._gitdir, name)
try:
fd = open(path)
mtime = os.path.getmtime(path)
self._mtime[name] = os.path.getmtime(path)
except OSError:
return
def _TrackTreeMtimes(self, root: str) -> None:
"""Recursively track modification times for a directory tree."""
root_path = os.path.join(self._gitdir, root)
try:
for line in fd:
line = str(line)
if line[0] == "#":
continue
if line[0] == "^":
continue
line = line[:-1]
p = line.split(" ")
ref_id = p[0]
name = p[1]
self._phyref[name] = ref_id
finally:
fd.close()
self._mtime["packed-refs"] = mtime
def _ReadLoose(self, prefix):
base = os.path.join(self._gitdir, prefix)
for name in platform_utils.listdir(base):
p = os.path.join(base, name)
# We don't implement the full ref validation algorithm, just the
# simple rules that would show up in local filesystems.
# https://git-scm.com/docs/git-check-ref-format
if name.startswith(".") or name.endswith(".lock"):
pass
elif platform_utils.isdir(p):
self._mtime[prefix] = os.path.getmtime(base)
self._ReadLoose(prefix + name + "/")
else:
self._ReadLoose1(p, prefix + name)
def _ReadLoose1(self, path, name):
try:
with open(path) as fd:
mtime = os.path.getmtime(path)
ref_id = fd.readline()
except (OSError, UnicodeError):
if not platform_utils.isdir(root_path):
return
except OSError:
return
try:
ref_id = ref_id.decode()
except AttributeError:
pass
if not ref_id:
return
ref_id = ref_id[:-1]
to_scan = [root]
while to_scan:
name = to_scan.pop()
self._TrackMtime(name)
path = os.path.join(self._gitdir, name)
if not platform_utils.isdir(path):
continue
if ref_id.startswith("ref: "):
self._symref[name] = ref_id[5:]
else:
self._phyref[name] = ref_id
self._mtime[name] = mtime
for child in platform_utils.listdir(path):
child_name = os.path.join(name, child)
child_path = os.path.join(self._gitdir, child_name)
if platform_utils.isdir(child_path):
to_scan.append(child_name)
else:
self._TrackMtime(child_name)

View File

@@ -1,5 +1,4 @@
#!/bin/sh
#
# Copyright (C) 2009 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");

View File

@@ -23,16 +23,20 @@ Examples:
"""
import functools
import glob
import hashlib
import os
import sys
import tempfile
import time
from typing import NamedTuple
import urllib.parse
from git_command import git_require
from git_command import GitCommand
from git_config import RepoConfig
from git_refs import GitRefs
import platform_utils
_SUPERPROJECT_GIT_NAME = "superproject.git"
@@ -128,6 +132,30 @@ class Superproject:
"""Set the _print_messages attribute."""
self._print_messages = value
@property
def commit_id(self):
"""Returns the commit ID of the superproject checkout."""
cmd = ["rev-parse", self.revision]
p = GitCommand(
None, # project
cmd,
gitdir=self._work_git,
bare=True,
capture_stdout=True,
capture_stderr=True,
)
retval = p.Wait()
if retval != 0:
self._LogWarning(
"git rev-parse call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return None
return p.stdout
@property
def project_commit_ids(self):
"""Returns a dictionary of projects and their commit ids."""
@@ -140,12 +168,33 @@ class Superproject:
self._manifest_path if os.path.exists(self._manifest_path) else None
)
@property
def repo_id(self):
"""Returns the repo ID for the superproject.
For example, if the superproject points to:
https://android-review.googlesource.com/platform/superproject/
Then the repo_id would be:
android/platform/superproject
"""
review_url = self.remote.review
if review_url:
parsed_url = urllib.parse.urlparse(review_url)
netloc = parsed_url.netloc
if netloc:
parts = netloc.split("-review", 1)
host = parts[0]
rev = GitRefs(self._work_git).get("HEAD")
return f"{host}/{self.name}@{rev}"
return None
def _LogMessage(self, fmt, *inputs):
"""Logs message to stderr and _git_event_log."""
message = f"{self._LogMessagePrefix()} {fmt.format(*inputs)}"
if self._print_messages:
print(message, file=sys.stderr)
self._git_event_log.ErrorEvent(message, fmt)
if self._git_event_log:
self._git_event_log.ErrorEvent(message, fmt)
def _LogMessagePrefix(self):
"""Returns the prefix string to be logged in each log message"""
@@ -169,30 +218,63 @@ class Superproject:
"""
if not os.path.exists(self._superproject_path):
os.mkdir(self._superproject_path)
if not self._quiet and not os.path.exists(self._work_git):
if os.path.exists(self._work_git):
return True
if not self._quiet:
print(
"%s: Performing initial setup for superproject; this might "
"take several minutes." % self._work_git
)
cmd = ["init", "--bare", self._work_git_name]
p = GitCommand(
None,
cmd,
cwd=self._superproject_path,
capture_stdout=True,
capture_stderr=True,
tmp_gitdir_prefix = ".tmp-superproject-initgitdir-"
tmp_gitdir = tempfile.mkdtemp(
prefix=tmp_gitdir_prefix,
dir=self._superproject_path,
)
retval = p.Wait()
if retval:
self._LogWarning(
"git init call failed, command: git {}, "
"return code: {}, stderr: {}",
tmp_git_name = os.path.basename(tmp_gitdir)
try:
cmd = ["init", "--bare", tmp_git_name]
p = GitCommand(
None,
cmd,
retval,
p.stderr,
cwd=self._superproject_path,
capture_stdout=True,
capture_stderr=True,
)
return False
return True
retval = p.Wait()
if retval:
self._LogWarning(
"git init call failed, command: git {}, "
"return code: {}, stderr: {}",
cmd,
retval,
p.stderr,
)
return False
platform_utils.rename(tmp_gitdir, self._work_git)
tmp_gitdir = None
return True
finally:
# Clean up the temporary directory created during the process,
# as well as any stale ones left over from previous attempts.
if tmp_gitdir and os.path.exists(tmp_gitdir):
platform_utils.rmtree(tmp_gitdir)
age_threshold = 60 * 60 * 24 # 1 day in seconds
now = time.time()
for tmp_dir in glob.glob(
os.path.join(self._superproject_path, f"{tmp_gitdir_prefix}*")
):
try:
mtime = os.path.getmtime(tmp_dir)
if now - mtime > age_threshold:
platform_utils.rmtree(tmp_dir)
except OSError:
pass
def _Fetch(self):
"""Fetches a superproject for the manifest based on |_remote_url|.
@@ -258,7 +340,7 @@ class Superproject:
Works only in git repositories.
Returns:
data: data returned from 'git ls-tree ...' instead of None.
data: data returned from 'git ls-tree ...'. None on error.
"""
if not os.path.exists(self._work_git):
self._LogWarning(
@@ -288,6 +370,7 @@ class Superproject:
retval,
p.stderr,
)
return None
return data
def Sync(self, git_event_log):
@@ -375,7 +458,8 @@ class Superproject:
)
return None
manifest_str = self._manifest.ToXml(
groups=self._manifest.GetGroupsStr(), omit_local=True
filter_groups=self._manifest.GetManifestGroupsStr(),
omit_local=True,
).toxml()
manifest_path = self._manifest_path
try:

View File

@@ -1,3 +1,19 @@
# Copyright (C) 2020 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Event logging in the git trace2 EVENT format."""
from git_command import GetEventTargetPath
from git_command import RepoSourceVersion
from git_trace2_event_log_base import BaseEventLog

View File

@@ -68,6 +68,7 @@ class BaseEventLog:
global p_init_count
p_init_count += 1
self._log = []
self.verbose = False
# Try to get session-id (sid) from environment (setup in repo launcher).
KEY = "GIT_TRACE2_PARENT_SID"
if env is None:
@@ -309,10 +310,12 @@ class BaseEventLog:
# ignore the attempt and continue to DGRAM below. Otherwise,
# issue a warning.
if err.errno != errno.EPROTOTYPE:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
if self.verbose:
print(
"repo: warning: git trace2 logging failed:",
f"{err}",
file=sys.stderr,
)
return None
if socket_type == socket.SOCK_DGRAM or socket_type is None:
try:
@@ -322,18 +325,20 @@ class BaseEventLog:
self._WriteLog(lambda bs: sock.sendto(bs, path))
return f"af_unix:dgram:{path}"
except OSError as err:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
if self.verbose:
print(
f"repo: warning: git trace2 logging failed: {err}",
file=sys.stderr,
)
return None
# Tried to open a socket but couldn't connect (SOCK_STREAM) or write
# (SOCK_DGRAM).
print(
"repo: warning: git trace2 logging failed: could not write to "
"socket",
file=sys.stderr,
)
if self.verbose:
print(
"repo: warning: git trace2 logging failed: could not"
"write to socket",
file=sys.stderr,
)
return None
# Path is an absolute path
@@ -348,9 +353,10 @@ class BaseEventLog:
self._WriteLog(f.write)
log_path = f.name
except FileExistsError as err:
print(
"repo: warning: git trace2 logging failed: %r" % err,
file=sys.stderr,
)
if self.verbose:
print(
"repo: warning: git trace2 logging failed: %r" % err,
file=sys.stderr,
)
return None
return log_path

View File

@@ -22,6 +22,13 @@ from error import HookError
from git_refs import HEAD
# The API we've documented to hook authors. Keep in sync with repo-hooks.md.
_API_ARGS = {
"pre-upload": {"project_list", "worktree_list"},
"post-sync": {"repo_topdir"},
}
class RepoHook:
"""A RepoHook contains information about a script to run as a hook.
@@ -56,6 +63,7 @@ class RepoHook:
hooks_project,
repo_topdir,
manifest_url,
bug_url=None,
bypass_hooks=False,
allow_all_hooks=False,
ignore_hooks=False,
@@ -75,6 +83,7 @@ class RepoHook:
run with CWD as this directory.
If you have a manifest, this is manifest.topdir.
manifest_url: The URL to the manifest git repo.
bug_url: The URL to report issues.
bypass_hooks: If True, then 'Do not run the hook'.
allow_all_hooks: If True, then 'Run the hook without prompting'.
ignore_hooks: If True, then 'Do not abort action if hooks fail'.
@@ -85,18 +94,18 @@ class RepoHook:
self._hooks_project = hooks_project
self._repo_topdir = repo_topdir
self._manifest_url = manifest_url
self._bug_url = bug_url
self._bypass_hooks = bypass_hooks
self._allow_all_hooks = allow_all_hooks
self._ignore_hooks = ignore_hooks
self._abort_if_user_denies = abort_if_user_denies
# Store the full path to the script for convenience.
if self._hooks_project:
self._script_fullpath = None
if self._hooks_project and self._hooks_project.worktree:
self._script_fullpath = os.path.join(
self._hooks_project.worktree, self._hook_type + ".py"
)
else:
self._script_fullpath = None
def _GetHash(self):
"""Return a hash of the contents of the hooks directory.
@@ -414,11 +423,26 @@ class RepoHook:
ignore the result through the option combinations as listed in
AddHookOptionGroup().
"""
# Make sure our own callers use the documented API.
exp_kwargs = _API_ARGS.get(self._hook_type, set())
got_kwargs = set(kwargs.keys())
if exp_kwargs != got_kwargs:
print(
"repo internal error: "
f"hook '{self._hook_type}' called incorrectly\n"
f" got: {sorted(got_kwargs)}\n"
f" expected: {sorted(exp_kwargs)}\n"
f"Please file a bug: {self._bug_url}",
file=sys.stderr,
)
return False
# Do not do anything in case bypass_hooks is set, or
# no-op if there is no hooks project or if hook is disabled.
if (
self._bypass_hooks
or not self._hooks_project
or not self._script_fullpath
or self._hook_type not in self._hooks_project.enabled_repo_hooks
):
return True
@@ -472,6 +496,7 @@ class RepoHook:
"manifest_url": manifest.manifestProject.GetRemote(
"origin"
).url,
"bug_url": manifest.contactinfo.bugurl,
}
)
return cls(*args, **kwargs)

View File

@@ -1,5 +1,4 @@
#!/usr/bin/env python3
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -338,6 +337,9 @@ class _Repo:
)
return 1
cmd.CommonValidateOptions(copts, cargs)
git_trace2_event_log.verbose = copts.verbose
if gopts.pager is not False and not isinstance(cmd, InteractiveCommand):
config = cmd.client.globalConfig
if gopts.pager:
@@ -360,7 +362,6 @@ class _Repo:
Execute the subcommand.
"""
nonlocal result
cmd.CommonValidateOptions(copts, cargs)
cmd.ValidateOptions(copts, cargs)
this_manifest_only = copts.this_manifest_only

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "December 2024" "repo gc" "Repo Manual"
.TH REPO "1" "April 2025" "repo gc" "Repo Manual"
.SH NAME
repo \- repo gc - manual page for repo gc
.SH SYNOPSIS
@@ -8,7 +8,7 @@ repo \- repo gc - manual page for repo gc
.SH DESCRIPTION
Summary
.PP
Cleaning up internal repo state.
Cleaning up internal repo and Git state.
.SH OPTIONS
.TP
\fB\-h\fR, \fB\-\-help\fR
@@ -19,6 +19,10 @@ do everything except actually delete
.TP
\fB\-y\fR, \fB\-\-yes\fR
answer yes to all safe prompts
.TP
\fB\-\-repack\fR
repack all projects that use partial clone with
filter=blob:none
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "December 2024" "repo manifest" "Repo Manual"
.TH REPO "1" "March 2026" "repo manifest" "Repo Manual"
.SH NAME
repo \- repo manifest - manual page for repo manifest
.SH SYNOPSIS
@@ -30,8 +30,8 @@ if in \fB\-r\fR mode, do not write the dest\-branch field
(only of use if the branch names for a sha1 manifest
are sensitive)
.TP
\fB\-\-json\fR
output manifest in JSON format (experimental)
\fB\-\-format\fR=\fI\,FORMAT\/\fR
output format: xml, json (default: xml)
.TP
\fB\-\-pretty\fR
format output for humans to read
@@ -78,6 +78,10 @@ set to the ref we were on when the manifest was generated. The 'dest\-branch'
attribute is set to indicate the remote ref to push changes to via 'repo
upload'.
.PP
Multiple output formats are supported via \fB\-\-format\fR. The default output is XML,
and formats are generally "condensed". Use \fB\-\-pretty\fR for more human\-readable
variations.
.PP
repo Manifest Format
.PP
A repo manifest describes the structure of a repo client; that is the
@@ -127,6 +131,7 @@ include*)>
<!ATTLIST default dest\-branch CDATA #IMPLIED>
<!ATTLIST default upstream CDATA #IMPLIED>
<!ATTLIST default sync\-j CDATA #IMPLIED>
<!ATTLIST default sync\-j\-max CDATA #IMPLIED>
<!ATTLIST default sync\-c CDATA #IMPLIED>
<!ATTLIST default sync\-s CDATA #IMPLIED>
<!ATTLIST default sync\-tags CDATA #IMPLIED>
@@ -135,7 +140,7 @@ include*)>
<!ATTLIST manifest\-server url CDATA #REQUIRED>
.IP
<!ELEMENT submanifest EMPTY>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest name ID #REQUIRED>
<!ATTLIST submanifest remote IDREF #IMPLIED>
<!ATTLIST submanifest project CDATA #IMPLIED>
<!ATTLIST submanifest manifest\-name CDATA #IMPLIED>
@@ -166,9 +171,9 @@ CDATA #IMPLIED>
<!ATTLIST project sync\-c CDATA #IMPLIED>
<!ATTLIST project sync\-s CDATA #IMPLIED>
<!ATTLIST project sync\-tags CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project upstream CDATA #IMPLIED>
<!ATTLIST project clone\-depth CDATA #IMPLIED>
<!ATTLIST project force\-path CDATA #IMPLIED>
<!ATTLIST project force\-path CDATA #IMPLIED>
.IP
<!ELEMENT annotation EMPTY>
<!ATTLIST annotation name CDATA #REQUIRED>
@@ -180,19 +185,34 @@ CDATA #IMPLIED>
<!ATTLIST copyfile dest CDATA #REQUIRED>
.IP
<!ELEMENT linkfile EMPTY>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile src CDATA #REQUIRED>
<!ATTLIST linkfile dest CDATA #REQUIRED>
.TP
<!ELEMENT extend\-project (annotation*,
copyfile*,
linkfile*)>
.TP
<!ATTLIST extend\-project name
CDATA #REQUIRED>
.TP
<!ATTLIST extend\-project path
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project dest\-path
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project groups
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project revision
CDATA #IMPLIED>
.TP
<!ATTLIST extend\-project remote
CDATA #IMPLIED>
.IP
<!ELEMENT extend\-project EMPTY>
<!ATTLIST extend\-project name CDATA #REQUIRED>
<!ATTLIST extend\-project path CDATA #IMPLIED>
<!ATTLIST extend\-project dest\-path CDATA #IMPLIED>
<!ATTLIST extend\-project groups CDATA #IMPLIED>
<!ATTLIST extend\-project revision CDATA #IMPLIED>
<!ATTLIST extend\-project remote CDATA #IMPLIED>
<!ATTLIST extend\-project dest\-branch CDATA #IMPLIED>
<!ATTLIST extend\-project upstream CDATA #IMPLIED>
<!ATTLIST extend\-project base\-rev CDATA #IMPLIED>
<!ATTLIST extend\-project upstream CDATA #IMPLIED>
<!ATTLIST extend\-project base\-rev CDATA #IMPLIED>
.IP
<!ELEMENT remove\-project EMPTY>
<!ATTLIST remove\-project name CDATA #IMPLIED>
@@ -201,7 +221,7 @@ CDATA #IMPLIED>
<!ATTLIST remove\-project base\-rev CDATA #IMPLIED>
.IP
<!ELEMENT repo\-hooks EMPTY>
<!ATTLIST repo\-hooks in\-project CDATA #REQUIRED>
<!ATTLIST repo\-hooks in\-project CDATA #REQUIRED>
<!ATTLIST repo\-hooks enabled\-list CDATA #REQUIRED>
.IP
<!ELEMENT superproject EMPTY>
@@ -210,7 +230,7 @@ CDATA #IMPLIED>
<!ATTLIST superproject revision CDATA #IMPLIED>
.IP
<!ELEMENT contactinfo EMPTY>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
<!ATTLIST contactinfo bugurl CDATA #REQUIRED>
.IP
<!ELEMENT include EMPTY>
<!ATTLIST include name CDATA #REQUIRED>
@@ -290,7 +310,9 @@ when syncing a revision locked manifest in \fB\-c\fR mode to avoid having to syn
entire ref space. Project elements not setting their own `upstream` will inherit
this value.
.PP
Attribute `sync\-j`: Number of parallel jobs to use when synching.
Attribute `sync\-j`: Number of parallel jobs to use when syncing.
.PP
Attribute `sync\-j\-max`: Maximum number of parallel jobs to use when syncing.
.PP
Attribute `sync\-c`: Set to true to only sync the given Git branch (specified in
the `revision` attribute) rather than the whole ref space. Project elements
@@ -306,25 +328,7 @@ Element manifest\-server
At most one manifest\-server may be specified. The url attribute is used to
specify the URL of a manifest server, which is an XML RPC service.
.PP
The manifest server should implement the following RPC methods:
.IP
GetApprovedManifest(branch, target)
.PP
Return a manifest in which each project is pegged to a known good revision for
the current branch and target. This is used by repo sync when the \fB\-\-smart\-sync\fR
option is given.
.PP
The target to use is defined by environment variables TARGET_PRODUCT and
TARGET_BUILD_VARIANT. These variables are used to create a string of the form
$TARGET_PRODUCT\-$TARGET_BUILD_VARIANT, e.g. passion\-userdebug. If one of those
variables or both are not present, the program will call GetApprovedManifest
without the target parameter and the manifest server should choose a reasonable
default target.
.IP
GetManifest(tag)
.PP
Return a manifest in which each project is pegged to the revision at the
specified tag. This is used by repo sync when the \fB\-\-smart\-tag\fR option is given.
See the [smart sync documentation](./smart\-sync.md) for more details.
.PP
Element submanifest
.PP
@@ -376,7 +380,7 @@ supplied, `revision` is used.
.PP
`path` may not be an absolute path or use "." or ".." path components.
.PP
Attribute `groups`: List of additional groups to which all projects in the
Attribute `groups`: Set of additional groups to which all projects in the
included submanifest belong. This appends and recurses, meaning all projects in
submanifests carry all parent submanifest groups. Same syntax as the
corresponding element of `project`.
@@ -438,7 +442,7 @@ Attribute `dest\-branch`: Name of a Git branch (e.g. `main`). When using `repo
upload`, changes will be submitted for code review on this branch. If
unspecified both here and in the default element, `revision` is used instead.
.PP
Attribute `groups`: List of groups to which this project belongs, whitespace or
Attribute `groups`: Set of groups to which this project belongs, whitespace or
comma separated. All projects belong to the group "all", and each project
automatically belongs to a group of its name:`name` and path:`path`. E.g. for
`<project name="monkeys" path="barrel\-of"/>`, that project definition is
@@ -474,6 +478,11 @@ of an existing project without completely replacing the existing project
definition. This makes the local manifest more robust against changes to the
original manifest.
.PP
The `extend\-project` element can also contain `annotation`, `copyfile`, and
`linkfile` child elements. These are added to the project's definition. A
`copyfile` or `linkfile` with a `dest` that already exists in the project will
overwrite the original.
.PP
Attribute `path`: If specified, limit the change to projects checked out at the
specified path, rather than all projects with the given name.
.PP
@@ -482,8 +491,8 @@ repo client where the Git working directory for this project should be placed.
This is used to move a project in the checkout by overriding the existing `path`
setting.
.PP
Attribute `groups`: List of additional groups to which this project belongs.
Same syntax as the corresponding element of `project`.
Attribute `groups`: Set of additional groups to which this project belongs. Same
syntax as the corresponding element of `project`.
.PP
Attribute `revision`: If specified, overrides the revision of the original
project. Same syntax as the corresponding element of `project`.
@@ -507,19 +516,21 @@ element of `project`.
.PP
Element annotation
.PP
Zero or more annotation elements may be specified as children of a project or
remote element. Each element describes a name\-value pair. For projects, this
name\-value pair will be exported into each project's environment during a
\&'forall' command, prefixed with `REPO__`. In addition, there is an optional
attribute "keep" which accepts the case insensitive values "true" (default) or
"false". This attribute determines whether or not the annotation will be kept
when exported with the manifest subcommand.
Zero or more annotation elements may be specified as children of a project
element, an extend\-project element, or a remote element. Each element describes
a name\-value pair. For projects, this name\-value pair will be exported into each
project's environment during a 'forall' command, prefixed with `REPO__`. In
addition, there is an optional attribute "keep" which accepts the case
insensitive values "true" (default) or "false". This attribute determines
whether or not the annotation will be kept when exported with the manifest
subcommand.
.PP
Element copyfile
.PP
Zero or more copyfile elements may be specified as children of a project
element. Each element describes a src\-dest pair of files; the "src" file will be
copied to the "dest" place during `repo sync` command.
element, or an extend\-project element. Each element describes a src\-dest pair of
files; the "src" file will be copied to the "dest" place during `repo sync`
command.
.PP
"src" is project relative, "dest" is relative to the top of the tree. Copying
from paths outside of the project or to paths outside of the repo client is not
@@ -530,10 +541,14 @@ Intermediate paths must not be symlinks either.
.PP
Parent directories of "dest" will be automatically created if missing.
.PP
The files are copied in the order they are specified in the manifests. If
multiple elements specify the same source and destination, they will only be
applied as one, based on the first occurence. Files are copied before any links
specified via linkfile elements are created.
.PP
Element linkfile
.PP
It's just like copyfile and runs at the same time as copyfile but instead of
copying it creates a symlink.
It's just like copyfile, but instead of copying it creates a symlink.
.PP
The symlink is created at "dest" (relative to the top of the tree) and points to
the path specified by "src" which is a path in the project.
@@ -543,6 +558,11 @@ Parent directories of "dest" will be automatically created if missing.
The symlink target may be a file or directory, but it may not point outside of
the repo client.
.PP
The links are created in the order they are specified in the manifests. If
multiple elements specify the same source and destination, they will only be
applied as one, based on the first occurence. Links are created after any files
specified via copyfile elements are copied.
.PP
Element remove\-project
.PP
Deletes a project from the internal manifest table, possibly allowing a
@@ -634,13 +654,16 @@ repository's root.
"name" may not be an absolute path or use "." or ".." path components. These
restrictions are not enforced for [Local Manifests].
.PP
Attribute `groups`: List of additional groups to which all projects in the
Attribute `groups`: Set of additional groups to which all projects in the
included manifest belong. This appends and recurses, meaning all projects in
included manifests carry all parent include groups. Same syntax as the
included manifests carry all parent include groups. This also applies to all
extend\-project elements in the included manifests. Same syntax as the
corresponding element of `project`.
.PP
Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`)
default to which all projects in the included manifest belong.
default to which all projects in the included manifest belong. This recurses,
meaning it will apply to all projects in all manifests included as a result of
this element.
.PP
Local Manifests
.PP

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "September 2024" "repo smartsync" "Repo Manual"
.TH REPO "1" "August 2025" "repo smartsync" "Repo Manual"
.SH NAME
repo \- repo smartsync - manual page for repo smartsync
.SH SYNOPSIS
@@ -20,11 +20,11 @@ number of CPU cores)
.TP
\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
number of network jobs to run in parallel (defaults to
\fB\-\-jobs\fR or 1)
\fB\-\-jobs\fR or 1). Ignored unless \fB\-\-no\-interleaved\fR is set
.TP
\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
number of local checkout jobs to run in parallel
(defaults to \fB\-\-jobs\fR or 8)
(defaults to \fB\-\-jobs\fR or 8). Ignored unless \fB\-\-nointerleaved\fR is set
.TP
\fB\-f\fR, \fB\-\-force\-broken\fR
obsolete option (to be deleted in the future)
@@ -58,6 +58,12 @@ only update working tree, don't fetch
use the existing manifest checkout as\-is. (do not
update to the latest revision)
.TP
\fB\-\-interleaved\fR
fetch and checkout projects in parallel (default)
.TP
\fB\-\-no\-interleaved\fR
fetch and checkout projects in phases
.TP
\fB\-n\fR, \fB\-\-network\-only\fR
fetch only, don't update working tree
.TP
@@ -145,6 +151,16 @@ operate on this manifest and its submanifests
.TP
\fB\-\-no\-repo\-verify\fR
do not verify repo source code
.SS post\-sync hooks:
.TP
\fB\-\-no\-verify\fR
Do not run the post\-sync hook.
.TP
\fB\-\-verify\fR
Run the post\-sync hook without prompting.
.TP
\fB\-\-ignore\-hooks\fR
Do not abort if post\-sync hooks fail.
.PP
Run `repo help smartsync` to view the detailed manual.
.SH DETAILS

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "September 2024" "repo sync" "Repo Manual"
.TH REPO "1" "August 2025" "repo sync" "Repo Manual"
.SH NAME
repo \- repo sync - manual page for repo sync
.SH SYNOPSIS
@@ -20,11 +20,11 @@ number of CPU cores)
.TP
\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
number of network jobs to run in parallel (defaults to
\fB\-\-jobs\fR or 1)
\fB\-\-jobs\fR or 1). Ignored unless \fB\-\-no\-interleaved\fR is set
.TP
\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
number of local checkout jobs to run in parallel
(defaults to \fB\-\-jobs\fR or 8)
(defaults to \fB\-\-jobs\fR or 8). Ignored unless \fB\-\-nointerleaved\fR is set
.TP
\fB\-f\fR, \fB\-\-force\-broken\fR
obsolete option (to be deleted in the future)
@@ -58,6 +58,12 @@ only update working tree, don't fetch
use the existing manifest checkout as\-is. (do not
update to the latest revision)
.TP
\fB\-\-interleaved\fR
fetch and checkout projects in parallel (default)
.TP
\fB\-\-no\-interleaved\fR
fetch and checkout projects in phases
.TP
\fB\-n\fR, \fB\-\-network\-only\fR
fetch only, don't update working tree
.TP
@@ -152,6 +158,16 @@ operate on this manifest and its submanifests
.TP
\fB\-\-no\-repo\-verify\fR
do not verify repo source code
.SS post\-sync hooks:
.TP
\fB\-\-no\-verify\fR
Do not run the post\-sync hook.
.TP
\fB\-\-verify\fR
Run the post\-sync hook without prompting.
.TP
\fB\-\-ignore\-hooks\fR
Do not abort if post\-sync hooks fail.
.PP
Run `repo help sync` to view the detailed manual.
.SH DETAILS

61
man/repo-wipe.1 Normal file
View File

@@ -0,0 +1,61 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "November 2025" "repo wipe" "Repo Manual"
.SH NAME
repo \- repo wipe - manual page for repo wipe
.SH SYNOPSIS
.B repo
\fI\,wipe <project>\/\fR...
.SH DESCRIPTION
Summary
.PP
Wipe projects from the worktree
.SH OPTIONS
.TP
\fB\-h\fR, \fB\-\-help\fR
show this help message and exit
.TP
\fB\-f\fR, \fB\-\-force\fR
force wipe shared projects and uncommitted changes
.TP
\fB\-\-force\-uncommitted\fR
force wipe even if there are uncommitted changes
.TP
\fB\-\-force\-shared\fR
force wipe even if the project shares an object
directory
.SS Logging options:
.TP
\fB\-v\fR, \fB\-\-verbose\fR
show all output
.TP
\fB\-q\fR, \fB\-\-quiet\fR
only show errors
.SS Multi\-manifest options:
.TP
\fB\-\-outer\-manifest\fR
operate starting at the outermost manifest
.TP
\fB\-\-no\-outer\-manifest\fR
do not operate on outer manifests
.TP
\fB\-\-this\-manifest\-only\fR
only operate on this (sub)manifest
.TP
\fB\-\-no\-this\-manifest\-only\fR, \fB\-\-all\-manifests\fR
operate on this manifest and its submanifests
.PP
Run `repo help wipe` to view the detailed manual.
.SH DETAILS
.PP
The 'repo wipe' command removes the specified projects from the worktree (the
checked out source code) and deletes the project's git data from `.repo`.
.PP
This is a destructive operation and cannot be undone.
.PP
Projects can be specified either by name, or by a relative or absolute path to
the project's local directory.
.SH EXAMPLES
.SS # Wipe the project "platform/build" by name:
$ repo wipe platform/build
.SS # Wipe the project at the path "build/make":
$ repo wipe build/make

View File

@@ -1,5 +1,5 @@
.\" DO NOT MODIFY THIS FILE! It was generated by help2man.
.TH REPO "1" "December 2024" "repo" "Repo Manual"
.TH REPO "1" "November 2025" "repo" "Repo Manual"
.SH NAME
repo \- repository management tool built on top of git
.SH SYNOPSIS
@@ -80,7 +80,7 @@ forall
Run a shell command in each project
.TP
gc
Cleaning up internal repo state.
Cleaning up internal repo and Git state.
.TP
grep
Print lines matching a pattern
@@ -132,6 +132,9 @@ Upload changes for code review
.TP
version
Display the version of repo
.TP
wipe
Wipe projects from the worktree
.PP
See 'repo help <command>' for more information on a specific command.
Bug reports: https://issues.gerritcodereview.com/issues/new?component=1370071

View File

@@ -155,6 +155,7 @@ class _Default:
upstreamExpr = None
remote = None
sync_j = None
sync_j_max = None
sync_c = False
sync_s = False
sync_tags = True
@@ -255,7 +256,7 @@ class _XmlSubmanifest:
project: a string, the name of the manifest project.
revision: a string, the commitish.
manifestName: a string, the submanifest file name.
groups: a list of strings, the groups to add to all projects in the
groups: a set of strings, the groups to add to all projects in the
submanifest.
default_groups: a list of strings, the default groups to sync.
path: a string, the relative path for the submanifest checkout.
@@ -281,7 +282,7 @@ class _XmlSubmanifest:
self.project = project
self.revision = revision
self.manifestName = manifestName
self.groups = groups
self.groups = groups or set()
self.default_groups = default_groups
self.path = path
self.parent = parent
@@ -304,7 +305,7 @@ class _XmlSubmanifest:
self.repo_client = RepoClient(
parent.repodir,
linkFile,
parent_groups=",".join(groups) or "",
parent_groups=groups,
submanifest_path=os.path.join(parent.path_prefix, self.relpath),
outer_client=outer_client,
default_groups=default_groups,
@@ -345,7 +346,7 @@ class _XmlSubmanifest:
manifestName = self.manifestName or "default.xml"
revision = self.revision or self.name
path = self.path or revision.split("/")[-1]
groups = self.groups or []
groups = self.groups
return SubmanifestSpec(
self.name, manifestUrl, manifestName, revision, path, groups
@@ -359,9 +360,7 @@ class _XmlSubmanifest:
def GetGroupsStr(self):
"""Returns the `groups` given for this submanifest."""
if self.groups:
return ",".join(self.groups)
return ""
return ",".join(sorted(self.groups))
def GetDefaultGroupsStr(self):
"""Returns the `default-groups` given for this submanifest."""
@@ -381,7 +380,7 @@ class SubmanifestSpec:
self.manifestName = manifestName
self.revision = revision
self.path = path
self.groups = groups or []
self.groups = groups
class XmlManifest:
@@ -393,7 +392,7 @@ class XmlManifest:
manifest_file,
local_manifests=None,
outer_client=None,
parent_groups="",
parent_groups=None,
submanifest_path="",
default_groups=None,
):
@@ -409,7 +408,8 @@ class XmlManifest:
manifests. This will usually be
|repodir|/|LOCAL_MANIFESTS_DIR_NAME|.
outer_client: RepoClient of the outer manifest.
parent_groups: a string, the groups to apply to this projects.
parent_groups: a set of strings, the groups to apply to this
manifest.
submanifest_path: The submanifest root relative to the repo root.
default_groups: a string, the default manifest groups to use.
"""
@@ -432,7 +432,7 @@ class XmlManifest:
self.manifestFileOverrides = {}
self.local_manifests = local_manifests
self._load_local_manifests = True
self.parent_groups = parent_groups
self.parent_groups = parent_groups or set()
self.default_groups = default_groups
if submanifest_path and not outer_client:
@@ -567,21 +567,29 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
"""
return [x for x in re.split(r"[,\s]+", field) if x]
def _ParseSet(self, field):
"""Parse fields that contain flattened sets.
These are whitespace & comma separated. Empty elements will be
discarded.
"""
return set(self._ParseList(field))
def ToXml(
self,
peg_rev=False,
peg_rev_upstream=True,
peg_rev_dest_branch=True,
groups=None,
filter_groups=None,
omit_local=False,
):
"""Return the current manifest XML."""
mp = self.manifestProject
if groups is None:
groups = mp.manifest_groups
if groups:
groups = self._ParseList(groups)
if filter_groups is None:
filter_groups = mp.manifest_groups
if filter_groups:
filter_groups = self._ParseList(filter_groups)
doc = xml.dom.minidom.Document()
root = doc.createElement("manifest")
@@ -624,6 +632,9 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if d.sync_j is not None:
have_default = True
e.setAttribute("sync-j", "%d" % d.sync_j)
if d.sync_j_max is not None:
have_default = True
e.setAttribute("sync-j-max", "%d" % d.sync_j_max)
if d.sync_c:
have_default = True
e.setAttribute("sync-c", "true")
@@ -654,7 +665,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
output_project(parent, parent_node, project)
def output_project(parent, parent_node, p):
if not p.MatchesGroups(groups):
if not p.MatchesGroups(filter_groups):
return
if omit_local and self.IsFromLocalManifest(p):
@@ -725,10 +736,9 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
le.setAttribute("dest", lf.dest)
e.appendChild(le)
default_groups = ["all", "name:%s" % p.name, "path:%s" % p.relpath]
egroups = [g for g in p.groups if g not in default_groups]
if egroups:
e.setAttribute("groups", ",".join(egroups))
groups = p.groups - {"all", f"name:{p.name}", f"path:{p.relpath}"}
if groups:
e.setAttribute("groups", ",".join(sorted(groups)))
for a in p.annotations:
if a.keep == "true":
@@ -1116,7 +1126,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups += f",platform-{platform.system().lower()}"
return groups
def GetGroupsStr(self):
def GetManifestGroupsStr(self):
"""Returns the manifest group string that should be synced."""
return (
self.manifestProject.manifest_groups or self.GetDefaultGroupsStr()
@@ -1171,12 +1181,12 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
b = b[len(R_HEADS) :]
self.branch = b
parent_groups = self.parent_groups
parent_groups = self.parent_groups.copy()
if self.path_prefix:
parent_groups = (
parent_groups |= {
f"{SUBMANIFEST_GROUP_PREFIX}:path:"
f"{self.path_prefix},{parent_groups}"
)
f"{self.path_prefix}"
}
# The manifestFile was specified by the user which is why we
# allow include paths to point anywhere.
@@ -1202,16 +1212,16 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
# Since local manifests are entirely managed by
# the user, allow them to point anywhere the
# user wants.
local_group = (
local_group = {
f"{LOCAL_MANIFEST_GROUP_PREFIX}:"
f"{local_file[:-4]}"
)
}
nodes.append(
self._ParseManifestXml(
local,
self.subdir,
parent_groups=(
f"{local_group},{parent_groups}"
local_group | parent_groups
),
restrict_includes=False,
)
@@ -1262,7 +1272,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
self,
path,
include_root,
parent_groups="",
parent_groups=None,
restrict_includes=True,
parent_node=None,
):
@@ -1271,11 +1281,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
Args:
path: The XML file to read & parse.
include_root: The path to interpret include "name"s relative to.
parent_groups: The groups to apply to this projects.
parent_groups: The set of groups to apply to this manifest.
restrict_includes: Whether to constrain the "name" attribute of
includes.
parent_node: The parent include node, to apply attribute to this
projects.
parent_node: The parent include node, to apply attributes to this
manifest.
Returns:
List of XML nodes.
@@ -1299,6 +1309,14 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
nodes = []
for node in manifest.childNodes:
if (
parent_node
and node.nodeName in ("include", "project")
and not node.hasAttribute("revision")
):
node.setAttribute(
"revision", parent_node.getAttribute("revision")
)
if node.nodeName == "include":
name = self._reqatt(node, "name")
if restrict_includes:
@@ -1307,12 +1325,10 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
raise ManifestInvalidPathError(
f'<include> invalid "name": {name}: {msg}'
)
include_groups = ""
if parent_groups:
include_groups = parent_groups
include_groups = (parent_groups or set()).copy()
if node.hasAttribute("groups"):
include_groups = (
node.getAttribute("groups") + "," + include_groups
include_groups |= self._ParseSet(
node.getAttribute("groups")
)
fp = os.path.join(include_root, name)
if not os.path.isfile(fp):
@@ -1328,33 +1344,23 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
)
# should isolate this to the exact exception, but that's
# tricky. actual parsing implementation may vary.
except (
KeyboardInterrupt,
RuntimeError,
SystemExit,
ManifestParseError,
):
except (RuntimeError, ManifestParseError):
raise
except Exception as e:
raise ManifestParseError(
f"failed parsing included manifest {name}: {e}"
)
else:
if parent_groups and node.nodeName == "project":
nodeGroups = parent_groups
if node.hasAttribute("groups"):
nodeGroups = (
node.getAttribute("groups") + "," + nodeGroups
)
node.setAttribute("groups", nodeGroups)
if (
parent_node
and node.nodeName == "project"
and not node.hasAttribute("revision")
if parent_groups and node.nodeName in (
"project",
"extend-project",
):
node.setAttribute(
"revision", parent_node.getAttribute("revision")
)
nodeGroups = parent_groups.copy()
if node.hasAttribute("groups"):
nodeGroups |= self._ParseSet(
node.getAttribute("groups")
)
node.setAttribute("groups", ",".join(sorted(nodeGroups)))
nodes.append(node)
return nodes
@@ -1463,7 +1469,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
dest_path = node.getAttribute("dest-path")
groups = node.getAttribute("groups")
if groups:
groups = self._ParseList(groups)
groups = self._ParseSet(groups or "")
revision = node.getAttribute("revision")
remote_name = node.getAttribute("remote")
if not remote_name:
@@ -1484,7 +1490,15 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
if path and p.relpath != path:
continue
if groups:
p.groups.extend(groups)
p.groups |= groups
# Drop local groups so we don't mistakenly omit this
# project from the superproject override manifest.
p.groups = {
g
for g in p.groups
if not g.startswith(LOCAL_MANIFEST_GROUP_PREFIX)
}
if revision:
if base_revision:
if p.revisionExpr != base_revision:
@@ -1514,6 +1528,14 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
p.UpdatePaths(relpath, worktree, gitdir, objdir)
self._paths[p.relpath] = p
for n in node.childNodes:
if n.nodeName == "copyfile":
self._ParseCopyFile(p, n)
elif n.nodeName == "linkfile":
self._ParseLinkFile(p, n)
elif n.nodeName == "annotation":
self._ParseAnnotation(p, n)
if node.nodeName == "repo-hooks":
# Only one project can be the hooks project
if repo_hooks_project is not None:
@@ -1745,6 +1767,13 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
% (self.manifestFile, d.sync_j)
)
d.sync_j_max = XmlInt(node, "sync-j-max", None)
if d.sync_j_max is not None and d.sync_j_max <= 0:
raise ManifestParseError(
'%s: sync-j-max must be greater than 0, not "%s"'
% (self.manifestFile, d.sync_j_max)
)
d.sync_c = XmlBool(node, "sync-c", False)
d.sync_s = XmlBool(node, "sync-s", False)
d.sync_tags = XmlBool(node, "sync-tags", True)
@@ -1807,7 +1836,7 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
groups = ""
if node.hasAttribute("groups"):
groups = node.getAttribute("groups")
groups = self._ParseList(groups)
groups = self._ParseSet(groups)
default_groups = self._ParseList(node.getAttribute("default-groups"))
path = node.getAttribute("path")
if path == "":
@@ -1916,11 +1945,6 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
upstream = node.getAttribute("upstream") or self._default.upstreamExpr
groups = ""
if node.hasAttribute("groups"):
groups = node.getAttribute("groups")
groups = self._ParseList(groups)
if parent is None:
(
relpath,
@@ -1935,8 +1959,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
parent, name, path
)
default_groups = ["all", "name:%s" % name, "path:%s" % relpath]
groups.extend(set(default_groups).difference(groups))
groups = ""
if node.hasAttribute("groups"):
groups = node.getAttribute("groups")
groups = self._ParseSet(groups)
groups |= {"all", f"name:{name}", f"path:{relpath}"}
if self.IsMirror and node.hasAttribute("force-path"):
if XmlBool(node, "force-path", False):
@@ -1968,11 +1995,11 @@ https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
for n in node.childNodes:
if n.nodeName == "copyfile":
self._ParseCopyFile(project, n)
if n.nodeName == "linkfile":
elif n.nodeName == "linkfile":
self._ParseLinkFile(project, n)
if n.nodeName == "annotation":
elif n.nodeName == "annotation":
self._ParseAnnotation(project, n)
if n.nodeName == "project":
elif n.nodeName == "project":
project.subprojects.append(
self._ParseProject(n, parent=project)
)

View File

@@ -25,7 +25,10 @@ except ImportError:
from repo_trace import IsTraceToStderr
_TTY = sys.stderr.isatty()
# Capture the original stderr stream. We use this exclusively for progress
# updates to ensure we talk to the terminal even if stderr is redirected.
_STDERR = sys.stderr
_TTY = _STDERR.isatty()
# This will erase all content in the current line (wherever the cursor is).
# It does not move the cursor, so this is usually followed by \r to move to
@@ -101,6 +104,7 @@ class Progress:
self._units = units
self._elide = elide and _TTY
self._quiet = quiet
self._ended = False
# Only show the active jobs section if we run more than one in parallel.
self._show_jobs = False
@@ -118,6 +122,11 @@ class Progress:
if not quiet and show_elapsed:
self._update_thread.start()
def update_total(self, new_total):
"""Updates the total if the new total is larger."""
if new_total > self._total:
self._total = new_total
def _update_loop(self):
while True:
self.update(inc=0)
@@ -127,11 +136,11 @@ class Progress:
def _write(self, s):
s = "\r" + s
if self._elide:
col = os.get_terminal_size(sys.stderr.fileno()).columns
col = os.get_terminal_size(_STDERR.fileno()).columns
if len(s) > col:
s = s[: col - 1] + ".."
sys.stderr.write(s)
sys.stderr.flush()
_STDERR.write(s)
_STDERR.flush()
def start(self, name):
self._active += 1
@@ -195,7 +204,26 @@ class Progress:
)
)
def display_message(self, msg):
"""Clears the current progress line and prints a message above it.
The progress bar is then redrawn on the next line.
"""
if not _TTY or IsTraceToStderr() or self._quiet:
return
# Erase the current line, print the message with a newline,
# and then immediately redraw the progress bar on the new line.
_STDERR.write("\r" + CSI_ERASE_LINE)
_STDERR.write(msg + "\n")
_STDERR.flush()
self.update(inc=0)
def end(self):
if self._ended:
return
self._ended = True
self._update_event.set()
if not _TTY or IsTraceToStderr() or self._quiet:
return

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,4 @@
# Copyright 2023 The Android Open Source Project
# Copyright (C) 2023 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -14,9 +14,32 @@
[tool.black]
line-length = 80
# NB: Keep in sync with tox.ini.
target-version = ['py36', 'py37', 'py38', 'py39', 'py310', 'py311'] #, 'py312'
# Config file for the isort python module.
# This is used to enforce import sorting standards.
#
# https://pycqa.github.io/isort/docs/configuration/options.html
[tool.isort]
# Be compatible with `black` since it also matches what we want.
profile = 'black'
line_length = 80
length_sort = false
force_single_line = true
lines_after_imports = 2
from_first = false
case_sensitive = false
force_sort_within_sections = true
order_by_type = false
# Ignore generated files.
extend_skip_glob = '*_pb2.py'
# Allow importing multiple classes on a single line from these modules.
# https://google.github.io/styleguide/pyguide#s2.2-imports
single_line_exclusions = ['abc', 'collections.abc', 'typing']
[tool.pytest.ini_options]
markers = """
skip_cq: Skip tests in the CQ. Should be rarely used!

155
release/check-metadata.py Executable file
View File

@@ -0,0 +1,155 @@
#!/usr/bin/env python3
# Copyright (C) 2025 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Helper tool to check various metadata (e.g. licensing) in source files."""
import argparse
from pathlib import Path
import re
import sys
import util
_FILE_HEADER_RE = re.compile(
r"""# Copyright \(C\) 20[0-9]{2} The Android Open Source Project
#
# Licensed under the Apache License, Version 2\.0 \(the "License"\);
# you may not use this file except in compliance with the License\.
# You may obtain a copy of the License at
#
# http://www\.apache\.org/licenses/LICENSE-2\.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied\.
# See the License for the specific language governing permissions and
# limitations under the License\.
"""
)
def check_license(path: Path, lines: list[str]) -> bool:
"""Check license header."""
# Enforce licensing on configs & scripts.
if not (
path.suffix in (".bash", ".cfg", ".ini", ".py", ".toml")
or lines[0] in ("#!/bin/bash", "#!/bin/sh", "#!/usr/bin/env python3")
):
return True
# Extract the file header.
header_lines = []
for line in lines:
if line.startswith("#"):
header_lines.append(line)
else:
break
if not header_lines:
print(
f"error: {path.relative_to(util.TOPDIR)}: "
"missing file header (copyright+licensing)",
file=sys.stderr,
)
return False
# Skip the shebang.
if header_lines[0].startswith("#!"):
header_lines.pop(0)
# If this file is imported into the tree, then leave it be.
if header_lines[0] == "# DO NOT EDIT THIS FILE":
return True
header = "".join(f"{x}\n" for x in header_lines)
if not _FILE_HEADER_RE.match(header):
print(
f"error: {path.relative_to(util.TOPDIR)}: "
"file header incorrectly formatted",
file=sys.stderr,
)
print(
"".join(f"> {x}\n" for x in header_lines), end="", file=sys.stderr
)
return False
return True
def check_path(opts: argparse.Namespace, path: Path) -> bool:
"""Check a single path."""
try:
data = path.read_text(encoding="utf-8")
except FileNotFoundError:
return True
lines = data.splitlines()
# NB: Use list comprehension and not a generator so we run all the checks.
return all(
[
check_license(path, lines),
]
)
def check_paths(opts: argparse.Namespace, paths: list[Path]) -> bool:
"""Check all the paths."""
# NB: Use list comprehension and not a generator so we check all paths.
return all([check_path(opts, x) for x in paths])
def find_files(opts: argparse.Namespace) -> list[Path]:
"""Find all the files in the source tree."""
result = util.run(
opts,
["git", "ls-tree", "-r", "-z", "--name-only", "HEAD"],
cwd=util.TOPDIR,
capture_output=True,
encoding="utf-8",
)
return [util.TOPDIR / x for x in result.stdout.split("\0")[:-1]]
def get_parser() -> argparse.ArgumentParser:
"""Get a CLI parser."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument(
"-n",
"--dry-run",
dest="dryrun",
action="store_true",
help="show everything that would be done",
)
parser.add_argument(
"paths",
nargs="*",
help="the paths to scan",
)
return parser
def main(argv: list[str]) -> int:
"""The main func!"""
parser = get_parser()
opts = parser.parse_args(argv)
paths = opts.paths
if not opts.paths:
paths = find_files(opts)
return 0 if check_paths(opts, paths) else 1
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View File

@@ -27,6 +27,9 @@ import sys
import util
assert sys.version_info >= (3, 9), "Release framework requires Python 3.9+"
def sign(opts):
"""Sign the launcher!"""
output = ""

View File

@@ -30,6 +30,9 @@ import sys
import util
assert sys.version_info >= (3, 9), "Release framework requires Python 3.9+"
# We currently sign with the old DSA key as it's been around the longest.
# We should transition to RSA by Jun 2020, and ECC by Jun 2021.
KEYID = util.KEYID_DSA

View File

@@ -24,7 +24,7 @@ from typing import List, Optional
import urllib.request
assert sys.version_info >= (3, 8), "Python 3.8+ required"
assert sys.version_info >= (3, 9), "Release framework requires Python 3.9+"
TOPDIR = Path(__file__).resolve().parent.parent

View File

@@ -27,9 +27,15 @@ import shutil
import subprocess
import sys
import tempfile
from typing import List
TOPDIR = Path(__file__).resolve().parent.parent
# NB: This script is currently imported by tests/ to unittest some logic.
assert sys.version_info >= (3, 6), "Python 3.6+ required"
THIS_FILE = Path(__file__).resolve()
TOPDIR = THIS_FILE.parent.parent
MANDIR = TOPDIR.joinpath("man")
# Load repo local modules.
@@ -42,9 +48,23 @@ def worker(cmd, **kwargs):
subprocess.run(cmd, **kwargs)
def main(argv):
def get_parser() -> argparse.ArgumentParser:
"""Get argument parser."""
parser = argparse.ArgumentParser(description=__doc__)
parser.parse_args(argv)
parser.add_argument(
"-n",
"--check",
"--dry-run",
action="store_const",
const=True,
help="Check if changes are necessary; don't actually change files",
)
return parser
def main(argv: List[str]) -> int:
parser = get_parser()
opts = parser.parse_args(argv)
if not shutil.which("help2man"):
sys.exit("Please install help2man to continue.")
@@ -117,6 +137,7 @@ def main(argv):
functools.partial(worker, cwd=tempdir, check=True), cmdlist
)
ret = 0
for tmp_path in MANDIR.glob("*.1.tmp"):
path = tmp_path.parent / tmp_path.stem
old_data = path.read_text() if path.exists() else ""
@@ -133,7 +154,17 @@ def main(argv):
)
new_data = re.sub(r'^(\.TH REPO "1" ")([^"]+)', r"\1", data, flags=re.M)
if old_data != new_data:
path.write_text(data)
if opts.check:
ret = 1
print(
f"{THIS_FILE.name}: {path.name}: "
"man page needs regenerating",
file=sys.stderr,
)
else:
path.write_text(data)
return ret
def replace_regex(data):

View File

@@ -14,7 +14,7 @@
"""Random utility code for release tools."""
import os
from pathlib import Path
import re
import shlex
import subprocess
@@ -24,8 +24,9 @@ import sys
assert sys.version_info >= (3, 6), "This module requires Python 3.6+"
TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
HOMEDIR = os.path.expanduser("~")
THIS_FILE = Path(__file__).resolve()
TOPDIR = THIS_FILE.parent.parent
HOMEDIR = Path("~").expanduser()
# These are the release keys we sign with.
@@ -54,7 +55,7 @@ def run(opts, cmd, check=True, **kwargs):
def import_release_key(opts):
"""Import the public key of the official release repo signing key."""
# Extract the key from our repo launcher.
launcher = getattr(opts, "launcher", os.path.join(TOPDIR, "repo"))
launcher = getattr(opts, "launcher", TOPDIR / "repo")
print(f'Importing keys from "{launcher}" launcher script')
with open(launcher, encoding="utf-8") as fp:
data = fp.read()

1
repo
View File

@@ -1,5 +1,4 @@
#!/usr/bin/env python3
#
# Copyright (C) 2008 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");

View File

@@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -17,25 +17,37 @@
import functools
import os
import shlex
import shutil
import subprocess
import sys
from typing import List
# NB: While tests/* support Python >=3.6 to match requirements.json for `repo`,
# the higher level runner logic does not need to be held back.
assert sys.version_info >= (3, 9), "Test/release framework requires Python 3.9+"
ROOT_DIR = os.path.dirname(os.path.realpath(__file__))
def log_cmd(cmd: str, argv: list[str]) -> None:
"""Log a debug message to make history easier to track."""
print("+", cmd, shlex.join(argv), file=sys.stderr)
@functools.lru_cache()
def is_ci() -> bool:
"""Whether we're running in our CI system."""
return os.getenv("LUCI_CQ") == "yes"
def run_pytest(argv: List[str]) -> int:
def run_pytest(argv: list[str]) -> int:
"""Returns the exit code from pytest."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
log_cmd("pytest", argv)
return subprocess.run(
[sys.executable, "-m", "pytest"] + argv,
check=False,
@@ -43,11 +55,12 @@ def run_pytest(argv: List[str]) -> int:
).returncode
def run_pytest_py38(argv: List[str]) -> int:
def run_pytest_py38(argv: list[str]) -> int:
"""Returns the exit code from pytest under Python 3.8."""
if is_ci():
argv = ["-m", "not skip_cq"] + argv
log_cmd("[vpython 3.8] pytest", argv)
try:
return subprocess.run(
[
@@ -68,6 +81,14 @@ def run_pytest_py38(argv: List[str]) -> int:
def run_black():
"""Returns the exit code from black."""
argv = ["--version"]
log_cmd("black", argv)
subprocess.run(
[sys.executable, "-m", "black"] + argv,
check=True,
cwd=ROOT_DIR,
)
# Black by default only matches .py files. We have to list standalone
# scripts manually.
extra_programs = [
@@ -76,8 +97,10 @@ def run_black():
"release/update-hooks",
"release/update-manpages",
]
argv = ["--diff", "--check", ROOT_DIR] + extra_programs
log_cmd("black", argv)
return subprocess.run(
[sys.executable, "-m", "black", "--check", ROOT_DIR] + extra_programs,
[sys.executable, "-m", "black"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
@@ -85,8 +108,18 @@ def run_black():
def run_flake8():
"""Returns the exit code from flake8."""
argv = ["--version"]
log_cmd("flake8", argv)
subprocess.run(
[sys.executable, "-m", "flake8"] + argv,
check=True,
cwd=ROOT_DIR,
)
argv = [ROOT_DIR]
log_cmd("flake8", argv)
return subprocess.run(
[sys.executable, "-m", "flake8", ROOT_DIR],
[sys.executable, "-m", "flake8"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
@@ -94,8 +127,45 @@ def run_flake8():
def run_isort():
"""Returns the exit code from isort."""
argv = ["--version-number"]
log_cmd("isort", argv)
subprocess.run(
[sys.executable, "-m", "isort"] + argv,
check=True,
cwd=ROOT_DIR,
)
argv = ["--check", ROOT_DIR]
log_cmd("isort", argv)
return subprocess.run(
[sys.executable, "-m", "isort", "--check", ROOT_DIR],
[sys.executable, "-m", "isort"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
def run_check_metadata():
"""Returns the exit code from check-metadata."""
argv = []
log_cmd("release/check-metadata.py", argv)
return subprocess.run(
[sys.executable, "release/check-metadata.py"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
def run_update_manpages() -> int:
"""Returns the exit code from release/update-manpages."""
# Allow this to fail on CI, but not local devs.
if is_ci() and not shutil.which("help2man"):
print("update-manpages: help2man not found; skipping test")
return 0
argv = ["--check"]
log_cmd("release/update-manpages", argv)
return subprocess.run(
[sys.executable, "release/update-manpages"] + argv,
check=False,
cwd=ROOT_DIR,
).returncode
@@ -109,6 +179,8 @@ def main(argv):
run_black,
run_flake8,
run_isort,
run_check_metadata,
run_update_manpages,
)
# Run all the tests all the time to get full feedback. Don't exit on the
# first error as that makes it more difficult to iterate in the CQ.

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env python3
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the 'License");
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#

23
ssh.py
View File

@@ -52,12 +52,12 @@ def _parse_ssh_version(ver_str=None):
@functools.lru_cache(maxsize=None)
def version():
"""return ssh version as a tuple"""
"""Return ssh version as a tuple.
If ssh is not available, a FileNotFoundError will be raised.
"""
try:
return _parse_ssh_version()
except FileNotFoundError:
print("fatal: ssh not installed", file=sys.stderr)
sys.exit(1)
except subprocess.CalledProcessError as e:
print(
"fatal: unable to detect ssh version"
@@ -102,9 +102,18 @@ class ProxyManager:
self._clients = manager.list()
# Path to directory for holding master sockets.
self._sock_path = None
# See if ssh is usable.
self._ssh_installed = False
def __enter__(self):
"""Enter a new context."""
# Check which version of ssh is available.
try:
version()
self._ssh_installed = True
except FileNotFoundError:
self._ssh_installed = False
return self
def __exit__(self, exc_type, exc_value, traceback):
@@ -282,6 +291,9 @@ class ProxyManager:
def preconnect(self, url):
"""If |uri| will create a ssh connection, setup the ssh master for it.""" # noqa: E501
if not self._ssh_installed:
return False
m = URI_ALL.match(url)
if m:
scheme = m.group(1)
@@ -306,6 +318,9 @@ class ProxyManager:
This has all the master sockets so clients can talk to them.
"""
if not self._ssh_installed:
return None
if self._sock_path is None:
if not create:
return None

View File

@@ -48,7 +48,6 @@ It is equivalent to "git branch -D <branchname>".
def _Options(self, p):
p.add_option(
"--all",
dest="all",
action="store_true",
help="delete all branches in all projects",
)

View File

@@ -35,7 +35,6 @@ to the Unix 'patch' command.
p.add_option(
"-u",
"--absolute",
dest="absolute",
action="store_true",
help="paths are relative to the repository root",
)

View File

@@ -67,7 +67,9 @@ synced and their revisions won't be found.
def _Options(self, p):
p.add_option(
"--raw", dest="raw", action="store_true", help="display raw diff"
"--raw",
action="store_true",
help="display raw diff",
)
p.add_option(
"--no-color",
@@ -78,7 +80,6 @@ synced and their revisions won't be found.
)
p.add_option(
"--pretty-format",
dest="pretty_format",
action="store",
metavar="<FORMAT>",
help="print the log using a custom git pretty format string",

View File

@@ -60,7 +60,6 @@ If no project is specified try to use current directory as a project.
p.add_option(
"-r",
"--revert",
dest="revert",
action="store_true",
help="revert instead of checkout",
)

View File

@@ -141,7 +141,6 @@ without iterating through the remaining projects.
p.add_option(
"-r",
"--regex",
dest="regex",
action="store_true",
help="execute the command only on projects matching regex or "
"wildcard expression",
@@ -149,7 +148,6 @@ without iterating through the remaining projects.
p.add_option(
"-i",
"--inverse-regex",
dest="inverse_regex",
action="store_true",
help="execute the command only on projects not matching regex or "
"wildcard expression",
@@ -157,22 +155,20 @@ without iterating through the remaining projects.
p.add_option(
"-g",
"--groups",
dest="groups",
help="execute the command only on projects matching the specified "
"groups",
)
p.add_option(
"-c",
"--command",
help="command (and arguments) to execute",
dest="command",
help="command (and arguments) to execute",
action="callback",
callback=self._cmd_option,
)
p.add_option(
"-e",
"--abort-on-errors",
dest="abort_on_errors",
action="store_true",
help="abort if a command exits unsuccessfully",
)

View File

@@ -284,7 +284,16 @@ class Gc(Command):
args, all_manifests=not opt.this_manifest_only
)
ret = self.delete_unused_projects(projects, opt)
# If the user specified projects, fetch the global list separately
# to avoid deleting untargeted projects.
if args:
all_projects = self.GetProjects(
[], all_manifests=not opt.this_manifest_only
)
else:
all_projects = projects
ret = self.delete_unused_projects(all_projects, opt)
if ret != 0:
return ret

View File

@@ -120,7 +120,6 @@ contain a line that matches both expressions:
g.add_option(
"-r",
"--revision",
dest="revision",
action="append",
metavar="TREEish",
help="Search TREEish, instead of the work tree",

View File

@@ -43,14 +43,12 @@ class Info(PagedCommand):
p.add_option(
"-o",
"--overview",
dest="overview",
action="store_true",
help="show overview of all local commits",
)
p.add_option(
"-c",
"--current-branch",
dest="current_branch",
action="store_true",
help="consider only checked out branches",
)
@@ -90,7 +88,7 @@ class Info(PagedCommand):
self.manifest = self.manifest.outer_client
manifestConfig = self.manifest.manifestProject.config
mergeBranch = manifestConfig.GetBranch("default").merge
manifestGroups = self.manifest.GetGroupsStr()
manifestGroups = self.manifest.GetManifestGroupsStr()
self.heading("Manifest branch: ")
if self.manifest.default.revisionExpr:
@@ -104,6 +102,11 @@ class Info(PagedCommand):
self.heading("Manifest groups: ")
self.headtext(manifestGroups)
self.out.nl()
sp = self.manifest.superproject
srev = sp.commit_id if sp and sp.commit_id else "None"
self.heading("Superproject revision: ")
self.headtext(srev)
self.out.nl()
self.printSeparator()

View File

@@ -127,6 +127,7 @@ to update the working directory files.
return {
"REPO_MANIFEST_URL": "manifest_url",
"REPO_MIRROR_LOCATION": "reference",
"REPO_GIT_LFS": "git_lfs",
}
def _SyncManifest(self, opt):

View File

@@ -40,7 +40,6 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
p.add_option(
"-r",
"--regex",
dest="regex",
action="store_true",
help="filter the project list based on regex or wildcard matching "
"of strings",
@@ -48,7 +47,6 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
p.add_option(
"-g",
"--groups",
dest="groups",
help="filter the project list based on the groups the project is "
"in",
)
@@ -61,21 +59,18 @@ This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
p.add_option(
"-n",
"--name-only",
dest="name_only",
action="store_true",
help="display only the name of the repository",
)
p.add_option(
"-p",
"--path-only",
dest="path_only",
action="store_true",
help="display only the path of the repository",
)
p.add_option(
"-f",
"--fullpath",
dest="fullpath",
action="store_true",
help="display the full work tree path instead of the relative path",
)

View File

@@ -12,7 +12,9 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import enum
import json
import optparse
import os
import sys
@@ -23,6 +25,16 @@ from repo_logging import RepoLogger
logger = RepoLogger(__file__)
class OutputFormat(enum.Enum):
"""Type for the requested output format."""
# Canonicalized manifest in XML format.
XML = enum.auto()
# Canonicalized manifest in JSON format.
JSON = enum.auto()
class Manifest(PagedCommand):
COMMON = False
helpSummary = "Manifest inspection utility"
@@ -42,6 +54,10 @@ revisions set to the current commit hash. These are known as
In this case, the 'upstream' attribute is set to the ref we were on
when the manifest was generated. The 'dest-branch' attribute is set
to indicate the remote ref to push changes to via 'repo upload'.
Multiple output formats are supported via --format. The default output
is XML, and formats are generally "condensed". Use --pretty for more
human-readable variations.
"""
@property
@@ -86,11 +102,21 @@ to indicate the remote ref to push changes to via 'repo upload'.
"(only of use if the branch names for a sha1 manifest are "
"sensitive)",
)
# Replaced with --format=json. Kept for backwards compatibility.
# Can delete in Jun 2026 or later.
p.add_option(
"--json",
default=False,
action="store_true",
help="output manifest in JSON format (experimental)",
action="store_const",
dest="format",
const=OutputFormat.JSON.name.lower(),
help=optparse.SUPPRESS_HELP,
)
formats = tuple(x.lower() for x in OutputFormat.__members__.keys())
p.add_option(
"--format",
default=OutputFormat.XML.name.lower(),
choices=formats,
help=f"output format: {', '.join(formats)} (default: %default)",
)
p.add_option(
"--pretty",
@@ -108,7 +134,6 @@ to indicate the remote ref to push changes to via 'repo upload'.
p.add_option(
"-o",
"--output-file",
dest="output_file",
default="-",
help="file to save the manifest to. (Filename prefix for "
"multi-tree.)",
@@ -121,6 +146,8 @@ to indicate the remote ref to push changes to via 'repo upload'.
if opt.manifest_name:
self.manifest.Override(opt.manifest_name, False)
output_format = OutputFormat[opt.format.upper()]
for manifest in self.ManifestList(opt):
output_file = opt.output_file
if output_file == "-":
@@ -135,8 +162,7 @@ to indicate the remote ref to push changes to via 'repo upload'.
manifest.SetUseLocalManifests(not opt.ignore_local_manifests)
if opt.json:
logger.warning("warning: --json is experimental!")
if output_format == OutputFormat.JSON:
doc = manifest.ToDict(
peg_rev=opt.peg_rev,
peg_rev_upstream=opt.peg_rev_upstream,
@@ -152,7 +178,7 @@ to indicate the remote ref to push changes to via 'repo upload'.
"separators": (",", ": ") if opt.pretty else (",", ":"),
"sort_keys": True,
}
fd.write(json.dumps(doc, **json_settings))
fd.write(json.dumps(doc, **json_settings) + "\n")
else:
manifest.Save(
fd,

View File

@@ -37,7 +37,6 @@ are displayed.
p.add_option(
"-c",
"--current-branch",
dest="current_branch",
action="store_true",
help="consider only checked out branches",
)

View File

@@ -47,21 +47,18 @@ branch but need to incorporate new upstream changes "underneath" them.
g.add_option(
"-i",
"--interactive",
dest="interactive",
action="store_true",
help="interactive rebase (single project only)",
)
p.add_option(
"--fail-fast",
dest="fail_fast",
action="store_true",
help="stop rebasing after first error is hit",
)
p.add_option(
"-f",
"--force-rebase",
dest="force_rebase",
action="store_true",
help="pass --force-rebase to git rebase",
)
@@ -74,27 +71,23 @@ branch but need to incorporate new upstream changes "underneath" them.
)
p.add_option(
"--autosquash",
dest="autosquash",
action="store_true",
help="pass --autosquash to git rebase",
)
p.add_option(
"--whitespace",
dest="whitespace",
action="store",
metavar="WS",
help="pass --whitespace to git rebase",
)
p.add_option(
"--auto-stash",
dest="auto_stash",
action="store_true",
help="stash local modifications before starting",
)
p.add_option(
"-m",
"--onto-manifest",
dest="onto_manifest",
action="store_true",
help="rebase onto the manifest version instead of upstream "
"HEAD (this helps to make sure the local tree stays "

View File

@@ -54,7 +54,6 @@ need to be performed by an end-user.
)
g.add_option(
"--repo-upgraded",
dest="repo_upgraded",
action="store_true",
help=optparse.SUPPRESS_HELP,
)

View File

@@ -46,7 +46,6 @@ The '%prog' command stages files to prepare the next commit.
g.add_option(
"-i",
"--interactive",
dest="interactive",
action="store_true",
help="use interactive staging",
)

View File

@@ -51,7 +51,6 @@ revision specified in the manifest.
def _Options(self, p):
p.add_option(
"--all",
dest="all",
action="store_true",
help="begin branch in all projects",
)

View File

@@ -82,7 +82,6 @@ the following meanings:
p.add_option(
"-o",
"--orphans",
dest="orphans",
action="store_true",
help="include objects in working directory outside of repo "
"projects",

File diff suppressed because it is too large Load Diff

View File

@@ -27,6 +27,7 @@ from error import SilentRepoExitError
from error import UploadError
from git_command import GitCommand
from git_refs import R_HEADS
import git_superproject
from hooks import RepoHook
from project import ReviewableBranch
from repo_logging import RepoLogger
@@ -267,7 +268,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
"--cc",
type="string",
action="append",
dest="cc",
help="also send email to these email addresses",
)
p.add_option(
@@ -281,7 +281,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
p.add_option(
"-c",
"--current-branch",
dest="current_branch",
action="store_true",
help="upload current git branch",
)
@@ -310,7 +309,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
"-p",
"--private",
action="store_true",
dest="private",
default=False,
help="upload as a private change (deprecated; use --wip)",
)
@@ -318,7 +316,6 @@ Gerrit Code Review: https://www.gerritcodereview.com/
"-w",
"--wip",
action="store_true",
dest="wip",
default=False,
help="upload as a work-in-progress change",
)
@@ -628,6 +625,16 @@ Gerrit Code Review: https://www.gerritcodereview.com/
branch.uploaded = False
return
# If using superproject, add the root repo as a push option.
manifest = branch.project.manifest
push_options = list(opt.push_options)
if git_superproject.UseSuperproject(None, manifest):
sp = manifest.superproject
if sp:
r_id = sp.repo_id
if r_id:
push_options.append(f"custom-keyed-value=rootRepo:{r_id}")
branch.UploadForReview(
people,
dryrun=opt.dryrun,
@@ -640,7 +647,7 @@ Gerrit Code Review: https://www.gerritcodereview.com/
ready=opt.ready,
dest_branch=destination,
validate_certs=opt.validate_certs,
push_options=opt.push_options,
push_options=push_options,
patchset_description=opt.patchset_description,
)

184
subcmds/wipe.py Normal file
View File

@@ -0,0 +1,184 @@
# Copyright (C) 2025 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
from typing import List
from command import Command
from error import GitError
from error import RepoExitError
import platform_utils
from project import DeleteWorktreeError
class Error(RepoExitError):
"""Exit error when wipe command fails."""
class Wipe(Command):
"""Delete projects from the worktree and .repo"""
COMMON = True
helpSummary = "Wipe projects from the worktree"
helpUsage = """
%prog <project>...
"""
helpDescription = """
The '%prog' command removes the specified projects from the worktree
(the checked out source code) and deletes the project's git data from `.repo`.
This is a destructive operation and cannot be undone.
Projects can be specified either by name, or by a relative or absolute path
to the project's local directory.
Examples:
# Wipe the project "platform/build" by name:
$ repo wipe platform/build
# Wipe the project at the path "build/make":
$ repo wipe build/make
"""
def _Options(self, p):
# TODO(crbug.com/gerrit/393383056): Add --broken option to scan and
# wipe broken projects.
p.add_option(
"-f",
"--force",
action="store_true",
help="force wipe shared projects and uncommitted changes",
)
p.add_option(
"--force-uncommitted",
action="store_true",
help="force wipe even if there are uncommitted changes",
)
p.add_option(
"--force-shared",
action="store_true",
help="force wipe even if the project shares an object directory",
)
def ValidateOptions(self, opt, args: List[str]):
if not args:
self.Usage()
def Execute(self, opt, args: List[str]):
# Get all projects to handle shared object directories.
all_projects = self.GetProjects(None, all_manifests=True, groups="all")
projects_to_wipe = self.GetProjects(args, all_manifests=True)
relpaths_to_wipe = {p.relpath for p in projects_to_wipe}
# Build a map from objdir to the relpaths of projects that use it.
objdir_map = {}
for p in all_projects:
objdir_map.setdefault(p.objdir, set()).add(p.relpath)
uncommitted_projects = []
shared_objdirs = {}
objdirs_to_delete = set()
for project in projects_to_wipe:
if project == self.manifest.manifestProject:
raise Error(
f"error: cannot wipe the manifest project: {project.name}"
)
try:
if project.HasChanges():
uncommitted_projects.append(project.name)
except GitError:
uncommitted_projects.append(f"{project.name} (corrupted)")
users = objdir_map.get(project.objdir, {project.relpath})
is_shared = not users.issubset(relpaths_to_wipe)
if is_shared:
shared_objdirs.setdefault(project.objdir, set()).update(users)
else:
objdirs_to_delete.add(project.objdir)
block_uncommitted = uncommitted_projects and not (
opt.force or opt.force_uncommitted
)
block_shared = shared_objdirs and not (opt.force or opt.force_shared)
if block_uncommitted or block_shared:
error_messages = []
if block_uncommitted:
error_messages.append(
"The following projects have uncommitted changes or are "
"corrupted:\n"
+ "\n".join(f" - {p}" for p in sorted(uncommitted_projects))
)
if block_shared:
shared_dir_messages = []
for objdir, users in sorted(shared_objdirs.items()):
other_users = users - relpaths_to_wipe
projects_to_wipe_in_dir = users & relpaths_to_wipe
message = f"""Object directory {objdir} is shared by:
Projects to be wiped: {', '.join(sorted(projects_to_wipe_in_dir))}
Projects not to be wiped: {', '.join(sorted(other_users))}"""
shared_dir_messages.append(message)
error_messages.append(
"The following projects have shared object directories:\n"
+ "\n".join(sorted(shared_dir_messages))
)
if block_uncommitted and block_shared:
error_messages.append(
"Use --force to wipe anyway, or --force-uncommitted and "
"--force-shared to specify."
)
elif block_uncommitted:
error_messages.append("Use --force-uncommitted to wipe anyway.")
else:
error_messages.append("Use --force-shared to wipe anyway.")
raise Error("\n\n".join(error_messages))
# If we are here, either there were no issues, or --force was used.
# Proceed with wiping.
successful_wipes = set()
for project in projects_to_wipe:
try:
# Force the delete here since we've already performed our
# own safety checks above.
project.DeleteWorktree(force=True, verbose=opt.verbose)
successful_wipes.add(project.relpath)
except DeleteWorktreeError as e:
print(
f"error: failed to wipe {project.name}: {e}",
file=sys.stderr,
)
# Clean up object directories only if all projects using them were
# successfully wiped.
for objdir in objdirs_to_delete:
users = objdir_map.get(objdir, set())
# Check if every project that uses this objdir has been
# successfully processed. If a project failed to be wiped, don't
# delete the object directory, or we'll corrupt the remaining
# project.
if users.issubset(successful_wipes):
if os.path.exists(objdir):
if opt.verbose:
print(
f"Deleting objects directory: {objdir}",
file=sys.stderr,
)
platform_utils.rmtree(objdir)

20
tests/README.md Normal file
View File

@@ -0,0 +1,20 @@
# Repo Tests
There is a mixture of [pytest] & [Python unittest] in here. We adopted [pytest]
later on but didn't migrate existing tests (since they still work). New tests
should be written using [pytest] only.
## File layout
* `test_xxx.py`: Unittests for the `xxx` module in the main repo codebase.
Modules that are in subdirs normalize the `/` into `_`.
For example, [test_error.py](./test_error.py) is for the
[error.py](../error.py) module, and
[test_subcmds_forall.py](./test_subcmds_forall.py) is for the
[subcmds/forall.py](../subcmds/forall.py) module.
* [conftest.py](./conftest.py): Custom pytest fixtures for sharing.
* [utils_for_test.py](./utils_for_test.py): Helpers for sharing in tests.
[pytest]: https://pytest.org/
[Python unittest]: https://docs.python.org/3/library/unittest.html#unittest.TestCase

View File

@@ -1,4 +1,4 @@
# Copyright 2022 The Android Open Source Project
# Copyright (C) 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -15,60 +15,66 @@
"""Unittests for the color.py module."""
import os
import unittest
import pytest
import color
import git_config
def fixture(*paths):
def fixture(*paths: str) -> str:
"""Return a path relative to test/fixtures."""
return os.path.join(os.path.dirname(__file__), "fixtures", *paths)
class ColoringTests(unittest.TestCase):
"""tests of the Coloring class."""
@pytest.fixture
def coloring() -> color.Coloring:
"""Create a Coloring object for testing."""
config_fixture = fixture("test.gitconfig")
config = git_config.GitConfig(config_fixture)
color.SetDefaultColoring("true")
return color.Coloring(config, "status")
def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture."""
config_fixture = fixture("test.gitconfig")
self.config = git_config.GitConfig(config_fixture)
color.SetDefaultColoring("true")
self.color = color.Coloring(self.config, "status")
def test_Color_Parse_all_params_none(self):
"""all params are None"""
val = self.color._parse(None, None, None, None)
self.assertEqual("", val)
def test_Color_Parse_all_params_none(coloring: color.Coloring) -> None:
"""all params are None"""
val = coloring._parse(None, None, None, None)
assert val == ""
def test_Color_Parse_first_parameter_none(self):
"""check fg & bg & attr"""
val = self.color._parse(None, "black", "red", "ul")
self.assertEqual("\x1b[4;30;41m", val)
def test_Color_Parse_one_entry(self):
"""check fg"""
val = self.color._parse("one", None, None, None)
self.assertEqual("\033[33m", val)
def test_Color_Parse_first_parameter_none(coloring: color.Coloring) -> None:
"""check fg & bg & attr"""
val = coloring._parse(None, "black", "red", "ul")
assert val == "\x1b[4;30;41m"
def test_Color_Parse_two_entry(self):
"""check fg & bg"""
val = self.color._parse("two", None, None, None)
self.assertEqual("\033[35;46m", val)
def test_Color_Parse_three_entry(self):
"""check fg & bg & attr"""
val = self.color._parse("three", None, None, None)
self.assertEqual("\033[4;30;41m", val)
def test_Color_Parse_one_entry(coloring: color.Coloring) -> None:
"""check fg"""
val = coloring._parse("one", None, None, None)
assert val == "\033[33m"
def test_Color_Parse_reset_entry(self):
"""check reset entry"""
val = self.color._parse("reset", None, None, None)
self.assertEqual("\033[m", val)
def test_Color_Parse_empty_entry(self):
"""check empty entry"""
val = self.color._parse("none", "blue", "white", "dim")
self.assertEqual("\033[2;34;47m", val)
val = self.color._parse("empty", "green", "white", "bold")
self.assertEqual("\033[1;32;47m", val)
def test_Color_Parse_two_entry(coloring: color.Coloring) -> None:
"""check fg & bg"""
val = coloring._parse("two", None, None, None)
assert val == "\033[35;46m"
def test_Color_Parse_three_entry(coloring: color.Coloring) -> None:
"""check fg & bg & attr"""
val = coloring._parse("three", None, None, None)
assert val == "\033[4;30;41m"
def test_Color_Parse_reset_entry(coloring: color.Coloring) -> None:
"""check reset entry"""
val = coloring._parse("reset", None, None, None)
assert val == "\033[m"
def test_Color_Parse_empty_entry(coloring: color.Coloring) -> None:
"""check empty entry"""
val = coloring._parse("none", "blue", "white", "dim")
assert val == "\033[2;34;47m"
val = coloring._parse("empty", "green", "white", "bold")
assert val == "\033[1;32;47m"

View File

@@ -14,43 +14,32 @@
"""Unittests for the editor.py module."""
import unittest
import pytest
from editor import Editor
class EditorTestCase(unittest.TestCase):
@pytest.fixture(autouse=True)
def reset_editor() -> None:
"""Take care of resetting Editor state across tests."""
def setUp(self):
self.setEditor(None)
def tearDown(self):
self.setEditor(None)
@staticmethod
def setEditor(editor):
Editor._editor = editor
Editor._editor = None
yield
Editor._editor = None
class GetEditor(EditorTestCase):
"""Check GetEditor behavior."""
def test_basic(self):
"""Basic checking of _GetEditor."""
self.setEditor(":")
self.assertEqual(":", Editor._GetEditor())
def test_basic() -> None:
"""Basic checking of _GetEditor."""
Editor._editor = ":"
assert Editor._GetEditor() == ":"
class EditString(EditorTestCase):
"""Check EditString behavior."""
def test_no_editor() -> None:
"""Check behavior when no editor is available."""
Editor._editor = ":"
assert Editor.EditString("foo") == "foo"
def test_no_editor(self):
"""Check behavior when no editor is available."""
self.setEditor(":")
self.assertEqual("foo", Editor.EditString("foo"))
def test_cat_editor(self):
"""Check behavior when editor is `cat`."""
self.setEditor("cat")
self.assertEqual("foo", Editor.EditString("foo"))
def test_cat_editor() -> None:
"""Check behavior when editor is `cat`."""
Editor._editor = "cat"
assert Editor.EditString("foo") == "foo"

View File

@@ -1,4 +1,4 @@
# Copyright 2021 The Android Open Source Project
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -16,7 +16,9 @@
import inspect
import pickle
import unittest
from typing import Iterator, Type
import pytest
import command
import error
@@ -26,7 +28,7 @@ import project
from subcmds import all_modules
imports = all_modules + [
_IMPORTS = all_modules + [
error,
project,
git_command,
@@ -35,36 +37,35 @@ imports = all_modules + [
]
class PickleTests(unittest.TestCase):
"""Make sure all our custom exceptions can be pickled."""
def get_exceptions() -> Iterator[Type[Exception]]:
"""Return all our custom exceptions."""
for entry in _IMPORTS:
for name in dir(entry):
cls = getattr(entry, name)
if isinstance(cls, type) and issubclass(cls, Exception):
yield cls
def getExceptions(self):
"""Return all our custom exceptions."""
for entry in imports:
for name in dir(entry):
cls = getattr(entry, name)
if isinstance(cls, type) and issubclass(cls, Exception):
yield cls
def testExceptionLookup(self):
"""Make sure our introspection logic works."""
classes = list(self.getExceptions())
self.assertIn(error.HookError, classes)
# Don't assert the exact number to avoid being a change-detector test.
self.assertGreater(len(classes), 10)
def test_exception_lookup() -> None:
"""Make sure our introspection logic works."""
classes = list(get_exceptions())
assert error.HookError in classes
# Don't assert the exact number to avoid being a change-detector test.
assert len(classes) > 10
def testPickle(self):
"""Try to pickle all the exceptions."""
for cls in self.getExceptions():
args = inspect.getfullargspec(cls.__init__).args[1:]
obj = cls(*args)
p = pickle.dumps(obj)
try:
newobj = pickle.loads(p)
except Exception as e: # pylint: disable=broad-except
self.fail(
"Class %s is unable to be pickled: %s\n"
"Incomplete super().__init__(...) call?" % (cls, e)
)
self.assertIsInstance(newobj, cls)
self.assertEqual(str(obj), str(newobj))
@pytest.mark.parametrize("cls", get_exceptions())
def test_pickle(cls: Type[Exception]) -> None:
"""Try to pickle all the exceptions."""
args = inspect.getfullargspec(cls.__init__).args[1:]
obj = cls(*args)
p = pickle.dumps(obj)
try:
newobj = pickle.loads(p)
except Exception as e:
pytest.fail(
f"Class {cls} is unable to be pickled: {e}\n"
"Incomplete super().__init__(...) call?"
)
assert isinstance(newobj, cls)
assert str(obj) == str(newobj)

View File

@@ -1,4 +1,4 @@
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.

View File

@@ -15,176 +15,219 @@
"""Unittests for the git_config.py module."""
import os
import tempfile
import unittest
from pathlib import Path
from typing import Any
import pytest
import git_config
def fixture(*paths):
def fixture_path(*paths: str) -> str:
"""Return a path relative to test/fixtures."""
return os.path.join(os.path.dirname(__file__), "fixtures", *paths)
class GitConfigReadOnlyTests(unittest.TestCase):
"""Read-only tests of the GitConfig class."""
def setUp(self):
"""Create a GitConfig object using the test.gitconfig fixture."""
config_fixture = fixture("test.gitconfig")
self.config = git_config.GitConfig(config_fixture)
def test_GetString_with_empty_config_values(self):
"""
Test config entries with no value.
[section]
empty
"""
val = self.config.GetString("section.empty")
self.assertEqual(val, None)
def test_GetString_with_true_value(self):
"""
Test config entries with a string value.
[section]
nonempty = true
"""
val = self.config.GetString("section.nonempty")
self.assertEqual(val, "true")
def test_GetString_from_missing_file(self):
"""
Test missing config file
"""
config_fixture = fixture("not.present.gitconfig")
config = git_config.GitConfig(config_fixture)
val = config.GetString("empty")
self.assertEqual(val, None)
def test_GetBoolean_undefined(self):
"""Test GetBoolean on key that doesn't exist."""
self.assertIsNone(self.config.GetBoolean("section.missing"))
def test_GetBoolean_invalid(self):
"""Test GetBoolean on invalid boolean value."""
self.assertIsNone(self.config.GetBoolean("section.boolinvalid"))
def test_GetBoolean_true(self):
"""Test GetBoolean on valid true boolean."""
self.assertTrue(self.config.GetBoolean("section.booltrue"))
def test_GetBoolean_false(self):
"""Test GetBoolean on valid false boolean."""
self.assertFalse(self.config.GetBoolean("section.boolfalse"))
def test_GetInt_undefined(self):
"""Test GetInt on key that doesn't exist."""
self.assertIsNone(self.config.GetInt("section.missing"))
def test_GetInt_invalid(self):
"""Test GetInt on invalid integer value."""
self.assertIsNone(self.config.GetBoolean("section.intinvalid"))
def test_GetInt_valid(self):
"""Test GetInt on valid integers."""
TESTS = (
("inthex", 16),
("inthexk", 16384),
("int", 10),
("intk", 10240),
("intm", 10485760),
("intg", 10737418240),
)
for key, value in TESTS:
self.assertEqual(value, self.config.GetInt(f"section.{key}"))
@pytest.fixture
def readonly_config() -> git_config.GitConfig:
"""Create a GitConfig object using the test.gitconfig fixture."""
config_fixture = fixture_path("test.gitconfig")
return git_config.GitConfig(config_fixture)
class GitConfigReadWriteTests(unittest.TestCase):
"""Read/write tests of the GitConfig class."""
def test_get_string_with_empty_config_values(
readonly_config: git_config.GitConfig,
) -> None:
"""Test config entries with no value.
def setUp(self):
self.tmpfile = tempfile.NamedTemporaryFile()
self.config = self.get_config()
[section]
empty
def get_config(self):
"""Get a new GitConfig instance."""
return git_config.GitConfig(self.tmpfile.name)
"""
val = readonly_config.GetString("section.empty")
assert val is None
def test_SetString(self):
"""Test SetString behavior."""
# Set a value.
self.assertIsNone(self.config.GetString("foo.bar"))
self.config.SetString("foo.bar", "val")
self.assertEqual("val", self.config.GetString("foo.bar"))
# Make sure the value was actually written out.
config = self.get_config()
self.assertEqual("val", config.GetString("foo.bar"))
def test_get_string_with_true_value(
readonly_config: git_config.GitConfig,
) -> None:
"""Test config entries with a string value.
# Update the value.
self.config.SetString("foo.bar", "valll")
self.assertEqual("valll", self.config.GetString("foo.bar"))
config = self.get_config()
self.assertEqual("valll", config.GetString("foo.bar"))
[section]
nonempty = true
# Delete the value.
self.config.SetString("foo.bar", None)
self.assertIsNone(self.config.GetString("foo.bar"))
config = self.get_config()
self.assertIsNone(config.GetString("foo.bar"))
"""
val = readonly_config.GetString("section.nonempty")
assert val == "true"
def test_SetBoolean(self):
"""Test SetBoolean behavior."""
# Set a true value.
self.assertIsNone(self.config.GetBoolean("foo.bar"))
for val in (True, 1):
self.config.SetBoolean("foo.bar", val)
self.assertTrue(self.config.GetBoolean("foo.bar"))
# Make sure the value was actually written out.
config = self.get_config()
self.assertTrue(config.GetBoolean("foo.bar"))
self.assertEqual("true", config.GetString("foo.bar"))
def test_get_string_from_missing_file() -> None:
"""Test missing config file."""
config_fixture = fixture_path("not.present.gitconfig")
config = git_config.GitConfig(config_fixture)
val = config.GetString("empty")
assert val is None
# Set a false value.
for val in (False, 0):
self.config.SetBoolean("foo.bar", val)
self.assertFalse(self.config.GetBoolean("foo.bar"))
# Make sure the value was actually written out.
config = self.get_config()
self.assertFalse(config.GetBoolean("foo.bar"))
self.assertEqual("false", config.GetString("foo.bar"))
def test_get_boolean_undefined(readonly_config: git_config.GitConfig) -> None:
"""Test GetBoolean on key that doesn't exist."""
assert readonly_config.GetBoolean("section.missing") is None
# Delete the value.
self.config.SetBoolean("foo.bar", None)
self.assertIsNone(self.config.GetBoolean("foo.bar"))
config = self.get_config()
self.assertIsNone(config.GetBoolean("foo.bar"))
def test_GetSyncAnalysisStateData(self):
"""Test config entries with a sync state analysis data."""
superproject_logging_data = {}
superproject_logging_data["test"] = False
options = type("options", (object,), {})()
options.verbose = "true"
options.mp_update = "false"
TESTS = (
("superproject.test", "false"),
("options.verbose", "true"),
("options.mpupdate", "false"),
("main.version", "1"),
)
self.config.UpdateSyncAnalysisState(options, superproject_logging_data)
sync_data = self.config.GetSyncAnalysisStateData()
for key, value in TESTS:
self.assertEqual(
sync_data[f"{git_config.SYNC_STATE_PREFIX}{key}"], value
)
self.assertTrue(
sync_data[f"{git_config.SYNC_STATE_PREFIX}main.synctime"]
)
def test_get_boolean_invalid(readonly_config: git_config.GitConfig) -> None:
"""Test GetBoolean on invalid boolean value."""
assert readonly_config.GetBoolean("section.boolinvalid") is None
def test_get_boolean_true(readonly_config: git_config.GitConfig) -> None:
"""Test GetBoolean on valid true boolean."""
assert readonly_config.GetBoolean("section.booltrue") is True
def test_get_boolean_false(readonly_config: git_config.GitConfig) -> None:
"""Test GetBoolean on valid false boolean."""
assert readonly_config.GetBoolean("section.boolfalse") is False
def test_get_int_undefined(readonly_config: git_config.GitConfig) -> None:
"""Test GetInt on key that doesn't exist."""
assert readonly_config.GetInt("section.missing") is None
def test_get_int_invalid(readonly_config: git_config.GitConfig) -> None:
"""Test GetInt on invalid integer value."""
assert readonly_config.GetInt("section.intinvalid") is None
@pytest.mark.parametrize(
"key, expected",
(
("inthex", 16),
("inthexk", 16384),
("int", 10),
("intk", 10240),
("intm", 10485760),
("intg", 10737418240),
),
)
def test_get_int_valid(
readonly_config: git_config.GitConfig, key: str, expected: int
) -> None:
"""Test GetInt on valid integers."""
assert readonly_config.GetInt(f"section.{key}") == expected
@pytest.fixture
def rw_config_file(tmp_path: Path) -> Path:
"""Return a path to a temporary config file."""
return tmp_path / "config"
def test_set_string(rw_config_file: Path) -> None:
"""Test SetString behavior."""
config = git_config.GitConfig(str(rw_config_file))
# Set a value.
assert config.GetString("foo.bar") is None
config.SetString("foo.bar", "val")
assert config.GetString("foo.bar") == "val"
# Make sure the value was actually written out.
config2 = git_config.GitConfig(str(rw_config_file))
assert config2.GetString("foo.bar") == "val"
# Update the value.
config.SetString("foo.bar", "valll")
assert config.GetString("foo.bar") == "valll"
config3 = git_config.GitConfig(str(rw_config_file))
assert config3.GetString("foo.bar") == "valll"
# Delete the value.
config.SetString("foo.bar", None)
assert config.GetString("foo.bar") is None
config4 = git_config.GitConfig(str(rw_config_file))
assert config4.GetString("foo.bar") is None
def test_set_boolean(rw_config_file: Path) -> None:
"""Test SetBoolean behavior."""
config = git_config.GitConfig(str(rw_config_file))
# Set a true value.
assert config.GetBoolean("foo.bar") is None
for val in (True, 1):
config.SetBoolean("foo.bar", val)
assert config.GetBoolean("foo.bar") is True
# Make sure the value was actually written out.
config2 = git_config.GitConfig(str(rw_config_file))
assert config2.GetBoolean("foo.bar") is True
assert config2.GetString("foo.bar") == "true"
# Set a false value.
for val in (False, 0):
config.SetBoolean("foo.bar", val)
assert config.GetBoolean("foo.bar") is False
# Make sure the value was actually written out.
config3 = git_config.GitConfig(str(rw_config_file))
assert config3.GetBoolean("foo.bar") is False
assert config3.GetString("foo.bar") == "false"
# Delete the value.
config.SetBoolean("foo.bar", None)
assert config.GetBoolean("foo.bar") is None
config4 = git_config.GitConfig(str(rw_config_file))
assert config4.GetBoolean("foo.bar") is None
def test_set_int(rw_config_file: Path) -> None:
"""Test SetInt behavior."""
config = git_config.GitConfig(str(rw_config_file))
# Set a value.
assert config.GetInt("foo.bar") is None
config.SetInt("foo.bar", 10)
assert config.GetInt("foo.bar") == 10
# Make sure the value was actually written out.
config2 = git_config.GitConfig(str(rw_config_file))
assert config2.GetInt("foo.bar") == 10
assert config2.GetString("foo.bar") == "10"
# Update the value.
config.SetInt("foo.bar", 20)
assert config.GetInt("foo.bar") == 20
config3 = git_config.GitConfig(str(rw_config_file))
assert config3.GetInt("foo.bar") == 20
# Delete the value.
config.SetInt("foo.bar", None)
assert config.GetInt("foo.bar") is None
config4 = git_config.GitConfig(str(rw_config_file))
assert config4.GetInt("foo.bar") is None
def test_get_sync_analysis_state_data(rw_config_file: Path) -> None:
"""Test config entries with a sync state analysis data."""
config = git_config.GitConfig(str(rw_config_file))
superproject_logging_data: dict[str, Any] = {"test": False}
class Options:
"""Container for testing."""
options = Options()
options.verbose = "true"
options.mp_update = "false"
TESTS = (
("superproject.test", "false"),
("options.verbose", "true"),
("options.mpupdate", "false"),
("main.version", "1"),
)
config.UpdateSyncAnalysisState(options, superproject_logging_data)
sync_data = config.GetSyncAnalysisStateData()
for key, value in TESTS:
assert sync_data[f"{git_config.SYNC_STATE_PREFIX}{key}"] == value
assert sync_data[f"{git_config.SYNC_STATE_PREFIX}main.synctime"]

99
tests/test_git_refs.py Normal file
View File

@@ -0,0 +1,99 @@
# Copyright (C) 2026 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the git_refs.py module."""
import os
from pathlib import Path
import subprocess
import pytest
import utils_for_test
import git_refs
def _run(repo, *args):
return subprocess.run(
["git", "-C", repo, *args],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
encoding="utf-8",
check=True,
).stdout.strip()
def _init_repo(tmp_path, reftable=False):
repo = os.path.join(tmp_path, "repo")
ref_format = "reftable" if reftable else "files"
utils_for_test.init_git_tree(repo, ref_format=ref_format)
Path(os.path.join(repo, "a")).write_text("1")
_run(repo, "add", "a")
_run(repo, "commit", "-q", "-m", "init")
return repo
@pytest.mark.parametrize("reftable", [False, True])
def test_reads_refs(tmp_path, reftable):
if reftable and not utils_for_test.supports_reftable():
pytest.skip("reftable not supported")
repo = _init_repo(tmp_path, reftable=reftable)
gitdir = os.path.join(repo, ".git")
refs = git_refs.GitRefs(gitdir)
branch = _run(repo, "symbolic-ref", "--short", "HEAD")
head = _run(repo, "rev-parse", "HEAD")
assert refs.symref("HEAD") == f"refs/heads/{branch}"
assert refs.get("HEAD") == head
assert refs.get(f"refs/heads/{branch}") == head
@pytest.mark.parametrize("reftable", [False, True])
def test_updates_when_refs_change(tmp_path, reftable):
if reftable and not utils_for_test.supports_reftable():
pytest.skip("reftable not supported")
repo = _init_repo(tmp_path, reftable=reftable)
gitdir = os.path.join(repo, ".git")
refs = git_refs.GitRefs(gitdir)
head = _run(repo, "rev-parse", "HEAD")
assert refs.get("refs/heads/topic") == ""
_run(repo, "branch", "topic")
assert refs.get("refs/heads/topic") == head
_run(repo, "branch", "-D", "topic")
assert refs.get("refs/heads/topic") == ""
@pytest.mark.skipif(
not utils_for_test.supports_refs_migrate(),
reason="git refs migrate reftable support is required for this test",
)
def test_updates_when_storage_backend_toggles(tmp_path):
repo = _init_repo(tmp_path, reftable=False)
gitdir = os.path.join(repo, ".git")
refs = git_refs.GitRefs(gitdir)
head = _run(repo, "rev-parse", "HEAD")
assert refs.get("refs/heads/reftable-branch") == ""
_run(repo, "refs", "migrate", "--ref-format=reftable")
_run(repo, "branch", "reftable-branch")
assert refs.get("refs/heads/reftable-branch") == head
assert refs.get("refs/heads/files-branch") == ""
_run(repo, "refs", "migrate", "--ref-format=files")
_run(repo, "branch", "files-branch")
assert refs.get("refs/heads/files-branch") == head

View File

@@ -461,6 +461,62 @@ class SuperprojectTestCase(unittest.TestCase):
"</manifest>",
)
def test_Init_success(self):
"""Test _Init succeeds and creates the work git dir."""
self.assertFalse(os.path.exists(self._superproject._work_git))
with mock.patch(
"git_superproject.GitCommand", autospec=True
) as mock_git_command:
instance = mock_git_command.return_value
instance.Wait.return_value = 0
self.assertTrue(self._superproject._Init())
mock_git_command.assert_called_once()
args, kwargs = mock_git_command.call_args
self.assertEqual(args[1][:2], ["init", "--bare"])
tmp_git_name = args[1][2]
self.assertTrue(
tmp_git_name.startswith(".tmp-superproject-initgitdir-")
)
self.assertTrue(os.path.exists(self._superproject._work_git))
tmp_git_path = os.path.join(
self._superproject._superproject_path, tmp_git_name
)
self.assertFalse(os.path.exists(tmp_git_path))
def test_Init_already_exists(self):
"""Test _Init returns early if the work git dir already exists."""
os.mkdir(self._superproject._superproject_path)
os.mkdir(self._superproject._work_git)
with mock.patch(
"git_superproject.GitCommand", autospec=True
) as mock_git_command:
self.assertTrue(self._superproject._Init())
mock_git_command.assert_not_called()
def test_Init_failure_cleans_up(self):
"""Test _Init cleans up the temporary directory if 'git init' fails."""
with mock.patch(
"git_superproject.GitCommand", autospec=True
) as mock_git_command:
instance = mock_git_command.return_value
instance.Wait.return_value = 1
instance.stderr = "mock git init failure"
self.assertFalse(self._superproject._Init())
mock_git_command.assert_called_once()
args, kwargs = mock_git_command.call_args
tmp_git_name = args[1][2]
tmp_git_path = os.path.join(
self._superproject._superproject_path, tmp_git_name
)
self.assertFalse(os.path.exists(tmp_git_path))
self.assertFalse(os.path.exists(self._superproject._work_git))
def test_Fetch(self):
manifest = self.getXmlManifest(
"""

View File

@@ -14,6 +14,8 @@
"""Unittests for the git_trace2_event_log.py module."""
import contextlib
import io
import json
import os
import socket
@@ -65,13 +67,13 @@ class EventLogTestCase(unittest.TestCase):
def setUp(self):
"""Load the event_log module every time."""
self._event_log_module = None
self._event_log = None
# By default we initialize with the expected case where
# repo launches us (so GIT_TRACE2_PARENT_SID is set).
env = {
self.PARENT_SID_KEY: self.PARENT_SID_VALUE,
}
self._event_log_module = git_trace2_event_log.EventLog(env=env)
self._event_log = git_trace2_event_log.EventLog(env=env)
self._log_data = None
def verifyCommonKeys(
@@ -112,13 +114,13 @@ class EventLogTestCase(unittest.TestCase):
def test_initial_state_with_parent_sid(self):
"""Test initial state when 'GIT_TRACE2_PARENT_SID' is set by parent."""
self.assertRegex(self._event_log_module.full_sid, self.FULL_SID_REGEX)
self.assertRegex(self._event_log.full_sid, self.FULL_SID_REGEX)
def test_initial_state_no_parent_sid(self):
"""Test initial state when 'GIT_TRACE2_PARENT_SID' is not set."""
# Setup an empty environment dict (no parent sid).
self._event_log_module = git_trace2_event_log.EventLog(env={})
self.assertRegex(self._event_log_module.full_sid, self.SELF_SID_REGEX)
self._event_log = git_trace2_event_log.EventLog(env={})
self.assertRegex(self._event_log.full_sid, self.SELF_SID_REGEX)
def test_version_event(self):
"""Test 'version' event data is valid.
@@ -130,7 +132,7 @@ class EventLogTestCase(unittest.TestCase):
<version event>
"""
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
# A log with no added events should only have the version entry.
@@ -150,9 +152,9 @@ class EventLogTestCase(unittest.TestCase):
<version event>
<start event>
"""
self._event_log_module.StartEvent([])
self._event_log.StartEvent([])
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
@@ -172,9 +174,9 @@ class EventLogTestCase(unittest.TestCase):
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(None)
self._event_log.ExitEvent(None)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
@@ -193,9 +195,9 @@ class EventLogTestCase(unittest.TestCase):
<version event>
<exit event>
"""
self._event_log_module.ExitEvent(2)
self._event_log.ExitEvent(2)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
@@ -213,11 +215,9 @@ class EventLogTestCase(unittest.TestCase):
<version event>
<command event>
"""
self._event_log_module.CommandEvent(
name="repo", subcommands=["init", "this"]
)
self._event_log.CommandEvent(name="repo", subcommands=["init", "this"])
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
@@ -241,10 +241,10 @@ class EventLogTestCase(unittest.TestCase):
"repo.partialclone": "true",
"repo.partialclonefilter": "blob:none",
}
self._event_log_module.DefParamRepoEvents(config)
self._event_log.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 3)
@@ -268,10 +268,10 @@ class EventLogTestCase(unittest.TestCase):
"git.foo": "bar",
"git.core.foo2": "baz",
}
self._event_log_module.DefParamRepoEvents(config)
self._event_log.DefParamRepoEvents(config)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 1)
@@ -292,10 +292,10 @@ class EventLogTestCase(unittest.TestCase):
"repo.syncstate.superproject.sys.argv": ["--", "sync", "protobuf"],
}
prefix_value = "prefix"
self._event_log_module.LogDataConfigEvents(config, prefix_value)
self._event_log.LogDataConfigEvents(config, prefix_value)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 5)
@@ -311,7 +311,7 @@ class EventLogTestCase(unittest.TestCase):
key = self.remove_prefix(key, f"{prefix_value}/")
value = event["value"]
self.assertEqual(
self._event_log_module.GetDataEventName(value), event["event"]
self._event_log.GetDataEventName(value), event["event"]
)
self.assertTrue(key in config and value == config[key])
@@ -324,9 +324,9 @@ class EventLogTestCase(unittest.TestCase):
"""
msg = "invalid option: --cahced"
fmt = "invalid option: %s"
self._event_log_module.ErrorEvent(msg, fmt)
self._event_log.ErrorEvent(msg, fmt)
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
log_path = self._event_log_module.Write(path=tempdir)
log_path = self._event_log.Write(path=tempdir)
self._log_data = self.readLog(log_path)
self.assertEqual(len(self._log_data), 2)
@@ -341,33 +341,34 @@ class EventLogTestCase(unittest.TestCase):
def test_write_with_filename(self):
"""Test Write() with a path to a file exits with None."""
self.assertIsNone(self._event_log_module.Write(path="path/to/file"))
self.assertIsNone(self._event_log.Write(path="path/to/file"))
def test_write_with_git_config(self):
"""Test Write() uses the git config path when 'git config' call
succeeds."""
with tempfile.TemporaryDirectory(prefix="event_log_tests") as tempdir:
with mock.patch.object(
self._event_log_module,
self._event_log,
"_GetEventTargetPath",
return_value=tempdir,
):
self.assertEqual(
os.path.dirname(self._event_log_module.Write()), tempdir
os.path.dirname(self._event_log.Write()), tempdir
)
def test_write_no_git_config(self):
"""Test Write() with no git config variable present exits with None."""
with mock.patch.object(
self._event_log_module, "_GetEventTargetPath", return_value=None
self._event_log, "_GetEventTargetPath", return_value=None
):
self.assertIsNone(self._event_log_module.Write())
self.assertIsNone(self._event_log.Write())
def test_write_non_string(self):
"""Test Write() with non-string type for |path| throws TypeError."""
with self.assertRaises(TypeError):
self._event_log_module.Write(path=1234)
self._event_log.Write(path=1234)
@unittest.skipIf(not hasattr(socket, "AF_UNIX"), "Requires AF_UNIX sockets")
def test_write_socket(self):
"""Test Write() with Unix domain socket for |path| and validate received
traces."""
@@ -388,10 +389,8 @@ class EventLogTestCase(unittest.TestCase):
with server_ready:
server_ready.wait(timeout=120)
self._event_log_module.StartEvent([])
path = self._event_log_module.Write(
path=f"af_unix:{socket_path}"
)
self._event_log.StartEvent([])
path = self._event_log.Write(path=f"af_unix:{socket_path}")
finally:
server_thread.join(timeout=5)
@@ -404,3 +403,59 @@ class EventLogTestCase(unittest.TestCase):
# Check for 'start' event specific fields.
self.assertIn("argv", start_event)
self.assertIsInstance(start_event["argv"], list)
class EventLogVerboseTestCase(unittest.TestCase):
"""TestCase for the EventLog module verbose logging."""
def setUp(self):
self._event_log = git_trace2_event_log.EventLog(env={})
def test_write_socket_error_no_verbose(self):
"""Test Write() suppression of socket errors when not verbose."""
self._event_log.verbose = False
with contextlib.redirect_stderr(
io.StringIO()
) as mock_stderr, mock.patch("socket.socket", side_effect=OSError):
self._event_log.Write(path="af_unix:stream:/tmp/test_sock")
self.assertEqual(mock_stderr.getvalue(), "")
def test_write_socket_error_verbose(self):
"""Test Write() printing of socket errors when verbose."""
self._event_log.verbose = True
with contextlib.redirect_stderr(
io.StringIO()
) as mock_stderr, mock.patch(
"socket.socket", side_effect=OSError("Mock error")
):
self._event_log.Write(path="af_unix:stream:/tmp/test_sock")
self.assertIn(
"git trace2 logging failed: Mock error",
mock_stderr.getvalue(),
)
def test_write_file_error_no_verbose(self):
"""Test Write() suppression of file errors when not verbose."""
self._event_log.verbose = False
with contextlib.redirect_stderr(
io.StringIO()
) as mock_stderr, mock.patch(
"tempfile.NamedTemporaryFile", side_effect=FileExistsError
):
self._event_log.Write(path="/tmp")
self.assertEqual(mock_stderr.getvalue(), "")
def test_write_file_error_verbose(self):
"""Test Write() printing of file errors when verbose."""
self._event_log.verbose = True
with contextlib.redirect_stderr(
io.StringIO()
) as mock_stderr, mock.patch(
"tempfile.NamedTemporaryFile",
side_effect=FileExistsError("Mock error"),
):
self._event_log.Write(path="/tmp")
self.assertIn(
"git trace2 logging failed: FileExistsError",
mock_stderr.getvalue(),
)

View File

@@ -14,42 +14,47 @@
"""Unittests for the hooks.py module."""
import unittest
import pytest
import hooks
class RepoHookShebang(unittest.TestCase):
"""Check shebang parsing in RepoHook."""
@pytest.mark.parametrize(
"data",
(
"",
"#\n# foo\n",
"# Bad shebang in script\n#!/foo\n",
),
)
def test_no_shebang(data: str) -> None:
"""Lines w/out shebangs should be rejected."""
assert hooks.RepoHook._ExtractInterpFromShebang(data) is None
def test_no_shebang(self):
"""Lines w/out shebangs should be rejected."""
DATA = ("", "#\n# foo\n", "# Bad shebang in script\n#!/foo\n")
for data in DATA:
self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
def test_direct_interp(self):
"""Lines whose shebang points directly to the interpreter."""
DATA = (
("#!/foo", "/foo"),
("#! /foo", "/foo"),
("#!/bin/foo ", "/bin/foo"),
("#! /usr/foo ", "/usr/foo"),
("#! /usr/foo -args", "/usr/foo"),
)
for shebang, interp in DATA:
self.assertEqual(
hooks.RepoHook._ExtractInterpFromShebang(shebang), interp
)
@pytest.mark.parametrize(
"shebang, interp",
(
("#!/foo", "/foo"),
("#! /foo", "/foo"),
("#!/bin/foo ", "/bin/foo"),
("#! /usr/foo ", "/usr/foo"),
("#! /usr/foo -args", "/usr/foo"),
),
)
def test_direct_interp(shebang: str, interp: str) -> None:
"""Lines whose shebang points directly to the interpreter."""
assert hooks.RepoHook._ExtractInterpFromShebang(shebang) == interp
def test_env_interp(self):
"""Lines whose shebang launches through `env`."""
DATA = (
("#!/usr/bin/env foo", "foo"),
("#!/bin/env foo", "foo"),
("#! /bin/env /bin/foo ", "/bin/foo"),
)
for shebang, interp in DATA:
self.assertEqual(
hooks.RepoHook._ExtractInterpFromShebang(shebang), interp
)
@pytest.mark.parametrize(
"shebang, interp",
(
("#!/usr/bin/env foo", "foo"),
("#!/bin/env foo", "foo"),
("#! /bin/env /bin/foo ", "/bin/foo"),
),
)
def test_env_interp(shebang: str, interp: str) -> None:
"""Lines whose shebang launches through `env`."""
assert hooks.RepoHook._ExtractInterpFromShebang(shebang) == interp

View File

@@ -15,6 +15,7 @@
"""Unittests for the manifest_xml.py module."""
import os
from pathlib import Path
import platform
import re
import tempfile
@@ -97,36 +98,34 @@ class ManifestParseTestCase(unittest.TestCase):
def setUp(self):
self.tempdirobj = tempfile.TemporaryDirectory(prefix="repo_tests")
self.tempdir = self.tempdirobj.name
self.repodir = os.path.join(self.tempdir, ".repo")
self.manifest_dir = os.path.join(self.repodir, "manifests")
self.manifest_file = os.path.join(
self.repodir, manifest_xml.MANIFEST_FILE_NAME
self.tempdir = Path(self.tempdirobj.name)
self.repodir = self.tempdir / ".repo"
self.manifest_dir = self.repodir / "manifests"
self.manifest_file = self.repodir / manifest_xml.MANIFEST_FILE_NAME
self.local_manifest_dir = (
self.repodir / manifest_xml.LOCAL_MANIFESTS_DIR_NAME
)
self.local_manifest_dir = os.path.join(
self.repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME
)
os.mkdir(self.repodir)
os.mkdir(self.manifest_dir)
self.repodir.mkdir()
self.manifest_dir.mkdir()
# The manifest parsing really wants a git repo currently.
gitdir = os.path.join(self.repodir, "manifests.git")
os.mkdir(gitdir)
with open(os.path.join(gitdir, "config"), "w") as fp:
fp.write(
"""[remote "origin"]
gitdir = self.repodir / "manifests.git"
gitdir.mkdir()
(gitdir / "config").write_text(
"""[remote "origin"]
url = https://localhost:0/manifest
"""
)
)
def tearDown(self):
self.tempdirobj.cleanup()
def getXmlManifest(self, data):
"""Helper to initialize a manifest for testing."""
with open(self.manifest_file, "w", encoding="utf-8") as fp:
fp.write(data)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
self.manifest_file.write_text(data, encoding="utf-8")
return manifest_xml.XmlManifest(
str(self.repodir), str(self.manifest_file)
)
@staticmethod
def encodeXmlAttr(attr):
@@ -243,12 +242,14 @@ class XmlManifestTests(ManifestParseTestCase):
def test_link(self):
"""Verify Link handling with new names."""
manifest = manifest_xml.XmlManifest(self.repodir, self.manifest_file)
with open(os.path.join(self.manifest_dir, "foo.xml"), "w") as fp:
fp.write("<manifest></manifest>")
manifest = manifest_xml.XmlManifest(
str(self.repodir), str(self.manifest_file)
)
(self.manifest_dir / "foo.xml").write_text("<manifest></manifest>")
manifest.Link("foo.xml")
with open(self.manifest_file) as fp:
self.assertIn('<include name="foo.xml" />', fp.read())
self.assertIn(
'<include name="foo.xml" />', self.manifest_file.read_text()
)
def test_toxml_empty(self):
"""Verify the ToXml() helper."""
@@ -400,16 +401,41 @@ class XmlManifestTests(ManifestParseTestCase):
self.assertEqual(len(manifest.projects), 1)
self.assertEqual(manifest.projects[0].name, "test-project")
def test_sync_j_max(self):
"""Check sync-j-max handling."""
# Check valid value.
manifest = self.getXmlManifest(
'<manifest><default sync-j-max="5" /></manifest>'
)
self.assertEqual(manifest.default.sync_j_max, 5)
self.assertEqual(
manifest.ToXml().toxml(),
'<?xml version="1.0" ?>'
'<manifest><default sync-j-max="5"/></manifest>',
)
# Check invalid values.
with self.assertRaises(error.ManifestParseError):
manifest = self.getXmlManifest(
'<manifest><default sync-j-max="0" /></manifest>'
)
manifest.ToXml()
with self.assertRaises(error.ManifestParseError):
manifest = self.getXmlManifest(
'<manifest><default sync-j-max="-1" /></manifest>'
)
manifest.ToXml()
class IncludeElementTests(ManifestParseTestCase):
"""Tests for <include>."""
def test_revision_default(self):
"""Check handling of revision attribute."""
root_m = os.path.join(self.manifest_dir, "root.xml")
with open(root_m, "w") as fp:
fp.write(
"""
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
@@ -418,17 +444,34 @@ class IncludeElementTests(ManifestParseTestCase):
<project name="root-name2" path="root-path2" />
</manifest>
"""
)
with open(os.path.join(self.manifest_dir, "stable.xml"), "w") as fp:
fp.write(
"""
)
(self.manifest_dir / "stable.xml").write_text(
"""
<manifest>
<include name="man1.xml" />
<include name="man2.xml" revision="stable-branch2" />
<project name="stable-name1" path="stable-path1" />
<project name="stable-name2" path="stable-path2" revision="stable-branch2" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(self.repodir, root_m)
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<project name="man1-name1" />
<project name="man1-name2" revision="stable-branch3" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<project name="man2-name1" />
<project name="man2-name2" revision="stable-branch3" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
for proj in include_m.projects:
if proj.name == "root-name1":
# Check include revision not set on root level proj.
@@ -442,12 +485,19 @@ class IncludeElementTests(ManifestParseTestCase):
if proj.name == "stable-name2":
# Check stable proj revision can override include node.
self.assertEqual("stable-branch2", proj.revisionExpr)
if proj.name == "man1-name1":
self.assertEqual("stable-branch", proj.revisionExpr)
if proj.name == "man1-name2":
self.assertEqual("stable-branch3", proj.revisionExpr)
if proj.name == "man2-name1":
self.assertEqual("stable-branch2", proj.revisionExpr)
if proj.name == "man2-name2":
self.assertEqual("stable-branch3", proj.revisionExpr)
def test_group_levels(self):
root_m = os.path.join(self.manifest_dir, "root.xml")
with open(root_m, "w") as fp:
fp.write(
"""
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
@@ -456,25 +506,23 @@ class IncludeElementTests(ManifestParseTestCase):
<project name="root-name2" path="root-path2" groups="r2g1,r2g2" />
</manifest>
"""
)
with open(os.path.join(self.manifest_dir, "level1.xml"), "w") as fp:
fp.write(
"""
)
(self.manifest_dir / "level1.xml").write_text(
"""
<manifest>
<include name="level2.xml" groups="level2-group" />
<project name="level1-name1" path="level1-path1" />
</manifest>
"""
)
with open(os.path.join(self.manifest_dir, "level2.xml"), "w") as fp:
fp.write(
"""
)
(self.manifest_dir / "level2.xml").write_text(
"""
<manifest>
<project name="level2-name1" path="level2-path1" groups="l2g1,l2g2" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(self.repodir, root_m)
)
include_m = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
for proj in include_m.projects:
if proj.name == "root-name1":
# Check include group not set on root level proj.
@@ -492,6 +540,68 @@ class IncludeElementTests(ManifestParseTestCase):
# Check level2 proj group not removed.
self.assertIn("l2g1", proj.groups)
def test_group_levels_with_extend_project(self):
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<include name="man1.xml" groups="top-group1" />
<include name="man2.xml" groups="top-group2" />
</manifest>
"""
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<project name="project1" path="project1" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<extend-project name="project1" groups="eg1" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
proj = include_m.projects[0]
# Check project has inherited group via project element.
self.assertIn("top-group1", proj.groups)
# Check project has inherited group via extend-project element.
self.assertIn("top-group2", proj.groups)
# Check project has set group via extend-project element.
self.assertIn("eg1", proj.groups)
def test_extend_project_does_not_inherit_local_groups(self):
"""Check that extend-project does not inherit local groups."""
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="project1" path="project1" />
<include name="man1.xml" groups="g1,local:g2" />
</manifest>
"""
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<extend-project name="project1" groups="g3" />
</manifest>
"""
)
include_m = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
proj = include_m.projects[0]
self.assertIn("g1", proj.groups)
self.assertNotIn("local:g2", proj.groups)
self.assertIn("g3", proj.groups)
def test_allow_bad_name_from_user(self):
"""Check handling of bad name attribute from the user's input."""
@@ -510,9 +620,8 @@ class IncludeElementTests(ManifestParseTestCase):
manifest.ToXml()
# Setup target of the include.
target = os.path.join(self.tempdir, "target.xml")
with open(target, "w") as fp:
fp.write("<manifest></manifest>")
target = self.tempdir / "target.xml"
target.write_text("<manifest></manifest>")
# Include with absolute path.
parse(os.path.abspath(target))
@@ -526,12 +635,9 @@ class IncludeElementTests(ManifestParseTestCase):
def parse(name):
name = self.encodeXmlAttr(name)
# Setup target of the include.
with open(
os.path.join(self.manifest_dir, "target.xml"),
"w",
encoding="utf-8",
) as fp:
fp.write(f'<manifest><include name="{name}"/></manifest>')
(self.manifest_dir / "target.xml").write_text(
f'<manifest><include name="{name}"/></manifest>'
)
manifest = self.getXmlManifest(
"""
@@ -578,18 +684,18 @@ class ProjectElementTests(ManifestParseTestCase):
manifest.projects[0].name: manifest.projects[0].groups,
manifest.projects[1].name: manifest.projects[1].groups,
}
self.assertCountEqual(
result["test-name"], ["name:test-name", "all", "path:test-path"]
self.assertEqual(
result["test-name"], {"name:test-name", "all", "path:test-path"}
)
self.assertCountEqual(
self.assertEqual(
result["extras"],
["g1", "g2", "g1", "name:extras", "all", "path:path"],
{"g1", "g2", "name:extras", "all", "path:path"},
)
groupstr = "default,platform-" + platform.system().lower()
self.assertEqual(groupstr, manifest.GetGroupsStr())
self.assertEqual(groupstr, manifest.GetManifestGroupsStr())
groupstr = "g1,g2,g1"
manifest.manifestProject.config.SetString("manifest.groups", groupstr)
self.assertEqual(groupstr, manifest.GetGroupsStr())
self.assertEqual(groupstr, manifest.GetManifestGroupsStr())
def test_set_revision_id(self):
"""Check setting of project's revisionId."""
@@ -1214,6 +1320,206 @@ class ExtendProjectElementTests(ManifestParseTestCase):
self.assertEqual(len(manifest.projects), 1)
self.assertEqual(manifest.projects[0].upstream, "bar")
def test_extend_project_copyfiles(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="myproject" />
<extend-project name="myproject">
<copyfile src="foo" dest="bar" />
</extend-project>
</manifest>
"""
)
self.assertEqual(list(manifest.projects[0].copyfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].copyfiles)[0].dest, "bar")
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="myproject">'
'<copyfile dest="bar" src="foo"/>'
"</project>"
"</manifest>",
)
def test_extend_project_duplicate_copyfiles(self):
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="myproject" />
<include name="man1.xml" />
<include name="man2.xml" />
</manifest>
"""
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "common.xml").write_text(
"""
<manifest>
<extend-project name="myproject">
<copyfile dest="bar" src="foo"/>
</extend-project>
</manifest>
"""
)
manifest = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
self.assertEqual(len(manifest.projects[0].copyfiles), 1)
self.assertEqual(list(manifest.projects[0].copyfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].copyfiles)[0].dest, "bar")
def test_extend_project_linkfiles(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="myproject" />
<extend-project name="myproject">
<linkfile src="foo" dest="bar" />
</extend-project>
</manifest>
"""
)
self.assertEqual(list(manifest.projects[0].linkfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].linkfiles)[0].dest, "bar")
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="myproject">'
'<linkfile dest="bar" src="foo"/>'
"</project>"
"</manifest>",
)
def test_extend_project_duplicate_linkfiles(self):
root_m = self.manifest_dir / "root.xml"
root_m.write_text(
"""
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="myproject" />
<include name="man1.xml" />
<include name="man2.xml" />
</manifest>
"""
)
(self.manifest_dir / "man1.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "man2.xml").write_text(
"""
<manifest>
<include name="common.xml" />
</manifest>
"""
)
(self.manifest_dir / "common.xml").write_text(
"""
<manifest>
<extend-project name="myproject">
<linkfile dest="bar" src="foo"/>
</extend-project>
</manifest>
"""
)
manifest = manifest_xml.XmlManifest(str(self.repodir), str(root_m))
self.assertEqual(len(manifest.projects[0].linkfiles), 1)
self.assertEqual(list(manifest.projects[0].linkfiles)[0].src, "foo")
self.assertEqual(list(manifest.projects[0].linkfiles)[0].dest, "bar")
def test_extend_project_annotations(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="myproject" />
<extend-project name="myproject">
<annotation name="foo" value="bar" />
</extend-project>
</manifest>
"""
)
self.assertEqual(manifest.projects[0].annotations[0].name, "foo")
self.assertEqual(manifest.projects[0].annotations[0].value, "bar")
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="myproject">'
'<annotation name="foo" value="bar"/>'
"</project>"
"</manifest>",
)
def test_extend_project_annotations_multiples(self):
manifest = self.getXmlManifest(
"""
<manifest>
<remote name="default-remote" fetch="http://localhost" />
<default remote="default-remote" revision="refs/heads/main" />
<project name="myproject">
<annotation name="foo" value="bar" />
<annotation name="few" value="bar" />
</project>
<extend-project name="myproject">
<annotation name="foo" value="new_bar" />
<annotation name="new" value="anno" />
</extend-project>
</manifest>
"""
)
self.assertEqual(
[(a.name, a.value) for a in manifest.projects[0].annotations],
[
("foo", "bar"),
("few", "bar"),
("foo", "new_bar"),
("new", "anno"),
],
)
self.assertEqual(
sort_attributes(manifest.ToXml().toxml()),
'<?xml version="1.0" ?><manifest>'
'<remote fetch="http://localhost" name="default-remote"/>'
'<default remote="default-remote" revision="refs/heads/main"/>'
'<project name="myproject">'
'<annotation name="foo" value="bar"/>'
'<annotation name="few" value="bar"/>'
'<annotation name="foo" value="new_bar"/>'
'<annotation name="new" value="anno"/>'
"</project>"
"</manifest>",
)
class NormalizeUrlTests(ManifestParseTestCase):
"""Tests for normalize_url() in manifest_xml.py"""

View File

@@ -1,4 +1,4 @@
# Copyright 2021 The Android Open Source Project
# Copyright (C) 2021 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -14,39 +14,35 @@
"""Unittests for the platform_utils.py module."""
import os
import tempfile
import unittest
from pathlib import Path
import pytest
import platform_utils
class RemoveTests(unittest.TestCase):
"""Check remove() helper."""
def test_remove_missing_ok(tmp_path: Path) -> None:
"""Check missing_ok handling."""
path = tmp_path / "test"
def testMissingOk(self):
"""Check missing_ok handling."""
with tempfile.TemporaryDirectory() as tmpdir:
path = os.path.join(tmpdir, "test")
# Should not fail.
platform_utils.remove(path, missing_ok=True)
# Should not fail.
platform_utils.remove(path, missing_ok=True)
# Should fail.
with pytest.raises(OSError):
platform_utils.remove(path)
with pytest.raises(OSError):
platform_utils.remove(path, missing_ok=False)
# Should fail.
self.assertRaises(OSError, platform_utils.remove, path)
self.assertRaises(
OSError, platform_utils.remove, path, missing_ok=False
)
# Should not fail if it exists.
path.touch()
platform_utils.remove(path, missing_ok=True)
assert not path.exists()
# Should not fail if it exists.
open(path, "w").close()
platform_utils.remove(path, missing_ok=True)
self.assertFalse(os.path.exists(path))
path.touch()
platform_utils.remove(path)
assert not path.exists()
open(path, "w").close()
platform_utils.remove(path)
self.assertFalse(os.path.exists(path))
open(path, "w").close()
platform_utils.remove(path, missing_ok=False)
self.assertFalse(os.path.exists(path))
path.touch()
platform_utils.remove(path, missing_ok=False)
assert not path.exists()

View File

@@ -19,35 +19,18 @@ import os
from pathlib import Path
import subprocess
import tempfile
from typing import Optional
import unittest
import utils_for_test
import error
import git_command
import git_config
import manifest_xml
import platform_utils
import project
@contextlib.contextmanager
def TempGitTree():
"""Create a new empty git checkout for testing."""
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
cmd = ["git", "init"]
if git_command.git_require((2, 28, 0)):
cmd += ["--initial-branch=main"]
else:
# Use template dir for init.
templatedir = tempfile.mkdtemp(prefix=".test-template")
with open(os.path.join(templatedir, "HEAD"), "w") as fp:
fp.write("ref: refs/heads/main\n")
cmd += ["--template", templatedir]
subprocess.check_call(cmd, cwd=tempdir)
yield tempdir
class FakeProject:
"""A fake for Project for basic functionality."""
@@ -63,13 +46,16 @@ class FakeProject:
)
self.config = git_config.GitConfig.ForRepository(gitdir=self.gitdir)
def RelPath(self, local: Optional[bool] = None) -> str:
return self.name
class ReviewableBranchTests(unittest.TestCase):
"""Check ReviewableBranch behavior."""
def test_smoke(self):
"""A quick run through everything."""
with TempGitTree() as tempdir:
with utils_for_test.TempGitTree() as tempdir:
fakeproj = FakeProject(tempdir)
# Generate some commits.
@@ -116,6 +102,29 @@ class ProjectTests(unittest.TestCase):
"abcd00%21%21_%2b",
)
@unittest.skipUnless(
utils_for_test.supports_reftable(),
"git reftable support is required for this test",
)
def test_get_head_unborn_reftable(self):
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
subprocess.check_call(
[
"git",
"-c",
"init.defaultRefFormat=reftable",
"init",
"-q",
tempdir,
]
)
fakeproj = FakeProject(tempdir)
expected = subprocess.check_output(
["git", "-C", tempdir, "symbolic-ref", "-q", "HEAD"],
encoding="utf-8",
).strip()
self.assertEqual(expected, fakeproj.work_git.GetHead())
class CopyLinkTestCase(unittest.TestCase):
"""TestCase for stub repo client checkouts.
@@ -359,6 +368,7 @@ class MigrateWorkTreeTests(unittest.TestCase):
"""Check _MigrateOldWorkTreeGitDir handling."""
_SYMLINKS = {
# go/keep-sorted start
"config",
"description",
"hooks",
@@ -367,9 +377,11 @@ class MigrateWorkTreeTests(unittest.TestCase):
"objects",
"packed-refs",
"refs",
"reftable",
"rr-cache",
"shallow",
"svn",
# go/keep-sorted end
}
_FILES = {
"COMMIT_EDITMSG",
@@ -448,6 +460,25 @@ class MigrateWorkTreeTests(unittest.TestCase):
for name in self._SYMLINKS:
self.assertTrue((dotgit / name).is_symlink())
def test_reftable_anchor_with_refs_dir(self):
"""Migrate when reftable/ and refs/ are directories."""
with self._simple_layout() as tempdir:
dotgit = tempdir / "src/test/.git"
(dotgit / "refs").unlink()
(dotgit / "refs").mkdir()
(dotgit / "refs" / "heads").write_text("dummy")
(dotgit / "reftable").unlink()
(dotgit / "reftable").mkdir()
(dotgit / "reftable" / "tables.list").write_text("dummy")
project.Project._MigrateOldWorkTreeGitDir(str(dotgit))
self.assertTrue(dotgit.is_symlink())
self.assertEqual(
os.readlink(dotgit),
os.path.normpath("../../.repo/projects/src/test.git"),
)
class ManifestPropertiesFetchedCorrectly(unittest.TestCase):
"""Ensure properties are fetched properly."""
@@ -467,7 +498,7 @@ class ManifestPropertiesFetchedCorrectly(unittest.TestCase):
def test_manifest_config_properties(self):
"""Test we are fetching the manifest config properties correctly."""
with TempGitTree() as tempdir:
with utils_for_test.TempGitTree() as tempdir:
fakeproj = self.setUpManifest(tempdir)
# Set property using the expected Set method, then ensure

View File

@@ -12,90 +12,96 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unit test for repo_logging module."""
"""Unittests for the repo_logging.py module."""
import contextlib
import io
import logging
import unittest
import re
from unittest import mock
import pytest
from color import SetDefaultColoring
from error import RepoExitError
from repo_logging import RepoLogger
class TestRepoLogger(unittest.TestCase):
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_aggregated_errors(self, mock_error):
"""Test if log_aggregated_errors logs a list of aggregated errors."""
logger = RepoLogger(__name__)
logger.log_aggregated_errors(
RepoExitError(
aggregate_errors=[
Exception("foo"),
Exception("bar"),
Exception("baz"),
Exception("hello"),
Exception("world"),
Exception("test"),
]
)
)
mock_error.assert_has_calls(
[
mock.call("=" * 80),
mock.call(
"Repo command failed due to the following `%s` errors:",
"RepoExitError",
),
mock.call("foo\nbar\nbaz\nhello\nworld"),
mock.call("+%d additional errors...", 1),
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_aggregated_errors(mock_error) -> None:
"""Test if log_aggregated_errors logs a list of aggregated errors."""
logger = RepoLogger(__name__)
logger.log_aggregated_errors(
RepoExitError(
aggregate_errors=[
Exception("foo"),
Exception("bar"),
Exception("baz"),
Exception("hello"),
Exception("world"),
Exception("test"),
]
)
)
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_single_error(self, mock_error):
"""Test if log_aggregated_errors logs empty aggregated_errors."""
mock_error.assert_has_calls(
[
mock.call("=" * 80),
mock.call(
"Repo command failed due to the following `%s` errors:",
"RepoExitError",
),
mock.call("foo\nbar\nbaz\nhello\nworld"),
mock.call("+%d additional errors...", 1),
]
)
@mock.patch.object(RepoLogger, "error")
def test_log_aggregated_errors_logs_single_error(mock_error) -> None:
"""Test if log_aggregated_errors logs empty aggregated_errors."""
logger = RepoLogger(__name__)
logger.log_aggregated_errors(RepoExitError())
mock_error.assert_has_calls(
[
mock.call("=" * 80),
mock.call("Repo command failed: %s", "RepoExitError"),
]
)
@pytest.mark.parametrize(
"level",
(
logging.INFO,
logging.WARN,
logging.ERROR,
),
)
def test_log_with_format_string(level: int) -> None:
"""Test different log levels with format strings."""
name = logging.getLevelName(level)
# Set color output to "always" for consistent test results.
# This ensures the logger's behavior is uniform across different
# environments and git configurations.
SetDefaultColoring("always")
# Regex pattern to match optional ANSI color codes.
# \033 - Escape character
# \[ - Opening square bracket
# [0-9;]* - Zero or more digits or semicolons
# m - Ending 'm' character
# ? - Makes the entire group optional
opt_color = r"(\033\[[0-9;]*m)?"
output = io.StringIO()
with contextlib.redirect_stderr(output):
logger = RepoLogger(__name__)
logger.log_aggregated_errors(RepoExitError())
logger.log(level, "%s", "100% pass")
mock_error.assert_has_calls(
[
mock.call("=" * 80),
mock.call("Repo command failed: %s", "RepoExitError"),
]
)
def test_log_with_format_string(self):
"""Test different log levels with format strings."""
# Set color output to "always" for consistent test results.
# This ensures the logger's behavior is uniform across different
# environments and git configurations.
SetDefaultColoring("always")
# Regex pattern to match optional ANSI color codes.
# \033 - Escape character
# \[ - Opening square bracket
# [0-9;]* - Zero or more digits or semicolons
# m - Ending 'm' character
# ? - Makes the entire group optional
opt_color = r"(\033\[[0-9;]*m)?"
for level in (logging.INFO, logging.WARN, logging.ERROR):
name = logging.getLevelName(level)
with self.subTest(level=level, name=name):
output = io.StringIO()
with contextlib.redirect_stderr(output):
logger = RepoLogger(__name__)
logger.log(level, "%s", "100% pass")
self.assertRegex(
output.getvalue().strip(),
f"^{opt_color}100% pass{opt_color}$",
f"failed for level {name}",
)
assert re.search(
f"^{opt_color}100% pass{opt_color}$", output.getvalue().strip()
), f"failed for level {name}"

View File

@@ -1,4 +1,4 @@
# Copyright 2022 The Android Open Source Project
# Copyright (C) 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -15,46 +15,37 @@
"""Unittests for the repo_trace.py module."""
import os
import unittest
from unittest import mock
import pytest
import repo_trace
class TraceTests(unittest.TestCase):
def test_trace_max_size_enforced(monkeypatch: pytest.MonkeyPatch) -> None:
"""Check Trace behavior."""
content = "git chicken"
def testTrace_MaxSizeEnforced(self):
content = "git chicken"
with repo_trace.Trace(content, first_trace=True):
pass
first_trace_size = os.path.getsize(repo_trace._TRACE_FILE)
with repo_trace.Trace(content, first_trace=True):
pass
first_trace_size = os.path.getsize(repo_trace._TRACE_FILE)
with repo_trace.Trace(content):
pass
assert os.path.getsize(repo_trace._TRACE_FILE) > first_trace_size
with repo_trace.Trace(content):
pass
self.assertGreater(
os.path.getsize(repo_trace._TRACE_FILE), first_trace_size
)
# Check we clear everything if the last chunk is larger than _MAX_SIZE.
monkeypatch.setattr(repo_trace, "_MAX_SIZE", 0)
with repo_trace.Trace(content, first_trace=True):
pass
assert os.path.getsize(repo_trace._TRACE_FILE) == first_trace_size
# Check we clear everything is the last chunk is larger than _MAX_SIZE.
with mock.patch("repo_trace._MAX_SIZE", 0):
with repo_trace.Trace(content, first_trace=True):
pass
self.assertEqual(
first_trace_size, os.path.getsize(repo_trace._TRACE_FILE)
)
# Check we only clear the chunks we need to.
new_max = (first_trace_size + 1) / (1024 * 1024)
monkeypatch.setattr(repo_trace, "_MAX_SIZE", new_max)
with repo_trace.Trace(content, first_trace=True):
pass
assert os.path.getsize(repo_trace._TRACE_FILE) == first_trace_size * 2
# Check we only clear the chunks we need to.
repo_trace._MAX_SIZE = (first_trace_size + 1) / (1024 * 1024)
with repo_trace.Trace(content, first_trace=True):
pass
self.assertEqual(
first_trace_size * 2, os.path.getsize(repo_trace._TRACE_FILE)
)
with repo_trace.Trace(content, first_trace=True):
pass
self.assertEqual(
first_trace_size * 2, os.path.getsize(repo_trace._TRACE_FILE)
)
with repo_trace.Trace(content, first_trace=True):
pass
assert os.path.getsize(repo_trace._TRACE_FILE) == first_trace_size * 2

View File

@@ -1,4 +1,4 @@
# Copyright 2019 The Android Open Source Project
# Copyright (C) 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -16,65 +16,84 @@
import multiprocessing
import subprocess
import unittest
from typing import Tuple
from unittest import mock
import pytest
import ssh
class SshTests(unittest.TestCase):
"""Tests the ssh functions."""
@pytest.fixture(autouse=True)
def clear_ssh_version_cache() -> None:
"""Clear the ssh version cache before each test."""
ssh.version.cache_clear()
def test_parse_ssh_version(self):
"""Check _parse_ssh_version() handling."""
ver = ssh._parse_ssh_version("Unknown\n")
self.assertEqual(ver, ())
ver = ssh._parse_ssh_version("OpenSSH_1.0\n")
self.assertEqual(ver, (1, 0))
ver = ssh._parse_ssh_version(
"OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n"
)
self.assertEqual(ver, (6, 6, 1))
ver = ssh._parse_ssh_version(
"OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n"
)
self.assertEqual(ver, (7, 6))
ver = ssh._parse_ssh_version("OpenSSH_9.0p1, LibreSSL 3.3.6\n")
self.assertEqual(ver, (9, 0))
def test_version(self):
"""Check version() handling."""
with mock.patch("ssh._run_ssh_version", return_value="OpenSSH_1.2\n"):
self.assertEqual(ssh.version(), (1, 2))
@pytest.mark.parametrize(
"input_str, expected",
(
("Unknown\n", ()),
("OpenSSH_1.0\n", (1, 0)),
(
"OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n",
(6, 6, 1),
),
(
"OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n 7 Dec 2017\n",
(7, 6),
),
("OpenSSH_9.0p1, LibreSSL 3.3.6\n", (9, 0)),
),
)
def test_parse_ssh_version(input_str: str, expected: Tuple[int, ...]) -> None:
"""Check _parse_ssh_version() handling."""
assert ssh._parse_ssh_version(input_str) == expected
def test_context_manager_empty(self):
"""Verify context manager with no clients works correctly."""
with multiprocessing.Manager() as manager:
with ssh.ProxyManager(manager):
pass
def test_context_manager_child_cleanup(self):
"""Verify orphaned clients & masters get cleaned up."""
with multiprocessing.Manager() as manager:
def test_version() -> None:
"""Check version() handling."""
with mock.patch("ssh._run_ssh_version", return_value="OpenSSH_1.2\n"):
assert ssh.version() == (1, 2)
def test_context_manager_empty() -> None:
"""Verify context manager with no clients works correctly."""
with multiprocessing.Manager() as manager:
with ssh.ProxyManager(manager):
pass
def test_context_manager_child_cleanup() -> None:
"""Verify orphaned clients & masters get cleaned up."""
with multiprocessing.Manager() as manager:
with mock.patch("ssh.version", return_value=(1, 2)):
with ssh.ProxyManager(manager) as ssh_proxy:
client = subprocess.Popen(["sleep", "964853320"])
ssh_proxy.add_client(client)
master = subprocess.Popen(["sleep", "964853321"])
ssh_proxy.add_master(master)
# If the process still exists, these will throw timeout errors.
client.wait(0)
master.wait(0)
# If the process still exists, these will throw timeout errors.
client.wait(0)
master.wait(0)
def test_ssh_sock(self):
"""Check sock() function."""
manager = multiprocessing.Manager()
def test_ssh_sock(monkeypatch: pytest.MonkeyPatch) -> None:
"""Check sock() function."""
with multiprocessing.Manager() as manager:
proxy = ssh.ProxyManager(manager)
with mock.patch("tempfile.mkdtemp", return_value="/tmp/foo"):
# Old ssh version uses port.
with mock.patch("ssh.version", return_value=(6, 6)):
self.assertTrue(proxy.sock().endswith("%p"))
monkeypatch.setattr(
"tempfile.mkdtemp", lambda *args, **kwargs: "/tmp/foo"
)
proxy._sock_path = None
# New ssh version uses hash.
with mock.patch("ssh.version", return_value=(6, 7)):
self.assertTrue(proxy.sock().endswith("%C"))
# Old ssh version uses port.
with mock.patch("ssh.version", return_value=(6, 6)):
with proxy as ssh_proxy:
assert ssh_proxy.sock().endswith("%p")
proxy._sock_path = None
# New ssh version uses hash.
with mock.patch("ssh.version", return_value=(6, 7)):
with proxy as ssh_proxy:
assert ssh_proxy.sock().endswith("%C")
proxy._sock_path = None

View File

@@ -15,77 +15,164 @@
"""Unittests for the subcmds module (mostly __init__.py than subcommands)."""
import optparse
import unittest
from typing import Type
import pytest
from command import Command
import subcmds
class AllCommands(unittest.TestCase):
"""Check registered all_commands."""
# NB: We don't test all subcommands as we want to avoid "change detection"
# tests, so we just look for the most common/important ones here that are
# unlikely to ever change.
@pytest.mark.parametrize(
"cmd", ("cherry-pick", "help", "init", "start", "sync", "upload")
)
def test_required_basic(cmd: str) -> None:
"""Basic checking of registered commands."""
assert cmd in subcmds.all_commands
def test_required_basic(self):
"""Basic checking of registered commands."""
# NB: We don't test all subcommands as we want to avoid "change
# detection" tests, so we just look for the most common/important ones
# here that are unlikely to ever change.
for cmd in {"cherry-pick", "help", "init", "start", "sync", "upload"}:
self.assertIn(cmd, subcmds.all_commands)
def test_naming(self):
"""Verify we don't add things that we shouldn't."""
for cmd in subcmds.all_commands:
# Reject filename suffixes like "help.py".
self.assertNotIn(".", cmd)
@pytest.mark.parametrize("name", subcmds.all_commands.keys())
def test_naming(name: str) -> None:
"""Verify we don't add things that we shouldn't."""
# Reject filename suffixes like "help.py".
assert "." not in name
# Make sure all '_' were converted to '-'.
self.assertNotIn("_", cmd)
# Make sure all '_' were converted to '-'.
assert "_" not in name
# Reject internal python paths like "__init__".
self.assertFalse(cmd.startswith("__"))
# Reject internal python paths like "__init__".
assert not name.startswith("__")
def test_help_desc_style(self):
"""Force some consistency in option descriptions.
Python's optparse & argparse has a few default options like --help.
Their option description text uses lowercase sentence fragments, so
enforce our options follow the same style so UI is consistent.
@pytest.mark.parametrize("name, cls", subcmds.all_commands.items())
def test_help_desc_style(name: str, cls: Type[Command]) -> None:
"""Force some consistency in option descriptions.
We enforce:
* Text starts with lowercase.
* Text doesn't end with period.
"""
Python's optparse & argparse has a few default options like --help.
Their option description text uses lowercase sentence fragments, so
enforce our options follow the same style so UI is consistent.
We enforce:
* Text starts with lowercase.
* Text doesn't end with period.
"""
cmd = cls()
parser = cmd.OptionParser
for option in parser.option_list:
if option.help == optparse.SUPPRESS_HELP or not option.help:
continue
c = option.help[0]
assert c.lower() == c, (
f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should start with lowercase: "{option.help}"'
)
assert option.help[-1] != ".", (
f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should not end in a period: "{option.help}"'
)
@pytest.mark.parametrize("name, cls", subcmds.all_commands.items())
def test_cli_option_style(name: str, cls: Type[Command]) -> None:
"""Force some consistency in option flags."""
cmd = cls()
parser = cmd.OptionParser
for option in parser.option_list:
for opt in option._long_opts:
assert "_" not in opt, (
f"subcmds/{name}.py: {opt}: only use dashes in "
"options, not underscores"
)
def test_cli_option_dest() -> None:
"""Block redundant dest= arguments."""
bad_opts: list[tuple[str, str]] = []
def _check_dest(opt: optparse.Option) -> None:
"""Check the dest= setting."""
# If the destination is not set, nothing to check.
# If long options are not set, then there's no implicit destination.
# If callback is used, then a destination might be needed because
# optparse cannot assume a value is always stored.
if opt.dest is None or not opt._long_opts or opt.callback:
return
long = opt._long_opts[0]
assert long.startswith("--")
# This matches optparse's behavior.
implicit_dest = long[2:].replace("-", "_")
if implicit_dest == opt.dest:
bad_opts.append((str(opt), opt.dest))
# Hook the option check list.
optparse.Option.CHECK_METHODS.insert(0, _check_dest)
try:
# Gather all the bad options up front so people can see all bad options
# instead of failing at the first one.
all_bad_opts: dict[str, list[tuple[str, str]]] = {}
for name, cls in subcmds.all_commands.items():
bad_opts = []
cmd = cls()
parser = cmd.OptionParser
for option in parser.option_list:
if option.help == optparse.SUPPRESS_HELP:
continue
# Trigger construction of parser.
_ = cmd.OptionParser
all_bad_opts[name] = bad_opts
c = option.help[0]
self.assertEqual(
c.lower(),
c,
msg=f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should start with lowercase: "{option.help}"',
errmsg = ""
for name, bad_opts_list in sorted(all_bad_opts.items()):
if bad_opts_list:
if not errmsg:
errmsg = "Omit redundant dest= when defining options.\n"
errmsg += f"\nSubcommand {name} (subcmds/{name}.py):\n"
errmsg += "".join(
f" {opt}: dest='{dest}'\n" for opt, dest in bad_opts_list
)
if errmsg:
pytest.fail(errmsg)
finally:
# Make sure we aren't popping the wrong stuff.
assert optparse.Option.CHECK_METHODS.pop(0) is _check_dest
self.assertNotEqual(
option.help[-1],
".",
msg=f"subcmds/{name}.py: {option.get_opt_string()}: "
f'help text should not end in a period: "{option.help}"',
)
def test_cli_option_style(self):
"""Force some consistency in option flags."""
for name, cls in subcmds.all_commands.items():
cmd = cls()
parser = cmd.OptionParser
for option in parser.option_list:
for opt in option._long_opts:
self.assertNotIn(
"_",
opt,
msg=f"subcmds/{name}.py: {opt}: only use dashes in "
"options, not underscores",
)
@pytest.mark.parametrize("name, cls", subcmds.all_commands.items())
def test_common_validate_options(name: str, cls: Type[Command]) -> None:
"""Verify CommonValidateOptions sets up expected fields."""
cmd = cls()
opts, args = cmd.OptionParser.parse_args([])
# Verify the fields don't exist yet.
assert not hasattr(
opts, "verbose"
), f"{name}: has verbose before validation"
assert not hasattr(opts, "quiet"), f"{name}: has quiet before validation"
cmd.CommonValidateOptions(opts, args)
# Verify the fields exist now.
assert hasattr(opts, "verbose"), f"{name}: missing verbose after validation"
assert hasattr(opts, "quiet"), f"{name}: missing quiet after validation"
assert hasattr(
opts, "outer_manifest"
), f"{name}: missing outer_manifest after validation"
def test_attribute_error_repro() -> None:
"""Confirm that accessing verbose before CommonValidateOptions fails."""
from subcmds.sync import Sync
cmd = Sync()
opts, args = cmd.OptionParser.parse_args([])
# This confirms that without the fix in main.py, an AttributeError
# would be raised because CommonValidateOptions hasn't been called yet.
with pytest.raises(AttributeError):
_ = opts.verbose
cmd.CommonValidateOptions(opts, args)
assert hasattr(opts, "verbose")

View File

@@ -17,12 +17,12 @@
from io import StringIO
import os
from shutil import rmtree
import subprocess
import tempfile
import unittest
from unittest import mock
import git_command
import utils_for_test
import manifest_xml
import project
import subcmds
@@ -50,24 +50,6 @@ class AllCommands(unittest.TestCase):
"""Common teardown."""
rmtree(self.tempdir, ignore_errors=True)
def initTempGitTree(self, git_dir):
"""Create a new empty git checkout for testing."""
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
cmd = ["git", "init", "-q"]
if git_command.git_require((2, 28, 0)):
cmd += ["--initial-branch=main"]
else:
# Use template dir for init
templatedir = os.path.join(self.tempdirobj.name, ".test-template")
os.makedirs(templatedir)
with open(os.path.join(templatedir, "HEAD"), "w") as fp:
fp.write("ref: refs/heads/main\n")
cmd += ["--template", templatedir]
cmd += [git_dir]
subprocess.check_call(cmd)
def getXmlManifestWith8Projects(self):
"""Create and return a setup of 8 projects with enough dummy
files and setup to execute forall."""
@@ -114,7 +96,7 @@ class AllCommands(unittest.TestCase):
)
)
git_path = os.path.join(self.tempdir, "tests/path" + str(x))
self.initTempGitTree(git_path)
utils_for_test.init_git_tree(git_path)
return manifest_xml.XmlManifest(self.repodir, self.manifest_file)

82
tests/test_subcmds_gc.py Normal file
View File

@@ -0,0 +1,82 @@
# Copyright (C) 2026 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds/gc.py module."""
import unittest
from unittest import mock
from subcmds import gc
class GcCommand(unittest.TestCase):
"""Tests for gc command."""
def setUp(self):
self.cmd = gc.Gc()
self.opt, self.args = self.cmd.OptionParser.parse_args([])
self.opt.this_manifest_only = False
self.opt.repack = False
self.mock_get_projects = mock.patch.object(
self.cmd, "GetProjects"
).start()
self.mock_delete = mock.patch.object(
self.cmd, "delete_unused_projects", return_value=0
).start()
self.mock_repack = mock.patch.object(
self.cmd, "repack_projects", return_value=0
).start()
def tearDown(self):
mock.patch.stopall()
def test_gc_no_args(self):
"""Test gc without specific projects."""
self.mock_get_projects.return_value = ["all_projects"]
self.cmd.Execute(self.opt, [])
self.mock_get_projects.assert_called_once_with([], all_manifests=True)
self.mock_delete.assert_called_once_with(["all_projects"], self.opt)
self.mock_repack.assert_not_called()
def test_gc_with_args(self):
"""Test gc with specific projects uses all_projects for delete."""
self.mock_get_projects.side_effect = [["projA"], ["all_projects"]]
self.opt.repack = True
self.cmd.Execute(self.opt, ["projA"])
self.mock_get_projects.assert_has_calls(
[
mock.call(["projA"], all_manifests=True),
mock.call([], all_manifests=True),
]
)
self.mock_delete.assert_called_once_with(["all_projects"], self.opt)
self.mock_repack.assert_called_once_with(["projA"], self.opt)
def test_gc_exit_on_delete_failure(self):
"""Test gc exits if delete_unused_projects fails."""
self.mock_get_projects.return_value = ["all_projects"]
self.mock_delete.return_value = 1
self.opt.repack = True
ret = self.cmd.Execute(self.opt, [])
self.assertEqual(ret, 1)
self.mock_repack.assert_not_called()

View File

@@ -14,33 +14,36 @@
"""Unittests for the subcmds/init.py module."""
import unittest
from typing import List
import pytest
from subcmds import init
class InitCommand(unittest.TestCase):
"""Check registered all_commands."""
@pytest.mark.parametrize(
"argv",
([],),
)
def test_cli_parser_good(argv: List[str]) -> None:
"""Check valid command line options."""
cmd = init.Init()
opts, args = cmd.OptionParser.parse_args(argv)
cmd.ValidateOptions(opts, args)
def setUp(self):
self.cmd = init.Init()
def test_cli_parser_good(self):
"""Check valid command line options."""
ARGV = ([],)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
self.cmd.ValidateOptions(opts, args)
def test_cli_parser_bad(self):
"""Check invalid command line options."""
ARGV = (
# Too many arguments.
["url", "asdf"],
# Conflicting options.
["--mirror", "--archive"],
)
for argv in ARGV:
opts, args = self.cmd.OptionParser.parse_args(argv)
with self.assertRaises(SystemExit):
self.cmd.ValidateOptions(opts, args)
@pytest.mark.parametrize(
"argv",
(
# Too many arguments.
["url", "asdf"],
# Conflicting options.
["--mirror", "--archive"],
),
)
def test_cli_parser_bad(argv: List[str]) -> None:
"""Check invalid command line options."""
cmd = init.Init()
opts, args = cmd.OptionParser.parse_args(argv)
with pytest.raises(SystemExit):
cmd.ValidateOptions(opts, args)

View File

@@ -0,0 +1,156 @@
# Copyright (C) 2025 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unittests for the subcmds/manifest.py module."""
import json
from pathlib import Path
from unittest import mock
import manifest_xml
from subcmds import manifest
_EXAMPLE_MANIFEST = """\
<?xml version="1.0" encoding="UTF-8"?>
<manifest>
<remote name="test-remote" fetch="http://localhost" />
<default remote="test-remote" revision="refs/heads/main" />
<project name="repohooks" path="src/repohooks"/>
<repo-hooks in-project="repohooks" enabled-list="a, b"/>
</manifest>
"""
def _get_cmd(repodir: Path) -> manifest.Manifest:
"""Instantiate a manifest command object to test."""
manifests_git = repodir / "manifests.git"
manifests_git.mkdir()
(manifests_git / "config").write_text(
"""
[remote "origin"]
\turl = http://localhost/manifest
"""
)
client = manifest_xml.RepoClient(repodir=str(repodir))
git_event_log = mock.MagicMock(ErrorEvent=mock.Mock(return_value=None))
return manifest.Manifest(
repodir=client.repodir,
client=client,
manifest=client.manifest,
outer_client=client,
outer_manifest=client.manifest,
git_event_log=git_event_log,
)
def test_output_format_xml_file(tmp_path):
"""Test writing XML to a file."""
path = tmp_path / "manifest.xml"
path.write_text(_EXAMPLE_MANIFEST)
outpath = tmp_path / "output.xml"
cmd = _get_cmd(tmp_path)
opt, args = cmd.OptionParser.parse_args(["--output-file", str(outpath)])
cmd.Execute(opt, args)
# Normalize the output a bit as we don't exactly care.
normalize = lambda data: "\n".join(
x.strip() for x in data.splitlines() if x.strip()
)
assert (
normalize(outpath.read_text())
== """<?xml version="1.0" encoding="UTF-8"?>
<manifest>
<remote name="test-remote" fetch="http://localhost"/>
<default remote="test-remote" revision="refs/heads/main"/>
<project name="repohooks" path="src/repohooks"/>
<repo-hooks in-project="repohooks" enabled-list="a b"/>
</manifest>"""
)
def test_output_format_xml_stdout(tmp_path, capsys):
"""Test writing XML to stdout."""
path = tmp_path / "manifest.xml"
path.write_text(_EXAMPLE_MANIFEST)
cmd = _get_cmd(tmp_path)
opt, args = cmd.OptionParser.parse_args(["--format", "xml"])
cmd.Execute(opt, args)
# Normalize the output a bit as we don't exactly care.
normalize = lambda data: "\n".join(
x.strip() for x in data.splitlines() if x.strip()
)
stdout = capsys.readouterr().out
assert (
normalize(stdout)
== """<?xml version="1.0" encoding="UTF-8"?>
<manifest>
<remote name="test-remote" fetch="http://localhost"/>
<default remote="test-remote" revision="refs/heads/main"/>
<project name="repohooks" path="src/repohooks"/>
<repo-hooks in-project="repohooks" enabled-list="a b"/>
</manifest>"""
)
def test_output_format_json(tmp_path, capsys):
"""Test writing JSON."""
path = tmp_path / "manifest.xml"
path.write_text(_EXAMPLE_MANIFEST)
cmd = _get_cmd(tmp_path)
opt, args = cmd.OptionParser.parse_args(["--format", "json"])
cmd.Execute(opt, args)
obj = json.loads(capsys.readouterr().out)
assert obj == {
"default": {"remote": "test-remote", "revision": "refs/heads/main"},
"project": [{"name": "repohooks", "path": "src/repohooks"}],
"remote": [{"fetch": "http://localhost", "name": "test-remote"}],
"repo-hooks": {"enabled-list": "a b", "in-project": "repohooks"},
}
def test_output_format_json_pretty(tmp_path, capsys):
"""Test writing pretty JSON."""
path = tmp_path / "manifest.xml"
path.write_text(_EXAMPLE_MANIFEST)
cmd = _get_cmd(tmp_path)
opt, args = cmd.OptionParser.parse_args(["--format", "json", "--pretty"])
cmd.Execute(opt, args)
stdout = capsys.readouterr().out
assert (
stdout
== """\
{
"default": {
"remote": "test-remote",
"revision": "refs/heads/main"
},
"project": [
{
"name": "repohooks",
"path": "src/repohooks"
}
],
"remote": [
{
"fetch": "http://localhost",
"name": "test-remote"
}
],
"repo-hooks": {
"enabled-list": "a b",
"in-project": "repohooks"
}
}
"""
)

View File

@@ -97,6 +97,35 @@ def test_cli_jobs(argv, jobs_manifest, jobs, jobs_net, jobs_check):
"""Tests --jobs option behavior."""
mp = mock.MagicMock()
mp.manifest.default.sync_j = jobs_manifest
mp.manifest.default.sync_j_max = None
cmd = sync.Sync()
opts, args = cmd.OptionParser.parse_args(argv)
cmd.ValidateOptions(opts, args)
with mock.patch.object(sync, "_rlimit_nofile", return_value=(256, 256)):
with mock.patch.object(os, "cpu_count", return_value=OS_CPU_COUNT):
cmd._ValidateOptionsWithManifest(opts, mp)
assert opts.jobs == jobs
assert opts.jobs_network == jobs_net
assert opts.jobs_checkout == jobs_check
@pytest.mark.parametrize(
"argv, jobs_manifest, jobs_manifest_max, jobs, jobs_net, jobs_check",
[
(["--jobs=10"], None, 5, 5, 5, 5),
(["--jobs=10", "--jobs-network=10"], None, 5, 5, 5, 5),
(["--jobs=10", "--jobs-checkout=10"], None, 5, 5, 5, 5),
],
)
def test_cli_jobs_sync_j_max(
argv, jobs_manifest, jobs_manifest_max, jobs, jobs_net, jobs_check
):
"""Tests --jobs option behavior with sync-j-max."""
mp = mock.MagicMock()
mp.manifest.default.sync_j = jobs_manifest
mp.manifest.default.sync_j_max = jobs_manifest_max
cmd = sync.Sync()
opts, args = cmd.OptionParser.parse_args(argv)
@@ -305,8 +334,21 @@ class LocalSyncState(unittest.TestCase):
class FakeProject:
def __init__(self, relpath):
def __init__(self, relpath, name=None, objdir=None):
self.relpath = relpath
self.name = name or relpath
self.objdir = objdir or relpath
self.worktree = relpath
self.use_git_worktrees = False
self.UseAlternates = False
self.manifest = mock.MagicMock()
self.manifest.GetProjectsWithName.return_value = [self]
self.config = mock.MagicMock()
self.EnableRepositoryExtension = mock.MagicMock()
def RelPath(self, local=None):
return self.relpath
def __str__(self):
return f"project: {self.relpath}"
@@ -513,3 +555,418 @@ class SyncCommand(unittest.TestCase):
self.cmd.Execute(self.opt, [])
self.assertIn(self.sync_local_half_error, e.aggregate_errors)
self.assertIn(self.sync_network_half_error, e.aggregate_errors)
class SyncUpdateRepoProject(unittest.TestCase):
"""Tests for Sync._UpdateRepoProject."""
def setUp(self):
"""Common setup."""
self.repodir = tempfile.mkdtemp(".repo")
self.manifest = manifest = mock.MagicMock(repodir=self.repodir)
# Create a repoProject with a mock Sync_NetworkHalf.
repoProject = mock.MagicMock(name="repo")
repoProject.Sync_NetworkHalf = mock.Mock(
return_value=SyncNetworkHalfResult(True, None)
)
manifest.repoProject = repoProject
manifest.IsArchive = False
manifest.CloneFilter = None
manifest.PartialCloneExclude = None
manifest.CloneFilterForDepth = None
git_event_log = mock.MagicMock(ErrorEvent=mock.Mock(return_value=None))
self.cmd = sync.Sync(manifest=manifest, git_event_log=git_event_log)
opt, _ = self.cmd.OptionParser.parse_args([])
opt.local_only = False
opt.repo_verify = False
opt.verbose = False
opt.quiet = True
opt.force_sync = False
opt.clone_bundle = False
opt.tags = False
opt.optimized_fetch = False
opt.retry_fetches = 0
opt.prune = False
self.opt = opt
self.errors = []
mock.patch.object(sync.Sync, "_GetCurrentBranchOnly").start()
def tearDown(self):
shutil.rmtree(self.repodir)
mock.patch.stopall()
def test_fetches_when_stale(self):
"""Test it fetches when the repo project is stale."""
self.manifest.repoProject.LastFetch = time.time() - (
sync._ONE_DAY_S + 1
)
with mock.patch.object(sync, "_PostRepoFetch") as mock_post_fetch:
self.cmd._UpdateRepoProject(self.opt, self.manifest, self.errors)
self.manifest.repoProject.Sync_NetworkHalf.assert_called_once()
mock_post_fetch.assert_called_once()
self.assertEqual(self.errors, [])
def test_skips_when_fresh(self):
"""Test it skips fetch when repo project is fresh."""
self.manifest.repoProject.LastFetch = time.time()
with mock.patch.object(sync, "_PostRepoFetch") as mock_post_fetch:
self.cmd._UpdateRepoProject(self.opt, self.manifest, self.errors)
self.manifest.repoProject.Sync_NetworkHalf.assert_not_called()
mock_post_fetch.assert_not_called()
def test_skips_local_only(self):
"""Test it does nothing with --local-only."""
self.opt.local_only = True
self.manifest.repoProject.LastFetch = time.time() - (
sync._ONE_DAY_S + 1
)
with mock.patch.object(sync, "_PostRepoFetch") as mock_post_fetch:
self.cmd._UpdateRepoProject(self.opt, self.manifest, self.errors)
self.manifest.repoProject.Sync_NetworkHalf.assert_not_called()
mock_post_fetch.assert_not_called()
def test_post_repo_fetch_skipped_on_env_var(self):
"""Test _PostRepoFetch is skipped when REPO_SKIP_SELF_UPDATE is set."""
self.manifest.repoProject.LastFetch = time.time()
with mock.patch.dict(os.environ, {"REPO_SKIP_SELF_UPDATE": "1"}):
with mock.patch.object(sync, "_PostRepoFetch") as mock_post_fetch:
self.cmd._UpdateRepoProject(
self.opt, self.manifest, self.errors
)
mock_post_fetch.assert_not_called()
def test_fetch_failure_is_handled(self):
"""Test that a fetch failure is recorded and doesn't crash."""
self.manifest.repoProject.LastFetch = time.time() - (
sync._ONE_DAY_S + 1
)
fetch_error = GitError("Fetch failed")
self.manifest.repoProject.Sync_NetworkHalf.return_value = (
SyncNetworkHalfResult(False, fetch_error)
)
with mock.patch.object(sync, "_PostRepoFetch") as mock_post_fetch:
self.cmd._UpdateRepoProject(self.opt, self.manifest, self.errors)
self.manifest.repoProject.Sync_NetworkHalf.assert_called_once()
mock_post_fetch.assert_not_called()
self.assertEqual(self.errors, [fetch_error])
class InterleavedSyncTest(unittest.TestCase):
"""Tests for interleaved sync."""
def setUp(self):
"""Set up a sync command with mocks."""
self.repodir = tempfile.mkdtemp(".repo")
self.manifest = mock.MagicMock(repodir=self.repodir)
self.manifest.repoProject.LastFetch = time.time()
self.manifest.repoProject.worktree = self.repodir
self.manifest.manifestProject.worktree = self.repodir
self.manifest.IsArchive = False
self.manifest.CloneBundle = False
self.manifest.default.sync_j = 1
self.outer_client = mock.MagicMock()
self.outer_client.manifest.IsArchive = False
self.cmd = sync.Sync(
manifest=self.manifest, outer_client=self.outer_client
)
self.cmd.outer_manifest = self.manifest
# Mock projects.
self.projA = FakeProject("projA", objdir="objA")
self.projB = FakeProject("projB", objdir="objB")
self.projA_sub = FakeProject(
"projA/sub", name="projA_sub", objdir="objA_sub"
)
self.projC = FakeProject("projC", objdir="objC")
# Mock methods that are not part of the core interleaved sync logic.
mock.patch.object(self.cmd, "_UpdateAllManifestProjects").start()
mock.patch.object(self.cmd, "_UpdateProjectsRevisionId").start()
mock.patch.object(self.cmd, "_ValidateOptionsWithManifest").start()
mock.patch.object(sync, "_PostRepoUpgrade").start()
mock.patch.object(sync, "_PostRepoFetch").start()
# Mock parallel context for worker tests.
self.parallel_context_patcher = mock.patch(
"subcmds.sync.Sync.get_parallel_context"
)
self.mock_get_parallel_context = self.parallel_context_patcher.start()
self.sync_dict = {}
self.mock_context = {
"projects": [],
"sync_dict": self.sync_dict,
}
self.mock_get_parallel_context.return_value = self.mock_context
# Mock _GetCurrentBranchOnly for worker tests.
mock.patch.object(sync.Sync, "_GetCurrentBranchOnly").start()
self.cmd._fetch_times = mock.Mock()
self.cmd._local_sync_state = mock.Mock()
def tearDown(self):
"""Clean up resources."""
shutil.rmtree(self.repodir)
mock.patch.stopall()
def test_interleaved_fail_fast(self):
"""Test that --fail-fast is respected in interleaved mode."""
opt, args = self.cmd.OptionParser.parse_args(
["--interleaved", "--fail-fast", "-j2"]
)
opt.quiet = True
# With projA/sub, _SafeCheckoutOrder creates two batches:
# 1. [projA, projB]
# 2. [projA/sub]
# We want to fail on the first batch and ensure the second isn't run.
all_projects = [self.projA, self.projB, self.projA_sub]
mock.patch.object(
self.cmd, "GetProjects", return_value=all_projects
).start()
# Mock ExecuteInParallel to simulate a failed run on the first batch of
# projects.
execute_mock = mock.patch.object(
self.cmd, "ExecuteInParallel", return_value=False
).start()
with self.assertRaises(sync.SyncFailFastError):
self.cmd._SyncInterleaved(
opt,
args,
[],
self.manifest,
self.manifest.manifestProject,
all_projects,
{},
)
execute_mock.assert_called_once()
def test_interleaved_shared_objdir_serial(self):
"""Test that projects with shared objdir are processed serially."""
opt, args = self.cmd.OptionParser.parse_args(["--interleaved", "-j4"])
opt.quiet = True
# Setup projects with a shared objdir.
self.projA.objdir = "common_objdir"
self.projC.objdir = "common_objdir"
all_projects = [self.projA, self.projB, self.projC]
mock.patch.object(
self.cmd, "GetProjects", return_value=all_projects
).start()
def execute_side_effect(jobs, target, work_items, **kwargs):
# The callback is a partial object. The first arg is the set we
# need to update to avoid the stall detection.
synced_relpaths_set = kwargs["callback"].args[0]
projects_in_pass = self.cmd.get_parallel_context()["projects"]
for item in work_items:
for project_idx in item:
synced_relpaths_set.add(
projects_in_pass[project_idx].relpath
)
return True
execute_mock = mock.patch.object(
self.cmd, "ExecuteInParallel", side_effect=execute_side_effect
).start()
self.cmd._SyncInterleaved(
opt,
args,
[],
self.manifest,
self.manifest.manifestProject,
all_projects,
{},
)
execute_mock.assert_called_once()
jobs_arg, _, work_items = execute_mock.call_args.args
self.assertEqual(jobs_arg, 2)
work_items_sets = {frozenset(item) for item in work_items}
expected_sets = {frozenset([0, 2]), frozenset([1])}
self.assertEqual(work_items_sets, expected_sets)
def _get_opts(self, args=None):
"""Helper to get default options for worker tests."""
if args is None:
args = ["--interleaved"]
opt, _ = self.cmd.OptionParser.parse_args(args)
# Set defaults for options used by the worker.
opt.quiet = True
opt.verbose = False
opt.force_sync = False
opt.clone_bundle = False
opt.tags = False
opt.optimized_fetch = False
opt.retry_fetches = 0
opt.prune = False
opt.detach_head = False
opt.force_checkout = False
opt.rebase = False
return opt
def test_worker_successful_sync(self):
"""Test _SyncProjectList with a successful fetch and checkout."""
opt = self._get_opts()
project = self.projA
project.Sync_NetworkHalf = mock.Mock(
return_value=SyncNetworkHalfResult(error=None, remote_fetched=True)
)
project.Sync_LocalHalf = mock.Mock()
project.manifest.manifestProject.config = mock.MagicMock()
self.mock_context["projects"] = [project]
with mock.patch("subcmds.sync.SyncBuffer") as mock_sync_buffer:
mock_sync_buf_instance = mock.MagicMock()
mock_sync_buf_instance.Finish.return_value = True
mock_sync_buf_instance.errors = []
mock_sync_buffer.return_value = mock_sync_buf_instance
result_obj = self.cmd._SyncProjectList(opt, [0])
self.assertEqual(len(result_obj.results), 1)
result = result_obj.results[0]
self.assertTrue(result.fetch_success)
self.assertTrue(result.checkout_success)
self.assertEqual(result.fetch_errors, [])
self.assertEqual(result.checkout_errors, [])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_called_once()
def test_worker_fetch_fails(self):
"""Test _SyncProjectList with a failed fetch."""
opt = self._get_opts()
project = self.projA
fetch_error = GitError("Fetch failed")
project.Sync_NetworkHalf = mock.Mock(
return_value=SyncNetworkHalfResult(
error=fetch_error, remote_fetched=False
)
)
project.Sync_LocalHalf = mock.Mock()
self.mock_context["projects"] = [project]
result_obj = self.cmd._SyncProjectList(opt, [0])
result = result_obj.results[0]
self.assertFalse(result.fetch_success)
self.assertFalse(result.checkout_success)
self.assertEqual(result.fetch_errors, [fetch_error])
self.assertEqual(result.checkout_errors, [])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_not_called()
def test_worker_no_worktree(self):
"""Test interleaved sync does not checkout with no worktree."""
opt = self._get_opts()
project = self.projA
project.worktree = None
project.Sync_NetworkHalf = mock.Mock(
return_value=SyncNetworkHalfResult(error=None, remote_fetched=True)
)
project.Sync_LocalHalf = mock.Mock()
self.mock_context["projects"] = [project]
result_obj = self.cmd._SyncProjectList(opt, [0])
result = result_obj.results[0]
self.assertTrue(result.fetch_success)
self.assertTrue(result.checkout_success)
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_not_called()
def test_worker_fetch_fails_exception(self):
"""Test _SyncProjectList with an exception during fetch."""
opt = self._get_opts()
project = self.projA
fetch_error = GitError("Fetch failed")
project.Sync_NetworkHalf = mock.Mock(side_effect=fetch_error)
project.Sync_LocalHalf = mock.Mock()
self.mock_context["projects"] = [project]
result_obj = self.cmd._SyncProjectList(opt, [0])
result = result_obj.results[0]
self.assertFalse(result.fetch_success)
self.assertFalse(result.checkout_success)
self.assertEqual(result.fetch_errors, [fetch_error])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_not_called()
def test_worker_checkout_fails(self):
"""Test _SyncProjectList with an exception during checkout."""
opt = self._get_opts()
project = self.projA
project.Sync_NetworkHalf = mock.Mock(
return_value=SyncNetworkHalfResult(error=None, remote_fetched=True)
)
checkout_error = GitError("Checkout failed")
project.Sync_LocalHalf = mock.Mock(side_effect=checkout_error)
project.manifest.manifestProject.config = mock.MagicMock()
self.mock_context["projects"] = [project]
with mock.patch("subcmds.sync.SyncBuffer"):
result_obj = self.cmd._SyncProjectList(opt, [0])
result = result_obj.results[0]
self.assertTrue(result.fetch_success)
self.assertFalse(result.checkout_success)
self.assertEqual(result.fetch_errors, [])
self.assertEqual(result.checkout_errors, [checkout_error])
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_called_once()
def test_worker_local_only(self):
"""Test _SyncProjectList with --local-only."""
opt = self._get_opts(["--interleaved", "--local-only"])
project = self.projA
project.Sync_NetworkHalf = mock.Mock()
project.Sync_LocalHalf = mock.Mock()
project.manifest.manifestProject.config = mock.MagicMock()
self.mock_context["projects"] = [project]
with mock.patch("subcmds.sync.SyncBuffer") as mock_sync_buffer:
mock_sync_buf_instance = mock.MagicMock()
mock_sync_buf_instance.Finish.return_value = True
mock_sync_buf_instance.errors = []
mock_sync_buffer.return_value = mock_sync_buf_instance
result_obj = self.cmd._SyncProjectList(opt, [0])
result = result_obj.results[0]
self.assertTrue(result.fetch_success)
self.assertTrue(result.checkout_success)
project.Sync_NetworkHalf.assert_not_called()
project.Sync_LocalHalf.assert_called_once()
def test_worker_network_only(self):
"""Test _SyncProjectList with --network-only."""
opt = self._get_opts(["--interleaved", "--network-only"])
project = self.projA
project.Sync_NetworkHalf = mock.Mock(
return_value=SyncNetworkHalfResult(error=None, remote_fetched=True)
)
project.Sync_LocalHalf = mock.Mock()
self.mock_context["projects"] = [project]
result_obj = self.cmd._SyncProjectList(opt, [0])
result = result_obj.results[0]
self.assertTrue(result.fetch_success)
self.assertTrue(result.checkout_success)
project.Sync_NetworkHalf.assert_called_once()
project.Sync_LocalHalf.assert_not_called()

263
tests/test_subcmds_wipe.py Normal file
View File

@@ -0,0 +1,263 @@
# Copyright (C) 2025 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import shutil
from unittest import mock
import pytest
import project
from subcmds import wipe
def _create_mock_project(tempdir, name, objdir_path=None, has_changes=False):
"""Creates a mock project with necessary attributes and directories."""
worktree = os.path.join(tempdir, name)
gitdir = os.path.join(tempdir, ".repo/projects", f"{name}.git")
if objdir_path:
objdir = objdir_path
else:
objdir = os.path.join(tempdir, ".repo/project-objects", f"{name}.git")
os.makedirs(worktree, exist_ok=True)
os.makedirs(gitdir, exist_ok=True)
os.makedirs(objdir, exist_ok=True)
proj = project.Project(
manifest=mock.MagicMock(),
name=name,
remote=mock.MagicMock(),
gitdir=gitdir,
objdir=objdir,
worktree=worktree,
relpath=name,
revisionExpr="main",
revisionId="abcd",
)
proj.HasChanges = mock.MagicMock(return_value=has_changes)
def side_effect_delete_worktree(force=False, verbose=False):
if os.path.exists(proj.worktree):
shutil.rmtree(proj.worktree)
if os.path.exists(proj.gitdir):
shutil.rmtree(proj.gitdir)
return True
proj.DeleteWorktree = mock.MagicMock(
side_effect=side_effect_delete_worktree
)
return proj
def _run_wipe(all_projects, projects_to_wipe_names, options=None):
"""Helper to run the Wipe command with mocked projects."""
cmd = wipe.Wipe()
cmd.manifest = mock.MagicMock()
def get_projects_mock(projects, all_manifests=False, **kwargs):
if projects is None:
return all_projects
names_to_find = set(projects)
return [p for p in all_projects if p.name in names_to_find]
cmd.GetProjects = mock.MagicMock(side_effect=get_projects_mock)
if options is None:
options = []
opts = cmd.OptionParser.parse_args(options + projects_to_wipe_names)[0]
cmd.CommonValidateOptions(opts, projects_to_wipe_names)
cmd.ValidateOptions(opts, projects_to_wipe_names)
cmd.Execute(opts, projects_to_wipe_names)
def test_wipe_single_unshared_project(tmp_path):
"""Test wiping a single project that is not shared."""
p1 = _create_mock_project(str(tmp_path), "project/one")
_run_wipe([p1], ["project/one"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)
def test_wipe_multiple_unshared_projects(tmp_path):
"""Test wiping multiple projects that are not shared."""
p1 = _create_mock_project(str(tmp_path), "project/one")
p2 = _create_mock_project(str(tmp_path), "project/two")
_run_wipe([p1, p2], ["project/one", "project/two"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)
assert not os.path.exists(p2.worktree)
assert not os.path.exists(p2.gitdir)
assert not os.path.exists(p2.objdir)
def test_wipe_shared_project_no_force_raises_error(tmp_path):
"""Test that wiping a shared project without --force raises an error."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
with pytest.raises(wipe.Error) as e:
_run_wipe([p1, p2], ["project/one"])
assert "shared object directories" in str(e.value)
assert "project/one" in str(e.value)
assert "project/two" in str(e.value)
assert os.path.exists(p1.worktree)
assert os.path.exists(p1.gitdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
assert os.path.exists(shared_objdir)
def test_wipe_shared_project_with_force(tmp_path):
"""Test wiping a shared project with --force."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
_run_wipe([p1, p2], ["project/one"], options=["--force"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert os.path.exists(shared_objdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
def test_wipe_all_sharing_projects(tmp_path):
"""Test wiping all projects that share an object directory."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
_run_wipe([p1, p2], ["project/one", "project/two"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p2.worktree)
assert not os.path.exists(p2.gitdir)
assert not os.path.exists(shared_objdir)
def test_wipe_with_uncommitted_changes_raises_error(tmp_path):
"""Test wiping a project with uncommitted changes raises an error."""
p1 = _create_mock_project(str(tmp_path), "project/one", has_changes=True)
with pytest.raises(wipe.Error) as e:
_run_wipe([p1], ["project/one"])
assert "uncommitted changes" in str(e.value)
assert "project/one" in str(e.value)
assert os.path.exists(p1.worktree)
assert os.path.exists(p1.gitdir)
assert os.path.exists(p1.objdir)
def test_wipe_with_uncommitted_changes_with_force(tmp_path):
"""Test wiping a project with uncommitted changes with --force."""
p1 = _create_mock_project(str(tmp_path), "project/one", has_changes=True)
_run_wipe([p1], ["project/one"], options=["--force"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)
def test_wipe_uncommitted_and_shared_raises_combined_error(tmp_path):
"""Test that uncommitted and shared projects raise a combined error."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path),
"project/one",
objdir_path=shared_objdir,
has_changes=True,
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
with pytest.raises(wipe.Error) as e:
_run_wipe([p1, p2], ["project/one"])
assert "uncommitted changes" in str(e.value)
assert "shared object directories" in str(e.value)
assert "project/one" in str(e.value)
assert "project/two" in str(e.value)
assert os.path.exists(p1.worktree)
assert os.path.exists(p1.gitdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
assert os.path.exists(shared_objdir)
def test_wipe_shared_project_with_force_shared(tmp_path):
"""Test wiping a shared project with --force-shared."""
shared_objdir = os.path.join(
str(tmp_path), ".repo/project-objects", "shared.git"
)
p1 = _create_mock_project(
str(tmp_path), "project/one", objdir_path=shared_objdir
)
p2 = _create_mock_project(
str(tmp_path), "project/two", objdir_path=shared_objdir
)
_run_wipe([p1, p2], ["project/one"], options=["--force-shared"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert os.path.exists(shared_objdir)
assert os.path.exists(p2.worktree)
assert os.path.exists(p2.gitdir)
def test_wipe_with_uncommitted_changes_with_force_uncommitted(tmp_path):
"""Test wiping uncommitted changes with --force-uncommitted."""
p1 = _create_mock_project(str(tmp_path), "project/one", has_changes=True)
_run_wipe([p1], ["project/one"], options=["--force-uncommitted"])
assert not os.path.exists(p1.worktree)
assert not os.path.exists(p1.gitdir)
assert not os.path.exists(p1.objdir)

View File

@@ -1,4 +1,4 @@
# Copyright 2022 The Android Open Source Project
# Copyright (C) 2022 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -14,15 +14,10 @@
"""Unittests for the update_manpages module."""
import unittest
from release import update_manpages
class UpdateManpagesTest(unittest.TestCase):
"""Tests the update-manpages code."""
def test_replace_regex(self):
"""Check that replace_regex works."""
data = "\n\033[1mSummary\033[m\n"
self.assertEqual(update_manpages.replace_regex(data), "\nSummary\n")
def test_replace_regex() -> None:
"""Check that replace_regex works."""
data = "\n\033[1mSummary\033[m\n"
assert update_manpages.replace_regex(data) == "\nSummary\n"

View File

@@ -23,7 +23,8 @@ import tempfile
import unittest
from unittest import mock
import git_command
import utils_for_test
import main
import wrapper
@@ -408,18 +409,7 @@ class GitCheckoutTestCase(RepoWrapperTestCase):
remote = os.path.join(cls.GIT_DIR, "remote")
os.mkdir(remote)
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
if git_command.git_require((2, 28, 0)):
initstr = "--initial-branch=main"
else:
# Use template dir for init.
templatedir = tempfile.mkdtemp(prefix=".test-template")
with open(os.path.join(templatedir, "HEAD"), "w") as fp:
fp.write("ref: refs/heads/main\n")
initstr = "--template=" + templatedir
run_git("init", initstr, cwd=remote)
utils_for_test.init_git_tree(remote)
run_git("commit", "--allow-empty", "-minit", cwd=remote)
run_git("branch", "stable", cwd=remote)
run_git("tag", "v1.0", cwd=remote)

99
tests/utils_for_test.py Normal file
View File

@@ -0,0 +1,99 @@
# Copyright (C) 2026 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Various utility code used by tests.
If you want to write a per-test fixture, see conftest.py instead.
"""
import contextlib
import functools
from pathlib import Path
import subprocess
import tempfile
from typing import Optional, Union
import git_command
def init_git_tree(
path: Union[str, Path],
ref_format: Optional[str] = None,
) -> None:
"""Initialize `path` as a new git repo."""
with contextlib.ExitStack() as stack:
# Tests need to assume, that main is default branch at init,
# which is not supported in config until 2.28.
cmd = ["git"]
if ref_format:
cmd += ["-c", f"init.defaultRefFormat={ref_format}"]
cmd += ["init"]
if git_command.git_require((2, 28, 0)):
cmd += ["--initial-branch=main"]
else:
# Use template dir for init.
templatedir = stack.enter_context(
tempfile.mkdtemp(prefix="git-template")
)
(Path(templatedir) / "HEAD").write_text("ref: refs/heads/main\n")
cmd += ["--template", templatedir]
cmd += [path]
subprocess.run(cmd, check=True)
@contextlib.contextmanager
def TempGitTree():
"""Create a new empty git checkout for testing."""
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
init_git_tree(tempdir)
yield tempdir
@functools.lru_cache(maxsize=None)
def supports_reftable() -> bool:
"""Check if git supports reftable."""
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
proc = subprocess.run(
["git", "-c", "init.defaultRefFormat=reftable", "init"],
cwd=tempdir,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
check=False,
)
return proc.returncode == 0
@functools.lru_cache(maxsize=None)
def supports_refs_migrate() -> bool:
"""Check if git supports refs migrate."""
with tempfile.TemporaryDirectory(prefix="repo-tests") as tempdir:
subprocess.check_call(
["git", "-c", "init.defaultRefFormat=files", "init"],
cwd=tempdir,
)
proc = subprocess.run(
[
"git",
"refs",
"migrate",
"--ref-format=reftable",
"--dry-run",
],
cwd=tempdir,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
check=False,
)
return proc.returncode == 0

63
tox.ini
View File

@@ -1,63 +0,0 @@
# Copyright 2019 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# https://tox.readthedocs.io/
[tox]
envlist = lint, py36, py37, py38, py39, py310, py311, py312
requires = virtualenv<20.22.0
[gh-actions]
python =
3.6: py36
3.7: py37
3.8: py38
3.9: py39
3.10: py310
3.11: py311
3.12: py312
[testenv]
deps =
-c constraints.txt
black
flake8
isort
pytest
pytest-timeout
commands = {envpython} run_tests {posargs}
setenv =
GIT_AUTHOR_NAME = Repo test author
GIT_COMMITTER_NAME = Repo test committer
EMAIL = repo@gerrit.nodomain
[testenv:lint]
skip_install = true
deps =
-c constraints.txt
black
flake8
commands =
black --check {posargs:. repo run_tests release/update-hooks release/update-manpages}
flake8
[testenv:format]
skip_install = true
deps =
-c constraints.txt
black
flake8
commands =
black {posargs:. repo run_tests release/update-hooks release/update-manpages}
flake8