Merge branch 'ps/contrib-sweep'

Remove bunch of stuff from contrib/ hierarchy.

* ps/contrib-sweep:
  contrib: remove some scripts in "stats" directory
  contrib: remove "git-new-workdir"
  contrib: remove "emacs" directory
  contrib: remove "git-resurrect.sh"
  contrib: remove "persistent-https" remote helper
  contrib: remove "mw-to-git"
  contrib: remove "hooks" directory
  contrib: remove "thunderbird-patch-inline"
  contrib: remove remote-helper stubs
  contrib: remove "examples" directory
  contrib: remove "remotes2config.sh"
This commit is contained in:
Junio C Hamano
2025-07-02 12:08:05 -07:00
52 changed files with 0 additions and 6900 deletions

View File

@@ -1,33 +0,0 @@
This directory used to contain various modules for Emacs support.
These were added shortly after Git was first released. Since then
Emacs's own support for Git got better than what was offered by these
modes. There are also popular 3rd-party Git modes such as Magit which
offer replacements for these.
The following modules were available, and can be dug up from the Git
history:
* git.el:
Wrapper for "git status" that provided access to other git commands.
Modern alternatives to this include Magit, and VC mode that ships
with Emacs.
* git-blame.el:
A wrapper for "git blame" written before Emacs's own vc-annotate
mode learned to invoke git-blame, which can be done via C-x v g.
* vc-git.el:
This file used to contain the VC-mode backend for git, but it is no
longer distributed with git. It is now maintained as part of Emacs
and included in standard Emacs distributions starting from version
22.2.
If you have an earlier Emacs version, upgrading to Emacs 22 is
recommended, since the VC mode in older Emacs is not generic enough
to be able to support git in a reasonable manner, and no attempt has
been made to backport vc-git.el.

View File

@@ -1,6 +0,0 @@
(error "git-blame.el no longer ships with git. It's recommended
to replace its use with Emacs's own vc-annotate. See
contrib/emacs/README in git's
sources (https://github.com/git/git/blob/master/contrib/emacs/README)
for more info on suggested alternatives and for why this
happened.")

View File

@@ -1,6 +0,0 @@
(error "git.el no longer ships with git. It's recommended to
replace its use with Magit, or simply delete references to git.el
in your initialization file(s). See contrib/emacs/README in git's
sources (https://github.com/git/git/blob/master/contrib/emacs/README)
for suggested alternatives and for why this happened. Emacs's own
VC mode and Magit are viable alternatives.")

View File

@@ -1,20 +0,0 @@
This directory used to contain scripted implementations of builtins
that have since been rewritten in C.
They have now been removed, but can be retrieved from an older commit
that removed them from this directory.
They're interesting for their reference value to any aspiring plumbing
users who want to learn how pieces can be fit together, but in many
cases have drifted enough from the actual implementations Git uses to
be instructive.
Other things that can be useful:
* Some commands such as git-gc wrap other commands, and what they're
doing behind the scenes can be seen by running them under
GIT_TRACE=1
* Doing `git log` on paths matching '*--helper.c' will show
incremental effort in the direction of moving existing shell
scripts to C.

View File

@@ -1,181 +0,0 @@
#!/bin/sh
USAGE="[-a] [-r] [-m] [-t] [-n] [-b <newname>] <name>"
LONG_USAGE="git-resurrect attempts to find traces of a branch tip
called <name>, and tries to resurrect it. Currently, the reflog is
searched for checkout messages, and with -r also merge messages. With
-m and -t, the history of all refs is scanned for Merge <name> into
other/Merge <other> into <name> (respectively) commit subjects, which
is rather slow but allows you to resurrect other people's topic
branches."
OPTIONS_KEEPDASHDASH=
OPTIONS_STUCKLONG=
OPTIONS_SPEC="\
git resurrect $USAGE
--
b,branch= save branch as <newname> instead of <name>
a,all same as -l -r -m -t
k,keep-going full rev-list scan (instead of first match)
l,reflog scan reflog for checkouts (enabled by default)
r,reflog-merges scan for merges recorded in reflog
m,merges scan for merges into other branches (slow)
t,merge-targets scan for merges of other branches into <name>
n,dry-run don't recreate the branch"
. git-sh-setup
search_reflog () {
sed -ne 's~^\([^ ]*\) .* checkout: moving from '"$1"' .*~\1~p' \
< "$GIT_DIR"/logs/HEAD
}
search_reflog_merges () {
git rev-parse $(
sed -ne 's~^[^ ]* \([^ ]*\) .* merge '"$1"':.*~\1^2~p' \
< "$GIT_DIR"/logs/HEAD
)
}
oid_pattern=$(git hash-object --stdin </dev/null | sed -e 's/./[0-9a-f]/g')
search_merges () {
git rev-list --all --grep="Merge branch '$1'" \
--pretty=tformat:"%P %s" |
sed -ne "/^$oid_pattern \($oid_pattern\) Merge .*/ {s//\1/p;$early_exit}"
}
search_merge_targets () {
git rev-list --all --grep="Merge branch '[^']*' into $branch\$" \
--pretty=tformat:"%H %s" --all |
sed -ne "/^\($oid_pattern\) Merge .*/ {s//\1/p;$early_exit} "
}
dry_run=
early_exit=q
scan_reflog=t
scan_reflog_merges=
scan_merges=
scan_merge_targets=
new_name=
while test "$#" != 0; do
case "$1" in
-b|--branch)
shift
new_name="$1"
;;
-n|--dry-run)
dry_run=t
;;
--no-dry-run)
dry_run=
;;
-k|--keep-going)
early_exit=
;;
--no-keep-going)
early_exit=q
;;
-m|--merges)
scan_merges=t
;;
--no-merges)
scan_merges=
;;
-l|--reflog)
scan_reflog=t
;;
--no-reflog)
scan_reflog=
;;
-r|--reflog_merges)
scan_reflog_merges=t
;;
--no-reflog_merges)
scan_reflog_merges=
;;
-t|--merge-targets)
scan_merge_targets=t
;;
--no-merge-targets)
scan_merge_targets=
;;
-a|--all)
scan_reflog=t
scan_reflog_merges=t
scan_merges=t
scan_merge_targets=t
;;
--)
shift
break
;;
*)
usage
;;
esac
shift
done
test "$#" = 1 || usage
all_strategies="$scan_reflog$scan_reflog_merges$scan_merges$scan_merge_targets"
if test -z "$all_strategies"; then
die "must enable at least one of -lrmt"
fi
branch="$1"
test -z "$new_name" && new_name="$branch"
if test ! -z "$scan_reflog"; then
if test -r "$GIT_DIR"/logs/HEAD; then
candidates="$(search_reflog $branch)"
else
die 'reflog scanning requested, but' \
'$GIT_DIR/logs/HEAD not readable'
fi
fi
if test ! -z "$scan_reflog_merges"; then
if test -r "$GIT_DIR"/logs/HEAD; then
candidates="$candidates $(search_reflog_merges $branch)"
else
die 'reflog scanning requested, but' \
'$GIT_DIR/logs/HEAD not readable'
fi
fi
if test ! -z "$scan_merges"; then
candidates="$candidates $(search_merges $branch)"
fi
if test ! -z "$scan_merge_targets"; then
candidates="$candidates $(search_merge_targets $branch)"
fi
candidates="$(git rev-parse $candidates | sort -u)"
if test -z "$candidates"; then
hint=
test "z$all_strategies" != "ztttt" \
&& hint=" (maybe try again with -a)"
die "no candidates for $branch found$hint"
fi
echo "** Candidates for $branch **"
for cmt in $candidates; do
git --no-pager log --pretty=tformat:"%ct:%h [%cr] %s" --abbrev-commit -1 $cmt
done \
| sort -n | cut -d: -f2-
newest="$(git rev-list -1 $candidates)"
if test ! -z "$dry_run"; then
printf "** Most recent: "
git --no-pager log -1 --pretty=tformat:"%h %s" $newest
elif ! git rev-parse --verify --quiet $new_name >/dev/null; then
printf "** Restoring $new_name to "
git --no-pager log -1 --pretty=tformat:"%h %s" $newest
git branch $new_name $newest
else
printf "Most recent: "
git --no-pager log -1 --pretty=tformat:"%h %s" $newest
echo "** $new_name already exists, doing nothing"
fi

View File

@@ -1,7 +0,0 @@
git-multimail is developed as an independent project at the following
website:
https://github.com/git-multimail/git-multimail
Please refer to that project page for information about how to report
bugs or contribute to git-multimail.

View File

@@ -1,759 +0,0 @@
#!/bin/sh
#
# Copyright (c) 2007 Andy Parkins
#
# An example hook script to mail out commit update information.
#
# NOTE: This script is no longer under active development. There
# is another script, git-multimail, which is more capable and
# configurable and is largely backwards-compatible with this script;
# please see "contrib/hooks/multimail/". For instructions on how to
# migrate from post-receive-email to git-multimail, please see
# "README.migrate-from-post-receive-email" in that directory.
#
# This hook sends emails listing new revisions to the repository
# introduced by the change being reported. The rule is that (for
# branch updates) each commit will appear on one email and one email
# only.
#
# This hook is stored in the contrib/hooks directory. Your distribution
# will have put this somewhere standard. You should make this script
# executable then link to it in the repository you would like to use it in.
# For example, on debian the hook is stored in
# /usr/share/git-core/contrib/hooks/post-receive-email:
#
# cd /path/to/your/repository.git
# ln -sf /usr/share/git-core/contrib/hooks/post-receive-email hooks/post-receive
#
# This hook script assumes it is enabled on the central repository of a
# project, with all users pushing only to it and not between each other. It
# will still work if you don't operate in that style, but it would become
# possible for the email to be from someone other than the person doing the
# push.
#
# To help with debugging and use on pre-v1.5.1 git servers, this script will
# also obey the interface of hooks/update, taking its arguments on the
# command line. Unfortunately, hooks/update is called once for each ref.
# To avoid firing one email per ref, this script just prints its output to
# the screen when used in this mode. The output can then be redirected if
# wanted.
#
# Config
# ------
# hooks.mailinglist
# This is the list that all pushes will go to; leave it blank to not send
# emails for every ref update.
# hooks.announcelist
# This is the list that all pushes of annotated tags will go to. Leave it
# blank to default to the mailinglist field. The announce emails lists
# the short log summary of the changes since the last annotated tag.
# hooks.envelopesender
# If set then the -f option is passed to sendmail to allow the envelope
# sender address to be set
# hooks.emailprefix
# All emails have their subjects prefixed with this prefix, or "[SCM]"
# if emailprefix is unset, to aid filtering
# hooks.showrev
# The shell command used to format each revision in the email, with
# "%s" replaced with the commit id. Defaults to "git rev-list -1
# --pretty %s", displaying the commit id, author, date and log
# message. To list full patches separated by a blank line, you
# could set this to "git show -C %s; echo".
# To list a gitweb/cgit URL *and* a full patch for each change set, use this:
# "t=%s; printf 'http://.../?id=%%s' \$t; echo;echo; git show -C \$t; echo"
# Be careful if "..." contains things that will be expanded by shell "eval"
# or printf.
# hooks.emailmaxlines
# The maximum number of lines that should be included in the generated
# email body. If not specified, there is no limit.
# Lines beyond the limit are suppressed and counted, and a final
# line is added indicating the number of suppressed lines.
# hooks.diffopts
# Alternate options for the git diff-tree invocation that shows changes.
# Default is "--stat --summary --find-copies-harder". Add -p to those
# options to include a unified diff of changes in addition to the usual
# summary output.
#
# Notes
# -----
# All emails include the headers "X-Git-Refname", "X-Git-Oldrev",
# "X-Git-Newrev", and "X-Git-Reftype" to enable fine tuned filtering and
# give information for debugging.
#
# ---------------------------- Functions
#
# Function to prepare for email generation. This decides what type
# of update this is and whether an email should even be generated.
#
prep_for_email()
{
# --- Arguments
oldrev=$(git rev-parse $1)
newrev=$(git rev-parse $2)
refname="$3"
# --- Interpret
# 0000->1234 (create)
# 1234->2345 (update)
# 2345->0000 (delete)
if expr "$oldrev" : '0*$' >/dev/null
then
change_type="create"
else
if expr "$newrev" : '0*$' >/dev/null
then
change_type="delete"
else
change_type="update"
fi
fi
# --- Get the revision types
newrev_type=$(git cat-file -t $newrev 2> /dev/null)
oldrev_type=$(git cat-file -t "$oldrev" 2> /dev/null)
case "$change_type" in
create|update)
rev="$newrev"
rev_type="$newrev_type"
;;
delete)
rev="$oldrev"
rev_type="$oldrev_type"
;;
esac
# The revision type tells us what type the commit is, combined with
# the location of the ref we can decide between
# - working branch
# - tracking branch
# - unannoted tag
# - annotated tag
case "$refname","$rev_type" in
refs/tags/*,commit)
# un-annotated tag
refname_type="tag"
short_refname=${refname##refs/tags/}
;;
refs/tags/*,tag)
# annotated tag
refname_type="annotated tag"
short_refname=${refname##refs/tags/}
# change recipients
if [ -n "$announcerecipients" ]; then
recipients="$announcerecipients"
fi
;;
refs/heads/*,commit)
# branch
refname_type="branch"
short_refname=${refname##refs/heads/}
;;
refs/remotes/*,commit)
# tracking branch
refname_type="tracking branch"
short_refname=${refname##refs/remotes/}
echo >&2 "*** Push-update of tracking branch, $refname"
echo >&2 "*** - no email generated."
return 1
;;
*)
# Anything else (is there anything else?)
echo >&2 "*** Unknown type of update to $refname ($rev_type)"
echo >&2 "*** - no email generated"
return 1
;;
esac
# Check if we've got anyone to send to
if [ -z "$recipients" ]; then
case "$refname_type" in
"annotated tag")
config_name="hooks.announcelist"
;;
*)
config_name="hooks.mailinglist"
;;
esac
echo >&2 "*** $config_name is not set so no email will be sent"
echo >&2 "*** for $refname update $oldrev->$newrev"
return 1
fi
return 0
}
#
# Top level email generation function. This calls the appropriate
# body-generation routine after outputting the common header.
#
# Note this function doesn't actually generate any email output, that is
# taken care of by the functions it calls:
# - generate_email_header
# - generate_create_XXXX_email
# - generate_update_XXXX_email
# - generate_delete_XXXX_email
# - generate_email_footer
#
# Note also that this function cannot 'exit' from the script; when this
# function is running (in hook script mode), the send_mail() function
# is already executing in another process, connected via a pipe, and
# if this function exits without, whatever has been generated to that
# point will be sent as an email... even if nothing has been generated.
#
generate_email()
{
# Email parameters
# The email subject will contain the best description of the ref
# that we can build from the parameters
describe=$(git describe $rev 2>/dev/null)
if [ -z "$describe" ]; then
describe=$rev
fi
generate_email_header
# Call the correct body generation function
fn_name=general
case "$refname_type" in
"tracking branch"|branch)
fn_name=branch
;;
"annotated tag")
fn_name=atag
;;
esac
if [ -z "$maxlines" ]; then
generate_${change_type}_${fn_name}_email
else
generate_${change_type}_${fn_name}_email | limit_lines $maxlines
fi
generate_email_footer
}
generate_email_header()
{
# --- Email (all stdout will be the email)
# Generate header
cat <<-EOF
To: $recipients
Subject: ${emailprefix}$projectdesc $refname_type $short_refname ${change_type}d. $describe
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
X-Git-Refname: $refname
X-Git-Reftype: $refname_type
X-Git-Oldrev: $oldrev
X-Git-Newrev: $newrev
Auto-Submitted: auto-generated
This is an automated email from the git hooks/post-receive script. It was
generated because a ref change was pushed to the repository containing
the project "$projectdesc".
The $refname_type, $short_refname has been ${change_type}d
EOF
}
generate_email_footer()
{
SPACE=" "
cat <<-EOF
hooks/post-receive
--${SPACE}
$projectdesc
EOF
}
# --------------- Branches
#
# Called for the creation of a branch
#
generate_create_branch_email()
{
# This is a new branch and so oldrev is not valid
echo " at $newrev ($newrev_type)"
echo ""
echo $LOGBEGIN
show_new_revisions
echo $LOGEND
}
#
# Called for the change of a pre-existing branch
#
generate_update_branch_email()
{
# Consider this:
# 1 --- 2 --- O --- X --- 3 --- 4 --- N
#
# O is $oldrev for $refname
# N is $newrev for $refname
# X is a revision pointed to by some other ref, for which we may
# assume that an email has already been generated.
# In this case we want to issue an email containing only revisions
# 3, 4, and N. Given (almost) by
#
# git rev-list N ^O --not --all
#
# The reason for the "almost", is that the "--not --all" will take
# precedence over the "N", and effectively will translate to
#
# git rev-list N ^O ^X ^N
#
# So, we need to build up the list more carefully. git rev-parse
# will generate a list of revs that may be fed into git rev-list.
# We can get it to make the "--not --all" part and then filter out
# the "^N" with:
#
# git rev-parse --not --all | grep -v N
#
# Then, using the --stdin switch to git rev-list we have effectively
# manufactured
#
# git rev-list N ^O ^X
#
# This leaves a problem when someone else updates the repository
# while this script is running. Their new value of the ref we're
# working on would be included in the "--not --all" output; and as
# our $newrev would be an ancestor of that commit, it would exclude
# all of our commits. What we really want is to exclude the current
# value of $refname from the --not list, rather than N itself. So:
#
# git rev-parse --not --all | grep -v $(git rev-parse $refname)
#
# Gets us to something pretty safe (apart from the small time
# between refname being read, and git rev-parse running - for that,
# I give up)
#
#
# Next problem, consider this:
# * --- B --- * --- O ($oldrev)
# \
# * --- X --- * --- N ($newrev)
#
# That is to say, there is no guarantee that oldrev is a strict
# subset of newrev (it would have required a --force, but that's
# allowed). So, we can't simply say rev-list $oldrev..$newrev.
# Instead we find the common base of the two revs and list from
# there.
#
# As above, we need to take into account the presence of X; if
# another branch is already in the repository and points at some of
# the revisions that we are about to output - we don't want them.
# The solution is as before: git rev-parse output filtered.
#
# Finally, tags: 1 --- 2 --- O --- T --- 3 --- 4 --- N
#
# Tags pushed into the repository generate nice shortlog emails that
# summarise the commits between them and the previous tag. However,
# those emails don't include the full commit messages that we output
# for a branch update. Therefore we still want to output revisions
# that have been output on a tag email.
#
# Luckily, git rev-parse includes just the tool. Instead of using
# "--all" we use "--branches"; this has the added benefit that
# "remotes/" will be ignored as well.
# List all of the revisions that were removed by this update, in a
# fast-forward update, this list will be empty, because rev-list O
# ^N is empty. For a non-fast-forward, O ^N is the list of removed
# revisions
fast_forward=""
rev=""
for rev in $(git rev-list $newrev..$oldrev)
do
revtype=$(git cat-file -t "$rev")
echo " discards $rev ($revtype)"
done
if [ -z "$rev" ]; then
fast_forward=1
fi
# List all the revisions from baserev to newrev in a kind of
# "table-of-contents"; note this list can include revisions that
# have already had notification emails and is present to show the
# full detail of the change from rolling back the old revision to
# the base revision and then forward to the new revision
for rev in $(git rev-list $oldrev..$newrev)
do
revtype=$(git cat-file -t "$rev")
echo " via $rev ($revtype)"
done
if [ "$fast_forward" ]; then
echo " from $oldrev ($oldrev_type)"
else
# 1. Existing revisions were removed. In this case newrev
# is a subset of oldrev - this is the reverse of a
# fast-forward, a rewind
# 2. New revisions were added on top of an old revision,
# this is a rewind and addition.
# (1) certainly happened, (2) possibly. When (2) hasn't
# happened, we set a flag to indicate that no log printout
# is required.
echo ""
# Find the common ancestor of the old and new revisions and
# compare it with newrev
baserev=$(git merge-base $oldrev $newrev)
rewind_only=""
if [ "$baserev" = "$newrev" ]; then
echo "This update discarded existing revisions and left the branch pointing at"
echo "a previous point in the repository history."
echo ""
echo " * -- * -- N ($newrev)"
echo " \\"
echo " O -- O -- O ($oldrev)"
echo ""
echo "The removed revisions are not necessarily gone - if another reference"
echo "still refers to them they will stay in the repository."
rewind_only=1
else
echo "This update added new revisions after undoing existing revisions. That is"
echo "to say, the old revision is not a strict subset of the new revision. This"
echo "situation occurs when you --force push a change and generate a repository"
echo "containing something like this:"
echo ""
echo " * -- * -- B -- O -- O -- O ($oldrev)"
echo " \\"
echo " N -- N -- N ($newrev)"
echo ""
echo "When this happens we assume that you've already had alert emails for all"
echo "of the O revisions, and so we here report only the revisions in the N"
echo "branch from the common base, B."
fi
fi
echo ""
if [ -z "$rewind_only" ]; then
echo "Those revisions listed above that are new to this repository have"
echo "not appeared on any other notification email; so we list those"
echo "revisions in full, below."
echo ""
echo $LOGBEGIN
show_new_revisions
# XXX: Need a way of detecting whether git rev-list actually
# outputted anything, so that we can issue a "no new
# revisions added by this update" message
echo $LOGEND
else
echo "No new revisions were added by this update."
fi
# The diffstat is shown from the old revision to the new revision.
# This is to show the truth of what happened in this change.
# There's no point showing the stat from the base to the new
# revision because the base is effectively a random revision at this
# point - the user will be interested in what this revision changed
# - including the undoing of previous revisions in the case of
# non-fast-forward updates.
echo ""
echo "Summary of changes:"
git diff-tree $diffopts $oldrev..$newrev
}
#
# Called for the deletion of a branch
#
generate_delete_branch_email()
{
echo " was $oldrev"
echo ""
echo $LOGBEGIN
git diff-tree -s --always --encoding=UTF-8 --pretty=oneline $oldrev
echo $LOGEND
}
# --------------- Annotated tags
#
# Called for the creation of an annotated tag
#
generate_create_atag_email()
{
echo " at $newrev ($newrev_type)"
generate_atag_email
}
#
# Called for the update of an annotated tag (this is probably a rare event
# and may not even be allowed)
#
generate_update_atag_email()
{
echo " to $newrev ($newrev_type)"
echo " from $oldrev (which is now obsolete)"
generate_atag_email
}
#
# Called when an annotated tag is created or changed
#
generate_atag_email()
{
# Use git for-each-ref to pull out the individual fields from the
# tag
eval $(git for-each-ref --shell --format='
tagobject=%(*objectname)
tagtype=%(*objecttype)
tagger=%(taggername)
tagged=%(taggerdate)' $refname
)
echo " tagging $tagobject ($tagtype)"
case "$tagtype" in
commit)
# If the tagged object is a commit, then we assume this is a
# release, and so we calculate which tag this tag is
# replacing
prevtag=$(git describe --abbrev=0 $newrev^ 2>/dev/null)
if [ -n "$prevtag" ]; then
echo " replaces $prevtag"
fi
;;
*)
echo " length $(git cat-file -s $tagobject) bytes"
;;
esac
echo " tagged by $tagger"
echo " on $tagged"
echo ""
echo $LOGBEGIN
# Show the content of the tag message; this might contain a change
# log or release notes so is worth displaying.
git cat-file tag $newrev | sed -e '1,/^$/d'
echo ""
case "$tagtype" in
commit)
# Only commit tags make sense to have rev-list operations
# performed on them
if [ -n "$prevtag" ]; then
# Show changes since the previous release
git shortlog "$prevtag..$newrev"
else
# No previous tag, show all the changes since time
# began
git shortlog $newrev
fi
;;
*)
# XXX: Is there anything useful we can do for non-commit
# objects?
;;
esac
echo $LOGEND
}
#
# Called for the deletion of an annotated tag
#
generate_delete_atag_email()
{
echo " was $oldrev"
echo ""
echo $LOGBEGIN
git diff-tree -s --always --encoding=UTF-8 --pretty=oneline $oldrev
echo $LOGEND
}
# --------------- General references
#
# Called when any other type of reference is created (most likely a
# non-annotated tag)
#
generate_create_general_email()
{
echo " at $newrev ($newrev_type)"
generate_general_email
}
#
# Called when any other type of reference is updated (most likely a
# non-annotated tag)
#
generate_update_general_email()
{
echo " to $newrev ($newrev_type)"
echo " from $oldrev"
generate_general_email
}
#
# Called for creation or update of any other type of reference
#
generate_general_email()
{
# Unannotated tags are more about marking a point than releasing a
# version; therefore we don't do the shortlog summary that we do for
# annotated tags above - we simply show that the point has been
# marked, and print the log message for the marked point for
# reference purposes
#
# Note this section also catches any other reference type (although
# there aren't any) and deals with them in the same way.
echo ""
if [ "$newrev_type" = "commit" ]; then
echo $LOGBEGIN
git diff-tree -s --always --encoding=UTF-8 --pretty=medium $newrev
echo $LOGEND
else
# What can we do here? The tag marks an object that is not
# a commit, so there is no log for us to display. It's
# probably not wise to output git cat-file as it could be a
# binary blob. We'll just say how big it is
echo "$newrev is a $newrev_type, and is $(git cat-file -s $newrev) bytes long."
fi
}
#
# Called for the deletion of any other type of reference
#
generate_delete_general_email()
{
echo " was $oldrev"
echo ""
echo $LOGBEGIN
git diff-tree -s --always --encoding=UTF-8 --pretty=oneline $oldrev
echo $LOGEND
}
# --------------- Miscellaneous utilities
#
# Show new revisions as the user would like to see them in the email.
#
show_new_revisions()
{
# This shows all log entries that are not already covered by
# another ref - i.e. commits that are now accessible from this
# ref that were previously not accessible
# (see generate_update_branch_email for the explanation of this
# command)
# Revision range passed to rev-list differs for new vs. updated
# branches.
if [ "$change_type" = create ]
then
# Show all revisions exclusive to this (new) branch.
revspec=$newrev
else
# Branch update; show revisions not part of $oldrev.
revspec=$oldrev..$newrev
fi
other_branches=$(git for-each-ref --format='%(refname)' refs/heads/ |
grep -F -v $refname)
git rev-parse --not $other_branches |
if [ -z "$custom_showrev" ]
then
git rev-list --pretty --stdin $revspec
else
git rev-list --stdin $revspec |
while read onerev
do
eval $(printf "$custom_showrev" $onerev)
done
fi
}
limit_lines()
{
lines=0
skipped=0
while IFS="" read -r line; do
lines=$((lines + 1))
if [ $lines -gt $1 ]; then
skipped=$((skipped + 1))
else
printf "%s\n" "$line"
fi
done
if [ $skipped -ne 0 ]; then
echo "... $skipped lines suppressed ..."
fi
}
send_mail()
{
if [ -n "$envelopesender" ]; then
/usr/sbin/sendmail -t -f "$envelopesender"
else
/usr/sbin/sendmail -t
fi
}
# ---------------------------- main()
# --- Constants
LOGBEGIN="- Log -----------------------------------------------------------------"
LOGEND="-----------------------------------------------------------------------"
# --- Config
# Set GIT_DIR either from the working directory, or from the environment
# variable.
GIT_DIR=$(git rev-parse --git-dir 2>/dev/null)
if [ -z "$GIT_DIR" ]; then
echo >&2 "fatal: post-receive: GIT_DIR not set"
exit 1
fi
projectdesc=$(sed -ne '1p' "$GIT_DIR/description" 2>/dev/null)
# Check if the description is unchanged from it's default, and shorten it to
# a more manageable length if it is
if expr "$projectdesc" : "Unnamed repository.*$" >/dev/null
then
projectdesc="UNNAMED PROJECT"
fi
recipients=$(git config hooks.mailinglist)
announcerecipients=$(git config hooks.announcelist)
envelopesender=$(git config hooks.envelopesender)
emailprefix=$(git config hooks.emailprefix || echo '[SCM] ')
custom_showrev=$(git config hooks.showrev)
maxlines=$(git config hooks.emailmaxlines)
diffopts=$(git config hooks.diffopts)
: ${diffopts:="--stat --summary --find-copies-harder"}
# --- Main loop
# Allow dual mode: run from the command line just like the update hook, or
# if no arguments are given then run as a hook script
if [ -n "$1" -a -n "$2" -a -n "$3" ]; then
# Output to the terminal in command line mode - if someone wanted to
# resend an email; they could redirect the output to sendmail
# themselves
prep_for_email $2 $3 $1 && PAGER= generate_email
else
while read oldrev newrev refname
do
prep_for_email $oldrev $newrev $refname || continue
generate_email $maxlines | send_mail
done
fi

View File

@@ -1,42 +0,0 @@
#!/bin/sh
#
# An example hook script to verify if you are on battery, in case you
# are running Linux or OS X. Called by git-gc --auto with no arguments.
# The hook should exit with non-zero status after issuing an appropriate
# message if it wants to stop the auto repacking.
#
# This hook is stored in the contrib/hooks directory. Your distribution
# may have put this somewhere else. If you want to use this hook, you
# should make this script executable then link to it in the repository
# you would like to use it in.
#
# For example, if the hook is stored in
# /usr/share/git-core/contrib/hooks/pre-auto-gc-battery:
#
# cd /path/to/your/repository.git
# ln -sf /usr/share/git-core/contrib/hooks/pre-auto-gc-battery \
# hooks/pre-auto-gc
if test -x /sbin/on_ac_power && (/sbin/on_ac_power;test $? -ne 1)
then
exit 0
elif test "$(cat /sys/class/power_supply/AC/online 2>/dev/null)" = 1
then
exit 0
elif grep -q 'on-line' /proc/acpi/ac_adapter/AC/state 2>/dev/null
then
exit 0
elif grep -q '0x01$' /proc/apm 2>/dev/null
then
exit 0
elif grep -q "AC Power \+: 1" /proc/pmu/info 2>/dev/null
then
exit 0
elif test -x /usr/bin/pmset && /usr/bin/pmset -g batt |
grep -q "drawing from 'AC Power'"
then
exit 0
fi
echo "Auto packing deferred; not on AC"
exit 1

View File

@@ -1,214 +0,0 @@
#!/usr/bin/perl
#
# Copyright (c) 2006 Josh England
#
# This script can be used to save/restore full permissions and ownership data
# within a git working tree.
#
# To save permissions/ownership data, place this script in your .git/hooks
# directory and enable a `pre-commit` hook with the following lines:
# #!/bin/sh
# SUBDIRECTORY_OK=1 . git-sh-setup
# $GIT_DIR/hooks/setgitperms.perl -r
#
# To restore permissions/ownership data, place this script in your .git/hooks
# directory and enable a `post-merge` and `post-checkout` hook with the
# following lines:
# #!/bin/sh
# SUBDIRECTORY_OK=1 . git-sh-setup
# $GIT_DIR/hooks/setgitperms.perl -w
#
use strict;
use Getopt::Long;
use File::Find;
use File::Basename;
my $usage =
"usage: setgitperms.perl [OPTION]... <--read|--write>
This program uses a file `.gitmeta` to store/restore permissions and uid/gid
info for all files/dirs tracked by git in the repository.
---------------------------------Read Mode-------------------------------------
-r, --read Reads perms/etc from working dir into a .gitmeta file
-s, --stdout Output to stdout instead of .gitmeta
-d, --diff Show unified diff of perms file (XOR with --stdout)
---------------------------------Write Mode------------------------------------
-w, --write Modify perms/etc in working dir to match the .gitmeta file
-v, --verbose Be verbose
\n";
my ($stdout, $showdiff, $verbose, $read_mode, $write_mode);
if ((@ARGV < 0) || !GetOptions(
"stdout", \$stdout,
"diff", \$showdiff,
"read", \$read_mode,
"write", \$write_mode,
"verbose", \$verbose,
)) { die $usage; }
die $usage unless ($read_mode xor $write_mode);
my $topdir = `git rev-parse --show-cdup` or die "\n"; chomp $topdir;
my $gitdir = $topdir . '.git';
my $gitmeta = $topdir . '.gitmeta';
if ($write_mode) {
# Update the working dir permissions/ownership based on data from .gitmeta
open (IN, "<$gitmeta") or die "Could not open $gitmeta for reading: $!\n";
while (defined ($_ = <IN>)) {
chomp;
if (/^(.*) mode=(\S+)\s+uid=(\d+)\s+gid=(\d+)/) {
# Compare recorded perms to actual perms in the working dir
my ($path, $mode, $uid, $gid) = ($1, $2, $3, $4);
my $fullpath = $topdir . $path;
my (undef,undef,$wmode,undef,$wuid,$wgid) = lstat($fullpath);
$wmode = sprintf "%04o", $wmode & 07777;
if ($mode ne $wmode) {
$verbose && print "Updating permissions on $path: old=$wmode, new=$mode\n";
chmod oct($mode), $fullpath;
}
if ($uid != $wuid || $gid != $wgid) {
if ($verbose) {
# Print out user/group names instead of uid/gid
my $pwname = getpwuid($uid);
my $grpname = getgrgid($gid);
my $wpwname = getpwuid($wuid);
my $wgrpname = getgrgid($wgid);
$pwname = $uid if !defined $pwname;
$grpname = $gid if !defined $grpname;
$wpwname = $wuid if !defined $wpwname;
$wgrpname = $wgid if !defined $wgrpname;
print "Updating uid/gid on $path: old=$wpwname/$wgrpname, new=$pwname/$grpname\n";
}
chown $uid, $gid, $fullpath;
}
}
else {
warn "Invalid input format in $gitmeta:\n\t$_\n";
}
}
close IN;
}
elsif ($read_mode) {
# Handle merge conflicts in the .gitperms file
if (-e "$gitdir/MERGE_MSG") {
if (`grep ====== $gitmeta`) {
# Conflict not resolved -- abort the commit
print "PERMISSIONS/OWNERSHIP CONFLICT\n";
print " Resolve the conflict in the $gitmeta file and then run\n";
print " `.git/hooks/setgitperms.perl --write` to reconcile.\n";
exit 1;
}
elsif (`grep $gitmeta $gitdir/MERGE_MSG`) {
# A conflict in .gitmeta has been manually resolved. Verify that
# the working dir perms matches the current .gitmeta perms for
# each file/dir that conflicted.
# This is here because a `setgitperms.perl --write` was not
# performed due to a merge conflict, so permissions/ownership
# may not be consistent with the manually merged .gitmeta file.
my @conflict_diff = `git show \$(cat $gitdir/MERGE_HEAD)`;
my @conflict_files;
my $metadiff = 0;
# Build a list of files that conflicted from the .gitmeta diff
foreach my $line (@conflict_diff) {
if ($line =~ m|^diff --git a/$gitmeta b/$gitmeta|) {
$metadiff = 1;
}
elsif ($line =~ /^diff --git/) {
$metadiff = 0;
}
elsif ($metadiff && $line =~ /^\+(.*) mode=/) {
push @conflict_files, $1;
}
}
# Verify that each conflict file now has permissions consistent
# with the .gitmeta file
foreach my $file (@conflict_files) {
my $absfile = $topdir . $file;
my $gm_entry = `grep "^$file mode=" $gitmeta`;
if ($gm_entry =~ /mode=(\d+) uid=(\d+) gid=(\d+)/) {
my ($gm_mode, $gm_uid, $gm_gid) = ($1, $2, $3);
my (undef,undef,$mode,undef,$uid,$gid) = lstat("$absfile");
$mode = sprintf("%04o", $mode & 07777);
if (($gm_mode ne $mode) || ($gm_uid != $uid)
|| ($gm_gid != $gid)) {
print "PERMISSIONS/OWNERSHIP CONFLICT\n";
print " Mismatch found for file: $file\n";
print " Run `.git/hooks/setgitperms.perl --write` to reconcile.\n";
exit 1;
}
}
else {
print "Warning! Permissions/ownership no longer being tracked for file: $file\n";
}
}
}
}
# No merge conflicts -- write out perms/ownership data to .gitmeta file
unless ($stdout) {
open (OUT, ">$gitmeta.tmp") or die "Could not open $gitmeta.tmp for writing: $!\n";
}
my @files = `git ls-files`;
my %dirs;
foreach my $path (@files) {
chomp $path;
# We have to manually add stats for parent directories
my $parent = dirname($path);
while (!exists $dirs{$parent}) {
$dirs{$parent} = 1;
next if $parent eq '.';
printstats($parent);
$parent = dirname($parent);
}
# Now the git-tracked file
printstats($path);
}
# diff the temporary metadata file to see if anything has changed
# If no metadata has changed, don't overwrite the real file
# This is just so `git commit -a` doesn't try to commit a bogus update
unless ($stdout) {
if (! -e $gitmeta) {
rename "$gitmeta.tmp", $gitmeta;
}
else {
my $diff = `diff -U 0 $gitmeta $gitmeta.tmp`;
if ($diff ne '') {
rename "$gitmeta.tmp", $gitmeta;
}
else {
unlink "$gitmeta.tmp";
}
if ($showdiff) {
print $diff;
}
}
close OUT;
}
# Make sure the .gitmeta file is tracked
system("git add $gitmeta");
}
sub printstats {
my $path = $_[0];
$path =~ s/@/\@/g;
my (undef,undef,$mode,undef,$uid,$gid) = lstat($path);
$path =~ s/%/\%/g;
if ($stdout) {
print $path;
printf " mode=%04o uid=$uid gid=$gid\n", $mode & 07777;
}
else {
print OUT $path;
printf OUT " mode=%04o uid=$uid gid=$gid\n", $mode & 07777;
}
}

View File

@@ -1,421 +0,0 @@
#!/usr/bin/perl
use strict;
use File::Spec;
$ENV{PATH} = '/opt/git/bin';
my $acl_git = '/vcs/acls.git';
my $acl_branch = 'refs/heads/master';
my $debug = 0;
=doc
Invoked as: update refname old-sha1 new-sha1
This script is run by git-receive-pack once for each ref that the
client is trying to modify. If we exit with a non-zero exit value
then the update for that particular ref is denied, but updates for
other refs in the same run of receive-pack may still be allowed.
We are run after the objects have been uploaded, but before the
ref is actually modified. We take advantage of that fact when we
look for "new" commits and tags (the new objects won't show up in
`rev-list --all`).
This script loads and parses the content of the config file
"users/$this_user.acl" from the $acl_branch commit of $acl_git ODB.
The acl file is a git-config style file, but uses a slightly more
restricted syntax as the Perl parser contained within this script
is not nearly as permissive as git-config.
Example:
[user]
committer = John Doe <john.doe@example.com>
committer = John R. Doe <john.doe@example.com>
[repository "acls"]
allow = heads/master
allow = CDUR for heads/jd/
allow = C for ^tags/v\\d+$
For all new commit or tag objects the committer (or tagger) line
within the object must exactly match one of the user.committer
values listed in the acl file ("HEAD:users/$this_user.acl").
For a branch to be modified an allow line within the matching
repository section must be matched for both the refname and the
opcode.
Repository sections are matched on the basename of the repository
(after removing the .git suffix).
The opcode abbreviations are:
C: create new ref
D: delete existing ref
U: fast-forward existing ref (no commit loss)
R: rewind/rebase existing ref (commit loss)
if no opcodes are listed before the "for" keyword then "U" (for
fast-forward update only) is assumed as this is the most common
usage.
Refnames are matched by always assuming a prefix of "refs/".
This hook forbids pushing or deleting anything not under "refs/".
Refnames that start with ^ are Perl regular expressions, and the ^
is kept as part of the regexp. \\ is needed to get just one \, so
\\d expands to \d in Perl. The 3rd allow line above is an example.
Refnames that don't start with ^ but that end with / are prefix
matches (2nd allow line above); all other refnames are strict
equality matches (1st allow line).
Anything pushed to "heads/" (ok, really "refs/heads/") must be
a commit. Tags are not permitted here.
Anything pushed to "tags/" (err, really "refs/tags/") must be an
annotated tag. Commits, blobs, trees, etc. are not permitted here.
Annotated tag signatures aren't checked, nor are they required.
The special subrepository of 'info/new-commit-check' can
be created and used to allow users to push new commits and
tags from another local repository to this one, even if they
aren't the committer/tagger of those objects. In a nut shell
the info/new-commit-check directory is a Git repository whose
objects/info/alternates file lists this repository and all other
possible sources, and whose refs subdirectory contains symlinks
to this repository's refs subdirectory, and to all other possible
sources refs subdirectories. Yes, this means that you cannot
use packed-refs in those repositories as they won't be resolved
correctly.
=cut
my $git_dir = $ENV{GIT_DIR};
my $new_commit_check = "$git_dir/info/new-commit-check";
my $ref = $ARGV[0];
my $old = $ARGV[1];
my $new = $ARGV[2];
my $new_type;
my ($this_user) = getpwuid $<; # REAL_USER_ID
my $repository_name;
my %user_committer;
my @allow_rules;
my @path_rules;
my %diff_cache;
sub deny ($) {
print STDERR "-Deny- $_[0]\n" if $debug;
print STDERR "\ndenied: $_[0]\n\n";
exit 1;
}
sub grant ($) {
print STDERR "-Grant- $_[0]\n" if $debug;
exit 0;
}
sub info ($) {
print STDERR "-Info- $_[0]\n" if $debug;
}
sub git_value (@) {
open(T,'-|','git',@_); local $_ = <T>; chop; close T; $_;
}
sub match_string ($$) {
my ($acl_n, $ref) = @_;
($acl_n eq $ref)
|| ($acl_n =~ m,/$, && substr($ref,0,length $acl_n) eq $acl_n)
|| ($acl_n =~ m,^\^, && $ref =~ m:$acl_n:);
}
sub parse_config ($$$$) {
my $data = shift;
local $ENV{GIT_DIR} = shift;
my $br = shift;
my $fn = shift;
return unless git_value('rev-list','--max-count=1',$br,'--',$fn);
info "Loading $br:$fn";
open(I,'-|','git','cat-file','blob',"$br:$fn");
my $section = '';
while (<I>) {
chomp;
if (/^\s*$/ || /^\s*#/) {
} elsif (/^\[([a-z]+)\]$/i) {
$section = lc $1;
} elsif (/^\[([a-z]+)\s+"(.*)"\]$/i) {
$section = join('.',lc $1,$2);
} elsif (/^\s*([a-z][a-z0-9]+)\s*=\s*(.*?)\s*$/i) {
push @{$data->{join('.',$section,lc $1)}}, $2;
} else {
deny "bad config file line $. in $br:$fn";
}
}
close I;
}
sub all_new_committers () {
local $ENV{GIT_DIR} = $git_dir;
$ENV{GIT_DIR} = $new_commit_check if -d $new_commit_check;
info "Getting committers of new commits.";
my %used;
open(T,'-|','git','rev-list','--pretty=raw',$new,'--not','--all');
while (<T>) {
next unless s/^committer //;
chop;
s/>.*$/>/;
info "Found $_." unless $used{$_}++;
}
close T;
info "No new commits." unless %used;
keys %used;
}
sub all_new_taggers () {
my %exists;
open(T,'-|','git','for-each-ref','--format=%(objectname)','refs/tags');
while (<T>) {
chop;
$exists{$_} = 1;
}
close T;
info "Getting taggers of new tags.";
my %used;
my $obj = $new;
my $obj_type = $new_type;
while ($obj_type eq 'tag') {
last if $exists{$obj};
$obj_type = '';
open(T,'-|','git','cat-file','tag',$obj);
while (<T>) {
chop;
if (/^object ([a-z0-9]{40})$/) {
$obj = $1;
} elsif (/^type (.+)$/) {
$obj_type = $1;
} elsif (s/^tagger //) {
s/>.*$/>/;
info "Found $_." unless $used{$_}++;
last;
}
}
close T;
}
info "No new tags." unless %used;
keys %used;
}
sub check_committers (@) {
my @bad;
foreach (@_) { push @bad, $_ unless $user_committer{$_}; }
if (@bad) {
print STDERR "\n";
print STDERR "You are not $_.\n" foreach (sort @bad);
deny "You cannot push changes not committed by you.";
}
}
sub load_diff ($) {
my $base = shift;
my $d = $diff_cache{$base};
unless ($d) {
local $/ = "\0";
my %this_diff;
if ($base =~ /^0{40}$/) {
# Don't load the diff at all; we are making the
# branch and have no base to compare to in this
# case. A file level ACL makes no sense in this
# context. Having an empty diff will allow the
# branch creation.
#
} else {
open(T,'-|','git','diff-tree',
'-r','--name-status','-z',
$base,$new) or return undef;
while (<T>) {
my $op = $_;
chop $op;
my $path = <T>;
chop $path;
$this_diff{$path} = $op;
}
close T or return undef;
}
$d = \%this_diff;
$diff_cache{$base} = $d;
}
return $d;
}
deny "No GIT_DIR inherited from caller" unless $git_dir;
deny "Need a ref name" unless $ref;
deny "Refusing funny ref $ref" unless $ref =~ s,^refs/,,;
deny "Bad old value $old" unless $old =~ /^[a-z0-9]{40}$/;
deny "Bad new value $new" unless $new =~ /^[a-z0-9]{40}$/;
deny "Cannot determine who you are." unless $this_user;
grant "No change requested." if $old eq $new;
$repository_name = File::Spec->rel2abs($git_dir);
$repository_name =~ m,/([^/]+)(?:\.git|/\.git)$,;
$repository_name = $1;
info "Updating in '$repository_name'.";
my $op;
if ($old =~ /^0{40}$/) { $op = 'C'; }
elsif ($new =~ /^0{40}$/) { $op = 'D'; }
else { $op = 'R'; }
# This is really an update (fast-forward) if the
# merge base of $old and $new is $old.
#
$op = 'U' if ($op eq 'R'
&& $ref =~ m,^heads/,
&& $old eq git_value('merge-base',$old,$new));
# Load the user's ACL file. Expand groups (user.memberof) one level.
{
my %data = ('user.committer' => []);
parse_config(\%data,$acl_git,$acl_branch,"external/$repository_name.acl");
%data = (
'user.committer' => $data{'user.committer'},
'user.memberof' => [],
);
parse_config(\%data,$acl_git,$acl_branch,"users/$this_user.acl");
%user_committer = map {$_ => $_} @{$data{'user.committer'}};
my $rule_key = "repository.$repository_name.allow";
my $rules = $data{$rule_key} || [];
foreach my $group (@{$data{'user.memberof'}}) {
my %g;
parse_config(\%g,$acl_git,$acl_branch,"groups/$group.acl");
my $group_rules = $g{$rule_key};
push @$rules, @$group_rules if $group_rules;
}
RULE:
foreach (@$rules) {
while (/\${user\.([a-z][a-zA-Z0-9]+)}/) {
my $k = lc $1;
my $v = $data{"user.$k"};
next RULE unless defined $v;
next RULE if @$v != 1;
next RULE unless defined $v->[0];
s/\${user\.$k}/$v->[0]/g;
}
if (/^([AMD ]+)\s+of\s+([^\s]+)\s+for\s+([^\s]+)\s+diff\s+([^\s]+)$/) {
my ($ops, $pth, $ref, $bst) = ($1, $2, $3, $4);
$ops =~ s/ //g;
$pth =~ s/\\\\/\\/g;
$ref =~ s/\\\\/\\/g;
push @path_rules, [$ops, $pth, $ref, $bst];
} elsif (/^([AMD ]+)\s+of\s+([^\s]+)\s+for\s+([^\s]+)$/) {
my ($ops, $pth, $ref) = ($1, $2, $3);
$ops =~ s/ //g;
$pth =~ s/\\\\/\\/g;
$ref =~ s/\\\\/\\/g;
push @path_rules, [$ops, $pth, $ref, $old];
} elsif (/^([CDRU ]+)\s+for\s+([^\s]+)$/) {
my $ops = $1;
my $ref = $2;
$ops =~ s/ //g;
$ref =~ s/\\\\/\\/g;
push @allow_rules, [$ops, $ref];
} elsif (/^for\s+([^\s]+)$/) {
# Mentioned, but nothing granted?
} elsif (/^[^\s]+$/) {
s/\\\\/\\/g;
push @allow_rules, ['U', $_];
}
}
}
if ($op ne 'D') {
$new_type = git_value('cat-file','-t',$new);
if ($ref =~ m,^heads/,) {
deny "$ref must be a commit." unless $new_type eq 'commit';
} elsif ($ref =~ m,^tags/,) {
deny "$ref must be an annotated tag." unless $new_type eq 'tag';
}
check_committers (all_new_committers);
check_committers (all_new_taggers) if $new_type eq 'tag';
}
info "$this_user wants $op for $ref";
foreach my $acl_entry (@allow_rules) {
my ($acl_ops, $acl_n) = @$acl_entry;
next unless $acl_ops =~ /^[CDRU]+$/; # Uhh.... shouldn't happen.
next unless $acl_n;
next unless $op =~ /^[$acl_ops]$/;
next unless match_string $acl_n, $ref;
# Don't test path rules on branch deletes.
#
grant "Allowed by: $acl_ops for $acl_n" if $op eq 'D';
# Aggregate matching path rules; allow if there aren't
# any matching this ref.
#
my %pr;
foreach my $p_entry (@path_rules) {
my ($p_ops, $p_n, $p_ref, $p_bst) = @$p_entry;
next unless $p_ref;
push @{$pr{$p_bst}}, $p_entry if match_string $p_ref, $ref;
}
grant "Allowed by: $acl_ops for $acl_n" unless %pr;
# Allow only if all changes against a single base are
# allowed by file path rules.
#
my @bad;
foreach my $p_bst (keys %pr) {
my $diff_ref = load_diff $p_bst;
deny "Cannot difference trees." unless ref $diff_ref;
my %fd = %$diff_ref;
foreach my $p_entry (@{$pr{$p_bst}}) {
my ($p_ops, $p_n, $p_ref, $p_bst) = @$p_entry;
next unless $p_ops =~ /^[AMD]+$/;
next unless $p_n;
foreach my $f_n (keys %fd) {
my $f_op = $fd{$f_n};
next unless $f_op;
next unless $f_op =~ /^[$p_ops]$/;
delete $fd{$f_n} if match_string $p_n, $f_n;
}
last unless %fd;
}
if (%fd) {
push @bad, [$p_bst, \%fd];
} else {
# All changes relative to $p_bst were allowed.
#
grant "Allowed by: $acl_ops for $acl_n diff $p_bst";
}
}
foreach my $bad_ref (@bad) {
my ($p_bst, $fd) = @$bad_ref;
print STDERR "\n";
print STDERR "Not allowed to make the following changes:\n";
print STDERR "(base: $p_bst)\n";
foreach my $f_n (sort keys %$fd) {
print STDERR " $fd->{$f_n} $f_n\n";
}
}
deny "You are not permitted to $op $ref";
}
close A;
deny "You are not permitted to $op $ref";

View File

@@ -1,2 +0,0 @@
git-remote-mediawiki
git-mw

View File

@@ -1,28 +0,0 @@
# These 3 rules demand to add the s, m and x flag to *every* regexp. This is
# overkill and would be harmful for readability.
[-RegularExpressions::RequireExtendedFormatting]
[-RegularExpressions::RequireDotMatchAnything]
[-RegularExpressions::RequireLineBoundaryMatching]
# This rule says that builtin functions should not be called with parentheses
# e.g.: (taken from CPAN's documentation)
# open($handle, '>', $filename); #not ok
# open $handle, '>', $filename; #ok
# Applying such a rule would mean modifying a huge number of lines for a
# question of style.
[-CodeLayout::ProhibitParensWithBuiltins]
# This rule states that each system call should have its return value checked
# The problem is that it includes the print call. Checking every print call's
# return value would be harmful to the code readability.
# This configuration keeps all default function but print.
[InputOutput::RequireCheckedSyscalls]
functions = open say close
# This rule demands to add a dependency for the Readonly module. This is not
# wished.
[-ValuesAndExpressions::ProhibitConstantPragma]
# This rule is not really useful (rather a question of style) and produces many
# warnings among the code.
[-ValuesAndExpressions::ProhibitNoisyQuotes]

View File

@@ -1,101 +0,0 @@
package Git::Mediawiki;
require v5.26;
use strict;
use POSIX;
use Git;
BEGIN {
our ($VERSION, @ISA, @EXPORT, @EXPORT_OK);
# Totally unstable API.
$VERSION = '0.01';
require Exporter;
@ISA = qw(Exporter);
@EXPORT = ();
# Methods which can be called as standalone functions as well:
@EXPORT_OK = qw(clean_filename smudge_filename connect_maybe
EMPTY HTTP_CODE_OK HTTP_CODE_PAGE_NOT_FOUND);
}
# Mediawiki filenames can contain forward slashes. This variable decides by which pattern they should be replaced
use constant SLASH_REPLACEMENT => '%2F';
# Used to test for empty strings
use constant EMPTY => q{};
# HTTP codes
use constant HTTP_CODE_OK => 200;
use constant HTTP_CODE_PAGE_NOT_FOUND => 404;
sub clean_filename {
my $filename = shift;
$filename =~ s{@{[SLASH_REPLACEMENT]}}{/}g;
# [, ], |, {, and } are forbidden by MediaWiki, even URL-encoded.
# Do a variant of URL-encoding, i.e. looks like URL-encoding,
# but with _ added to prevent MediaWiki from thinking this is
# an actual special character.
$filename =~ s/[\[\]\{\}\|]/sprintf("_%%_%x", ord($&))/ge;
# If we use the uri escape before
# we should unescape here, before anything
return $filename;
}
sub smudge_filename {
my $filename = shift;
$filename =~ s{/}{@{[SLASH_REPLACEMENT]}}g;
$filename =~ s/ /_/g;
# Decode forbidden characters encoded in clean_filename
$filename =~ s/_%_([0-9a-fA-F][0-9a-fA-F])/sprintf('%c', hex($1))/ge;
return substr($filename, 0, NAME_MAX-length('.mw'));
}
sub connect_maybe {
my $wiki = shift;
if ($wiki) {
return $wiki;
}
my $remote_name = shift;
my $remote_url = shift;
my ($wiki_login, $wiki_password, $wiki_domain);
$wiki_login = Git::config("remote.${remote_name}.mwLogin");
$wiki_password = Git::config("remote.${remote_name}.mwPassword");
$wiki_domain = Git::config("remote.${remote_name}.mwDomain");
$wiki = MediaWiki::API->new;
$wiki->{config}->{api_url} = "${remote_url}/api.php";
if ($wiki_login) {
my %credential = (
'url' => $remote_url,
'username' => $wiki_login,
'password' => $wiki_password
);
Git::credential(\%credential);
my $request = {lgname => $credential{username},
lgpassword => $credential{password},
lgdomain => $wiki_domain};
if ($wiki->login($request)) {
Git::credential(\%credential, 'approve');
print {*STDERR} qq(Logged in mediawiki user "$credential{username}".\n);
} else {
print {*STDERR} qq(Failed to log in mediawiki user "$credential{username}" on ${remote_url}\n);
print {*STDERR} ' (error ' .
$wiki->{error}->{code} . ': ' .
$wiki->{error}->{details} . ")\n";
Git::credential(\%credential, 'reject');
exit 1;
}
}
return $wiki;
}
1; # Famous last words

View File

@@ -1,61 +0,0 @@
#
# Copyright (C) 2013
# Matthieu Moy <Matthieu.Moy@imag.fr>
#
# To build and test:
#
# make
# bin-wrapper/git mw preview Some_page.mw
# bin-wrapper/git clone mediawiki::http://example.com/wiki/
#
# To install, run Git's toplevel 'make install' then run:
#
# make install
# The default target of this Makefile is...
all::
GIT_MEDIAWIKI_PM=Git/Mediawiki.pm
SCRIPT_PERL=git-remote-mediawiki.perl
SCRIPT_PERL+=git-mw.perl
GIT_ROOT_DIR=../..
HERE=contrib/mw-to-git/
INSTALL = install
SCRIPT_PERL_FULL=$(patsubst %,$(HERE)/%,$(SCRIPT_PERL))
INSTLIBDIR=$(shell $(MAKE) -C $(GIT_ROOT_DIR)/ \
-s --no-print-directory prefix=$(prefix) \
perllibdir=$(perllibdir) perllibdir)
DESTDIR_SQ = $(subst ','\'',$(DESTDIR))
INSTLIBDIR_SQ = $(subst ','\'',$(INSTLIBDIR))
all:: build
test: all
$(MAKE) -C t
check: perlcritic test
install_pm:
$(INSTALL) -d -m 755 '$(DESTDIR_SQ)$(INSTLIBDIR_SQ)/Git'
$(INSTALL) -m 644 $(GIT_MEDIAWIKI_PM) \
'$(DESTDIR_SQ)$(INSTLIBDIR_SQ)/$(GIT_MEDIAWIKI_PM)'
build:
$(MAKE) -C $(GIT_ROOT_DIR) SCRIPT_PERL="$(SCRIPT_PERL_FULL)" \
build-perl-script
install: install_pm
$(MAKE) -C $(GIT_ROOT_DIR) SCRIPT_PERL="$(SCRIPT_PERL_FULL)" \
install-perl-script
clean:
$(MAKE) -C $(GIT_ROOT_DIR) SCRIPT_PERL="$(SCRIPT_PERL_FULL)" \
clean-perl-script
perlcritic:
perlcritic -5 $(SCRIPT_PERL)
-perlcritic -2 $(SCRIPT_PERL)
.PHONY: all test check install_pm install clean perlcritic

View File

@@ -1,14 +0,0 @@
#!/bin/sh
# git executable wrapper script for Git-Mediawiki to run tests without
# installing all the scripts and perl packages.
GIT_ROOT_DIR=../../..
GIT_EXEC_PATH=$(cd "$(dirname "$0")" && cd ${GIT_ROOT_DIR} && pwd)
GITPERLLIB="$GIT_EXEC_PATH"'/contrib/mw-to-git'"${GITPERLLIB:+:$GITPERLLIB}"
PATH="$GIT_EXEC_PATH"'/contrib/mw-to-git:'"$PATH"
export GITPERLLIB PATH
exec "${GIT_EXEC_PATH}/bin-wrappers/git" "$@"

View File

@@ -1,368 +0,0 @@
#!/usr/bin/perl
# Copyright (C) 2013
# Benoit Person <benoit.person@ensimag.imag.fr>
# Celestin Matte <celestin.matte@ensimag.imag.fr>
# License: GPL v2 or later
# Set of tools for git repo with a mediawiki remote.
# Documentation & bugtracker: https://github.com/Git-Mediawiki/Git-Mediawiki
use strict;
use warnings;
use Getopt::Long;
use URI::URL qw(url);
use LWP::UserAgent;
use HTML::TreeBuilder;
use Git;
use MediaWiki::API;
use Git::Mediawiki qw(clean_filename connect_maybe
EMPTY HTTP_CODE_PAGE_NOT_FOUND);
# By default, use UTF-8 to communicate with Git and the user
binmode STDERR, ':encoding(UTF-8)';
binmode STDOUT, ':encoding(UTF-8)';
# Global parameters
my $verbose = 0;
sub v_print {
if ($verbose) {
return print {*STDERR} @_;
}
return;
}
# Preview parameters
my $file_name = EMPTY;
my $remote_name = EMPTY;
my $preview_file_name = EMPTY;
my $autoload = 0;
sub file {
$file_name = shift;
return $file_name;
}
my %commands = (
'help' =>
[\&help, {}, \&help],
'preview' =>
[\&preview, {
'<>' => \&file,
'output|o=s' => \$preview_file_name,
'remote|r=s' => \$remote_name,
'autoload|a' => \$autoload
}, \&preview_help]
);
# Search for sub-command
my $cmd = $commands{'help'};
for (0..@ARGV-1) {
if (defined $commands{$ARGV[$_]}) {
$cmd = $commands{$ARGV[$_]};
splice @ARGV, $_, 1;
last;
}
};
GetOptions( %{$cmd->[1]},
'help|h' => \&{$cmd->[2]},
'verbose|v' => \$verbose);
# Launch command
&{$cmd->[0]};
############################# Preview Functions ################################
sub preview_help {
print {*STDOUT} <<'END';
USAGE: git mw preview [--remote|-r <remote name>] [--autoload|-a]
[--output|-o <output filename>] [--verbose|-v]
<blob> | <filename>
DESCRIPTION:
Preview is an utiliy to preview local content of a mediawiki repo as if it was
pushed on the remote.
For that, preview searches for the remote name of the current branch's
upstream if --remote is not set. If that remote is not found or if it
is not a mediawiki, it lists all mediawiki remotes configured and asks
you to replay your command with the --remote option set properly.
Then, it searches for a file named 'filename'. If it's not found in
the current dir, it will assume it's a blob.
The content retrieved in the file (or in the blob) will then be parsed
by the remote mediawiki and combined with a template retrieved from
the mediawiki.
Finally, preview will save the HTML result in a file. and autoload it
in your default web browser if the option --autoload is present.
OPTIONS:
-r <remote name>, --remote <remote name>
If the remote is a mediawiki, the template and the parse engine
used for the preview will be those of that remote.
If not, a list of valid remotes will be shown.
-a, --autoload
Try to load the HTML output in a new tab (or new window) of your
default web browser.
-o <output filename>, --output <output filename>
Change the HTML output filename. Default filename is based on the
input filename with its extension replaced by '.html'.
-v, --verbose
Show more information on what's going on under the hood.
END
exit;
}
sub preview {
my $wiki;
my ($remote_url, $wiki_page_name);
my ($new_content, $template);
my $file_content;
if ($file_name eq EMPTY) {
die "Missing file argument, see `git mw help`\n";
}
v_print("### Selecting remote\n");
if ($remote_name eq EMPTY) {
$remote_name = find_upstream_remote_name();
if ($remote_name) {
$remote_url = mediawiki_remote_url_maybe($remote_name);
}
if (! $remote_url) {
my @valid_remotes = find_mediawiki_remotes();
if ($#valid_remotes == 0) {
print {*STDERR} "No mediawiki remote in this repo. \n";
exit 1;
} else {
my $remotes_list = join("\n\t", @valid_remotes);
print {*STDERR} <<"MESSAGE";
There are multiple mediawiki remotes, which of:
${remotes_list}
do you want ? Use the -r option to specify the remote.
MESSAGE
}
exit 1;
}
} else {
if (!is_valid_remote($remote_name)) {
die "${remote_name} is not a remote\n";
}
$remote_url = mediawiki_remote_url_maybe($remote_name);
if (! $remote_url) {
die "${remote_name} is not a mediawiki remote\n";
}
}
v_print("selected remote:\n\tname: ${remote_name}\n\turl: ${remote_url}\n");
$wiki = connect_maybe($wiki, $remote_name, $remote_url);
# Read file content
if (! -e $file_name) {
$file_content = git_cmd_try {
Git::command('cat-file', 'blob', $file_name); }
"%s failed w/ code %d";
if ($file_name =~ /(.+):(.+)/) {
$file_name = $2;
}
} else {
open my $read_fh, "<", $file_name
or die "could not open ${file_name}: $!\n";
$file_content = do { local $/ = undef; <$read_fh> };
close $read_fh
or die "unable to close: $!\n";
}
v_print("### Retrieving template\n");
($wiki_page_name = clean_filename($file_name)) =~ s/\.[^.]+$//;
$template = get_template($remote_url, $wiki_page_name);
v_print("### Parsing local content\n");
$new_content = $wiki->api({
action => 'parse',
text => $file_content,
title => $wiki_page_name
}, {
skip_encoding => 1
}) or die "No response from remote mediawiki\n";
$new_content = $new_content->{'parse'}->{'text'}->{'*'};
v_print("### Merging contents\n");
if ($preview_file_name eq EMPTY) {
($preview_file_name = $file_name) =~ s/\.[^.]+$/.html/;
}
open(my $save_fh, '>:encoding(UTF-8)', $preview_file_name)
or die "Could not open: $!\n";
print {$save_fh} merge_contents($template, $new_content, $remote_url);
close($save_fh)
or die "Could not close: $!\n";
v_print("### Results\n");
if ($autoload) {
v_print("Launching browser w/ file: ${preview_file_name}");
system('git', 'web--browse', $preview_file_name);
} else {
print {*STDERR} "Preview file saved as: ${preview_file_name}\n";
}
exit;
}
# uses global scope variable: $remote_name
sub merge_contents {
my $template = shift;
my $content = shift;
my $remote_url = shift;
my ($content_tree, $html_tree, $mw_content_text);
my $template_content_id = 'bodyContent';
$html_tree = HTML::TreeBuilder->new;
$html_tree->parse($template);
$content_tree = HTML::TreeBuilder->new;
$content_tree->parse($content);
$template_content_id = Git::config("remote.${remote_name}.mwIDcontent")
|| $template_content_id;
v_print("Using '${template_content_id}' as the content ID\n");
$mw_content_text = $html_tree->look_down('id', $template_content_id);
if (!defined $mw_content_text) {
print {*STDERR} <<"CONFIG";
Could not combine the new content with the template. You might want to
configure `mediawiki.IDContent` in your config:
git config --add remote.${remote_name}.mwIDcontent <id>
and re-run the command afterward.
CONFIG
exit 1;
}
$mw_content_text->delete_content();
$mw_content_text->push_content($content_tree);
make_links_absolute($html_tree, $remote_url);
return $html_tree->as_HTML;
}
sub make_links_absolute {
my $html_tree = shift;
my $remote_url = shift;
for (@{ $html_tree->extract_links() }) {
my ($link, $element, $attr) = @{ $_ };
my $url = url($link)->canonical;
if ($url !~ /#/) {
$element->attr($attr, URI->new_abs($url, $remote_url));
}
}
return $html_tree;
}
sub is_valid_remote {
my $remote = shift;
my @remotes = git_cmd_try {
Git::command('remote') }
"%s failed w/ code %d";
my $found_remote = 0;
foreach my $remote (@remotes) {
if ($remote eq $remote) {
$found_remote = 1;
last;
}
}
return $found_remote;
}
sub find_mediawiki_remotes {
my @remotes = git_cmd_try {
Git::command('remote'); }
"%s failed w/ code %d";
my $remote_url;
my @valid_remotes = ();
foreach my $remote (@remotes) {
$remote_url = mediawiki_remote_url_maybe($remote);
if ($remote_url) {
push(@valid_remotes, $remote);
}
}
return @valid_remotes;
}
sub find_upstream_remote_name {
my $current_branch = git_cmd_try {
Git::command_oneline('symbolic-ref', '--short', 'HEAD') }
"%s failed w/ code %d";
return Git::config("branch.${current_branch}.remote");
}
sub mediawiki_remote_url_maybe {
my $remote = shift;
# Find remote url
my $remote_url = Git::config("remote.${remote}.url");
if ($remote_url =~ s/mediawiki::(.*)/$1/) {
return url($remote_url)->canonical;
}
return;
}
sub get_template {
my $url = shift;
my $page_name = shift;
my ($req, $res, $code, $url_after);
$req = LWP::UserAgent->new;
if ($verbose) {
$req->show_progress(1);
}
$res = $req->get("${url}/index.php?title=${page_name}");
if (!$res->is_success) {
$code = $res->code;
$url_after = $res->request()->uri(); # resolve all redirections
if ($code == HTTP_CODE_PAGE_NOT_FOUND) {
if ($verbose) {
print {*STDERR} <<"WARNING";
Warning: Failed to retrieve '$page_name'. Create it on the mediawiki if you want
all the links to work properly.
Trying to use the mediawiki homepage as a fallback template ...
WARNING
}
# LWP automatically redirects GET request
$res = $req->get("${url}/index.php");
if (!$res->is_success) {
$url_after = $res->request()->uri(); # resolve all redirections
die "Failed to get homepage @ ${url_after} w/ code ${code}\n";
}
} else {
die "Failed to get '${page_name}' @ ${url_after} w/ code ${code}\n";
}
}
return $res->decoded_content;
}
############################## Help Functions ##################################
sub help {
print {*STDOUT} <<'END';
usage: git mw <command> <args>
git mw commands are:
help Display help information about git mw
preview Parse and render local file into HTML
END
exit;
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +0,0 @@
Git-Mediawiki is a project which aims the creation of a gate
between git and mediawiki, allowing git users to push and pull
objects from mediawiki just as one would do with a classic git
repository thanks to remote-helpers.
For more information, visit the wiki at
https://github.com/Git-Mediawiki/Git-Mediawiki

View File

@@ -1,4 +0,0 @@
WEB/
mediawiki/
trash directory.t*/
test-results/

View File

@@ -1,32 +0,0 @@
#
# Copyright (C) 2012
# Charles Roussel <charles.roussel@ensimag.imag.fr>
# Simon Cathebras <simon.cathebras@ensimag.imag.fr>
# Julien Khayat <julien.khayat@ensimag.imag.fr>
# Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
# Simon Perrat <simon.perrat@ensimag.imag.fr>
#
## Test git-remote-mediawiki
# The default target of this Makefile is...
all:: test
-include ../../../config.mak.autogen
-include ../../../config.mak
T = $(wildcard t[0-9][0-9][0-9][0-9]-*.sh)
.PHONY: help test clean all
help:
@echo 'Run "$(MAKE) test" to launch test scripts'
@echo 'Run "$(MAKE) clean" to remove trash folders'
test:
@for t in $(T); do \
echo "$$t"; \
"./$$t" || exit 1; \
done
clean:
$(RM) -r 'trash directory'.*

View File

@@ -1,124 +0,0 @@
Tests for Mediawiki-to-Git
==========================
Introduction
------------
This manual describes how to install the git-remote-mediawiki test
environment on a machine with git installed on it.
Prerequisite
------------
In order to run this test environment correctly, you will need to
install the following packages (Debian/Ubuntu names, may need to be
adapted for another distribution):
* lighttpd
* php
* php-cgi
* php-cli
* php-curl
* php-sqlite
Principles and Technical Choices
--------------------------------
The test environment makes it easy to install and manipulate one or
several MediaWiki instances. To allow developers to run the testsuite
easily, the environment does not require root privilege (except to
install the required packages if needed). It starts a webserver
instance on the user's account (using lighttpd greatly helps for
that), and does not need a separate database daemon (thanks to the use
of sqlite).
Run the test environment
------------------------
Install a new wiki
~~~~~~~~~~~~~~~~~~
Once you have all the prerequisite, you need to install a MediaWiki
instance on your machine. If you already have one, it is still
strongly recommended to install one with the script provided. Here's
how to work it:
a. change directory to contrib/mw-to-git/t/
b. if needed, edit test.config to choose your installation parameters
c. run `./install-wiki.sh install`
d. check on your favourite web browser if your wiki is correctly
installed.
Remove an existing wiki
~~~~~~~~~~~~~~~~~~~~~~~
Edit the file test.config to fit the wiki you want to delete, and then
execute the command `./install-wiki.sh delete` from the
contrib/mw-to-git/t directory.
Run the existing tests
~~~~~~~~~~~~~~~~~~~~~~
The provided tests are currently in the `contrib/mw-to-git/t` directory.
The files are all the t936[0-9]-*.sh shell scripts.
a. Run all tests:
To do so, run "make test" from the contrib/mw-to-git/ directory.
b. Run a specific test:
To run a given test <test_name>, run ./<test_name> from the
contrib/mw-to-git/t directory.
How to create new tests
-----------------------
Available functions
~~~~~~~~~~~~~~~~~~~
The test environment of git-remote-mediawiki provides some functions
useful to test its behaviour. for more details about the functions'
parameters, please refer to the `test-gitmw-lib.sh` and
`test-gitmw.pl` files.
** `test_check_wiki_precond`:
Check if the tests must be skipped or not. Please use this function
at the beginning of each new test file.
** `wiki_getpage`:
Fetch a given page from the wiki and puts its content in the
directory in parameter.
** `wiki_delete_page`:
Delete a given page from the wiki.
** `wiki_edit_page`:
Create or modify a given page in the wiki. You can specify several
parameters like a summary for the page edition, or add the page to a
given category.
See test-gitmw.pl for more details.
** `wiki_getallpage`:
Fetch all pages from the wiki into a given directory. The directory
is created if it does not exists.
** `test_diff_directories`:
Compare the content of two directories. The content must be the same.
Use this function to compare the content of a git directory and a wiki
one created by wiki_getallpage.
** `test_contains_N_files`:
Check if the given directory contains a given number of file.
** `wiki_page_exists`:
Tests if a given page exists on the wiki.
** `wiki_reset`:
Reset the wiki, i.e. flush the database. Use this function at the
beginning of each new test, except if the test re-uses the same wiki
(and history) as the previous test.
How to write a new test
~~~~~~~~~~~~~~~~~~~~~~~
Please, follow the standards given by git. See git/t/README.
New file should be named as t936[0-9]-*.sh.
Be sure to reset your wiki regularly with the function `wiki_reset`.

View File

@@ -1,55 +0,0 @@
#!/bin/sh
# This script installs or deletes a MediaWiki on your computer.
# It requires a web server with PHP and SQLite running. In addition, if you
# do not have MediaWiki sources on your computer, the option 'install'
# downloads them for you.
# Please set the CONFIGURATION VARIABLES in ./test-gitmw-lib.sh
WIKI_TEST_DIR=$(cd "$(dirname "$0")" && pwd)
if test -z "$WIKI_TEST_DIR"
then
WIKI_TEST_DIR=.
fi
. "$WIKI_TEST_DIR"/test-gitmw-lib.sh
usage () {
echo "usage: "
echo " ./install-wiki.sh <install | delete | --help>"
echo " install | -i : Install a wiki on your computer."
echo " delete | -d : Delete the wiki and all its pages and "
echo " content."
echo " start | -s : Start the previously configured lighttpd daemon"
echo " stop : Stop lighttpd daemon."
}
# Argument: install, delete, --help | -h
case "$1" in
"install" | "-i")
wiki_install
exit 0
;;
"delete" | "-d")
wiki_delete
exit 0
;;
"start" | "-s")
start_lighttpd
exit
;;
"stop")
stop_lighttpd
exit
;;
"--help" | "-h")
usage
exit 0
;;
*)
echo "Invalid argument: $1"
usage
exit 1
;;
esac

View File

@@ -1,144 +0,0 @@
test_push_pull () {
test_expect_success 'Git pull works after adding a new wiki page' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_1 &&
wiki_editpage Foo "page created after the git clone" false &&
(
cd mw_dir_1 &&
git pull
) &&
wiki_getallpage ref_page_1 &&
test_diff_directories mw_dir_1 ref_page_1
'
test_expect_success 'Git pull works after editing a wiki page' '
wiki_reset &&
wiki_editpage Foo "page created before the git clone" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_2 &&
wiki_editpage Foo "new line added on the wiki" true &&
(
cd mw_dir_2 &&
git pull
) &&
wiki_getallpage ref_page_2 &&
test_diff_directories mw_dir_2 ref_page_2
'
test_expect_success 'git pull works on conflict handled by auto-merge' '
wiki_reset &&
wiki_editpage Foo "1 init
3
5
" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_3 &&
wiki_editpage Foo "1 init
2 content added on wiki after clone
3
5
" false &&
(
cd mw_dir_3 &&
echo "1 init
3
4 content added on git after clone
5
" >Foo.mw &&
git commit -am "conflicting change on foo" &&
git pull &&
git push
)
'
test_expect_success 'Git push works after adding a file .mw' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_4 &&
wiki_getallpage ref_page_4 &&
(
cd mw_dir_4 &&
test_path_is_missing Foo.mw &&
touch Foo.mw &&
echo "hello world" >>Foo.mw &&
git add Foo.mw &&
git commit -m "Foo" &&
git push
) &&
wiki_getallpage ref_page_4 &&
test_diff_directories mw_dir_4 ref_page_4
'
test_expect_success 'Git push works after editing a file .mw' '
wiki_reset &&
wiki_editpage "Foo" "page created before the git clone" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_5 &&
(
cd mw_dir_5 &&
echo "new line added in the file Foo.mw" >>Foo.mw &&
git commit -am "edit file Foo.mw" &&
git push
) &&
wiki_getallpage ref_page_5 &&
test_diff_directories mw_dir_5 ref_page_5
'
test_expect_failure 'Git push works after deleting a file' '
wiki_reset &&
wiki_editpage Foo "wiki page added before git clone" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_6 &&
(
cd mw_dir_6 &&
git rm Foo.mw &&
git commit -am "page Foo.mw deleted" &&
git push
) &&
test_must_fail wiki_page_exist Foo
'
test_expect_success 'Merge conflict expected and solving it' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_7 &&
wiki_editpage Foo "1 conflict
3 wiki
4" false &&
(
cd mw_dir_7 &&
echo "1 conflict
2 git
4" >Foo.mw &&
git add Foo.mw &&
git commit -m "conflict created" &&
test_must_fail git pull &&
"$PERL_PATH" -pi -e "s/[<=>].*//g" Foo.mw &&
git commit -am "merge conflict solved" &&
git push
)
'
test_expect_failure 'git pull works after deleting a wiki page' '
wiki_reset &&
wiki_editpage Foo "wiki page added before the git clone" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_8 &&
wiki_delete_page Foo &&
(
cd mw_dir_8 &&
git pull &&
test_path_is_missing Foo.mw
)
'
}

View File

@@ -1,257 +0,0 @@
#!/bin/sh
#
# Copyright (C) 2012
# Charles Roussel <charles.roussel@ensimag.imag.fr>
# Simon Cathebras <simon.cathebras@ensimag.imag.fr>
# Julien Khayat <julien.khayat@ensimag.imag.fr>
# Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
# Simon Perrat <simon.perrat@ensimag.imag.fr>
#
# License: GPL v2 or later
test_description='Test the Git Mediawiki remote helper: git clone'
. ./test-gitmw-lib.sh
. $TEST_DIRECTORY/test-lib.sh
test_check_precond
test_expect_success 'Git clone creates the expected git log with one file' '
wiki_reset &&
wiki_editpage foo "this is not important" false -c cat -s "this must be the same" &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_1 &&
(
cd mw_dir_1 &&
git log --format=%s HEAD^..HEAD >log.tmp
) &&
echo "this must be the same" >msg.tmp &&
test_cmp msg.tmp mw_dir_1/log.tmp
'
test_expect_success 'Git clone creates the expected git log with multiple files' '
wiki_reset &&
wiki_editpage daddy "this is not important" false -s="this must be the same" &&
wiki_editpage daddy "neither is this" true -s="this must also be the same" &&
wiki_editpage daddy "neither is this" true -s="same same same" &&
wiki_editpage dj "dont care" false -s="identical" &&
wiki_editpage dj "dont care either" true -s="identical too" &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_2 &&
(
cd mw_dir_2 &&
git log --format=%s Daddy.mw >logDaddy.tmp &&
git log --format=%s Dj.mw >logDj.tmp
) &&
echo "same same same" >msgDaddy.tmp &&
echo "this must also be the same" >>msgDaddy.tmp &&
echo "this must be the same" >>msgDaddy.tmp &&
echo "identical too" >msgDj.tmp &&
echo "identical" >>msgDj.tmp &&
test_cmp msgDaddy.tmp mw_dir_2/logDaddy.tmp &&
test_cmp msgDj.tmp mw_dir_2/logDj.tmp
'
test_expect_success 'Git clone creates only Main_Page.mw with an empty wiki' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_3 &&
test_contains_N_files mw_dir_3 1 &&
test_path_is_file mw_dir_3/Main_Page.mw
'
test_expect_success 'Git clone does not fetch a deleted page' '
wiki_reset &&
wiki_editpage foo "this page must be deleted before the clone" false &&
wiki_delete_page foo &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_4 &&
test_contains_N_files mw_dir_4 1 &&
test_path_is_file mw_dir_4/Main_Page.mw &&
test_path_is_missing mw_dir_4/Foo.mw
'
test_expect_success 'Git clone works with page added' '
wiki_reset &&
wiki_editpage foo " I will be cloned" false &&
wiki_editpage bar "I will be cloned" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_5 &&
wiki_getallpage ref_page_5 &&
test_diff_directories mw_dir_5 ref_page_5 &&
wiki_delete_page foo &&
wiki_delete_page bar
'
test_expect_success 'Git clone works with an edited page ' '
wiki_reset &&
wiki_editpage foo "this page will be edited" \
false -s "first edition of page foo" &&
wiki_editpage foo "this page has been edited and must be on the clone " true &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_6 &&
test_path_is_file mw_dir_6/Foo.mw &&
test_path_is_file mw_dir_6/Main_Page.mw &&
wiki_getallpage mw_dir_6/page_ref_6 &&
test_diff_directories mw_dir_6 mw_dir_6/page_ref_6 &&
(
cd mw_dir_6 &&
git log --format=%s HEAD^ Foo.mw > ../Foo.log
) &&
echo "first edition of page foo" > FooExpect.log &&
diff FooExpect.log Foo.log
'
test_expect_success 'Git clone works with several pages and some deleted ' '
wiki_reset &&
wiki_editpage foo "this page will not be deleted" false &&
wiki_editpage bar "I must not be erased" false &&
wiki_editpage namnam "I will not be there at the end" false &&
wiki_editpage nyancat "nyan nyan nyan delete me" false &&
wiki_delete_page namnam &&
wiki_delete_page nyancat &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_7 &&
test_path_is_file mw_dir_7/Foo.mw &&
test_path_is_file mw_dir_7/Bar.mw &&
test_path_is_missing mw_dir_7/Namnam.mw &&
test_path_is_missing mw_dir_7/Nyancat.mw &&
wiki_getallpage mw_dir_7/page_ref_7 &&
test_diff_directories mw_dir_7 mw_dir_7/page_ref_7
'
test_expect_success 'Git clone works with one specific page cloned ' '
wiki_reset &&
wiki_editpage foo "I will not be cloned" false &&
wiki_editpage bar "Do not clone me" false &&
wiki_editpage namnam "I will be cloned :)" false -s="this log must stay" &&
wiki_editpage nyancat "nyan nyan nyan you cant clone me" false &&
git clone -c remote.origin.pages=namnam \
mediawiki::'"$WIKI_URL"' mw_dir_8 &&
test_contains_N_files mw_dir_8 1 &&
test_path_is_file mw_dir_8/Namnam.mw &&
test_path_is_missing mw_dir_8/Main_Page.mw &&
(
cd mw_dir_8 &&
echo "this log must stay" >msg.tmp &&
git log --format=%s >log.tmp &&
test_cmp msg.tmp log.tmp
) &&
wiki_check_content mw_dir_8/Namnam.mw Namnam
'
test_expect_success 'Git clone works with multiple specific page cloned ' '
wiki_reset &&
wiki_editpage foo "I will be there" false &&
wiki_editpage bar "I will not disappear" false &&
wiki_editpage namnam "I be erased" false &&
wiki_editpage nyancat "nyan nyan nyan you will not erase me" false &&
wiki_delete_page namnam &&
git clone -c remote.origin.pages="foo bar nyancat namnam" \
mediawiki::'"$WIKI_URL"' mw_dir_9 &&
test_contains_N_files mw_dir_9 3 &&
test_path_is_missing mw_dir_9/Namnam.mw &&
test_path_is_file mw_dir_9/Foo.mw &&
test_path_is_file mw_dir_9/Nyancat.mw &&
test_path_is_file mw_dir_9/Bar.mw &&
wiki_check_content mw_dir_9/Foo.mw Foo &&
wiki_check_content mw_dir_9/Bar.mw Bar &&
wiki_check_content mw_dir_9/Nyancat.mw Nyancat
'
test_expect_success 'Mediawiki-clone of several specific pages on wiki' '
wiki_reset &&
wiki_editpage foo "foo 1" false &&
wiki_editpage bar "bar 1" false &&
wiki_editpage dummy "dummy 1" false &&
wiki_editpage cloned_1 "cloned_1 1" false &&
wiki_editpage cloned_2 "cloned_2 2" false &&
wiki_editpage cloned_3 "cloned_3 3" false &&
mkdir -p ref_page_10 &&
wiki_getpage cloned_1 ref_page_10 &&
wiki_getpage cloned_2 ref_page_10 &&
wiki_getpage cloned_3 ref_page_10 &&
git clone -c remote.origin.pages="cloned_1 cloned_2 cloned_3" \
mediawiki::'"$WIKI_URL"' mw_dir_10 &&
test_diff_directories mw_dir_10 ref_page_10
'
test_expect_success 'Git clone works with the shallow option' '
wiki_reset &&
wiki_editpage foo "1st revision, should be cloned" false &&
wiki_editpage bar "1st revision, should be cloned" false &&
wiki_editpage nyan "1st revision, should not be cloned" false &&
wiki_editpage nyan "2nd revision, should be cloned" false &&
git -c remote.origin.shallow=true clone \
mediawiki::'"$WIKI_URL"' mw_dir_11 &&
test_contains_N_files mw_dir_11 4 &&
test_path_is_file mw_dir_11/Nyan.mw &&
test_path_is_file mw_dir_11/Foo.mw &&
test_path_is_file mw_dir_11/Bar.mw &&
test_path_is_file mw_dir_11/Main_Page.mw &&
(
cd mw_dir_11 &&
test $(git log --oneline Nyan.mw | wc -l) -eq 1 &&
test $(git log --oneline Foo.mw | wc -l) -eq 1 &&
test $(git log --oneline Bar.mw | wc -l) -eq 1 &&
test $(git log --oneline Main_Page.mw | wc -l ) -eq 1
) &&
wiki_check_content mw_dir_11/Nyan.mw Nyan &&
wiki_check_content mw_dir_11/Foo.mw Foo &&
wiki_check_content mw_dir_11/Bar.mw Bar &&
wiki_check_content mw_dir_11/Main_Page.mw Main_Page
'
test_expect_success 'Git clone works with the shallow option with a delete page' '
wiki_reset &&
wiki_editpage foo "1st revision, will be deleted" false &&
wiki_editpage bar "1st revision, should be cloned" false &&
wiki_editpage nyan "1st revision, should not be cloned" false &&
wiki_editpage nyan "2nd revision, should be cloned" false &&
wiki_delete_page foo &&
git -c remote.origin.shallow=true clone \
mediawiki::'"$WIKI_URL"' mw_dir_12 &&
test_contains_N_files mw_dir_12 3 &&
test_path_is_file mw_dir_12/Nyan.mw &&
test_path_is_missing mw_dir_12/Foo.mw &&
test_path_is_file mw_dir_12/Bar.mw &&
test_path_is_file mw_dir_12/Main_Page.mw &&
(
cd mw_dir_12 &&
test $(git log --oneline Nyan.mw | wc -l) -eq 1 &&
test $(git log --oneline Bar.mw | wc -l) -eq 1 &&
test $(git log --oneline Main_Page.mw | wc -l ) -eq 1
) &&
wiki_check_content mw_dir_12/Nyan.mw Nyan &&
wiki_check_content mw_dir_12/Bar.mw Bar &&
wiki_check_content mw_dir_12/Main_Page.mw Main_Page
'
test_expect_success 'Test of fetching a category' '
wiki_reset &&
wiki_editpage Foo "I will be cloned" false -c=Category &&
wiki_editpage Bar "Meet me on the repository" false -c=Category &&
wiki_editpage Dummy "I will not come" false &&
wiki_editpage BarWrong "I will stay online only" false -c=NotCategory &&
git clone -c remote.origin.categories="Category" \
mediawiki::'"$WIKI_URL"' mw_dir_13 &&
wiki_getallpage ref_page_13 Category &&
test_diff_directories mw_dir_13 ref_page_13
'
test_expect_success 'Test of resistance to modification of category on wiki for clone' '
wiki_reset &&
wiki_editpage Tobedeleted "this page will be deleted" false -c=Catone &&
wiki_editpage Tobeedited "this page will be modified" false -c=Catone &&
wiki_editpage Normalone "this page wont be modified and will be on git" false -c=Catone &&
wiki_editpage Notconsidered "this page will not appear on local" false &&
wiki_editpage Othercategory "this page will not appear on local" false -c=Cattwo &&
wiki_editpage Tobeedited "this page have been modified" true -c=Catone &&
wiki_delete_page Tobedeleted &&
git clone -c remote.origin.categories="Catone" \
mediawiki::'"$WIKI_URL"' mw_dir_14 &&
wiki_getallpage ref_page_14 Catone &&
test_diff_directories mw_dir_14 ref_page_14
'
test_done

View File

@@ -1,24 +0,0 @@
#!/bin/sh
#
# Copyright (C) 2012
# Charles Roussel <charles.roussel@ensimag.imag.fr>
# Simon Cathebras <simon.cathebras@ensimag.imag.fr>
# Julien Khayat <julien.khayat@ensimag.imag.fr>
# Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
# Simon Perrat <simon.perrat@ensimag.imag.fr>
#
# License: GPL v2 or later
# tests for git-remote-mediawiki
test_description='Test the Git Mediawiki remote helper: git push and git pull simple test cases'
. ./test-gitmw-lib.sh
. ./push-pull-tests.sh
. $TEST_DIRECTORY/test-lib.sh
test_check_precond
test_push_pull
test_done

View File

@@ -1,347 +0,0 @@
#!/bin/sh
#
# Copyright (C) 2012
# Charles Roussel <charles.roussel@ensimag.imag.fr>
# Simon Cathebras <simon.cathebras@ensimag.imag.fr>
# Julien Khayat <julien.khayat@ensimag.imag.fr>
# Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
# Simon Perrat <simon.perrat@ensimag.imag.fr>
#
# License: GPL v2 or later
# tests for git-remote-mediawiki
test_description='Test git-mediawiki with special characters in filenames'
. ./test-gitmw-lib.sh
. $TEST_DIRECTORY/test-lib.sh
test_check_precond
test_expect_success 'Git clone works for a wiki with accents in the page names' '
wiki_reset &&
wiki_editpage féé "This page must be délétéd before clone" false &&
wiki_editpage kèè "This page must be deleted before clone" false &&
wiki_editpage hàà "This page must be deleted before clone" false &&
wiki_editpage kîî "This page must be deleted before clone" false &&
wiki_editpage foo "This page must be deleted before clone" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_1 &&
wiki_getallpage ref_page_1 &&
test_diff_directories mw_dir_1 ref_page_1
'
test_expect_success 'Git pull works with a wiki with accents in the pages names' '
wiki_reset &&
wiki_editpage kîî "this page must be cloned" false &&
wiki_editpage foo "this page must be cloned" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_2 &&
wiki_editpage éàîôû "This page must be pulled" false &&
(
cd mw_dir_2 &&
git pull
) &&
wiki_getallpage ref_page_2 &&
test_diff_directories mw_dir_2 ref_page_2
'
test_expect_success 'Cloning a chosen page works with accents' '
wiki_reset &&
wiki_editpage kîî "this page must be cloned" false &&
git clone -c remote.origin.pages=kîî \
mediawiki::'"$WIKI_URL"' mw_dir_3 &&
wiki_check_content mw_dir_3/Kîî.mw Kîî &&
test_path_is_file mw_dir_3/Kîî.mw &&
rm -rf mw_dir_3
'
test_expect_success 'The shallow option works with accents' '
wiki_reset &&
wiki_editpage néoà "1st revision, should not be cloned" false &&
wiki_editpage néoà "2nd revision, should be cloned" false &&
git -c remote.origin.shallow=true clone \
mediawiki::'"$WIKI_URL"' mw_dir_4 &&
test_contains_N_files mw_dir_4 2 &&
test_path_is_file mw_dir_4/Néoà.mw &&
test_path_is_file mw_dir_4/Main_Page.mw &&
(
cd mw_dir_4 &&
test $(git log --oneline Néoà.mw | wc -l) -eq 1 &&
test $(git log --oneline Main_Page.mw | wc -l ) -eq 1
) &&
wiki_check_content mw_dir_4/Néoà.mw Néoà &&
wiki_check_content mw_dir_4/Main_Page.mw Main_Page
'
test_expect_success 'Cloning works when page name first letter has an accent' '
wiki_reset &&
wiki_editpage îî "this page must be cloned" false &&
git clone -c remote.origin.pages=îî \
mediawiki::'"$WIKI_URL"' mw_dir_5 &&
test_path_is_file mw_dir_5/Îî.mw &&
wiki_check_content mw_dir_5/Îî.mw Îî
'
test_expect_success 'Git push works with a wiki with accents' '
wiki_reset &&
wiki_editpage féé "lots of accents : éèàÖ" false &&
wiki_editpage foo "this page must be cloned" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_6 &&
(
cd mw_dir_6 &&
echo "A wild Pîkächû appears on the wiki" >Pîkächû.mw &&
git add Pîkächû.mw &&
git commit -m "A new page appears" &&
git push
) &&
wiki_getallpage ref_page_6 &&
test_diff_directories mw_dir_6 ref_page_6
'
test_expect_success 'Git clone works with accentsand spaces' '
wiki_reset &&
wiki_editpage "é à î" "this page must be délété before the clone" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_7 &&
wiki_getallpage ref_page_7 &&
test_diff_directories mw_dir_7 ref_page_7
'
test_expect_success 'character $ in page name (mw -> git)' '
wiki_reset &&
wiki_editpage file_\$_foo "expect to be called file_$_foo" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_8 &&
test_path_is_file mw_dir_8/File_\$_foo.mw &&
wiki_getallpage ref_page_8 &&
test_diff_directories mw_dir_8 ref_page_8
'
test_expect_success 'character $ in file name (git -> mw) ' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_9 &&
(
cd mw_dir_9 &&
echo "this file is called File_\$_foo.mw" >File_\$_foo.mw &&
git add . &&
git commit -am "file File_\$_foo.mw" &&
git pull &&
git push
) &&
wiki_getallpage ref_page_9 &&
test_diff_directories mw_dir_9 ref_page_9
'
test_expect_failure 'capital at the beginning of file names' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_10 &&
(
cd mw_dir_10 &&
echo "my new file foo" >foo.mw &&
echo "my new file Foo... Finger crossed" >Foo.mw &&
git add . &&
git commit -am "file foo.mw" &&
git pull &&
git push
) &&
wiki_getallpage ref_page_10 &&
test_diff_directories mw_dir_10 ref_page_10
'
test_expect_failure 'special character at the beginning of file name from mw to git' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_11 &&
wiki_editpage {char_1 "expect to be renamed {char_1" false &&
wiki_editpage [char_2 "expect to be renamed [char_2" false &&
(
cd mw_dir_11 &&
git pull
) &&
test_path_is_file mw_dir_11/{char_1 &&
test_path_is_file mw_dir_11/[char_2
'
test_expect_success 'Pull page with title containing ":" other than namespace separator' '
wiki_editpage Foo:Bar content false &&
(
cd mw_dir_11 &&
git pull
) &&
test_path_is_file mw_dir_11/Foo:Bar.mw
'
test_expect_success 'Push page with title containing ":" other than namespace separator' '
(
cd mw_dir_11 &&
echo content >NotANameSpace:Page.mw &&
git add NotANameSpace:Page.mw &&
git commit -m "add page with colon" &&
git push
) &&
wiki_page_exist NotANameSpace:Page
'
test_expect_success 'test of correct formatting for file name from mw to git' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_12 &&
wiki_editpage char_%_7b_1 "expect to be renamed char{_1" false &&
wiki_editpage char_%_5b_2 "expect to be renamed char{_2" false &&
(
cd mw_dir_12 &&
git pull
) &&
test_path_is_file mw_dir_12/Char\{_1.mw &&
test_path_is_file mw_dir_12/Char\[_2.mw &&
wiki_getallpage ref_page_12 &&
mv ref_page_12/Char_%_7b_1.mw ref_page_12/Char\{_1.mw &&
mv ref_page_12/Char_%_5b_2.mw ref_page_12/Char\[_2.mw &&
test_diff_directories mw_dir_12 ref_page_12
'
test_expect_failure 'test of correct formatting for file name beginning with special character' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_13 &&
(
cd mw_dir_13 &&
echo "my new file {char_1" >\{char_1.mw &&
echo "my new file [char_2" >\[char_2.mw &&
git add . &&
git commit -am "committing some exotic file name..." &&
git push &&
git pull
) &&
wiki_getallpage ref_page_13 &&
test_path_is_file ref_page_13/{char_1.mw &&
test_path_is_file ref_page_13/[char_2.mw &&
test_diff_directories mw_dir_13 ref_page_13
'
test_expect_success 'test of correct formatting for file name from git to mw' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_14 &&
(
cd mw_dir_14 &&
echo "my new file char{_1" >Char\{_1.mw &&
echo "my new file char[_2" >Char\[_2.mw &&
git add . &&
git commit -m "committing some exotic file name..." &&
git push
) &&
wiki_getallpage ref_page_14 &&
mv mw_dir_14/Char\{_1.mw mw_dir_14/Char_%_7b_1.mw &&
mv mw_dir_14/Char\[_2.mw mw_dir_14/Char_%_5b_2.mw &&
test_diff_directories mw_dir_14 ref_page_14
'
test_expect_success 'git clone with /' '
wiki_reset &&
wiki_editpage \/fo\/o "this is not important" false -c=Deleted &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_15 &&
test_path_is_file mw_dir_15/%2Ffo%2Fo.mw &&
wiki_check_content mw_dir_15/%2Ffo%2Fo.mw \/fo\/o
'
test_expect_success 'git push with /' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_16 &&
echo "I will be on the wiki" >mw_dir_16/%2Ffo%2Fo.mw &&
(
cd mw_dir_16 &&
git add %2Ffo%2Fo.mw &&
git commit -m " %2Ffo%2Fo added" &&
git push
) &&
wiki_page_exist \/fo\/o &&
wiki_check_content mw_dir_16/%2Ffo%2Fo.mw \/fo\/o
'
test_expect_success 'git clone with \' '
wiki_reset &&
wiki_editpage \\ko\\o "this is not important" false -c=Deleted &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_17 &&
test_path_is_file mw_dir_17/\\ko\\o.mw &&
wiki_check_content mw_dir_17/\\ko\\o.mw \\ko\\o
'
test_expect_success 'git push with \' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_18 &&
echo "I will be on the wiki" >mw_dir_18/\\ko\\o.mw &&
(
cd mw_dir_18 &&
git add \\ko\\o.mw &&
git commit -m " \\ko\\o added" &&
git push
) &&
wiki_page_exist \\ko\\o &&
wiki_check_content mw_dir_18/\\ko\\o.mw \\ko\\o
'
test_expect_success 'git clone with \ in format control' '
wiki_reset &&
wiki_editpage \\no\\o "this is not important" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_19 &&
test_path_is_file mw_dir_19/\\no\\o.mw &&
wiki_check_content mw_dir_19/\\no\\o.mw \\no\\o
'
test_expect_success 'git push with \ in format control' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_20 &&
echo "I will be on the wiki" >mw_dir_20/\\fo\\o.mw &&
(
cd mw_dir_20 &&
git add \\fo\\o.mw &&
git commit -m " \\fo\\o added" &&
git push
) &&
wiki_page_exist \\fo\\o &&
wiki_check_content mw_dir_20/\\fo\\o.mw \\fo\\o
'
test_expect_success 'fast-import meta-characters in page name (mw -> git)' '
wiki_reset &&
wiki_editpage \"file\"_\\_foo "expect to be called \"file\"_\\_foo" false &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_21 &&
test_path_is_file mw_dir_21/\"file\"_\\_foo.mw &&
wiki_getallpage ref_page_21 &&
test_diff_directories mw_dir_21 ref_page_21
'
test_expect_success 'fast-import meta-characters in page name (git -> mw) ' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir_22 &&
(
cd mw_dir_22 &&
echo "this file is called \"file\"_\\_foo.mw" >\"file\"_\\_foo &&
git add . &&
git commit -am "file \"file\"_\\_foo" &&
git pull &&
git push
) &&
wiki_getallpage ref_page_22 &&
test_diff_directories mw_dir_22 ref_page_22
'
test_done

View File

@@ -1,218 +0,0 @@
#!/bin/sh
#
# Copyright (C) 2012
# Charles Roussel <charles.roussel@ensimag.imag.fr>
# Simon Cathebras <simon.cathebras@ensimag.imag.fr>
# Julien Khayat <julien.khayat@ensimag.imag.fr>
# Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
# Simon Perrat <simon.perrat@ensimag.imag.fr>
#
# License: GPL v2 or later
# tests for git-remote-mediawiki
test_description='Test the Git Mediawiki remote helper: git push and git pull simple test cases'
. ./test-gitmw-lib.sh
. $TEST_DIRECTORY/test-lib.sh
test_check_precond
test_git_reimport () {
git -c remote.origin.dumbPush=true push &&
git -c remote.origin.mediaImport=true pull --rebase
}
# Don't bother with permissions, be administrator by default
test_expect_success 'setup config' '
git config --global remote.origin.mwLogin "$WIKI_ADMIN" &&
git config --global remote.origin.mwPassword "$WIKI_PASSW" &&
test_might_fail git config --global --unset remote.origin.mediaImport
'
test_expect_failure 'git push can upload media (File:) files' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir &&
(
cd mw_dir &&
echo "hello world" >Foo.txt &&
git add Foo.txt &&
git commit -m "add a text file" &&
git push &&
"$PERL_PATH" -e "print STDOUT \"binary content: \".chr(255);" >Foo.txt &&
git add Foo.txt &&
git commit -m "add a text file with binary content" &&
git push
)
'
test_expect_failure 'git clone works on previously created wiki with media files' '
test_when_finished "rm -rf mw_dir mw_dir_clone" &&
git clone -c remote.origin.mediaimport=true \
mediawiki::'"$WIKI_URL"' mw_dir_clone &&
test_cmp mw_dir_clone/Foo.txt mw_dir/Foo.txt &&
(cd mw_dir_clone && git checkout HEAD^) &&
(cd mw_dir && git checkout HEAD^) &&
test_path_is_file mw_dir_clone/Foo.txt &&
test_cmp mw_dir_clone/Foo.txt mw_dir/Foo.txt
'
test_expect_success 'git push can upload media (File:) files containing valid UTF-8' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir &&
(
cd mw_dir &&
"$PERL_PATH" -e "print STDOUT \"UTF-8 content: éèàéê€.\";" >Bar.txt &&
git add Bar.txt &&
git commit -m "add a text file with UTF-8 content" &&
git push
)
'
test_expect_success 'git clone works on previously created wiki with media files containing valid UTF-8' '
test_when_finished "rm -rf mw_dir mw_dir_clone" &&
git clone -c remote.origin.mediaimport=true \
mediawiki::'"$WIKI_URL"' mw_dir_clone &&
test_cmp mw_dir_clone/Bar.txt mw_dir/Bar.txt
'
test_expect_success 'git push & pull work with locally renamed media files' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir &&
test_when_finished "rm -fr mw_dir" &&
(
cd mw_dir &&
echo "A File" >Foo.txt &&
git add Foo.txt &&
git commit -m "add a file" &&
git mv Foo.txt Bar.txt &&
git commit -m "Rename a file" &&
test_git_reimport &&
echo "A File" >expect &&
test_cmp expect Bar.txt &&
test_path_is_missing Foo.txt
)
'
test_expect_success 'git push can propagate local page deletion' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir &&
test_when_finished "rm -fr mw_dir" &&
(
cd mw_dir &&
test_path_is_missing Foo.mw &&
echo "hello world" >Foo.mw &&
git add Foo.mw &&
git commit -m "Add the page Foo" &&
git push &&
rm -f Foo.mw &&
git commit -am "Delete the page Foo" &&
test_git_reimport &&
test_path_is_missing Foo.mw
)
'
test_expect_success 'git push can propagate local media file deletion' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir &&
test_when_finished "rm -fr mw_dir" &&
(
cd mw_dir &&
echo "hello world" >Foo.txt &&
git add Foo.txt &&
git commit -m "Add the text file Foo" &&
git rm Foo.txt &&
git commit -m "Delete the file Foo" &&
test_git_reimport &&
test_path_is_missing Foo.txt
)
'
# test failure: the file is correctly uploaded, and then deleted but
# as no page link to it, the import (which looks at page revisions)
# doesn't notice the file deletion on the wiki. We fetch the list of
# files from the wiki, but as the file is deleted, it doesn't appear.
test_expect_failure 'git pull correctly imports media file deletion when no page link to it' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir &&
test_when_finished "rm -fr mw_dir" &&
(
cd mw_dir &&
echo "hello world" >Foo.txt &&
git add Foo.txt &&
git commit -m "Add the text file Foo" &&
git push &&
git rm Foo.txt &&
git commit -m "Delete the file Foo" &&
test_git_reimport &&
test_path_is_missing Foo.txt
)
'
test_expect_success 'git push properly warns about insufficient permissions' '
wiki_reset &&
git clone mediawiki::'"$WIKI_URL"' mw_dir &&
test_when_finished "rm -fr mw_dir" &&
(
cd mw_dir &&
echo "A File" >foo.forbidden &&
git add foo.forbidden &&
git commit -m "add a file" &&
git push 2>actual &&
test_grep "foo.forbidden is not a permitted file" actual
)
'
test_expect_success 'setup a repository with media files' '
wiki_reset &&
wiki_editpage testpage "I am linking a file [[File:File.txt]]" false &&
echo "File content" >File.txt &&
wiki_upload_file File.txt &&
echo "Another file content" >AnotherFile.txt &&
wiki_upload_file AnotherFile.txt
'
test_expect_success 'git clone works with one specific page cloned and mediaimport=true' '
git clone -c remote.origin.pages=testpage \
-c remote.origin.mediaimport=true \
mediawiki::'"$WIKI_URL"' mw_dir_15 &&
test_when_finished "rm -rf mw_dir_15" &&
test_contains_N_files mw_dir_15 3 &&
test_path_is_file mw_dir_15/Testpage.mw &&
test_path_is_file mw_dir_15/File:File.txt.mw &&
test_path_is_file mw_dir_15/File.txt &&
test_path_is_missing mw_dir_15/Main_Page.mw &&
test_path_is_missing mw_dir_15/File:AnotherFile.txt.mw &&
test_path_is_missing mw_dir_15/AnothetFile.txt &&
wiki_check_content mw_dir_15/Testpage.mw Testpage &&
test_cmp mw_dir_15/File.txt File.txt
'
test_expect_success 'git clone works with one specific page cloned and mediaimport=false' '
test_when_finished "rm -rf mw_dir_16" &&
git clone -c remote.origin.pages=testpage \
mediawiki::'"$WIKI_URL"' mw_dir_16 &&
test_contains_N_files mw_dir_16 1 &&
test_path_is_file mw_dir_16/Testpage.mw &&
test_path_is_missing mw_dir_16/File:File.txt.mw &&
test_path_is_missing mw_dir_16/File.txt &&
test_path_is_missing mw_dir_16/Main_Page.mw &&
wiki_check_content mw_dir_16/Testpage.mw Testpage
'
# should behave like mediaimport=false
test_expect_success 'git clone works with one specific page cloned and mediaimport unset' '
test_when_finished "rm -fr mw_dir_17" &&
git clone -c remote.origin.pages=testpage \
mediawiki::'"$WIKI_URL"' mw_dir_17 &&
test_contains_N_files mw_dir_17 1 &&
test_path_is_file mw_dir_17/Testpage.mw &&
test_path_is_missing mw_dir_17/File:File.txt.mw &&
test_path_is_missing mw_dir_17/File.txt &&
test_path_is_missing mw_dir_17/Main_Page.mw &&
wiki_check_content mw_dir_17/Testpage.mw Testpage
'
test_done

View File

@@ -1,17 +0,0 @@
#!/bin/sh
test_description='Test the Git Mediawiki remote helper: git pull by revision'
. ./test-gitmw-lib.sh
. ./push-pull-tests.sh
. $TEST_DIRECTORY/test-lib.sh
test_check_precond
test_expect_success 'configuration' '
git config --global mediawiki.fetchStrategy by_rev
'
test_push_pull
test_done

View File

@@ -1,23 +0,0 @@
#!/bin/sh
test_description='Test the Git Mediawiki remote helper: queries w/ more than 500 results'
. ./test-gitmw-lib.sh
. $TEST_DIRECTORY/test-lib.sh
test_check_precond
test_expect_success 'creating page w/ >500 revisions' '
wiki_reset &&
for i in $(test_seq 501)
do
echo "creating revision $i" &&
wiki_editpage foo "revision $i<br/>" true || return 1
done
'
test_expect_success 'cloning page w/ >500 revisions' '
git clone mediawiki::'"$WIKI_URL"' mw_dir
'
test_done

View File

@@ -1,432 +0,0 @@
# Copyright (C) 2012
# Charles Roussel <charles.roussel@ensimag.imag.fr>
# Simon Cathebras <simon.cathebras@ensimag.imag.fr>
# Julien Khayat <julien.khayat@ensimag.imag.fr>
# Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
# Simon Perrat <simon.perrat@ensimag.imag.fr>
# License: GPL v2 or later
#
# CONFIGURATION VARIABLES
# You might want to change these ones
#
. ./test.config
WIKI_BASE_URL=http://$SERVER_ADDR:$PORT
WIKI_URL=$WIKI_BASE_URL/$WIKI_DIR_NAME
CURR_DIR=$(pwd)
TEST_OUTPUT_DIRECTORY=$(pwd)
TEST_DIRECTORY="$CURR_DIR"/../../../t
export TEST_OUTPUT_DIRECTORY TEST_DIRECTORY CURR_DIR
if test "$LIGHTTPD" = "false" ; then
PORT=80
else
WIKI_DIR_INST="$CURR_DIR/$WEB_WWW"
fi
wiki_upload_file () {
"$CURR_DIR"/test-gitmw.pl upload_file "$@"
}
wiki_getpage () {
"$CURR_DIR"/test-gitmw.pl get_page "$@"
}
wiki_delete_page () {
"$CURR_DIR"/test-gitmw.pl delete_page "$@"
}
wiki_editpage () {
"$CURR_DIR"/test-gitmw.pl edit_page "$@"
}
die () {
die_with_status 1 "$@"
}
die_with_status () {
status=$1
shift
echo >&2 "$*"
exit "$status"
}
# Check the preconditions to run git-remote-mediawiki's tests
test_check_precond () {
if ! test_have_prereq PERL
then
skip_all='skipping gateway git-mw tests, perl not available'
test_done
fi
GIT_EXEC_PATH=$(cd "$(dirname "$0")" && cd "../.." && pwd)
PATH="$GIT_EXEC_PATH"'/bin-wrapper:'"$PATH"
if ! test -d "$WIKI_DIR_INST/$WIKI_DIR_NAME"
then
skip_all='skipping gateway git-mw tests, no mediawiki found'
test_done
fi
}
# test_diff_directories <dir_git> <dir_wiki>
#
# Compare the contents of directories <dir_git> and <dir_wiki> with diff
# and errors if they do not match. The program will
# not look into .git in the process.
# Warning: the first argument MUST be the directory containing the git data
test_diff_directories () {
rm -rf "$1_tmp"
mkdir -p "$1_tmp"
cp "$1"/*.mw "$1_tmp"
diff -r -b "$1_tmp" "$2"
}
# $1=<dir>
# $2=<N>
#
# Check that <dir> contains exactly <N> files
test_contains_N_files () {
if test $(ls -- "$1" | wc -l) -ne "$2"; then
echo "directory $1 should contain $2 files"
echo "it contains these files:"
ls "$1"
false
fi
}
# wiki_check_content <file_name> <page_name>
#
# Compares the contents of the file <file_name> and the wiki page
# <page_name> and exits with error 1 if they do not match.
wiki_check_content () {
mkdir -p wiki_tmp
wiki_getpage "$2" wiki_tmp
# replacement of forbidden character in file name
page_name=$(printf "%s\n" "$2" | sed -e "s/\//%2F/g")
diff -b "$1" wiki_tmp/"$page_name".mw
if test $? -ne 0
then
rm -rf wiki_tmp
error "ERROR: file $2 not found on wiki"
fi
rm -rf wiki_tmp
}
# wiki_page_exist <page_name>
#
# Check the existence of the page <page_name> on the wiki and exits
# with error if it is absent from it.
wiki_page_exist () {
mkdir -p wiki_tmp
wiki_getpage "$1" wiki_tmp
page_name=$(printf "%s\n" "$1" | sed "s/\//%2F/g")
if test -f wiki_tmp/"$page_name".mw ; then
rm -rf wiki_tmp
else
rm -rf wiki_tmp
error "test failed: file $1 not found on wiki"
fi
}
# wiki_getallpagename
#
# Fetch the name of each page on the wiki.
wiki_getallpagename () {
"$CURR_DIR"/test-gitmw.pl getallpagename
}
# wiki_getallpagecategory <category>
#
# Fetch the name of each page belonging to <category> on the wiki.
wiki_getallpagecategory () {
"$CURR_DIR"/test-gitmw.pl getallpagename "$@"
}
# wiki_getallpage <dest_dir> [<category>]
#
# Fetch all the pages from the wiki and place them in the directory
# <dest_dir>.
# If <category> is define, then wiki_getallpage fetch the pages included
# in <category>.
wiki_getallpage () {
if test -z "$2";
then
wiki_getallpagename
else
wiki_getallpagecategory "$2"
fi
mkdir -p "$1"
while read -r line; do
wiki_getpage "$line" $1;
done < all.txt
}
# ================= Install part =================
error () {
echo "$@" >&2
exit 1
}
# config_lighttpd
#
# Create the configuration files and the folders necessary to start lighttpd.
# Overwrite any existing file.
config_lighttpd () {
mkdir -p $WEB
mkdir -p $WEB_TMP
mkdir -p $WEB_WWW
cat > $WEB/lighttpd.conf <<EOF
server.document-root = "$CURR_DIR/$WEB_WWW"
server.port = $PORT
server.pid-file = "$CURR_DIR/$WEB_TMP/pid"
server.modules = (
"mod_rewrite",
"mod_redirect",
"mod_access",
"mod_accesslog",
"mod_fastcgi"
)
index-file.names = ("index.php" , "index.html")
mimetype.assign = (
".pdf" => "application/pdf",
".sig" => "application/pgp-signature",
".spl" => "application/futuresplash",
".class" => "application/octet-stream",
".ps" => "application/postscript",
".torrent" => "application/x-bittorrent",
".dvi" => "application/x-dvi",
".gz" => "application/x-gzip",
".pac" => "application/x-ns-proxy-autoconfig",
".swf" => "application/x-shockwave-flash",
".tar.gz" => "application/x-tgz",
".tgz" => "application/x-tgz",
".tar" => "application/x-tar",
".zip" => "application/zip",
".mp3" => "audio/mpeg",
".m3u" => "audio/x-mpegurl",
".wma" => "audio/x-ms-wma",
".wax" => "audio/x-ms-wax",
".ogg" => "application/ogg",
".wav" => "audio/x-wav",
".gif" => "image/gif",
".jpg" => "image/jpeg",
".jpeg" => "image/jpeg",
".png" => "image/png",
".xbm" => "image/x-xbitmap",
".xpm" => "image/x-xpixmap",
".xwd" => "image/x-xwindowdump",
".css" => "text/css",
".html" => "text/html",
".htm" => "text/html",
".js" => "text/javascript",
".asc" => "text/plain",
".c" => "text/plain",
".cpp" => "text/plain",
".log" => "text/plain",
".conf" => "text/plain",
".text" => "text/plain",
".txt" => "text/plain",
".dtd" => "text/xml",
".xml" => "text/xml",
".mpeg" => "video/mpeg",
".mpg" => "video/mpeg",
".mov" => "video/quicktime",
".qt" => "video/quicktime",
".avi" => "video/x-msvideo",
".asf" => "video/x-ms-asf",
".asx" => "video/x-ms-asf",
".wmv" => "video/x-ms-wmv",
".bz2" => "application/x-bzip",
".tbz" => "application/x-bzip-compressed-tar",
".tar.bz2" => "application/x-bzip-compressed-tar",
"" => "text/plain"
)
fastcgi.server = ( ".php" =>
("localhost" =>
( "socket" => "$CURR_DIR/$WEB_TMP/php.socket",
"bin-path" => "$PHP_DIR/php-cgi -c $CURR_DIR/$WEB/php.ini"
)
)
)
EOF
cat > $WEB/php.ini <<EOF
session.save_path ='$CURR_DIR/$WEB_TMP'
EOF
}
# start_lighttpd
#
# Start or restart daemon lighttpd. If restart, rewrite configuration files.
start_lighttpd () {
if test -f "$WEB_TMP/pid"; then
echo "Instance already running. Restarting..."
stop_lighttpd
fi
config_lighttpd
"$LIGHTTPD_DIR"/lighttpd -f "$WEB"/lighttpd.conf
if test $? -ne 0 ; then
echo "Could not execute http daemon lighttpd"
exit 1
fi
}
# stop_lighttpd
#
# Kill daemon lighttpd and removes files and folders associated.
stop_lighttpd () {
test -f "$WEB_TMP/pid" && kill $(cat "$WEB_TMP/pid")
}
wiki_delete_db () {
rm -rf \
"$FILES_FOLDER_DB"/* || error "Couldn't delete $FILES_FOLDER_DB/"
}
wiki_delete_db_backup () {
rm -rf \
"$FILES_FOLDER_POST_INSTALL_DB"/* || error "Couldn't delete $FILES_FOLDER_POST_INSTALL_DB/"
}
# Install MediaWiki using its install.php script. If the database file
# already exists, it will be deleted.
install_mediawiki () {
localsettings="$WIKI_DIR_INST/$WIKI_DIR_NAME/LocalSettings.php"
if test -f "$localsettings"
then
error "We already installed the wiki, since $localsettings exists" \
"perhaps you wanted to run 'delete' first?"
fi
wiki_delete_db
wiki_delete_db_backup
mkdir \
"$FILES_FOLDER_DB/" \
"$FILES_FOLDER_POST_INSTALL_DB/"
install_script="$WIKI_DIR_INST/$WIKI_DIR_NAME/maintenance/install.php"
echo "Installing MediaWiki using $install_script. This may take some time ..."
php "$WIKI_DIR_INST/$WIKI_DIR_NAME/maintenance/install.php" \
--server $WIKI_BASE_URL \
--scriptpath /wiki \
--lang en \
--dbtype sqlite \
--dbpath $PWD/$FILES_FOLDER_DB/ \
--pass "$WIKI_PASSW" \
Git-MediaWiki-Test \
"$WIKI_ADMIN" ||
error "Couldn't run $install_script, see errors above. Try to run ./install-wiki.sh delete first."
cat <<-'EOF' >>$localsettings
# Custom settings added by test-gitmw-lib.sh
#
# Uploading text files is needed for
# t9363-mw-to-git-export-import.sh
$wgEnableUploads = true;
$wgFileExtensions[] = 'txt';
EOF
# Copy the initially generated database file into our backup
# folder
cp -R "$FILES_FOLDER_DB/"* "$FILES_FOLDER_POST_INSTALL_DB/" ||
error "Unable to copy $FILES_FOLDER_DB/* to $FILES_FOLDER_POST_INSTALL_DB/*"
}
# Install a wiki in your web server directory.
wiki_install () {
if test $LIGHTTPD = "true" ; then
start_lighttpd
fi
# In this part, we change directory to $TMP in order to download,
# unpack and copy the files of MediaWiki
(
mkdir -p "$WIKI_DIR_INST/$WIKI_DIR_NAME"
if ! test -d "$WIKI_DIR_INST/$WIKI_DIR_NAME"
then
error "Folder $WIKI_DIR_INST/$WIKI_DIR_NAME doesn't exist.
Please create it and launch the script again."
fi
# Fetch MediaWiki's archive if not already present in the
# download directory
mkdir -p "$FILES_FOLDER_DOWNLOAD"
MW_FILENAME="mediawiki-$MW_VERSION_MAJOR.$MW_VERSION_MINOR.tar.gz"
cd "$FILES_FOLDER_DOWNLOAD"
if ! test -f $MW_FILENAME
then
echo "Downloading $MW_VERSION_MAJOR.$MW_VERSION_MINOR sources ..."
wget "http://download.wikimedia.org/mediawiki/$MW_VERSION_MAJOR/$MW_FILENAME" ||
error "Unable to download "\
"http://download.wikimedia.org/mediawiki/$MW_VERSION_MAJOR/"\
"$MW_FILENAME. "\
"Please fix your connection and launch the script again."
echo "$MW_FILENAME downloaded in $(pwd)/;" \
"you can delete it later if you want."
else
echo "Reusing existing $MW_FILENAME downloaded in $(pwd)/"
fi
archive_abs_path=$(pwd)/$MW_FILENAME
cd "$WIKI_DIR_INST/$WIKI_DIR_NAME/" ||
error "can't cd to $WIKI_DIR_INST/$WIKI_DIR_NAME/"
tar xzf "$archive_abs_path" --strip-components=1 ||
error "Unable to extract WikiMedia's files from $archive_abs_path to "\
"$WIKI_DIR_INST/$WIKI_DIR_NAME"
) || exit 1
echo Extracted in "$WIKI_DIR_INST/$WIKI_DIR_NAME"
install_mediawiki
echo "Your wiki has been installed. You can check it at
$WIKI_URL"
}
# Reset the database of the wiki and the password of the admin
#
# Warning: This function must be called only in a subdirectory of t/ directory
wiki_reset () {
# Copy initial database of the wiki
if ! test -d "../$FILES_FOLDER_DB"
then
error "No wiki database at ../$FILES_FOLDER_DB, not installed yet?"
fi
if ! test -d "../$FILES_FOLDER_POST_INSTALL_DB"
then
error "No wiki backup database at ../$FILES_FOLDER_POST_INSTALL_DB, failed installation?"
fi
wiki_delete_db
cp -R "../$FILES_FOLDER_POST_INSTALL_DB/"* "../$FILES_FOLDER_DB/" ||
error "Can't copy ../$FILES_FOLDER_POST_INSTALL_DB/* to ../$FILES_FOLDER_DB/*"
echo "File $FILES_FOLDER_DB/* has been reset"
}
# Delete the wiki created in the web server's directory and all its content
# saved in the database.
wiki_delete () {
if test $LIGHTTPD = "true"; then
stop_lighttpd
rm -fr "$WEB"
else
# Delete the wiki's directory.
rm -rf "$WIKI_DIR_INST/$WIKI_DIR_NAME" ||
error "Wiki's directory $WIKI_DIR_INST/" \
"$WIKI_DIR_NAME could not be deleted"
fi
wiki_delete_db
wiki_delete_db_backup
}

View File

@@ -1,223 +0,0 @@
#!/usr/bin/perl -w -s
# Copyright (C) 2012
# Charles Roussel <charles.roussel@ensimag.imag.fr>
# Simon Cathebras <simon.cathebras@ensimag.imag.fr>
# Julien Khayat <julien.khayat@ensimag.imag.fr>
# Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
# Simon Perrat <simon.perrat@ensimag.imag.fr>
# License: GPL v2 or later
# Usage:
# ./test-gitmw.pl <command> [argument]*
# Execute in terminal using the name of the function to call as first
# parameter, and the function's arguments as following parameters
#
# Example:
# ./test-gitmw.pl "get_page" foo .
# will call <wiki_getpage> with arguments <foo> and <.>
#
# Available functions are:
# "get_page"
# "delete_page"
# "edit_page"
# "getallpagename"
use MediaWiki::API;
use Getopt::Long;
use DateTime::Format::ISO8601;
use constant SLASH_REPLACEMENT => "%2F";
#Parsing of the config file
my $configfile = "$ENV{'CURR_DIR'}/test.config";
my %config;
open my $CONFIG, "<", $configfile or die "can't open $configfile: $!";
while (<$CONFIG>)
{
chomp;
s/#.*//;
s/^\s+//;
s/\s+$//;
next unless length;
my ($key, $value) = split (/\s*=\s*/,$_, 2);
$config{$key} = $value;
last if ($key eq 'LIGHTTPD' and $value eq 'false');
last if ($key eq 'PORT');
}
close $CONFIG or die "can't close $configfile: $!";
my $wiki_address = "http://$config{'SERVER_ADDR'}".":"."$config{'PORT'}";
my $wiki_url = "$wiki_address/$config{'WIKI_DIR_NAME'}/api.php";
my $wiki_admin = "$config{'WIKI_ADMIN'}";
my $wiki_admin_pass = "$config{'WIKI_PASSW'}";
my $mw = MediaWiki::API->new;
$mw->{config}->{api_url} = $wiki_url;
# wiki_login <name> <password>
#
# Logs the user with <name> and <password> in the global variable
# of the mediawiki $mw
sub wiki_login {
$mw->login( { lgname => "$_[0]",lgpassword => "$_[1]" } )
|| die "getpage: login failed";
}
# wiki_getpage <wiki_page> <dest_path>
#
# fetch a page <wiki_page> from the wiki referenced in the global variable
# $mw and copies its content in directory dest_path
sub wiki_getpage {
my $pagename = $_[0];
my $destdir = $_[1];
my $page = $mw->get_page( { title => $pagename } );
if (!defined($page)) {
die "getpage: wiki does not exist";
}
my $content = $page->{'*'};
if (!defined($content)) {
die "getpage: page does not exist";
}
$pagename=$page->{'title'};
# Replace spaces by underscore in the page name
$pagename =~ s/ /_/g;
$pagename =~ s/\//%2F/g;
open(my $file, ">:encoding(UTF-8)", "$destdir/$pagename.mw");
print $file "$content";
close ($file);
}
# wiki_delete_page <page_name>
#
# delete the page with name <page_name> from the wiki referenced
# in the global variable $mw
sub wiki_delete_page {
my $pagename = $_[0];
my $exist=$mw->get_page({title => $pagename});
if (defined($exist->{'*'})){
$mw->edit({ action => 'delete',
title => $pagename})
|| die $mw->{error}->{code} . ": " . $mw->{error}->{details};
} else {
die "no page with such name found: $pagename\n";
}
}
# wiki_editpage <wiki_page> <wiki_content> <wiki_append> [-c=<category>] [-s=<summary>]
#
# Edit a page named <wiki_page> with content <wiki_content> on the wiki
# referenced with the global variable $mw
# If <wiki_append> == true : append <wiki_content> at the end of the actual
# content of the page <wiki_page>
# If <wik_page> doesn't exist, that page is created with the <wiki_content>
sub wiki_editpage {
my $wiki_page = $_[0];
my $wiki_content = $_[1];
my $wiki_append = $_[2];
my $summary = "";
my ($summ, $cat) = ();
GetOptions('s=s' => \$summ, 'c=s' => \$cat);
my $append = 0;
if (defined($wiki_append) && $wiki_append eq 'true') {
$append=1;
}
my $previous_text ="";
if ($append) {
my $ref = $mw->get_page( { title => $wiki_page } );
$previous_text = $ref->{'*'};
}
my $text = $wiki_content;
if (defined($previous_text)) {
$text="$previous_text$text";
}
# Eventually, add this page to a category.
if (defined($cat)) {
my $category_name="[[Category:$cat]]";
$text="$text\n $category_name";
}
if(defined($summ)){
$summary=$summ;
}
$mw->edit( { action => 'edit', title => $wiki_page, summary => $summary, text => "$text"} );
}
# wiki_getallpagename [<category>]
#
# Fetch all pages of the wiki referenced by the global variable $mw
# and print the names of each one in the file all.txt with a new line
# ("\n") between these.
# If the argument <category> is defined, then this function get only the pages
# belonging to <category>.
sub wiki_getallpagename {
# fetch the pages of the wiki
if (defined($_[0])) {
my $mw_pages = $mw->list ( { action => 'query',
list => 'categorymembers',
cmtitle => "Category:$_[0]",
cmnamespace => 0,
cmlimit => 500 },
)
|| die $mw->{error}->{code}.": ".$mw->{error}->{details};
open(my $file, ">:encoding(UTF-8)", "all.txt");
foreach my $page (@{$mw_pages}) {
print $file "$page->{title}\n";
}
close ($file);
} else {
my $mw_pages = $mw->list({
action => 'query',
list => 'allpages',
aplimit => 500,
})
|| die $mw->{error}->{code}.": ".$mw->{error}->{details};
open(my $file, ">:encoding(UTF-8)", "all.txt");
foreach my $page (@{$mw_pages}) {
print $file "$page->{title}\n";
}
close ($file);
}
}
sub wiki_upload_file {
my $file_name = $_[0];
my $resultat = $mw->edit ( {
action => 'upload',
filename => $file_name,
comment => 'upload a file',
file => [ $file_name ],
ignorewarnings=>1,
}, {
skip_encoding => 1
} ) || die $mw->{error}->{code} . ' : ' . $mw->{error}->{details};
}
# Main part of this script: parse the command line arguments
# and select which function to execute
my $fct_to_call = shift;
wiki_login($wiki_admin, $wiki_admin_pass);
my %functions_to_call = (
upload_file => \&wiki_upload_file,
get_page => \&wiki_getpage,
delete_page => \&wiki_delete_page,
edit_page => \&wiki_editpage,
getallpagename => \&wiki_getallpagename,
);
die "$0 ERROR: wrong argument" unless exists $functions_to_call{$fct_to_call};
$functions_to_call{$fct_to_call}->(map { utf8::decode($_); $_ } @ARGV);

View File

@@ -1,40 +0,0 @@
# Name of the web server's directory dedicated to the wiki is WIKI_DIR_NAME
WIKI_DIR_NAME=wiki
# Login and password of the wiki's admin
WIKI_ADMIN=WikiAdmin
WIKI_PASSW=AdminPass1
# Address of the web server
SERVER_ADDR=localhost
# If LIGHTTPD is not set to true, the script will use the default
# web server running in WIKI_DIR_INST.
WIKI_DIR_INST=/var/www
# If LIGHTTPD is set to true, the script will use Lighttpd to run
# the wiki.
LIGHTTPD=true
# The variables below are useful only if LIGHTTPD is set to true.
PORT=1234
PHP_DIR=/usr/bin
LIGHTTPD_DIR=/usr/sbin
WEB=WEB
WEB_TMP=$WEB/tmp
WEB_WWW=$WEB/www
# Where our configuration for the wiki is located
FILES_FOLDER=mediawiki
FILES_FOLDER_DOWNLOAD=$FILES_FOLDER/download
FILES_FOLDER_DB=$FILES_FOLDER/db
FILES_FOLDER_POST_INSTALL_DB=$FILES_FOLDER/post-install-db
# The variables below are used by the script to install a wiki.
# You should not modify these unless you are modifying the script itself.
# tested versions: 1.19.X -> 1.21.1 -> 1.34.2
#
# See https://www.mediawiki.org/wiki/Download for what the latest
# version is.
MW_VERSION_MAJOR=1.34
MW_VERSION_MINOR=2

View File

@@ -1,202 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -1,43 +0,0 @@
# Copyright 2012 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# The default target of this Makefile is...
all::
BUILD_LABEL=$(shell cut -d" " -f3 ../../GIT-VERSION-FILE)
TAR_OUT=$(shell go env GOOS)_$(shell go env GOARCH).tar.gz
all:: git-remote-persistent-https git-remote-persistent-https--proxy \
git-remote-persistent-http
git-remote-persistent-https--proxy: git-remote-persistent-https
ln -f -s git-remote-persistent-https git-remote-persistent-https--proxy
git-remote-persistent-http: git-remote-persistent-https
ln -f -s git-remote-persistent-https git-remote-persistent-http
git-remote-persistent-https:
case $$(go version) in \
"go version go"1.[0-5].*) EQ=" " ;; *) EQ="=" ;; esac && \
go build -o git-remote-persistent-https \
-ldflags "-X main._BUILD_EMBED_LABEL$${EQ}$(BUILD_LABEL)"
clean:
rm -f git-remote-persistent-http* *.tar.gz
tar: clean all
@chmod 555 git-remote-persistent-https
@tar -czf $(TAR_OUT) git-remote-persistent-http* README LICENSE
@echo
@echo "Created $(TAR_OUT)"

View File

@@ -1,72 +0,0 @@
git-remote-persistent-https
The git-remote-persistent-https binary speeds up SSL operations
by running a daemon job (git-remote-persistent-https--proxy) that
keeps a connection open to a server.
PRE-BUILT BINARIES
Darwin amd64:
https://commondatastorage.googleapis.com/git-remote-persistent-https/darwin_amd64.tar.gz
Linux amd64:
https://commondatastorage.googleapis.com/git-remote-persistent-https/linux_amd64.tar.gz
INSTALLING
Move all of the git-remote-persistent-http* binaries to a directory
in PATH.
USAGE
HTTPS requests can be delegated to the proxy by using the
"persistent-https" scheme, e.g.
git clone persistent-https://kernel.googlesource.com/pub/scm/git/git
Likewise, .gitconfig can be updated as follows to rewrite https urls
to use persistent-https:
[url "persistent-https"]
insteadof = https
[url "persistent-http"]
insteadof = http
You may also want to allow the use of the persistent-https helper for
submodule URLs (since any https URLs pointing to submodules will be
rewritten, and Git's out-of-the-box defaults forbid submodules from
using unknown remote helpers):
[protocol "persistent-https"]
allow = always
[protocol "persistent-http"]
allow = always
#####################################################################
# BUILDING FROM SOURCE
#####################################################################
LOCATION
The source is available in the contrib/persistent-https directory of
the Git source repository. The Git source repository is available at
git://git.kernel.org/pub/scm/git/git.git/
https://kernel.googlesource.com/pub/scm/git/git
PREREQUISITES
The code is written in Go (http://golang.org/) and the Go compiler is
required. Currently, the compiler must be built and installed from tip
of source, in order to include a fix in the reverse http proxy:
http://code.google.com/p/go/source/detail?r=a615b796570a2cd8591884767a7d67ede74f6648
BUILDING
Run "make" to build the binaries. See the section on
INSTALLING above.

View File

@@ -1,189 +0,0 @@
// Copyright 2012 Google Inc. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package main
import (
"bufio"
"errors"
"fmt"
"net"
"net/url"
"os"
"os/exec"
"strings"
"syscall"
"time"
)
type Client struct {
ProxyBin string
Args []string
insecure bool
}
func (c *Client) Run() error {
if err := c.resolveArgs(); err != nil {
return fmt.Errorf("resolveArgs() got error: %v", err)
}
// Connect to the proxy.
uconn, hconn, addr, err := c.connect()
if err != nil {
return fmt.Errorf("connect() got error: %v", err)
}
// Keep the unix socket connection open for the duration of the request.
defer uconn.Close()
// Keep a connection to the HTTP server open, so no other user can
// bind on the same address so long as the process is running.
defer hconn.Close()
// Start the git-remote-http subprocess.
cargs := []string{"-c", fmt.Sprintf("http.proxy=%v", addr), "remote-http"}
cargs = append(cargs, c.Args...)
cmd := exec.Command("git", cargs...)
for _, v := range os.Environ() {
if !strings.HasPrefix(v, "GIT_PERSISTENT_HTTPS_SECURE=") {
cmd.Env = append(cmd.Env, v)
}
}
// Set the GIT_PERSISTENT_HTTPS_SECURE environment variable when
// the proxy is using a SSL connection. This allows credential helpers
// to identify secure proxy connections, despite being passed an HTTP
// scheme.
if !c.insecure {
cmd.Env = append(cmd.Env, "GIT_PERSISTENT_HTTPS_SECURE=1")
}
cmd.Stdin = os.Stdin
cmd.Stdout = os.Stdout
cmd.Stderr = os.Stderr
if err := cmd.Run(); err != nil {
if eerr, ok := err.(*exec.ExitError); ok {
if stat, ok := eerr.ProcessState.Sys().(syscall.WaitStatus); ok && stat.ExitStatus() != 0 {
os.Exit(stat.ExitStatus())
}
}
return fmt.Errorf("git-remote-http subprocess got error: %v", err)
}
return nil
}
func (c *Client) connect() (uconn net.Conn, hconn net.Conn, addr string, err error) {
uconn, err = DefaultSocket.Dial()
if err != nil {
if e, ok := err.(*net.OpError); ok && (os.IsNotExist(e.Err) || e.Err == syscall.ECONNREFUSED) {
if err = c.startProxy(); err == nil {
uconn, err = DefaultSocket.Dial()
}
}
if err != nil {
return
}
}
if addr, err = c.readAddr(uconn); err != nil {
return
}
// Open a tcp connection to the proxy.
if hconn, err = net.Dial("tcp", addr); err != nil {
return
}
// Verify the address hasn't changed ownership.
var addr2 string
if addr2, err = c.readAddr(uconn); err != nil {
return
} else if addr != addr2 {
err = fmt.Errorf("address changed after connect. got %q, want %q", addr2, addr)
return
}
return
}
func (c *Client) readAddr(conn net.Conn) (string, error) {
conn.SetDeadline(time.Now().Add(5 * time.Second))
data := make([]byte, 100)
n, err := conn.Read(data)
if err != nil {
return "", fmt.Errorf("error reading unix socket: %v", err)
} else if n == 0 {
return "", errors.New("empty data response")
}
conn.Write([]byte{1}) // Ack
var addr string
if addrs := strings.Split(string(data[:n]), "\n"); len(addrs) != 2 {
return "", fmt.Errorf("got %q, wanted 2 addresses", data[:n])
} else if c.insecure {
addr = addrs[1]
} else {
addr = addrs[0]
}
return addr, nil
}
func (c *Client) startProxy() error {
cmd := exec.Command(c.ProxyBin)
cmd.SysProcAttr = &syscall.SysProcAttr{Setpgid: true}
stdout, err := cmd.StdoutPipe()
if err != nil {
return err
}
defer stdout.Close()
if err := cmd.Start(); err != nil {
return err
}
result := make(chan error)
go func() {
bytes, _, err := bufio.NewReader(stdout).ReadLine()
if line := string(bytes); err == nil && line != "OK" {
err = fmt.Errorf("proxy returned %q, want \"OK\"", line)
}
result <- err
}()
select {
case err := <-result:
return err
case <-time.After(5 * time.Second):
return errors.New("timeout waiting for proxy to start")
}
panic("not reachable")
}
func (c *Client) resolveArgs() error {
if nargs := len(c.Args); nargs == 0 {
return errors.New("remote needed")
} else if nargs > 2 {
return fmt.Errorf("want at most 2 args, got %v", c.Args)
}
// Rewrite the url scheme to be http.
idx := len(c.Args) - 1
rawurl := c.Args[idx]
rurl, err := url.Parse(rawurl)
if err != nil {
return fmt.Errorf("invalid remote: %v", err)
}
c.insecure = rurl.Scheme == "persistent-http"
rurl.Scheme = "http"
c.Args[idx] = rurl.String()
if idx != 0 && c.Args[0] == rawurl {
c.Args[0] = c.Args[idx]
}
return nil
}

View File

@@ -1,82 +0,0 @@
// Copyright 2012 Google Inc. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
// The git-remote-persistent-https binary speeds up SSL operations by running
// a daemon job that keeps a connection open to a Git server. This ensures the
// git-remote-persistent-https--proxy is running and delegating execution
// to the git-remote-http binary with the http_proxy set to the daemon job.
// A unix socket is used to authenticate the proxy and discover the
// HTTP address. Note, both the client and proxy are included in the same
// binary.
package main
import (
"flag"
"fmt"
"log"
"os"
"strings"
"time"
)
var (
forceProxy = flag.Bool("proxy", false, "Whether to start the binary in proxy mode")
proxyBin = flag.String("proxy_bin", "git-remote-persistent-https--proxy", "Path to the proxy binary")
printLabel = flag.Bool("print_label", false, "Prints the build label for the binary")
// Variable that should be defined through the -X linker flag.
_BUILD_EMBED_LABEL string
)
const (
defaultMaxIdleDuration = 24 * time.Hour
defaultPollUpdateInterval = 15 * time.Minute
)
func main() {
flag.Parse()
if *printLabel {
// Short circuit execution to print the build label
fmt.Println(buildLabel())
return
}
var err error
if *forceProxy || strings.HasSuffix(os.Args[0], "--proxy") {
log.SetPrefix("git-remote-persistent-https--proxy: ")
proxy := &Proxy{
BuildLabel: buildLabel(),
MaxIdleDuration: defaultMaxIdleDuration,
PollUpdateInterval: defaultPollUpdateInterval,
}
err = proxy.Run()
} else {
log.SetPrefix("git-remote-persistent-https: ")
client := &Client{
ProxyBin: *proxyBin,
Args: flag.Args(),
}
err = client.Run()
}
if err != nil {
log.Fatalln(err)
}
}
func buildLabel() string {
if _BUILD_EMBED_LABEL == "" {
log.Println(`unlabeled build; build with "make" to label`)
}
return _BUILD_EMBED_LABEL
}

View File

@@ -1,190 +0,0 @@
// Copyright 2012 Google Inc. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package main
import (
"fmt"
"log"
"net"
"net/http"
"net/http/httputil"
"os"
"os/exec"
"os/signal"
"sync"
"syscall"
"time"
)
type Proxy struct {
BuildLabel string
MaxIdleDuration time.Duration
PollUpdateInterval time.Duration
ul net.Listener
httpAddr string
httpsAddr string
}
func (p *Proxy) Run() error {
hl, err := net.Listen("tcp", "127.0.0.1:0")
if err != nil {
return fmt.Errorf("http listen failed: %v", err)
}
defer hl.Close()
hsl, err := net.Listen("tcp", "127.0.0.1:0")
if err != nil {
return fmt.Errorf("https listen failed: %v", err)
}
defer hsl.Close()
p.ul, err = DefaultSocket.Listen()
if err != nil {
c, derr := DefaultSocket.Dial()
if derr == nil {
c.Close()
fmt.Println("OK\nA proxy is already running... exiting")
return nil
} else if e, ok := derr.(*net.OpError); ok && e.Err == syscall.ECONNREFUSED {
// Nothing is listening on the socket, unlink it and try again.
syscall.Unlink(DefaultSocket.Path())
p.ul, err = DefaultSocket.Listen()
}
if err != nil {
return fmt.Errorf("unix listen failed on %v: %v", DefaultSocket.Path(), err)
}
}
defer p.ul.Close()
go p.closeOnSignal()
go p.closeOnUpdate()
p.httpAddr = hl.Addr().String()
p.httpsAddr = hsl.Addr().String()
fmt.Printf("OK\nListening on unix socket=%v http=%v https=%v\n",
p.ul.Addr(), p.httpAddr, p.httpsAddr)
result := make(chan error, 2)
go p.serveUnix(result)
go func() {
result <- http.Serve(hl, &httputil.ReverseProxy{
FlushInterval: 500 * time.Millisecond,
Director: func(r *http.Request) {},
})
}()
go func() {
result <- http.Serve(hsl, &httputil.ReverseProxy{
FlushInterval: 500 * time.Millisecond,
Director: func(r *http.Request) {
r.URL.Scheme = "https"
},
})
}()
return <-result
}
type socketContext struct {
sync.WaitGroup
mutex sync.Mutex
last time.Time
}
func (sc *socketContext) Done() {
sc.mutex.Lock()
defer sc.mutex.Unlock()
sc.last = time.Now()
sc.WaitGroup.Done()
}
func (p *Proxy) serveUnix(result chan<- error) {
sockCtx := &socketContext{}
go p.closeOnIdle(sockCtx)
var err error
for {
var uconn net.Conn
uconn, err = p.ul.Accept()
if err != nil {
err = fmt.Errorf("accept failed: %v", err)
break
}
sockCtx.Add(1)
go p.handleUnixConn(sockCtx, uconn)
}
sockCtx.Wait()
result <- err
}
func (p *Proxy) handleUnixConn(sockCtx *socketContext, uconn net.Conn) {
defer sockCtx.Done()
defer uconn.Close()
data := []byte(fmt.Sprintf("%v\n%v", p.httpsAddr, p.httpAddr))
uconn.SetDeadline(time.Now().Add(5 * time.Second))
for i := 0; i < 2; i++ {
if n, err := uconn.Write(data); err != nil {
log.Printf("error sending http addresses: %+v\n", err)
return
} else if n != len(data) {
log.Printf("sent %d data bytes, wanted %d\n", n, len(data))
return
}
if _, err := uconn.Read([]byte{0, 0, 0, 0}); err != nil {
log.Printf("error waiting for Ack: %+v\n", err)
return
}
}
// Wait without a deadline for the client to finish via EOF
uconn.SetDeadline(time.Time{})
uconn.Read([]byte{0, 0, 0, 0})
}
func (p *Proxy) closeOnIdle(sockCtx *socketContext) {
for d := p.MaxIdleDuration; d > 0; {
time.Sleep(d)
sockCtx.Wait()
sockCtx.mutex.Lock()
if d = sockCtx.last.Add(p.MaxIdleDuration).Sub(time.Now()); d <= 0 {
log.Println("graceful shutdown from idle timeout")
p.ul.Close()
}
sockCtx.mutex.Unlock()
}
}
func (p *Proxy) closeOnUpdate() {
for {
time.Sleep(p.PollUpdateInterval)
if out, err := exec.Command(os.Args[0], "--print_label").Output(); err != nil {
log.Printf("error polling for updated binary: %v\n", err)
} else if s := string(out[:len(out)-1]); p.BuildLabel != s {
log.Printf("graceful shutdown from updated binary: %q --> %q\n", p.BuildLabel, s)
p.ul.Close()
break
}
}
}
func (p *Proxy) closeOnSignal() {
ch := make(chan os.Signal, 10)
signal.Notify(ch, os.Interrupt, os.Kill, os.Signal(syscall.SIGTERM), os.Signal(syscall.SIGHUP))
sig := <-ch
p.ul.Close()
switch sig {
case os.Signal(syscall.SIGHUP):
log.Printf("graceful shutdown from signal: %v\n", sig)
default:
log.Fatalf("exiting from signal: %v\n", sig)
}
}

View File

@@ -1,97 +0,0 @@
// Copyright 2012 Google Inc. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package main
import (
"fmt"
"log"
"net"
"os"
"path/filepath"
"syscall"
)
// A Socket is a wrapper around a Unix socket that verifies directory
// permissions.
type Socket struct {
Dir string
}
func defaultDir() string {
sockPath := ".git-credential-cache"
if home := os.Getenv("HOME"); home != "" {
return filepath.Join(home, sockPath)
}
log.Printf("socket: cannot find HOME path. using relative directory %q for socket", sockPath)
return sockPath
}
// DefaultSocket is a Socket in the $HOME/.git-credential-cache directory.
var DefaultSocket = Socket{Dir: defaultDir()}
// Listen announces the local network address of the unix socket. The
// permissions on the socket directory are verified before attempting
// the actual listen.
func (s Socket) Listen() (net.Listener, error) {
network, addr := "unix", s.Path()
if err := s.mkdir(); err != nil {
return nil, &net.OpError{Op: "listen", Net: network, Addr: &net.UnixAddr{Name: addr, Net: network}, Err: err}
}
return net.Listen(network, addr)
}
// Dial connects to the unix socket. The permissions on the socket directory
// are verified before attempting the actual dial.
func (s Socket) Dial() (net.Conn, error) {
network, addr := "unix", s.Path()
if err := s.checkPermissions(); err != nil {
return nil, &net.OpError{Op: "dial", Net: network, Addr: &net.UnixAddr{Name: addr, Net: network}, Err: err}
}
return net.Dial(network, addr)
}
// Path returns the fully specified file name of the unix socket.
func (s Socket) Path() string {
return filepath.Join(s.Dir, "persistent-https-proxy-socket")
}
func (s Socket) mkdir() error {
if err := s.checkPermissions(); err == nil {
return nil
} else if !os.IsNotExist(err) {
return err
}
if err := os.MkdirAll(s.Dir, 0700); err != nil {
return err
}
return s.checkPermissions()
}
func (s Socket) checkPermissions() error {
fi, err := os.Stat(s.Dir)
if err != nil {
return err
}
if !fi.IsDir() {
return fmt.Errorf("socket: got file, want directory for %q", s.Dir)
}
if fi.Mode().Perm() != 0700 {
return fmt.Errorf("socket: got perm %o, want 700 for %q", fi.Mode().Perm(), s.Dir)
}
if st := fi.Sys().(*syscall.Stat_t); int(st.Uid) != os.Getuid() {
return fmt.Errorf("socket: got uid %d, want %d for %q", st.Uid, os.Getuid(), s.Dir)
}
return nil
}

View File

@@ -1,15 +0,0 @@
The remote-helper bridges to access data stored in Mercurial and
Bazaar are maintained outside the git.git tree in the repositories
of their primary author:
https://github.com/felipec/git-remote-hg (for Mercurial)
https://github.com/felipec/git-remote-bzr (for Bazaar)
You can pick a directory on your $PATH and download them from these
repositories, e.g.:
$ wget -O $HOME/bin/git-remote-hg \
https://raw.github.com/felipec/git-remote-hg/master/git-remote-hg
$ wget -O $HOME/bin/git-remote-bzr \
https://raw.github.com/felipec/git-remote-bzr/master/git-remote-bzr
$ chmod +x $HOME/bin/git-remote-hg $HOME/bin/git-remote-bzr

View File

@@ -1,11 +0,0 @@
#!/bin/sh
cat >&2 <<'EOT'
WARNING: git-remote-bzr is now maintained independently.
WARNING: For more information visit https://github.com/felipec/git-remote-bzr
WARNING:
WARNING: You can pick a directory on your $PATH and download it, e.g.:
WARNING: $ wget -O $HOME/bin/git-remote-bzr \
WARNING: https://raw.github.com/felipec/git-remote-bzr/master/git-remote-bzr
WARNING: $ chmod +x $HOME/bin/git-remote-bzr
EOT

View File

@@ -1,11 +0,0 @@
#!/bin/sh
cat >&2 <<'EOT'
WARNING: git-remote-hg is now maintained independently.
WARNING: For more information visit https://github.com/felipec/git-remote-hg
WARNING:
WARNING: You can pick a directory on your $PATH and download it, e.g.:
WARNING: $ wget -O $HOME/bin/git-remote-hg \
WARNING: https://raw.github.com/felipec/git-remote-hg/master/git-remote-hg
WARNING: $ chmod +x $HOME/bin/git-remote-hg
EOT

View File

@@ -1,33 +0,0 @@
#!/bin/sh
# Use this tool to rewrite your .git/remotes/ files into the config.
. git-sh-setup
if [ -d "$GIT_DIR"/remotes ]; then
echo "Rewriting $GIT_DIR/remotes" >&2
error=0
# rewrite into config
{
cd "$GIT_DIR"/remotes
ls | while read f; do
name=$(printf "$f" | tr -c "A-Za-z0-9-" ".")
sed -n \
-e "s/^URL:[ ]*\(.*\)$/remote.$name.url \1 ./p" \
-e "s/^Pull:[ ]*\(.*\)$/remote.$name.fetch \1 ^$ /p" \
-e "s/^Push:[ ]*\(.*\)$/remote.$name.push \1 ^$ /p" \
< "$f"
done
echo done
} | while read key value regex; do
case $key in
done)
if [ $error = 0 ]; then
mv "$GIT_DIR"/remotes "$GIT_DIR"/remotes.old
fi ;;
*)
echo "git config $key "$value" $regex"
git config $key "$value" $regex || error=1 ;;
esac
done
fi

View File

@@ -1,26 +0,0 @@
#!/bin/sh
# This script displays the distribution of longest common hash prefixes.
# This can be used to determine the minimum prefix length to use
# for object names to be unique.
git rev-list --objects --all | sort | perl -lne '
substr($_, 40) = "";
# uncomment next line for a distribution of bits instead of hex chars
# $_ = unpack("B*",pack("H*",$_));
if (defined $p) {
($p ^ $_) =~ /^(\0*)/;
$common = length $1;
if (defined $pcommon) {
$count[$pcommon > $common ? $pcommon : $common]++;
} else {
$count[$common]++; # first item
}
}
$p = $_;
$pcommon = $common;
END {
$count[$common]++; # last item
print "$_: $count[$_]" for 0..$#count;
}
'

View File

@@ -1,70 +0,0 @@
#!/usr/bin/perl
use warnings 'all';
use strict;
use Getopt::Long;
my $match_emails;
my $match_names;
my $order_by = 'count';
Getopt::Long::Configure(qw(bundling));
GetOptions(
'emails|e!' => \$match_emails,
'names|n!' => \$match_names,
'count|c' => sub { $order_by = 'count' },
'time|t' => sub { $order_by = 'stamp' },
) or exit 1;
$match_emails = 1 unless $match_names;
my $email = {};
my $name = {};
open(my $fh, '-|', "git log --format='%at <%aE> %aN'");
while(<$fh>) {
my ($t, $e, $n) = /(\S+) <(\S+)> (.*)/;
mark($email, $e, $n, $t);
mark($name, $n, $e, $t);
}
close($fh);
if ($match_emails) {
foreach my $e (dups($email)) {
foreach my $n (vals($email->{$e})) {
show($n, $e, $email->{$e}->{$n});
}
print "\n";
}
}
if ($match_names) {
foreach my $n (dups($name)) {
foreach my $e (vals($name->{$n})) {
show($n, $e, $name->{$n}->{$e});
}
print "\n";
}
}
exit 0;
sub mark {
my ($h, $k, $v, $t) = @_;
my $e = $h->{$k}->{$v} ||= { count => 0, stamp => 0 };
$e->{count}++;
$e->{stamp} = $t unless $t < $e->{stamp};
}
sub dups {
my $h = shift;
return grep { keys($h->{$_}) > 1 } keys($h);
}
sub vals {
my $h = shift;
return sort {
$h->{$b}->{$order_by} <=> $h->{$a}->{$order_by}
} keys($h);
}
sub show {
my ($n, $e, $h) = @_;
print "$n <$e> ($h->{$order_by})\n";
}

View File

@@ -1,20 +0,0 @@
appp.sh is a script that is supposed to be used together with ExternalEditor
for Mozilla Thunderbird. It will let you include patches inline in e-mails
in an easy way.
Usage:
- Generate the patch with git format-patch.
- Start writing a new e-mail in Thunderbird.
- Press the external editor button (or Ctrl-E) to run appp.sh
- Select the previously generated patch file.
- Finish editing the e-mail.
Any text that is entered into the message editor before appp.sh is called
will be moved to the section between the --- and the diffstat.
All S-O-B:s and Cc:s in the patch will be added to the CC list.
To set it up, just install External Editor and tell it to use appp.sh as the
editor.
Zenity is a required dependency.

View File

@@ -1,55 +0,0 @@
#!/bin/sh
# Copyright 2008 Lukas Sandström <luksan@gmail.com>
#
# AppendPatch - A script to be used together with ExternalEditor
# for Mozilla Thunderbird to properly include patches inline in e-mails.
# ExternalEditor can be downloaded at http://globs.org/articles.php?lng=en&pg=2
CONFFILE=~/.appprc
SEP="-=-=-=-=-=-=-=-=-=# Don't remove this line #=-=-=-=-=-=-=-=-=-"
if [ -e "$CONFFILE" ] ; then
LAST_DIR=$(grep -m 1 "^LAST_DIR=" "${CONFFILE}"|sed -e 's/^LAST_DIR=//')
cd "${LAST_DIR}"
else
cd > /dev/null
fi
PATCH=$(zenity --file-selection)
if [ "$?" != "0" ] ; then
#zenity --error --text "No patchfile given."
exit 1
fi
cd - > /dev/null
SUBJECT=$(sed -n -e '/^Subject: /p' "${PATCH}")
HEADERS=$(sed -e '/^'"${SEP}"'$/,$d' $1)
BODY=$(sed -e "1,/${SEP}/d" $1)
CMT_MSG=$(sed -e '1,/^$/d' -e '/^---$/,$d' "${PATCH}")
DIFF=$(sed -e '1,/^---$/d' "${PATCH}")
CCS=$(printf '%s\n%s\n' "$CMT_MSG" "$HEADERS" | sed -n -e 's/^Cc: \(.*\)$/\1,/gp' \
-e 's/^Signed-off-by: \(.*\)/\1,/gp')
echo "$SUBJECT" > $1
echo "Cc: $CCS" >> $1
echo "$HEADERS" | sed -e '/^Subject: /d' -e '/^Cc: /d' >> $1
echo "$SEP" >> $1
echo "$CMT_MSG" >> $1
echo "---" >> $1
if [ "x${BODY}x" != "xx" ] ; then
echo >> $1
echo "$BODY" >> $1
echo >> $1
fi
echo "$DIFF" >> $1
LAST_DIR=$(dirname "${PATCH}")
grep -v "^LAST_DIR=" "${CONFFILE}" > "${CONFFILE}_"
echo "LAST_DIR=${LAST_DIR}" >> "${CONFFILE}_"
mv "${CONFFILE}_" "${CONFFILE}"

View File

@@ -1 +0,0 @@
/git-new-workdir eol=lf

View File

@@ -1,105 +0,0 @@
#!/bin/sh
usage () {
echo "usage:" $@
exit 127
}
die () {
echo $@
exit 128
}
failed () {
die "unable to create new workdir '$new_workdir'!"
}
if test $# -lt 2 || test $# -gt 3
then
usage "$0 <repository> <new_workdir> [<branch>]"
fi
orig_git=$1
new_workdir=$2
branch=$3
# want to make sure that what is pointed to has a .git directory ...
git_dir=$(cd "$orig_git" 2>/dev/null &&
git rev-parse --git-dir 2>/dev/null) ||
die "Not a git repository: \"$orig_git\""
case "$git_dir" in
.git)
git_dir="$orig_git/.git"
;;
.)
git_dir=$orig_git
;;
esac
# don't link to a configured bare repository
isbare=$(git --git-dir="$git_dir" config --bool --get core.bare)
if test ztrue = "z$isbare"
then
die "\"$git_dir\" has core.bare set to true," \
" remove from \"$git_dir/config\" to use $0"
fi
# don't link to a workdir
if test -h "$git_dir/config"
then
die "\"$orig_git\" is a working directory only, please specify" \
"a complete repository."
fi
# make sure the links in the workdir have full paths to the original repo
git_dir=$(cd "$git_dir" && pwd) || exit 1
# don't recreate a workdir over an existing directory, unless it's empty
if test -d "$new_workdir"
then
if test $(ls -a1 "$new_workdir/." | wc -l) -ne 2
then
die "destination directory '$new_workdir' is not empty."
fi
cleandir="$new_workdir/.git"
else
cleandir="$new_workdir"
fi
mkdir -p "$new_workdir/.git" || failed
cleandir=$(cd "$cleandir" && pwd) || failed
cleanup () {
rm -rf "$cleandir"
}
siglist="0 1 2 15"
trap cleanup $siglist
# create the links to the original repo. explicitly exclude index, HEAD and
# logs/HEAD from the list since they are purely related to the current working
# directory, and should not be shared.
for x in config refs logs/refs objects info hooks packed-refs remotes rr-cache svn reftable
do
# create a containing directory if needed
case $x in
*/*)
mkdir -p "$new_workdir/.git/${x%/*}"
;;
esac
ln -s "$git_dir/$x" "$new_workdir/.git/$x" || failed
done
# commands below this are run in the context of the new workdir
cd "$new_workdir" || failed
# copy the HEAD from the original repository as a default branch
cp "$git_dir/HEAD" .git/HEAD || failed
# the workdir is set up. if the checkout fails, the user can fix it.
trap - $siglist
# checkout the branch (either the same as HEAD from the original repository,
# or the one that was asked for)
git checkout -f $branch

View File

@@ -178,7 +178,6 @@ integration_tests = [
't1015-read-index-unmerged.sh',
't1016-compatObjectFormat.sh',
't1020-subdirectory.sh',
't1021-rerere-in-workdir.sh',
't1022-read-tree-partial-clone.sh',
't1050-large.sh',
't1051-large-conversion.sh',

View File

@@ -1,58 +0,0 @@
#!/bin/sh
test_description='rerere run in a workdir'
GIT_TEST_DEFAULT_INITIAL_BRANCH_NAME=main
export GIT_TEST_DEFAULT_INITIAL_BRANCH_NAME
. ./test-lib.sh
test_expect_success SYMLINKS setup '
git config rerere.enabled true &&
>world &&
git add world &&
test_tick &&
git commit -m initial &&
echo hello >world &&
test_tick &&
git commit -a -m hello &&
git checkout -b side HEAD^ &&
echo goodbye >world &&
test_tick &&
git commit -a -m goodbye &&
git checkout main
'
test_expect_success SYMLINKS 'rerere in workdir' '
rm -rf .git/rr-cache &&
"$SHELL_PATH" "$TEST_DIRECTORY/../contrib/workdir/git-new-workdir" . work &&
(
cd work &&
test_must_fail git merge side &&
git rerere status >actual &&
echo world >expect &&
test_cmp expect actual
)
'
# This fails because we don't resolve relative symlink in mkdir_in_gitdir()
# For the purpose of helping contrib/workdir/git-new-workdir users, we do not
# have to support relative symlinks, but it might be nicer to make this work
# with a relative symbolic link someday.
test_expect_failure SYMLINKS 'rerere in workdir (relative)' '
rm -rf .git/rr-cache &&
"$SHELL_PATH" "$TEST_DIRECTORY/../contrib/workdir/git-new-workdir" . krow &&
(
cd krow &&
rm -f .git/rr-cache &&
ln -s ../.git/rr-cache .git/rr-cache &&
test_must_fail git merge side &&
git rerere status >actual &&
echo world >expect &&
test_cmp expect actual
)
'
test_done

View File

@@ -73,25 +73,6 @@ test_expect_success 'ls-files --others handles non-submodule .git' '
test_cmp expected1 output
'
test_expect_success SYMLINKS 'ls-files --others with symlinked submodule' '
git init super &&
git init sub &&
(
cd sub &&
>a &&
git add a &&
git commit -m sub &&
git pack-refs --all
) &&
(
cd super &&
"$SHELL_PATH" "$TEST_DIRECTORY/../contrib/workdir/git-new-workdir" ../sub sub &&
git ls-files --others --exclude-standard >../actual
) &&
echo sub/ >expect &&
test_cmp expect actual
'
test_expect_success 'setup nested pathspec search' '
test_create_repo nested &&
(