Hi,
A few proposals:
a) move the code from svn to github (separate repository per module).
I checked out the code yesterday, and the checkout probably took half an hour or so... svn was a bad idea when it arrived, and now it just needs to die...
also, I'm in the mood to do some major refactoring of the code... and not on a simple level where I have one tiny change that can be discussed here... so the only reasonable way I think I can do it is to fork the code and then just code ahead... any serious issues with me doing that?
This is changes like a full rewrite from C/Java to C++11/Scala, restructuring of the models, replacement of all libraries, a full refactoring to a DDD style domain model based approach, transformation to TDD and so on... i can't go into details since such refactoring is always a matter of continuous high speed trial and error...
Some suggestions
Moderator: Board moderators
Some suggestions
I speaken gutes English, does I nicht?
While changing revision control systems can be done, in the past it was done via vote/discussion. While SVN may be outdated now (hey, at least were not with CVS), there are probably as many opinions on what RCS and hosting site should be used as choices. And there has to be a fairly compelling reason to do it, since there is some amount of work to do it (more on everyone that has a checked out copy and needs to resync/convert to a new RCS than on the server maintenance side)
Note that with SVN, it is possible to check out just the modules you need, so there is no requirement that you check everything out.
In terms of rewrite, a few notes:
- Such rewrites have been started before, but I'm not aware of any that actually completed (not to say they aren't out there - there are lots of forks of crossfire)
- While you are free to fork the code and do the rewrite, be aware that if you do it without any outside discussion, there is a high likelihood that it will always remain as a fork, which is to say it won't be accepted as the de facto versions of crossfire.
There are no issues with you doing so - numerous forks of crossfire do exist, and with GPL, you are certainly free to have your crossfire c++ version that you maintain and develop.
The one comment is that the amount of assistance you get may be limited. Not just in the C++ development aspect, but if you have questions about the existing C code, developers may be reluctant to spend a much time answering questions for code that will never be part of the main trunk.
Note that with SVN, it is possible to check out just the modules you need, so there is no requirement that you check everything out.
In terms of rewrite, a few notes:
- Such rewrites have been started before, but I'm not aware of any that actually completed (not to say they aren't out there - there are lots of forks of crossfire)
- While you are free to fork the code and do the rewrite, be aware that if you do it without any outside discussion, there is a high likelihood that it will always remain as a fork, which is to say it won't be accepted as the de facto versions of crossfire.
There are no issues with you doing so - numerous forks of crossfire do exist, and with GPL, you are certainly free to have your crossfire c++ version that you maintain and develop.
The one comment is that the amount of assistance you get may be limited. Not just in the C++ development aspect, but if you have questions about the existing C code, developers may be reluctant to spend a much time answering questions for code that will never be part of the main trunk.
yeah, you can check out submodules in svn...
anyway, one brilliant thing with Git is that not only is it easy to work, it is very easy to create changes in ones own local repository, and make a merge request on selected changes to the head branch... and to keep the forked repo up to date...
it makes forking less problematic, since forking is not a way of just copying all and completely detach it anymore... it's just a way of keep track of changes in parallel to the official version, with full flow back and forth... similar to the flow between the svn server and checked out versions in svn...
for me, I would prefer to make one clone that is just for minor bugfixes, and a clone of that which is for more extreme experiments... if I during the extreme experiments find something that should be ported back to the more traditional version, I just push it back to that repo, and if those changes would be nice for for main project, then I could have made a merge request for them there...
anyway, one brilliant thing with Git is that not only is it easy to work, it is very easy to create changes in ones own local repository, and make a merge request on selected changes to the head branch... and to keep the forked repo up to date...
it makes forking less problematic, since forking is not a way of just copying all and completely detach it anymore... it's just a way of keep track of changes in parallel to the official version, with full flow back and forth... similar to the flow between the svn server and checked out versions in svn...
for me, I would prefer to make one clone that is just for minor bugfixes, and a clone of that which is for more extreme experiments... if I during the extreme experiments find something that should be ported back to the more traditional version, I just push it back to that repo, and if those changes would be nice for for main project, then I could have made a merge request for them there...
I speaken gutes English, does I nicht?
https://github.com/pricing :
Git was a bit difficult to learn at first due to easily detach HEAD but now am quite comfortable with git .
I have also successfully pulled http://git.kernel.org/cgit/linux/kernel ... table.git/
after of course first pulling http://git.kernel.org/cgit/linux/kernel ... linux.git/
and first used git clone instead of git init; git remote add SOMENAME URL; git pull
that ended in a completely removed download because of interrupting the shell after 30 minutes .
I am not sure how git resumes downloads, svn checkout seems doing resumes quite well .
That kernel experiment had stressed my full-speed download limit of 5GB much
and over Christmas I decided to which switched to 7KiB/s analog speed after roundabout 900MB . So the checkout took around 50 hours .
On the HDD it took around 4027 MB, not 7GB as it is written on http://crossfire.real-time.com/svn/index.html -- seems I am missing the metadata(?) , because whenever i run svn log or svn diff it downloads from sourceforge . Ok for diffs at KB size, but I accidentally did a which diffs 633:6304 and not 6303:6304 like I wanted .. at 7KB/s it seems to take a day, the diff is now at 541MB
If there are people that know how to best use
svnadmin dump the crossfire svn repository to fetch the metadata - please let me know .
Last but not least :
http://john.albin.net/git/convert-subversion-to-git and similar sites have some code as
which seems to make it urgent to clean up the find -type f -name DEVELOPERS file before converting to git
Another way to transform to git seems to be here :
http://stackoverflow.com/questions/7916 ... itory?rq=1
I for myself had started with git in November and am submitting pull requests to a repository on GitHub .Why don’t I see any disk space limits?
Your GitHub account has no size limitations. Repositories perform best when kept under 1GB. For more information on repository sizes, please read our Help article.
Git was a bit difficult to learn at first due to easily detach HEAD but now am quite comfortable with git .
I have also successfully pulled http://git.kernel.org/cgit/linux/kernel ... table.git/
after of course first pulling http://git.kernel.org/cgit/linux/kernel ... linux.git/
and first used git clone instead of git init; git remote add SOMENAME URL; git pull
that ended in a completely removed download because of interrupting the shell after 30 minutes .
I am not sure how git resumes downloads, svn checkout seems doing resumes quite well .
That kernel experiment had stressed my full-speed download limit of 5GB much
and over Christmas I decided to
Code: Select all
svn checkout svn://svn.code.sf.net/p/crossfire/code/ crossfire-code
On the HDD it took around 4027 MB, not 7GB as it is written on http://crossfire.real-time.com/svn/index.html -- seems I am missing the metadata(?) , because whenever i run svn log or svn diff it downloads from sourceforge . Ok for diffs at KB size, but I accidentally did a
Code: Select all
for i in 04; do svn diff -r 63$((i-1)):63$i >../weather-r63$((i-1)):63$i.diff; done

If there are people that know how to best use
svnadmin dump the crossfire svn repository to fetch the metadata - please let me know .
Last but not least :
http://john.albin.net/git/convert-subversion-to-git and similar sites have some code as
Code: Select all
svn log -q | awk -F '|' '/^r/ {sub("^ ", "", $2); sub(" $", "", $2); print $2" = "$2" <"$2">"}' | sort -u > authors-transform.txt

Another way to transform to git seems to be here :
http://stackoverflow.com/questions/7916 ... itory?rq=1
which I didn't have tested due to my bandwidth limit ..However, for a simple migration with all the history, here are the few simple steps:
Initialize the local repo:
mkdir project
cd project
git svn init http://svn.url
Mark how far back you want to start importing revisions:
git svn fetch -r42
(or just "git svn fetch" for all revs)
Actually fetch everything since then:
git svn rebase
You can check the result of the import with Gitk. I'm not sure if this works on Windows, it works on OSX and Linux:
gitk
When you've got your SVN repo cloned locally, you may want to push it to a centralized Git repo for easier collaboration.
First create your empty remote repo (maybe on GitHub?):
git remote add origin git@github.com:user/project-name.git
Then, optionally sync your main branch so the pull operation will automatically merge the remote master with your local master, when both contain new stuff:
git config branch.master.remote origin
git config branch.master.merge refs/heads/master

-
- Forum Aficionado
- Posts: 1994
- Joined: Tue Apr 29, 2003 5:55 pm
- Location: Minnesota, USA
- Contact:
Wow, pretty thorough write up and summary. Thank you on that.
And the results were:
And as far as disk space:
I'm not sure why the disk space is so much lower for you and for me, given we both used the same check out command.
I just did a clean checkout using:karl wrote:
On the HDD it took around 4027 MB, not 7GB as it is written on http://crossfire.real-time.com/svn/index.html -- seems I am missing the metadata(?) , because whenever i run svn log or svn diff it downloads from sourceforge .
Code: Select all
time svn co https://svn.code.sf.net/p/crossfire/code crossfire-code
Code: Select all
Checked out revision 19192.
real 31m34.782s
user 4m41.700s
sys 2m6.700s
Code: Select all
leaf@cfserver:~/crossfire-code$ du -sch
7.0G .
7.0G total
"Put another, more succinct way: don't complain, contribute. It's more satisfying in the long run, and it's more constructive."
Eric Meyer
Eric Meyer
I'm not sure the differences in size - could be different factors - and certainly if you do any development work, unless you separate the built files from the repo, can be a different size.
SVN is not very good - it will keep a local copy of the latest version in the repository (so you can do a svn diff against latest checked in code without it having to go to the network), but if you want to check against older versions, then it has to go to the network.
I can't really explain the performance you see either - while SVN isn't great, it certainly isn't that bad.
While SVN is out of date, on the plus side, things like git or mercurial do provide plugins or the like to allow accessing a SVN repository through these other tools, with the features as you expect.
However, performance on those is really slow. At least for the hg+svn interface, the only way for hg to get every version like it normally operates, it has to explicitly check out each one. For typical updates of a few a day, not a big problem, but if trying to check out the entire repository, I could see it taking days.
In that case, much easier to get a dump of the SVN repo and do conversion locally and not over the network. There used to be a way to get that through sourceforge - I don't know if there still is or not, as its been a while since I last did it.
SVN is not very good - it will keep a local copy of the latest version in the repository (so you can do a svn diff against latest checked in code without it having to go to the network), but if you want to check against older versions, then it has to go to the network.
I can't really explain the performance you see either - while SVN isn't great, it certainly isn't that bad.
While SVN is out of date, on the plus side, things like git or mercurial do provide plugins or the like to allow accessing a SVN repository through these other tools, with the features as you expect.
However, performance on those is really slow. At least for the hg+svn interface, the only way for hg to get every version like it normally operates, it has to explicitly check out each one. For typical updates of a few a day, not a big problem, but if trying to check out the entire repository, I could see it taking days.
In that case, much easier to get a dump of the SVN repo and do conversion locally and not over the network. There used to be a way to get that through sourceforge - I don't know if there still is or not, as its been a while since I last did it.