Dear Open Hub Users,
We’re excited to announce that we will be moving the Open Hub Forum to
https://community.blackduck.com/s/black-duck-open-hub.
Beginning immediately, users can head over,
register,
get technical help and discuss issue pertinent to the Open Hub. Registered users can also subscribe to Open Hub announcements here.
On May 1, 2020, we will be freezing https://www.openhub.net/forums and users will not be able to create new discussions. If you have any questions and concerns, please email us at
[email protected]
Hi AnMaster,
I deleted project 439.
Yes, it can take a long time to download the complete Subversion history. In the case of SuperTux, there are over 4500 patch sets to download. I think that the lethargik.org Subversion server is a little bit slow -- some of these patch sets are taking 10 or 15 minutes each to download. This might be because of all the binary media files in this project.
All of the subprojects in /supertux/trunk will be combined together and considered a single project in the Ohloh report. Separate totals will be computed for each language, and the main language
will be whichever language has the most lines, which will either be C++ or C# in this case. We will count the XML, be we don't consider XML or HTML when determining the main language
of a project.
Media files are mostly ignored by our system, and won't be counted at all.
Yes, this job failed earlier today because of a server timeout.
We don't restart from the beginning when a download fails. We are able to resume from where we left off -- which is the same way that we regularly get the latest updates for every project.
It looks like we are starting over because the percentage returns to 0%. Each time we restart the job, it recomputes the number of patch sets left to download and begins a new progress counter. The first time we ran the job, there were 4500 patch sets to download. The second time we started the job, there were 1700 patch sets left to download.
Hi I don't know if I'm hitting something similar or not, but this started yesterday (almost 24 hours ago) and it seems stuck at 20% for several hours now. http://www.ohloh.net/projects/4656/enlistments
Any pointers?
spatialguru,
I took a look at this and your download is moving, although very slowly. There are 769 patch sets left to download, and it looks like they are taking about 4 minutes each. You've only got 51 hours left to wait :-).
I'm not sure if the speed problem is on our side or on the Subversion server. Does this repository contain large binary files?
I tried to checkout this repository on my laptop (a MacBook Pro) to see how long it took, and I got an odd error:
svn: Can't convert string from 'UTF-8' to native encoding:
svn: mapguide/MgDev/UnitTest/TestData/ResourceService/Western?\195?\169urop?\195?\168?\195?\160n.sdf
I hope that doesn't mean that this download is going to fail at some point.
Robin
Hi Robin I guess this is a known issue at our end here - fails for me on Mac too. Depending on what O/S your SVN client runs on, it may fail on at least one file that is encoded poorly. Looks like there are some workarounds if you can set certain environment settings, but I assume this isn't an option? Who knows, maybe it is a non-issue for your client - I'll cross my fingers :)
I also gave a bad URL for the SVN source which included a bunch of website content (ugh) So I've added a second enlistment that only targets the source code subdir and doesn't include a bunch of website content that is likely binary.
Perhaps we can kill the first enlistment and see if 2nd goes faster??