+[2014-12-09T23:29:27Z]KLVTZI'll attempt this question here only because it involved cloning from github. What is the strategy on shrinking remote repositories that have grown exponentially. More specifically, a project with 13,000+ objects that are being pulled. I've tried graphing and doing the recomended suggestions. However, the speed of cloning is almost a minute. If history is not an issue, would starting a new remote repo be app +[2014-12-09T23:29:33Z]KLVTZropriate or using a graft? +[2014-12-09T23:52:41Z]VxJasonxVif history is not the latest issue, you can squash the whole repository into a single commit of the latest sources +[2014-12-09T23:53:12Z]VxJasonxVI'm not sure that process is particularly straight forward though. I guess you could rebase -i to the first commit and just find and replace all the 'pick' commands with 'squash' commands. +[2014-12-09T23:53:29Z]SeveasKLVTZ: grafts are a nasty kludge. You could do a shallow clone for some things or indeed create a new repo. Though 13.000+ objects is not a big repo. Are you storing huge files in there?
+[2014-12-10T05:05:57Z]graphitemasteryour dns routing seems to be awfully bad +[2014-12-10T05:06:18Z]graphitemastertraceroute from here in Canada at least puts me through almost 200 hops before I get to ae-3-80.edge3.Washington4.Level3.net (4.69.149.146) +[2014-12-10T05:06:26Z]graphitemasterwhich takes nearly 480ms +[2014-12-10T05:06:33Z]graphitemastermaking the site absolutely unusably slow +[2014-12-10T06:00:48Z]MagePsychohas anyone used readthedocs?