+ [2014-12-09T23:29:27Z] KLVTZ I'll attempt this question here only because it involved cloning from github. What is the strategy on shrinking remote repositories that have grown exponentially. More specifically, a project with 13,000+ objects that are being pulled. I've tried graphing and doing the recomended suggestions. However, the speed of cloning is almost a minute. If history is not an issue, would starting a new remote repo be app
+ [2014-12-09T23:29:33Z] KLVTZ ropriate or using a graft?
+ [2014-12-09T23:52:41Z] VxJasonxV if history is not the latest issue, you can squash the whole repository into a single commit of the latest sources
+ [2014-12-09T23:53:12Z] VxJasonxV I'm not sure that process is particularly straight forward though. I guess you could rebase -i to the first commit and just find and replace all the 'pick' commands with 'squash' commands.
+ [2014-12-09T23:53:29Z] Seveas KLVTZ: grafts are a nasty kludge. You could do a shallow clone for some things or indeed create a new repo. Though 13.000+ objects is not a big repo. Are you storing huge files in there?

message no. 65348

Posted by Seveas in #github at 2014-12-09T23:53:29Z

KLVTZ: grafts are a nasty kludge. You could do a shallow clone for some things or indeed create a new repo. Though 13.000+ objects is not a big repo. Are you storing huge files in there?
+ [2014-12-10T05:05:57Z] graphitemaster your dns routing seems to be awfully bad
+ [2014-12-10T05:06:18Z] graphitemaster traceroute from here in Canada at least puts me through almost 200 hops before I get to ae-3-80.edge3.Washington4.Level3.net (4.69.149.146)
+ [2014-12-10T05:06:33Z] graphitemaster making the site absolutely unusably slow
+ [2014-12-10T06:00:48Z] MagePsycho has anyone used readthedocs?