我有一个装满zip文件的仓库,重新压缩这些文件会浪费时间。
我已经尝试在本地副本和远程副本上设置core.compression = 0,但没有成功。
git config core.compression 0
git config core.loosecompression 0
git pull 仍然是有效的
remote: Counting objects: 23, done.
remote: Compressing objects: ...
我有一个装满zip文件的仓库,重新压缩这些文件会浪费时间。
我已经尝试在本地副本和远程副本上设置core.compression = 0,但没有成功。
git config core.compression 0
git config core.loosecompression 0
git pull 仍然是有效的
remote: Counting objects: 23, done.
remote: Compressing objects: ...
我遇到的时间问题是由增量压缩引起的。
对于我来说,解决方案是:
echo '*.zip -delta' > .gitattributes
git gc
我将引用这篇来自关于图像、音频文件和其他“非代码”数据存在严重性能问题的回复:
请注意,设置必须在您正在获取/拉取的存储库上进行,而不是您正在获取/拉取到的存储库。Git does spend a fair bit of time in zlib for some workloads, but it should not create problems on the order of minutes.
For pushing and pulling, you're probably seeing delta compression, which can be slow for large files
core.compression 0 # Didn't seem to work.
That should disable zlib compression of loose objects and objects within packfiles. It can save a little time for objects which won't compress, but you will lose the size benefits for any text files.
But it won't turn off delta compression, which is what the "compressing..." phase during push and pull is doing. And which is much more likely the cause of slowness.
pack.window 0
It sets the number of other objects git will consider when doing delta compression. Setting it low should improve your push/pull times. But you will lose the substantial benefit of delta-compression of your non-image files (and git's meta objects). So the "-delta" option above for specific files is a much better solution.
echo '*.jpg -delta' >> .gitattributes
Also, consider repacking your repository, which will generate a packfile that will be re-used during push and pull.
.gitattributes
文件中有的任何其他内容:echo '*.zip -delta' >> .gitattributes
。 - scottgwaldcompressing object
一般指执行pack
操作,这包括比较文件树等等。这里的"compressing"并不是指core.compression
中的"压缩"。
git config --add core.bigFileThreshold 1
git config --unset core.bigFileThreshold