I am uploading a deep learning code to GitHub so my supervisor can take a look at , the total folder size is 17 GB including the data , I have seen examples like this on GitHub before.
PS C:UsersAdminHeart_projectHeart_segmentation> git push origin main --force
Enumerating objects: 908, done.
Counting objects: 100% (908/908), done.
Delta compression using up to 16 threads
Compressing objects: 100% (899/899), done.
error: RPC failed; HTTP 500 curl 22 The requested URL returned error: 500
send-pack: unexpected disconnect while reading sideband packet
Writing objects: 100% (908/908), 3.22 GiB | 2.42 MiB/s, done.
Total 908 (delta 363), reused 0 (delta 0), pack-reused 0
fatal: the remote end hung up unexpectedly
Everything up-to-date
I was expecting that the my project would be uploaded but it keeps stopping I tried many ways, like using the desktop app.
2
Answers
You’re probably running up against repository / file size limits.
The size limits on GitHub at the time of this writing are as follows:
File Size Limits:
Repository Size Limits:
With Git LFS:
If your repository contains data files that can be perfectly regenerated from other tracked files in a reasonable amount of time, then don’t commit them to your repo (gitignore them and regenerate them after cloning / pulling). If they can’t be perfectly regenerated from other tracked files in a reasonable amount of time and you can’t fit them within those GitHub limits, then find some other way to share those files to your supervisor. Also compress them for sharing if you haven’t already done so and they can be reasonably compressed (are not completely random data).
Please refer to the Github documentation:
Unfortunately, GitHub LFS isn’t going to help you with a 17GB upload, either:
SUGGESTION: try uploading your "data" to cloud storage (AWS S3, MS OneDrive, Google Drive, etc) and simply link to your data in the Git project itself.