Back everything up first to safe locations. If github is deletd it can be revocerd in settings. This is for sendning up a large repo that somehow got removed from linking via obscure issues like old repo pushes with same name and no history so you are forced to push local> you will have a project over 2gbs so you will ruynb into a new issue.
OVER 2GBS AND REPO IS GONE + HISTORY BUT HAVE LOCAL .git WHAT DO I DO?
BACKUP VIA BELOW THEN FOLLOW STEPS BELOW FROM A STACKOVERFLOW REC. (in comments)
I AM DOING THIS FOR MY PROJECT THAT ALSO HAS GIT-LFS TRACKING AND OVER 140gbs.
Mount an external or an internal and make sure to format proper to transfer over 4gbs at a time. I can add a guide and link it here later for linux users. You can rsync remote or local. OR just tarball right into the internal/external as this was fastest for me.
tar -czvf /<mount-location>/<moounted-drive>/BACKUPS/misc_backup_080424.tar.gz ./MY-PROJECT/
OR
rsync -av <DIRECTORY-COPYING-WITH-RSYNC>/ /<mount-location>/<moounted-drive>/<new-folder-named-project-name> --progress
see all commits
git rev-list --reverse HEAD
Back local commit hashes up to a file
git rev-list --reverse HEAD > ../local_commits.txt
Commit from local .git every 50 commmits to bypass 2gb limit If any issues step smaller and only push under 2gbs at a time
git rev-list --reverse HEAD | \
perl -ne 'print unless $i++ % 50;' | \
xargs -I{} git push origin {}:master
if you actually fucked up linking master and main will be different. I waited to confirm difference as commit history being so different wont let git do a PR. Then I ran into main after feeling confident with a force.
git rev-list --reverse HEAD | perl -ne 'print unless $i++ % 50;' | xargs -I{} git push origin {}:main --force
https://stackoverflow.com/questions/70966245/i-want-to-add-a-9-7-gb-project-to-github